Create Apps faster with AI: GPT Engineer
- GPT-Engineer
- GPT-Pilot
- SmolDev
GPT-Engineer
https://pypi.org/project/gpt-engineer/ https://github.com/gpt-engineer-org/gpt-engineer https://github.com/gpt-engineer-org/gpt-engineer?tab=MIT-1-ov-file#readme
GPT Engineer with (Local and Free) LLM
https://gpt-engineer.readthedocs.io/en/latest/open_models.html
- Levereging the use of OpenAI Compatible API, like we saw with textgenui:
- https://gpt-engineer.readthedocs.io/en/latest/open_models.html
SelfHosting GPT Engineer
GPT Engineer with Docker
\https://github.com/gpt-engineer-org/gpt-engineer/blob/main/docker/README.md
git clone https://github.com/gpt-engineer-org/gpt-engineer
cd gpt-engineerWith Venv:
python3 -m venv gpteng #create the venv | python3 if you are on linux
gpteng\Scripts\activate #activate venv (windows)
source gpteng/bin/activate #(linux)With conda:
##conda --version
conda create -n pandasaigroqapi python=3.11
conda activate pandasaigroqapi
python -m pip install gpt-engineer #estable version
#python -m pip install -r requirements.txt #all at once
conda deactivateexport OPENAI_API_KEY=[your api key sk-proj-...]Run it with: https://gpt-engineer.readthedocs.io/en/latest/quickstart.html
gpte projects/my-new-projectYou will see that a folder gets created under ./projects/my-new-project
You can iterate on the already created code:
gpte projects/my-old-project -ihttps://github.com/Pythagora-io/gpt-pilot https://github.com/Pythagora-io/gpt-pilot?tab=MIT-1-ov-file#readme
CLI Version
With GPT API
GPT Pilot with Docker
https://github.com/Pythagora-io/gpt-pilot?tab=readme-ov-file#-how-to-start-gpt-pilot-in-docker
Pre-Requisites - Get Docker! π
Important step and quite recommended for any SelfHosting Project - Get Docker Installed
It will be one command, this one, if you are in Linux:
apt-get update && sudo apt-get upgrade && curl -fsSL https://get.docker.com -o get-docker.sh
sh get-docker.sh && docker versionGPT Pilot With (FREE & Local) LLMs
- https://github.com/Pythagora-io/gpt-pilot/blob/main/pilot/.env.example
- https://github.com/Pythagora-io/gpt-pilot/wiki/Using-GPT%E2%80%90Pilot-with-Local-LLMs
We can use:
# OPENAI or AZURE or OPENROUTER
ENDPOINT=OPENAI
OPENAI_ENDPOINT=http://localhost/8000 ??? #https://api.openai.com/v1/chat/completions
OPENAI_API_KEY=
# In case of Azure/OpenRouter endpoint, change this to your deployed model name
MODEL_NAME=gpt-4-1106-preview
# MODEL_NAME=gpt-4
# MODEL_NAME=gpt-3.5-turbo-16k
MAX_TOKENS=8192After you have Python and (optionally) PostgreSQL installed, follow these steps:
git clone https://github.com/Pythagora-io/gpt-pilot.git (clone the repo)
cd gpt-pilot
python -m venv pilot-env (create a virtual environment)
source pilot-env/bin/activate (or on Windows pilot-env\Scripts\activate) (activate the virtual environment)
pip install -r requirements.txt (install the dependencies)
cd pilot
mv .env.example .env (or on Windows copy .env.example .env) (create the .env file)
Add your environment to the .env file:
LLM Provider (OpenAI/Azure/Openrouter)
Your API key
database settings: SQLite/PostgreSQL (to change from SQLite to PostgreSQL, just set DATABASE_TYPE=postgres)
optionally set IGNORE_FOLDERS for the folders which shouldn't be tracked by GPT Pilot in workspace, useful to ignore folders created by compilers (i.e. IGNORE_FOLDERS=folder1,folder2,folder3)
python db_init.py (initialize the database)
python main.py (start GPT Pilot)
After, this, you can just follow the instructions in the terminal.Smol-Dev
https://github.com/smol-ai/developer https://github.com/smol-ai/developer?tab=MIT-1-ov-file#readme
The first library to let you embed a developer agent in your own app!
scaffolds an entire codebase out for you once you give it a product spec gives you basic building blocks to have a smol developer inside of your own app.
https://twitter.com/SmolModels
- The Official Site
- The Source Code at Github
- License: aGPL 3.0 β€οΈ
Conclusions
https://anakin.ai/blog/gpt-pilot/
FAQ
How to get better with our prompts? β¬
An Open-Source Framework for Prompt-Learning.
How to use GPT Engineer with F/OSS LLMs
Other F/OSS Alternatives to GPT-Engineer
- OpenDevin - https://github.com/OpenDevin/OpenDevin
docker pull ghcr.io/opendevin/opendevin:mainHow to use OpenDevin With Docker β¬
Use it with Docker:
WORKSPACE_BASE=$(pwd)/workspace
docker run -it \
--pull=always \
-e SANDBOX_USER_ID=$(id -u) \
-e WORKSPACE_MOUNT_PATH=$WORKSPACE_BASE \
-v $WORKSPACE_BASE:/opt/workspace_base \
-v /var/run/docker.sock:/var/run/docker.sock \
-p 3000:3000 \
--add-host host.docker.internal:host-gateway \
--name opendevin-app-$(date +%Y%m%d%H%M%S) \
ghcr.io/opendevin/opendevinOpenDevin will be ready at localhost:3000
OpenDevin is an autonomous AI software engineer capable of executing complex engineering tasks and collaborating actively with users on software development projects
π OpenDevin: Code Less, Make More
- Devika - https://github.com/stitionai/devika (MIT Licensed β€οΈ)
Supports Claude 3, GPT-4, Gemini, Mistral , Groq and Local LLMs via Ollama. For optimal performance: Use the Claude 3 family of models.
Devika is an Agentic AI Software Engineer that can understand high-level human instructions, break them down into steps, research relevant information, and write code to achieve the given objective. Devika aims to be a competitive open-source alternative to Devin by Cognition AI.
- SWE-Agent - https://github.com/princeton-nlp/SWE-agent
SWE-agent takes a GitHub issue and tries to automatically fix it, using GPT-4, or your LM of choice. It solves 12.47% of bugs in the SWE-bench evaluation set and takes just 1 minute to run.
Code with Multi-Agent Frameworks
PraisonAI application combines AutoGen and CrewAI or similar frameworks into a low-code solution for building and managing multi-agent LLM systems, focusing on simplicity, customisation, and efficient human-agent collaboration. Chat with your ENTIRE Codebase.
Conclusions
F/OSS AI Coding Assistant
Coding Assistants
I personally like VSCodium - the F/OSS version of VSCode
Fully Open Source IDE
- VSCodium
- LAPCE: https://github.com/lapce/lapce - Lightning-fast and Powerful Code Editor written in Rust
- Lap Dev
- https://lap.dev/about/
- https://github.com/lapce/lapdev - Self-Hosted Remote Dev Environment
No Code F/OSS - Rivet IDE
https://www.youtube.com/watch?v=Zd5wjy4YPis
Project Source Code: https://github.com/ironclad/rivet
- License: MIT β€οΈ
- The Official Site
- The Source Code at Github
- License: aGPL 3.0 β€οΈ
The Open-Source Visual AI Programming Environment
https://www.youtube.com/watch?v=P1PhHWK6n9I https://www.youtube.com/watch?v=a45y5bmLPY8
Overall, ClippyGPT is a promising concept that has the potential to make AI more accessible and useful for a wider range of people. However, it is important to be aware of the potential challenges associated with this technology and to use it responsibly.