Flowise and Gitea
Flowise and Gitea
September 5, 2025
Tl;DR
I dont quite like having some use enterprise version within n8n.
So putting together a Flowise x Gitea stack that is F/OSS (Apache v2 and MIT)
Intro
Flowise AI
Most important, among similar low/nocode tools, FlowiseAI is OSS Licensed with Apache v2.
git clone https://github.com/FlowiseAI/Flowise
cd ./Flowise/docker
cp .env.example .env
cat <<EOL >> .env
FLOWISE_USERNAME=teco
FLOWISE_PASSWORD=paco
EOL
sudo docker compose up -d
https://docs.flowiseai.com/integrations
Flowise API SDK Embed
Extend and integrate to your applications using APIs, SDK and Embedded Chat
- APIs: https://docs.flowiseai.com/api-reference
- Embedded Widget: https://docs.flowiseai.com/using-flowise/embed
- Typescript & Python SDK
Connect to Flowise
If you want, plug your Ollama instance to Flowise:
cd gitea
sudo docker-compose up -d
#sudo docker ps | grep ollama
docker network connect cloudflared_tunnel gitea #network -> container name
#docker inspect gitea --format '{{json .NetworkSettings.Networks}}' | jq
Or just use 3rd parties LLMs:
- Groq: https://console.groq.com/keys
- Gemini (Google): https://ai.google.dev/gemini-api/docs
- Mixtral: https://docs.mistral.ai/api/
- Anthropic (Claude) - https://www.anthropic.com/api
- Open AI: https://platform.openai.com/api-keys
- Grok: https://x.ai/api
cd gitea
sudo docker-compose up -d
#sudo docker ps | grep flowise
docker network connect cloudflared_tunnel gitea #network -> container name
#docker inspect gitea --format '{{json .NetworkSettings.Networks}}' | jq