Loading...
Build custom LLM applications with drag-and-drop simplicity

Explore different sections
Visits
5
Likes
0
Quality Score
50/100
Drag-and-drop interface for creating LLM pipelines
Connect OpenAI, Llama2, Mistral, and local models
Launch faster with ready-made chatbot and RAG templates
Self-host on AWS/Azure/GCP with air-gapped security
N/A
No
Not Available
Yes, Flowise supports local LLMs through Ollama, HuggingFace, and self-hosted models
Yes, Flowise offers open-source self-hosting and a 14-day free trial
Enterprise version offers air-gapped deployments, RBAC, and audit logs
Social Media
Social Media