llm-hub/README.md

1.5 KiB

🤖 Agentic LLM Hub

Self-hosted AI agent platform with multi-provider LLM aggregation, reasoning engines (ReAct, Plan-and-Execute, Reflexion), MCP tools, and web IDE.

🚀 Quick Start

# 1. Clone from your Gitea
git clone https://gitea.yourdomain.com/youruser/llm-hub.git
cd llm-hub

# 2. Configure
cp .env.example .env
nano .env  # Add your API keys

# 3. Deploy
./setup.sh && ./start.sh

📡 Access Points

Service URL Description
VS Code IDE http://your-ip:8443 Full IDE with Continue.dev
Agent API http://your-ip:8080/v1 Main API endpoint
LiteLLM http://your-ip:4000 LLM Gateway
MCP Tools http://your-ip:8001/docs Tool OpenAPI docs
ChromaDB http://your-ip:8000 Vector memory
Web UI http://your-ip:3000 Chat interface

🔧 Supported Providers

  • Groq (Free tier, fast)
  • Mistral (1B tokens/month free)
  • Anthropic Claude (Trial credits)
  • Moonshot Kimi ($5 signup bonus)
  • OpenRouter (Free tier access)
  • Cohere (1K calls/month)
  • DeepSeek (Cheap reasoning)

🧠 Reasoning Modes

  • react - Fast iterative reasoning
  • plan_execute - Complex multi-step tasks
  • reflexion - Self-correcting with verification
  • auto - Automatic selection

📚 Documentation

🔄 Updates

git pull origin main
docker-compose pull
docker-compose up -d