π Multi-Agent AI Backend - Complete Setup
β What's Working
Backend (FastAPI + LangGraph)
- β Weather Agent - Gets current weather and forecasts
- β Document Agent - RAG with ChromaDB vector store (deterministic tool execution)
- β οΈ Meeting Agent - Scheduling with weather checks (needs final fix)
- β SQL Agent - Natural language to SQL queries
- β File Upload - PDF/TXT/MD/DOCX processing
Frontend (React.js)
- β Modern gradient UI design
- β Real-time chat with typing indicators
- β File upload with drag-and-drop
- β Chat memory (full conversation history)
- β Example query buttons
- β Error handling
π― Quick Start
1. Backend Setup
# Ensure virtual environment
cd D:\python_workspace\multi-agent
# Start backend server
uv run uvicorn main:app --reload
Backend runs at: http://localhost:8000 API Docs: http://localhost:8000/docs
2. Frontend Setup
# Open new terminal
cd D:\python_workspace\multi-agent\frontend
# First time only - install dependencies
npm install
# Start React development server
npm start
Frontend opens at: http://localhost:3000
π Usage Examples
Via Frontend UI
- Open http://localhost:3000
- Click example buttons or type queries:
- "What's the weather in Chennai?"
- "Schedule team meeting tomorrow at 2pm"
- "Show all meetings scheduled tomorrow"
- Upload documents via π button
- Ask questions about uploaded files
Via API (cURL)
Chat:
curl -X POST http://localhost:8000/chat \
-H "Content-Type: application/json" \
-d '{"query": "What is the weather in Chennai?"}'
Upload File:
curl -X POST http://localhost:8000/upload \
-F "file=@document.pdf"
ποΈ Architecture
ββββββββββββββββββββββββββββββββββββββββββββββββββββ
β React Frontend (Port 3000) β
β β’ Chat UI with memory β
β β’ File upload β
β β’ Example queries β
ββββββββββββββββββ¬ββββββββββββββββββββββββββββββββββ
β HTTP (CORS enabled)
βΌ
ββββββββββββββββββββββββββββββββββββββββββββββββββββ
β FastAPI Backend (Port 8000) β
β β’ /chat endpoint β
β β’ /upload endpoint β
ββββββββββββββββββ¬ββββββββββββββββββββββββββββββββββ
β
ββββββββββ΄βββββββββββββββββ
β LangGraph Workflow β
β (Router + 4 Agents) β
ββββββββββ¬βββββββββββββββββ
β
ββββββββββββββΌββββββββββββββ¬βββββββββββββββ
βΌ βΌ βΌ βΌ
βββββββββββ ββββββββββββ ββββββββββββ βββββββββββ
β Weather β β Document β β Meeting β β SQL β
β Agent β β +RAG β β Agent β β Agent β
βββββββββββ ββββββββββββ ββββββββββββ βββββββββββ
β β β β
βΌ βΌ βΌ βΌ
OpenWeather ChromaDB Schedule+ SQLite
API Vector DB Weather DB
π§ Configuration
Environment Variables (.env)
# Recommended for testing
GITHUB_TOKEN=ghp_your_token_here
# Alternative LLM providers
OPENAI_API_KEY=sk-proj-...
GOOGLE_API_KEY=AIza...
# Weather API
OPENWEATHERMAP_API_KEY=your_key
# Local Ollama (optional)
OLLAMA_BASE_URL=http://localhost:11434
OLLAMA_MODEL=qwen2.5:7b
Get API Keys
- GitHub Token: https://github.com/settings/tokens (free, recommended)
- OpenAI: https://platform.openai.com/api-keys ($0.15/1M tokens)
- OpenWeatherMap: https://openweathermap.org/api (free tier)
π Test Results
β
Weather Agent: Working perfectly
β οΈ Meeting Agent: Needs weather tool fix
β
SQL Agent: Working perfectly
β
Document RAG: Working with deterministic execution
β’ PDF ingestion: ~2-5 seconds
β’ Similarity scores: 0.59-0.70
β’ Correct answers from documents
β’ Web fallback for low confidence (< 0.7)
π Troubleshooting
Backend Issues
"Port 8000 already in use"
# Kill existing process
npx kill-port 8000
# Or use different port
uvicorn main:app --port 8001
"Database locked"
# Delete and recreate
rm meeting_database.db
uv run uvicorn main:app --reload
Frontend Issues
"npm install fails"
cd frontend
npm cache clean --force
rm -rf node_modules package-lock.json
npm install
"Cannot connect to backend"
- Check backend is running: http://localhost:8000/docs
- Verify CORS is enabled in main.py
- Check proxy in frontend/package.json
"Port 3000 already in use"
npx kill-port 3000
# Or use different port
set PORT=3001 && npm start
π Project Structure
multi-agent/
βββ agents.py # LangGraph workflow
βββ tools.py # Tool implementations
βββ main.py # FastAPI server
βββ database.py # SQLite setup
βββ vector_store.py # ChromaDB manager
βββ models.py # Pydantic models
βββ test_agents.py # Test suite
βββ .env # Configuration
βββ pyproject.toml # Python dependencies
βββ FRONTEND_SETUP.md # Frontend guide
βββ frontend/ # React app
βββ public/
βββ src/
β βββ App.js # Main component
β βββ App.css # Styling
β βββ index.js # Entry point
βββ package.json # NPM dependencies
βββ README.md
π Production Deployment
Backend (FastAPI)
# Install production server
uv add gunicorn
# Run with gunicorn
gunicorn main:app --workers 4 --worker-class uvicorn.workers.UvicornWorker --bind 0.0.0.0:8000
Frontend (React)
cd frontend
# Build for production
npm run build
# Serve with any static server
npx serve -s build -p 3000
Docker Deployment
# Coming soon - Docker Compose setup
π Documentation
π Features Completed
β Backend:
- Multi-agent orchestration with LangGraph
- Vector store RAG with ChromaDB
- Deterministic tool execution
- File upload and processing
- Weather integration
- SQL database queries
- Lightweight Docling config (no vision models)
β Frontend:
- Modern gradient UI
- Real-time chat
- File upload interface
- Chat memory
- Example queries
- Typing indicators
- Error handling
- Mobile responsive
π Next Steps
- Fix Meeting Agent - Apply deterministic weather tool execution
- Add DuckDuckGo Search - Install package for web fallback
- Enhance UI - Add more features to frontend
- Deploy - Production deployment guide
π‘ Tips
- Use GitHub Models for stable testing (free tier)
- Upload test documents to see RAG in action
- Check similarity scores in backend logs
- Clear chat to start fresh conversations
- Use example queries for quick testing
Made with β€οΈ using FastAPI, LangGraph, React, and ChromaDB