IntegraChat / README_HF_SPACES.md
nothingworry's picture
Deploy to HF Spaces
916e12b
---
title: IntegraChat
emoji: πŸ€–
colorFrom: blue
colorTo: purple
sdk: docker
app_port: 7860
pinned: false
license: mit
---
# IntegraChat β€” Enterprise MCP Autonomous Agent Platform
**IntegraChat** is an enterprise-grade, multi-tenant AI platform that demonstrates the full capabilities of the **Model Context Protocol (MCP)** in a production-style environment.
## πŸš€ Quick Start
This Hugging Face Space runs the complete IntegraChat stack:
- **Gradio UI** on port 7860 (main interface)
- **FastAPI Backend** on port 8000 (API endpoints)
- **Unified MCP Server** on port 8900 (RAG/Web/Admin tools)
## ✨ Features
- πŸ€– **Autonomous Multi-Step MCP Agents** – Intelligent tool-aware agent with conversation memory
- πŸ“š **Enhanced Knowledge Base Management** – Upload documents (PDF/DOCX/TXT/MD) with AI-generated metadata
- πŸ” **Optimized RAG Search** – Cross-encoder re-ranking for improved accuracy
- πŸ›‘οΈ **Enterprise Admin Governance** – Regex-based red-flag detection with LLM enhancement
- πŸ“Š **Comprehensive Analytics** – Real-time visualizations and tenant-level metrics
- 🌐 **Live Web Search** – Google Programmable Search integration
- 🏒 **Multi-Tenant Isolation** – Complete tenant isolation with role-based access control
## πŸ“– Usage
1. **Enter Tenant ID**: Set your tenant ID in the UI (top of the page)
2. **Select Role**: Choose your role (viewer, editor, admin, owner) from the dropdown
3. **Start Chatting**: Use the Chat tab to interact with the agent
4. **Ingest Documents**: Upload documents in the Document Ingestion tab (requires editor+ role)
5. **Manage Rules**: Add admin rules in the Admin Rules tab (requires admin+ role)
6. **View Analytics**: Check analytics in the Admin Analytics tab
## πŸ”§ Configuration
Set these environment variables in your Hugging Face Space settings:
### Required
- `POSTGRESQL_URL` - PostgreSQL connection string with pgvector extension
- `OLLAMA_URL` - Ollama server URL (or use Groq API)
- `OLLAMA_MODEL` - Model name (e.g., `llama3.1:latest`)
### Optional
- `SUPABASE_URL` - Supabase project URL (for production storage)
- `SUPABASE_SERVICE_KEY` - Supabase service role key
- `GOOGLE_SEARCH_API_KEY` - Google Custom Search API key
- `GOOGLE_SEARCH_CX_ID` - Google Custom Search Engine ID
- `GROQ_API_KEY` - Groq API key (alternative to Ollama)
- `LLM_BACKEND` - `ollama` or `groq` (default: `ollama`)
## πŸ“š API Endpoints
The FastAPI backend is available at `/api` (relative to the Space URL):
- `POST /api/agent/message` - Main chat endpoint
- `POST /api/rag/ingest-document` - Ingest documents
- `GET /api/rag/list` - List documents
- `POST /api/admin/rules` - Manage admin rules
- `GET /api/analytics/overview` - View analytics
Full API docs available at `/api/docs` when the backend is running.
## πŸ—οΈ Architecture
```
β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”
β”‚ Gradio UI β”‚ Port 7860
β”‚ (Main Entry) β”‚
β””β”€β”€β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”€β”€β”˜
β”‚
β–Ό
β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”
β”‚ FastAPI Backendβ”‚ Port 8000
β””β”€β”€β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”€β”€β”˜
β”‚
β”œβ”€β”€β–Ί MCP Server (Port 8900)
β”‚ β”œβ”€β”€ RAG Tools
β”‚ β”œβ”€β”€ Web Tools
β”‚ └── Admin Tools
β”‚
β”œβ”€β”€β–Ί PostgreSQL (RAG)
β”œβ”€β”€β–Ί Supabase/SQLite (Rules & Analytics)
└──► LLM (Ollama/Groq)
```
## πŸ” Role-Based Access Control
- **viewer** - Basic chat access
- **editor** - Can ingest documents
- **admin** - Can manage rules and delete documents
- **owner** - Full system access
## πŸ“ License
MIT License - see [LICENSE](LICENSE) file for details.
## πŸ”— Links
- [Model Context Protocol](https://modelcontextprotocol.io/)
- [Full Documentation](README.md)
- [Backend Documentation](backend/README.md)
---
**Made with ❀️ for the MCP Hackathon**