Spaces:
Sleeping
Sleeping
A newer version of the Gradio SDK is available: 6.13.0
Quick Start - Integrated LlamaIndex with MCP & Gradio
Get up and running with the fully integrated EcoMCP system in 5 minutes.
Setup (1 minute)
# 1. Install dependencies
pip install -r requirements.txt
# 2. Set OpenAI API key
export OPENAI_API_KEY=sk-...
# Verify docs directory exists
ls -la ./docs
Running (2 minutes)
Terminal 1: Start MCP Server
python src/server/mcp_server.py
Expected output:
2025-11-27 ... EcoMCP Server started - listening for JSON-RPC messages
2025-11-27 ... Knowledge base initialized successfully
Terminal 2: Start Gradio UI
python src/ui/app.py
Expected output:
Running on http://0.0.0.0:7860
Testing (2 minutes)
Test 1: Gradio UI Knowledge Search
- Open http://localhost:7860 in browser
- Click "π Knowledge Search" tab
- Enter query:
deployment guide - Select search type:
Documentation - Click "π Search"
- See results with similarity scores
Test 2: MCP Server Tools (via Python)
import asyncio
from src.server.mcp_server import EcoMCPServer
async def test():
server = EcoMCPServer()
# Test knowledge_search
result = await server.call_tool("knowledge_search", {
"query": "product features",
"search_type": "all",
"top_k": 5
})
print(result)
# Test product_query
result = await server.call_tool("product_query", {
"question": "What is the main feature?"
})
print(result)
asyncio.run(test())
Features Available
In Gradio UI (6 tabs)
- π¦ Analyze Product - Product analysis
- β Analyze Reviews - Review sentiment
- βοΈ Generate Listing - Product copy
- π° Price Recommendation - Pricing strategy
- π Knowledge Search β NEW (LlamaIndex)
- βΉοΈ About - Platform information
In MCP Server (7 tools)
analyze_product- Product analysisanalyze_reviews- Review analysisgenerate_listing- Copy generationprice_recommendation- Pricingcompetitor_analysis- Competitionknowledge_searchβ NEW (LlamaIndex)product_queryβ NEW (LlamaIndex)
Common Tasks
Search Products
results = kb.search_products("wireless headphones", top_k=5)
Search Documentation
results = kb.search_documentation("deployment", top_k=5)
Ask a Question
answer = kb.query("How to deploy this platform?")
Get Recommendations
recs = kb.get_recommendations("gaming laptop", limit=5)
File Structure
ecomcp/
βββ src/
β βββ server/
β β βββ mcp_server.py β MCP with KB integration
β βββ ui/
β β βββ app.py β Gradio with Knowledge tab
β βββ core/
β βββ knowledge_base.py β KB implementation
β βββ document_loader.py β Document loading
β βββ vector_search.py β Search algorithms
β βββ llama_integration.py β Integration wrapper
βββ docs/
β βββ INTEGRATION_GUIDE.md β Full integration guide
β βββ INTEGRATION_SUMMARY.md β Changes summary
β βββ LLAMA_FRAMEWORK_REFINED.md β KB framework details
β βββ *.md β Indexed documentation
βββ requirements.txt
Configuration
Knowledge Base
# In src/server/mcp_server.py
docs_path = "./docs" # Documentation directory
top_k = 5 # Default results
embedding_model = "text-embedding-3-small"
llm_model = "gpt-5"
UI Search
# In src/ui/app.py
search_results = 5 # Results per search
kb.initialize("./docs") # Index documents
Troubleshooting
"Knowledge base not initialized"
- Verify
./docsdirectory exists - Check server logs for initialization errors
- Ensure LlamaIndex is installed:
pip list | grep llama
"No results found"
- Try simpler search query
- Check documents are indexed
- Verify OPENAI_API_KEY is set
Search is slow
- Reduce
top_kparameter - Use smaller embedding model
- Check disk I/O performance
Knowledge tab not appearing
- Verify LlamaIndex installed
- Check for errors in UI console
- Restart Gradio UI
Next Steps
Index Product Data
products = [{"name": "...", "description": "..."}] kb.add_products(products)Deploy to Production
# Using Modal modal deploy src/server/mcp_server.py # Using Docker docker build -t ecomcp . docker run -e OPENAI_API_KEY=... ecomcpScale Knowledge Base
config = IndexConfig(use_pinecone=True) kb = EcoMCPKnowledgeBase(config=config)Add Analytics
- Track search queries
- Monitor result quality
- Measure latency
Documentation
- Full Integration Guide:
docs/INTEGRATION_GUIDE.md - Framework Details:
docs/LLAMA_FRAMEWORK_REFINED.md - KB Implementation:
src/core/examples.py - MCP Specification:
src/server/mcp_server.py
Support
Check Logs
# Server logs
grep "Knowledge base" logs/*.log
# UI logs (browser console)
F12 β Console tab
Test API
# Test MCP server
echo '{"jsonrpc": "2.0", "id": 1, "method": "tools/list"}' | python src/server/mcp_server.py
Verify Installation
python -c "from src.core import EcoMCPKnowledgeBase; print('β LlamaIndex installed')"
Tips & Tricks
Faster Searches
# Use smaller model
config = IndexConfig(
embedding_model="text-embedding-3-small",
similarity_top_k=3
)
Better Results
# Use larger model
config = IndexConfig(
embedding_model="text-embedding-3-large",
similarity_top_k=10
)
Save Indexed Data
kb.save("./kb_backup") # Save index
kb.load("./kb_backup") # Load index
Performance
| Operation | Latency |
|---|---|
| Index load | 1-2s |
| Search query | 0.1-0.5s |
| Q&A query | 0.5-2s |
| Startup | 2-5s |
Integration Checklist
- OPENAI_API_KEY set
- dependencies installed
- ./docs directory exists
- MCP server starts (logs show KB initialized)
- Gradio UI starts (http://localhost:7860)
- Knowledge Search tab appears
- Search returns results
- Tests pass
Done! β
Your EcoMCP system is now fully integrated with LlamaIndex knowledge base.
Next: Try searching for "deployment" in the Knowledge Search tab!