ecomcp / docs /QUICK_START_INTEGRATED.md
vinhnx90's picture
feat: Implement LlamaIndex integration with new core modules for knowledge base, document loading, vector search, and comprehensive documentation and tests.
108d8af

A newer version of the Gradio SDK is available: 6.13.0

Upgrade

Quick Start - Integrated LlamaIndex with MCP & Gradio

Get up and running with the fully integrated EcoMCP system in 5 minutes.

Setup (1 minute)

# 1. Install dependencies
pip install -r requirements.txt

# 2. Set OpenAI API key
export OPENAI_API_KEY=sk-...

# Verify docs directory exists
ls -la ./docs

Running (2 minutes)

Terminal 1: Start MCP Server

python src/server/mcp_server.py

Expected output:

2025-11-27 ... EcoMCP Server started - listening for JSON-RPC messages
2025-11-27 ... Knowledge base initialized successfully

Terminal 2: Start Gradio UI

python src/ui/app.py

Expected output:

Running on http://0.0.0.0:7860

Testing (2 minutes)

Test 1: Gradio UI Knowledge Search

  1. Open http://localhost:7860 in browser
  2. Click "πŸ” Knowledge Search" tab
  3. Enter query: deployment guide
  4. Select search type: Documentation
  5. Click "πŸ” Search"
  6. See results with similarity scores

Test 2: MCP Server Tools (via Python)

import asyncio
from src.server.mcp_server import EcoMCPServer

async def test():
    server = EcoMCPServer()
    
    # Test knowledge_search
    result = await server.call_tool("knowledge_search", {
        "query": "product features",
        "search_type": "all",
        "top_k": 5
    })
    print(result)
    
    # Test product_query
    result = await server.call_tool("product_query", {
        "question": "What is the main feature?"
    })
    print(result)

asyncio.run(test())

Features Available

In Gradio UI (6 tabs)

  1. πŸ“¦ Analyze Product - Product analysis
  2. ⭐ Analyze Reviews - Review sentiment
  3. ✍️ Generate Listing - Product copy
  4. πŸ’° Price Recommendation - Pricing strategy
  5. πŸ” Knowledge Search ← NEW (LlamaIndex)
  6. ℹ️ About - Platform information

In MCP Server (7 tools)

  1. analyze_product - Product analysis
  2. analyze_reviews - Review analysis
  3. generate_listing - Copy generation
  4. price_recommendation - Pricing
  5. competitor_analysis - Competition
  6. knowledge_search ← NEW (LlamaIndex)
  7. product_query ← NEW (LlamaIndex)

Common Tasks

Search Products

results = kb.search_products("wireless headphones", top_k=5)

Search Documentation

results = kb.search_documentation("deployment", top_k=5)

Ask a Question

answer = kb.query("How to deploy this platform?")

Get Recommendations

recs = kb.get_recommendations("gaming laptop", limit=5)

File Structure

ecomcp/
β”œβ”€β”€ src/
β”‚   β”œβ”€β”€ server/
β”‚   β”‚   └── mcp_server.py        ← MCP with KB integration
β”‚   β”œβ”€β”€ ui/
β”‚   β”‚   └── app.py               ← Gradio with Knowledge tab
β”‚   └── core/
β”‚       β”œβ”€β”€ knowledge_base.py     ← KB implementation
β”‚       β”œβ”€β”€ document_loader.py    ← Document loading
β”‚       β”œβ”€β”€ vector_search.py      ← Search algorithms
β”‚       └── llama_integration.py  ← Integration wrapper
β”œβ”€β”€ docs/
β”‚   β”œβ”€β”€ INTEGRATION_GUIDE.md      ← Full integration guide
β”‚   β”œβ”€β”€ INTEGRATION_SUMMARY.md    ← Changes summary
β”‚   β”œβ”€β”€ LLAMA_FRAMEWORK_REFINED.md ← KB framework details
β”‚   └── *.md                      ← Indexed documentation
└── requirements.txt

Configuration

Knowledge Base

# In src/server/mcp_server.py
docs_path = "./docs"              # Documentation directory
top_k = 5                         # Default results
embedding_model = "text-embedding-3-small"
llm_model = "gpt-5"

UI Search

# In src/ui/app.py
search_results = 5                # Results per search
kb.initialize("./docs")           # Index documents

Troubleshooting

"Knowledge base not initialized"

  • Verify ./docs directory exists
  • Check server logs for initialization errors
  • Ensure LlamaIndex is installed: pip list | grep llama

"No results found"

  • Try simpler search query
  • Check documents are indexed
  • Verify OPENAI_API_KEY is set

Search is slow

  • Reduce top_k parameter
  • Use smaller embedding model
  • Check disk I/O performance

Knowledge tab not appearing

  • Verify LlamaIndex installed
  • Check for errors in UI console
  • Restart Gradio UI

Next Steps

  1. Index Product Data

    products = [{"name": "...", "description": "..."}]
    kb.add_products(products)
    
  2. Deploy to Production

    # Using Modal
    modal deploy src/server/mcp_server.py
    
    # Using Docker
    docker build -t ecomcp .
    docker run -e OPENAI_API_KEY=... ecomcp
    
  3. Scale Knowledge Base

    config = IndexConfig(use_pinecone=True)
    kb = EcoMCPKnowledgeBase(config=config)
    
  4. Add Analytics

    • Track search queries
    • Monitor result quality
    • Measure latency

Documentation

  • Full Integration Guide: docs/INTEGRATION_GUIDE.md
  • Framework Details: docs/LLAMA_FRAMEWORK_REFINED.md
  • KB Implementation: src/core/examples.py
  • MCP Specification: src/server/mcp_server.py

Support

Check Logs

# Server logs
grep "Knowledge base" logs/*.log

# UI logs (browser console)
F12 β†’ Console tab

Test API

# Test MCP server
echo '{"jsonrpc": "2.0", "id": 1, "method": "tools/list"}' | python src/server/mcp_server.py

Verify Installation

python -c "from src.core import EcoMCPKnowledgeBase; print('βœ“ LlamaIndex installed')"

Tips & Tricks

Faster Searches

# Use smaller model
config = IndexConfig(
    embedding_model="text-embedding-3-small",
    similarity_top_k=3
)

Better Results

# Use larger model
config = IndexConfig(
    embedding_model="text-embedding-3-large",
    similarity_top_k=10
)

Save Indexed Data

kb.save("./kb_backup")          # Save index
kb.load("./kb_backup")          # Load index

Performance

Operation Latency
Index load 1-2s
Search query 0.1-0.5s
Q&A query 0.5-2s
Startup 2-5s

Integration Checklist

  • OPENAI_API_KEY set
  • dependencies installed
  • ./docs directory exists
  • MCP server starts (logs show KB initialized)
  • Gradio UI starts (http://localhost:7860)
  • Knowledge Search tab appears
  • Search returns results
  • Tests pass

Done! βœ…

Your EcoMCP system is now fully integrated with LlamaIndex knowledge base.

Next: Try searching for "deployment" in the Knowledge Search tab!