wrdler / docs /MCP_INTEGRATION.md
Surn's picture
Switch to Gradio UI; Add MCP support
8899cc8
# MCP Integration for Wrdler
## Overview
Wrdler exposes AI word generation functionality as an **MCP (Model Context Protocol) tool** when running locally. This allows AI assistants and other MCP clients to generate vocabulary words for custom topics.
## What is MCP?
The Model Context Protocol (MCP) is a standard for integrating AI assistants with external tools and data sources. Gradio 5.0+ has built-in MCP server support, making it easy to expose functions as MCP tools.
**Reference:** [Building MCP Server with Gradio](https://www.gradio.app/guides/building-mcp-server-with-gradio)
## Available MCP Tools
### `generate_ai_words`
Generate 75 AI-selected words (25 each of lengths 4, 5, 6) related to a specific topic.
**Availability:** Only when running locally with `USE_HF_WORDS=false`
#### Input Parameters
```json
{
"topic": "Ocean Life", // Required: Theme for word generation
"model_name": null, // Optional: Override default AI model
"seed": null, // Optional: Random seed for reproducibility
"use_dictionary_filter": true, // Optional: Filter against dictionary (legacy parameter)
"selected_file": null // Optional: Word list file for dictionary context
}
```
#### Output Format
```json
{
"words": [
"WAVE", "TIDE", "FISH", ... // 75 words total (25 � 4-letter, 25 � 5-letter, 25 � 6-letter)
],
"difficulties": {
"WAVE": 0.45, // Difficulty score for each word
"TIDE": 0.32,
...
},
"metadata": {
"model_used": "microsoft/Phi-3-mini-4k-instruct",
"transformers_available": "True",
"gradio_client_available": "True",
"use_hf_words": "False",
"raw_output_length": "2048",
"raw_output_snippet": "...",
"ai_initial_count": "75",
"topic": "Ocean Life",
"dictionary_filter": "True",
"new_words_saved": "15"
}
}
```
## Setup
### 1. Environment Configuration
Set the `USE_HF_WORDS` environment variable to enable local mode:
```bash
# Linux/Mac
export USE_HF_WORDS=false
# Windows (PowerShell)
$env:USE_HF_WORDS="false"
# .env file
USE_HF_WORDS=false
```
### 2. Run Gradio App
```bash
python gradio_app.py
```
You should see:
```
===========================================================================
?? MCP SERVER ENABLED (Local Mode)
===========================================================================
MCP tools available:
- generate_ai_words: Generate AI vocabulary words for topics
To use MCP tools, connect your MCP client to this Gradio app.
See: https://www.gradio.app/guides/building-mcp-server-with-gradio
===========================================================================
```
### 3. Connect MCP Client
Configure your MCP client to connect to the Gradio server:
```json
{
"mcpServers": {
"wrdler": {
"url": "http://localhost:7860",
"transport": "gradio"
}
}
}
```
## Usage Examples
### Example 1: Basic Topic Generation
**Input:**
```json
{
"topic": "Space Exploration"
}
```
**Output:**
```json
{
"words": [
"STAR", "MARS", "MOON", "SHIP", "ORBIT", "COMET", ...
],
"difficulties": {
"STAR": 0.25,
"MARS": 0.30,
...
},
"metadata": {
"model_used": "microsoft/Phi-3-mini-4k-instruct",
"topic": "Space Exploration",
"ai_initial_count": "75",
...
}
}
```
### Example 2: With Custom Model and Seed
**Input:**
```json
{
"topic": "Medieval History",
"model_name": "meta-llama/Llama-3.1-8B-Instruct",
"seed": 42
}
```
### Example 3: Using MCP via Claude Desktop
If you have Claude Desktop configured with MCP:
1. Add Wrdler to your MCP configuration
2. In Claude, use natural language:
```
Can you generate vocabulary words about Ancient Rome using the generate_ai_words tool?
```
Claude will automatically call the MCP tool and return the results.
## Technical Details
### Implementation
The MCP integration is implemented in `wrdler/word_loader_ai.py` using Gradio's `@gr.mcp_server_function` decorator:
```python
@gr.mcp_server_function(
name="generate_ai_words",
description="Generate 75 AI-selected words...",
input_schema={...},
output_schema={...}
)
def mcp_generate_ai_words(...) -> dict:
# Wrapper for generate_ai_words()
...
```
The Gradio app (`gradio_app.py`) enables the MCP server by setting `mcp_server=True` in the launch configuration:
```python
demo.launch(
server_name="0.0.0.0",
server_port=7860,
mcp_server=True, # Enable MCP server
...
)
```
### Conditional Registration
The MCP function is **only registered when**:
- ? Gradio is available
- ? `USE_HF_WORDS=false` (local mode)
When deployed to Hugging Face Spaces (`USE_HF_WORDS=true`), the MCP function is **not registered** to avoid conflicts with the remote API.
### Word Generation Pipeline
1. **AI Generation**: Use local transformers models or HF Space API
2. **Validation**: Filter words to lengths 4, 5, 6 (uppercase A-Z only)
3. **Distribution**: Ensure 25 words per length
4. **Difficulty Scoring**: Calculate word difficulty metrics
5. **File Saving**: Save new words to topic-based files
6. **Return**: Provide words, difficulties, and metadata
## Troubleshooting
### MCP Function Not Appearing
**Check 1: Environment Variable**
```bash
echo $USE_HF_WORDS # Should be "false"
```
**Check 2: Gradio Logs**
```
? word_loader_ai module loaded (MCP functions may be registered)
? MCP server function 'generate_ai_words' registered (local mode)
```
**Check 3: Gradio Version**
```bash
pip show gradio # Should be >= 5.0.0
```
### Model Loading Issues
If you see warnings about model loading:
```
?? Transformers not available; falling back to dictionary words.
```
Install transformers:
```bash
pip install transformers torch
```
### Port Conflicts
If port 7860 is in use, modify `gradio_app.py`:
```python
launch_kwargs = {
"server_port": 7861, # Change port
...
}
```
## Remote vs Local Mode
| Feature | Local Mode (`USE_HF_WORDS=false`) | Remote Mode (`USE_HF_WORDS=true`) |
|---------|-----------------------------------|-----------------------------------|
| MCP Server | ? Enabled | ? Disabled |
| AI Models | Local transformers | HF Space API |
| Word Saving | ? Saves to files | ? Saves to files |
| Best For | Development, MCP clients | Production deployment |
## Security Notes
- MCP tools run with **full local file system access**
- Only enable MCP server in **trusted environments**
- Generated words are saved to `wrdler/words/` directory
- Model weights are cached in `TMPDIR/hf-cache/`
## Further Reading
- [Gradio MCP Guide](https://www.gradio.app/guides/building-mcp-server-with-gradio)
- [MCP Specification](https://modelcontextprotocol.io/)
- [Wrdler Requirements](../specs/requirements.md)
---
**Last Updated:** 2025-11-30
**Version:** 0.1.5