--- title: MCP_Research_Server app_file: research_server.py sdk: gradio sdk_version: 5.31.0 --- # 🧠 FastMCP SSE Server – Research Paper Agent This project is a deployable **MCP-compatible remote server** built using the `FastMCP` framework. It exposes tools and resources for: - Searching academic papers on **arXiv** - Extracting information about saved papers - Generating structured prompts for Claude or other LLM agents It is designed to work with **Claude, GPT, or any MCP client** that supports `SSE` transport. --- ## 🌐 Live Server βœ… **MCP server is running here:** **Tool URL (SSE):** [`https://mcp-server-vs1x.onrender.com/sse`](https://mcp-server-vs1x.onrender.com/sse) To test if it’s working, simply visit the link above β€” you’ll see a plain text confirmation. image --- ## πŸš€ Features - `search_papers(topic)`: Search and save top arXiv papers by topic - `extract_info(paper_id)`: Retrieve paper details from stored JSON - `get_topic_papers(topic)`: Read summaries for all papers in a topic - `get_available_folders()`: List all saved topic folders - **Prompt template** for Claude to generate full topic reports --- ## πŸ§‘β€πŸ’» Project Structure ```bash . β”œβ”€β”€ research_server.py # Main FastMCP server β”œβ”€β”€ Dockerfile # For deployment on Render β”œβ”€β”€ pyproject.toml # Python project setup (required by uv) β”œβ”€β”€ uv.lock # Dependency lock file (required by uv) β”œβ”€β”€ papers/ # Local storage for downloaded paper info ``` --- ## πŸ“¦ Requirements - Python 3.11+ - [uv](https://github.com/astral-sh/uv): A fast Python package manager - [Render.com](https://render.com) (for deployment) --- ## πŸ› οΈ Local Setup (Optional) ```bash git clone https://github.com/YOUR_USERNAME/mcp-sse-server.git cd mcp-sse-server # Run with uv (you must have uv installed) uv pip install --system . uv run research_server.py ``` The server will run on `localhost:8001/sse`. --- ## ☁️ Deploy on Render.com (Docker) 1. Push this project to your GitHub 2. Create a new web service on Render 3. Use the following settings: - **Environment:** Docker - **Port:** 8001 - **Start command:** (leave blank – handled in Dockerfile) 4. Deploy πŸš€ Render will give you a URL like: ``` https://your-app-name.onrender.com/sse ``` **To run locally in Docker:** ```bash docker run -p 8001:8001 python research_server.py ``` --- ## πŸ§ͺ Test with MCP Inspector Install and run: ```bash npx @modelcontextprotocol/inspector ``` In the web UI: - **Transport:** SSE - **URL:** `https://mcp-server-vs1x.onrender.com/sse` You’ll now be able to call the tools and test them live using Claude or your own chatbot. --- ## πŸ“š Credits Built as part of the DeepLearning.AI Claude Agent Systems course.