Spaces:
Sleeping
Sleeping
| title: MCP GitHub PR Opportunity Server | |
| emoji: 🔍 | |
| colorFrom: blue | |
| colorTo: indigo | |
| sdk: gradio | |
| sdk_version: 5.33.1 | |
| app_file: app.py | |
| pinned: false | |
| tag: | |
| - mcp-server-track | |
| - agent-demo-track | |
| license: mit | |
| # Video Link: https://drive.google.com/file/d/1FaYPqtoYWmUBsjiSFexPNyPARvFsixsh/view?usp=sharing | |
| # MCP GitHub PR Opportunity Server | |
| A Gradio-based MCP server that searches GitHub repositories for possible PR opportunities (open issues labeled 'good first issue' or 'help wanted'), supporting search by keyword and topic, with authentication. | |
| ## Features | |
| - Search GitHub repos by keyword and topic | |
| - Find open issues labeled 'good first issue' or 'help wanted' | |
| - Supports GitHub authentication via token | |
| - Gradio MCP server for LLM integration | |
| - Interactive web interface | |
| - Compatible with n8n workflows | |
| ## Local Setup | |
| 1. **Clone the repo and install dependencies:** | |
| ```bash | |
| cd mcp_github_server | |
| pip install -r requirements.txt | |
| ``` | |
| 2. **Configure GitHub Token:** | |
| - Set `GITHUB_TOKEN` in your environment variables, or | |
| - Enter the token in the web interface | |
| 3. **Run the server:** | |
| ```bash | |
| python app.py | |
| ``` | |
| ## Hugging Face Spaces Deployment | |
| 1. **Create a new Space:** | |
| - Go to [Hugging Face Spaces](https://huggingface.co/spaces) | |
| - Click "Create new Space" | |
| - Choose "Gradio" as the SDK | |
| - Name your space (e.g., "mcp-github-pr-server") | |
| 2. **Configure Environment Variables:** | |
| - In your Space settings, add `GITHUB_TOKEN` as a secret | |
| - Set your GitHub token as the value | |
| 3. **Deploy:** | |
| - Push your code to the Space repository | |
| - Hugging Face will automatically build and deploy your application | |
| ## Using with MCP Clients | |
| Add this configuration to your MCP client (e.g., Claude Desktop, Cursor, or Cline): | |
| ```json | |
| { | |
| "mcpServers": { | |
| "github-pr-finder": { | |
| "url": "https://akinyemiar-mcp-github-pr-server.hf.space/gradio_api/mcp/sse" | |
| } | |
| } | |
| } | |
| ``` | |
| ## Using with n8n | |
| 1. **Add HTTP Request Node:** | |
| - In your n8n workflow, add an "HTTP Request" node | |
| - Set the URL to your Hugging Face Space URL (e.g., `https://akinyemiar-mcp-github-pr-server.hf.space/api/predict`) | |
| - Set Method to POST | |
| - Add Headers: | |
| ``` | |
| Content-Type: application/json | |
| ``` | |
| - Set Body to JSON: | |
| ```json | |
| { | |
| "data": [ | |
| "fastapi", | |
| "python", | |
| 5, | |
| 1, | |
| "YOUR_GITHUB_TOKEN" | |
| ] | |
| } | |
| ``` | |
| 2. **Process Response:** | |
| - The response will contain opportunities and total_count | |
| - Use n8n's "Set" node to process the response data | |
| - Add additional nodes to handle the opportunities as needed | |
| ## API Usage | |
| The server provides both a web interface and an API endpoint at `/api/predict`. The API accepts POST requests with the following parameters: | |
| ```json | |
| { | |
| "data": [ | |
| "keyword", // optional | |
| "topic", // optional | |
| "per_page", // default: 5 | |
| "page", // default: 1 | |
| "token" // GitHub token | |
| ] | |
| } | |
| ``` | |
| Response: | |
| ```json | |
| { | |
| "data": [ | |
| { | |
| "opportunities": [ | |
| { | |
| "repo_name": "tiangolo/fastapi", | |
| "repo_url": "https://github.com/tiangolo/fastapi", | |
| "issue_title": "Add more examples", | |
| "issue_url": "https://github.com/tiangolo/fastapi/issues/1234", | |
| "issue_labels": ["good first issue"], | |
| "issue_body": "Please add more examples to the docs..." | |
| } | |
| ], | |
| "total_count": 123 | |
| } | |
| ] | |
| } | |
| ``` | |
| ## License | |
| MIT |