Humanlearning commited on
Commit
2fb7e10
·
1 Parent(s): f3ab124

Merge remote changes

Browse files
Files changed (2) hide show
  1. .gitignore +5 -1
  2. README.md +0 -158
.gitignore CHANGED
@@ -1,5 +1,9 @@
 
1
  .env.*
2
  .venv/
3
  __pycache__/
4
  *.pyc
5
- .DS_Store
 
 
 
 
1
+ <<<<<<< HEAD
2
  .env.*
3
  .venv/
4
  __pycache__/
5
  *.pyc
6
+ .DS_Store
7
+ =======
8
+ .env.*
9
+ >>>>>>> 66c1ee8c459604895e42be1a97a63f258b3cfabe
README.md CHANGED
@@ -1,158 +0,0 @@
1
- ---
2
- title: CredentialWatch MCP Server
3
- emoji: 🩺
4
- colorFrom: blue
5
- colorTo: purple
6
- sdk: gradio
7
- python_version: 3.10
8
- sdk_version: 6.0.0
9
- app_file: src/credentialwatch_agent/main.py
10
- short_description: "MCP-enabled Gradio Space for credential monitoring."
11
- tags:
12
- - mcp
13
- - gradio
14
- - healthcare
15
- - tools
16
- pinned: false
17
- ---
18
-
19
- # CredentialWatch MCP Server
20
-
21
- Agent-ready Gradio Space that exposes healthcare credential tools (lookups, expiry checks, risk scoring) over **Model Context Protocol (MCP)**.
22
-
23
- ## Hugging Face Space
24
-
25
- - **SDK**: Gradio
26
- - **Entry file**: `src/credentialwatch_agent/main.py`
27
- - **Python**: 3.10+
28
-
29
- Deploy this repo as a Gradio Space and it will automatically serve both the web UI and an MCP server.
30
-
31
- ## MCP Server
32
-
33
- This Space exposes its tools via Model Context Protocol (MCP) using Gradio.
34
-
35
- ### How MCP is enabled
36
- In `src/credentialwatch_agent/main.py` we:
37
- 1. Install Gradio with MCP support: `pip install "gradio[mcp]"`
38
- 2. Define typed Python functions with docstrings
39
- 3. Launch the app with MCP support:
40
- ```python
41
- demo.launch(mcp_server=True)
42
- ```
43
-
44
- Alternatively, you can set the environment variable:
45
- ```bash
46
- export GRADIO_MCP_SERVER=True
47
- ```
48
-
49
- ### MCP endpoints
50
- When the Space is running, Gradio exposes:
51
- - **MCP SSE endpoint**: `https://<space-host>/gradio_api/mcp/sse`
52
- - **MCP schema**: `https://<space-host>/gradio_api/mcp/schema`
53
-
54
- ## Using this Space from an MCP client
55
-
56
- ### Easiest: Hugging Face MCP Server (no manual config)
57
- 1. Go to your HF **MCP settings**: https://huggingface.co/settings/mcp
58
- 2. Add this Space under **Spaces Tools** (look for the MCP badge on the Space).
59
- 3. Restart your MCP client (VS Code, Cursor, Claude Code, etc.).
60
- 4. The tools from this Space will appear as MCP tools and can be called directly.
61
-
62
- ### Manual config (generic MCP client using mcp-remote)
63
- If your MCP client uses a JSON config, you can point it to the SSE endpoint via `mcp-remote`:
64
-
65
- ```jsonc
66
- {
67
- "mcpServers": {
68
- "credentialwatch": {
69
- "command": "npx",
70
- "args": [
71
- "mcp-remote",
72
- "https://<space-host>/gradio_api/mcp/sse"
73
- ]
74
- }
75
- }
76
- }
77
- ```
78
- Replace `<space-host>` with the full URL of your Space.
79
-
80
- ## Local development
81
-
82
- ```bash
83
- # 1. Install deps
84
- uv sync
85
-
86
- # 2. Run locally
87
- uv run python src/credentialwatch_agent/main.py
88
- # or
89
- GRADIO_MCP_SERVER=True uv run python src/credentialwatch_agent/main.py
90
- ```
91
-
92
- The local server will be available at `http://127.0.0.1:7860`, and MCP at `http://127.0.0.1:7860/gradio_api/mcp/sse`.
93
-
94
- ## Deploying to Hugging Face Spaces
95
-
96
- 1. Create a new Space with SDK = **Gradio**.
97
- 2. Push this repo to the Space (Git or `huggingface_hub`).
98
- 3. Ensure the YAML header in `README.md` is present and correct.
99
- 4. Wait for the Space to build and start — it should show an **MCP badge** automatically.
100
-
101
- ## Troubleshooting
102
-
103
- - **README.md location**: Must be at the repo root and named `README.md` (all caps).
104
- - **YAML Header**: Must be the very first thing in the file, delimited by `---`.
105
- - **Configuration Error**: Check `sdk`, `app_file`, and `python_version` in the YAML.
106
- - **MCP Badge Missing**: Ensure `demo.launch(mcp_server=True)` is called or `GRADIO_MCP_SERVER=True` is set, and the Space is public.
107
-
108
- ## Features
109
-
110
- - **Interactive Query Agent**: Ask natural language questions about provider credentials.
111
- - **Expiry Sweep Agent**: Automated workflow to check for expiring credentials and generate alerts.
112
- - **MCP Integration**: Connects to NPI Registry, Credential Database, and Alerting systems via MCP.
113
- - **Gradio UI**: User-friendly interface for interaction.
114
-
115
- ## Prerequisites
116
-
117
- - Python 3.11 or higher
118
- - [uv](https://github.com/astral-sh/uv) (recommended)
119
-
120
- ## Installation
121
-
122
- 1. **Clone the repository:**
123
- ```bash
124
- git clone <repository_url>
125
- cd credential_watch
126
- ```
127
-
128
- 2. **Install dependencies:**
129
- ```bash
130
- uv sync
131
- ```
132
-
133
- ## Configuration
134
-
135
- Create a `.env` file in the root directory (or copy `.env.local`) and configure your environment variables:
136
-
137
- ```env
138
- OPENAI_API_KEY=your_openai_api_key
139
- # MCP Server URLs (defaults shown)
140
- NPI_MCP_URL=http://localhost:8001/sse
141
- CRED_DB_MCP_URL=http://localhost:8002/sse
142
- ALERT_MCP_URL=http://localhost:8003/sse
143
- ```
144
-
145
- ## Architecture
146
-
147
- - **`src/credentialwatch_agent/agents/`**: Contains LangGraph workflow definitions.
148
- - **`src/credentialwatch_agent/mcp_client.py`**: Handles connections to MCP servers.
149
- - **`src/credentialwatch_agent/main.py`**: Entry point and Gradio UI.
150
-
151
- ## MCP Servers
152
-
153
- The agent expects the following MCP servers to be running:
154
- 1. **NPI Server** (Port 8001)
155
- 2. **Credential DB Server** (Port 8002)
156
- 3. **Alert Server** (Port 8003)
157
-
158
- If these servers are not reachable, the client will fall back to using mock data for demonstration purposes.