File size: 6,122 Bytes
9699bea
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
d495234
9699bea
 
 
 
d495234
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
9699bea
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
# πŸš€ Deploy MediGuard AI to Hugging Face Spaces

This guide walks you through deploying MediGuard AI to Hugging Face Spaces using Docker.

## Prerequisites

1. **Hugging Face Account** β€” [Sign up free](https://huggingface.co/join)
2. **Git** β€” Installed on your machine
3. **API Key** β€” Either:
   - **Groq** (recommended) β€” [Get free key](https://console.groq.com/keys)
   - **Google Gemini** β€” [Get free key](https://aistudio.google.com/app/apikey)

## Step 1: Create a New Space

1. Go to [huggingface.co/new-space](https://huggingface.co/new-space)
2. Fill in:
   - **Space name**: `mediguard-ai` (or your choice)
   - **License**: MIT
   - **SDK**: Select **Docker**
   - **Hardware**: **CPU Basic** (free tier works!)
3. Click **Create Space**

## Step 2: Clone Your Space

```bash
# Clone the empty space
git clone https://huggingface.co/spaces/YOUR_USERNAME/mediguard-ai
cd mediguard-ai
```

## Step 3: Copy Project Files

Copy all files from this repository to your space folder:

```bash
# Option A: If you have the RagBot repo locally
cp -r /path/to/RagBot/* .

# Option B: Clone fresh
git clone https://github.com/yourusername/ragbot temp
cp -r temp/* .
rm -rf temp
```

## Step 4: Set Up Dockerfile for Spaces

Hugging Face Spaces expects the Dockerfile in the root. Copy the HF-optimized Dockerfile:

```bash
# Copy the HF Spaces Dockerfile to root
cp huggingface/Dockerfile ./Dockerfile
```

**Or** update your root `Dockerfile` to match the HF Spaces version.

## Step 5: Set Up README (Important!)

The README.md must have the HF Spaces metadata header. Copy the HF README:

```bash
# Backup original README
mv README.md README_original.md

# Use HF Spaces README
cp huggingface/README.md ./README.md
```

## Step 6: Add Your API Keys (Secrets)

1. Go to your Space: `https://huggingface.co/spaces/YOUR_USERNAME/mediguard-ai`
2. Click **Settings** tab
3. Scroll to **Repository Secrets**

### Required Secrets (pick one)

| Secret | Description | Get Free Key |
|--------|-------------|--------------|
| `GROQ_API_KEY` | Groq API key (recommended) | [console.groq.com/keys](https://console.groq.com/keys) |
| `GOOGLE_API_KEY` | Google Gemini API key | [aistudio.google.com](https://aistudio.google.com/app/apikey) |

### Optional Secrets

| Secret | Description | Default |
|--------|-------------|---------|
| `GROQ_MODEL` | Groq model to use | `llama-3.3-70b-versatile` |
| `GEMINI_MODEL` | Gemini model to use | `gemini-2.0-flash` |
| `EMBEDDING_PROVIDER` | Embedding provider: `jina`, `google`, `huggingface` | `huggingface` |
| `JINA_API_KEY` | Jina AI API key for high-quality embeddings | - |
| `LANGFUSE_ENABLED` | Enable Langfuse tracing (`true`/`false`) | `false` |
| `LANGFUSE_PUBLIC_KEY` | Langfuse public key | - |
| `LANGFUSE_SECRET_KEY` | Langfuse secret key | - |
| `LANGFUSE_HOST` | Langfuse host URL | - |

> **Tip**: See `huggingface/.env.huggingface` for a complete reference of all available secrets.

## Step 7: Push to Deploy

```bash
# Add all files
git add .

# Commit
git commit -m "Deploy MediGuard AI"

# Push to Hugging Face
git push
```

## Step 8: Monitor Deployment

1. Go to your Space: `https://huggingface.co/spaces/YOUR_USERNAME/mediguard-ai`
2. Click the **Logs** tab to watch the build
3. Build takes ~5-10 minutes (first time)
4. Once "Running", your app is live! πŸŽ‰

## πŸ”§ Troubleshooting

### "No LLM API key configured"

- Make sure you added `GROQ_API_KEY` or `GOOGLE_API_KEY` in Space Settings β†’ Secrets
- Secret names are case-sensitive

### Build fails with "No space disk"

- Hugging Face free tier has limited disk space
- The FAISS vector store might be too large
- Solution: Upgrade to a paid tier or reduce vector store size

### "ModuleNotFoundError"

- Check that all dependencies are in `huggingface/requirements.txt`
- The Dockerfile should install from this file

### App crashes on startup

- Check Logs for the actual error
- Common issue: Missing environment variables
- Increase Space hardware if OOM error

## πŸ“ File Structure for Deployment

Your Space should have this structure:

```
your-space/
β”œβ”€β”€ Dockerfile              # HF Spaces Dockerfile (from huggingface/)
β”œβ”€β”€ README.md               # HF Spaces README with metadata
β”œβ”€β”€ huggingface/
β”‚   β”œβ”€β”€ app.py              # Standalone Gradio app
β”‚   β”œβ”€β”€ requirements.txt    # Minimal deps for HF
β”‚   └── README.md           # Original HF README
β”œβ”€β”€ src/                    # Core application code
β”‚   β”œβ”€β”€ workflow.py
β”‚   β”œβ”€β”€ state.py
β”‚   β”œβ”€β”€ llm_config.py
β”‚   β”œβ”€β”€ pdf_processor.py
β”‚   β”œβ”€β”€ agents/
β”‚   └── ...
β”œβ”€β”€ data/
β”‚   └── vector_stores/
β”‚       β”œβ”€β”€ medical_knowledge.faiss
β”‚       └── medical_knowledge.pkl
└── config/
    └── biomarker_references.json
```

## πŸ”„ Updating Your Space

To update after making changes:

```bash
git add .
git commit -m "Update: description of changes"
git push
```

Hugging Face will automatically rebuild and redeploy.

## πŸ’° Hardware Options

| Tier | RAM | vCPU | Cost | Best For |
|------|-----|------|------|----------|
| CPU Basic | 2GB | 2 | Free | Demo/Testing |
| CPU Upgrade | 8GB | 4 | ~$0.03/hr | Production |
| T4 Small | 16GB | 4 | ~$0.06/hr | Heavy usage |

The free tier works for demos. Upgrade if you experience timeouts.

## πŸŽ‰ Your Space is Live!

Once deployed, share your Space URL:

```
https://huggingface.co/spaces/YOUR_USERNAME/mediguard-ai
```

Anyone can now use MediGuard AI without any setup!

---

## Quick Commands Reference

```bash
# Clone your space
git clone https://huggingface.co/spaces/YOUR_USERNAME/mediguard-ai

# Set up remote (if needed)
git remote add origin https://huggingface.co/spaces/YOUR_USERNAME/mediguard-ai

# Push changes
git push origin main

# Force rebuild (if stuck)
# Go to Settings β†’ Factory Reset
```

## Need Help?

- [Hugging Face Spaces Docs](https://huggingface.co/docs/hub/spaces)
- [Docker on Spaces](https://huggingface.co/docs/hub/spaces-sdks-docker)
- [Spaces Secrets](https://huggingface.co/docs/hub/spaces-secrets)