Spaces:
Running
Running
π Deploy MediGuard AI to Hugging Face Spaces
This guide walks you through deploying MediGuard AI to Hugging Face Spaces using Docker.
Prerequisites
- Hugging Face Account β Sign up free
- Git β Installed on your machine
- API Key β Either:
- Groq (recommended) β Get free key
- Google Gemini β Get free key
Step 1: Create a New Space
- Go to huggingface.co/new-space
- Fill in:
- Space name:
mediguard-ai(or your choice) - License: MIT
- SDK: Select Docker
- Hardware: CPU Basic (free tier works!)
- Space name:
- Click Create Space
Step 2: Clone Your Space
# Clone the empty space
git clone https://huggingface.co/spaces/YOUR_USERNAME/mediguard-ai
cd mediguard-ai
Step 3: Copy Project Files
Copy all files from this repository to your space folder:
# Option A: If you have the RagBot repo locally
cp -r /path/to/RagBot/* .
# Option B: Clone fresh
git clone https://github.com/yourusername/ragbot temp
cp -r temp/* .
rm -rf temp
Step 4: Set Up Dockerfile for Spaces
Hugging Face Spaces expects the Dockerfile in the root. Copy the HF-optimized Dockerfile:
# Copy the HF Spaces Dockerfile to root
cp huggingface/Dockerfile ./Dockerfile
Or update your root Dockerfile to match the HF Spaces version.
Step 5: Set Up README (Important!)
The README.md must have the HF Spaces metadata header. Copy the HF README:
# Backup original README
mv README.md README_original.md
# Use HF Spaces README
cp huggingface/README.md ./README.md
Step 6: Add Your API Keys (Secrets)
- Go to your Space:
https://huggingface.co/spaces/YOUR_USERNAME/mediguard-ai - Click Settings tab
- Scroll to Repository Secrets
Required Secrets (pick one)
| Secret | Description | Get Free Key |
|---|---|---|
GROQ_API_KEY |
Groq API key (recommended) | console.groq.com/keys |
GOOGLE_API_KEY |
Google Gemini API key | aistudio.google.com |
Optional Secrets
| Secret | Description | Default |
|---|---|---|
GROQ_MODEL |
Groq model to use | llama-3.3-70b-versatile |
GEMINI_MODEL |
Gemini model to use | gemini-2.0-flash |
EMBEDDING_PROVIDER |
Embedding provider: jina, google, huggingface |
huggingface |
JINA_API_KEY |
Jina AI API key for high-quality embeddings | - |
LANGFUSE_ENABLED |
Enable Langfuse tracing (true/false) |
false |
LANGFUSE_PUBLIC_KEY |
Langfuse public key | - |
LANGFUSE_SECRET_KEY |
Langfuse secret key | - |
LANGFUSE_HOST |
Langfuse host URL | - |
Tip: See
huggingface/.env.huggingfacefor a complete reference of all available secrets.
Step 7: Push to Deploy
# Add all files
git add .
# Commit
git commit -m "Deploy MediGuard AI"
# Push to Hugging Face
git push
Step 8: Monitor Deployment
- Go to your Space:
https://huggingface.co/spaces/YOUR_USERNAME/mediguard-ai - Click the Logs tab to watch the build
- Build takes ~5-10 minutes (first time)
- Once "Running", your app is live! π
π§ Troubleshooting
"No LLM API key configured"
- Make sure you added
GROQ_API_KEYorGOOGLE_API_KEYin Space Settings β Secrets - Secret names are case-sensitive
Build fails with "No space disk"
- Hugging Face free tier has limited disk space
- The FAISS vector store might be too large
- Solution: Upgrade to a paid tier or reduce vector store size
"ModuleNotFoundError"
- Check that all dependencies are in
huggingface/requirements.txt - The Dockerfile should install from this file
App crashes on startup
- Check Logs for the actual error
- Common issue: Missing environment variables
- Increase Space hardware if OOM error
π File Structure for Deployment
Your Space should have this structure:
your-space/
βββ Dockerfile # HF Spaces Dockerfile (from huggingface/)
βββ README.md # HF Spaces README with metadata
βββ huggingface/
β βββ app.py # Standalone Gradio app
β βββ requirements.txt # Minimal deps for HF
β βββ README.md # Original HF README
βββ src/ # Core application code
β βββ workflow.py
β βββ state.py
β βββ llm_config.py
β βββ pdf_processor.py
β βββ agents/
β βββ ...
βββ data/
β βββ vector_stores/
β βββ medical_knowledge.faiss
β βββ medical_knowledge.pkl
βββ config/
βββ biomarker_references.json
π Updating Your Space
To update after making changes:
git add .
git commit -m "Update: description of changes"
git push
Hugging Face will automatically rebuild and redeploy.
π° Hardware Options
| Tier | RAM | vCPU | Cost | Best For |
|---|---|---|---|---|
| CPU Basic | 2GB | 2 | Free | Demo/Testing |
| CPU Upgrade | 8GB | 4 | ~$0.03/hr | Production |
| T4 Small | 16GB | 4 | ~$0.06/hr | Heavy usage |
The free tier works for demos. Upgrade if you experience timeouts.
π Your Space is Live!
Once deployed, share your Space URL:
https://huggingface.co/spaces/YOUR_USERNAME/mediguard-ai
Anyone can now use MediGuard AI without any setup!
Quick Commands Reference
# Clone your space
git clone https://huggingface.co/spaces/YOUR_USERNAME/mediguard-ai
# Set up remote (if needed)
git remote add origin https://huggingface.co/spaces/YOUR_USERNAME/mediguard-ai
# Push changes
git push origin main
# Force rebuild (if stuck)
# Go to Settings β Factory Reset