Spaces:
Running
A newer version of the Gradio SDK is available:
6.9.0
Deployment Guide - Hugging Face Spaces
Quick Deployment to Hugging Face
Step 1: Prepare Files
Ensure you have these files:
your-repo/
βββ app.py # FastAPI application
βββ binary_segmentation.py # Core segmentation module
βββ requirements.txt # Python dependencies
βββ Dockerfile # Docker configuration
βββ README.md # This becomes your Space README
βββ static/
β βββ index.html # Web interface
βββ .model_cache/
βββ u2netp.pth # Model weights (IMPORTANT!)
Step 2: Download U2NETP Weights
CRITICAL: You must download the U2NETP model weights:
- Visit: https://github.com/xuebinqin/U-2-Net/tree/master/saved_models
- Download:
u2netp.pth(4.7 MB) - Place in:
.model_cache/u2netp.pth
OR use this direct link:
mkdir -p .model_cache
wget https://github.com/xuebinqin/U-2-Net/raw/master/saved_models/u2netp/u2netp.pth -O .model_cache/u2netp.pth
Step 3: Create Hugging Face Space
Fill in:
- Space name:
background-removal(or your choice) - License: Apache 2.0
- SDK: Docker
- Hardware: CPU Basic (free tier works!)
- Space name:
Click "Create Space"
Step 4: Upload Files
Option A: Using Git (Recommended)
# Clone your new space
git clone https://huggingface.co/spaces/YOUR_USERNAME/YOUR_SPACE_NAME
cd YOUR_SPACE_NAME
# Copy all files
cp /path/to/app.py .
cp /path/to/binary_segmentation.py .
cp /path/to/requirements.txt .
cp /path/to/Dockerfile .
cp /path/to/README_HF.md ./README.md
cp -r /path/to/static .
cp -r /path/to/.model_cache .
# Commit and push
git add .
git commit -m "Initial commit"
git push
Option B: Using Web Interface
- Click "Files" β "Add file"
- Upload each file individually
- Important: Upload
.model_cache/u2netp.pth(it's large, ~4.7MB)
Step 5: Wait for Build
- Space will build automatically (takes 3-5 minutes)
- Watch the "Logs" tab for build progress
- Once complete, your Space will be live!
Step 6: Test Your Space
Visit your Space URL and try:
- Upload an image
- Click "Process Image"
- Download the result
Configuration Options
Use Different Models
To enable BiRefNet or RMBG models, edit requirements.txt:
# Uncomment these lines:
transformers>=4.30.0
huggingface-hub>=0.16.0
Note: These models are larger and may require upgraded hardware (GPU).
Custom Port
Default port is 7860 (Hugging Face standard). To change:
In Dockerfile:
CMD ["uvicorn", "app:app", "--host", "0.0.0.0", "--port", "7860"]
Environment Variables
Add secrets in Space Settings:
import os
API_KEY = os.environ.get("API_KEY", "default")
Hardware Requirements
CPU Basic (Free)
- β U2NETP model
- β Small to medium images (<5MP)
- β±οΈ ~2-5 seconds per image
CPU Upgrade
- β U2NETP model
- β Large images
- β±οΈ ~1-3 seconds per image
GPU T4
- β All models (U2NETP, BiRefNet, RMBG)
- β Any image size
- β±οΈ <1 second per image
Troubleshooting
Build Fails
Issue: "No module named 'binary_segmentation'"
- Fix: Ensure
binary_segmentation.pyis in root directory
Issue: "Model weights not found"
- Fix: Upload
u2netp.pthto.model_cache/u2netp.pth
Issue: "OpenCV error"
- Fix: Check Dockerfile has
libgl1-mesa-glxinstalled
Runtime Errors
Issue: "Out of memory"
- Fix: Upgrade to GPU hardware OR reduce image size
Issue: "Slow processing"
- Fix: Use CPU Upgrade or GPU hardware
Issue: "Model not loading"
- Fix: Check logs, ensure model file is in correct location
API Not Working
Issue: 404 errors
- Fix: Check that FastAPI routes are correct
- Fix: Ensure
app:appin CMD matchesapp = FastAPI()in code
Issue: CORS errors
- Fix: CORS is enabled by default; check browser console
File Structure Verification
Before deploying, verify:
# Check all files exist
ls -la
# Should see:
# app.py
# binary_segmentation.py
# requirements.txt
# Dockerfile
# README.md
# static/index.html
# .model_cache/u2netp.pth
# Check model file size (should be ~4.7MB)
ls -lh .model_cache/u2netp.pth
Alternative: Deploy Without Docker
If you prefer not to use Docker, create .spacesdk file:
sdk: gradio
sdk_version: 4.0.0
Then modify to use Gradio instead of FastAPI. But Docker is recommended for FastAPI.
Post-Deployment
Monitor Usage
- Check "Analytics" tab for usage stats
- Monitor "Logs" for errors
Update Your Space
git pull
# Make changes
git add .
git commit -m "Update"
git push
Share Your Space
- Get shareable link from Space page
- Embed in website using iframe
- Use API endpoint in your apps
Example API Usage from External Apps
Once deployed, use your Space API:
import requests
SPACE_URL = "https://huggingface.co/spaces/YOUR_USERNAME/YOUR_SPACE_NAME"
with open('image.jpg', 'rb') as f:
response = requests.post(
f"{SPACE_URL}/segment",
files={'file': f},
data={'model': 'u2netp', 'threshold': 0.5}
)
with open('result.png', 'wb') as out:
out.write(response.content)
Need Help?
- Hugging Face Docs: https://huggingface.co/docs/hub/spaces
- Community Forum: https://discuss.huggingface.co/
- Discord: https://discord.gg/hugging-face
Pro Tip: Start with CPU Basic (free), test your Space, then upgrade to GPU if needed!