Spaces:
Sleeping
A newer version of the Gradio SDK is available:
6.2.0
Troubleshooting: Why It Works on Hugging Face Spaces But Not Locally
Common Issues and Solutions
1. Missing Dependencies ⚠️ (Most Common)
Problem: The required Python packages are not installed locally.
Solution: Install all dependencies:
cd /home/tahereh/engram/users/Tahereh/Codes/Public_Codes/Generative_Inference_Faces
pip install -r requirements.txt
Required packages:
torchandtorchvision(PyTorch)gradio(for the web interface)numpy,pillow(PIL),matplotlibrequests,tqdm,huggingface_hub
2. GPU Decorator ✅ (Fixed)
Problem: The @GPU decorator from Hugging Face Spaces is not available locally.
Solution: The code now automatically handles this:
- On Hugging Face Spaces: Uses the
spaces.GPUdecorator - Locally: Uses a no-op decorator (GPU detection is automatic via PyTorch)
Status: ✅ Fixed in the code
3. Port Configuration ✅ (Fixed)
Problem: Port configuration was inconsistent between local and Spaces environments.
Solution: The code now:
- Uses port 7860 by default (same as Spaces)
- Allows custom port via
--portargument - Automatically detects Hugging Face Spaces environment
Status: ✅ Fixed in the code
4. Model Files Not Downloaded
Problem: Model checkpoint files may not be downloaded yet.
Solution: The code will automatically download models on first run, but you can verify:
ls models/
Expected files:
resnet50_robust.ptstandard_resnet50.pt(optional)resnet50_robust_face_100_checkpoint.pt(optional)
5. Missing Stimuli Images
Problem: Example images may be missing.
Solution: Verify stimuli directory exists:
ls stimuli/
All example images should be present for the demo to work fully.
6. CUDA/GPU Issues
Problem: GPU may not be available or configured correctly.
Solution: The code automatically detects available hardware:
- CUDA (NVIDIA GPUs)
- MPS (Apple Silicon)
- CPU (fallback)
Check your setup:
import torch
print("CUDA available:", torch.cuda.is_available())
print("Device:", torch.device("cuda" if torch.cuda.is_available() else "cpu"))
7. Python Version
Problem: Incompatible Python version.
Solution: Use Python 3.8+ (tested with 3.11.5):
python --version
Quick Start Guide
Install dependencies:
pip install -r requirements.txtRun the app:
python app.pyOr with a custom port:
python app.py --port 8080Access the web interface:
- Open your browser to
http://localhost:7860 - Or the port you specified
- Open your browser to
Differences Between Hugging Face Spaces and Local
| Feature | Hugging Face Spaces | Local |
|---|---|---|
| GPU Decorator | @spaces.GPU available |
No-op decorator (automatic GPU) |
| Port | Set via PORT env var |
Default 7860, or --port arg |
| Dependencies | Pre-installed | Must install manually |
| Environment | SPACE_ID env var set |
Not set |
| Model Storage | Persistent storage | Local models/ directory |
Testing the Fixes
After applying the fixes, test with:
# Check imports work
python -c "import gradio, torch, numpy, PIL; print('All imports OK')"
# Run the app
python app.py --port 7860
Still Having Issues?
- Check error messages: Look for specific import errors or file not found errors
- Verify Python environment: Make sure you're using the correct virtual environment
- Check file permissions: Ensure the
models/andstimuli/directories are writable - Review logs: Check the
logs/directory for model loading issues