Spaces:
Sleeping
Sleeping
π Hugging Face Spaces Deployment Guide
Quick Deploy Steps
1. Create Your Space
- Go to huggingface.co/new-space
- Name:
content-classifier(or your preferred name) - SDK: Docker
- Visibility: Public/Private (your choice)
- Click Create Space
2. Upload Files
Upload these files to your Space:
Required Files:
contextClassifier.onnx(your model file)app.pyrequirements.txtDockerfileREADME.md
Optional Files:
test_api.py(for testing)
3. Model File
β οΈ Important: Make sure your contextClassifier.onnx file is in the same directory as these files before uploading.
4. Git Method (Recommended)
# Clone your space
git clone https://huggingface.co/spaces/YOUR_USERNAME/YOUR_SPACE_NAME
cd YOUR_SPACE_NAME
# Copy your model file
copy path\to\your\contextClassifier.onnx .
# Copy all project files
copy app.py .
copy requirements.txt .
copy Dockerfile .
copy README.md .
# Add and commit
git add .
git commit -m "π Add content classifier ONNX model"
git push
5. Monitor Deployment
- Check Build Logs: Go to your Space > Logs tab
- Wait for Build: Usually takes 2-3 minutes
- Check Status: Space will show "Building" β "Running"
6. Test Your Space
Once deployed, your API will be available at:
https://YOUR_USERNAME-YOUR_SPACE_NAME.hf.space
API Endpoints:
/docs- Interactive documentation/predict- Main prediction endpoint/health- Health check/model-info- Model information
7. Example Usage
import requests
# Replace with your actual Space URL
api_url = "https://YOUR_USERNAME-YOUR_SPACE_NAME.hf.space"
response = requests.post(
f"{api_url}/predict",
json={"text": "This is a test message for classification"}
)
print(response.json())
Troubleshooting
Common Issues:
Build Fails:
- Check Logs tab for error details
- Verify all required files are uploaded
- Ensure
contextClassifier.onnxis present
Model Not Found:
- Verify
contextClassifier.onnxis in root directory - Check file name matches exactly (case-sensitive)
API Not Responding:
- Check if Space is "Running" (not "Building")
- Try accessing
/healthendpoint first - Check Logs for runtime errors
Memory Issues:
- ONNX model might be too large
- Consider model optimization
- Check Space hardware limits
Success Indicators:
β
Space shows "Running" status
β
/health endpoint returns {"status": "healthy"}
β
/docs shows interactive API documentation
β
/predict accepts POST requests and returns expected format
Next Steps
- Test thoroughly with various text inputs
- Share your Space with the community
- Monitor usage in Space analytics
- Update model by pushing new
contextClassifier.onnx
Your Content Classifier is now live and ready to use! π