π HuggingFace Spaces Deployment Guide
Quick Deploy
Method 1: Using Spaces UI (Recommended)
Create New Space
- Go to HuggingFace Spaces
- Click "Create new Space"
- Choose any name:
code-interpreter-sandbox - Select License:
mit - Select Hardware:
t4-small(free tier available) - Select SDK:
docker - Click "Create Space"
Upload Files
# Clone your space repository git clone https://huggingface.co/spaces/your-username/code-interpreter-sandbox cd code-interpreter-sandbox # Copy all files from the code_interpreter directory cp /path/to/code_interpreter/* . # Add, commit and push git add . git commit -m "Initial commit: Advanced Code Interpreter" git pushWait for Build
- Spaces will automatically build your Docker image
- Check the "Logs" tab for progress
- Should take 5-10 minutes
Access Your Space
- Once built, your space will be available at:
https://your-username-code-interpreter-sandbox.hf.space
Method 2: Using GitHub Integration
Create GitHub Repository
# Create a new repo on GitHub # Upload all files to the repositoryConnect to HuggingFace
- Go to HuggingFace Spaces
- Click "Create new Space"
- Choose "Create from GitHub repo"
- Select your repository
- Follow the same steps as Method 1
π οΈ Configuration
Hardware Requirements
- Free Tier: t4-small or cpu-basic
- GPU: t4-medium or better for ML workloads
- Memory: 16GB+ recommended for large datasets
Environment Variables
No special environment variables required. All configuration is in app.py.
Port Configuration
- Port: 7860
- Host: 0.0.0.0
- Protocol: HTTP
π File Structure
code-interpreter-sandbox/
βββ app.py # Main application
βββ requirements.txt # Python dependencies
βββ Dockerfile # Docker configuration
βββ .huggingface/spaces_metadata # Space metadata
βββ README.md # Documentation
βββ examples/
β βββ data_analysis_example.py # Data analysis demo
β βββ ml_example.py # Machine learning demo
β βββ visualization_example.py # Visualization demo
βββ config/
βββ packages.json # Pre-configured packages
βββ settings.json # App settings
π― Features Overview
β Implemented Features
Code Execution Engine
- Secure Python execution
- Timeout protection
- Error handling
- Output capture (stdout/stderr)
File Management
- Upload files
- Download results
- File browser
- Multi-file support
- Session isolation
Package Manager
- pip installation
- Popular packages pre-installed
- Batch installation
- Package tracking
Visualization Support
- Matplotlib integration
- Plotly support
- Seaborn compatibility
- Bokeh and Altair ready
Session Management
- State persistence
- Uptime tracking
- File history
- Package history
User Interface
- Gradio-based UI
- Tabbed interface
- Syntax highlighting
- Dark theme
- Responsive design
π Pre-installed Packages
Essential packages included in requirements.txt:
# Data Science
numpy>=1.24.0 # Numerical computing
pandas>=2.0.0 # Data manipulation
matplotlib>=3.7.0 # Plotting
plotly>=5.15.0 # Interactive plots
seaborn>=0.12.0 # Statistical visualization
scipy>=1.10.0 # Scientific computing
scikit-learn>=1.3.0 # Machine learning
# Image Processing
Pillow>=10.0.0 # Image handling
# Web & APIs
requests>=2.31.0 # HTTP client
beautifulsoup4>=4.12.0 # Web scraping
# NLP
nltk>=3.8.0 # Natural language processing
spacy>=3.6.0 # Advanced NLP
# Graphs
networkx>=3.1 # Graph library
# Math
sympy>=1.12 # Symbolic math
# Visualization
bokeh>=3.2.0 # Interactive plots
altair>=5.0.0 # Declarative visualization
π§ Customization
Adding Pre-installed Packages
Edit requirements.txt to add more packages:
# Add your packages here
your-package>=1.0.0
another-package>=2.0.0
Customizing Timeout/Resource Limits
Edit the CodeExecutor class in app.py:
class CodeExecutor:
def __init__(self, timeout=30, memory_limit=1024): # Increase limits
self.timeout = timeout
self.memory_limit = memory_limit
Adding New Tabs/Features
Add new tabs in the Gradio interface:
with gr.Tab("Your New Tab"):
# Your custom interface
pass
Custom CSS Styling
Edit the CUSTOM_CSS variable in app.py:
CUSTOM_CSS = """
.gradio-container {
max-width: 1600px !important; /* Wider interface */
}
"""
π¨ Troubleshooting
Build Fails
- Check
Dockerfilesyntax - Verify
requirements.txtformat - Review build logs
- Ensure all imports are correct
App Not Loading
- Check port configuration
- Verify environment variables
- Review application logs
- Test locally first
Package Installation Issues
- Use correct package names (PyPI names)
- Check version compatibility
- Some packages may require system dependencies
- Review pip output for errors
Memory/Timeout Issues
- Adjust
timeoutparameter - Use smaller datasets
- Process data in chunks
- Consider upgrading hardware tier
π Performance Optimization
For Free Tier (t4-small)
- Use efficient algorithms
- Avoid large data loading
- Clear variables between runs
- Use generators/iterators
For GPU Tier
- Enable GPU acceleration
- Install CUDA packages
- Use libraries like TensorFlow/PyTorch
- Optimize for parallel processing
π Security
Built-in Protections
- Execution timeouts
- Memory limits
- Isolated file system
- Session-based isolation
Best Practices
- Don't execute untrusted code
- Monitor resource usage
- Clear sensitive data
- Use secure package sources
π License
This project is licensed under the MIT License - see the LICENSE file for details.
π€ Support
- Check HuggingFace Spaces Docs
- Review Gradio Documentation
- Open an Issue for bugs
- Join our Community
π Credits
- Gradio: Amazing UI framework
- HuggingFace: Excellent hosting platform
- Python: Core language
- Community: Users and contributors
Happy Coding! π