Spaces:
Running
Running
Deployment Guide
This guide covers different deployment options for the hf-eda-mcp server.
Table of Contents
Local Development
Prerequisites
- Python 3.13+
- PDM (Python package manager)
- HuggingFace account (optional, for private datasets)
Setup
- Clone the repository:
git clone https://github.com/your-username/hf-eda-mcp.git
cd hf-eda-mcp
- Install dependencies:
pdm install
- Configure environment variables:
cp config.example.env .env
# Edit .env and add your HF_TOKEN if needed
- Run the server:
pdm run hf-eda-mcp
The server will start on http://localhost:7860 with MCP enabled.
Docker Deployment
Build the Image
docker build -t hf-eda-mcp:latest .
Run with Docker
docker run -d \
--name hf-eda-mcp-server \
-p 7860:7860 \
-e HF_TOKEN=your_token_here \
-v hf-cache:/app/cache \
hf-eda-mcp:latest
Run with Docker Compose
- Create a
.envfile with your configuration:
HF_TOKEN=your_token_here
- Start the service:
docker-compose up -d
- View logs:
docker-compose logs -f
- Stop the service:
docker-compose down
Docker Configuration Options
Environment variables you can set:
HF_TOKEN: HuggingFace API tokenGRADIO_SERVER_NAME: Server host (default:0.0.0.0)GRADIO_SERVER_PORT: Server port (default:7860)HF_HOME: Cache directory for HuggingFaceMCP_SERVER_ENABLED: Enable MCP server (default:true)
HuggingFace Spaces
Deployment Steps
Create a new Space:
- Go to https://huggingface.co/spaces
- Click "Create new Space"
- Choose "Gradio" as the SDK
- Select SDK version 5.49.1 or higher
Upload files:
# Copy files to Spaces directory cp -r src/ spaces/ cp README.md LICENSE spaces/ # Initialize git in spaces directory cd spaces git init git remote add origin https://huggingface.co/spaces/YOUR-USERNAME/hf-eda-mcpConfigure the Space:
- Copy
spaces/README.mdas the Space's README - Ensure
spaces/app.pyis set as the app file - Add
spaces/requirements.txtfor dependencies
- Copy
Set secrets (for private datasets):
- Go to Space settings
- Add
HF_TOKENas a secret
Deploy:
git add . git commit -m "Initial deployment" git push origin main
Space Configuration
The Space will automatically:
- Install dependencies from
requirements.txt - Run
app.pyas the entry point - Expose the MCP server at
/gradio_api/mcp/sse
Accessing the Space
Your MCP server will be available at:
https://YOUR-USERNAME-hf-eda-mcp.hf.space/gradio_api/mcp/sse
Production Considerations
Security
Authentication:
- Use environment variables for sensitive data
- Never commit tokens to version control
- Rotate tokens regularly
Access Control:
- Consider implementing rate limiting
- Use HTTPS for all connections
- Validate all input parameters
Secrets Management:
- Use Docker secrets or environment files
- For Spaces, use the built-in secrets feature
- Consider using a secrets manager (AWS Secrets Manager, HashiCorp Vault)
Performance
Caching:
- Enable persistent cache volumes
- Configure appropriate cache sizes
- Monitor cache hit rates
Resource Limits:
- Set memory limits in Docker
- Configure appropriate timeouts
- Monitor CPU and memory usage
Scaling:
- Use load balancers for multiple instances
- Consider horizontal scaling for high traffic
- Monitor response times and adjust resources
Monitoring
Logging:
- Configure structured logging
- Use log aggregation tools (ELK, Splunk)
- Monitor error rates
Metrics:
- Track request counts and latencies
- Monitor cache performance
- Set up alerts for errors
Health Checks:
- Implement health check endpoints
- Configure container health checks
- Set up uptime monitoring
Backup and Recovery
Data Backup:
- Backup cache volumes regularly
- Document configuration settings
- Version control all code
Disaster Recovery:
- Document deployment procedures
- Test recovery processes
- Maintain rollback capabilities
Deployment Checklist
Pre-Deployment
- All tests passing
- Dependencies up to date
- Security scan completed
- Documentation updated
- Environment variables configured
- Secrets properly managed
Deployment
- Build successful
- Health checks passing
- MCP endpoints accessible
- Tools functioning correctly
- Logs being collected
- Monitoring configured
Post-Deployment
- Verify all tools work
- Check performance metrics
- Monitor error rates
- Test with MCP clients
- Document any issues
- Update runbooks
Troubleshooting
Common Issues
Server won't start:
- Check Python version (3.13+ required)
- Verify all dependencies installed
- Check port availability
- Review logs for errors
MCP connection fails:
- Verify server is running
- Check firewall settings
- Confirm correct URL/port
- Test with curl or browser
Dataset access errors:
- Verify HF_TOKEN is set
- Check token permissions
- Confirm dataset exists
- Test with public dataset first
Performance issues:
- Check cache configuration
- Monitor resource usage
- Reduce sample sizes
- Enable caching
Getting Help
- Check logs:
docker logs hf-eda-mcp-server - Review documentation: See
MCP_USAGE.md - Open an issue: GitHub repository
- Community support: HuggingFace forums
Next Steps
After deployment:
- Configure MCP clients (see
deployment/mcp-client-examples.md) - Test all tools with various datasets
- Set up monitoring and alerts
- Document any custom configurations
- Share your Space with the community!