File size: 8,551 Bytes
44cdbab | 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 153 154 155 156 157 158 159 160 161 162 163 164 165 166 167 168 169 170 171 172 173 174 175 176 177 178 179 180 181 182 183 184 185 186 187 188 189 190 191 192 193 194 195 196 197 198 199 200 201 202 203 204 205 206 207 208 209 210 211 212 213 214 215 216 217 218 219 220 221 222 223 224 225 226 227 228 229 230 231 232 233 234 235 236 237 238 239 240 241 242 243 244 245 246 247 248 249 250 251 252 253 254 255 256 257 258 259 260 261 262 263 264 265 266 267 268 269 270 271 272 273 274 275 276 277 278 279 280 281 282 283 284 285 286 287 288 289 290 291 292 293 294 295 296 297 298 299 300 301 302 303 304 305 306 307 308 309 310 311 312 313 314 315 316 317 318 319 320 321 322 323 324 325 326 327 328 329 330 331 332 333 334 335 336 337 338 339 340 341 342 343 344 345 346 347 348 349 350 351 352 353 354 355 356 357 358 359 360 361 362 363 364 365 366 367 368 369 370 371 372 373 374 375 376 377 378 379 380 381 382 383 384 385 386 387 | # Deployment Guide
Complete guide for deploying the Land Redistribution Algorithm to production.
## Table of Contents
- [Prerequisites](#prerequisites)
- [Backend Deployment (Hugging Face Spaces)](#backend-deployment-hugging-face-spaces)
- [Frontend Deployment (Streamlit Cloud)](#frontend-deployment-streamlit-cloud)
- [Local Docker Testing](#local-docker-testing)
- [Environment Variables](#environment-variables)
- [Troubleshooting](#troubleshooting)
## Prerequisites
### For All Deployments
- Git installed on your machine
- GitHub account (for Streamlit Cloud)
- Hugging Face account (for backend deployment)
### For Local Testing
- Docker and Docker Compose installed
- Python 3.11+ (for non-Docker development)
- Make (optional, for convenience commands)
## Backend Deployment (Hugging Face Spaces)
Hugging Face Spaces provides free hosting for ML applications with Docker support.
### Step 1: Create a New Space
1. Go to [Hugging Face Spaces](https://huggingface.co/spaces)
2. Click **"Create new Space"**
3. Configure:
- **Space name**: `land-redistribution-api` (or your choice)
- **License**: MIT
- **Select the Space SDK**: Docker
- **Visibility**: Public or Private
### Step 2: Prepare Backend Files
The backend directory is already configured with:
- β
`Dockerfile` - Multi-stage production build
- β
`README_HF.md` - Hugging Face metadata
- β
`requirements.txt` - Python dependencies
- β
`.dockerignore` - Build optimization
### Step 3: Deploy to Hugging Face
#### Option A: Git Push (Recommended)
```bash
# Navigate to backend directory
cd /Volumes/WorkSpace/Project/REMB/algorithms/backend
# Initialize git (if not already)
git init
# Add Hugging Face remote using your space name
git remote add hf https://huggingface.co/spaces/<YOUR_USERNAME>/<SPACE_NAME>
# Rename README for Hugging Face
cp README_HF.md README.md
# Add and commit files
git add .
git commit -m "Initial deployment"
# Push to Hugging Face
git push hf main
```
#### Option B: Web Upload
1. In your Space, click **"Files and versions"**
2. Upload all files from `backend/` directory
3. Ensure `README_HF.md` is renamed to `README.md`
### Step 4: Wait for Build
- Hugging Face will automatically build your Docker image
- Build time: ~5-10 minutes
- Monitor progress in the "Logs" tab
### Step 5: Test Your Backend API
Once deployed, your API will be available at:
```
https://<YOUR_USERNAME>-<SPACE_NAME>.hf.space
```
Test endpoints:
```bash
# Health check
curl https://<YOUR_USERNAME>-<SPACE_NAME>.hf.space/health
# API documentation
open https://<YOUR_USERNAME>-<SPACE_NAME>.hf.space/docs
```
## Frontend Deployment (Streamlit Cloud)
### Step 1: Push Frontend to GitHub
```bash
cd /Volumes/WorkSpace/Project/REMB/algorithms/frontend
# Initialize git repository (if not already)
git init
# Add GitHub remote
git remote add origin https://github.com/<YOUR_USERNAME>/land-redistribution-ui.git
# Add all files
git add .
git commit -m "Initial commit"
# Push to GitHub
git branch -M main
git push -u origin main
```
### Step 2: Deploy on Streamlit Cloud
1. Go to [Streamlit Cloud](https://streamlit.io/cloud)
2. Sign in with GitHub
3. Click **"New app"**
4. Configure:
- **Repository**: Select your frontend repository
- **Branch**: `main`
- **Main file path**: `app.py`
### Step 3: Configure Environment Variables
In Streamlit Cloud, add secrets:
1. Go to your app settings
2. Click **"Secrets"**
3. Add:
```toml
API_URL = "https://<YOUR_HF_USERNAME>-<SPACE_NAME>.hf.space"
```
### Step 4: Deploy
- Click **"Deploy"**
- Streamlit Cloud will install dependencies and launch your app
- Your app will be available at: `https://<APP_NAME>.streamlit.app`
## Local Docker Testing
Before deploying to production, test locally with Docker Compose.
### Quick Start
```bash
# Navigate to algorithms directory
cd /Volumes/WorkSpace/Project/REMB/algorithms
# Build and start services
make build
make up
# View logs
make logs
# Test services
make health
```
### Manual Testing
```bash
# Build backend
docker-compose build backend
# Start all services
docker-compose up -d
# Check status
docker-compose ps
# View logs
docker-compose logs -f
# Test backend
curl http://localhost:8000/health
# Access frontend
open http://localhost:8501
# Stop services
docker-compose down
```
### Testing the Backend Container Only
```bash
cd backend
# Build image
docker build -t land-redistribution-api .
# Run container
docker run -p 7860:7860 land-redistribution-api
# Test in another terminal
curl http://localhost:7860/health
```
## Environment Variables
### Backend (.env or Hugging Face Secrets)
```bash
API_HOST=0.0.0.0
API_PORT=7860
CORS_ORIGINS=*
LOG_LEVEL=INFO
```
### Frontend (.env or Streamlit Secrets)
```bash
# Development
API_URL=http://localhost:8000
# Production (use your actual Hugging Face Space URL)
API_URL=https://<YOUR_HF_USERNAME>-<SPACE_NAME>.hf.space
```
## Troubleshooting
### Backend Issues
#### Build Fails on Hugging Face
**Problem**: Docker build fails with dependency errors
**Solution**:
1. Check Dockerfile syntax
2. Verify requirements.txt has pinned versions
3. Check build logs in Hugging Face Space
4. Test locally first: `docker build -t test ./backend`
#### API Returns 500 Error
**Problem**: Backend starts but API endpoints fail
**Solution**:
1. Check logs in Hugging Face Space
2. Verify all imports work: Test locally with Docker
3. Check CORS settings in `main.py`
#### Slow Performance
**Problem**: API is slow or times out
**Solution**:
- Reduce optimization parameters (population_size, generations)
- Consider upgrading to Hugging Face paid tier for more resources
- Add caching for common requests
### Frontend Issues
#### Cannot Connect to Backend
**Problem**: Frontend shows "Cannot connect to API"
**Solution**:
1. Verify `API_URL` environment variable is set correctly in Streamlit Secrets
2. Check backend is running: Visit backend URL directly
3. Check CORS settings on backend
4. Verify no typos in API_URL (should include https://)
#### Streamlit Cloud Build Fails
**Problem**: Deployment fails on Streamlit Cloud
**Solution**:
1. Check `requirements.txt` for incompatible versions
2. Verify `app.py` has no syntax errors
3. Check Streamlit Cloud build logs
4. Test locally: `streamlit run app.py`
### Docker Compose Issues
#### Port Already in Use
**Problem**: `Error: port is already allocated`
**Solution**:
```bash
# Find process using port
lsof -i :8000
lsof -i :8501
# Kill process
kill -9 <PID>
# Or change ports in docker-compose.yml
```
#### Container Crashes on Startup
**Problem**: Service exits immediately
**Solution**:
```bash
# Check logs
docker-compose logs backend
docker-compose logs frontend
# Run container interactively
docker run -it land-redistribution-api /bin/bash
# Check health
docker-compose ps
```
## Performance Optimization
### Backend
1. **Reduce CPU-intensive operations**:
- Lower default `population_size` and `generations`
- Add request timeouts
- Implement result caching
2. **Optimize Docker image**:
- Use multi-stage builds (already implemented)
- Minimize layers
- Remove unnecessary dependencies
### Frontend
1. **Optimize Streamlit**:
- Use `@st.cache_data` for expensive computations
- Lazy load visualizations
- Reduce re-renders with `st.session_state`
2. **Reduce API calls**:
- Cache results in session state
- Batch multiple requests
## Monitoring
### Hugging Face Spaces
- View logs: Space β Logs tab
- Check metrics: Space β Settings β Usage
- Restart: Space β Settings β Factory reboot
### Streamlit Cloud
- View logs: App β Manage app β Logs
- Check analytics: App β Analytics
- Restart: App β Manage app β Reboot app
## Security Considerations
1. **Environment Variables**: Never commit `.env` files with secrets
2. **CORS**: In production, replace `CORS_ORIGINS=*` with specific domains
3. **Rate Limiting**: Consider adding rate limiting for public APIs
4. **Input Validation**: Backend validates all inputs (already implemented)
## Next Steps
1. β
Test locally with Docker Compose
2. β
Deploy backend to Hugging Face Spaces
3. β
Deploy frontend to Streamlit Cloud
4. β
Configure environment variables
5. β
Test end-to-end flow
6. π Monitor performance and logs
7. π Share with users!
## Support
For issues or questions:
- Backend API: Check Hugging Face Space discussions
- Frontend: Check Streamlit Community forum
- General: Open an issue on GitHub
## License
MIT
|