Buckets:
| # Simulation Server - HuggingFace Spaces Deployment | |
| ## Overview | |
| This project is a Django web application for running numerical simulations with asynchronous processing via Celery and Redis. | |
| ## HuggingFace Spaces Configuration | |
| ### Option 1: Using Gradio Interface (Recommended) | |
| Create a new file `simulations/gradio_app.py`: | |
| ```python | |
| import gradio as gr | |
| import requests | |
| def run_simulation(method, **params): | |
| response = requests.post( | |
| "http://localhost:8000/api/runs/", | |
| json={"method": method, "parameters": params} | |
| ) | |
| return response.json() | |
| # Create Gradio interface... | |
| ``` | |
| ### Option 2: Using Docker (Full Features) | |
| 1. **Create a new Space on HuggingFace** | |
| - Go to https://huggingface.co/new-space | |
| - Choose "Docker" as the SDK | |
| - Set hardware to "CPU" or "GPU" as needed | |
| 2. **Push your code to HuggingFace** | |
| ```bash | |
| git add . | |
| git commit -m "Add HuggingFace deployment" | |
| git remote add hf https://huggingface.co/username/space-name | |
| git push hf main | |
| ``` | |
| 3. **Create `HF_Dockerfile`** in your repository root: | |
| ```dockerfile | |
| FROM python:3.11-slim | |
| ENV PYTHONDONTWRITEBYTECODE=1 | |
| ENV PYTHONUNBUFFERED=1 | |
| WORKDIR /root/bookshop | |
| RUN apt-get update && apt-get install -y --no-install-recommends gcc | |
| RUN rm -rf /var/lib/apt/lists/* | |
| COPY simulationserver/requirements.txt . | |
| RUN pip install --no-cache-dir -r simulationserver/requirements.txt | |
| COPY . . | |
| EXPOSE 7860 | |
| CMD ["python", "manage.py", "runserver", "0.0.0.0:7860"] | |
| ``` | |
| 4. **Update `simulationserver/settings.py`** for HuggingFace: | |
| ```python | |
| # Use environment variables | |
| SECRET_KEY = os.environ.get('DJANGO_SECRET_KEY', 'your-secret-key') | |
| DEBUG = os.environ.get('DEBUG', 'False').lower() in ('true', '1') | |
| ``` | |
| ## Important Notes for HuggingFace Spaces | |
| ### Limitations | |
| - **No persistent Redis**: Celery workers won't work with persistent queues | |
| - **No background tasks**: Jobs are limited to the container lifetime | |
| - **Ephemeral storage**: Files are lost when the space restarts | |
| ### Recommended: Use Synchronous Mode | |
| For HuggingFace Spaces, modify `simulations/tasks.py` to run synchronously: | |
| ```python | |
| def run_simulation_sync(method_slug, params): | |
| """Run simulation synchronously for HuggingFace Spaces.""" | |
| method_func = SIMULATION_METHODS.get(method_slug) | |
| if not method_func: | |
| raise ValueError(f"Méthode inconnue: {method_slug}") | |
| result = None | |
| for progress in method_func(params): | |
| yield progress | |
| if progress >= 100: | |
| break | |
| return result | |
| ``` | |
| ### Alternative: Use External Redis | |
| For full Celery support, use an external Redis service: | |
| ```python | |
| CELERY_BROKER_URL = os.environ.get('REDIS_URL', 'redis://localhost:6379/0') | |
| ``` | |
| ## Local Development with Docker | |
| ```bash | |
| # Build and run | |
| docker build -t simulation-server . | |
| docker run -p 8000:8000 simulation-server | |
| # With docker-compose | |
| docker-compose up -d | |
| ``` | |
| ## Environment Variables | |
| | Variable | Description | Default | | |
| |----------|-------------|---------| | |
| | `DJANGO_SECRET_KEY` | Django secret key | `django-insecure-...` | | |
| | `DEBUG` | Debug mode | `True` | | |
| | `CELERY_BROKER_URL` | Redis broker URL | `redis://localhost:6379/0` | | |
| ## API Endpoints | |
| - `GET /` - Homepage | |
| - `GET /methods/` - List simulation methods | |
| - `GET /run/create/` - Create new simulation | |
| - `GET /runs/` - List simulation history | |
| - `GET /run/{id}/` - View simulation results | |
| - `POST /api/runs/` - Create simulation via API | |
| - `GET /api/runs/{id}/` - Get simulation status via API | |
Xet Storage Details
- Size:
- 3.56 kB
- Xet hash:
- 3bd11e8e7d0a33b2e5adffce92841fbfe6cbef630f5bc2888738388481866a77
·
Xet efficiently stores files, intelligently splitting them into unique chunks and accelerating uploads and downloads. More info.