# 🚀 Deployment Guide ## Deploy to Hugging Face Spaces ### Prerequisites 1. Install Hugging Face CLI: ```bash pip install huggingface_hub ``` 2. Login to Hugging Face: ```bash huggingface-cli login ``` ### Create and Deploy Space 1. **Create a new Space on Hugging Face Hub:** ```bash huggingface-cli repo create --type space --space_sdk gradio your-username/one-pager-generator ``` 2. **Clone and set up the repository:** ```bash git clone https://huggingface.co/spaces/your-username/one-pager-generator cd one-pager-generator ``` 3. **Copy files to the Space repository:** ```bash cp ../one-pager/* . ``` 4. **Add, commit and push:** ```bash git add . git commit -m "Initial commit: AI One-Pager Generator" git push ``` ### Alternative: Direct CLI Upload You can also use the HF CLI to upload files directly: ```bash huggingface-cli upload your-username/one-pager-generator . --repo-type=space ``` ### Files Required for Deployment - `app.py` - Main application file - `requirements.txt` - Python dependencies - `config.yaml` - Space configuration - `README.md` - Documentation - `.gitignore` - Git ignore patterns ### Configuration Notes - The app uses `distilgpt2` model for better compatibility - CPU-only inference for free tier compatibility - Fallback template system ensures reliable output - Gradio interface optimized for Spaces ### Post-Deployment After deployment, your Space will be available at: `https://huggingface.co/spaces/your-username/one-pager-generator` The app will automatically: 1. Install dependencies from requirements.txt 2. Load the AI model 3. Launch the Gradio interface 4. Be accessible via the web ### Troubleshooting - **Model loading issues**: The app falls back to structured templates - **Memory issues**: Using smaller DistilGPT2 model for efficiency - **Timeout issues**: CPU inference may be slower but more reliable