simple-search-engine / DEPLOYMENT_GUIDE.md
Monimoy's picture
Upload 2 files
7c69a21 verified
# Hugging Face Spaces Deployment Guide
This guide will walk you through deploying your FastAPI Search Engine to Hugging Face Spaces.
## Prerequisites
1. A Hugging Face account (sign up at https://huggingface.co/join)
2. Git installed on your computer
3. Your application files ready
## Deployment Steps
### Option 1: Deploy via Web Interface (Easiest)
#### Step 1: Create a New Space
1. Go to https://huggingface.co/spaces
2. Click on **"Create new Space"** button
3. Fill in the details:
- **Space name**: `simple-search-engine` (or your preferred name)
- **License**: Choose appropriate license (e.g., MIT, Apache 2.0)
- **Select the SDK**: Choose **Docker**
- **Space hardware**: Start with **CPU basic** (free tier)
- **Visibility**: Public or Private (your choice)
4. Click **"Create Space"**
#### Step 2: Upload Files
After creating the Space, you'll see options to upload files:
1. **Upload the following files in this exact order:**
- `README.md` (the one provided)
- `Dockerfile` (the one provided)
- `requirements.txt` (your existing file)
- `main.py` (your existing file)
- `.gitignore` (optional, the one provided)
2. Click **"Commit to main"** after uploading each file
#### Step 3: Wait for Build
1. After uploading all files, Hugging Face will automatically start building your Space
2. You'll see build logs in the interface
3. The build process typically takes 3-5 minutes
4. Once complete, your app will be live at: `https://huggingface.co/spaces/YOUR-USERNAME/simple-search-engine`
---
### Option 2: Deploy via Git (Recommended for developers)
#### Step 1: Create Space on Hugging Face
1. Go to https://huggingface.co/spaces
2. Click **"Create new Space"**
3. Configure as described in Option 1, Step 1
4. After creation, copy the Git clone URL (shown on the Space page)
#### Step 2: Clone the Repository
Open your terminal and run:
```bash
# Clone the Space repository
git clone https://huggingface.co/spaces/YOUR-USERNAME/simple-search-engine
cd simple-search-engine
```
#### Step 3: Add Your Files
```bash
# Copy your files to the cloned directory
# Assuming you're in the simple-search-engine directory
# Copy files
cp /path/to/your/main.py .
cp /path/to/your/requirements.txt .
cp /path/to/provided/Dockerfile .
cp /path/to/provided/README.md .
cp /path/to/provided/.gitignore .
```
#### Step 4: Git Add, Commit, and Push
```bash
# Configure git LFS (if not already done)
git lfs install
# Add all files
git add .
# Commit changes
git commit -m "Initial deployment of search engine"
# Push to Hugging Face
git push
```
#### Step 5: Monitor Deployment
1. Go to your Space URL: `https://huggingface.co/spaces/YOUR-USERNAME/simple-search-engine`
2. Watch the build logs
3. Once built, your app will be live!
---
## Post-Deployment
### Testing Your Deployed App
1. Navigate to your Space URL
2. You should see the purple gradient search interface
3. Try sample queries like:
- "machine learning AI"
- "cloud services AWS"
- "financial revenue"
### Monitoring
- Check the **Logs** tab in your Space to see application logs
- Monitor build status and errors
- View usage statistics
### Updating Your App
When you need to update:
**Via Web Interface:**
1. Go to your Space β†’ Files tab
2. Click on the file you want to edit
3. Make changes
4. Commit changes
5. Space will rebuild automatically
**Via Git:**
```bash
# Make changes to your files
git add .
git commit -m "Description of changes"
git push
```
---
## Troubleshooting
### Build Fails
**Issue**: Docker build fails
- **Solution**: Check the build logs for specific errors
- Common causes:
- Missing dependencies in requirements.txt
- Syntax errors in code
- Port configuration issues
### App Not Loading
**Issue**: Space built successfully but app doesn't load
- **Solution**:
- Ensure port 7860 is used (required by HF Spaces)
- Check application logs for runtime errors
- Verify NLTK downloads completed successfully
### Slow Model Loading
**Issue**: First request takes a long time
- **Solution**: This is normal - the SBERT model loads on first request
- The model stays in memory for subsequent requests
- Consider upgrading to better hardware for faster cold starts
### Out of Memory
**Issue**: App crashes with memory errors
- **Solution**:
- Upgrade to a better hardware tier (CPU upgrade or small GPU)
- Go to Settings β†’ Change hardware
- Note: GPU tiers are paid
---
## Upgrading Hardware
If you need better performance:
1. Go to your Space β†’ **Settings**
2. Scroll to **"Space hardware"**
3. Choose from available options:
- **CPU basic** (Free)
- **CPU upgrade** (Paid)
- **T4 small** (GPU, Paid)
- **T4 medium** (GPU, Paid)
4. Click **"Save"**
---
## Making Your Space Private
1. Go to Space β†’ **Settings**
2. Find **"Visibility"** section
3. Change from "Public" to "Private"
4. Only you can access the Space when private
---
## Additional Resources
- [Hugging Face Spaces Documentation](https://huggingface.co/docs/hub/spaces)
- [Docker Spaces Guide](https://huggingface.co/docs/hub/spaces-sdks-docker)
- [Sentence Transformers Documentation](https://www.sbert.net/)
- [FastAPI Documentation](https://fastapi.tiangolo.com/)
---
## Quick Reference: File Structure
Your final file structure should look like:
```
simple-search-engine/
β”œβ”€β”€ .gitignore
β”œβ”€β”€ Dockerfile
β”œβ”€β”€ README.md
β”œβ”€β”€ main.py
└── requirements.txt
```
---
## Support
If you encounter issues:
1. Check the build logs in your Space
2. Review the troubleshooting section above
3. Ask in Hugging Face Discord or Forums
4. Open an issue on Hugging Face Spaces GitHub
Good luck with your deployment! πŸš€