Spaces:
Sleeping
Sleeping
Hugging Face Spaces Deployment - Ready to Deploy! β
Files Created & Verified
β
1. Dockerfile
Status: Ready β
Key features:
- β
Base image:
python:3.11-slim - β
Installed:
curlandbuild-essential - β SurrealDB installed via official script
- β HF_HOME=/tmp set for Hugging Face Spaces compatibility
- β TRANSFORMERS_CACHE=/tmp for model cache
- β SENTENCE_TRANSFORMERS_HOME=/tmp for embeddings
- β
Pre-downloads
all-MiniLM-L6-v2model during build - β
Copies all required directories:
api/,open_notebook/,commands/,migrations/,prompts/ - β Exposes port 7860 (required for HF Spaces)
- β Environment variables configured for SurrealDB and API
β
2. start.sh
Status: Ready β
Startup sequence:
- β
Starts SurrealDB with
surreal start --log debug --user root --pass root memory & - β
Waits 5 seconds for initialization:
sleep 5 - β Validates SurrealDB is running
- β Sets environment variables for API (port 7860, host 0.0.0.0, reload false)
- β
Launches FastAPI app with
python run_api.py
β
3. open_notebook/database/connection.py
Status: Ready β
Features:
- β 5 retry attempts with exponential backoff
- β
Connects to
ws://localhost:8000/rpc - β Proper error handling and logging
- β Works with containerized SurrealDB
β
4. requirements.txt
Status: Ready β
All dependencies extracted from pyproject.toml
Deployment Checklist
Pre-Deployment
- Dockerfile created with HF_HOME=/tmp
- start.sh updated with correct startup sequence
- Connection retry logic implemented
- All files in git repository
Deploy to Hugging Face Spaces
Step 1: Create Space
- Go to https://huggingface.co/new-space
- Fill in:
- Name:
open-notebook(or your choice) - License: MIT
- SDK: Docker β οΈ Important!
- Hardware: CPU Basic (Free tier)
- Name:
Step 2: Push Code
# Add Hugging Face remote
git remote add hf https://huggingface.co/spaces/YOUR_USERNAME/open-notebook
# Push to Hugging Face (will trigger build)
git push hf main
Step 3: Configure Secrets
In Space Settings β Repository Secrets, add:
Required (at least one):
OPENAI_API_KEY=sk-...
ANTHROPIC_API_KEY=sk-ant-...
GOOGLE_API_KEY=AIza...
Optional:
GROQ_API_KEY=gsk_...
MISTRAL_API_KEY=...
AUTH_PASSWORD=your_secure_password
Step 4: Wait for Build
- Build takes ~10-15 minutes
- Check build logs for errors
- Once "Running", your Space is live!
Accessing Your Deployed App
Once deployed, access at:
https://YOUR_USERNAME-open-notebook.hf.space
API Endpoints
Documentation:
https://YOUR_USERNAME-open-notebook.hf.space/docs
Health Check:
curl https://YOUR_USERNAME-open-notebook.hf.space/health
Create Notebook:
curl -X POST https://YOUR_USERNAME-open-notebook.hf.space/api/notebooks \
-H "Content-Type: application/json" \
-d '{"name": "My Research", "description": "First notebook"}'
Important Notes
π΄ Data Persistence
- In-Memory Storage: Data is lost on restart
- Why: Using
surreal start ... memory & - Solution: For persistent data, use external SurrealDB and update
SURREAL_URL
π‘ Performance Tips
Use Free AI Models:
- Google Gemini (60 req/min free)
- OpenAI GPT-3.5 (cheapest)
Optimize Requests:
- Smaller chunk sizes for embeddings
- Cache frequent queries
- Limit concurrent uploads
Monitor Usage:
- Check HF Spaces logs
- Watch API rate limits
- Track token usage
β οΈ Troubleshooting
Build Fails?
- Check logs in HF Spaces dashboard
- Verify all COPY paths exist in repo
- Ensure pyproject.toml is valid
"Connection Refused" Error?
- SurrealDB didn't start β check logs
- Increase sleep time in start.sh to 10 seconds
- Verify port 8000 accessible internally
Model Download Fails?
- Ensure HF_HOME=/tmp is set
- Check internet during build
- Verify sentence-transformers version
API Not Responding?
- Check port 7860 is exposed
- Verify run_api.py uses correct port
- Check FastAPI logs in Space
Testing Locally
Before deploying, test locally:
# Build Docker image
docker build -t open-notebook-test .
# Run container
docker run -p 7860:7860 \
-e OPENAI_API_KEY=your_key \
open-notebook-test
# Test in browser
open http://localhost:7860/docs
Next Steps After Deployment
1. Create Frontend (Optional)
Deploy the Next.js frontend separately:
- Create new Space with Node.js SDK
- Set
NEXT_PUBLIC_API_URL=https://YOUR-backend-SPACE.hf.space - Push frontend code
2. Set Up Persistent Storage
For production use:
# Deploy SurrealDB on Railway/Fly.io
# Update Space secrets:
SURREAL_URL=wss://your-db-instance.com/rpc
SURREAL_USER=your_user
SURREAL_PASS=your_password
3. Enable Authentication
Set password in Space secrets:
AUTH_PASSWORD=secure_password_here
4. Monitor and Scale
- Watch HF Spaces metrics
- Upgrade to HF Pro ($9/month) for:
- No cold starts
- Persistent storage
- More CPU/RAM
- Custom domains
Files Summary
c:\Bavesh\Sem6\SE\
βββ Dockerfile β
Ready (port 7860, HF_HOME=/tmp)
βββ start.sh β
Ready (SurrealDB + sleep 5 + run_api.py)
βββ requirements.txt β
Ready (all dependencies)
βββ run_api.py β
Existing (launches on port from env)
βββ open_notebook/
β βββ database/
β βββ connection.py β
Ready (retry logic)
βββ api/ β
Existing
βββ commands/ β
Existing
βββ migrations/ β
Existing
βββ prompts/ β
Existing
βββ pyproject.toml β
Existing
Support
- Documentation: HUGGINGFACE_DEPLOYMENT.md
- Issues: https://github.com/baveshraam/software-eng-proj/issues
- Discord: https://discord.gg/37XJPXfz2w
π You're Ready to Deploy!
Your files are configured correctly. Just push to Hugging Face and watch it build!
git add .
git commit -m "Configure for Hugging Face Spaces deployment"
git push hf main
Good luck! π