File size: 3,233 Bytes
c1f3888
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
# Hugging Face Spaces Deployment Guide

## Steps to Deploy on Hugging Face Spaces

### 1. Create a New Space
1. Go to [Hugging Face Spaces](https://huggingface.co/new-space)
2. Choose a name for your space (e.g., `content-classifier`)
3. Select **Docker** as the SDK
4. Set the space to **Public** or **Private** as needed
5. Click **Create Space**

### 2. Upload Files to Your Space

You need to upload these files to your Space repository:

```

contextClassifier.onnx    # Your ONNX model

app.py                   # FastAPI application

requirements.txt         # Python dependencies

Dockerfile              # Docker configuration

README.md               # This will become the Space's README

```

### 3. Required Files Content

**For the Space's README.md header, add this at the top:**
```yaml

---

title: Content Classifier

emoji: 🔍

colorFrom: blue

colorTo: purple

sdk: docker

pinned: false

license: mit

app_port: 7860

---

```

### 4. Deployment Process

1. **Via Git (Recommended):**
   ```bash

   git clone https://huggingface.co/spaces/YOUR_USERNAME/YOUR_SPACE_NAME

   cd YOUR_SPACE_NAME

   

   # Copy your files

   copy contextClassifier.onnx .

   copy app.py .

   copy requirements.txt .

   copy Dockerfile .

   

   # Commit and push

   git add .

   git commit -m "Add content classifier API"

   git push

   ```

2. **Via Web Interface:**
   - Use the **Files** tab in your Space
   - Upload each file individually
   - Or drag and drop all files at once

### 5. Monitor Deployment

1. Go to your Space URL: `https://huggingface.co/spaces/YOUR_USERNAME/YOUR_SPACE_NAME`
2. Check the **Logs** tab to monitor the build process
3. The Space will show "Building" status during deployment
4. Once ready, you'll see the API documentation interface

### 6. Access Your API

Once deployed, your API will be available at:
- **Swagger UI:** `https://YOUR_USERNAME-YOUR_SPACE_NAME.hf.space/docs`
- **API Endpoints:**
  - `POST /predict` - Main prediction endpoint
  - `GET /health` - Health check
  - `GET /model-info` - Model information

### 7. Example Usage

```python

import requests



# Replace with your actual Space URL

api_url = "https://YOUR_USERNAME-YOUR_SPACE_NAME.hf.space"



# Make a prediction

response = requests.post(

    f"{api_url}/predict",

    json={"text": "This is a test message"}

)



print(response.json())

```

### 8. Important Notes

- **Model Size:** Make sure your `contextClassifier.onnx` file is under the Space's size limit
- **Cold Start:** The first request might take longer as the Space wakes up
- **Logs:** Monitor the logs for any runtime errors
- **Updates:** Any push to the repository will trigger a rebuild

### 9. Troubleshooting

**Common Issues:**
- **Build Fails:** Check logs for dependency issues
- **Model Not Found:** Ensure `contextClassifier.onnx` is in the root directory
- **Port Issues:** Make sure the app uses port 7860
- **Memory Issues:** Large models might exceed memory limits

**Solutions:**
- Review requirements.txt for compatible versions
- Check model file path in app.py
- Verify Dockerfile exposes port 7860
- Consider model optimization for deployment