File size: 5,902 Bytes
0b86477
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
# ⚑ Quick Start - Deploy MedSAM to HuggingFace Space

## 🎯 Goal
Deploy your MedSAM model as an API that you can call from your backend.

## πŸ“¦ What's in This Folder

```
huggingface_space/
β”œβ”€β”€ app.py                    # Gradio app (upload to HF Space)
β”œβ”€β”€ requirements.txt          # Dependencies (upload to HF Space)
β”œβ”€β”€ README.md                # Space description (upload to HF Space)
β”œβ”€β”€ .gitattributes           # Git LFS config (upload to HF Space)
β”œβ”€β”€ DEPLOYMENT_GUIDE.md      # Detailed deployment steps
β”œβ”€β”€ integration_example.py   # How to use in your backend
β”œβ”€β”€ test_space.py           # Test script after deployment
└── QUICKSTART.md           # This file
```

## πŸš€ Deploy in 5 Steps

### Step 1: Create Space (2 min)

1. Go to: https://huggingface.co/new-space
2. Fill in:
   - Space name: `medsam-inference`
   - SDK: **Gradio**
   - Hardware: **CPU basic** (free) or **T4 small** (GPU, $0.60/hr)
3. Click **Create Space**

### Step 2: Upload Files (3 min)

**Option A: Via Web (Easiest)**

1. In your Space, click **Files** β†’ **Add file** β†’ **Upload files**
2. Upload these 4 files:
   - `app.py`
   - `requirements.txt`
   - `README.md`
   - `.gitattributes`

**Option B: Via Git**

```bash
# Clone your Space
git clone https://huggingface.co/spaces/YOUR_USERNAME/medsam-inference
cd medsam-inference

# Copy files
cp app.py requirements.txt README.md .gitattributes .

# Commit
git add .
git commit -m "Initial commit"
git push
```

### Step 3: Upload Model (2 min)

**Download your model:**

Go to: https://huggingface.co/Aniketg6/Fine-Tuned-MedSAM

Download: `medsam_vit_b.pth` (375 MB)

**Upload to Space:**

- Via web: **Files** β†’ **Add file** β†’ **Upload file** β†’ Upload `medsam_vit_b.pth`
- Via git: 
  ```bash
  # Make sure Git LFS is installed
  git lfs install
  git lfs track "*.pth"
  
  # Copy your model
  cp /path/to/medsam_vit_b.pth .
  
  # Commit (will use LFS for large file)
  git add .gitattributes medsam_vit_b.pth
  git commit -m "Add MedSAM model"
  git push
  ```

### Step 4: Wait for Build (3-5 min)

- HuggingFace will build your Space automatically
- Check **Logs** tab to see progress
- When done, you'll see "Running" status βœ…

### Step 5: Test It! (1 min)

1. Visit your Space: `https://huggingface.co/spaces/YOUR_USERNAME/medsam-inference`
2. Click **Simple Interface** tab
3. Upload a test image
4. Enter X, Y coordinates (e.g., 200, 150)
5. Click **Segment**
6. You should see a mask! πŸŽ‰

## βœ… Your API is Ready!

**Endpoint:** `https://YOUR_USERNAME-medsam-inference.hf.space/api/predict`

---

## πŸ”— Use in Your Backend

### Quick Integration

1. **Create client file:**

```bash
cd backend
nano medsam_space_client.py
```

2. **Add this code:**

```python
import requests
import json
import base64
from io import BytesIO
from PIL import Image
import numpy as np

SPACE_URL = "https://YOUR_USERNAME-medsam-inference.hf.space/api/predict"

class MedSAMSpacePredictor:
    def __init__(self, space_url):
        self.space_url = space_url
        self.image_array = None
    
    def set_image(self, image):
        self.image_array = image
    
    def predict(self, point_coords, point_labels, multimask_output=True, **kwargs):
        # Convert to base64
        img = Image.fromarray(self.image_array)
        buf = BytesIO()
        img.save(buf, format="PNG")
        img_b64 = base64.b64encode(buf.getvalue()).decode()
        
        # Call API
        points_json = json.dumps({
            "coords": point_coords.tolist(),
            "labels": point_labels.tolist(),
            "multimask_output": multimask_output
        })
        
        resp = requests.post(
            self.space_url,
            json={"data": [f"data:image/png;base64,{img_b64}", points_json]},
            timeout=120
        )
        
        result = json.loads(resp.json()["data"][0])
        masks = np.array([np.array(m["mask_data"], dtype=bool) for m in result["masks"]])
        scores = np.array(result["scores"])
        
        return masks, scores, None
```

3. **Update app.py:**

```python
# Add import
from medsam_space_client import MedSAMSpacePredictor

# Replace this:
# sam_predictor = SamPredictor(sam)

# With this:
sam_predictor = MedSAMSpacePredictor(
    "https://YOUR_USERNAME-medsam-inference.hf.space/api/predict"
)

# Everything else stays the same!
# sam_predictor.set_image(image_array)
# masks, scores, _ = sam_predictor.predict(...)
```

4. **Done!** Your backend now uses the HF Space API βœ…

---

## πŸ§ͺ Test Your Integration

```bash
cd backend/huggingface_space

# Update SPACE_URL in test_space.py first
nano test_space.py

# Run test
python test_space.py path/to/test/image.jpg 200 150
```

Should see:
```
βœ… TEST PASSED! Your Space is working correctly!
```

---

## πŸ’° Cost

**Free Tier (CPU Basic):**
- βœ… Free!
- ⚠️ Slower (~5-10 seconds per image)
- ⚠️ Sleeps after 48h inactivity

**Paid Tier (T4 Small GPU):**
- πŸ’° $0.60/hour
- βœ… Fast (~1-2 seconds)
- βœ… Always on

**Upgrade:** Space Settings β†’ Hardware β†’ T4 small

---

## πŸ› Troubleshooting

**"Application startup failed"**
β†’ Check Logs tab, make sure medsam_vit_b.pth is uploaded

**"Space is sleeping"**
β†’ First request wakes it (takes 10-20s)

**API timeout**
β†’ Space might be sleeping or overloaded, retry

**CORS error**
β†’ Update your backend CORS settings

---

## πŸ“š More Info

- **Detailed guide:** `DEPLOYMENT_GUIDE.md`
- **Integration examples:** `integration_example.py`
- **Test script:** `test_space.py`

---

## ✨ Summary

1. βœ… Create Space on HuggingFace (2 min)
2. βœ… Upload 4 files + model (5 min)
3. βœ… Wait for build (3-5 min)
4. βœ… Test via UI (1 min)
5. βœ… Integrate with backend (5 min)
6. πŸŽ‰ **Total: ~15 minutes!**

**Your MedSAM model is now a cloud API!** πŸš€

---

**Questions? Check:** `DEPLOYMENT_GUIDE.md`