Update README.md
Browse files
README.md
CHANGED
|
@@ -192,6 +192,24 @@ for bag_features, label in loader: # bag_features: (num_patches, embed_dim)
|
|
| 192 |
optimizer.zero_grad()
|
| 193 |
|
| 194 |
```
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 195 |
---
|
| 196 |
## Paper
|
| 197 |
|
|
|
|
| 192 |
optimizer.zero_grad()
|
| 193 |
|
| 194 |
```
|
| 195 |
+
|
| 196 |
+
---
|
| 197 |
+
|
| 198 |
+
## Offline usage (HPC clusters without internet)
|
| 199 |
+
|
| 200 |
+
If your cluster doesn’t allow internet access on compute nodes, **pre-download the model on the front-end/login node** (which has internet), so it’s cached locally, then run jobs offline:
|
| 201 |
+
|
| 202 |
+
```bash
|
| 203 |
+
# On the front-end/login node (with internet):
|
| 204 |
+
python -c "from transformers import AutoModel; AutoModel.from_pretrained('sofieneb/histaug-conch', trust_remote_code=True)"
|
| 205 |
+
|
| 206 |
+
# On your compute job (no internet):
|
| 207 |
+
export HF_HUB_OFFLINE=1
|
| 208 |
+
export TRANSFORMERS_OFFLINE=1
|
| 209 |
+
```
|
| 210 |
+
|
| 211 |
+
This prevents unnecessary network calls and ensures `transformers` loads HistAug from the local cache.
|
| 212 |
+
|
| 213 |
---
|
| 214 |
## Paper
|
| 215 |
|