Add new SentenceTransformer model
Browse files
README.md
CHANGED
|
@@ -7,7 +7,7 @@ tags:
|
|
| 7 |
- generated_from_trainer
|
| 8 |
- dataset_size:90000
|
| 9 |
- loss:MultipleNegativesRankingLoss
|
| 10 |
-
base_model: sentence-transformers/all-MiniLM-
|
| 11 |
widget:
|
| 12 |
- source_sentence: what is the maximum i can contribute to a traditional ira
|
| 13 |
sentences:
|
|
@@ -121,7 +121,7 @@ metrics:
|
|
| 121 |
- cosine_mrr@10
|
| 122 |
- cosine_map@100
|
| 123 |
model-index:
|
| 124 |
-
- name: SentenceTransformer based on sentence-transformers/all-MiniLM-
|
| 125 |
results:
|
| 126 |
- task:
|
| 127 |
type: information-retrieval
|
|
@@ -131,10 +131,10 @@ model-index:
|
|
| 131 |
type: NanoMSMARCO
|
| 132 |
metrics:
|
| 133 |
- type: cosine_accuracy@1
|
| 134 |
-
value: 0.
|
| 135 |
name: Cosine Accuracy@1
|
| 136 |
- type: cosine_accuracy@3
|
| 137 |
-
value: 0.
|
| 138 |
name: Cosine Accuracy@3
|
| 139 |
- type: cosine_accuracy@5
|
| 140 |
value: 0.62
|
|
@@ -143,10 +143,10 @@ model-index:
|
|
| 143 |
value: 0.72
|
| 144 |
name: Cosine Accuracy@10
|
| 145 |
- type: cosine_precision@1
|
| 146 |
-
value: 0.
|
| 147 |
name: Cosine Precision@1
|
| 148 |
- type: cosine_precision@3
|
| 149 |
-
value: 0.
|
| 150 |
name: Cosine Precision@3
|
| 151 |
- type: cosine_precision@5
|
| 152 |
value: 0.124
|
|
@@ -155,10 +155,10 @@ model-index:
|
|
| 155 |
value: 0.07200000000000001
|
| 156 |
name: Cosine Precision@10
|
| 157 |
- type: cosine_recall@1
|
| 158 |
-
value: 0.
|
| 159 |
name: Cosine Recall@1
|
| 160 |
- type: cosine_recall@3
|
| 161 |
-
value: 0.
|
| 162 |
name: Cosine Recall@3
|
| 163 |
- type: cosine_recall@5
|
| 164 |
value: 0.62
|
|
@@ -167,13 +167,13 @@ model-index:
|
|
| 167 |
value: 0.72
|
| 168 |
name: Cosine Recall@10
|
| 169 |
- type: cosine_ndcg@10
|
| 170 |
-
value: 0.
|
| 171 |
name: Cosine Ndcg@10
|
| 172 |
- type: cosine_mrr@10
|
| 173 |
-
value: 0.
|
| 174 |
name: Cosine Mrr@10
|
| 175 |
- type: cosine_map@100
|
| 176 |
-
value: 0.
|
| 177 |
name: Cosine Map@100
|
| 178 |
- task:
|
| 179 |
type: information-retrieval
|
|
@@ -183,49 +183,49 @@ model-index:
|
|
| 183 |
type: NanoNQ
|
| 184 |
metrics:
|
| 185 |
- type: cosine_accuracy@1
|
| 186 |
-
value: 0.
|
| 187 |
name: Cosine Accuracy@1
|
| 188 |
- type: cosine_accuracy@3
|
| 189 |
-
value: 0.
|
| 190 |
name: Cosine Accuracy@3
|
| 191 |
- type: cosine_accuracy@5
|
| 192 |
-
value: 0.
|
| 193 |
name: Cosine Accuracy@5
|
| 194 |
- type: cosine_accuracy@10
|
| 195 |
-
value: 0.
|
| 196 |
name: Cosine Accuracy@10
|
| 197 |
- type: cosine_precision@1
|
| 198 |
-
value: 0.
|
| 199 |
name: Cosine Precision@1
|
| 200 |
- type: cosine_precision@3
|
| 201 |
-
value: 0.
|
| 202 |
name: Cosine Precision@3
|
| 203 |
- type: cosine_precision@5
|
| 204 |
-
value: 0.
|
| 205 |
name: Cosine Precision@5
|
| 206 |
- type: cosine_precision@10
|
| 207 |
-
value: 0.
|
| 208 |
name: Cosine Precision@10
|
| 209 |
- type: cosine_recall@1
|
| 210 |
-
value: 0.
|
| 211 |
name: Cosine Recall@1
|
| 212 |
- type: cosine_recall@3
|
| 213 |
-
value: 0.
|
| 214 |
name: Cosine Recall@3
|
| 215 |
- type: cosine_recall@5
|
| 216 |
-
value: 0.
|
| 217 |
name: Cosine Recall@5
|
| 218 |
- type: cosine_recall@10
|
| 219 |
-
value: 0.
|
| 220 |
name: Cosine Recall@10
|
| 221 |
- type: cosine_ndcg@10
|
| 222 |
-
value: 0.
|
| 223 |
name: Cosine Ndcg@10
|
| 224 |
- type: cosine_mrr@10
|
| 225 |
-
value: 0.
|
| 226 |
name: Cosine Mrr@10
|
| 227 |
- type: cosine_map@100
|
| 228 |
-
value: 0.
|
| 229 |
name: Cosine Map@100
|
| 230 |
- task:
|
| 231 |
type: nano-beir
|
|
@@ -235,61 +235,61 @@ model-index:
|
|
| 235 |
type: NanoBEIR_mean
|
| 236 |
metrics:
|
| 237 |
- type: cosine_accuracy@1
|
| 238 |
-
value: 0.
|
| 239 |
name: Cosine Accuracy@1
|
| 240 |
- type: cosine_accuracy@3
|
| 241 |
-
value: 0.
|
| 242 |
name: Cosine Accuracy@3
|
| 243 |
- type: cosine_accuracy@5
|
| 244 |
-
value: 0.
|
| 245 |
name: Cosine Accuracy@5
|
| 246 |
- type: cosine_accuracy@10
|
| 247 |
-
value: 0.
|
| 248 |
name: Cosine Accuracy@10
|
| 249 |
- type: cosine_precision@1
|
| 250 |
-
value: 0.
|
| 251 |
name: Cosine Precision@1
|
| 252 |
- type: cosine_precision@3
|
| 253 |
-
value: 0.
|
| 254 |
name: Cosine Precision@3
|
| 255 |
- type: cosine_precision@5
|
| 256 |
-
value: 0.
|
| 257 |
name: Cosine Precision@5
|
| 258 |
- type: cosine_precision@10
|
| 259 |
-
value: 0.
|
| 260 |
name: Cosine Precision@10
|
| 261 |
- type: cosine_recall@1
|
| 262 |
-
value: 0.
|
| 263 |
name: Cosine Recall@1
|
| 264 |
- type: cosine_recall@3
|
| 265 |
-
value: 0.
|
| 266 |
name: Cosine Recall@3
|
| 267 |
- type: cosine_recall@5
|
| 268 |
-
value: 0.
|
| 269 |
name: Cosine Recall@5
|
| 270 |
- type: cosine_recall@10
|
| 271 |
-
value: 0.
|
| 272 |
name: Cosine Recall@10
|
| 273 |
- type: cosine_ndcg@10
|
| 274 |
-
value: 0.
|
| 275 |
name: Cosine Ndcg@10
|
| 276 |
- type: cosine_mrr@10
|
| 277 |
-
value: 0.
|
| 278 |
name: Cosine Mrr@10
|
| 279 |
- type: cosine_map@100
|
| 280 |
-
value: 0.
|
| 281 |
name: Cosine Map@100
|
| 282 |
---
|
| 283 |
|
| 284 |
-
# SentenceTransformer based on sentence-transformers/all-MiniLM-
|
| 285 |
|
| 286 |
-
This is a [sentence-transformers](https://www.SBERT.net) model finetuned from [sentence-transformers/all-MiniLM-
|
| 287 |
|
| 288 |
## Model Details
|
| 289 |
|
| 290 |
### Model Description
|
| 291 |
- **Model Type:** Sentence Transformer
|
| 292 |
-
- **Base model:** [sentence-transformers/all-MiniLM-
|
| 293 |
- **Maximum Sequence Length:** 128 tokens
|
| 294 |
- **Output Dimensionality:** 384 dimensions
|
| 295 |
- **Similarity Function:** Cosine Similarity
|
|
@@ -342,9 +342,9 @@ print(embeddings.shape)
|
|
| 342 |
# Get the similarity scores for the embeddings
|
| 343 |
similarities = model.similarity(embeddings, embeddings)
|
| 344 |
print(similarities)
|
| 345 |
-
# tensor([[
|
| 346 |
-
# [0.
|
| 347 |
-
# [0.
|
| 348 |
```
|
| 349 |
|
| 350 |
<!--
|
|
@@ -382,21 +382,21 @@ You can finetune this model on your own dataset.
|
|
| 382 |
|
| 383 |
| Metric | NanoMSMARCO | NanoNQ |
|
| 384 |
|:--------------------|:------------|:-----------|
|
| 385 |
-
| cosine_accuracy@1 | 0.
|
| 386 |
-
| cosine_accuracy@3 | 0.
|
| 387 |
-
| cosine_accuracy@5 | 0.62 | 0.
|
| 388 |
-
| cosine_accuracy@10 | 0.72 | 0.
|
| 389 |
-
| cosine_precision@1 | 0.
|
| 390 |
-
| cosine_precision@3 | 0.
|
| 391 |
-
| cosine_precision@5 | 0.124 | 0.
|
| 392 |
-
| cosine_precision@10 | 0.072 | 0.
|
| 393 |
-
| cosine_recall@1 | 0.
|
| 394 |
-
| cosine_recall@3 | 0.
|
| 395 |
-
| cosine_recall@5 | 0.62 | 0.
|
| 396 |
-
| cosine_recall@10 | 0.72 | 0.
|
| 397 |
-
| **cosine_ndcg@10** | **0.
|
| 398 |
-
| cosine_mrr@10 | 0.
|
| 399 |
-
| cosine_map@100 | 0.
|
| 400 |
|
| 401 |
#### Nano BEIR
|
| 402 |
|
|
@@ -414,21 +414,21 @@ You can finetune this model on your own dataset.
|
|
| 414 |
|
| 415 |
| Metric | Value |
|
| 416 |
|:--------------------|:-----------|
|
| 417 |
-
| cosine_accuracy@1 | 0.
|
| 418 |
-
| cosine_accuracy@3 | 0.
|
| 419 |
-
| cosine_accuracy@5 | 0.
|
| 420 |
-
| cosine_accuracy@10 | 0.
|
| 421 |
-
| cosine_precision@1 | 0.
|
| 422 |
-
| cosine_precision@3 | 0.
|
| 423 |
-
| cosine_precision@5 | 0.
|
| 424 |
-
| cosine_precision@10 | 0.
|
| 425 |
-
| cosine_recall@1 | 0.
|
| 426 |
-
| cosine_recall@3 | 0.
|
| 427 |
-
| cosine_recall@5 | 0.
|
| 428 |
-
| cosine_recall@10 | 0.
|
| 429 |
-
| **cosine_ndcg@10** | **0.
|
| 430 |
-
| cosine_mrr@10 | 0.
|
| 431 |
-
| cosine_map@100 | 0.
|
| 432 |
|
| 433 |
<!--
|
| 434 |
## Bias, Risks and Limitations
|
|
@@ -502,9 +502,9 @@ You can finetune this model on your own dataset.
|
|
| 502 |
- `eval_strategy`: steps
|
| 503 |
- `per_device_train_batch_size`: 128
|
| 504 |
- `per_device_eval_batch_size`: 128
|
| 505 |
-
- `learning_rate`:
|
| 506 |
- `weight_decay`: 0.005
|
| 507 |
-
- `max_steps`:
|
| 508 |
- `warmup_ratio`: 0.1
|
| 509 |
- `fp16`: True
|
| 510 |
- `dataloader_drop_last`: True
|
|
@@ -531,14 +531,14 @@ You can finetune this model on your own dataset.
|
|
| 531 |
- `gradient_accumulation_steps`: 1
|
| 532 |
- `eval_accumulation_steps`: None
|
| 533 |
- `torch_empty_cache_steps`: None
|
| 534 |
-
- `learning_rate`:
|
| 535 |
- `weight_decay`: 0.005
|
| 536 |
- `adam_beta1`: 0.9
|
| 537 |
- `adam_beta2`: 0.999
|
| 538 |
- `adam_epsilon`: 1e-08
|
| 539 |
- `max_grad_norm`: 1.0
|
| 540 |
- `num_train_epochs`: 3.0
|
| 541 |
-
- `max_steps`:
|
| 542 |
- `lr_scheduler_type`: linear
|
| 543 |
- `lr_scheduler_kwargs`: {}
|
| 544 |
- `warmup_ratio`: 0.1
|
|
@@ -643,12 +643,13 @@ You can finetune this model on your own dataset.
|
|
| 643 |
</details>
|
| 644 |
|
| 645 |
### Training Logs
|
| 646 |
-
| Epoch
|
| 647 |
-
|
| 648 |
-
| 0
|
| 649 |
-
| 0.3556
|
| 650 |
-
| 0.7112 | 500
|
| 651 |
|
|
|
|
| 652 |
|
| 653 |
### Framework Versions
|
| 654 |
- Python: 3.10.18
|
|
|
|
| 7 |
- generated_from_trainer
|
| 8 |
- dataset_size:90000
|
| 9 |
- loss:MultipleNegativesRankingLoss
|
| 10 |
+
base_model: sentence-transformers/all-MiniLM-L12-v2
|
| 11 |
widget:
|
| 12 |
- source_sentence: what is the maximum i can contribute to a traditional ira
|
| 13 |
sentences:
|
|
|
|
| 121 |
- cosine_mrr@10
|
| 122 |
- cosine_map@100
|
| 123 |
model-index:
|
| 124 |
+
- name: SentenceTransformer based on sentence-transformers/all-MiniLM-L12-v2
|
| 125 |
results:
|
| 126 |
- task:
|
| 127 |
type: information-retrieval
|
|
|
|
| 131 |
type: NanoMSMARCO
|
| 132 |
metrics:
|
| 133 |
- type: cosine_accuracy@1
|
| 134 |
+
value: 0.32
|
| 135 |
name: Cosine Accuracy@1
|
| 136 |
- type: cosine_accuracy@3
|
| 137 |
+
value: 0.52
|
| 138 |
name: Cosine Accuracy@3
|
| 139 |
- type: cosine_accuracy@5
|
| 140 |
value: 0.62
|
|
|
|
| 143 |
value: 0.72
|
| 144 |
name: Cosine Accuracy@10
|
| 145 |
- type: cosine_precision@1
|
| 146 |
+
value: 0.32
|
| 147 |
name: Cosine Precision@1
|
| 148 |
- type: cosine_precision@3
|
| 149 |
+
value: 0.1733333333333333
|
| 150 |
name: Cosine Precision@3
|
| 151 |
- type: cosine_precision@5
|
| 152 |
value: 0.124
|
|
|
|
| 155 |
value: 0.07200000000000001
|
| 156 |
name: Cosine Precision@10
|
| 157 |
- type: cosine_recall@1
|
| 158 |
+
value: 0.32
|
| 159 |
name: Cosine Recall@1
|
| 160 |
- type: cosine_recall@3
|
| 161 |
+
value: 0.52
|
| 162 |
name: Cosine Recall@3
|
| 163 |
- type: cosine_recall@5
|
| 164 |
value: 0.62
|
|
|
|
| 167 |
value: 0.72
|
| 168 |
name: Cosine Recall@10
|
| 169 |
- type: cosine_ndcg@10
|
| 170 |
+
value: 0.5182449787606596
|
| 171 |
name: Cosine Ndcg@10
|
| 172 |
- type: cosine_mrr@10
|
| 173 |
+
value: 0.45404761904761903
|
| 174 |
name: Cosine Mrr@10
|
| 175 |
- type: cosine_map@100
|
| 176 |
+
value: 0.4681213273999474
|
| 177 |
name: Cosine Map@100
|
| 178 |
- task:
|
| 179 |
type: information-retrieval
|
|
|
|
| 183 |
type: NanoNQ
|
| 184 |
metrics:
|
| 185 |
- type: cosine_accuracy@1
|
| 186 |
+
value: 0.38
|
| 187 |
name: Cosine Accuracy@1
|
| 188 |
- type: cosine_accuracy@3
|
| 189 |
+
value: 0.52
|
| 190 |
name: Cosine Accuracy@3
|
| 191 |
- type: cosine_accuracy@5
|
| 192 |
+
value: 0.58
|
| 193 |
name: Cosine Accuracy@5
|
| 194 |
- type: cosine_accuracy@10
|
| 195 |
+
value: 0.7
|
| 196 |
name: Cosine Accuracy@10
|
| 197 |
- type: cosine_precision@1
|
| 198 |
+
value: 0.38
|
| 199 |
name: Cosine Precision@1
|
| 200 |
- type: cosine_precision@3
|
| 201 |
+
value: 0.18666666666666665
|
| 202 |
name: Cosine Precision@3
|
| 203 |
- type: cosine_precision@5
|
| 204 |
+
value: 0.128
|
| 205 |
name: Cosine Precision@5
|
| 206 |
- type: cosine_precision@10
|
| 207 |
+
value: 0.076
|
| 208 |
name: Cosine Precision@10
|
| 209 |
- type: cosine_recall@1
|
| 210 |
+
value: 0.36
|
| 211 |
name: Cosine Recall@1
|
| 212 |
- type: cosine_recall@3
|
| 213 |
+
value: 0.5
|
| 214 |
name: Cosine Recall@3
|
| 215 |
- type: cosine_recall@5
|
| 216 |
+
value: 0.57
|
| 217 |
name: Cosine Recall@5
|
| 218 |
- type: cosine_recall@10
|
| 219 |
+
value: 0.67
|
| 220 |
name: Cosine Recall@10
|
| 221 |
- type: cosine_ndcg@10
|
| 222 |
+
value: 0.5134978713592498
|
| 223 |
name: Cosine Ndcg@10
|
| 224 |
- type: cosine_mrr@10
|
| 225 |
+
value: 0.47138888888888886
|
| 226 |
name: Cosine Mrr@10
|
| 227 |
- type: cosine_map@100
|
| 228 |
+
value: 0.4692659575514759
|
| 229 |
name: Cosine Map@100
|
| 230 |
- task:
|
| 231 |
type: nano-beir
|
|
|
|
| 235 |
type: NanoBEIR_mean
|
| 236 |
metrics:
|
| 237 |
- type: cosine_accuracy@1
|
| 238 |
+
value: 0.35
|
| 239 |
name: Cosine Accuracy@1
|
| 240 |
- type: cosine_accuracy@3
|
| 241 |
+
value: 0.52
|
| 242 |
name: Cosine Accuracy@3
|
| 243 |
- type: cosine_accuracy@5
|
| 244 |
+
value: 0.6
|
| 245 |
name: Cosine Accuracy@5
|
| 246 |
- type: cosine_accuracy@10
|
| 247 |
+
value: 0.71
|
| 248 |
name: Cosine Accuracy@10
|
| 249 |
- type: cosine_precision@1
|
| 250 |
+
value: 0.35
|
| 251 |
name: Cosine Precision@1
|
| 252 |
- type: cosine_precision@3
|
| 253 |
+
value: 0.18
|
| 254 |
name: Cosine Precision@3
|
| 255 |
- type: cosine_precision@5
|
| 256 |
+
value: 0.126
|
| 257 |
name: Cosine Precision@5
|
| 258 |
- type: cosine_precision@10
|
| 259 |
+
value: 0.07400000000000001
|
| 260 |
name: Cosine Precision@10
|
| 261 |
- type: cosine_recall@1
|
| 262 |
+
value: 0.33999999999999997
|
| 263 |
name: Cosine Recall@1
|
| 264 |
- type: cosine_recall@3
|
| 265 |
+
value: 0.51
|
| 266 |
name: Cosine Recall@3
|
| 267 |
- type: cosine_recall@5
|
| 268 |
+
value: 0.595
|
| 269 |
name: Cosine Recall@5
|
| 270 |
- type: cosine_recall@10
|
| 271 |
+
value: 0.6950000000000001
|
| 272 |
name: Cosine Recall@10
|
| 273 |
- type: cosine_ndcg@10
|
| 274 |
+
value: 0.5158714250599548
|
| 275 |
name: Cosine Ndcg@10
|
| 276 |
- type: cosine_mrr@10
|
| 277 |
+
value: 0.46271825396825395
|
| 278 |
name: Cosine Mrr@10
|
| 279 |
- type: cosine_map@100
|
| 280 |
+
value: 0.46869364247571166
|
| 281 |
name: Cosine Map@100
|
| 282 |
---
|
| 283 |
|
| 284 |
+
# SentenceTransformer based on sentence-transformers/all-MiniLM-L12-v2
|
| 285 |
|
| 286 |
+
This is a [sentence-transformers](https://www.SBERT.net) model finetuned from [sentence-transformers/all-MiniLM-L12-v2](https://huggingface.co/sentence-transformers/all-MiniLM-L12-v2). It maps sentences & paragraphs to a 384-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.
|
| 287 |
|
| 288 |
## Model Details
|
| 289 |
|
| 290 |
### Model Description
|
| 291 |
- **Model Type:** Sentence Transformer
|
| 292 |
+
- **Base model:** [sentence-transformers/all-MiniLM-L12-v2](https://huggingface.co/sentence-transformers/all-MiniLM-L12-v2) <!-- at revision 936af83a2ecce5fe87a09109ff5cbcefe073173a -->
|
| 293 |
- **Maximum Sequence Length:** 128 tokens
|
| 294 |
- **Output Dimensionality:** 384 dimensions
|
| 295 |
- **Similarity Function:** Cosine Similarity
|
|
|
|
| 342 |
# Get the similarity scores for the embeddings
|
| 343 |
similarities = model.similarity(embeddings, embeddings)
|
| 344 |
print(similarities)
|
| 345 |
+
# tensor([[0.9999, 0.5493, 0.3900],
|
| 346 |
+
# [0.5493, 1.0000, 0.1239],
|
| 347 |
+
# [0.3900, 0.1239, 1.0001]])
|
| 348 |
```
|
| 349 |
|
| 350 |
<!--
|
|
|
|
| 382 |
|
| 383 |
| Metric | NanoMSMARCO | NanoNQ |
|
| 384 |
|:--------------------|:------------|:-----------|
|
| 385 |
+
| cosine_accuracy@1 | 0.32 | 0.38 |
|
| 386 |
+
| cosine_accuracy@3 | 0.52 | 0.52 |
|
| 387 |
+
| cosine_accuracy@5 | 0.62 | 0.58 |
|
| 388 |
+
| cosine_accuracy@10 | 0.72 | 0.7 |
|
| 389 |
+
| cosine_precision@1 | 0.32 | 0.38 |
|
| 390 |
+
| cosine_precision@3 | 0.1733 | 0.1867 |
|
| 391 |
+
| cosine_precision@5 | 0.124 | 0.128 |
|
| 392 |
+
| cosine_precision@10 | 0.072 | 0.076 |
|
| 393 |
+
| cosine_recall@1 | 0.32 | 0.36 |
|
| 394 |
+
| cosine_recall@3 | 0.52 | 0.5 |
|
| 395 |
+
| cosine_recall@5 | 0.62 | 0.57 |
|
| 396 |
+
| cosine_recall@10 | 0.72 | 0.67 |
|
| 397 |
+
| **cosine_ndcg@10** | **0.5182** | **0.5135** |
|
| 398 |
+
| cosine_mrr@10 | 0.454 | 0.4714 |
|
| 399 |
+
| cosine_map@100 | 0.4681 | 0.4693 |
|
| 400 |
|
| 401 |
#### Nano BEIR
|
| 402 |
|
|
|
|
| 414 |
|
| 415 |
| Metric | Value |
|
| 416 |
|:--------------------|:-----------|
|
| 417 |
+
| cosine_accuracy@1 | 0.35 |
|
| 418 |
+
| cosine_accuracy@3 | 0.52 |
|
| 419 |
+
| cosine_accuracy@5 | 0.6 |
|
| 420 |
+
| cosine_accuracy@10 | 0.71 |
|
| 421 |
+
| cosine_precision@1 | 0.35 |
|
| 422 |
+
| cosine_precision@3 | 0.18 |
|
| 423 |
+
| cosine_precision@5 | 0.126 |
|
| 424 |
+
| cosine_precision@10 | 0.074 |
|
| 425 |
+
| cosine_recall@1 | 0.34 |
|
| 426 |
+
| cosine_recall@3 | 0.51 |
|
| 427 |
+
| cosine_recall@5 | 0.595 |
|
| 428 |
+
| cosine_recall@10 | 0.695 |
|
| 429 |
+
| **cosine_ndcg@10** | **0.5159** |
|
| 430 |
+
| cosine_mrr@10 | 0.4627 |
|
| 431 |
+
| cosine_map@100 | 0.4687 |
|
| 432 |
|
| 433 |
<!--
|
| 434 |
## Bias, Risks and Limitations
|
|
|
|
| 502 |
- `eval_strategy`: steps
|
| 503 |
- `per_device_train_batch_size`: 128
|
| 504 |
- `per_device_eval_batch_size`: 128
|
| 505 |
+
- `learning_rate`: 8e-05
|
| 506 |
- `weight_decay`: 0.005
|
| 507 |
+
- `max_steps`: 500
|
| 508 |
- `warmup_ratio`: 0.1
|
| 509 |
- `fp16`: True
|
| 510 |
- `dataloader_drop_last`: True
|
|
|
|
| 531 |
- `gradient_accumulation_steps`: 1
|
| 532 |
- `eval_accumulation_steps`: None
|
| 533 |
- `torch_empty_cache_steps`: None
|
| 534 |
+
- `learning_rate`: 8e-05
|
| 535 |
- `weight_decay`: 0.005
|
| 536 |
- `adam_beta1`: 0.9
|
| 537 |
- `adam_beta2`: 0.999
|
| 538 |
- `adam_epsilon`: 1e-08
|
| 539 |
- `max_grad_norm`: 1.0
|
| 540 |
- `num_train_epochs`: 3.0
|
| 541 |
+
- `max_steps`: 500
|
| 542 |
- `lr_scheduler_type`: linear
|
| 543 |
- `lr_scheduler_kwargs`: {}
|
| 544 |
- `warmup_ratio`: 0.1
|
|
|
|
| 643 |
</details>
|
| 644 |
|
| 645 |
### Training Logs
|
| 646 |
+
| Epoch | Step | Training Loss | Validation Loss | NanoMSMARCO_cosine_ndcg@10 | NanoNQ_cosine_ndcg@10 | NanoBEIR_mean_cosine_ndcg@10 |
|
| 647 |
+
|:----------:|:-------:|:-------------:|:---------------:|:--------------------------:|:---------------------:|:----------------------------:|
|
| 648 |
+
| 0 | 0 | - | 1.2073 | 0.5887 | 0.5786 | 0.5836 |
|
| 649 |
+
| 0.3556 | 250 | 1.0774 | 0.8959 | 0.5097 | 0.5021 | 0.5059 |
|
| 650 |
+
| **0.7112** | **500** | **1.0113** | **0.8709** | **0.5182** | **0.5135** | **0.5159** |
|
| 651 |
|
| 652 |
+
* The bold row denotes the saved checkpoint.
|
| 653 |
|
| 654 |
### Framework Versions
|
| 655 |
- Python: 3.10.18
|