radoslavralev commited on
Commit
9e26ee3
·
verified ·
1 Parent(s): d077b9c

Add new SentenceTransformer model

Browse files
Files changed (1) hide show
  1. README.md +91 -94
README.md CHANGED
@@ -7,7 +7,7 @@ tags:
7
  - generated_from_trainer
8
  - dataset_size:111470
9
  - loss:MultipleNegativesRankingLoss
10
- base_model: sentence-transformers/all-MiniLM-L6-v2
11
  widget:
12
  - source_sentence: why are some rocks radioactive
13
  sentences:
@@ -106,7 +106,7 @@ metrics:
106
  - cosine_mrr@10
107
  - cosine_map@100
108
  model-index:
109
- - name: SentenceTransformer based on sentence-transformers/all-MiniLM-L6-v2
110
  results:
111
  - task:
112
  type: information-retrieval
@@ -116,49 +116,49 @@ model-index:
116
  type: NanoMSMARCO
117
  metrics:
118
  - type: cosine_accuracy@1
119
- value: 0.32
120
  name: Cosine Accuracy@1
121
  - type: cosine_accuracy@3
122
- value: 0.5
123
  name: Cosine Accuracy@3
124
  - type: cosine_accuracy@5
125
- value: 0.56
126
  name: Cosine Accuracy@5
127
  - type: cosine_accuracy@10
128
  value: 0.72
129
  name: Cosine Accuracy@10
130
  - type: cosine_precision@1
131
- value: 0.32
132
  name: Cosine Precision@1
133
  - type: cosine_precision@3
134
- value: 0.16666666666666669
135
  name: Cosine Precision@3
136
  - type: cosine_precision@5
137
- value: 0.11200000000000002
138
  name: Cosine Precision@5
139
  - type: cosine_precision@10
140
  value: 0.07200000000000001
141
  name: Cosine Precision@10
142
  - type: cosine_recall@1
143
- value: 0.32
144
  name: Cosine Recall@1
145
  - type: cosine_recall@3
146
- value: 0.5
147
  name: Cosine Recall@3
148
  - type: cosine_recall@5
149
- value: 0.56
150
  name: Cosine Recall@5
151
  - type: cosine_recall@10
152
  value: 0.72
153
  name: Cosine Recall@10
154
  - type: cosine_ndcg@10
155
- value: 0.5076844979819899
156
  name: Cosine Ndcg@10
157
  - type: cosine_mrr@10
158
- value: 0.4414682539682539
159
  name: Cosine Mrr@10
160
  - type: cosine_map@100
161
- value: 0.4541878759718346
162
  name: Cosine Map@100
163
  - task:
164
  type: information-retrieval
@@ -168,49 +168,49 @@ model-index:
168
  type: NanoNQ
169
  metrics:
170
  - type: cosine_accuracy@1
171
- value: 0.32
172
  name: Cosine Accuracy@1
173
  - type: cosine_accuracy@3
174
- value: 0.48
175
  name: Cosine Accuracy@3
176
  - type: cosine_accuracy@5
177
- value: 0.52
178
  name: Cosine Accuracy@5
179
  - type: cosine_accuracy@10
180
- value: 0.54
181
  name: Cosine Accuracy@10
182
  - type: cosine_precision@1
183
- value: 0.32
184
  name: Cosine Precision@1
185
  - type: cosine_precision@3
186
- value: 0.16666666666666663
187
  name: Cosine Precision@3
188
  - type: cosine_precision@5
189
- value: 0.10800000000000001
190
  name: Cosine Precision@5
191
  - type: cosine_precision@10
192
- value: 0.05800000000000001
193
  name: Cosine Precision@10
194
  - type: cosine_recall@1
195
- value: 0.29
196
  name: Cosine Recall@1
197
  - type: cosine_recall@3
198
- value: 0.45
199
  name: Cosine Recall@3
200
  - type: cosine_recall@5
201
- value: 0.49
202
  name: Cosine Recall@5
203
  - type: cosine_recall@10
204
- value: 0.52
205
  name: Cosine Recall@10
206
  - type: cosine_ndcg@10
207
- value: 0.4230776979752646
208
  name: Cosine Ndcg@10
209
  - type: cosine_mrr@10
210
- value: 0.40852380952380957
211
  name: Cosine Mrr@10
212
  - type: cosine_map@100
213
- value: 0.4024677771121777
214
  name: Cosine Map@100
215
  - task:
216
  type: nano-beir
@@ -220,61 +220,61 @@ model-index:
220
  type: NanoBEIR_mean
221
  metrics:
222
  - type: cosine_accuracy@1
223
- value: 0.32
224
  name: Cosine Accuracy@1
225
  - type: cosine_accuracy@3
226
- value: 0.49
227
  name: Cosine Accuracy@3
228
  - type: cosine_accuracy@5
229
- value: 0.54
230
  name: Cosine Accuracy@5
231
  - type: cosine_accuracy@10
232
- value: 0.63
233
  name: Cosine Accuracy@10
234
  - type: cosine_precision@1
235
- value: 0.32
236
  name: Cosine Precision@1
237
  - type: cosine_precision@3
238
- value: 0.16666666666666666
239
  name: Cosine Precision@3
240
  - type: cosine_precision@5
241
- value: 0.11000000000000001
242
  name: Cosine Precision@5
243
  - type: cosine_precision@10
244
- value: 0.065
245
  name: Cosine Precision@10
246
  - type: cosine_recall@1
247
- value: 0.305
248
  name: Cosine Recall@1
249
  - type: cosine_recall@3
250
- value: 0.475
251
  name: Cosine Recall@3
252
  - type: cosine_recall@5
253
- value: 0.525
254
  name: Cosine Recall@5
255
  - type: cosine_recall@10
256
- value: 0.62
257
  name: Cosine Recall@10
258
  - type: cosine_ndcg@10
259
- value: 0.46538109797862726
260
  name: Cosine Ndcg@10
261
  - type: cosine_mrr@10
262
- value: 0.42499603174603173
263
  name: Cosine Mrr@10
264
  - type: cosine_map@100
265
- value: 0.42832782654200613
266
  name: Cosine Map@100
267
  ---
268
 
269
- # SentenceTransformer based on sentence-transformers/all-MiniLM-L6-v2
270
 
271
- This is a [sentence-transformers](https://www.SBERT.net) model finetuned from [sentence-transformers/all-MiniLM-L6-v2](https://huggingface.co/sentence-transformers/all-MiniLM-L6-v2). It maps sentences & paragraphs to a 384-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.
272
 
273
  ## Model Details
274
 
275
  ### Model Description
276
  - **Model Type:** Sentence Transformer
277
- - **Base model:** [sentence-transformers/all-MiniLM-L6-v2](https://huggingface.co/sentence-transformers/all-MiniLM-L6-v2) <!-- at revision c9745ed1d9f207416be6d2e6f8de32d1f16199bf -->
278
  - **Maximum Sequence Length:** 128 tokens
279
  - **Output Dimensionality:** 384 dimensions
280
  - **Similarity Function:** Cosine Similarity
@@ -327,9 +327,9 @@ print(embeddings.shape)
327
  # Get the similarity scores for the embeddings
328
  similarities = model.similarity(embeddings, embeddings)
329
  print(similarities)
330
- # tensor([[1.0000, 1.0000, 0.9805],
331
- # [1.0000, 1.0000, 0.9805],
332
- # [0.9805, 0.9805, 1.0000]])
333
  ```
334
 
335
  <!--
@@ -367,21 +367,21 @@ You can finetune this model on your own dataset.
367
 
368
  | Metric | NanoMSMARCO | NanoNQ |
369
  |:--------------------|:------------|:-----------|
370
- | cosine_accuracy@1 | 0.32 | 0.32 |
371
- | cosine_accuracy@3 | 0.5 | 0.48 |
372
- | cosine_accuracy@5 | 0.56 | 0.52 |
373
- | cosine_accuracy@10 | 0.72 | 0.54 |
374
- | cosine_precision@1 | 0.32 | 0.32 |
375
- | cosine_precision@3 | 0.1667 | 0.1667 |
376
- | cosine_precision@5 | 0.112 | 0.108 |
377
- | cosine_precision@10 | 0.072 | 0.058 |
378
- | cosine_recall@1 | 0.32 | 0.29 |
379
- | cosine_recall@3 | 0.5 | 0.45 |
380
- | cosine_recall@5 | 0.56 | 0.49 |
381
- | cosine_recall@10 | 0.72 | 0.52 |
382
- | **cosine_ndcg@10** | **0.5077** | **0.4231** |
383
- | cosine_mrr@10 | 0.4415 | 0.4085 |
384
- | cosine_map@100 | 0.4542 | 0.4025 |
385
 
386
  #### Nano BEIR
387
 
@@ -399,21 +399,21 @@ You can finetune this model on your own dataset.
399
 
400
  | Metric | Value |
401
  |:--------------------|:-----------|
402
- | cosine_accuracy@1 | 0.32 |
403
- | cosine_accuracy@3 | 0.49 |
404
- | cosine_accuracy@5 | 0.54 |
405
- | cosine_accuracy@10 | 0.63 |
406
- | cosine_precision@1 | 0.32 |
407
- | cosine_precision@3 | 0.1667 |
408
- | cosine_precision@5 | 0.11 |
409
- | cosine_precision@10 | 0.065 |
410
- | cosine_recall@1 | 0.305 |
411
- | cosine_recall@3 | 0.475 |
412
- | cosine_recall@5 | 0.525 |
413
- | cosine_recall@10 | 0.62 |
414
- | **cosine_ndcg@10** | **0.4654** |
415
- | cosine_mrr@10 | 0.425 |
416
- | cosine_map@100 | 0.4283 |
417
 
418
  <!--
419
  ## Bias, Risks and Limitations
@@ -487,9 +487,9 @@ You can finetune this model on your own dataset.
487
  - `eval_strategy`: steps
488
  - `per_device_train_batch_size`: 128
489
  - `per_device_eval_batch_size`: 128
490
- - `learning_rate`: 0.0001
491
  - `weight_decay`: 0.005
492
- - `max_steps`: 2250
493
  - `warmup_ratio`: 0.1
494
  - `fp16`: True
495
  - `dataloader_drop_last`: True
@@ -516,14 +516,14 @@ You can finetune this model on your own dataset.
516
  - `gradient_accumulation_steps`: 1
517
  - `eval_accumulation_steps`: None
518
  - `torch_empty_cache_steps`: None
519
- - `learning_rate`: 0.0001
520
  - `weight_decay`: 0.005
521
  - `adam_beta1`: 0.9
522
  - `adam_beta2`: 0.999
523
  - `adam_epsilon`: 1e-08
524
  - `max_grad_norm`: 1.0
525
  - `num_train_epochs`: 3.0
526
- - `max_steps`: 2250
527
  - `lr_scheduler_type`: linear
528
  - `lr_scheduler_kwargs`: {}
529
  - `warmup_ratio`: 0.1
@@ -630,16 +630,13 @@ You can finetune this model on your own dataset.
630
  ### Training Logs
631
  | Epoch | Step | Training Loss | Validation Loss | NanoMSMARCO_cosine_ndcg@10 | NanoNQ_cosine_ndcg@10 | NanoBEIR_mean_cosine_ndcg@10 |
632
  |:------:|:----:|:-------------:|:---------------:|:--------------------------:|:---------------------:|:----------------------------:|
633
- | 0 | 0 | - | 1.1445 | 0.5540 | 0.5931 | 0.5735 |
634
- | 0.2874 | 250 | 1.0772 | 0.8647 | 0.4705 | 0.5045 | 0.4875 |
635
- | 0.5747 | 500 | 0.9853 | 0.8353 | 0.4884 | 0.4658 | 0.4771 |
636
- | 0.8621 | 750 | 0.9571 | 0.8153 | 0.5181 | 0.4371 | 0.4776 |
637
- | 1.1494 | 1000 | 0.8923 | 0.8060 | 0.4848 | 0.4274 | 0.4561 |
638
- | 1.4368 | 1250 | 0.8458 | 0.8020 | 0.5415 | 0.4466 | 0.4941 |
639
- | 1.7241 | 1500 | 0.8394 | 0.7929 | 0.4928 | 0.4569 | 0.4748 |
640
- | 2.0115 | 1750 | 0.8308 | 0.7917 | 0.5332 | 0.4285 | 0.4808 |
641
- | 2.2989 | 2000 | 0.7709 | 0.7921 | 0.4952 | 0.4088 | 0.4520 |
642
- | 2.5862 | 2250 | 0.7685 | 0.7914 | 0.5077 | 0.4231 | 0.4654 |
643
 
644
 
645
  ### Framework Versions
 
7
  - generated_from_trainer
8
  - dataset_size:111470
9
  - loss:MultipleNegativesRankingLoss
10
+ base_model: sentence-transformers/all-MiniLM-L12-v2
11
  widget:
12
  - source_sentence: why are some rocks radioactive
13
  sentences:
 
106
  - cosine_mrr@10
107
  - cosine_map@100
108
  model-index:
109
+ - name: SentenceTransformer based on sentence-transformers/all-MiniLM-L12-v2
110
  results:
111
  - task:
112
  type: information-retrieval
 
116
  type: NanoMSMARCO
117
  metrics:
118
  - type: cosine_accuracy@1
119
+ value: 0.34
120
  name: Cosine Accuracy@1
121
  - type: cosine_accuracy@3
122
+ value: 0.54
123
  name: Cosine Accuracy@3
124
  - type: cosine_accuracy@5
125
+ value: 0.64
126
  name: Cosine Accuracy@5
127
  - type: cosine_accuracy@10
128
  value: 0.72
129
  name: Cosine Accuracy@10
130
  - type: cosine_precision@1
131
+ value: 0.34
132
  name: Cosine Precision@1
133
  - type: cosine_precision@3
134
+ value: 0.18
135
  name: Cosine Precision@3
136
  - type: cosine_precision@5
137
+ value: 0.12800000000000003
138
  name: Cosine Precision@5
139
  - type: cosine_precision@10
140
  value: 0.07200000000000001
141
  name: Cosine Precision@10
142
  - type: cosine_recall@1
143
+ value: 0.34
144
  name: Cosine Recall@1
145
  - type: cosine_recall@3
146
+ value: 0.54
147
  name: Cosine Recall@3
148
  - type: cosine_recall@5
149
+ value: 0.64
150
  name: Cosine Recall@5
151
  - type: cosine_recall@10
152
  value: 0.72
153
  name: Cosine Recall@10
154
  - type: cosine_ndcg@10
155
+ value: 0.5240037339745247
156
  name: Cosine Ndcg@10
157
  - type: cosine_mrr@10
158
+ value: 0.4620555555555555
159
  name: Cosine Mrr@10
160
  - type: cosine_map@100
161
+ value: 0.47561895719519554
162
  name: Cosine Map@100
163
  - task:
164
  type: information-retrieval
 
168
  type: NanoNQ
169
  metrics:
170
  - type: cosine_accuracy@1
171
+ value: 0.42
172
  name: Cosine Accuracy@1
173
  - type: cosine_accuracy@3
174
+ value: 0.6
175
  name: Cosine Accuracy@3
176
  - type: cosine_accuracy@5
177
+ value: 0.6
178
  name: Cosine Accuracy@5
179
  - type: cosine_accuracy@10
180
+ value: 0.68
181
  name: Cosine Accuracy@10
182
  - type: cosine_precision@1
183
+ value: 0.42
184
  name: Cosine Precision@1
185
  - type: cosine_precision@3
186
+ value: 0.21333333333333332
187
  name: Cosine Precision@3
188
  - type: cosine_precision@5
189
+ value: 0.128
190
  name: Cosine Precision@5
191
  - type: cosine_precision@10
192
+ value: 0.07200000000000001
193
  name: Cosine Precision@10
194
  - type: cosine_recall@1
195
+ value: 0.38
196
  name: Cosine Recall@1
197
  - type: cosine_recall@3
198
+ value: 0.58
199
  name: Cosine Recall@3
200
  - type: cosine_recall@5
201
+ value: 0.58
202
  name: Cosine Recall@5
203
  - type: cosine_recall@10
204
+ value: 0.65
205
  name: Cosine Recall@10
206
  - type: cosine_ndcg@10
207
+ value: 0.5323118123166091
208
  name: Cosine Ndcg@10
209
  - type: cosine_mrr@10
210
+ value: 0.5109126984126984
211
  name: Cosine Mrr@10
212
  - type: cosine_map@100
213
+ value: 0.49965941289619986
214
  name: Cosine Map@100
215
  - task:
216
  type: nano-beir
 
220
  type: NanoBEIR_mean
221
  metrics:
222
  - type: cosine_accuracy@1
223
+ value: 0.38
224
  name: Cosine Accuracy@1
225
  - type: cosine_accuracy@3
226
+ value: 0.5700000000000001
227
  name: Cosine Accuracy@3
228
  - type: cosine_accuracy@5
229
+ value: 0.62
230
  name: Cosine Accuracy@5
231
  - type: cosine_accuracy@10
232
+ value: 0.7
233
  name: Cosine Accuracy@10
234
  - type: cosine_precision@1
235
+ value: 0.38
236
  name: Cosine Precision@1
237
  - type: cosine_precision@3
238
+ value: 0.19666666666666666
239
  name: Cosine Precision@3
240
  - type: cosine_precision@5
241
+ value: 0.128
242
  name: Cosine Precision@5
243
  - type: cosine_precision@10
244
+ value: 0.07200000000000001
245
  name: Cosine Precision@10
246
  - type: cosine_recall@1
247
+ value: 0.36
248
  name: Cosine Recall@1
249
  - type: cosine_recall@3
250
+ value: 0.56
251
  name: Cosine Recall@3
252
  - type: cosine_recall@5
253
+ value: 0.61
254
  name: Cosine Recall@5
255
  - type: cosine_recall@10
256
+ value: 0.685
257
  name: Cosine Recall@10
258
  - type: cosine_ndcg@10
259
+ value: 0.5281577731455669
260
  name: Cosine Ndcg@10
261
  - type: cosine_mrr@10
262
+ value: 0.4864841269841269
263
  name: Cosine Mrr@10
264
  - type: cosine_map@100
265
+ value: 0.4876391850456977
266
  name: Cosine Map@100
267
  ---
268
 
269
+ # SentenceTransformer based on sentence-transformers/all-MiniLM-L12-v2
270
 
271
+ This is a [sentence-transformers](https://www.SBERT.net) model finetuned from [sentence-transformers/all-MiniLM-L12-v2](https://huggingface.co/sentence-transformers/all-MiniLM-L12-v2). It maps sentences & paragraphs to a 384-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.
272
 
273
  ## Model Details
274
 
275
  ### Model Description
276
  - **Model Type:** Sentence Transformer
277
+ - **Base model:** [sentence-transformers/all-MiniLM-L12-v2](https://huggingface.co/sentence-transformers/all-MiniLM-L12-v2) <!-- at revision 936af83a2ecce5fe87a09109ff5cbcefe073173a -->
278
  - **Maximum Sequence Length:** 128 tokens
279
  - **Output Dimensionality:** 384 dimensions
280
  - **Similarity Function:** Cosine Similarity
 
327
  # Get the similarity scores for the embeddings
328
  similarities = model.similarity(embeddings, embeddings)
329
  print(similarities)
330
+ # tensor([[1.0000, 1.0000, 0.9589],
331
+ # [1.0000, 1.0000, 0.9589],
332
+ # [0.9589, 0.9589, 1.0001]])
333
  ```
334
 
335
  <!--
 
367
 
368
  | Metric | NanoMSMARCO | NanoNQ |
369
  |:--------------------|:------------|:-----------|
370
+ | cosine_accuracy@1 | 0.34 | 0.42 |
371
+ | cosine_accuracy@3 | 0.54 | 0.6 |
372
+ | cosine_accuracy@5 | 0.64 | 0.6 |
373
+ | cosine_accuracy@10 | 0.72 | 0.68 |
374
+ | cosine_precision@1 | 0.34 | 0.42 |
375
+ | cosine_precision@3 | 0.18 | 0.2133 |
376
+ | cosine_precision@5 | 0.128 | 0.128 |
377
+ | cosine_precision@10 | 0.072 | 0.072 |
378
+ | cosine_recall@1 | 0.34 | 0.38 |
379
+ | cosine_recall@3 | 0.54 | 0.58 |
380
+ | cosine_recall@5 | 0.64 | 0.58 |
381
+ | cosine_recall@10 | 0.72 | 0.65 |
382
+ | **cosine_ndcg@10** | **0.524** | **0.5323** |
383
+ | cosine_mrr@10 | 0.4621 | 0.5109 |
384
+ | cosine_map@100 | 0.4756 | 0.4997 |
385
 
386
  #### Nano BEIR
387
 
 
399
 
400
  | Metric | Value |
401
  |:--------------------|:-----------|
402
+ | cosine_accuracy@1 | 0.38 |
403
+ | cosine_accuracy@3 | 0.57 |
404
+ | cosine_accuracy@5 | 0.62 |
405
+ | cosine_accuracy@10 | 0.7 |
406
+ | cosine_precision@1 | 0.38 |
407
+ | cosine_precision@3 | 0.1967 |
408
+ | cosine_precision@5 | 0.128 |
409
+ | cosine_precision@10 | 0.072 |
410
+ | cosine_recall@1 | 0.36 |
411
+ | cosine_recall@3 | 0.56 |
412
+ | cosine_recall@5 | 0.61 |
413
+ | cosine_recall@10 | 0.685 |
414
+ | **cosine_ndcg@10** | **0.5282** |
415
+ | cosine_mrr@10 | 0.4865 |
416
+ | cosine_map@100 | 0.4876 |
417
 
418
  <!--
419
  ## Bias, Risks and Limitations
 
487
  - `eval_strategy`: steps
488
  - `per_device_train_batch_size`: 128
489
  - `per_device_eval_batch_size`: 128
490
+ - `learning_rate`: 8e-05
491
  - `weight_decay`: 0.005
492
+ - `max_steps`: 1687
493
  - `warmup_ratio`: 0.1
494
  - `fp16`: True
495
  - `dataloader_drop_last`: True
 
516
  - `gradient_accumulation_steps`: 1
517
  - `eval_accumulation_steps`: None
518
  - `torch_empty_cache_steps`: None
519
+ - `learning_rate`: 8e-05
520
  - `weight_decay`: 0.005
521
  - `adam_beta1`: 0.9
522
  - `adam_beta2`: 0.999
523
  - `adam_epsilon`: 1e-08
524
  - `max_grad_norm`: 1.0
525
  - `num_train_epochs`: 3.0
526
+ - `max_steps`: 1687
527
  - `lr_scheduler_type`: linear
528
  - `lr_scheduler_kwargs`: {}
529
  - `warmup_ratio`: 0.1
 
630
  ### Training Logs
631
  | Epoch | Step | Training Loss | Validation Loss | NanoMSMARCO_cosine_ndcg@10 | NanoNQ_cosine_ndcg@10 | NanoBEIR_mean_cosine_ndcg@10 |
632
  |:------:|:----:|:-------------:|:---------------:|:--------------------------:|:---------------------:|:----------------------------:|
633
+ | 0 | 0 | - | 1.1142 | 0.5887 | 0.5786 | 0.5836 |
634
+ | 0.2874 | 250 | 1.0151 | 0.8262 | 0.5547 | 0.5323 | 0.5435 |
635
+ | 0.5747 | 500 | 0.9318 | 0.8010 | 0.5094 | 0.5342 | 0.5218 |
636
+ | 0.8621 | 750 | 0.9087 | 0.7803 | 0.5242 | 0.5461 | 0.5352 |
637
+ | 1.1494 | 1000 | 0.8448 | 0.7753 | 0.5291 | 0.5333 | 0.5312 |
638
+ | 1.4368 | 1250 | 0.798 | 0.7693 | 0.5249 | 0.5104 | 0.5176 |
639
+ | 1.7241 | 1500 | 0.7883 | 0.7618 | 0.5240 | 0.5323 | 0.5282 |
 
 
 
640
 
641
 
642
  ### Framework Versions