radoslavralev commited on
Commit
4e21d51
·
verified ·
1 Parent(s): 93b85b5

Add new SentenceTransformer model

Browse files
Files changed (1) hide show
  1. README.md +86 -86
README.md CHANGED
@@ -7,7 +7,7 @@ tags:
7
  - generated_from_trainer
8
  - dataset_size:111470
9
  - loss:MultipleNegativesRankingLoss
10
- base_model: sentence-transformers/all-MiniLM-L6-v2
11
  widget:
12
  - source_sentence: why are some rocks radioactive
13
  sentences:
@@ -106,7 +106,7 @@ metrics:
106
  - cosine_mrr@10
107
  - cosine_map@100
108
  model-index:
109
- - name: SentenceTransformer based on sentence-transformers/all-MiniLM-L6-v2
110
  results:
111
  - task:
112
  type: information-retrieval
@@ -116,49 +116,49 @@ model-index:
116
  type: NanoMSMARCO
117
  metrics:
118
  - type: cosine_accuracy@1
119
- value: 0.34
120
  name: Cosine Accuracy@1
121
  - type: cosine_accuracy@3
122
- value: 0.5
123
  name: Cosine Accuracy@3
124
  - type: cosine_accuracy@5
125
- value: 0.6
126
  name: Cosine Accuracy@5
127
  - type: cosine_accuracy@10
128
- value: 0.7
129
  name: Cosine Accuracy@10
130
  - type: cosine_precision@1
131
- value: 0.34
132
  name: Cosine Precision@1
133
  - type: cosine_precision@3
134
- value: 0.16666666666666669
135
  name: Cosine Precision@3
136
  - type: cosine_precision@5
137
- value: 0.12000000000000002
138
  name: Cosine Precision@5
139
  - type: cosine_precision@10
140
- value: 0.07
141
  name: Cosine Precision@10
142
  - type: cosine_recall@1
143
- value: 0.34
144
  name: Cosine Recall@1
145
  - type: cosine_recall@3
146
- value: 0.5
147
  name: Cosine Recall@3
148
  - type: cosine_recall@5
149
- value: 0.6
150
  name: Cosine Recall@5
151
  - type: cosine_recall@10
152
- value: 0.7
153
  name: Cosine Recall@10
154
  - type: cosine_ndcg@10
155
- value: 0.5086414353894502
156
  name: Cosine Ndcg@10
157
  - type: cosine_mrr@10
158
- value: 0.4489126984126983
159
  name: Cosine Mrr@10
160
  - type: cosine_map@100
161
- value: 0.4629036994571799
162
  name: Cosine Map@100
163
  - task:
164
  type: information-retrieval
@@ -168,49 +168,49 @@ model-index:
168
  type: NanoNQ
169
  metrics:
170
  - type: cosine_accuracy@1
171
- value: 0.4
172
  name: Cosine Accuracy@1
173
  - type: cosine_accuracy@3
174
- value: 0.54
175
  name: Cosine Accuracy@3
176
  - type: cosine_accuracy@5
177
- value: 0.56
178
  name: Cosine Accuracy@5
179
  - type: cosine_accuracy@10
180
- value: 0.6
181
  name: Cosine Accuracy@10
182
  - type: cosine_precision@1
183
- value: 0.4
184
  name: Cosine Precision@1
185
  - type: cosine_precision@3
186
- value: 0.19333333333333333
187
  name: Cosine Precision@3
188
  - type: cosine_precision@5
189
- value: 0.12400000000000003
190
  name: Cosine Precision@5
191
  - type: cosine_precision@10
192
- value: 0.066
193
  name: Cosine Precision@10
194
  - type: cosine_recall@1
195
- value: 0.36
196
  name: Cosine Recall@1
197
  - type: cosine_recall@3
198
- value: 0.52
199
  name: Cosine Recall@3
200
  - type: cosine_recall@5
201
- value: 0.55
202
  name: Cosine Recall@5
203
  - type: cosine_recall@10
204
- value: 0.59
205
  name: Cosine Recall@10
206
  - type: cosine_ndcg@10
207
- value: 0.49563614338260625
208
  name: Cosine Ndcg@10
209
  - type: cosine_mrr@10
210
- value: 0.4761111111111111
211
  name: Cosine Mrr@10
212
  - type: cosine_map@100
213
- value: 0.47880687529822674
214
  name: Cosine Map@100
215
  - task:
216
  type: nano-beir
@@ -223,58 +223,58 @@ model-index:
223
  value: 0.37
224
  name: Cosine Accuracy@1
225
  - type: cosine_accuracy@3
226
- value: 0.52
227
  name: Cosine Accuracy@3
228
  - type: cosine_accuracy@5
229
- value: 0.5800000000000001
230
  name: Cosine Accuracy@5
231
  - type: cosine_accuracy@10
232
- value: 0.6499999999999999
233
  name: Cosine Accuracy@10
234
  - type: cosine_precision@1
235
  value: 0.37
236
  name: Cosine Precision@1
237
  - type: cosine_precision@3
238
- value: 0.18
239
  name: Cosine Precision@3
240
  - type: cosine_precision@5
241
- value: 0.12200000000000003
242
  name: Cosine Precision@5
243
  - type: cosine_precision@10
244
- value: 0.068
245
  name: Cosine Precision@10
246
  - type: cosine_recall@1
247
- value: 0.35
248
  name: Cosine Recall@1
249
  - type: cosine_recall@3
250
- value: 0.51
251
  name: Cosine Recall@3
252
  - type: cosine_recall@5
253
- value: 0.575
254
  name: Cosine Recall@5
255
  - type: cosine_recall@10
256
- value: 0.645
257
  name: Cosine Recall@10
258
  - type: cosine_ndcg@10
259
- value: 0.5021387893860282
260
  name: Cosine Ndcg@10
261
  - type: cosine_mrr@10
262
- value: 0.46251190476190474
263
  name: Cosine Mrr@10
264
  - type: cosine_map@100
265
- value: 0.4708552873777033
266
  name: Cosine Map@100
267
  ---
268
 
269
- # SentenceTransformer based on sentence-transformers/all-MiniLM-L6-v2
270
 
271
- This is a [sentence-transformers](https://www.SBERT.net) model finetuned from [sentence-transformers/all-MiniLM-L6-v2](https://huggingface.co/sentence-transformers/all-MiniLM-L6-v2). It maps sentences & paragraphs to a 384-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.
272
 
273
  ## Model Details
274
 
275
  ### Model Description
276
  - **Model Type:** Sentence Transformer
277
- - **Base model:** [sentence-transformers/all-MiniLM-L6-v2](https://huggingface.co/sentence-transformers/all-MiniLM-L6-v2) <!-- at revision c9745ed1d9f207416be6d2e6f8de32d1f16199bf -->
278
  - **Maximum Sequence Length:** 128 tokens
279
  - **Output Dimensionality:** 384 dimensions
280
  - **Similarity Function:** Cosine Similarity
@@ -327,9 +327,9 @@ print(embeddings.shape)
327
  # Get the similarity scores for the embeddings
328
  similarities = model.similarity(embeddings, embeddings)
329
  print(similarities)
330
- # tensor([[1.0000, 1.0000, 0.9770],
331
- # [1.0000, 1.0000, 0.9770],
332
- # [0.9770, 0.9770, 1.0000]])
333
  ```
334
 
335
  <!--
@@ -367,21 +367,21 @@ You can finetune this model on your own dataset.
367
 
368
  | Metric | NanoMSMARCO | NanoNQ |
369
  |:--------------------|:------------|:-----------|
370
- | cosine_accuracy@1 | 0.34 | 0.4 |
371
- | cosine_accuracy@3 | 0.5 | 0.54 |
372
- | cosine_accuracy@5 | 0.6 | 0.56 |
373
- | cosine_accuracy@10 | 0.7 | 0.6 |
374
- | cosine_precision@1 | 0.34 | 0.4 |
375
- | cosine_precision@3 | 0.1667 | 0.1933 |
376
- | cosine_precision@5 | 0.12 | 0.124 |
377
- | cosine_precision@10 | 0.07 | 0.066 |
378
- | cosine_recall@1 | 0.34 | 0.36 |
379
- | cosine_recall@3 | 0.5 | 0.52 |
380
- | cosine_recall@5 | 0.6 | 0.55 |
381
- | cosine_recall@10 | 0.7 | 0.59 |
382
- | **cosine_ndcg@10** | **0.5086** | **0.4956** |
383
- | cosine_mrr@10 | 0.4489 | 0.4761 |
384
- | cosine_map@100 | 0.4629 | 0.4788 |
385
 
386
  #### Nano BEIR
387
 
@@ -400,20 +400,20 @@ You can finetune this model on your own dataset.
400
  | Metric | Value |
401
  |:--------------------|:-----------|
402
  | cosine_accuracy@1 | 0.37 |
403
- | cosine_accuracy@3 | 0.52 |
404
- | cosine_accuracy@5 | 0.58 |
405
- | cosine_accuracy@10 | 0.65 |
406
  | cosine_precision@1 | 0.37 |
407
- | cosine_precision@3 | 0.18 |
408
- | cosine_precision@5 | 0.122 |
409
- | cosine_precision@10 | 0.068 |
410
- | cosine_recall@1 | 0.35 |
411
- | cosine_recall@3 | 0.51 |
412
- | cosine_recall@5 | 0.575 |
413
- | cosine_recall@10 | 0.645 |
414
- | **cosine_ndcg@10** | **0.5021** |
415
- | cosine_mrr@10 | 0.4625 |
416
- | cosine_map@100 | 0.4709 |
417
 
418
  <!--
419
  ## Bias, Risks and Limitations
@@ -487,9 +487,9 @@ You can finetune this model on your own dataset.
487
  - `eval_strategy`: steps
488
  - `per_device_train_batch_size`: 128
489
  - `per_device_eval_batch_size`: 128
490
- - `learning_rate`: 0.0001
491
  - `weight_decay`: 0.005
492
- - `max_steps`: 562
493
  - `warmup_ratio`: 0.1
494
  - `fp16`: True
495
  - `dataloader_drop_last`: True
@@ -516,14 +516,14 @@ You can finetune this model on your own dataset.
516
  - `gradient_accumulation_steps`: 1
517
  - `eval_accumulation_steps`: None
518
  - `torch_empty_cache_steps`: None
519
- - `learning_rate`: 0.0001
520
  - `weight_decay`: 0.005
521
  - `adam_beta1`: 0.9
522
  - `adam_beta2`: 0.999
523
  - `adam_epsilon`: 1e-08
524
  - `max_grad_norm`: 1.0
525
  - `num_train_epochs`: 3.0
526
- - `max_steps`: 562
527
  - `lr_scheduler_type`: linear
528
  - `lr_scheduler_kwargs`: {}
529
  - `warmup_ratio`: 0.1
@@ -630,9 +630,9 @@ You can finetune this model on your own dataset.
630
  ### Training Logs
631
  | Epoch | Step | Training Loss | Validation Loss | NanoMSMARCO_cosine_ndcg@10 | NanoNQ_cosine_ndcg@10 | NanoBEIR_mean_cosine_ndcg@10 |
632
  |:------:|:----:|:-------------:|:---------------:|:--------------------------:|:---------------------:|:----------------------------:|
633
- | 0 | 0 | - | 1.1445 | 0.5540 | 0.5931 | 0.5735 |
634
- | 0.2874 | 250 | 1.0487 | 0.8460 | 0.5005 | 0.5178 | 0.5092 |
635
- | 0.5747 | 500 | 0.9685 | 0.8172 | 0.5086 | 0.4956 | 0.5021 |
636
 
637
 
638
  ### Framework Versions
 
7
  - generated_from_trainer
8
  - dataset_size:111470
9
  - loss:MultipleNegativesRankingLoss
10
+ base_model: sentence-transformers/all-MiniLM-L12-v2
11
  widget:
12
  - source_sentence: why are some rocks radioactive
13
  sentences:
 
106
  - cosine_mrr@10
107
  - cosine_map@100
108
  model-index:
109
+ - name: SentenceTransformer based on sentence-transformers/all-MiniLM-L12-v2
110
  results:
111
  - task:
112
  type: information-retrieval
 
116
  type: NanoMSMARCO
117
  metrics:
118
  - type: cosine_accuracy@1
119
+ value: 0.36
120
  name: Cosine Accuracy@1
121
  - type: cosine_accuracy@3
122
+ value: 0.54
123
  name: Cosine Accuracy@3
124
  - type: cosine_accuracy@5
125
+ value: 0.62
126
  name: Cosine Accuracy@5
127
  - type: cosine_accuracy@10
128
+ value: 0.74
129
  name: Cosine Accuracy@10
130
  - type: cosine_precision@1
131
+ value: 0.36
132
  name: Cosine Precision@1
133
  - type: cosine_precision@3
134
+ value: 0.18
135
  name: Cosine Precision@3
136
  - type: cosine_precision@5
137
+ value: 0.124
138
  name: Cosine Precision@5
139
  - type: cosine_precision@10
140
+ value: 0.07400000000000001
141
  name: Cosine Precision@10
142
  - type: cosine_recall@1
143
+ value: 0.36
144
  name: Cosine Recall@1
145
  - type: cosine_recall@3
146
+ value: 0.54
147
  name: Cosine Recall@3
148
  - type: cosine_recall@5
149
+ value: 0.62
150
  name: Cosine Recall@5
151
  - type: cosine_recall@10
152
+ value: 0.74
153
  name: Cosine Recall@10
154
  - type: cosine_ndcg@10
155
+ value: 0.5388909110035328
156
  name: Cosine Ndcg@10
157
  - type: cosine_mrr@10
158
+ value: 0.4758571428571428
159
  name: Cosine Mrr@10
160
  - type: cosine_map@100
161
+ value: 0.4857996035129028
162
  name: Cosine Map@100
163
  - task:
164
  type: information-retrieval
 
168
  type: NanoNQ
169
  metrics:
170
  - type: cosine_accuracy@1
171
+ value: 0.38
172
  name: Cosine Accuracy@1
173
  - type: cosine_accuracy@3
174
+ value: 0.56
175
  name: Cosine Accuracy@3
176
  - type: cosine_accuracy@5
177
+ value: 0.6
178
  name: Cosine Accuracy@5
179
  - type: cosine_accuracy@10
180
+ value: 0.66
181
  name: Cosine Accuracy@10
182
  - type: cosine_precision@1
183
+ value: 0.38
184
  name: Cosine Precision@1
185
  - type: cosine_precision@3
186
+ value: 0.2
187
  name: Cosine Precision@3
188
  - type: cosine_precision@5
189
+ value: 0.128
190
  name: Cosine Precision@5
191
  - type: cosine_precision@10
192
+ value: 0.07
193
  name: Cosine Precision@10
194
  - type: cosine_recall@1
195
+ value: 0.35
196
  name: Cosine Recall@1
197
  - type: cosine_recall@3
198
+ value: 0.54
199
  name: Cosine Recall@3
200
  - type: cosine_recall@5
201
+ value: 0.58
202
  name: Cosine Recall@5
203
  - type: cosine_recall@10
204
+ value: 0.64
205
  name: Cosine Recall@10
206
  - type: cosine_ndcg@10
207
+ value: 0.5086947283606115
208
  name: Cosine Ndcg@10
209
  - type: cosine_mrr@10
210
+ value: 0.4757142857142857
211
  name: Cosine Mrr@10
212
  - type: cosine_map@100
213
+ value: 0.4762724913936289
214
  name: Cosine Map@100
215
  - task:
216
  type: nano-beir
 
223
  value: 0.37
224
  name: Cosine Accuracy@1
225
  - type: cosine_accuracy@3
226
+ value: 0.55
227
  name: Cosine Accuracy@3
228
  - type: cosine_accuracy@5
229
+ value: 0.61
230
  name: Cosine Accuracy@5
231
  - type: cosine_accuracy@10
232
+ value: 0.7
233
  name: Cosine Accuracy@10
234
  - type: cosine_precision@1
235
  value: 0.37
236
  name: Cosine Precision@1
237
  - type: cosine_precision@3
238
+ value: 0.19
239
  name: Cosine Precision@3
240
  - type: cosine_precision@5
241
+ value: 0.126
242
  name: Cosine Precision@5
243
  - type: cosine_precision@10
244
+ value: 0.07200000000000001
245
  name: Cosine Precision@10
246
  - type: cosine_recall@1
247
+ value: 0.355
248
  name: Cosine Recall@1
249
  - type: cosine_recall@3
250
+ value: 0.54
251
  name: Cosine Recall@3
252
  - type: cosine_recall@5
253
+ value: 0.6
254
  name: Cosine Recall@5
255
  - type: cosine_recall@10
256
+ value: 0.69
257
  name: Cosine Recall@10
258
  - type: cosine_ndcg@10
259
+ value: 0.5237928196820721
260
  name: Cosine Ndcg@10
261
  - type: cosine_mrr@10
262
+ value: 0.47578571428571426
263
  name: Cosine Mrr@10
264
  - type: cosine_map@100
265
+ value: 0.4810360474532659
266
  name: Cosine Map@100
267
  ---
268
 
269
+ # SentenceTransformer based on sentence-transformers/all-MiniLM-L12-v2
270
 
271
+ This is a [sentence-transformers](https://www.SBERT.net) model finetuned from [sentence-transformers/all-MiniLM-L12-v2](https://huggingface.co/sentence-transformers/all-MiniLM-L12-v2). It maps sentences & paragraphs to a 384-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.
272
 
273
  ## Model Details
274
 
275
  ### Model Description
276
  - **Model Type:** Sentence Transformer
277
+ - **Base model:** [sentence-transformers/all-MiniLM-L12-v2](https://huggingface.co/sentence-transformers/all-MiniLM-L12-v2) <!-- at revision 936af83a2ecce5fe87a09109ff5cbcefe073173a -->
278
  - **Maximum Sequence Length:** 128 tokens
279
  - **Output Dimensionality:** 384 dimensions
280
  - **Similarity Function:** Cosine Similarity
 
327
  # Get the similarity scores for the embeddings
328
  similarities = model.similarity(embeddings, embeddings)
329
  print(similarities)
330
+ # tensor([[1.0001, 1.0001, 0.9818],
331
+ # [1.0001, 1.0001, 0.9818],
332
+ # [0.9818, 0.9818, 0.9999]])
333
  ```
334
 
335
  <!--
 
367
 
368
  | Metric | NanoMSMARCO | NanoNQ |
369
  |:--------------------|:------------|:-----------|
370
+ | cosine_accuracy@1 | 0.36 | 0.38 |
371
+ | cosine_accuracy@3 | 0.54 | 0.56 |
372
+ | cosine_accuracy@5 | 0.62 | 0.6 |
373
+ | cosine_accuracy@10 | 0.74 | 0.66 |
374
+ | cosine_precision@1 | 0.36 | 0.38 |
375
+ | cosine_precision@3 | 0.18 | 0.2 |
376
+ | cosine_precision@5 | 0.124 | 0.128 |
377
+ | cosine_precision@10 | 0.074 | 0.07 |
378
+ | cosine_recall@1 | 0.36 | 0.35 |
379
+ | cosine_recall@3 | 0.54 | 0.54 |
380
+ | cosine_recall@5 | 0.62 | 0.58 |
381
+ | cosine_recall@10 | 0.74 | 0.64 |
382
+ | **cosine_ndcg@10** | **0.5389** | **0.5087** |
383
+ | cosine_mrr@10 | 0.4759 | 0.4757 |
384
+ | cosine_map@100 | 0.4858 | 0.4763 |
385
 
386
  #### Nano BEIR
387
 
 
400
  | Metric | Value |
401
  |:--------------------|:-----------|
402
  | cosine_accuracy@1 | 0.37 |
403
+ | cosine_accuracy@3 | 0.55 |
404
+ | cosine_accuracy@5 | 0.61 |
405
+ | cosine_accuracy@10 | 0.7 |
406
  | cosine_precision@1 | 0.37 |
407
+ | cosine_precision@3 | 0.19 |
408
+ | cosine_precision@5 | 0.126 |
409
+ | cosine_precision@10 | 0.072 |
410
+ | cosine_recall@1 | 0.355 |
411
+ | cosine_recall@3 | 0.54 |
412
+ | cosine_recall@5 | 0.6 |
413
+ | cosine_recall@10 | 0.69 |
414
+ | **cosine_ndcg@10** | **0.5238** |
415
+ | cosine_mrr@10 | 0.4758 |
416
+ | cosine_map@100 | 0.481 |
417
 
418
  <!--
419
  ## Bias, Risks and Limitations
 
487
  - `eval_strategy`: steps
488
  - `per_device_train_batch_size`: 128
489
  - `per_device_eval_batch_size`: 128
490
+ - `learning_rate`: 8e-05
491
  - `weight_decay`: 0.005
492
+ - `max_steps`: 500
493
  - `warmup_ratio`: 0.1
494
  - `fp16`: True
495
  - `dataloader_drop_last`: True
 
516
  - `gradient_accumulation_steps`: 1
517
  - `eval_accumulation_steps`: None
518
  - `torch_empty_cache_steps`: None
519
+ - `learning_rate`: 8e-05
520
  - `weight_decay`: 0.005
521
  - `adam_beta1`: 0.9
522
  - `adam_beta2`: 0.999
523
  - `adam_epsilon`: 1e-08
524
  - `max_grad_norm`: 1.0
525
  - `num_train_epochs`: 3.0
526
+ - `max_steps`: 500
527
  - `lr_scheduler_type`: linear
528
  - `lr_scheduler_kwargs`: {}
529
  - `warmup_ratio`: 0.1
 
630
  ### Training Logs
631
  | Epoch | Step | Training Loss | Validation Loss | NanoMSMARCO_cosine_ndcg@10 | NanoNQ_cosine_ndcg@10 | NanoBEIR_mean_cosine_ndcg@10 |
632
  |:------:|:----:|:-------------:|:---------------:|:--------------------------:|:---------------------:|:----------------------------:|
633
+ | 0 | 0 | - | 1.1142 | 0.5887 | 0.5786 | 0.5836 |
634
+ | 0.2874 | 250 | 0.9921 | 0.8098 | 0.5442 | 0.5263 | 0.5353 |
635
+ | 0.5747 | 500 | 0.915 | 0.7854 | 0.5389 | 0.5087 | 0.5238 |
636
 
637
 
638
  ### Framework Versions