radoslavralev commited on
Commit
7d44926
·
verified ·
1 Parent(s): 2f57d0f

Add new SentenceTransformer model

Browse files
Files changed (1) hide show
  1. README.md +101 -108
README.md CHANGED
@@ -7,7 +7,7 @@ tags:
7
  - generated_from_trainer
8
  - dataset_size:111470
9
  - loss:MultipleNegativesRankingLoss
10
- base_model: sentence-transformers/all-MiniLM-L6-v2
11
  widget:
12
  - source_sentence: why are some rocks radioactive
13
  sentences:
@@ -106,7 +106,7 @@ metrics:
106
  - cosine_mrr@10
107
  - cosine_map@100
108
  model-index:
109
- - name: SentenceTransformer based on sentence-transformers/all-MiniLM-L6-v2
110
  results:
111
  - task:
112
  type: information-retrieval
@@ -119,46 +119,46 @@ model-index:
119
  value: 0.32
120
  name: Cosine Accuracy@1
121
  - type: cosine_accuracy@3
122
- value: 0.5
123
  name: Cosine Accuracy@3
124
  - type: cosine_accuracy@5
125
- value: 0.56
126
  name: Cosine Accuracy@5
127
  - type: cosine_accuracy@10
128
- value: 0.7
129
  name: Cosine Accuracy@10
130
  - type: cosine_precision@1
131
  value: 0.32
132
  name: Cosine Precision@1
133
  - type: cosine_precision@3
134
- value: 0.16666666666666669
135
  name: Cosine Precision@3
136
  - type: cosine_precision@5
137
- value: 0.11200000000000002
138
  name: Cosine Precision@5
139
  - type: cosine_precision@10
140
- value: 0.07
141
  name: Cosine Precision@10
142
  - type: cosine_recall@1
143
  value: 0.32
144
  name: Cosine Recall@1
145
  - type: cosine_recall@3
146
- value: 0.5
147
  name: Cosine Recall@3
148
  - type: cosine_recall@5
149
- value: 0.56
150
  name: Cosine Recall@5
151
  - type: cosine_recall@10
152
- value: 0.7
153
  name: Cosine Recall@10
154
  - type: cosine_ndcg@10
155
- value: 0.4962486706422321
156
  name: Cosine Ndcg@10
157
  - type: cosine_mrr@10
158
- value: 0.43346031746031743
159
  name: Cosine Mrr@10
160
  - type: cosine_map@100
161
- value: 0.44415856354878636
162
  name: Cosine Map@100
163
  - task:
164
  type: information-retrieval
@@ -168,49 +168,49 @@ model-index:
168
  type: NanoNQ
169
  metrics:
170
  - type: cosine_accuracy@1
171
- value: 0.16
172
  name: Cosine Accuracy@1
173
  - type: cosine_accuracy@3
174
- value: 0.26
175
  name: Cosine Accuracy@3
176
  - type: cosine_accuracy@5
177
- value: 0.32
178
  name: Cosine Accuracy@5
179
  - type: cosine_accuracy@10
180
- value: 0.46
181
  name: Cosine Accuracy@10
182
  - type: cosine_precision@1
183
- value: 0.16
184
  name: Cosine Precision@1
185
  - type: cosine_precision@3
186
- value: 0.08666666666666666
187
  name: Cosine Precision@3
188
  - type: cosine_precision@5
189
- value: 0.068
190
  name: Cosine Precision@5
191
  - type: cosine_precision@10
192
  value: 0.04800000000000001
193
  name: Cosine Precision@10
194
  - type: cosine_recall@1
195
- value: 0.15
196
  name: Cosine Recall@1
197
  - type: cosine_recall@3
198
- value: 0.23
199
  name: Cosine Recall@3
200
  - type: cosine_recall@5
201
- value: 0.3
202
  name: Cosine Recall@5
203
  - type: cosine_recall@10
204
- value: 0.43
205
  name: Cosine Recall@10
206
  - type: cosine_ndcg@10
207
- value: 0.27247558178705156
208
  name: Cosine Ndcg@10
209
  - type: cosine_mrr@10
210
- value: 0.23207936507936502
211
  name: Cosine Mrr@10
212
  - type: cosine_map@100
213
- value: 0.234397839045408
214
  name: Cosine Map@100
215
  - task:
216
  type: nano-beir
@@ -220,61 +220,61 @@ model-index:
220
  type: NanoBEIR_mean
221
  metrics:
222
  - type: cosine_accuracy@1
223
- value: 0.24
224
  name: Cosine Accuracy@1
225
  - type: cosine_accuracy@3
226
- value: 0.38
227
  name: Cosine Accuracy@3
228
  - type: cosine_accuracy@5
229
- value: 0.44000000000000006
230
  name: Cosine Accuracy@5
231
  - type: cosine_accuracy@10
232
- value: 0.58
233
  name: Cosine Accuracy@10
234
  - type: cosine_precision@1
235
- value: 0.24
236
  name: Cosine Precision@1
237
  - type: cosine_precision@3
238
- value: 0.12666666666666668
239
  name: Cosine Precision@3
240
  - type: cosine_precision@5
241
- value: 0.09000000000000001
242
  name: Cosine Precision@5
243
  - type: cosine_precision@10
244
- value: 0.05900000000000001
245
  name: Cosine Precision@10
246
  - type: cosine_recall@1
247
- value: 0.235
248
  name: Cosine Recall@1
249
  - type: cosine_recall@3
250
- value: 0.365
251
  name: Cosine Recall@3
252
  - type: cosine_recall@5
253
- value: 0.43000000000000005
254
  name: Cosine Recall@5
255
  - type: cosine_recall@10
256
- value: 0.565
257
  name: Cosine Recall@10
258
  - type: cosine_ndcg@10
259
- value: 0.38436212621464183
260
  name: Cosine Ndcg@10
261
  - type: cosine_mrr@10
262
- value: 0.3327698412698412
263
  name: Cosine Mrr@10
264
  - type: cosine_map@100
265
- value: 0.3392782012970972
266
  name: Cosine Map@100
267
  ---
268
 
269
- # SentenceTransformer based on sentence-transformers/all-MiniLM-L6-v2
270
 
271
- This is a [sentence-transformers](https://www.SBERT.net) model finetuned from [sentence-transformers/all-MiniLM-L6-v2](https://huggingface.co/sentence-transformers/all-MiniLM-L6-v2). It maps sentences & paragraphs to a 384-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.
272
 
273
  ## Model Details
274
 
275
  ### Model Description
276
  - **Model Type:** Sentence Transformer
277
- - **Base model:** [sentence-transformers/all-MiniLM-L6-v2](https://huggingface.co/sentence-transformers/all-MiniLM-L6-v2) <!-- at revision c9745ed1d9f207416be6d2e6f8de32d1f16199bf -->
278
  - **Maximum Sequence Length:** 128 tokens
279
  - **Output Dimensionality:** 384 dimensions
280
  - **Similarity Function:** Cosine Similarity
@@ -327,9 +327,9 @@ print(embeddings.shape)
327
  # Get the similarity scores for the embeddings
328
  similarities = model.similarity(embeddings, embeddings)
329
  print(similarities)
330
- # tensor([[1.0000, 1.0000, 0.9955],
331
- # [1.0000, 1.0000, 0.9955],
332
- # [0.9955, 0.9955, 1.0000]])
333
  ```
334
 
335
  <!--
@@ -367,21 +367,21 @@ You can finetune this model on your own dataset.
367
 
368
  | Metric | NanoMSMARCO | NanoNQ |
369
  |:--------------------|:------------|:-----------|
370
- | cosine_accuracy@1 | 0.32 | 0.16 |
371
- | cosine_accuracy@3 | 0.5 | 0.26 |
372
- | cosine_accuracy@5 | 0.56 | 0.32 |
373
- | cosine_accuracy@10 | 0.7 | 0.46 |
374
- | cosine_precision@1 | 0.32 | 0.16 |
375
- | cosine_precision@3 | 0.1667 | 0.0867 |
376
- | cosine_precision@5 | 0.112 | 0.068 |
377
- | cosine_precision@10 | 0.07 | 0.048 |
378
- | cosine_recall@1 | 0.32 | 0.15 |
379
- | cosine_recall@3 | 0.5 | 0.23 |
380
- | cosine_recall@5 | 0.56 | 0.3 |
381
- | cosine_recall@10 | 0.7 | 0.43 |
382
- | **cosine_ndcg@10** | **0.4962** | **0.2725** |
383
- | cosine_mrr@10 | 0.4335 | 0.2321 |
384
- | cosine_map@100 | 0.4442 | 0.2344 |
385
 
386
  #### Nano BEIR
387
 
@@ -397,23 +397,23 @@ You can finetune this model on your own dataset.
397
  }
398
  ```
399
 
400
- | Metric | Value |
401
- |:--------------------|:-----------|
402
- | cosine_accuracy@1 | 0.24 |
403
- | cosine_accuracy@3 | 0.38 |
404
- | cosine_accuracy@5 | 0.44 |
405
- | cosine_accuracy@10 | 0.58 |
406
- | cosine_precision@1 | 0.24 |
407
- | cosine_precision@3 | 0.1267 |
408
- | cosine_precision@5 | 0.09 |
409
- | cosine_precision@10 | 0.059 |
410
- | cosine_recall@1 | 0.235 |
411
- | cosine_recall@3 | 0.365 |
412
- | cosine_recall@5 | 0.43 |
413
- | cosine_recall@10 | 0.565 |
414
- | **cosine_ndcg@10** | **0.3844** |
415
- | cosine_mrr@10 | 0.3328 |
416
- | cosine_map@100 | 0.3393 |
417
 
418
  <!--
419
  ## Bias, Risks and Limitations
@@ -487,9 +487,9 @@ You can finetune this model on your own dataset.
487
  - `eval_strategy`: steps
488
  - `per_device_train_batch_size`: 128
489
  - `per_device_eval_batch_size`: 128
490
- - `learning_rate`: 0.0001
491
- - `weight_decay`: 0.001
492
- - `max_steps`: 5062
493
  - `warmup_ratio`: 0.1
494
  - `fp16`: True
495
  - `dataloader_drop_last`: True
@@ -516,14 +516,14 @@ You can finetune this model on your own dataset.
516
  - `gradient_accumulation_steps`: 1
517
  - `eval_accumulation_steps`: None
518
  - `torch_empty_cache_steps`: None
519
- - `learning_rate`: 0.0001
520
- - `weight_decay`: 0.001
521
  - `adam_beta1`: 0.9
522
  - `adam_beta2`: 0.999
523
  - `adam_epsilon`: 1e-08
524
  - `max_grad_norm`: 1.0
525
  - `num_train_epochs`: 3.0
526
- - `max_steps`: 5062
527
  - `lr_scheduler_type`: linear
528
  - `lr_scheduler_kwargs`: {}
529
  - `warmup_ratio`: 0.1
@@ -630,27 +630,20 @@ You can finetune this model on your own dataset.
630
  ### Training Logs
631
  | Epoch | Step | Training Loss | Validation Loss | NanoMSMARCO_cosine_ndcg@10 | NanoNQ_cosine_ndcg@10 | NanoBEIR_mean_cosine_ndcg@10 |
632
  |:------:|:----:|:-------------:|:---------------:|:--------------------------:|:---------------------:|:----------------------------:|
633
- | 0 | 0 | - | 3.3212 | 0.5540 | 0.5931 | 0.5735 |
634
- | 0.2874 | 250 | 3.2509 | 3.0429 | 0.4590 | 0.4189 | 0.4389 |
635
- | 0.5747 | 500 | 3.1458 | 3.0222 | 0.4855 | 0.3752 | 0.4303 |
636
- | 0.8621 | 750 | 3.1119 | 3.0053 | 0.4708 | 0.3715 | 0.4211 |
637
- | 1.1494 | 1000 | 3.0646 | 2.9901 | 0.4632 | 0.3600 | 0.4116 |
638
- | 1.4368 | 1250 | 3.0381 | 2.9852 | 0.5014 | 0.3426 | 0.4220 |
639
- | 1.7241 | 1500 | 3.0301 | 2.9781 | 0.4967 | 0.3029 | 0.3998 |
640
- | 2.0115 | 1750 | 3.0238 | 2.9768 | 0.4706 | 0.2717 | 0.3712 |
641
- | 2.2989 | 2000 | 2.9739 | 2.9735 | 0.4828 | 0.2734 | 0.3781 |
642
- | 2.5862 | 2250 | 2.9709 | 2.9696 | 0.4896 | 0.2257 | 0.3576 |
643
- | 2.8736 | 2500 | 2.9652 | 2.9693 | 0.4816 | 0.2553 | 0.3684 |
644
- | 3.1609 | 2750 | 2.9475 | 2.9720 | 0.4815 | 0.2618 | 0.3717 |
645
- | 3.4483 | 3000 | 2.9313 | 2.9715 | 0.5048 | 0.2831 | 0.3939 |
646
- | 3.7356 | 3250 | 2.9309 | 2.9705 | 0.4606 | 0.2879 | 0.3743 |
647
- | 4.0230 | 3500 | 2.9264 | 2.9712 | 0.5049 | 0.2774 | 0.3911 |
648
- | 4.3103 | 3750 | 2.9056 | 2.9722 | 0.4758 | 0.2532 | 0.3645 |
649
- | 4.5977 | 4000 | 2.9056 | 2.9708 | 0.5004 | 0.2724 | 0.3864 |
650
- | 4.8851 | 4250 | 2.9038 | 2.9705 | 0.5066 | 0.2675 | 0.3870 |
651
- | 5.1724 | 4500 | 2.8932 | 2.9729 | 0.4890 | 0.2627 | 0.3759 |
652
- | 5.4598 | 4750 | 2.8884 | 2.9710 | 0.5016 | 0.2822 | 0.3919 |
653
- | 5.7471 | 5000 | 2.8876 | 2.9712 | 0.4962 | 0.2725 | 0.3844 |
654
 
655
 
656
  ### Framework Versions
 
7
  - generated_from_trainer
8
  - dataset_size:111470
9
  - loss:MultipleNegativesRankingLoss
10
+ base_model: sentence-transformers/all-MiniLM-L12-v2
11
  widget:
12
  - source_sentence: why are some rocks radioactive
13
  sentences:
 
106
  - cosine_mrr@10
107
  - cosine_map@100
108
  model-index:
109
+ - name: SentenceTransformer based on sentence-transformers/all-MiniLM-L12-v2
110
  results:
111
  - task:
112
  type: information-retrieval
 
119
  value: 0.32
120
  name: Cosine Accuracy@1
121
  - type: cosine_accuracy@3
122
+ value: 0.48
123
  name: Cosine Accuracy@3
124
  - type: cosine_accuracy@5
125
+ value: 0.58
126
  name: Cosine Accuracy@5
127
  - type: cosine_accuracy@10
128
+ value: 0.66
129
  name: Cosine Accuracy@10
130
  - type: cosine_precision@1
131
  value: 0.32
132
  name: Cosine Precision@1
133
  - type: cosine_precision@3
134
+ value: 0.15999999999999998
135
  name: Cosine Precision@3
136
  - type: cosine_precision@5
137
+ value: 0.11600000000000002
138
  name: Cosine Precision@5
139
  - type: cosine_precision@10
140
+ value: 0.06600000000000002
141
  name: Cosine Precision@10
142
  - type: cosine_recall@1
143
  value: 0.32
144
  name: Cosine Recall@1
145
  - type: cosine_recall@3
146
+ value: 0.48
147
  name: Cosine Recall@3
148
  - type: cosine_recall@5
149
+ value: 0.58
150
  name: Cosine Recall@5
151
  - type: cosine_recall@10
152
+ value: 0.66
153
  name: Cosine Recall@10
154
  - type: cosine_ndcg@10
155
+ value: 0.48429243159695967
156
  name: Cosine Ndcg@10
157
  - type: cosine_mrr@10
158
+ value: 0.42899999999999994
159
  name: Cosine Mrr@10
160
  - type: cosine_map@100
161
+ value: 0.4402189973816575
162
  name: Cosine Map@100
163
  - task:
164
  type: information-retrieval
 
168
  type: NanoNQ
169
  metrics:
170
  - type: cosine_accuracy@1
171
+ value: 0.18
172
  name: Cosine Accuracy@1
173
  - type: cosine_accuracy@3
174
+ value: 0.32
175
  name: Cosine Accuracy@3
176
  - type: cosine_accuracy@5
177
+ value: 0.38
178
  name: Cosine Accuracy@5
179
  - type: cosine_accuracy@10
180
+ value: 0.44
181
  name: Cosine Accuracy@10
182
  - type: cosine_precision@1
183
+ value: 0.18
184
  name: Cosine Precision@1
185
  - type: cosine_precision@3
186
+ value: 0.10666666666666666
187
  name: Cosine Precision@3
188
  - type: cosine_precision@5
189
+ value: 0.08
190
  name: Cosine Precision@5
191
  - type: cosine_precision@10
192
  value: 0.04800000000000001
193
  name: Cosine Precision@10
194
  - type: cosine_recall@1
195
+ value: 0.16
196
  name: Cosine Recall@1
197
  - type: cosine_recall@3
198
+ value: 0.27
199
  name: Cosine Recall@3
200
  - type: cosine_recall@5
201
+ value: 0.34
202
  name: Cosine Recall@5
203
  - type: cosine_recall@10
204
+ value: 0.41
205
  name: Cosine Recall@10
206
  - type: cosine_ndcg@10
207
+ value: 0.28963575004380426
208
  name: Cosine Ndcg@10
209
  - type: cosine_mrr@10
210
+ value: 0.2683333333333333
211
  name: Cosine Mrr@10
212
  - type: cosine_map@100
213
+ value: 0.2588793322285594
214
  name: Cosine Map@100
215
  - task:
216
  type: nano-beir
 
220
  type: NanoBEIR_mean
221
  metrics:
222
  - type: cosine_accuracy@1
223
+ value: 0.25
224
  name: Cosine Accuracy@1
225
  - type: cosine_accuracy@3
226
+ value: 0.4
227
  name: Cosine Accuracy@3
228
  - type: cosine_accuracy@5
229
+ value: 0.48
230
  name: Cosine Accuracy@5
231
  - type: cosine_accuracy@10
232
+ value: 0.55
233
  name: Cosine Accuracy@10
234
  - type: cosine_precision@1
235
+ value: 0.25
236
  name: Cosine Precision@1
237
  - type: cosine_precision@3
238
+ value: 0.1333333333333333
239
  name: Cosine Precision@3
240
  - type: cosine_precision@5
241
+ value: 0.098
242
  name: Cosine Precision@5
243
  - type: cosine_precision@10
244
+ value: 0.05700000000000001
245
  name: Cosine Precision@10
246
  - type: cosine_recall@1
247
+ value: 0.24
248
  name: Cosine Recall@1
249
  - type: cosine_recall@3
250
+ value: 0.375
251
  name: Cosine Recall@3
252
  - type: cosine_recall@5
253
+ value: 0.45999999999999996
254
  name: Cosine Recall@5
255
  - type: cosine_recall@10
256
+ value: 0.535
257
  name: Cosine Recall@10
258
  - type: cosine_ndcg@10
259
+ value: 0.38696409082038197
260
  name: Cosine Ndcg@10
261
  - type: cosine_mrr@10
262
+ value: 0.3486666666666666
263
  name: Cosine Mrr@10
264
  - type: cosine_map@100
265
+ value: 0.34954916480510845
266
  name: Cosine Map@100
267
  ---
268
 
269
+ # SentenceTransformer based on sentence-transformers/all-MiniLM-L12-v2
270
 
271
+ This is a [sentence-transformers](https://www.SBERT.net) model finetuned from [sentence-transformers/all-MiniLM-L12-v2](https://huggingface.co/sentence-transformers/all-MiniLM-L12-v2). It maps sentences & paragraphs to a 384-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.
272
 
273
  ## Model Details
274
 
275
  ### Model Description
276
  - **Model Type:** Sentence Transformer
277
+ - **Base model:** [sentence-transformers/all-MiniLM-L12-v2](https://huggingface.co/sentence-transformers/all-MiniLM-L12-v2) <!-- at revision 936af83a2ecce5fe87a09109ff5cbcefe073173a -->
278
  - **Maximum Sequence Length:** 128 tokens
279
  - **Output Dimensionality:** 384 dimensions
280
  - **Similarity Function:** Cosine Similarity
 
327
  # Get the similarity scores for the embeddings
328
  similarities = model.similarity(embeddings, embeddings)
329
  print(similarities)
330
+ # tensor([[1.0000, 1.0000, 0.9835],
331
+ # [1.0000, 1.0000, 0.9835],
332
+ # [0.9835, 0.9835, 1.0000]])
333
  ```
334
 
335
  <!--
 
367
 
368
  | Metric | NanoMSMARCO | NanoNQ |
369
  |:--------------------|:------------|:-----------|
370
+ | cosine_accuracy@1 | 0.32 | 0.18 |
371
+ | cosine_accuracy@3 | 0.48 | 0.32 |
372
+ | cosine_accuracy@5 | 0.58 | 0.38 |
373
+ | cosine_accuracy@10 | 0.66 | 0.44 |
374
+ | cosine_precision@1 | 0.32 | 0.18 |
375
+ | cosine_precision@3 | 0.16 | 0.1067 |
376
+ | cosine_precision@5 | 0.116 | 0.08 |
377
+ | cosine_precision@10 | 0.066 | 0.048 |
378
+ | cosine_recall@1 | 0.32 | 0.16 |
379
+ | cosine_recall@3 | 0.48 | 0.27 |
380
+ | cosine_recall@5 | 0.58 | 0.34 |
381
+ | cosine_recall@10 | 0.66 | 0.41 |
382
+ | **cosine_ndcg@10** | **0.4843** | **0.2896** |
383
+ | cosine_mrr@10 | 0.429 | 0.2683 |
384
+ | cosine_map@100 | 0.4402 | 0.2589 |
385
 
386
  #### Nano BEIR
387
 
 
397
  }
398
  ```
399
 
400
+ | Metric | Value |
401
+ |:--------------------|:----------|
402
+ | cosine_accuracy@1 | 0.25 |
403
+ | cosine_accuracy@3 | 0.4 |
404
+ | cosine_accuracy@5 | 0.48 |
405
+ | cosine_accuracy@10 | 0.55 |
406
+ | cosine_precision@1 | 0.25 |
407
+ | cosine_precision@3 | 0.1333 |
408
+ | cosine_precision@5 | 0.098 |
409
+ | cosine_precision@10 | 0.057 |
410
+ | cosine_recall@1 | 0.24 |
411
+ | cosine_recall@3 | 0.375 |
412
+ | cosine_recall@5 | 0.46 |
413
+ | cosine_recall@10 | 0.535 |
414
+ | **cosine_ndcg@10** | **0.387** |
415
+ | cosine_mrr@10 | 0.3487 |
416
+ | cosine_map@100 | 0.3495 |
417
 
418
  <!--
419
  ## Bias, Risks and Limitations
 
487
  - `eval_strategy`: steps
488
  - `per_device_train_batch_size`: 128
489
  - `per_device_eval_batch_size`: 128
490
+ - `learning_rate`: 8e-05
491
+ - `weight_decay`: 0.005
492
+ - `max_steps`: 3375
493
  - `warmup_ratio`: 0.1
494
  - `fp16`: True
495
  - `dataloader_drop_last`: True
 
516
  - `gradient_accumulation_steps`: 1
517
  - `eval_accumulation_steps`: None
518
  - `torch_empty_cache_steps`: None
519
+ - `learning_rate`: 8e-05
520
+ - `weight_decay`: 0.005
521
  - `adam_beta1`: 0.9
522
  - `adam_beta2`: 0.999
523
  - `adam_epsilon`: 1e-08
524
  - `max_grad_norm`: 1.0
525
  - `num_train_epochs`: 3.0
526
+ - `max_steps`: 3375
527
  - `lr_scheduler_type`: linear
528
  - `lr_scheduler_kwargs`: {}
529
  - `warmup_ratio`: 0.1
 
630
  ### Training Logs
631
  | Epoch | Step | Training Loss | Validation Loss | NanoMSMARCO_cosine_ndcg@10 | NanoNQ_cosine_ndcg@10 | NanoBEIR_mean_cosine_ndcg@10 |
632
  |:------:|:----:|:-------------:|:---------------:|:--------------------------:|:---------------------:|:----------------------------:|
633
+ | 0 | 0 | - | 3.3069 | 0.5887 | 0.5786 | 0.5836 |
634
+ | 0.2874 | 250 | 3.1957 | 3.0122 | 0.5103 | 0.4380 | 0.4741 |
635
+ | 0.5747 | 500 | 3.0979 | 2.9933 | 0.4427 | 0.3722 | 0.4074 |
636
+ | 0.8621 | 750 | 3.068 | 2.9763 | 0.4850 | 0.3193 | 0.4021 |
637
+ | 1.1494 | 1000 | 3.024 | 2.9673 | 0.4492 | 0.3337 | 0.3915 |
638
+ | 1.4368 | 1250 | 2.9981 | 2.9615 | 0.4489 | 0.3182 | 0.3836 |
639
+ | 1.7241 | 1500 | 2.9922 | 2.9556 | 0.4839 | 0.2935 | 0.3887 |
640
+ | 2.0115 | 1750 | 2.9861 | 2.9544 | 0.4597 | 0.2967 | 0.3782 |
641
+ | 2.2989 | 2000 | 2.9396 | 2.9517 | 0.4700 | 0.2864 | 0.3782 |
642
+ | 2.5862 | 2250 | 2.9377 | 2.9487 | 0.4905 | 0.2720 | 0.3813 |
643
+ | 2.8736 | 2500 | 2.9313 | 2.9480 | 0.4835 | 0.3000 | 0.3917 |
644
+ | 3.1609 | 2750 | 2.9187 | 2.9466 | 0.4588 | 0.3055 | 0.3821 |
645
+ | 3.4483 | 3000 | 2.9038 | 2.9472 | 0.4937 | 0.2990 | 0.3963 |
646
+ | 3.7356 | 3250 | 2.9034 | 2.9453 | 0.4843 | 0.2896 | 0.3870 |
 
 
 
 
 
 
 
647
 
648
 
649
  ### Framework Versions