KhaledReda commited on
Commit
0fd3eb4
·
verified ·
1 Parent(s): d41a59f

Upload folder using huggingface_hub

Browse files
1_Pooling/config.json ADDED
@@ -0,0 +1,10 @@
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "word_embedding_dimension": 384,
3
+ "pooling_mode_cls_token": false,
4
+ "pooling_mode_mean_tokens": true,
5
+ "pooling_mode_max_tokens": false,
6
+ "pooling_mode_mean_sqrt_len_tokens": false,
7
+ "pooling_mode_weightedmean_tokens": false,
8
+ "pooling_mode_lasttoken": false,
9
+ "include_prompt": true
10
+ }
README.md ADDED
@@ -0,0 +1,1119 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ language:
3
+ - en
4
+ license: apache-2.0
5
+ tags:
6
+ - sentence-transformers
7
+ - sentence-similarity
8
+ - feature-extraction
9
+ - dense
10
+ - generated_from_trainer
11
+ - dataset_size:9229520
12
+ - loss:CoSENTLoss
13
+ base_model: sentence-transformers/all-MiniLM-L6-v2
14
+ widget:
15
+ - source_sentence: cookies cupcake
16
+ sentences:
17
+ - beard brush
18
+ - smoky bottoms
19
+ - rings curtain
20
+ - source_sentence: skin vitamins serum
21
+ sentences:
22
+ - vitamin e brow pencil
23
+ - platinum plated necklace
24
+ - centra collection glassware air bubble base drinkware
25
+ - source_sentence: salmon and mozzarella pizza
26
+ sentences:
27
+ - almond pizza
28
+ - sea breeze scented candle
29
+ - ribbed collar tshirt
30
+ - source_sentence: ushaped back swimsuit
31
+ sentences:
32
+ - golden coffee beans
33
+ - jolieva
34
+ - beach
35
+ - source_sentence: kite
36
+ sentences:
37
+ - gastreg ampoules
38
+ - ramadan kaftan clutch
39
+ - side pocket boardshorts
40
+ datasets:
41
+ - KhaledReda/pairs_three_scores_v13_synonyms_added
42
+ pipeline_tag: sentence-similarity
43
+ library_name: sentence-transformers
44
+ ---
45
+
46
+ # all-MiniLM-L6-v17-pair_score
47
+
48
+ This is a [sentence-transformers](https://www.SBERT.net) model finetuned from [sentence-transformers/all-MiniLM-L6-v2](https://huggingface.co/sentence-transformers/all-MiniLM-L6-v2) on the [pairs_three_scores_v13_synonyms_added](https://huggingface.co/datasets/KhaledReda/pairs_three_scores_v13_synonyms_added) dataset. It maps sentences & paragraphs to a 384-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.
49
+
50
+ ## Model Details
51
+
52
+ ### Model Description
53
+ - **Model Type:** Sentence Transformer
54
+ - **Base model:** [sentence-transformers/all-MiniLM-L6-v2](https://huggingface.co/sentence-transformers/all-MiniLM-L6-v2) <!-- at revision c9745ed1d9f207416be6d2e6f8de32d1f16199bf -->
55
+ - **Maximum Sequence Length:** 256 tokens
56
+ - **Output Dimensionality:** 384 dimensions
57
+ - **Similarity Function:** Cosine Similarity
58
+ - **Training Dataset:**
59
+ - [pairs_three_scores_v13_synonyms_added](https://huggingface.co/datasets/KhaledReda/pairs_three_scores_v13_synonyms_added)
60
+ - **Language:** en
61
+ - **License:** apache-2.0
62
+
63
+ ### Model Sources
64
+
65
+ - **Documentation:** [Sentence Transformers Documentation](https://sbert.net)
66
+ - **Repository:** [Sentence Transformers on GitHub](https://github.com/UKPLab/sentence-transformers)
67
+ - **Hugging Face:** [Sentence Transformers on Hugging Face](https://huggingface.co/models?library=sentence-transformers)
68
+
69
+ ### Full Model Architecture
70
+
71
+ ```
72
+ SentenceTransformer(
73
+ (0): Transformer({'max_seq_length': 256, 'do_lower_case': False, 'architecture': 'BertModel'})
74
+ (1): Pooling({'word_embedding_dimension': 384, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
75
+ (2): Normalize()
76
+ )
77
+ ```
78
+
79
+ ## Usage
80
+
81
+ ### Direct Usage (Sentence Transformers)
82
+
83
+ First install the Sentence Transformers library:
84
+
85
+ ```bash
86
+ pip install -U sentence-transformers
87
+ ```
88
+
89
+ Then you can load this model and run inference.
90
+ ```python
91
+ from sentence_transformers import SentenceTransformer
92
+
93
+ # Download from the 🤗 Hub
94
+ model = SentenceTransformer("sentence_transformers_model_id")
95
+ # Run inference
96
+ sentences = [
97
+ 'kite',
98
+ 'ramadan kaftan clutch',
99
+ 'side pocket boardshorts',
100
+ ]
101
+ embeddings = model.encode(sentences)
102
+ print(embeddings.shape)
103
+ # [3, 384]
104
+
105
+ # Get the similarity scores for the embeddings
106
+ similarities = model.similarity(embeddings, embeddings)
107
+ print(similarities)
108
+ # tensor([[1.0000, 0.6999, 0.6212],
109
+ # [0.6999, 1.0000, 0.7124],
110
+ # [0.6212, 0.7124, 1.0000]])
111
+ ```
112
+
113
+ <!--
114
+ ### Direct Usage (Transformers)
115
+
116
+ <details><summary>Click to see the direct usage in Transformers</summary>
117
+
118
+ </details>
119
+ -->
120
+
121
+ <!--
122
+ ### Downstream Usage (Sentence Transformers)
123
+
124
+ You can finetune this model on your own dataset.
125
+
126
+ <details><summary>Click to expand</summary>
127
+
128
+ </details>
129
+ -->
130
+
131
+ <!--
132
+ ### Out-of-Scope Use
133
+
134
+ *List how the model may foreseeably be misused and address what users ought not to do with the model.*
135
+ -->
136
+
137
+ <!--
138
+ ## Bias, Risks and Limitations
139
+
140
+ *What are the known or foreseeable issues stemming from this model? You could also flag here known failure cases or weaknesses of the model.*
141
+ -->
142
+
143
+ <!--
144
+ ### Recommendations
145
+
146
+ *What are recommendations with respect to the foreseeable issues? For example, filtering explicit content.*
147
+ -->
148
+
149
+ ## Training Details
150
+
151
+ ### Training Dataset
152
+
153
+ #### pairs_three_scores_v13_synonyms_added
154
+
155
+ * Dataset: [pairs_three_scores_v13_synonyms_added](https://huggingface.co/datasets/KhaledReda/pairs_three_scores_v13_synonyms_added) at [10e49f8](https://huggingface.co/datasets/KhaledReda/pairs_three_scores_v13_synonyms_added/tree/10e49f8482fdfce9deef1fe41dadb4ae0320a17b)
156
+ * Size: 9,229,520 training samples
157
+ * Columns: <code>sentence1</code>, <code>sentence2</code>, and <code>score</code>
158
+ * Approximate statistics based on the first 1000 samples:
159
+ | | sentence1 | sentence2 | score |
160
+ |:--------|:---------------------------------------------------------------------------------|:---------------------------------------------------------------------------------|:----------------------------------------------------------------|
161
+ | type | string | string | float |
162
+ | details | <ul><li>min: 3 tokens</li><li>mean: 5.65 tokens</li><li>max: 18 tokens</li></ul> | <ul><li>min: 3 tokens</li><li>mean: 5.69 tokens</li><li>max: 16 tokens</li></ul> | <ul><li>min: 0.15</li><li>mean: 0.42</li><li>max: 1.0</li></ul> |
163
+ * Samples:
164
+ | sentence1 | sentence2 | score |
165
+ |:------------------------------------|:------------------------------------------|:------------------|
166
+ | <code>kettlebell</code> | <code>bag</code> | <code>0.22</code> |
167
+ | <code>mixed berry milk shake</code> | <code>elasticized waistband shorts</code> | <code>0.21</code> |
168
+ | <code>raw linden honey</code> | <code>refresher sponge</code> | <code>0.22</code> |
169
+ * Loss: [<code>CoSENTLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#cosentloss) with these parameters:
170
+ ```json
171
+ {
172
+ "scale": 20.0,
173
+ "similarity_fct": "pairwise_cos_sim"
174
+ }
175
+ ```
176
+
177
+ ### Evaluation Dataset
178
+
179
+ #### pairs_three_scores_v13_synonyms_added
180
+
181
+ * Dataset: [pairs_three_scores_v13_synonyms_added](https://huggingface.co/datasets/KhaledReda/pairs_three_scores_v13_synonyms_added) at [10e49f8](https://huggingface.co/datasets/KhaledReda/pairs_three_scores_v13_synonyms_added/tree/10e49f8482fdfce9deef1fe41dadb4ae0320a17b)
182
+ * Size: 46,380 evaluation samples
183
+ * Columns: <code>sentence1</code>, <code>sentence2</code>, and <code>score</code>
184
+ * Approximate statistics based on the first 1000 samples:
185
+ | | sentence1 | sentence2 | score |
186
+ |:--------|:----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|:----------------------------------------------------------------|
187
+ | type | string | string | float |
188
+ | details | <ul><li>min: 3 tokens</li><li>mean: 5.69 tokens</li><li>max: 115 tokens</li></ul> | <ul><li>min: 3 tokens</li><li>mean: 5.77 tokens</li><li>max: 115 tokens</li></ul> | <ul><li>min: 0.15</li><li>mean: 0.42</li><li>max: 1.0</li></ul> |
189
+ * Samples:
190
+ | sentence1 | sentence2 | score |
191
+ |:-----------------------------------|:-----------------------------------|:------------------|
192
+ | <code>bag</code> | <code>nude rocks</code> | <code>0.24</code> |
193
+ | <code>semi natural necklace</code> | <code>21 kt plated necklace</code> | <code>1.0</code> |
194
+ | <code>eco friendly coasters</code> | <code>measuring cup</code> | <code>0.23</code> |
195
+ * Loss: [<code>CoSENTLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#cosentloss) with these parameters:
196
+ ```json
197
+ {
198
+ "scale": 20.0,
199
+ "similarity_fct": "pairwise_cos_sim"
200
+ }
201
+ ```
202
+
203
+ ### Training Hyperparameters
204
+ #### Non-Default Hyperparameters
205
+
206
+ - `eval_strategy`: steps
207
+ - `per_device_train_batch_size`: 128
208
+ - `per_device_eval_batch_size`: 128
209
+ - `learning_rate`: 2e-05
210
+ - `num_train_epochs`: 1
211
+ - `warmup_ratio`: 0.1
212
+ - `fp16`: True
213
+
214
+ #### All Hyperparameters
215
+ <details><summary>Click to expand</summary>
216
+
217
+ - `overwrite_output_dir`: False
218
+ - `do_predict`: False
219
+ - `eval_strategy`: steps
220
+ - `prediction_loss_only`: True
221
+ - `per_device_train_batch_size`: 128
222
+ - `per_device_eval_batch_size`: 128
223
+ - `per_gpu_train_batch_size`: None
224
+ - `per_gpu_eval_batch_size`: None
225
+ - `gradient_accumulation_steps`: 1
226
+ - `eval_accumulation_steps`: None
227
+ - `torch_empty_cache_steps`: None
228
+ - `learning_rate`: 2e-05
229
+ - `weight_decay`: 0.0
230
+ - `adam_beta1`: 0.9
231
+ - `adam_beta2`: 0.999
232
+ - `adam_epsilon`: 1e-08
233
+ - `max_grad_norm`: 1.0
234
+ - `num_train_epochs`: 1
235
+ - `max_steps`: -1
236
+ - `lr_scheduler_type`: linear
237
+ - `lr_scheduler_kwargs`: {}
238
+ - `warmup_ratio`: 0.1
239
+ - `warmup_steps`: 0
240
+ - `log_level`: passive
241
+ - `log_level_replica`: warning
242
+ - `log_on_each_node`: True
243
+ - `logging_nan_inf_filter`: True
244
+ - `save_safetensors`: True
245
+ - `save_on_each_node`: False
246
+ - `save_only_model`: False
247
+ - `restore_callback_states_from_checkpoint`: False
248
+ - `no_cuda`: False
249
+ - `use_cpu`: False
250
+ - `use_mps_device`: False
251
+ - `seed`: 42
252
+ - `data_seed`: None
253
+ - `jit_mode_eval`: False
254
+ - `use_ipex`: False
255
+ - `bf16`: False
256
+ - `fp16`: True
257
+ - `fp16_opt_level`: O1
258
+ - `half_precision_backend`: auto
259
+ - `bf16_full_eval`: False
260
+ - `fp16_full_eval`: False
261
+ - `tf32`: None
262
+ - `local_rank`: 0
263
+ - `ddp_backend`: None
264
+ - `tpu_num_cores`: None
265
+ - `tpu_metrics_debug`: False
266
+ - `debug`: []
267
+ - `dataloader_drop_last`: False
268
+ - `dataloader_num_workers`: 0
269
+ - `dataloader_prefetch_factor`: None
270
+ - `past_index`: -1
271
+ - `disable_tqdm`: False
272
+ - `remove_unused_columns`: True
273
+ - `label_names`: None
274
+ - `load_best_model_at_end`: False
275
+ - `ignore_data_skip`: False
276
+ - `fsdp`: []
277
+ - `fsdp_min_num_params`: 0
278
+ - `fsdp_config`: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
279
+ - `fsdp_transformer_layer_cls_to_wrap`: None
280
+ - `accelerator_config`: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
281
+ - `deepspeed`: None
282
+ - `label_smoothing_factor`: 0.0
283
+ - `optim`: adamw_torch
284
+ - `optim_args`: None
285
+ - `adafactor`: False
286
+ - `group_by_length`: False
287
+ - `length_column_name`: length
288
+ - `ddp_find_unused_parameters`: None
289
+ - `ddp_bucket_cap_mb`: None
290
+ - `ddp_broadcast_buffers`: False
291
+ - `dataloader_pin_memory`: True
292
+ - `dataloader_persistent_workers`: False
293
+ - `skip_memory_metrics`: True
294
+ - `use_legacy_prediction_loop`: False
295
+ - `push_to_hub`: False
296
+ - `resume_from_checkpoint`: None
297
+ - `hub_model_id`: None
298
+ - `hub_strategy`: every_save
299
+ - `hub_private_repo`: None
300
+ - `hub_always_push`: False
301
+ - `hub_revision`: None
302
+ - `gradient_checkpointing`: False
303
+ - `gradient_checkpointing_kwargs`: None
304
+ - `include_inputs_for_metrics`: False
305
+ - `include_for_metrics`: []
306
+ - `eval_do_concat_batches`: True
307
+ - `fp16_backend`: auto
308
+ - `push_to_hub_model_id`: None
309
+ - `push_to_hub_organization`: None
310
+ - `mp_parameters`:
311
+ - `auto_find_batch_size`: False
312
+ - `full_determinism`: False
313
+ - `torchdynamo`: None
314
+ - `ray_scope`: last
315
+ - `ddp_timeout`: 1800
316
+ - `torch_compile`: False
317
+ - `torch_compile_backend`: None
318
+ - `torch_compile_mode`: None
319
+ - `include_tokens_per_second`: False
320
+ - `include_num_input_tokens_seen`: False
321
+ - `neftune_noise_alpha`: None
322
+ - `optim_target_modules`: None
323
+ - `batch_eval_metrics`: False
324
+ - `eval_on_start`: False
325
+ - `use_liger_kernel`: False
326
+ - `liger_kernel_config`: None
327
+ - `eval_use_gather_object`: False
328
+ - `average_tokens_across_devices`: False
329
+ - `prompts`: None
330
+ - `batch_sampler`: batch_sampler
331
+ - `multi_dataset_batch_sampler`: proportional
332
+ - `router_mapping`: {}
333
+ - `learning_rate_mapping`: {}
334
+
335
+ </details>
336
+
337
+ ### Training Logs
338
+ <details><summary>Click to expand</summary>
339
+
340
+ | Epoch | Step | Training Loss | Validation Loss |
341
+ |:------:|:-----:|:-------------:|:---------------:|
342
+ | 0.0014 | 100 | 11.7561 | - |
343
+ | 0.0028 | 200 | 11.739 | - |
344
+ | 0.0042 | 300 | 11.2175 | - |
345
+ | 0.0055 | 400 | 11.0759 | - |
346
+ | 0.0069 | 500 | 10.7749 | 10.9497 |
347
+ | 0.0083 | 600 | 10.4026 | - |
348
+ | 0.0097 | 700 | 10.2194 | - |
349
+ | 0.0111 | 800 | 9.834 | - |
350
+ | 0.0125 | 900 | 9.6126 | - |
351
+ | 0.0139 | 1000 | 9.3563 | 9.3834 |
352
+ | 0.0153 | 1100 | 9.0716 | - |
353
+ | 0.0166 | 1200 | 8.9245 | - |
354
+ | 0.0180 | 1300 | 8.7384 | - |
355
+ | 0.0194 | 1400 | 8.6381 | - |
356
+ | 0.0208 | 1500 | 8.6089 | 8.5228 |
357
+ | 0.0222 | 1600 | 8.5817 | - |
358
+ | 0.0236 | 1700 | 8.5418 | - |
359
+ | 0.0250 | 1800 | 8.532 | - |
360
+ | 0.0264 | 1900 | 8.5107 | - |
361
+ | 0.0277 | 2000 | 8.4917 | 8.4366 |
362
+ | 0.0291 | 2100 | 8.485 | - |
363
+ | 0.0305 | 2200 | 8.4826 | - |
364
+ | 0.0319 | 2300 | 8.4512 | - |
365
+ | 0.0333 | 2400 | 8.4694 | - |
366
+ | 0.0347 | 2500 | 8.4485 | 8.3778 |
367
+ | 0.0361 | 2600 | 8.4293 | - |
368
+ | 0.0374 | 2700 | 8.4222 | - |
369
+ | 0.0388 | 2800 | 8.4031 | - |
370
+ | 0.0402 | 2900 | 8.3947 | - |
371
+ | 0.0416 | 3000 | 8.3912 | 8.3335 |
372
+ | 0.0430 | 3100 | 8.3913 | - |
373
+ | 0.0444 | 3200 | 8.3822 | - |
374
+ | 0.0458 | 3300 | 8.3552 | - |
375
+ | 0.0472 | 3400 | 8.3759 | - |
376
+ | 0.0485 | 3500 | 8.3632 | 8.2942 |
377
+ | 0.0499 | 3600 | 8.3495 | - |
378
+ | 0.0513 | 3700 | 8.3385 | - |
379
+ | 0.0527 | 3800 | 8.3346 | - |
380
+ | 0.0541 | 3900 | 8.3249 | - |
381
+ | 0.0555 | 4000 | 8.3033 | 8.2534 |
382
+ | 0.0569 | 4100 | 8.3141 | - |
383
+ | 0.0582 | 4200 | 8.3015 | - |
384
+ | 0.0596 | 4300 | 8.2982 | - |
385
+ | 0.0610 | 4400 | 8.3006 | - |
386
+ | 0.0624 | 4500 | 8.2972 | 8.2175 |
387
+ | 0.0638 | 4600 | 8.2757 | - |
388
+ | 0.0652 | 4700 | 8.2765 | - |
389
+ | 0.0666 | 4800 | 8.2668 | - |
390
+ | 0.0680 | 4900 | 8.2472 | - |
391
+ | 0.0693 | 5000 | 8.2605 | 8.1990 |
392
+ | 0.0707 | 5100 | 8.2481 | - |
393
+ | 0.0721 | 5200 | 8.2598 | - |
394
+ | 0.0735 | 5300 | 8.2403 | - |
395
+ | 0.0749 | 5400 | 8.2388 | - |
396
+ | 0.0763 | 5500 | 8.2074 | 8.1497 |
397
+ | 0.0777 | 5600 | 8.2236 | - |
398
+ | 0.0791 | 5700 | 8.2204 | - |
399
+ | 0.0804 | 5800 | 8.2086 | - |
400
+ | 0.0818 | 5900 | 8.208 | - |
401
+ | 0.0832 | 6000 | 8.1991 | 8.1357 |
402
+ | 0.0846 | 6100 | 8.2064 | - |
403
+ | 0.0860 | 6200 | 8.1969 | - |
404
+ | 0.0874 | 6300 | 8.1795 | - |
405
+ | 0.0888 | 6400 | 8.1846 | - |
406
+ | 0.0901 | 6500 | 8.188 | 8.1128 |
407
+ | 0.0915 | 6600 | 8.1902 | - |
408
+ | 0.0929 | 6700 | 8.1624 | - |
409
+ | 0.0943 | 6800 | 8.1527 | - |
410
+ | 0.0957 | 6900 | 8.1589 | - |
411
+ | 0.0971 | 7000 | 8.1624 | 8.0843 |
412
+ | 0.0985 | 7100 | 8.1705 | - |
413
+ | 0.0999 | 7200 | 8.1362 | - |
414
+ | 0.1012 | 7300 | 8.1419 | - |
415
+ | 0.1026 | 7400 | 8.1564 | - |
416
+ | 0.1040 | 7500 | 8.1422 | 8.0581 |
417
+ | 0.1054 | 7600 | 8.1214 | - |
418
+ | 0.1068 | 7700 | 8.1369 | - |
419
+ | 0.1082 | 7800 | 8.1024 | - |
420
+ | 0.1096 | 7900 | 8.0974 | - |
421
+ | 0.1109 | 8000 | 8.1316 | 8.0378 |
422
+ | 0.1123 | 8100 | 8.1185 | - |
423
+ | 0.1137 | 8200 | 8.1148 | - |
424
+ | 0.1151 | 8300 | 8.1015 | - |
425
+ | 0.1165 | 8400 | 8.0851 | - |
426
+ | 0.1179 | 8500 | 8.0881 | 8.0091 |
427
+ | 0.1193 | 8600 | 8.0734 | - |
428
+ | 0.1207 | 8700 | 8.0644 | - |
429
+ | 0.1220 | 8800 | 8.0802 | - |
430
+ | 0.1234 | 8900 | 8.0827 | - |
431
+ | 0.1248 | 9000 | 8.0934 | 8.0049 |
432
+ | 0.1262 | 9100 | 8.0544 | - |
433
+ | 0.1276 | 9200 | 8.0828 | - |
434
+ | 0.1290 | 9300 | 8.0844 | - |
435
+ | 0.1304 | 9400 | 8.0598 | - |
436
+ | 0.1318 | 9500 | 8.0575 | 7.9784 |
437
+ | 0.1331 | 9600 | 8.0476 | - |
438
+ | 0.1345 | 9700 | 8.0617 | - |
439
+ | 0.1359 | 9800 | 8.0632 | - |
440
+ | 0.1373 | 9900 | 8.0398 | - |
441
+ | 0.1387 | 10000 | 8.0455 | 7.9625 |
442
+ | 0.1401 | 10100 | 8.0441 | - |
443
+ | 0.1415 | 10200 | 8.0462 | - |
444
+ | 0.1428 | 10300 | 8.0429 | - |
445
+ | 0.1442 | 10400 | 8.0332 | - |
446
+ | 0.1456 | 10500 | 8.0087 | 7.9579 |
447
+ | 0.1470 | 10600 | 8.0374 | - |
448
+ | 0.1484 | 10700 | 8.0243 | - |
449
+ | 0.1498 | 10800 | 8.0445 | - |
450
+ | 0.1512 | 10900 | 8.0155 | - |
451
+ | 0.1526 | 11000 | 8.0161 | 7.9321 |
452
+ | 0.1539 | 11100 | 8.0092 | - |
453
+ | 0.1553 | 11200 | 8.0041 | - |
454
+ | 0.1567 | 11300 | 8.0165 | - |
455
+ | 0.1581 | 11400 | 8.005 | - |
456
+ | 0.1595 | 11500 | 7.9992 | 7.9243 |
457
+ | 0.1609 | 11600 | 8.0109 | - |
458
+ | 0.1623 | 11700 | 8.0096 | - |
459
+ | 0.1636 | 11800 | 8.0176 | - |
460
+ | 0.1650 | 11900 | 7.9965 | - |
461
+ | 0.1664 | 12000 | 8.0159 | 7.9092 |
462
+ | 0.1678 | 12100 | 7.9865 | - |
463
+ | 0.1692 | 12200 | 7.9742 | - |
464
+ | 0.1706 | 12300 | 7.9757 | - |
465
+ | 0.1720 | 12400 | 7.9852 | - |
466
+ | 0.1734 | 12500 | 8.0068 | 7.8931 |
467
+ | 0.1747 | 12600 | 7.9616 | - |
468
+ | 0.1761 | 12700 | 7.9889 | - |
469
+ | 0.1775 | 12800 | 7.9795 | - |
470
+ | 0.1789 | 12900 | 7.9657 | - |
471
+ | 0.1803 | 13000 | 7.952 | 7.8785 |
472
+ | 0.1817 | 13100 | 7.9534 | - |
473
+ | 0.1831 | 13200 | 7.9212 | - |
474
+ | 0.1845 | 13300 | 7.9479 | - |
475
+ | 0.1858 | 13400 | 7.9433 | - |
476
+ | 0.1872 | 13500 | 7.9599 | 7.8757 |
477
+ | 0.1886 | 13600 | 7.9751 | - |
478
+ | 0.1900 | 13700 | 7.9564 | - |
479
+ | 0.1914 | 13800 | 7.9642 | - |
480
+ | 0.1928 | 13900 | 7.9511 | - |
481
+ | 0.1942 | 14000 | 7.9458 | 7.8580 |
482
+ | 0.1955 | 14100 | 7.9625 | - |
483
+ | 0.1969 | 14200 | 7.9728 | - |
484
+ | 0.1983 | 14300 | 7.9235 | - |
485
+ | 0.1997 | 14400 | 7.9658 | - |
486
+ | 0.2011 | 14500 | 7.9567 | 7.8480 |
487
+ | 0.2025 | 14600 | 7.9214 | - |
488
+ | 0.2039 | 14700 | 7.8983 | - |
489
+ | 0.2053 | 14800 | 7.9334 | - |
490
+ | 0.2066 | 14900 | 7.9345 | - |
491
+ | 0.2080 | 15000 | 7.9245 | 7.8334 |
492
+ | 0.2094 | 15100 | 7.9144 | - |
493
+ | 0.2108 | 15200 | 7.9375 | - |
494
+ | 0.2122 | 15300 | 7.9058 | - |
495
+ | 0.2136 | 15400 | 7.9365 | - |
496
+ | 0.2150 | 15500 | 7.9101 | 7.8291 |
497
+ | 0.2163 | 15600 | 7.9001 | - |
498
+ | 0.2177 | 15700 | 7.8906 | - |
499
+ | 0.2191 | 15800 | 7.9103 | - |
500
+ | 0.2205 | 15900 | 7.8899 | - |
501
+ | 0.2219 | 16000 | 7.8874 | 7.8157 |
502
+ | 0.2233 | 16100 | 7.9011 | - |
503
+ | 0.2247 | 16200 | 7.92 | - |
504
+ | 0.2261 | 16300 | 7.8933 | - |
505
+ | 0.2274 | 16400 | 7.886 | - |
506
+ | 0.2288 | 16500 | 7.8959 | 7.8097 |
507
+ | 0.2302 | 16600 | 7.8623 | - |
508
+ | 0.2316 | 16700 | 7.8703 | - |
509
+ | 0.2330 | 16800 | 7.8934 | - |
510
+ | 0.2344 | 16900 | 7.8651 | - |
511
+ | 0.2358 | 17000 | 7.91 | 7.8047 |
512
+ | 0.2372 | 17100 | 7.8794 | - |
513
+ | 0.2385 | 17200 | 7.8794 | - |
514
+ | 0.2399 | 17300 | 7.8723 | - |
515
+ | 0.2413 | 17400 | 7.9007 | - |
516
+ | 0.2427 | 17500 | 7.8568 | 7.7977 |
517
+ | 0.2441 | 17600 | 7.8855 | - |
518
+ | 0.2455 | 17700 | 7.8687 | - |
519
+ | 0.2469 | 17800 | 7.8708 | - |
520
+ | 0.2482 | 17900 | 7.8533 | - |
521
+ | 0.2496 | 18000 | 7.87 | 7.8019 |
522
+ | 0.2510 | 18100 | 7.8364 | - |
523
+ | 0.2524 | 18200 | 7.8901 | - |
524
+ | 0.2538 | 18300 | 7.8782 | - |
525
+ | 0.2552 | 18400 | 7.8817 | - |
526
+ | 0.2566 | 18500 | 7.8794 | 7.7774 |
527
+ | 0.2580 | 18600 | 7.9031 | - |
528
+ | 0.2593 | 18700 | 7.8897 | - |
529
+ | 0.2607 | 18800 | 7.8741 | - |
530
+ | 0.2621 | 18900 | 7.8774 | - |
531
+ | 0.2635 | 19000 | 7.8771 | 7.7696 |
532
+ | 0.2649 | 19100 | 7.839 | - |
533
+ | 0.2663 | 19200 | 7.8786 | - |
534
+ | 0.2677 | 19300 | 7.866 | - |
535
+ | 0.2690 | 19400 | 7.868 | - |
536
+ | 0.2704 | 19500 | 7.8804 | 7.7672 |
537
+ | 0.2718 | 19600 | 7.8398 | - |
538
+ | 0.2732 | 19700 | 7.8662 | - |
539
+ | 0.2746 | 19800 | 7.8341 | - |
540
+ | 0.2760 | 19900 | 7.86 | - |
541
+ | 0.2774 | 20000 | 7.8325 | 7.7518 |
542
+ | 0.2788 | 20100 | 7.7957 | - |
543
+ | 0.2801 | 20200 | 7.8478 | - |
544
+ | 0.2815 | 20300 | 7.8601 | - |
545
+ | 0.2829 | 20400 | 7.8395 | - |
546
+ | 0.2843 | 20500 | 7.8414 | 7.7452 |
547
+ | 0.2857 | 20600 | 7.8332 | - |
548
+ | 0.2871 | 20700 | 7.862 | - |
549
+ | 0.2885 | 20800 | 7.8007 | - |
550
+ | 0.2899 | 20900 | 7.8249 | - |
551
+ | 0.2912 | 21000 | 7.8237 | 7.7456 |
552
+ | 0.2926 | 21100 | 7.8616 | - |
553
+ | 0.2940 | 21200 | 7.865 | - |
554
+ | 0.2954 | 21300 | 7.8226 | - |
555
+ | 0.2968 | 21400 | 7.8245 | - |
556
+ | 0.2982 | 21500 | 7.809 | 7.7333 |
557
+ | 0.2996 | 21600 | 7.8026 | - |
558
+ | 0.3009 | 21700 | 7.8169 | - |
559
+ | 0.3023 | 21800 | 7.8201 | - |
560
+ | 0.3037 | 21900 | 7.8057 | - |
561
+ | 0.3051 | 22000 | 7.8237 | 7.7258 |
562
+ | 0.3065 | 22100 | 7.82 | - |
563
+ | 0.3079 | 22200 | 7.819 | - |
564
+ | 0.3093 | 22300 | 7.7972 | - |
565
+ | 0.3107 | 22400 | 7.8141 | - |
566
+ | 0.3120 | 22500 | 7.8135 | 7.7245 |
567
+ | 0.3134 | 22600 | 7.7951 | - |
568
+ | 0.3148 | 22700 | 7.8051 | - |
569
+ | 0.3162 | 22800 | 7.7968 | - |
570
+ | 0.3176 | 22900 | 7.8256 | - |
571
+ | 0.3190 | 23000 | 7.8407 | 7.7135 |
572
+ | 0.3204 | 23100 | 7.8241 | - |
573
+ | 0.3217 | 23200 | 7.8195 | - |
574
+ | 0.3231 | 23300 | 7.7964 | - |
575
+ | 0.3245 | 23400 | 7.8166 | - |
576
+ | 0.3259 | 23500 | 7.821 | 7.6989 |
577
+ | 0.3273 | 23600 | 7.8125 | - |
578
+ | 0.3287 | 23700 | 7.7913 | - |
579
+ | 0.3301 | 23800 | 7.7958 | - |
580
+ | 0.3315 | 23900 | 7.7988 | - |
581
+ | 0.3328 | 24000 | 7.8148 | 7.7022 |
582
+ | 0.3342 | 24100 | 7.7964 | - |
583
+ | 0.3356 | 24200 | 7.7924 | - |
584
+ | 0.3370 | 24300 | 7.7783 | - |
585
+ | 0.3384 | 24400 | 7.8008 | - |
586
+ | 0.3398 | 24500 | 7.7745 | 7.6911 |
587
+ | 0.3412 | 24600 | 7.8002 | - |
588
+ | 0.3426 | 24700 | 7.7984 | - |
589
+ | 0.3439 | 24800 | 7.8212 | - |
590
+ | 0.3453 | 24900 | 7.7789 | - |
591
+ | 0.3467 | 25000 | 7.7609 | 7.6880 |
592
+ | 0.3481 | 25100 | 7.792 | - |
593
+ | 0.3495 | 25200 | 7.8064 | - |
594
+ | 0.3509 | 25300 | 7.7851 | - |
595
+ | 0.3523 | 25400 | 7.784 | - |
596
+ | 0.3536 | 25500 | 7.7905 | 7.6772 |
597
+ | 0.3550 | 25600 | 7.8252 | - |
598
+ | 0.3564 | 25700 | 7.766 | - |
599
+ | 0.3578 | 25800 | 7.7424 | - |
600
+ | 0.3592 | 25900 | 7.779 | - |
601
+ | 0.3606 | 26000 | 7.7701 | 7.6759 |
602
+ | 0.3620 | 26100 | 7.774 | - |
603
+ | 0.3634 | 26200 | 7.7752 | - |
604
+ | 0.3647 | 26300 | 7.7928 | - |
605
+ | 0.3661 | 26400 | 7.7525 | - |
606
+ | 0.3675 | 26500 | 7.7783 | 7.6744 |
607
+ | 0.3689 | 26600 | 7.7618 | - |
608
+ | 0.3703 | 26700 | 7.8067 | - |
609
+ | 0.3717 | 26800 | 7.7771 | - |
610
+ | 0.3731 | 26900 | 7.7936 | - |
611
+ | 0.3744 | 27000 | 7.7499 | 7.6710 |
612
+ | 0.3758 | 27100 | 7.7629 | - |
613
+ | 0.3772 | 27200 | 7.7843 | - |
614
+ | 0.3786 | 27300 | 7.7735 | - |
615
+ | 0.3800 | 27400 | 7.7662 | - |
616
+ | 0.3814 | 27500 | 7.7453 | 7.6658 |
617
+ | 0.3828 | 27600 | 7.7417 | - |
618
+ | 0.3842 | 27700 | 7.7793 | - |
619
+ | 0.3855 | 27800 | 7.7535 | - |
620
+ | 0.3869 | 27900 | 7.7695 | - |
621
+ | 0.3883 | 28000 | 7.758 | 7.6481 |
622
+ | 0.3897 | 28100 | 7.7391 | - |
623
+ | 0.3911 | 28200 | 7.7447 | - |
624
+ | 0.3925 | 28300 | 7.7691 | - |
625
+ | 0.3939 | 28400 | 7.7555 | - |
626
+ | 0.3953 | 28500 | 7.752 | 7.6460 |
627
+ | 0.3966 | 28600 | 7.7272 | - |
628
+ | 0.3980 | 28700 | 7.7464 | - |
629
+ | 0.3994 | 28800 | 7.7415 | - |
630
+ | 0.4008 | 28900 | 7.7616 | - |
631
+ | 0.4022 | 29000 | 7.7661 | 7.6477 |
632
+ | 0.4036 | 29100 | 7.7352 | - |
633
+ | 0.4050 | 29200 | 7.7438 | - |
634
+ | 0.4063 | 29300 | 7.7468 | - |
635
+ | 0.4077 | 29400 | 7.768 | - |
636
+ | 0.4091 | 29500 | 7.7581 | 7.6392 |
637
+ | 0.4105 | 29600 | 7.7374 | - |
638
+ | 0.4119 | 29700 | 7.7307 | - |
639
+ | 0.4133 | 29800 | 7.7292 | - |
640
+ | 0.4147 | 29900 | 7.7543 | - |
641
+ | 0.4161 | 30000 | 7.7435 | 7.6337 |
642
+ | 0.4174 | 30100 | 7.751 | - |
643
+ | 0.4188 | 30200 | 7.7264 | - |
644
+ | 0.4202 | 30300 | 7.7366 | - |
645
+ | 0.4216 | 30400 | 7.7137 | - |
646
+ | 0.4230 | 30500 | 7.7625 | 7.6239 |
647
+ | 0.4244 | 30600 | 7.7006 | - |
648
+ | 0.4258 | 30700 | 7.7571 | - |
649
+ | 0.4271 | 30800 | 7.722 | - |
650
+ | 0.4285 | 30900 | 7.7209 | - |
651
+ | 0.4299 | 31000 | 7.7159 | 7.6189 |
652
+ | 0.4313 | 31100 | 7.7058 | - |
653
+ | 0.4327 | 31200 | 7.7407 | - |
654
+ | 0.4341 | 31300 | 7.7093 | - |
655
+ | 0.4355 | 31400 | 7.7172 | - |
656
+ | 0.4369 | 31500 | 7.7532 | 7.6187 |
657
+ | 0.4382 | 31600 | 7.7254 | - |
658
+ | 0.4396 | 31700 | 7.716 | - |
659
+ | 0.4410 | 31800 | 7.7231 | - |
660
+ | 0.4424 | 31900 | 7.7272 | - |
661
+ | 0.4438 | 32000 | 7.7214 | 7.6153 |
662
+ | 0.4452 | 32100 | 7.7325 | - |
663
+ | 0.4466 | 32200 | 7.7268 | - |
664
+ | 0.4480 | 32300 | 7.6801 | - |
665
+ | 0.4493 | 32400 | 7.7209 | - |
666
+ | 0.4507 | 32500 | 7.6958 | 7.6057 |
667
+ | 0.4521 | 32600 | 7.6903 | - |
668
+ | 0.4535 | 32700 | 7.7379 | - |
669
+ | 0.4549 | 32800 | 7.7245 | - |
670
+ | 0.4563 | 32900 | 7.7506 | - |
671
+ | 0.4577 | 33000 | 7.7095 | 7.6051 |
672
+ | 0.4590 | 33100 | 7.7148 | - |
673
+ | 0.4604 | 33200 | 7.7182 | - |
674
+ | 0.4618 | 33300 | 7.7307 | - |
675
+ | 0.4632 | 33400 | 7.7381 | - |
676
+ | 0.4646 | 33500 | 7.7214 | 7.6028 |
677
+ | 0.4660 | 33600 | 7.6882 | - |
678
+ | 0.4674 | 33700 | 7.6864 | - |
679
+ | 0.4688 | 33800 | 7.6718 | - |
680
+ | 0.4701 | 33900 | 7.7201 | - |
681
+ | 0.4715 | 34000 | 7.7173 | 7.6092 |
682
+ | 0.4729 | 34100 | 7.6805 | - |
683
+ | 0.4743 | 34200 | 7.7264 | - |
684
+ | 0.4757 | 34300 | 7.7013 | - |
685
+ | 0.4771 | 34400 | 7.7074 | - |
686
+ | 0.4785 | 34500 | 7.7044 | 7.6044 |
687
+ | 0.4798 | 34600 | 7.742 | - |
688
+ | 0.4812 | 34700 | 7.7104 | - |
689
+ | 0.4826 | 34800 | 7.7004 | - |
690
+ | 0.4840 | 34900 | 7.7175 | - |
691
+ | 0.4854 | 35000 | 7.687 | 7.5947 |
692
+ | 0.4868 | 35100 | 7.7024 | - |
693
+ | 0.4882 | 35200 | 7.6666 | - |
694
+ | 0.4896 | 35300 | 7.6869 | - |
695
+ | 0.4909 | 35400 | 7.7147 | - |
696
+ | 0.4923 | 35500 | 7.7281 | 7.5804 |
697
+ | 0.4937 | 35600 | 7.6852 | - |
698
+ | 0.4951 | 35700 | 7.6735 | - |
699
+ | 0.4965 | 35800 | 7.7043 | - |
700
+ | 0.4979 | 35900 | 7.6884 | - |
701
+ | 0.4993 | 36000 | 7.7233 | 7.5851 |
702
+ | 0.5007 | 36100 | 7.6914 | - |
703
+ | 0.5020 | 36200 | 7.7083 | - |
704
+ | 0.5034 | 36300 | 7.6876 | - |
705
+ | 0.5048 | 36400 | 7.6909 | - |
706
+ | 0.5062 | 36500 | 7.679 | 7.5862 |
707
+ | 0.5076 | 36600 | 7.6884 | - |
708
+ | 0.5090 | 36700 | 7.6697 | - |
709
+ | 0.5104 | 36800 | 7.6625 | - |
710
+ | 0.5117 | 36900 | 7.6881 | - |
711
+ | 0.5131 | 37000 | 7.6859 | 7.5844 |
712
+ | 0.5145 | 37100 | 7.6624 | - |
713
+ | 0.5159 | 37200 | 7.6932 | - |
714
+ | 0.5173 | 37300 | 7.6851 | - |
715
+ | 0.5187 | 37400 | 7.6941 | - |
716
+ | 0.5201 | 37500 | 7.6473 | 7.5810 |
717
+ | 0.5215 | 37600 | 7.6619 | - |
718
+ | 0.5228 | 37700 | 7.6789 | - |
719
+ | 0.5242 | 37800 | 7.6842 | - |
720
+ | 0.5256 | 37900 | 7.6686 | - |
721
+ | 0.5270 | 38000 | 7.6677 | 7.5784 |
722
+ | 0.5284 | 38100 | 7.7113 | - |
723
+ | 0.5298 | 38200 | 7.6863 | - |
724
+ | 0.5312 | 38300 | 7.664 | - |
725
+ | 0.5325 | 38400 | 7.6928 | - |
726
+ | 0.5339 | 38500 | 7.685 | 7.5819 |
727
+ | 0.5353 | 38600 | 7.6507 | - |
728
+ | 0.5367 | 38700 | 7.6848 | - |
729
+ | 0.5381 | 38800 | 7.6435 | - |
730
+ | 0.5395 | 38900 | 7.6421 | - |
731
+ | 0.5409 | 39000 | 7.6883 | 7.5664 |
732
+ | 0.5423 | 39100 | 7.6907 | - |
733
+ | 0.5436 | 39200 | 7.6919 | - |
734
+ | 0.5450 | 39300 | 7.6956 | - |
735
+ | 0.5464 | 39400 | 7.6592 | - |
736
+ | 0.5478 | 39500 | 7.6488 | 7.5738 |
737
+ | 0.5492 | 39600 | 7.6918 | - |
738
+ | 0.5506 | 39700 | 7.6725 | - |
739
+ | 0.5520 | 39800 | 7.6804 | - |
740
+ | 0.5534 | 39900 | 7.6598 | - |
741
+ | 0.5547 | 40000 | 7.6888 | 7.5581 |
742
+ | 0.5561 | 40100 | 7.6732 | - |
743
+ | 0.5575 | 40200 | 7.7042 | - |
744
+ | 0.5589 | 40300 | 7.6626 | - |
745
+ | 0.5603 | 40400 | 7.7271 | - |
746
+ | 0.5617 | 40500 | 7.6753 | 7.5562 |
747
+ | 0.5631 | 40600 | 7.6521 | - |
748
+ | 0.5644 | 40700 | 7.667 | - |
749
+ | 0.5658 | 40800 | 7.6823 | - |
750
+ | 0.5672 | 40900 | 7.6635 | - |
751
+ | 0.5686 | 41000 | 7.6609 | 7.5553 |
752
+ | 0.5700 | 41100 | 7.6609 | - |
753
+ | 0.5714 | 41200 | 7.6712 | - |
754
+ | 0.5728 | 41300 | 7.6687 | - |
755
+ | 0.5742 | 41400 | 7.7182 | - |
756
+ | 0.5755 | 41500 | 7.6335 | 7.5660 |
757
+ | 0.5769 | 41600 | 7.6791 | - |
758
+ | 0.5783 | 41700 | 7.6509 | - |
759
+ | 0.5797 | 41800 | 7.6497 | - |
760
+ | 0.5811 | 41900 | 7.6514 | - |
761
+ | 0.5825 | 42000 | 7.6288 | 7.5552 |
762
+ | 0.5839 | 42100 | 7.6699 | - |
763
+ | 0.5852 | 42200 | 7.6824 | - |
764
+ | 0.5866 | 42300 | 7.68 | - |
765
+ | 0.5880 | 42400 | 7.661 | - |
766
+ | 0.5894 | 42500 | 7.6573 | 7.5487 |
767
+ | 0.5908 | 42600 | 7.6702 | - |
768
+ | 0.5922 | 42700 | 7.6573 | - |
769
+ | 0.5936 | 42800 | 7.6546 | - |
770
+ | 0.5950 | 42900 | 7.6424 | - |
771
+ | 0.5963 | 43000 | 7.6721 | 7.5504 |
772
+ | 0.5977 | 43100 | 7.6713 | - |
773
+ | 0.5991 | 43200 | 7.6695 | - |
774
+ | 0.6005 | 43300 | 7.6817 | - |
775
+ | 0.6019 | 43400 | 7.6484 | - |
776
+ | 0.6033 | 43500 | 7.6062 | 7.5481 |
777
+ | 0.6047 | 43600 | 7.6397 | - |
778
+ | 0.6061 | 43700 | 7.6555 | - |
779
+ | 0.6074 | 43800 | 7.6546 | - |
780
+ | 0.6088 | 43900 | 7.6781 | - |
781
+ | 0.6102 | 44000 | 7.6284 | 7.5399 |
782
+ | 0.6116 | 44100 | 7.666 | - |
783
+ | 0.6130 | 44200 | 7.6597 | - |
784
+ | 0.6144 | 44300 | 7.6651 | - |
785
+ | 0.6158 | 44400 | 7.6475 | - |
786
+ | 0.6171 | 44500 | 7.6565 | 7.5369 |
787
+ | 0.6185 | 44600 | 7.6336 | - |
788
+ | 0.6199 | 44700 | 7.6421 | - |
789
+ | 0.6213 | 44800 | 7.646 | - |
790
+ | 0.6227 | 44900 | 7.6319 | - |
791
+ | 0.6241 | 45000 | 7.664 | 7.5368 |
792
+ | 0.6255 | 45100 | 7.6515 | - |
793
+ | 0.6269 | 45200 | 7.6525 | - |
794
+ | 0.6282 | 45300 | 7.6534 | - |
795
+ | 0.6296 | 45400 | 7.655 | - |
796
+ | 0.6310 | 45500 | 7.6712 | 7.5278 |
797
+ | 0.6324 | 45600 | 7.6342 | - |
798
+ | 0.6338 | 45700 | 7.6077 | - |
799
+ | 0.6352 | 45800 | 7.6476 | - |
800
+ | 0.6366 | 45900 | 7.6412 | - |
801
+ | 0.6379 | 46000 | 7.6546 | 7.5331 |
802
+ | 0.6393 | 46100 | 7.6378 | - |
803
+ | 0.6407 | 46200 | 7.6572 | - |
804
+ | 0.6421 | 46300 | 7.6284 | - |
805
+ | 0.6435 | 46400 | 7.625 | - |
806
+ | 0.6449 | 46500 | 7.6526 | 7.5338 |
807
+ | 0.6463 | 46600 | 7.6172 | - |
808
+ | 0.6477 | 46700 | 7.6136 | - |
809
+ | 0.6490 | 46800 | 7.6428 | - |
810
+ | 0.6504 | 46900 | 7.6277 | - |
811
+ | 0.6518 | 47000 | 7.6903 | 7.5272 |
812
+ | 0.6532 | 47100 | 7.6313 | - |
813
+ | 0.6546 | 47200 | 7.6214 | - |
814
+ | 0.6560 | 47300 | 7.6044 | - |
815
+ | 0.6574 | 47400 | 7.6098 | - |
816
+ | 0.6588 | 47500 | 7.6477 | 7.5203 |
817
+ | 0.6601 | 47600 | 7.6454 | - |
818
+ | 0.6615 | 47700 | 7.6199 | - |
819
+ | 0.6629 | 47800 | 7.6119 | - |
820
+ | 0.6643 | 47900 | 7.6241 | - |
821
+ | 0.6657 | 48000 | 7.6414 | 7.5189 |
822
+ | 0.6671 | 48100 | 7.6629 | - |
823
+ | 0.6685 | 48200 | 7.6777 | - |
824
+ | 0.6698 | 48300 | 7.6217 | - |
825
+ | 0.6712 | 48400 | 7.6097 | - |
826
+ | 0.6726 | 48500 | 7.6449 | 7.5183 |
827
+ | 0.6740 | 48600 | 7.6131 | - |
828
+ | 0.6754 | 48700 | 7.622 | - |
829
+ | 0.6768 | 48800 | 7.6373 | - |
830
+ | 0.6782 | 48900 | 7.6193 | - |
831
+ | 0.6796 | 49000 | 7.6119 | 7.5209 |
832
+ | 0.6809 | 49100 | 7.6261 | - |
833
+ | 0.6823 | 49200 | 7.626 | - |
834
+ | 0.6837 | 49300 | 7.6232 | - |
835
+ | 0.6851 | 49400 | 7.5951 | - |
836
+ | 0.6865 | 49500 | 7.6368 | 7.5136 |
837
+ | 0.6879 | 49600 | 7.6641 | - |
838
+ | 0.6893 | 49700 | 7.6046 | - |
839
+ | 0.6906 | 49800 | 7.5923 | - |
840
+ | 0.6920 | 49900 | 7.6119 | - |
841
+ | 0.6934 | 50000 | 7.6301 | 7.5130 |
842
+ | 0.6948 | 50100 | 7.6288 | - |
843
+ | 0.6962 | 50200 | 7.6338 | - |
844
+ | 0.6976 | 50300 | 7.6137 | - |
845
+ | 0.6990 | 50400 | 7.6473 | - |
846
+ | 0.7004 | 50500 | 7.589 | 7.5153 |
847
+ | 0.7017 | 50600 | 7.6076 | - |
848
+ | 0.7031 | 50700 | 7.5906 | - |
849
+ | 0.7045 | 50800 | 7.6102 | - |
850
+ | 0.7059 | 50900 | 7.6463 | - |
851
+ | 0.7073 | 51000 | 7.6695 | 7.5098 |
852
+ | 0.7087 | 51100 | 7.5947 | - |
853
+ | 0.7101 | 51200 | 7.6097 | - |
854
+ | 0.7115 | 51300 | 7.6397 | - |
855
+ | 0.7128 | 51400 | 7.6072 | - |
856
+ | 0.7142 | 51500 | 7.6112 | 7.5103 |
857
+ | 0.7156 | 51600 | 7.639 | - |
858
+ | 0.7170 | 51700 | 7.6188 | - |
859
+ | 0.7184 | 51800 | 7.6198 | - |
860
+ | 0.7198 | 51900 | 7.6229 | - |
861
+ | 0.7212 | 52000 | 7.6323 | 7.5050 |
862
+ | 0.7225 | 52100 | 7.6275 | - |
863
+ | 0.7239 | 52200 | 7.6012 | - |
864
+ | 0.7253 | 52300 | 7.6187 | - |
865
+ | 0.7267 | 52400 | 7.6191 | - |
866
+ | 0.7281 | 52500 | 7.6232 | 7.5109 |
867
+ | 0.7295 | 52600 | 7.6199 | - |
868
+ | 0.7309 | 52700 | 7.5819 | - |
869
+ | 0.7323 | 52800 | 7.6474 | - |
870
+ | 0.7336 | 52900 | 7.6124 | - |
871
+ | 0.7350 | 53000 | 7.622 | 7.5000 |
872
+ | 0.7364 | 53100 | 7.6184 | - |
873
+ | 0.7378 | 53200 | 7.5761 | - |
874
+ | 0.7392 | 53300 | 7.5943 | - |
875
+ | 0.7406 | 53400 | 7.6209 | - |
876
+ | 0.7420 | 53500 | 7.6065 | 7.5055 |
877
+ | 0.7434 | 53600 | 7.6065 | - |
878
+ | 0.7447 | 53700 | 7.6285 | - |
879
+ | 0.7461 | 53800 | 7.641 | - |
880
+ | 0.7475 | 53900 | 7.633 | - |
881
+ | 0.7489 | 54000 | 7.6184 | 7.4995 |
882
+ | 0.7503 | 54100 | 7.6198 | - |
883
+ | 0.7517 | 54200 | 7.6239 | - |
884
+ | 0.7531 | 54300 | 7.6087 | - |
885
+ | 0.7544 | 54400 | 7.6112 | - |
886
+ | 0.7558 | 54500 | 7.6372 | 7.4957 |
887
+ | 0.7572 | 54600 | 7.5938 | - |
888
+ | 0.7586 | 54700 | 7.6091 | - |
889
+ | 0.7600 | 54800 | 7.622 | - |
890
+ | 0.7614 | 54900 | 7.6052 | - |
891
+ | 0.7628 | 55000 | 7.5775 | 7.4967 |
892
+ | 0.7642 | 55100 | 7.6484 | - |
893
+ | 0.7655 | 55200 | 7.5911 | - |
894
+ | 0.7669 | 55300 | 7.5966 | - |
895
+ | 0.7683 | 55400 | 7.5708 | - |
896
+ | 0.7697 | 55500 | 7.5905 | 7.4959 |
897
+ | 0.7711 | 55600 | 7.5858 | - |
898
+ | 0.7725 | 55700 | 7.6255 | - |
899
+ | 0.7739 | 55800 | 7.6169 | - |
900
+ | 0.7752 | 55900 | 7.6159 | - |
901
+ | 0.7766 | 56000 | 7.584 | 7.4929 |
902
+ | 0.7780 | 56100 | 7.6364 | - |
903
+ | 0.7794 | 56200 | 7.558 | - |
904
+ | 0.7808 | 56300 | 7.6095 | - |
905
+ | 0.7822 | 56400 | 7.6049 | - |
906
+ | 0.7836 | 56500 | 7.6079 | 7.4934 |
907
+ | 0.7850 | 56600 | 7.584 | - |
908
+ | 0.7863 | 56700 | 7.5543 | - |
909
+ | 0.7877 | 56800 | 7.5971 | - |
910
+ | 0.7891 | 56900 | 7.6395 | - |
911
+ | 0.7905 | 57000 | 7.6006 | 7.4900 |
912
+ | 0.7919 | 57100 | 7.6199 | - |
913
+ | 0.7933 | 57200 | 7.5938 | - |
914
+ | 0.7947 | 57300 | 7.602 | - |
915
+ | 0.7961 | 57400 | 7.6317 | - |
916
+ | 0.7974 | 57500 | 7.6125 | 7.4891 |
917
+ | 0.7988 | 57600 | 7.6031 | - |
918
+ | 0.8002 | 57700 | 7.6153 | - |
919
+ | 0.8016 | 57800 | 7.6141 | - |
920
+ | 0.8030 | 57900 | 7.5877 | - |
921
+ | 0.8044 | 58000 | 7.6051 | 7.4896 |
922
+ | 0.8058 | 58100 | 7.6065 | - |
923
+ | 0.8071 | 58200 | 7.5677 | - |
924
+ | 0.8085 | 58300 | 7.6035 | - |
925
+ | 0.8099 | 58400 | 7.6071 | - |
926
+ | 0.8113 | 58500 | 7.6214 | 7.4800 |
927
+ | 0.8127 | 58600 | 7.5914 | - |
928
+ | 0.8141 | 58700 | 7.6038 | - |
929
+ | 0.8155 | 58800 | 7.6206 | - |
930
+ | 0.8169 | 58900 | 7.6222 | - |
931
+ | 0.8182 | 59000 | 7.6128 | 7.4801 |
932
+ | 0.8196 | 59100 | 7.6109 | - |
933
+ | 0.8210 | 59200 | 7.5591 | - |
934
+ | 0.8224 | 59300 | 7.5794 | - |
935
+ | 0.8238 | 59400 | 7.6161 | - |
936
+ | 0.8252 | 59500 | 7.5689 | 7.4824 |
937
+ | 0.8266 | 59600 | 7.6009 | - |
938
+ | 0.8279 | 59700 | 7.6121 | - |
939
+ | 0.8293 | 59800 | 7.5872 | - |
940
+ | 0.8307 | 59900 | 7.6111 | - |
941
+ | 0.8321 | 60000 | 7.5339 | 7.4813 |
942
+ | 0.8335 | 60100 | 7.5739 | - |
943
+ | 0.8349 | 60200 | 7.5565 | - |
944
+ | 0.8363 | 60300 | 7.5637 | - |
945
+ | 0.8377 | 60400 | 7.5997 | - |
946
+ | 0.8390 | 60500 | 7.592 | 7.4829 |
947
+ | 0.8404 | 60600 | 7.6004 | - |
948
+ | 0.8418 | 60700 | 7.6007 | - |
949
+ | 0.8432 | 60800 | 7.602 | - |
950
+ | 0.8446 | 60900 | 7.5755 | - |
951
+ | 0.8460 | 61000 | 7.5771 | 7.4795 |
952
+ | 0.8474 | 61100 | 7.6143 | - |
953
+ | 0.8488 | 61200 | 7.6088 | - |
954
+ | 0.8501 | 61300 | 7.5555 | - |
955
+ | 0.8515 | 61400 | 7.5841 | - |
956
+ | 0.8529 | 61500 | 7.5979 | 7.4762 |
957
+ | 0.8543 | 61600 | 7.6403 | - |
958
+ | 0.8557 | 61700 | 7.5607 | - |
959
+ | 0.8571 | 61800 | 7.6151 | - |
960
+ | 0.8585 | 61900 | 7.6179 | - |
961
+ | 0.8598 | 62000 | 7.6152 | 7.4767 |
962
+ | 0.8612 | 62100 | 7.598 | - |
963
+ | 0.8626 | 62200 | 7.6013 | - |
964
+ | 0.8640 | 62300 | 7.5577 | - |
965
+ | 0.8654 | 62400 | 7.6108 | - |
966
+ | 0.8668 | 62500 | 7.5869 | 7.4716 |
967
+ | 0.8682 | 62600 | 7.559 | - |
968
+ | 0.8696 | 62700 | 7.5963 | - |
969
+ | 0.8709 | 62800 | 7.5884 | - |
970
+ | 0.8723 | 62900 | 7.5922 | - |
971
+ | 0.8737 | 63000 | 7.5915 | 7.4683 |
972
+ | 0.8751 | 63100 | 7.5473 | - |
973
+ | 0.8765 | 63200 | 7.5829 | - |
974
+ | 0.8779 | 63300 | 7.6122 | - |
975
+ | 0.8793 | 63400 | 7.5863 | - |
976
+ | 0.8806 | 63500 | 7.5764 | 7.4707 |
977
+ | 0.8820 | 63600 | 7.6258 | - |
978
+ | 0.8834 | 63700 | 7.5862 | - |
979
+ | 0.8848 | 63800 | 7.5977 | - |
980
+ | 0.8862 | 63900 | 7.5708 | - |
981
+ | 0.8876 | 64000 | 7.6024 | 7.4675 |
982
+ | 0.8890 | 64100 | 7.5625 | - |
983
+ | 0.8904 | 64200 | 7.5474 | - |
984
+ | 0.8917 | 64300 | 7.5978 | - |
985
+ | 0.8931 | 64400 | 7.5505 | - |
986
+ | 0.8945 | 64500 | 7.5741 | 7.4678 |
987
+ | 0.8959 | 64600 | 7.5763 | - |
988
+ | 0.8973 | 64700 | 7.5528 | - |
989
+ | 0.8987 | 64800 | 7.5787 | - |
990
+ | 0.9001 | 64900 | 7.5631 | - |
991
+ | 0.9015 | 65000 | 7.582 | 7.4724 |
992
+ | 0.9028 | 65100 | 7.5931 | - |
993
+ | 0.9042 | 65200 | 7.5977 | - |
994
+ | 0.9056 | 65300 | 7.572 | - |
995
+ | 0.9070 | 65400 | 7.6331 | - |
996
+ | 0.9084 | 65500 | 7.5503 | 7.4660 |
997
+ | 0.9098 | 65600 | 7.5987 | - |
998
+ | 0.9112 | 65700 | 7.611 | - |
999
+ | 0.9125 | 65800 | 7.563 | - |
1000
+ | 0.9139 | 65900 | 7.5699 | - |
1001
+ | 0.9153 | 66000 | 7.5942 | 7.4677 |
1002
+ | 0.9167 | 66100 | 7.6119 | - |
1003
+ | 0.9181 | 66200 | 7.5873 | - |
1004
+ | 0.9195 | 66300 | 7.6036 | - |
1005
+ | 0.9209 | 66400 | 7.5827 | - |
1006
+ | 0.9223 | 66500 | 7.6103 | 7.4649 |
1007
+ | 0.9236 | 66600 | 7.604 | - |
1008
+ | 0.9250 | 66700 | 7.6129 | - |
1009
+ | 0.9264 | 66800 | 7.5668 | - |
1010
+ | 0.9278 | 66900 | 7.5699 | - |
1011
+ | 0.9292 | 67000 | 7.6045 | 7.4626 |
1012
+ | 0.9306 | 67100 | 7.5973 | - |
1013
+ | 0.9320 | 67200 | 7.5951 | - |
1014
+ | 0.9333 | 67300 | 7.5635 | - |
1015
+ | 0.9347 | 67400 | 7.5915 | - |
1016
+ | 0.9361 | 67500 | 7.5577 | 7.4619 |
1017
+ | 0.9375 | 67600 | 7.5921 | - |
1018
+ | 0.9389 | 67700 | 7.5888 | - |
1019
+ | 0.9403 | 67800 | 7.5838 | - |
1020
+ | 0.9417 | 67900 | 7.5648 | - |
1021
+ | 0.9431 | 68000 | 7.5537 | 7.4616 |
1022
+ | 0.9444 | 68100 | 7.5809 | - |
1023
+ | 0.9458 | 68200 | 7.5882 | - |
1024
+ | 0.9472 | 68300 | 7.5372 | - |
1025
+ | 0.9486 | 68400 | 7.584 | - |
1026
+ | 0.9500 | 68500 | 7.5821 | 7.4607 |
1027
+ | 0.9514 | 68600 | 7.5663 | - |
1028
+ | 0.9528 | 68700 | 7.5734 | - |
1029
+ | 0.9542 | 68800 | 7.6026 | - |
1030
+ | 0.9555 | 68900 | 7.5928 | - |
1031
+ | 0.9569 | 69000 | 7.5415 | 7.4615 |
1032
+ | 0.9583 | 69100 | 7.5785 | - |
1033
+ | 0.9597 | 69200 | 7.5925 | - |
1034
+ | 0.9611 | 69300 | 7.5922 | - |
1035
+ | 0.9625 | 69400 | 7.5559 | - |
1036
+ | 0.9639 | 69500 | 7.5759 | 7.4594 |
1037
+ | 0.9652 | 69600 | 7.5753 | - |
1038
+ | 0.9666 | 69700 | 7.6039 | - |
1039
+ | 0.9680 | 69800 | 7.5791 | - |
1040
+ | 0.9694 | 69900 | 7.5905 | - |
1041
+ | 0.9708 | 70000 | 7.57 | 7.4592 |
1042
+ | 0.9722 | 70100 | 7.5804 | - |
1043
+ | 0.9736 | 70200 | 7.5709 | - |
1044
+ | 0.9750 | 70300 | 7.582 | - |
1045
+ | 0.9763 | 70400 | 7.6233 | - |
1046
+ | 0.9777 | 70500 | 7.556 | 7.4582 |
1047
+ | 0.9791 | 70600 | 7.6028 | - |
1048
+ | 0.9805 | 70700 | 7.6149 | - |
1049
+ | 0.9819 | 70800 | 7.5763 | - |
1050
+ | 0.9833 | 70900 | 7.5904 | - |
1051
+ | 0.9847 | 71000 | 7.5607 | 7.4590 |
1052
+ | 0.9860 | 71100 | 7.5826 | - |
1053
+ | 0.9874 | 71200 | 7.5704 | - |
1054
+ | 0.9888 | 71300 | 7.5656 | - |
1055
+ | 0.9902 | 71400 | 7.5879 | - |
1056
+ | 0.9916 | 71500 | 7.5943 | 7.4583 |
1057
+ | 0.9930 | 71600 | 7.5359 | - |
1058
+ | 0.9944 | 71700 | 7.6152 | - |
1059
+ | 0.9958 | 71800 | 7.5791 | - |
1060
+ | 0.9971 | 71900 | 7.5845 | - |
1061
+ | 0.9985 | 72000 | 7.5487 | 7.4580 |
1062
+ | 0.9999 | 72100 | 7.6124 | - |
1063
+
1064
+ </details>
1065
+
1066
+ ### Framework Versions
1067
+ - Python: 3.12.3
1068
+ - Sentence Transformers: 5.1.0
1069
+ - Transformers: 4.55.4
1070
+ - PyTorch: 2.5.1+cu121
1071
+ - Accelerate: 1.10.1
1072
+ - Datasets: 4.0.0
1073
+ - Tokenizers: 0.21.4
1074
+
1075
+ ## Citation
1076
+
1077
+ ### BibTeX
1078
+
1079
+ #### Sentence Transformers
1080
+ ```bibtex
1081
+ @inproceedings{reimers-2019-sentence-bert,
1082
+ title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
1083
+ author = "Reimers, Nils and Gurevych, Iryna",
1084
+ booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
1085
+ month = "11",
1086
+ year = "2019",
1087
+ publisher = "Association for Computational Linguistics",
1088
+ url = "https://arxiv.org/abs/1908.10084",
1089
+ }
1090
+ ```
1091
+
1092
+ #### CoSENTLoss
1093
+ ```bibtex
1094
+ @online{kexuefm-8847,
1095
+ title={CoSENT: A more efficient sentence vector scheme than Sentence-BERT},
1096
+ author={Su Jianlin},
1097
+ year={2022},
1098
+ month={Jan},
1099
+ url={https://kexue.fm/archives/8847},
1100
+ }
1101
+ ```
1102
+
1103
+ <!--
1104
+ ## Glossary
1105
+
1106
+ *Clearly define terms in order to be accessible across audiences.*
1107
+ -->
1108
+
1109
+ <!--
1110
+ ## Model Card Authors
1111
+
1112
+ *Lists the people who create the model card, providing recognition and accountability for the detailed work that goes into its construction.*
1113
+ -->
1114
+
1115
+ <!--
1116
+ ## Model Card Contact
1117
+
1118
+ *Provides a way for people who have updates to the Model Card, suggestions, or questions, to contact the Model Card authors.*
1119
+ -->
config.json ADDED
@@ -0,0 +1,25 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "architectures": [
3
+ "BertModel"
4
+ ],
5
+ "attention_probs_dropout_prob": 0.1,
6
+ "classifier_dropout": null,
7
+ "gradient_checkpointing": false,
8
+ "hidden_act": "gelu",
9
+ "hidden_dropout_prob": 0.1,
10
+ "hidden_size": 384,
11
+ "initializer_range": 0.02,
12
+ "intermediate_size": 1536,
13
+ "layer_norm_eps": 1e-12,
14
+ "max_position_embeddings": 512,
15
+ "model_type": "bert",
16
+ "num_attention_heads": 12,
17
+ "num_hidden_layers": 6,
18
+ "pad_token_id": 0,
19
+ "position_embedding_type": "absolute",
20
+ "torch_dtype": "float32",
21
+ "transformers_version": "4.55.4",
22
+ "type_vocab_size": 2,
23
+ "use_cache": true,
24
+ "vocab_size": 30522
25
+ }
config_sentence_transformers.json ADDED
@@ -0,0 +1,14 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "__version__": {
3
+ "sentence_transformers": "5.1.0",
4
+ "transformers": "4.55.4",
5
+ "pytorch": "2.5.1+cu121"
6
+ },
7
+ "model_type": "SentenceTransformer",
8
+ "prompts": {
9
+ "query": "",
10
+ "document": ""
11
+ },
12
+ "default_prompt_name": null,
13
+ "similarity_fn_name": "cosine"
14
+ }
model.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:502661ee77213b11edc2bed06b80addfa64f58600d0ff24cbf6455041d272128
3
+ size 90864192
modules.json ADDED
@@ -0,0 +1,20 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ [
2
+ {
3
+ "idx": 0,
4
+ "name": "0",
5
+ "path": "",
6
+ "type": "sentence_transformers.models.Transformer"
7
+ },
8
+ {
9
+ "idx": 1,
10
+ "name": "1",
11
+ "path": "1_Pooling",
12
+ "type": "sentence_transformers.models.Pooling"
13
+ },
14
+ {
15
+ "idx": 2,
16
+ "name": "2",
17
+ "path": "2_Normalize",
18
+ "type": "sentence_transformers.models.Normalize"
19
+ }
20
+ ]
sentence_bert_config.json ADDED
@@ -0,0 +1,4 @@
 
 
 
 
 
1
+ {
2
+ "max_seq_length": 256,
3
+ "do_lower_case": false
4
+ }
special_tokens_map.json ADDED
@@ -0,0 +1,37 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "cls_token": {
3
+ "content": "[CLS]",
4
+ "lstrip": false,
5
+ "normalized": false,
6
+ "rstrip": false,
7
+ "single_word": false
8
+ },
9
+ "mask_token": {
10
+ "content": "[MASK]",
11
+ "lstrip": false,
12
+ "normalized": false,
13
+ "rstrip": false,
14
+ "single_word": false
15
+ },
16
+ "pad_token": {
17
+ "content": "[PAD]",
18
+ "lstrip": false,
19
+ "normalized": false,
20
+ "rstrip": false,
21
+ "single_word": false
22
+ },
23
+ "sep_token": {
24
+ "content": "[SEP]",
25
+ "lstrip": false,
26
+ "normalized": false,
27
+ "rstrip": false,
28
+ "single_word": false
29
+ },
30
+ "unk_token": {
31
+ "content": "[UNK]",
32
+ "lstrip": false,
33
+ "normalized": false,
34
+ "rstrip": false,
35
+ "single_word": false
36
+ }
37
+ }
tokenizer.json ADDED
The diff for this file is too large to render. See raw diff
 
tokenizer_config.json ADDED
@@ -0,0 +1,65 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "added_tokens_decoder": {
3
+ "0": {
4
+ "content": "[PAD]",
5
+ "lstrip": false,
6
+ "normalized": false,
7
+ "rstrip": false,
8
+ "single_word": false,
9
+ "special": true
10
+ },
11
+ "100": {
12
+ "content": "[UNK]",
13
+ "lstrip": false,
14
+ "normalized": false,
15
+ "rstrip": false,
16
+ "single_word": false,
17
+ "special": true
18
+ },
19
+ "101": {
20
+ "content": "[CLS]",
21
+ "lstrip": false,
22
+ "normalized": false,
23
+ "rstrip": false,
24
+ "single_word": false,
25
+ "special": true
26
+ },
27
+ "102": {
28
+ "content": "[SEP]",
29
+ "lstrip": false,
30
+ "normalized": false,
31
+ "rstrip": false,
32
+ "single_word": false,
33
+ "special": true
34
+ },
35
+ "103": {
36
+ "content": "[MASK]",
37
+ "lstrip": false,
38
+ "normalized": false,
39
+ "rstrip": false,
40
+ "single_word": false,
41
+ "special": true
42
+ }
43
+ },
44
+ "clean_up_tokenization_spaces": false,
45
+ "cls_token": "[CLS]",
46
+ "do_basic_tokenize": true,
47
+ "do_lower_case": true,
48
+ "extra_special_tokens": {},
49
+ "mask_token": "[MASK]",
50
+ "max_length": 128,
51
+ "model_max_length": 256,
52
+ "never_split": null,
53
+ "pad_to_multiple_of": null,
54
+ "pad_token": "[PAD]",
55
+ "pad_token_type_id": 0,
56
+ "padding_side": "right",
57
+ "sep_token": "[SEP]",
58
+ "stride": 0,
59
+ "strip_accents": null,
60
+ "tokenize_chinese_chars": true,
61
+ "tokenizer_class": "BertTokenizer",
62
+ "truncation_side": "right",
63
+ "truncation_strategy": "longest_first",
64
+ "unk_token": "[UNK]"
65
+ }
vocab.txt ADDED
The diff for this file is too large to render. See raw diff