KhaledReda commited on
Commit
f7e4afe
·
verified ·
1 Parent(s): 7213d5e

Upload folder using huggingface_hub

Browse files
1_Pooling/config.json ADDED
@@ -0,0 +1,10 @@
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "word_embedding_dimension": 384,
3
+ "pooling_mode_cls_token": false,
4
+ "pooling_mode_mean_tokens": true,
5
+ "pooling_mode_max_tokens": false,
6
+ "pooling_mode_mean_sqrt_len_tokens": false,
7
+ "pooling_mode_weightedmean_tokens": false,
8
+ "pooling_mode_lasttoken": false,
9
+ "include_prompt": true
10
+ }
README.md ADDED
@@ -0,0 +1,1658 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ language:
3
+ - en
4
+ license: apache-2.0
5
+ tags:
6
+ - sentence-transformers
7
+ - sentence-similarity
8
+ - feature-extraction
9
+ - dense
10
+ - generated_from_trainer
11
+ - dataset_size:16131988
12
+ - loss:CoSENTLoss
13
+ base_model: sentence-transformers/all-MiniLM-L6-v2
14
+ widget:
15
+ - source_sentence: sienna cardigan
16
+ sentences:
17
+ - parabens free lipstick
18
+ - pajama pant
19
+ - appetizer plate
20
+ - source_sentence: peach cover up
21
+ sentences:
22
+ - teddy stuffed doll
23
+ - necklace
24
+ - fabric tablecloth
25
+ - source_sentence: plain shawl
26
+ sentences:
27
+ - daily combing hair brush
28
+ - flawlessly blush
29
+ - microwave safe dining set
30
+ - source_sentence: assorted candy
31
+ sentences:
32
+ - backless blouse
33
+ - viscose scourer
34
+ - papaya body splash
35
+ - source_sentence: knitwear
36
+ sentences:
37
+ - unisex trousers
38
+ - daily exfoliating scrub
39
+ - stussys ball destiny rug
40
+ datasets:
41
+ - KhaledReda/pairs_three_scores_v10_synonyms
42
+ pipeline_tag: sentence-similarity
43
+ library_name: sentence-transformers
44
+ ---
45
+
46
+ # all-MiniLM-L6-v12-pair_score
47
+
48
+ This is a [sentence-transformers](https://www.SBERT.net) model finetuned from [sentence-transformers/all-MiniLM-L6-v2](https://huggingface.co/sentence-transformers/all-MiniLM-L6-v2) on the [pairs_three_scores_v10_synonyms](https://huggingface.co/datasets/KhaledReda/pairs_three_scores_v10_synonyms) dataset. It maps sentences & paragraphs to a 384-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.
49
+
50
+ ## Model Details
51
+
52
+ ### Model Description
53
+ - **Model Type:** Sentence Transformer
54
+ - **Base model:** [sentence-transformers/all-MiniLM-L6-v2](https://huggingface.co/sentence-transformers/all-MiniLM-L6-v2) <!-- at revision c9745ed1d9f207416be6d2e6f8de32d1f16199bf -->
55
+ - **Maximum Sequence Length:** 256 tokens
56
+ - **Output Dimensionality:** 384 dimensions
57
+ - **Similarity Function:** Cosine Similarity
58
+ - **Training Dataset:**
59
+ - [pairs_three_scores_v10_synonyms](https://huggingface.co/datasets/KhaledReda/pairs_three_scores_v10_synonyms)
60
+ - **Language:** en
61
+ - **License:** apache-2.0
62
+
63
+ ### Model Sources
64
+
65
+ - **Documentation:** [Sentence Transformers Documentation](https://sbert.net)
66
+ - **Repository:** [Sentence Transformers on GitHub](https://github.com/UKPLab/sentence-transformers)
67
+ - **Hugging Face:** [Sentence Transformers on Hugging Face](https://huggingface.co/models?library=sentence-transformers)
68
+
69
+ ### Full Model Architecture
70
+
71
+ ```
72
+ SentenceTransformer(
73
+ (0): Transformer({'max_seq_length': 256, 'do_lower_case': False, 'architecture': 'BertModel'})
74
+ (1): Pooling({'word_embedding_dimension': 384, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
75
+ (2): Normalize()
76
+ )
77
+ ```
78
+
79
+ ## Usage
80
+
81
+ ### Direct Usage (Sentence Transformers)
82
+
83
+ First install the Sentence Transformers library:
84
+
85
+ ```bash
86
+ pip install -U sentence-transformers
87
+ ```
88
+
89
+ Then you can load this model and run inference.
90
+ ```python
91
+ from sentence_transformers import SentenceTransformer
92
+
93
+ # Download from the 🤗 Hub
94
+ model = SentenceTransformer("sentence_transformers_model_id")
95
+ # Run inference
96
+ sentences = [
97
+ 'knitwear',
98
+ 'unisex trousers',
99
+ 'daily exfoliating scrub',
100
+ ]
101
+ embeddings = model.encode(sentences)
102
+ print(embeddings.shape)
103
+ # [3, 384]
104
+
105
+ # Get the similarity scores for the embeddings
106
+ similarities = model.similarity(embeddings, embeddings)
107
+ print(similarities)
108
+ # tensor([[1.0000, 0.7775, 0.4984],
109
+ # [0.7775, 1.0000, 0.4737],
110
+ # [0.4984, 0.4737, 1.0000]])
111
+ ```
112
+
113
+ <!--
114
+ ### Direct Usage (Transformers)
115
+
116
+ <details><summary>Click to see the direct usage in Transformers</summary>
117
+
118
+ </details>
119
+ -->
120
+
121
+ <!--
122
+ ### Downstream Usage (Sentence Transformers)
123
+
124
+ You can finetune this model on your own dataset.
125
+
126
+ <details><summary>Click to expand</summary>
127
+
128
+ </details>
129
+ -->
130
+
131
+ <!--
132
+ ### Out-of-Scope Use
133
+
134
+ *List how the model may foreseeably be misused and address what users ought not to do with the model.*
135
+ -->
136
+
137
+ <!--
138
+ ## Bias, Risks and Limitations
139
+
140
+ *What are the known or foreseeable issues stemming from this model? You could also flag here known failure cases or weaknesses of the model.*
141
+ -->
142
+
143
+ <!--
144
+ ### Recommendations
145
+
146
+ *What are recommendations with respect to the foreseeable issues? For example, filtering explicit content.*
147
+ -->
148
+
149
+ ## Training Details
150
+
151
+ ### Training Dataset
152
+
153
+ #### pairs_three_scores_v10_synonyms
154
+
155
+ * Dataset: [pairs_three_scores_v10_synonyms](https://huggingface.co/datasets/KhaledReda/pairs_three_scores_v10_synonyms) at [7a03b60](https://huggingface.co/datasets/KhaledReda/pairs_three_scores_v10_synonyms/tree/7a03b606bff087ad13af0b23ffaf2a3d3252f42c)
156
+ * Size: 16,131,988 training samples
157
+ * Columns: <code>sentence1</code>, <code>sentence2</code>, and <code>score</code>
158
+ * Approximate statistics based on the first 1000 samples:
159
+ | | sentence1 | sentence2 | score |
160
+ |:--------|:---------------------------------------------------------------------------------|:---------------------------------------------------------------------------------|:----------------------------------------------------------------|
161
+ | type | string | string | float |
162
+ | details | <ul><li>min: 3 tokens</li><li>mean: 5.63 tokens</li><li>max: 20 tokens</li></ul> | <ul><li>min: 3 tokens</li><li>mean: 5.75 tokens</li><li>max: 41 tokens</li></ul> | <ul><li>min: 0.15</li><li>mean: 0.36</li><li>max: 1.0</li></ul> |
163
+ * Samples:
164
+ | sentence1 | sentence2 | score |
165
+ |:---------------------------------------|:----------------------------------|:------------------|
166
+ | <code>kango gloves</code> | <code>balance board</code> | <code>0.27</code> |
167
+ | <code>nylon fanny pack</code> | <code>internal pockets bag</code> | <code>0.33</code> |
168
+ | <code>marshmallow scent perfume</code> | <code>handmade bowl</code> | <code>0.21</code> |
169
+ * Loss: [<code>CoSENTLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#cosentloss) with these parameters:
170
+ ```json
171
+ {
172
+ "scale": 20.0,
173
+ "similarity_fct": "pairwise_cos_sim"
174
+ }
175
+ ```
176
+
177
+ ### Evaluation Dataset
178
+
179
+ #### pairs_three_scores_v10_synonyms
180
+
181
+ * Dataset: [pairs_three_scores_v10_synonyms](https://huggingface.co/datasets/KhaledReda/pairs_three_scores_v10_synonyms) at [7a03b60](https://huggingface.co/datasets/KhaledReda/pairs_three_scores_v10_synonyms/tree/7a03b606bff087ad13af0b23ffaf2a3d3252f42c)
182
+ * Size: 81,066 evaluation samples
183
+ * Columns: <code>sentence1</code>, <code>sentence2</code>, and <code>score</code>
184
+ * Approximate statistics based on the first 1000 samples:
185
+ | | sentence1 | sentence2 | score |
186
+ |:--------|:---------------------------------------------------------------------------------|:---------------------------------------------------------------------------------|:----------------------------------------------------------------|
187
+ | type | string | string | float |
188
+ | details | <ul><li>min: 3 tokens</li><li>mean: 5.56 tokens</li><li>max: 14 tokens</li></ul> | <ul><li>min: 3 tokens</li><li>mean: 5.71 tokens</li><li>max: 22 tokens</li></ul> | <ul><li>min: 0.16</li><li>mean: 0.34</li><li>max: 1.0</li></ul> |
189
+ * Samples:
190
+ | sentence1 | sentence2 | score |
191
+ |:---------------------------------|:---------------------------------|:------------------|
192
+ | <code>tahini brownies</code> | <code>laylo tee</code> | <code>0.25</code> |
193
+ | <code>vinyl sticker candy</code> | <code>buttercream cupcake</code> | <code>0.3</code> |
194
+ | <code>relax sofa</code> | <code>stoneware plate</code> | <code>0.23</code> |
195
+ * Loss: [<code>CoSENTLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#cosentloss) with these parameters:
196
+ ```json
197
+ {
198
+ "scale": 20.0,
199
+ "similarity_fct": "pairwise_cos_sim"
200
+ }
201
+ ```
202
+
203
+ ### Training Hyperparameters
204
+ #### Non-Default Hyperparameters
205
+
206
+ - `eval_strategy`: steps
207
+ - `per_device_train_batch_size`: 128
208
+ - `per_device_eval_batch_size`: 128
209
+ - `learning_rate`: 2e-05
210
+ - `num_train_epochs`: 1
211
+ - `warmup_ratio`: 0.1
212
+ - `fp16`: True
213
+
214
+ #### All Hyperparameters
215
+ <details><summary>Click to expand</summary>
216
+
217
+ - `overwrite_output_dir`: False
218
+ - `do_predict`: False
219
+ - `eval_strategy`: steps
220
+ - `prediction_loss_only`: True
221
+ - `per_device_train_batch_size`: 128
222
+ - `per_device_eval_batch_size`: 128
223
+ - `per_gpu_train_batch_size`: None
224
+ - `per_gpu_eval_batch_size`: None
225
+ - `gradient_accumulation_steps`: 1
226
+ - `eval_accumulation_steps`: None
227
+ - `torch_empty_cache_steps`: None
228
+ - `learning_rate`: 2e-05
229
+ - `weight_decay`: 0.0
230
+ - `adam_beta1`: 0.9
231
+ - `adam_beta2`: 0.999
232
+ - `adam_epsilon`: 1e-08
233
+ - `max_grad_norm`: 1.0
234
+ - `num_train_epochs`: 1
235
+ - `max_steps`: -1
236
+ - `lr_scheduler_type`: linear
237
+ - `lr_scheduler_kwargs`: {}
238
+ - `warmup_ratio`: 0.1
239
+ - `warmup_steps`: 0
240
+ - `log_level`: passive
241
+ - `log_level_replica`: warning
242
+ - `log_on_each_node`: True
243
+ - `logging_nan_inf_filter`: True
244
+ - `save_safetensors`: True
245
+ - `save_on_each_node`: False
246
+ - `save_only_model`: False
247
+ - `restore_callback_states_from_checkpoint`: False
248
+ - `no_cuda`: False
249
+ - `use_cpu`: False
250
+ - `use_mps_device`: False
251
+ - `seed`: 42
252
+ - `data_seed`: None
253
+ - `jit_mode_eval`: False
254
+ - `use_ipex`: False
255
+ - `bf16`: False
256
+ - `fp16`: True
257
+ - `fp16_opt_level`: O1
258
+ - `half_precision_backend`: auto
259
+ - `bf16_full_eval`: False
260
+ - `fp16_full_eval`: False
261
+ - `tf32`: None
262
+ - `local_rank`: 0
263
+ - `ddp_backend`: None
264
+ - `tpu_num_cores`: None
265
+ - `tpu_metrics_debug`: False
266
+ - `debug`: []
267
+ - `dataloader_drop_last`: False
268
+ - `dataloader_num_workers`: 0
269
+ - `dataloader_prefetch_factor`: None
270
+ - `past_index`: -1
271
+ - `disable_tqdm`: False
272
+ - `remove_unused_columns`: True
273
+ - `label_names`: None
274
+ - `load_best_model_at_end`: False
275
+ - `ignore_data_skip`: False
276
+ - `fsdp`: []
277
+ - `fsdp_min_num_params`: 0
278
+ - `fsdp_config`: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
279
+ - `fsdp_transformer_layer_cls_to_wrap`: None
280
+ - `accelerator_config`: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
281
+ - `deepspeed`: None
282
+ - `label_smoothing_factor`: 0.0
283
+ - `optim`: adamw_torch
284
+ - `optim_args`: None
285
+ - `adafactor`: False
286
+ - `group_by_length`: False
287
+ - `length_column_name`: length
288
+ - `ddp_find_unused_parameters`: None
289
+ - `ddp_bucket_cap_mb`: None
290
+ - `ddp_broadcast_buffers`: False
291
+ - `dataloader_pin_memory`: True
292
+ - `dataloader_persistent_workers`: False
293
+ - `skip_memory_metrics`: True
294
+ - `use_legacy_prediction_loop`: False
295
+ - `push_to_hub`: False
296
+ - `resume_from_checkpoint`: None
297
+ - `hub_model_id`: None
298
+ - `hub_strategy`: every_save
299
+ - `hub_private_repo`: None
300
+ - `hub_always_push`: False
301
+ - `hub_revision`: None
302
+ - `gradient_checkpointing`: False
303
+ - `gradient_checkpointing_kwargs`: None
304
+ - `include_inputs_for_metrics`: False
305
+ - `include_for_metrics`: []
306
+ - `eval_do_concat_batches`: True
307
+ - `fp16_backend`: auto
308
+ - `push_to_hub_model_id`: None
309
+ - `push_to_hub_organization`: None
310
+ - `mp_parameters`:
311
+ - `auto_find_batch_size`: False
312
+ - `full_determinism`: False
313
+ - `torchdynamo`: None
314
+ - `ray_scope`: last
315
+ - `ddp_timeout`: 1800
316
+ - `torch_compile`: False
317
+ - `torch_compile_backend`: None
318
+ - `torch_compile_mode`: None
319
+ - `include_tokens_per_second`: False
320
+ - `include_num_input_tokens_seen`: False
321
+ - `neftune_noise_alpha`: None
322
+ - `optim_target_modules`: None
323
+ - `batch_eval_metrics`: False
324
+ - `eval_on_start`: False
325
+ - `use_liger_kernel`: False
326
+ - `liger_kernel_config`: None
327
+ - `eval_use_gather_object`: False
328
+ - `average_tokens_across_devices`: False
329
+ - `prompts`: None
330
+ - `batch_sampler`: batch_sampler
331
+ - `multi_dataset_batch_sampler`: proportional
332
+ - `router_mapping`: {}
333
+ - `learning_rate_mapping`: {}
334
+
335
+ </details>
336
+
337
+ ### Training Logs
338
+ <details><summary>Click to expand</summary>
339
+
340
+ | Epoch | Step | Training Loss |
341
+ |:------:|:------:|:-------------:|
342
+ | 0.0008 | 100 | 12.0412 |
343
+ | 0.0016 | 200 | 11.9152 |
344
+ | 0.0024 | 300 | 11.912 |
345
+ | 0.0032 | 400 | 11.5195 |
346
+ | 0.0040 | 500 | 11.2838 |
347
+ | 0.0048 | 600 | 11.1474 |
348
+ | 0.0056 | 700 | 10.7648 |
349
+ | 0.0063 | 800 | 10.4413 |
350
+ | 0.0071 | 900 | 10.2946 |
351
+ | 0.0079 | 1000 | 10.0453 |
352
+ | 0.0087 | 1100 | 9.7748 |
353
+ | 0.0095 | 1200 | 9.6514 |
354
+ | 0.0103 | 1300 | 9.4917 |
355
+ | 0.0111 | 1400 | 9.2272 |
356
+ | 0.0119 | 1500 | 9.0495 |
357
+ | 0.0127 | 1600 | 8.9256 |
358
+ | 0.0135 | 1700 | 8.8294 |
359
+ | 0.0143 | 1800 | 8.7685 |
360
+ | 0.0151 | 1900 | 8.7427 |
361
+ | 0.0159 | 2000 | 8.7067 |
362
+ | 0.0167 | 2100 | 8.7053 |
363
+ | 0.0175 | 2200 | 8.6753 |
364
+ | 0.0182 | 2300 | 8.6717 |
365
+ | 0.0190 | 2400 | 8.6575 |
366
+ | 0.0198 | 2500 | 8.6501 |
367
+ | 0.0206 | 2600 | 8.6424 |
368
+ | 0.0214 | 2700 | 8.6203 |
369
+ | 0.0222 | 2800 | 8.6342 |
370
+ | 0.0230 | 2900 | 8.6031 |
371
+ | 0.0238 | 3000 | 8.6121 |
372
+ | 0.0246 | 3100 | 8.5977 |
373
+ | 0.0254 | 3200 | 8.5898 |
374
+ | 0.0262 | 3300 | 8.5821 |
375
+ | 0.0270 | 3400 | 8.5881 |
376
+ | 0.0278 | 3500 | 8.5784 |
377
+ | 0.0286 | 3600 | 8.5604 |
378
+ | 0.0294 | 3700 | 8.5606 |
379
+ | 0.0302 | 3800 | 8.5429 |
380
+ | 0.0309 | 3900 | 8.5491 |
381
+ | 0.0317 | 4000 | 8.5483 |
382
+ | 0.0325 | 4100 | 8.5394 |
383
+ | 0.0333 | 4200 | 8.5343 |
384
+ | 0.0341 | 4300 | 8.5251 |
385
+ | 0.0349 | 4400 | 8.5183 |
386
+ | 0.0357 | 4500 | 8.5069 |
387
+ | 0.0365 | 4600 | 8.5296 |
388
+ | 0.0373 | 4700 | 8.501 |
389
+ | 0.0381 | 4800 | 8.5133 |
390
+ | 0.0389 | 4900 | 8.4931 |
391
+ | 0.0397 | 5000 | 8.4872 |
392
+ | 0.0405 | 5100 | 8.5026 |
393
+ | 0.0413 | 5200 | 8.4917 |
394
+ | 0.0421 | 5300 | 8.4758 |
395
+ | 0.0428 | 5400 | 8.4826 |
396
+ | 0.0436 | 5500 | 8.4639 |
397
+ | 0.0444 | 5600 | 8.4732 |
398
+ | 0.0452 | 5700 | 8.4576 |
399
+ | 0.0460 | 5800 | 8.4761 |
400
+ | 0.0468 | 5900 | 8.4504 |
401
+ | 0.0476 | 6000 | 8.458 |
402
+ | 0.0484 | 6100 | 8.4411 |
403
+ | 0.0492 | 6200 | 8.4336 |
404
+ | 0.0500 | 6300 | 8.4425 |
405
+ | 0.0508 | 6400 | 8.4439 |
406
+ | 0.0516 | 6500 | 8.4314 |
407
+ | 0.0524 | 6600 | 8.426 |
408
+ | 0.0532 | 6700 | 8.4333 |
409
+ | 0.0540 | 6800 | 8.4237 |
410
+ | 0.0547 | 6900 | 8.4138 |
411
+ | 0.0555 | 7000 | 8.404 |
412
+ | 0.0563 | 7100 | 8.4076 |
413
+ | 0.0571 | 7200 | 8.4042 |
414
+ | 0.0579 | 7300 | 8.4052 |
415
+ | 0.0587 | 7400 | 8.4037 |
416
+ | 0.0595 | 7500 | 8.4017 |
417
+ | 0.0603 | 7600 | 8.4103 |
418
+ | 0.0611 | 7700 | 8.3919 |
419
+ | 0.0619 | 7800 | 8.3827 |
420
+ | 0.0627 | 7900 | 8.371 |
421
+ | 0.0635 | 8000 | 8.3792 |
422
+ | 0.0643 | 8100 | 8.3737 |
423
+ | 0.0651 | 8200 | 8.376 |
424
+ | 0.0659 | 8300 | 8.3804 |
425
+ | 0.0666 | 8400 | 8.3679 |
426
+ | 0.0674 | 8500 | 8.3731 |
427
+ | 0.0682 | 8600 | 8.3716 |
428
+ | 0.0690 | 8700 | 8.372 |
429
+ | 0.0698 | 8800 | 8.3806 |
430
+ | 0.0706 | 8900 | 8.3491 |
431
+ | 0.0714 | 9000 | 8.3285 |
432
+ | 0.0722 | 9100 | 8.3454 |
433
+ | 0.0730 | 9200 | 8.3536 |
434
+ | 0.0738 | 9300 | 8.3538 |
435
+ | 0.0746 | 9400 | 8.3495 |
436
+ | 0.0754 | 9500 | 8.3395 |
437
+ | 0.0762 | 9600 | 8.3342 |
438
+ | 0.0770 | 9700 | 8.3125 |
439
+ | 0.0778 | 9800 | 8.3323 |
440
+ | 0.0786 | 9900 | 8.3057 |
441
+ | 0.0793 | 10000 | 8.3212 |
442
+ | 0.0801 | 10100 | 8.3136 |
443
+ | 0.0809 | 10200 | 8.3163 |
444
+ | 0.0817 | 10300 | 8.326 |
445
+ | 0.0825 | 10400 | 8.3099 |
446
+ | 0.0833 | 10500 | 8.3121 |
447
+ | 0.0841 | 10600 | 8.3165 |
448
+ | 0.0849 | 10700 | 8.3076 |
449
+ | 0.0857 | 10800 | 8.298 |
450
+ | 0.0865 | 10900 | 8.2938 |
451
+ | 0.0873 | 11000 | 8.289 |
452
+ | 0.0881 | 11100 | 8.2997 |
453
+ | 0.0889 | 11200 | 8.2932 |
454
+ | 0.0897 | 11300 | 8.2841 |
455
+ | 0.0905 | 11400 | 8.2825 |
456
+ | 0.0912 | 11500 | 8.2785 |
457
+ | 0.0920 | 11600 | 8.3002 |
458
+ | 0.0928 | 11700 | 8.2711 |
459
+ | 0.0936 | 11800 | 8.2854 |
460
+ | 0.0944 | 11900 | 8.2745 |
461
+ | 0.0952 | 12000 | 8.2641 |
462
+ | 0.0960 | 12100 | 8.2712 |
463
+ | 0.0968 | 12200 | 8.2613 |
464
+ | 0.0976 | 12300 | 8.2779 |
465
+ | 0.0984 | 12400 | 8.2499 |
466
+ | 0.0992 | 12500 | 8.2666 |
467
+ | 0.1000 | 12600 | 8.2553 |
468
+ | 0.1008 | 12700 | 8.2421 |
469
+ | 0.1016 | 12800 | 8.2562 |
470
+ | 0.1024 | 12900 | 8.2483 |
471
+ | 0.1031 | 13000 | 8.2657 |
472
+ | 0.1039 | 13100 | 8.2454 |
473
+ | 0.1047 | 13200 | 8.2381 |
474
+ | 0.1055 | 13300 | 8.2406 |
475
+ | 0.1063 | 13400 | 8.229 |
476
+ | 0.1071 | 13500 | 8.2139 |
477
+ | 0.1079 | 13600 | 8.2308 |
478
+ | 0.1087 | 13700 | 8.2442 |
479
+ | 0.1095 | 13800 | 8.2102 |
480
+ | 0.1103 | 13900 | 8.2337 |
481
+ | 0.1111 | 14000 | 8.234 |
482
+ | 0.1119 | 14100 | 8.2024 |
483
+ | 0.1127 | 14200 | 8.2114 |
484
+ | 0.1135 | 14300 | 8.2167 |
485
+ | 0.1143 | 14400 | 8.2168 |
486
+ | 0.1151 | 14500 | 8.2264 |
487
+ | 0.1158 | 14600 | 8.2055 |
488
+ | 0.1166 | 14700 | 8.2172 |
489
+ | 0.1174 | 14800 | 8.1961 |
490
+ | 0.1182 | 14900 | 8.1905 |
491
+ | 0.1190 | 15000 | 8.1855 |
492
+ | 0.1198 | 15100 | 8.1984 |
493
+ | 0.1206 | 15200 | 8.1847 |
494
+ | 0.1214 | 15300 | 8.1916 |
495
+ | 0.1222 | 15400 | 8.1725 |
496
+ | 0.1230 | 15500 | 8.2086 |
497
+ | 0.1238 | 15600 | 8.177 |
498
+ | 0.1246 | 15700 | 8.1659 |
499
+ | 0.1254 | 15800 | 8.1787 |
500
+ | 0.1262 | 15900 | 8.1683 |
501
+ | 0.1270 | 16000 | 8.1752 |
502
+ | 0.1277 | 16100 | 8.1832 |
503
+ | 0.1285 | 16200 | 8.1832 |
504
+ | 0.1293 | 16300 | 8.1786 |
505
+ | 0.1301 | 16400 | 8.1713 |
506
+ | 0.1309 | 16500 | 8.1666 |
507
+ | 0.1317 | 16600 | 8.1555 |
508
+ | 0.1325 | 16700 | 8.1553 |
509
+ | 0.1333 | 16800 | 8.1702 |
510
+ | 0.1341 | 16900 | 8.1586 |
511
+ | 0.1349 | 17000 | 8.1578 |
512
+ | 0.1357 | 17100 | 8.1531 |
513
+ | 0.1365 | 17200 | 8.1681 |
514
+ | 0.1373 | 17300 | 8.1509 |
515
+ | 0.1381 | 17400 | 8.147 |
516
+ | 0.1389 | 17500 | 8.1465 |
517
+ | 0.1396 | 17600 | 8.1658 |
518
+ | 0.1404 | 17700 | 8.1514 |
519
+ | 0.1412 | 17800 | 8.1463 |
520
+ | 0.1420 | 17900 | 8.1372 |
521
+ | 0.1428 | 18000 | 8.1369 |
522
+ | 0.1436 | 18100 | 8.1435 |
523
+ | 0.1444 | 18200 | 8.147 |
524
+ | 0.1452 | 18300 | 8.1396 |
525
+ | 0.1460 | 18400 | 8.1538 |
526
+ | 0.1468 | 18500 | 8.1308 |
527
+ | 0.1476 | 18600 | 8.1696 |
528
+ | 0.1484 | 18700 | 8.1229 |
529
+ | 0.1492 | 18800 | 8.1333 |
530
+ | 0.1500 | 18900 | 8.1217 |
531
+ | 0.1508 | 19000 | 8.1189 |
532
+ | 0.1515 | 19100 | 8.1132 |
533
+ | 0.1523 | 19200 | 8.1085 |
534
+ | 0.1531 | 19300 | 8.141 |
535
+ | 0.1539 | 19400 | 8.1169 |
536
+ | 0.1547 | 19500 | 8.1234 |
537
+ | 0.1555 | 19600 | 8.1328 |
538
+ | 0.1563 | 19700 | 8.1204 |
539
+ | 0.1571 | 19800 | 8.1107 |
540
+ | 0.1579 | 19900 | 8.1383 |
541
+ | 0.1587 | 20000 | 8.1167 |
542
+ | 0.1595 | 20100 | 8.1088 |
543
+ | 0.1603 | 20200 | 8.0967 |
544
+ | 0.1611 | 20300 | 8.1275 |
545
+ | 0.1619 | 20400 | 8.103 |
546
+ | 0.1627 | 20500 | 8.0989 |
547
+ | 0.1635 | 20600 | 8.1116 |
548
+ | 0.1642 | 20700 | 8.0952 |
549
+ | 0.1650 | 20800 | 8.1064 |
550
+ | 0.1658 | 20900 | 8.0833 |
551
+ | 0.1666 | 21000 | 8.0924 |
552
+ | 0.1674 | 21100 | 8.083 |
553
+ | 0.1682 | 21200 | 8.1075 |
554
+ | 0.1690 | 21300 | 8.07 |
555
+ | 0.1698 | 21400 | 8.0769 |
556
+ | 0.1706 | 21500 | 8.1305 |
557
+ | 0.1714 | 21600 | 8.0656 |
558
+ | 0.1722 | 21700 | 8.0887 |
559
+ | 0.1730 | 21800 | 8.0884 |
560
+ | 0.1738 | 21900 | 8.0961 |
561
+ | 0.1746 | 22000 | 8.0807 |
562
+ | 0.1754 | 22100 | 8.0795 |
563
+ | 0.1761 | 22200 | 8.0833 |
564
+ | 0.1769 | 22300 | 8.1031 |
565
+ | 0.1777 | 22400 | 8.0857 |
566
+ | 0.1785 | 22500 | 8.0878 |
567
+ | 0.1793 | 22600 | 8.07 |
568
+ | 0.1801 | 22700 | 8.0943 |
569
+ | 0.1809 | 22800 | 8.0835 |
570
+ | 0.1817 | 22900 | 8.0973 |
571
+ | 0.1825 | 23000 | 8.081 |
572
+ | 0.1833 | 23100 | 8.0924 |
573
+ | 0.1841 | 23200 | 8.0438 |
574
+ | 0.1849 | 23300 | 8.0755 |
575
+ | 0.1857 | 23400 | 8.0749 |
576
+ | 0.1865 | 23500 | 8.0786 |
577
+ | 0.1873 | 23600 | 8.0558 |
578
+ | 0.1880 | 23700 | 8.0627 |
579
+ | 0.1888 | 23800 | 8.0876 |
580
+ | 0.1896 | 23900 | 8.0635 |
581
+ | 0.1904 | 24000 | 8.041 |
582
+ | 0.1912 | 24100 | 8.0657 |
583
+ | 0.1920 | 24200 | 8.0608 |
584
+ | 0.1928 | 24300 | 8.0688 |
585
+ | 0.1936 | 24400 | 8.0401 |
586
+ | 0.1944 | 24500 | 8.0585 |
587
+ | 0.1952 | 24600 | 8.0621 |
588
+ | 0.1960 | 24700 | 8.0194 |
589
+ | 0.1968 | 24800 | 8.0729 |
590
+ | 0.1976 | 24900 | 8.0449 |
591
+ | 0.1984 | 25000 | 8.041 |
592
+ | 0.1992 | 25100 | 8.074 |
593
+ | 0.1999 | 25200 | 8.0483 |
594
+ | 0.2007 | 25300 | 8.0656 |
595
+ | 0.2015 | 25400 | 8.0639 |
596
+ | 0.2023 | 25500 | 8.0359 |
597
+ | 0.2031 | 25600 | 8.019 |
598
+ | 0.2039 | 25700 | 8.0337 |
599
+ | 0.2047 | 25800 | 8.036 |
600
+ | 0.2055 | 25900 | 8.0224 |
601
+ | 0.2063 | 26000 | 8.0444 |
602
+ | 0.2071 | 26100 | 8.0227 |
603
+ | 0.2079 | 26200 | 8.0208 |
604
+ | 0.2087 | 26300 | 8.05 |
605
+ | 0.2095 | 26400 | 8.0272 |
606
+ | 0.2103 | 26500 | 8.022 |
607
+ | 0.2111 | 26600 | 8.0318 |
608
+ | 0.2119 | 26700 | 8.0332 |
609
+ | 0.2126 | 26800 | 8.0434 |
610
+ | 0.2134 | 26900 | 8.0407 |
611
+ | 0.2142 | 27000 | 8.0326 |
612
+ | 0.2150 | 27100 | 8.028 |
613
+ | 0.2158 | 27200 | 8.0233 |
614
+ | 0.2166 | 27300 | 8.0384 |
615
+ | 0.2174 | 27400 | 8.0513 |
616
+ | 0.2182 | 27500 | 8.0096 |
617
+ | 0.2190 | 27600 | 8.0334 |
618
+ | 0.2198 | 27700 | 8.0335 |
619
+ | 0.2206 | 27800 | 8.0297 |
620
+ | 0.2214 | 27900 | 8.0124 |
621
+ | 0.2222 | 28000 | 8.0294 |
622
+ | 0.2230 | 28100 | 8.0197 |
623
+ | 0.2238 | 28200 | 7.9973 |
624
+ | 0.2245 | 28300 | 8.0122 |
625
+ | 0.2253 | 28400 | 8.0034 |
626
+ | 0.2261 | 28500 | 8.0284 |
627
+ | 0.2269 | 28600 | 8.0158 |
628
+ | 0.2277 | 28700 | 8.0077 |
629
+ | 0.2285 | 28800 | 8.0155 |
630
+ | 0.2293 | 28900 | 8.0216 |
631
+ | 0.2301 | 29000 | 8.0141 |
632
+ | 0.2309 | 29100 | 7.9963 |
633
+ | 0.2317 | 29200 | 8.0045 |
634
+ | 0.2325 | 29300 | 8.0118 |
635
+ | 0.2333 | 29400 | 8.0192 |
636
+ | 0.2341 | 29500 | 7.9981 |
637
+ | 0.2349 | 29600 | 7.9893 |
638
+ | 0.2357 | 29700 | 8.0174 |
639
+ | 0.2364 | 29800 | 7.9907 |
640
+ | 0.2372 | 29900 | 8.0144 |
641
+ | 0.2380 | 30000 | 8.0101 |
642
+ | 0.2388 | 30100 | 7.9858 |
643
+ | 0.2396 | 30200 | 8.0121 |
644
+ | 0.2404 | 30300 | 8.0037 |
645
+ | 0.2412 | 30400 | 8.0033 |
646
+ | 0.2420 | 30500 | 7.966 |
647
+ | 0.2428 | 30600 | 7.9766 |
648
+ | 0.2436 | 30700 | 7.9915 |
649
+ | 0.2444 | 30800 | 8.0029 |
650
+ | 0.2452 | 30900 | 8.0012 |
651
+ | 0.2460 | 31000 | 7.9844 |
652
+ | 0.2468 | 31100 | 7.9884 |
653
+ | 0.2476 | 31200 | 7.9929 |
654
+ | 0.2483 | 31300 | 7.9936 |
655
+ | 0.2491 | 31400 | 7.9997 |
656
+ | 0.2499 | 31500 | 7.9811 |
657
+ | 0.2507 | 31600 | 8.0012 |
658
+ | 0.2515 | 31700 | 7.9789 |
659
+ | 0.2523 | 31800 | 8.0087 |
660
+ | 0.2531 | 31900 | 8.0072 |
661
+ | 0.2539 | 32000 | 7.9996 |
662
+ | 0.2547 | 32100 | 7.9918 |
663
+ | 0.2555 | 32200 | 8.0013 |
664
+ | 0.2563 | 32300 | 7.9866 |
665
+ | 0.2571 | 32400 | 7.9679 |
666
+ | 0.2579 | 32500 | 8.0188 |
667
+ | 0.2587 | 32600 | 7.9661 |
668
+ | 0.2595 | 32700 | 7.9891 |
669
+ | 0.2603 | 32800 | 7.9697 |
670
+ | 0.2610 | 32900 | 7.969 |
671
+ | 0.2618 | 33000 | 7.9749 |
672
+ | 0.2626 | 33100 | 7.9636 |
673
+ | 0.2634 | 33200 | 7.9802 |
674
+ | 0.2642 | 33300 | 7.9643 |
675
+ | 0.2650 | 33400 | 7.9989 |
676
+ | 0.2658 | 33500 | 7.9458 |
677
+ | 0.2666 | 33600 | 7.9944 |
678
+ | 0.2674 | 33700 | 7.9794 |
679
+ | 0.2682 | 33800 | 7.9824 |
680
+ | 0.2690 | 33900 | 7.9922 |
681
+ | 0.2698 | 34000 | 7.9699 |
682
+ | 0.2706 | 34100 | 7.9711 |
683
+ | 0.2714 | 34200 | 7.9582 |
684
+ | 0.2722 | 34300 | 7.9877 |
685
+ | 0.2729 | 34400 | 7.9604 |
686
+ | 0.2737 | 34500 | 7.9874 |
687
+ | 0.2745 | 34600 | 7.9601 |
688
+ | 0.2753 | 34700 | 7.9355 |
689
+ | 0.2761 | 34800 | 7.9501 |
690
+ | 0.2769 | 34900 | 7.9548 |
691
+ | 0.2777 | 35000 | 7.9632 |
692
+ | 0.2785 | 35100 | 7.981 |
693
+ | 0.2793 | 35200 | 7.9614 |
694
+ | 0.2801 | 35300 | 7.9746 |
695
+ | 0.2809 | 35400 | 7.938 |
696
+ | 0.2817 | 35500 | 7.9592 |
697
+ | 0.2825 | 35600 | 7.9508 |
698
+ | 0.2833 | 35700 | 7.9587 |
699
+ | 0.2841 | 35800 | 7.9414 |
700
+ | 0.2848 | 35900 | 7.9619 |
701
+ | 0.2856 | 36000 | 7.9494 |
702
+ | 0.2864 | 36100 | 7.9448 |
703
+ | 0.2872 | 36200 | 7.9596 |
704
+ | 0.2880 | 36300 | 7.9503 |
705
+ | 0.2888 | 36400 | 7.9502 |
706
+ | 0.2896 | 36500 | 7.943 |
707
+ | 0.2904 | 36600 | 7.935 |
708
+ | 0.2912 | 36700 | 7.965 |
709
+ | 0.2920 | 36800 | 7.9485 |
710
+ | 0.2928 | 36900 | 7.9289 |
711
+ | 0.2936 | 37000 | 7.9595 |
712
+ | 0.2944 | 37100 | 7.9497 |
713
+ | 0.2952 | 37200 | 7.9302 |
714
+ | 0.2960 | 37300 | 7.9177 |
715
+ | 0.2968 | 37400 | 7.9516 |
716
+ | 0.2975 | 37500 | 7.9546 |
717
+ | 0.2983 | 37600 | 7.9561 |
718
+ | 0.2991 | 37700 | 7.9394 |
719
+ | 0.2999 | 37800 | 7.9266 |
720
+ | 0.3007 | 37900 | 7.9413 |
721
+ | 0.3015 | 38000 | 7.9485 |
722
+ | 0.3023 | 38100 | 7.9366 |
723
+ | 0.3031 | 38200 | 7.9355 |
724
+ | 0.3039 | 38300 | 7.9164 |
725
+ | 0.3047 | 38400 | 7.9553 |
726
+ | 0.3055 | 38500 | 7.958 |
727
+ | 0.3063 | 38600 | 7.9331 |
728
+ | 0.3071 | 38700 | 7.9183 |
729
+ | 0.3079 | 38800 | 7.9185 |
730
+ | 0.3087 | 38900 | 7.9438 |
731
+ | 0.3094 | 39000 | 7.9204 |
732
+ | 0.3102 | 39100 | 7.9175 |
733
+ | 0.3110 | 39200 | 7.9323 |
734
+ | 0.3118 | 39300 | 7.9108 |
735
+ | 0.3126 | 39400 | 7.9405 |
736
+ | 0.3134 | 39500 | 7.9238 |
737
+ | 0.3142 | 39600 | 7.9306 |
738
+ | 0.3150 | 39700 | 7.9372 |
739
+ | 0.3158 | 39800 | 7.948 |
740
+ | 0.3166 | 39900 | 7.9174 |
741
+ | 0.3174 | 40000 | 7.916 |
742
+ | 0.3182 | 40100 | 7.9212 |
743
+ | 0.3190 | 40200 | 7.9407 |
744
+ | 0.3198 | 40300 | 7.9181 |
745
+ | 0.3206 | 40400 | 7.9181 |
746
+ | 0.3213 | 40500 | 7.9285 |
747
+ | 0.3221 | 40600 | 7.9136 |
748
+ | 0.3229 | 40700 | 7.9239 |
749
+ | 0.3237 | 40800 | 7.9119 |
750
+ | 0.3245 | 40900 | 7.9201 |
751
+ | 0.3253 | 41000 | 7.9126 |
752
+ | 0.3261 | 41100 | 7.9269 |
753
+ | 0.3269 | 41200 | 7.8949 |
754
+ | 0.3277 | 41300 | 7.8836 |
755
+ | 0.3285 | 41400 | 7.9099 |
756
+ | 0.3293 | 41500 | 7.9219 |
757
+ | 0.3301 | 41600 | 7.9426 |
758
+ | 0.3309 | 41700 | 7.9163 |
759
+ | 0.3317 | 41800 | 7.9249 |
760
+ | 0.3325 | 41900 | 7.9062 |
761
+ | 0.3332 | 42000 | 7.8736 |
762
+ | 0.3340 | 42100 | 7.9214 |
763
+ | 0.3348 | 42200 | 7.9024 |
764
+ | 0.3356 | 42300 | 7.9086 |
765
+ | 0.3364 | 42400 | 7.9015 |
766
+ | 0.3372 | 42500 | 7.9202 |
767
+ | 0.3380 | 42600 | 7.9097 |
768
+ | 0.3388 | 42700 | 7.9373 |
769
+ | 0.3396 | 42800 | 7.8987 |
770
+ | 0.3404 | 42900 | 7.8992 |
771
+ | 0.3412 | 43000 | 7.8831 |
772
+ | 0.3420 | 43100 | 7.9166 |
773
+ | 0.3428 | 43200 | 7.9293 |
774
+ | 0.3436 | 43300 | 7.9095 |
775
+ | 0.3444 | 43400 | 7.903 |
776
+ | 0.3452 | 43500 | 7.9295 |
777
+ | 0.3459 | 43600 | 7.908 |
778
+ | 0.3467 | 43700 | 7.887 |
779
+ | 0.3475 | 43800 | 7.8854 |
780
+ | 0.3483 | 43900 | 7.9023 |
781
+ | 0.3491 | 44000 | 7.9025 |
782
+ | 0.3499 | 44100 | 7.9328 |
783
+ | 0.3507 | 44200 | 7.8859 |
784
+ | 0.3515 | 44300 | 7.891 |
785
+ | 0.3523 | 44400 | 7.9165 |
786
+ | 0.3531 | 44500 | 7.8875 |
787
+ | 0.3539 | 44600 | 7.8752 |
788
+ | 0.3547 | 44700 | 7.8807 |
789
+ | 0.3555 | 44800 | 7.8818 |
790
+ | 0.3563 | 44900 | 7.8955 |
791
+ | 0.3571 | 45000 | 7.8975 |
792
+ | 0.3578 | 45100 | 7.8736 |
793
+ | 0.3586 | 45200 | 7.8815 |
794
+ | 0.3594 | 45300 | 7.9161 |
795
+ | 0.3602 | 45400 | 7.8515 |
796
+ | 0.3610 | 45500 | 7.9015 |
797
+ | 0.3618 | 45600 | 7.9023 |
798
+ | 0.3626 | 45700 | 7.859 |
799
+ | 0.3634 | 45800 | 7.9132 |
800
+ | 0.3642 | 45900 | 7.9016 |
801
+ | 0.3650 | 46000 | 7.9213 |
802
+ | 0.3658 | 46100 | 7.8946 |
803
+ | 0.3666 | 46200 | 7.8778 |
804
+ | 0.3674 | 46300 | 7.8708 |
805
+ | 0.3682 | 46400 | 7.8756 |
806
+ | 0.3690 | 46500 | 7.9123 |
807
+ | 0.3697 | 46600 | 7.8953 |
808
+ | 0.3705 | 46700 | 7.8767 |
809
+ | 0.3713 | 46800 | 7.8677 |
810
+ | 0.3721 | 46900 | 7.8689 |
811
+ | 0.3729 | 47000 | 7.9171 |
812
+ | 0.3737 | 47100 | 7.91 |
813
+ | 0.3745 | 47200 | 7.88 |
814
+ | 0.3753 | 47300 | 7.891 |
815
+ | 0.3761 | 47400 | 7.8574 |
816
+ | 0.3769 | 47500 | 7.8915 |
817
+ | 0.3777 | 47600 | 7.8857 |
818
+ | 0.3785 | 47700 | 7.9033 |
819
+ | 0.3793 | 47800 | 7.877 |
820
+ | 0.3801 | 47900 | 7.8959 |
821
+ | 0.3809 | 48000 | 7.8763 |
822
+ | 0.3816 | 48100 | 7.8653 |
823
+ | 0.3824 | 48200 | 7.8716 |
824
+ | 0.3832 | 48300 | 7.8871 |
825
+ | 0.3840 | 48400 | 7.8883 |
826
+ | 0.3848 | 48500 | 7.8753 |
827
+ | 0.3856 | 48600 | 7.9032 |
828
+ | 0.3864 | 48700 | 7.8551 |
829
+ | 0.3872 | 48800 | 7.8779 |
830
+ | 0.3880 | 48900 | 7.8767 |
831
+ | 0.3888 | 49000 | 7.8497 |
832
+ | 0.3896 | 49100 | 7.8794 |
833
+ | 0.3904 | 49200 | 7.8867 |
834
+ | 0.3912 | 49300 | 7.8679 |
835
+ | 0.3920 | 49400 | 7.8564 |
836
+ | 0.3928 | 49500 | 7.874 |
837
+ | 0.3936 | 49600 | 7.8734 |
838
+ | 0.3943 | 49700 | 7.8795 |
839
+ | 0.3951 | 49800 | 7.8588 |
840
+ | 0.3959 | 49900 | 7.8713 |
841
+ | 0.3967 | 50000 | 7.8615 |
842
+ | 0.3975 | 50100 | 7.8803 |
843
+ | 0.3983 | 50200 | 7.8909 |
844
+ | 0.3991 | 50300 | 7.8863 |
845
+ | 0.3999 | 50400 | 7.8841 |
846
+ | 0.4007 | 50500 | 7.8682 |
847
+ | 0.4015 | 50600 | 7.8797 |
848
+ | 0.4023 | 50700 | 7.8572 |
849
+ | 0.4031 | 50800 | 7.8712 |
850
+ | 0.4039 | 50900 | 7.8674 |
851
+ | 0.4047 | 51000 | 7.8506 |
852
+ | 0.4055 | 51100 | 7.8768 |
853
+ | 0.4062 | 51200 | 7.8719 |
854
+ | 0.4070 | 51300 | 7.8455 |
855
+ | 0.4078 | 51400 | 7.8829 |
856
+ | 0.4086 | 51500 | 7.8543 |
857
+ | 0.4094 | 51600 | 7.8743 |
858
+ | 0.4102 | 51700 | 7.8779 |
859
+ | 0.4110 | 51800 | 7.8645 |
860
+ | 0.4118 | 51900 | 7.8401 |
861
+ | 0.4126 | 52000 | 7.8621 |
862
+ | 0.4134 | 52100 | 7.8753 |
863
+ | 0.4142 | 52200 | 7.8547 |
864
+ | 0.4150 | 52300 | 7.8518 |
865
+ | 0.4158 | 52400 | 7.8615 |
866
+ | 0.4166 | 52500 | 7.8499 |
867
+ | 0.4174 | 52600 | 7.8632 |
868
+ | 0.4181 | 52700 | 7.8644 |
869
+ | 0.4189 | 52800 | 7.8802 |
870
+ | 0.4197 | 52900 | 7.8653 |
871
+ | 0.4205 | 53000 | 7.8436 |
872
+ | 0.4213 | 53100 | 7.8619 |
873
+ | 0.4221 | 53200 | 7.8601 |
874
+ | 0.4229 | 53300 | 7.8635 |
875
+ | 0.4237 | 53400 | 7.8675 |
876
+ | 0.4245 | 53500 | 7.8669 |
877
+ | 0.4253 | 53600 | 7.8496 |
878
+ | 0.4261 | 53700 | 7.8601 |
879
+ | 0.4269 | 53800 | 7.8567 |
880
+ | 0.4277 | 53900 | 7.829 |
881
+ | 0.4285 | 54000 | 7.828 |
882
+ | 0.4293 | 54100 | 7.8727 |
883
+ | 0.4300 | 54200 | 7.8735 |
884
+ | 0.4308 | 54300 | 7.8582 |
885
+ | 0.4316 | 54400 | 7.8406 |
886
+ | 0.4324 | 54500 | 7.8351 |
887
+ | 0.4332 | 54600 | 7.8549 |
888
+ | 0.4340 | 54700 | 7.8363 |
889
+ | 0.4348 | 54800 | 7.8635 |
890
+ | 0.4356 | 54900 | 7.8572 |
891
+ | 0.4364 | 55000 | 7.8587 |
892
+ | 0.4372 | 55100 | 7.8167 |
893
+ | 0.4380 | 55200 | 7.8116 |
894
+ | 0.4388 | 55300 | 7.8655 |
895
+ | 0.4396 | 55400 | 7.8535 |
896
+ | 0.4404 | 55500 | 7.8363 |
897
+ | 0.4412 | 55600 | 7.8742 |
898
+ | 0.4420 | 55700 | 7.855 |
899
+ | 0.4427 | 55800 | 7.8514 |
900
+ | 0.4435 | 55900 | 7.8316 |
901
+ | 0.4443 | 56000 | 7.8481 |
902
+ | 0.4451 | 56100 | 7.8476 |
903
+ | 0.4459 | 56200 | 7.8584 |
904
+ | 0.4467 | 56300 | 7.8213 |
905
+ | 0.4475 | 56400 | 7.8209 |
906
+ | 0.4483 | 56500 | 7.8181 |
907
+ | 0.4491 | 56600 | 7.8297 |
908
+ | 0.4499 | 56700 | 7.8515 |
909
+ | 0.4507 | 56800 | 7.8609 |
910
+ | 0.4515 | 56900 | 7.8297 |
911
+ | 0.4523 | 57000 | 7.8383 |
912
+ | 0.4531 | 57100 | 7.8139 |
913
+ | 0.4539 | 57200 | 7.856 |
914
+ | 0.4546 | 57300 | 7.8191 |
915
+ | 0.4554 | 57400 | 7.8386 |
916
+ | 0.4562 | 57500 | 7.8752 |
917
+ | 0.4570 | 57600 | 7.8101 |
918
+ | 0.4578 | 57700 | 7.8346 |
919
+ | 0.4586 | 57800 | 7.8217 |
920
+ | 0.4594 | 57900 | 7.8416 |
921
+ | 0.4602 | 58000 | 7.8171 |
922
+ | 0.4610 | 58100 | 7.8451 |
923
+ | 0.4618 | 58200 | 7.8454 |
924
+ | 0.4626 | 58300 | 7.8095 |
925
+ | 0.4634 | 58400 | 7.8377 |
926
+ | 0.4642 | 58500 | 7.8376 |
927
+ | 0.4650 | 58600 | 7.823 |
928
+ | 0.4658 | 58700 | 7.8331 |
929
+ | 0.4665 | 58800 | 7.8253 |
930
+ | 0.4673 | 58900 | 7.8325 |
931
+ | 0.4681 | 59000 | 7.8509 |
932
+ | 0.4689 | 59100 | 7.8452 |
933
+ | 0.4697 | 59200 | 7.8332 |
934
+ | 0.4705 | 59300 | 7.8238 |
935
+ | 0.4713 | 59400 | 7.8026 |
936
+ | 0.4721 | 59500 | 7.8639 |
937
+ | 0.4729 | 59600 | 7.8262 |
938
+ | 0.4737 | 59700 | 7.8494 |
939
+ | 0.4745 | 59800 | 7.8479 |
940
+ | 0.4753 | 59900 | 7.831 |
941
+ | 0.4761 | 60000 | 7.8333 |
942
+ | 0.4769 | 60100 | 7.8325 |
943
+ | 0.4777 | 60200 | 7.8248 |
944
+ | 0.4784 | 60300 | 7.8784 |
945
+ | 0.4792 | 60400 | 7.8093 |
946
+ | 0.4800 | 60500 | 7.8351 |
947
+ | 0.4808 | 60600 | 7.8217 |
948
+ | 0.4816 | 60700 | 7.8132 |
949
+ | 0.4824 | 60800 | 7.8225 |
950
+ | 0.4832 | 60900 | 7.834 |
951
+ | 0.4840 | 61000 | 7.8317 |
952
+ | 0.4848 | 61100 | 7.8747 |
953
+ | 0.4856 | 61200 | 7.8552 |
954
+ | 0.4864 | 61300 | 7.818 |
955
+ | 0.4872 | 61400 | 7.8246 |
956
+ | 0.4880 | 61500 | 7.822 |
957
+ | 0.4888 | 61600 | 7.8202 |
958
+ | 0.4896 | 61700 | 7.8203 |
959
+ | 0.4904 | 61800 | 7.808 |
960
+ | 0.4911 | 61900 | 7.8176 |
961
+ | 0.4919 | 62000 | 7.7943 |
962
+ | 0.4927 | 62100 | 7.8168 |
963
+ | 0.4935 | 62200 | 7.7912 |
964
+ | 0.4943 | 62300 | 7.8415 |
965
+ | 0.4951 | 62400 | 7.826 |
966
+ | 0.4959 | 62500 | 7.8168 |
967
+ | 0.4967 | 62600 | 7.8436 |
968
+ | 0.4975 | 62700 | 7.7826 |
969
+ | 0.4983 | 62800 | 7.8321 |
970
+ | 0.4991 | 62900 | 7.8269 |
971
+ | 0.4999 | 63000 | 7.8432 |
972
+ | 0.5007 | 63100 | 7.8307 |
973
+ | 0.5015 | 63200 | 7.8151 |
974
+ | 0.5023 | 63300 | 7.7998 |
975
+ | 0.5030 | 63400 | 7.8185 |
976
+ | 0.5038 | 63500 | 7.7965 |
977
+ | 0.5046 | 63600 | 7.8193 |
978
+ | 0.5054 | 63700 | 7.8319 |
979
+ | 0.5062 | 63800 | 7.8101 |
980
+ | 0.5070 | 63900 | 7.7838 |
981
+ | 0.5078 | 64000 | 7.828 |
982
+ | 0.5086 | 64100 | 7.8331 |
983
+ | 0.5094 | 64200 | 7.8342 |
984
+ | 0.5102 | 64300 | 7.8022 |
985
+ | 0.5110 | 64400 | 7.7986 |
986
+ | 0.5118 | 64500 | 7.8048 |
987
+ | 0.5126 | 64600 | 7.8123 |
988
+ | 0.5134 | 64700 | 7.799 |
989
+ | 0.5142 | 64800 | 7.8118 |
990
+ | 0.5149 | 64900 | 7.8234 |
991
+ | 0.5157 | 65000 | 7.8083 |
992
+ | 0.5165 | 65100 | 7.8056 |
993
+ | 0.5173 | 65200 | 7.8051 |
994
+ | 0.5181 | 65300 | 7.8097 |
995
+ | 0.5189 | 65400 | 7.8143 |
996
+ | 0.5197 | 65500 | 7.8116 |
997
+ | 0.5205 | 65600 | 7.7758 |
998
+ | 0.5213 | 65700 | 7.7913 |
999
+ | 0.5221 | 65800 | 7.7804 |
1000
+ | 0.5229 | 65900 | 7.7906 |
1001
+ | 0.5237 | 66000 | 7.7676 |
1002
+ | 0.5245 | 66100 | 7.8007 |
1003
+ | 0.5253 | 66200 | 7.8194 |
1004
+ | 0.5261 | 66300 | 7.8149 |
1005
+ | 0.5269 | 66400 | 7.7842 |
1006
+ | 0.5276 | 66500 | 7.8083 |
1007
+ | 0.5284 | 66600 | 7.8145 |
1008
+ | 0.5292 | 66700 | 7.8174 |
1009
+ | 0.5300 | 66800 | 7.7821 |
1010
+ | 0.5308 | 66900 | 7.795 |
1011
+ | 0.5316 | 67000 | 7.8241 |
1012
+ | 0.5324 | 67100 | 7.832 |
1013
+ | 0.5332 | 67200 | 7.7978 |
1014
+ | 0.5340 | 67300 | 7.8049 |
1015
+ | 0.5348 | 67400 | 7.8231 |
1016
+ | 0.5356 | 67500 | 7.8171 |
1017
+ | 0.5364 | 67600 | 7.8007 |
1018
+ | 0.5372 | 67700 | 7.8477 |
1019
+ | 0.5380 | 67800 | 7.8003 |
1020
+ | 0.5388 | 67900 | 7.8196 |
1021
+ | 0.5395 | 68000 | 7.8105 |
1022
+ | 0.5403 | 68100 | 7.7716 |
1023
+ | 0.5411 | 68200 | 7.8121 |
1024
+ | 0.5419 | 68300 | 7.7958 |
1025
+ | 0.5427 | 68400 | 7.8067 |
1026
+ | 0.5435 | 68500 | 7.8335 |
1027
+ | 0.5443 | 68600 | 7.8158 |
1028
+ | 0.5451 | 68700 | 7.8133 |
1029
+ | 0.5459 | 68800 | 7.8423 |
1030
+ | 0.5467 | 68900 | 7.8051 |
1031
+ | 0.5475 | 69000 | 7.8195 |
1032
+ | 0.5483 | 69100 | 7.8095 |
1033
+ | 0.5491 | 69200 | 7.7677 |
1034
+ | 0.5499 | 69300 | 7.7847 |
1035
+ | 0.5507 | 69400 | 7.7981 |
1036
+ | 0.5514 | 69500 | 7.7737 |
1037
+ | 0.5522 | 69600 | 7.7971 |
1038
+ | 0.5530 | 69700 | 7.8085 |
1039
+ | 0.5538 | 69800 | 7.7996 |
1040
+ | 0.5546 | 69900 | 7.7833 |
1041
+ | 0.5554 | 70000 | 7.8204 |
1042
+ | 0.5562 | 70100 | 7.7875 |
1043
+ | 0.5570 | 70200 | 7.8002 |
1044
+ | 0.5578 | 70300 | 7.8105 |
1045
+ | 0.5586 | 70400 | 7.8107 |
1046
+ | 0.5594 | 70500 | 7.7909 |
1047
+ | 0.5602 | 70600 | 7.8094 |
1048
+ | 0.5610 | 70700 | 7.7725 |
1049
+ | 0.5618 | 70800 | 7.8224 |
1050
+ | 0.5626 | 70900 | 7.766 |
1051
+ | 0.5633 | 71000 | 7.7958 |
1052
+ | 0.5641 | 71100 | 7.7841 |
1053
+ | 0.5649 | 71200 | 7.7846 |
1054
+ | 0.5657 | 71300 | 7.795 |
1055
+ | 0.5665 | 71400 | 7.7668 |
1056
+ | 0.5673 | 71500 | 7.7875 |
1057
+ | 0.5681 | 71600 | 7.8003 |
1058
+ | 0.5689 | 71700 | 7.7688 |
1059
+ | 0.5697 | 71800 | 7.751 |
1060
+ | 0.5705 | 71900 | 7.773 |
1061
+ | 0.5713 | 72000 | 7.7892 |
1062
+ | 0.5721 | 72100 | 7.7687 |
1063
+ | 0.5729 | 72200 | 7.7745 |
1064
+ | 0.5737 | 72300 | 7.7657 |
1065
+ | 0.5745 | 72400 | 7.7373 |
1066
+ | 0.5753 | 72500 | 7.765 |
1067
+ | 0.5760 | 72600 | 7.7766 |
1068
+ | 0.5768 | 72700 | 7.785 |
1069
+ | 0.5776 | 72800 | 7.7955 |
1070
+ | 0.5784 | 72900 | 7.8414 |
1071
+ | 0.5792 | 73000 | 7.7838 |
1072
+ | 0.5800 | 73100 | 7.8033 |
1073
+ | 0.5808 | 73200 | 7.7758 |
1074
+ | 0.5816 | 73300 | 7.8067 |
1075
+ | 0.5824 | 73400 | 7.7902 |
1076
+ | 0.5832 | 73500 | 7.7921 |
1077
+ | 0.5840 | 73600 | 7.7989 |
1078
+ | 0.5848 | 73700 | 7.8024 |
1079
+ | 0.5856 | 73800 | 7.8233 |
1080
+ | 0.5864 | 73900 | 7.7966 |
1081
+ | 0.5872 | 74000 | 7.7693 |
1082
+ | 0.5879 | 74100 | 7.7697 |
1083
+ | 0.5887 | 74200 | 7.7514 |
1084
+ | 0.5895 | 74300 | 7.8101 |
1085
+ | 0.5903 | 74400 | 7.768 |
1086
+ | 0.5911 | 74500 | 7.8202 |
1087
+ | 0.5919 | 74600 | 7.7555 |
1088
+ | 0.5927 | 74700 | 7.8208 |
1089
+ | 0.5935 | 74800 | 7.7757 |
1090
+ | 0.5943 | 74900 | 7.7755 |
1091
+ | 0.5951 | 75000 | 7.7854 |
1092
+ | 0.5959 | 75100 | 7.8051 |
1093
+ | 0.5967 | 75200 | 7.7653 |
1094
+ | 0.5975 | 75300 | 7.8007 |
1095
+ | 0.5983 | 75400 | 7.7729 |
1096
+ | 0.5991 | 75500 | 7.7658 |
1097
+ | 0.5998 | 75600 | 7.7667 |
1098
+ | 0.6006 | 75700 | 7.808 |
1099
+ | 0.6014 | 75800 | 7.8034 |
1100
+ | 0.6022 | 75900 | 7.8004 |
1101
+ | 0.6030 | 76000 | 7.7595 |
1102
+ | 0.6038 | 76100 | 7.764 |
1103
+ | 0.6046 | 76200 | 7.7663 |
1104
+ | 0.6054 | 76300 | 7.8129 |
1105
+ | 0.6062 | 76400 | 7.788 |
1106
+ | 0.6070 | 76500 | 7.7721 |
1107
+ | 0.6078 | 76600 | 7.7951 |
1108
+ | 0.6086 | 76700 | 7.7497 |
1109
+ | 0.6094 | 76800 | 7.7769 |
1110
+ | 0.6102 | 76900 | 7.8075 |
1111
+ | 0.6110 | 77000 | 7.7576 |
1112
+ | 0.6117 | 77100 | 7.7834 |
1113
+ | 0.6125 | 77200 | 7.7573 |
1114
+ | 0.6133 | 77300 | 7.7652 |
1115
+ | 0.6141 | 77400 | 7.7348 |
1116
+ | 0.6149 | 77500 | 7.8105 |
1117
+ | 0.6157 | 77600 | 7.786 |
1118
+ | 0.6165 | 77700 | 7.7999 |
1119
+ | 0.6173 | 77800 | 7.7863 |
1120
+ | 0.6181 | 77900 | 7.7732 |
1121
+ | 0.6189 | 78000 | 7.7531 |
1122
+ | 0.6197 | 78100 | 7.7992 |
1123
+ | 0.6205 | 78200 | 7.8139 |
1124
+ | 0.6213 | 78300 | 7.7678 |
1125
+ | 0.6221 | 78400 | 7.7512 |
1126
+ | 0.6229 | 78500 | 7.7512 |
1127
+ | 0.6237 | 78600 | 7.7516 |
1128
+ | 0.6244 | 78700 | 7.7591 |
1129
+ | 0.6252 | 78800 | 7.7853 |
1130
+ | 0.6260 | 78900 | 7.7598 |
1131
+ | 0.6268 | 79000 | 7.7489 |
1132
+ | 0.6276 | 79100 | 7.7585 |
1133
+ | 0.6284 | 79200 | 7.79 |
1134
+ | 0.6292 | 79300 | 7.7485 |
1135
+ | 0.6300 | 79400 | 7.8023 |
1136
+ | 0.6308 | 79500 | 7.7705 |
1137
+ | 0.6316 | 79600 | 7.7609 |
1138
+ | 0.6324 | 79700 | 7.7404 |
1139
+ | 0.6332 | 79800 | 7.748 |
1140
+ | 0.6340 | 79900 | 7.7489 |
1141
+ | 0.6348 | 80000 | 7.7646 |
1142
+ | 0.6356 | 80100 | 7.7661 |
1143
+ | 0.6363 | 80200 | 7.7978 |
1144
+ | 0.6371 | 80300 | 7.79 |
1145
+ | 0.6379 | 80400 | 7.771 |
1146
+ | 0.6387 | 80500 | 7.759 |
1147
+ | 0.6395 | 80600 | 7.7393 |
1148
+ | 0.6403 | 80700 | 7.7748 |
1149
+ | 0.6411 | 80800 | 7.766 |
1150
+ | 0.6419 | 80900 | 7.7782 |
1151
+ | 0.6427 | 81000 | 7.7686 |
1152
+ | 0.6435 | 81100 | 7.7724 |
1153
+ | 0.6443 | 81200 | 7.7512 |
1154
+ | 0.6451 | 81300 | 7.7616 |
1155
+ | 0.6459 | 81400 | 7.7614 |
1156
+ | 0.6467 | 81500 | 7.7511 |
1157
+ | 0.6475 | 81600 | 7.764 |
1158
+ | 0.6482 | 81700 | 7.7391 |
1159
+ | 0.6490 | 81800 | 7.7428 |
1160
+ | 0.6498 | 81900 | 7.7868 |
1161
+ | 0.6506 | 82000 | 7.7309 |
1162
+ | 0.6514 | 82100 | 7.7582 |
1163
+ | 0.6522 | 82200 | 7.7754 |
1164
+ | 0.6530 | 82300 | 7.7282 |
1165
+ | 0.6538 | 82400 | 7.7693 |
1166
+ | 0.6546 | 82500 | 7.7742 |
1167
+ | 0.6554 | 82600 | 7.7859 |
1168
+ | 0.6562 | 82700 | 7.7177 |
1169
+ | 0.6570 | 82800 | 7.7538 |
1170
+ | 0.6578 | 82900 | 7.7574 |
1171
+ | 0.6586 | 83000 | 7.7423 |
1172
+ | 0.6594 | 83100 | 7.7485 |
1173
+ | 0.6601 | 83200 | 7.7748 |
1174
+ | 0.6609 | 83300 | 7.7371 |
1175
+ | 0.6617 | 83400 | 7.769 |
1176
+ | 0.6625 | 83500 | 7.7637 |
1177
+ | 0.6633 | 83600 | 7.765 |
1178
+ | 0.6641 | 83700 | 7.7525 |
1179
+ | 0.6649 | 83800 | 7.738 |
1180
+ | 0.6657 | 83900 | 7.783 |
1181
+ | 0.6665 | 84000 | 7.7489 |
1182
+ | 0.6673 | 84100 | 7.7615 |
1183
+ | 0.6681 | 84200 | 7.7318 |
1184
+ | 0.6689 | 84300 | 7.7583 |
1185
+ | 0.6697 | 84400 | 7.7781 |
1186
+ | 0.6705 | 84500 | 7.7194 |
1187
+ | 0.6713 | 84600 | 7.7482 |
1188
+ | 0.6721 | 84700 | 7.7483 |
1189
+ | 0.6728 | 84800 | 7.7635 |
1190
+ | 0.6736 | 84900 | 7.739 |
1191
+ | 0.6744 | 85000 | 7.7449 |
1192
+ | 0.6752 | 85100 | 7.6926 |
1193
+ | 0.6760 | 85200 | 7.758 |
1194
+ | 0.6768 | 85300 | 7.739 |
1195
+ | 0.6776 | 85400 | 7.7419 |
1196
+ | 0.6784 | 85500 | 7.7793 |
1197
+ | 0.6792 | 85600 | 7.7731 |
1198
+ | 0.6800 | 85700 | 7.7676 |
1199
+ | 0.6808 | 85800 | 7.741 |
1200
+ | 0.6816 | 85900 | 7.7646 |
1201
+ | 0.6824 | 86000 | 7.7848 |
1202
+ | 0.6832 | 86100 | 7.7405 |
1203
+ | 0.6840 | 86200 | 7.749 |
1204
+ | 0.6847 | 86300 | 7.7295 |
1205
+ | 0.6855 | 86400 | 7.7499 |
1206
+ | 0.6863 | 86500 | 7.7653 |
1207
+ | 0.6871 | 86600 | 7.7719 |
1208
+ | 0.6879 | 86700 | 7.8107 |
1209
+ | 0.6887 | 86800 | 7.7414 |
1210
+ | 0.6895 | 86900 | 7.7397 |
1211
+ | 0.6903 | 87000 | 7.7213 |
1212
+ | 0.6911 | 87100 | 7.7619 |
1213
+ | 0.6919 | 87200 | 7.7267 |
1214
+ | 0.6927 | 87300 | 7.7408 |
1215
+ | 0.6935 | 87400 | 7.7421 |
1216
+ | 0.6943 | 87500 | 7.7754 |
1217
+ | 0.6951 | 87600 | 7.7386 |
1218
+ | 0.6959 | 87700 | 7.721 |
1219
+ | 0.6966 | 87800 | 7.7403 |
1220
+ | 0.6974 | 87900 | 7.771 |
1221
+ | 0.6982 | 88000 | 7.7491 |
1222
+ | 0.6990 | 88100 | 7.744 |
1223
+ | 0.6998 | 88200 | 7.7231 |
1224
+ | 0.7006 | 88300 | 7.7498 |
1225
+ | 0.7014 | 88400 | 7.7619 |
1226
+ | 0.7022 | 88500 | 7.7374 |
1227
+ | 0.7030 | 88600 | 7.7649 |
1228
+ | 0.7038 | 88700 | 7.7399 |
1229
+ | 0.7046 | 88800 | 7.7882 |
1230
+ | 0.7054 | 88900 | 7.7816 |
1231
+ | 0.7062 | 89000 | 7.7451 |
1232
+ | 0.7070 | 89100 | 7.7813 |
1233
+ | 0.7078 | 89200 | 7.7419 |
1234
+ | 0.7086 | 89300 | 7.7456 |
1235
+ | 0.7093 | 89400 | 7.7203 |
1236
+ | 0.7101 | 89500 | 7.7378 |
1237
+ | 0.7109 | 89600 | 7.7689 |
1238
+ | 0.7117 | 89700 | 7.7392 |
1239
+ | 0.7125 | 89800 | 7.7071 |
1240
+ | 0.7133 | 89900 | 7.7081 |
1241
+ | 0.7141 | 90000 | 7.7365 |
1242
+ | 0.7149 | 90100 | 7.7483 |
1243
+ | 0.7157 | 90200 | 7.7614 |
1244
+ | 0.7165 | 90300 | 7.7732 |
1245
+ | 0.7173 | 90400 | 7.7589 |
1246
+ | 0.7181 | 90500 | 7.7484 |
1247
+ | 0.7189 | 90600 | 7.708 |
1248
+ | 0.7197 | 90700 | 7.7244 |
1249
+ | 0.7205 | 90800 | 7.7543 |
1250
+ | 0.7212 | 90900 | 7.7054 |
1251
+ | 0.7220 | 91000 | 7.7162 |
1252
+ | 0.7228 | 91100 | 7.7454 |
1253
+ | 0.7236 | 91200 | 7.7878 |
1254
+ | 0.7244 | 91300 | 7.7339 |
1255
+ | 0.7252 | 91400 | 7.7792 |
1256
+ | 0.7260 | 91500 | 7.7707 |
1257
+ | 0.7268 | 91600 | 7.716 |
1258
+ | 0.7276 | 91700 | 7.7608 |
1259
+ | 0.7284 | 91800 | 7.7055 |
1260
+ | 0.7292 | 91900 | 7.7477 |
1261
+ | 0.7300 | 92000 | 7.7109 |
1262
+ | 0.7308 | 92100 | 7.7544 |
1263
+ | 0.7316 | 92200 | 7.7419 |
1264
+ | 0.7324 | 92300 | 7.7431 |
1265
+ | 0.7331 | 92400 | 7.6993 |
1266
+ | 0.7339 | 92500 | 7.7143 |
1267
+ | 0.7347 | 92600 | 7.7545 |
1268
+ | 0.7355 | 92700 | 7.7326 |
1269
+ | 0.7363 | 92800 | 7.741 |
1270
+ | 0.7371 | 92900 | 7.7614 |
1271
+ | 0.7379 | 93000 | 7.76 |
1272
+ | 0.7387 | 93100 | 7.7515 |
1273
+ | 0.7395 | 93200 | 7.6917 |
1274
+ | 0.7403 | 93300 | 7.7208 |
1275
+ | 0.7411 | 93400 | 7.7186 |
1276
+ | 0.7419 | 93500 | 7.7356 |
1277
+ | 0.7427 | 93600 | 7.7412 |
1278
+ | 0.7435 | 93700 | 7.7369 |
1279
+ | 0.7443 | 93800 | 7.761 |
1280
+ | 0.7450 | 93900 | 7.7277 |
1281
+ | 0.7458 | 94000 | 7.7195 |
1282
+ | 0.7466 | 94100 | 7.6837 |
1283
+ | 0.7474 | 94200 | 7.6995 |
1284
+ | 0.7482 | 94300 | 7.7319 |
1285
+ | 0.7490 | 94400 | 7.7739 |
1286
+ | 0.7498 | 94500 | 7.7449 |
1287
+ | 0.7506 | 94600 | 7.7263 |
1288
+ | 0.7514 | 94700 | 7.6986 |
1289
+ | 0.7522 | 94800 | 7.728 |
1290
+ | 0.7530 | 94900 | 7.7952 |
1291
+ | 0.7538 | 95000 | 7.7436 |
1292
+ | 0.7546 | 95100 | 7.7355 |
1293
+ | 0.7554 | 95200 | 7.7532 |
1294
+ | 0.7562 | 95300 | 7.7724 |
1295
+ | 0.7570 | 95400 | 7.7634 |
1296
+ | 0.7577 | 95500 | 7.7302 |
1297
+ | 0.7585 | 95600 | 7.7296 |
1298
+ | 0.7593 | 95700 | 7.7449 |
1299
+ | 0.7601 | 95800 | 7.7524 |
1300
+ | 0.7609 | 95900 | 7.7283 |
1301
+ | 0.7617 | 96000 | 7.7244 |
1302
+ | 0.7625 | 96100 | 7.747 |
1303
+ | 0.7633 | 96200 | 7.7524 |
1304
+ | 0.7641 | 96300 | 7.7394 |
1305
+ | 0.7649 | 96400 | 7.7361 |
1306
+ | 0.7657 | 96500 | 7.7341 |
1307
+ | 0.7665 | 96600 | 7.7108 |
1308
+ | 0.7673 | 96700 | 7.7478 |
1309
+ | 0.7681 | 96800 | 7.7656 |
1310
+ | 0.7689 | 96900 | 7.7456 |
1311
+ | 0.7696 | 97000 | 7.7063 |
1312
+ | 0.7704 | 97100 | 7.7138 |
1313
+ | 0.7712 | 97200 | 7.7314 |
1314
+ | 0.7720 | 97300 | 7.726 |
1315
+ | 0.7728 | 97400 | 7.7465 |
1316
+ | 0.7736 | 97500 | 7.7501 |
1317
+ | 0.7744 | 97600 | 7.7457 |
1318
+ | 0.7752 | 97700 | 7.7299 |
1319
+ | 0.7760 | 97800 | 7.7463 |
1320
+ | 0.7768 | 97900 | 7.7252 |
1321
+ | 0.7776 | 98000 | 7.7436 |
1322
+ | 0.7784 | 98100 | 7.7624 |
1323
+ | 0.7792 | 98200 | 7.7469 |
1324
+ | 0.7800 | 98300 | 7.7272 |
1325
+ | 0.7808 | 98400 | 7.729 |
1326
+ | 0.7815 | 98500 | 7.7571 |
1327
+ | 0.7823 | 98600 | 7.7204 |
1328
+ | 0.7831 | 98700 | 7.7173 |
1329
+ | 0.7839 | 98800 | 7.7226 |
1330
+ | 0.7847 | 98900 | 7.6997 |
1331
+ | 0.7855 | 99000 | 7.717 |
1332
+ | 0.7863 | 99100 | 7.7023 |
1333
+ | 0.7871 | 99200 | 7.7006 |
1334
+ | 0.7879 | 99300 | 7.7622 |
1335
+ | 0.7887 | 99400 | 7.7258 |
1336
+ | 0.7895 | 99500 | 7.7504 |
1337
+ | 0.7903 | 99600 | 7.7327 |
1338
+ | 0.7911 | 99700 | 7.7826 |
1339
+ | 0.7919 | 99800 | 7.7066 |
1340
+ | 0.7927 | 99900 | 7.7174 |
1341
+ | 0.7934 | 100000 | 7.7106 |
1342
+ | 0.7942 | 100100 | 7.7423 |
1343
+ | 0.7950 | 100200 | 7.7595 |
1344
+ | 0.7958 | 100300 | 7.7035 |
1345
+ | 0.7966 | 100400 | 7.7308 |
1346
+ | 0.7974 | 100500 | 7.7353 |
1347
+ | 0.7982 | 100600 | 7.7145 |
1348
+ | 0.7990 | 100700 | 7.7311 |
1349
+ | 0.7998 | 100800 | 7.7048 |
1350
+ | 0.8006 | 100900 | 7.7036 |
1351
+ | 0.8014 | 101000 | 7.7387 |
1352
+ | 0.8022 | 101100 | 7.7198 |
1353
+ | 0.8030 | 101200 | 7.7429 |
1354
+ | 0.8038 | 101300 | 7.6884 |
1355
+ | 0.8046 | 101400 | 7.7252 |
1356
+ | 0.8054 | 101500 | 7.7067 |
1357
+ | 0.8061 | 101600 | 7.73 |
1358
+ | 0.8069 | 101700 | 7.7027 |
1359
+ | 0.8077 | 101800 | 7.7318 |
1360
+ | 0.8085 | 101900 | 7.7057 |
1361
+ | 0.8093 | 102000 | 7.754 |
1362
+ | 0.8101 | 102100 | 7.7112 |
1363
+ | 0.8109 | 102200 | 7.7117 |
1364
+ | 0.8117 | 102300 | 7.7651 |
1365
+ | 0.8125 | 102400 | 7.7167 |
1366
+ | 0.8133 | 102500 | 7.67 |
1367
+ | 0.8141 | 102600 | 7.7433 |
1368
+ | 0.8149 | 102700 | 7.7096 |
1369
+ | 0.8157 | 102800 | 7.7173 |
1370
+ | 0.8165 | 102900 | 7.7329 |
1371
+ | 0.8173 | 103000 | 7.7081 |
1372
+ | 0.8180 | 103100 | 7.7031 |
1373
+ | 0.8188 | 103200 | 7.7116 |
1374
+ | 0.8196 | 103300 | 7.7244 |
1375
+ | 0.8204 | 103400 | 7.7479 |
1376
+ | 0.8212 | 103500 | 7.7482 |
1377
+ | 0.8220 | 103600 | 7.7226 |
1378
+ | 0.8228 | 103700 | 7.7198 |
1379
+ | 0.8236 | 103800 | 7.7418 |
1380
+ | 0.8244 | 103900 | 7.7206 |
1381
+ | 0.8252 | 104000 | 7.6748 |
1382
+ | 0.8260 | 104100 | 7.726 |
1383
+ | 0.8268 | 104200 | 7.7106 |
1384
+ | 0.8276 | 104300 | 7.6973 |
1385
+ | 0.8284 | 104400 | 7.7458 |
1386
+ | 0.8292 | 104500 | 7.7373 |
1387
+ | 0.8299 | 104600 | 7.7182 |
1388
+ | 0.8307 | 104700 | 7.707 |
1389
+ | 0.8315 | 104800 | 7.71 |
1390
+ | 0.8323 | 104900 | 7.6899 |
1391
+ | 0.8331 | 105000 | 7.6667 |
1392
+ | 0.8339 | 105100 | 7.71 |
1393
+ | 0.8347 | 105200 | 7.7444 |
1394
+ | 0.8355 | 105300 | 7.6934 |
1395
+ | 0.8363 | 105400 | 7.7217 |
1396
+ | 0.8371 | 105500 | 7.7162 |
1397
+ | 0.8379 | 105600 | 7.673 |
1398
+ | 0.8387 | 105700 | 7.7248 |
1399
+ | 0.8395 | 105800 | 7.7479 |
1400
+ | 0.8403 | 105900 | 7.6815 |
1401
+ | 0.8411 | 106000 | 7.6878 |
1402
+ | 0.8418 | 106100 | 7.7098 |
1403
+ | 0.8426 | 106200 | 7.7118 |
1404
+ | 0.8434 | 106300 | 7.7066 |
1405
+ | 0.8442 | 106400 | 7.6979 |
1406
+ | 0.8450 | 106500 | 7.7493 |
1407
+ | 0.8458 | 106600 | 7.7283 |
1408
+ | 0.8466 | 106700 | 7.746 |
1409
+ | 0.8474 | 106800 | 7.6793 |
1410
+ | 0.8482 | 106900 | 7.6892 |
1411
+ | 0.8490 | 107000 | 7.6969 |
1412
+ | 0.8498 | 107100 | 7.7307 |
1413
+ | 0.8506 | 107200 | 7.7291 |
1414
+ | 0.8514 | 107300 | 7.7283 |
1415
+ | 0.8522 | 107400 | 7.7321 |
1416
+ | 0.8530 | 107500 | 7.7268 |
1417
+ | 0.8538 | 107600 | 7.7407 |
1418
+ | 0.8545 | 107700 | 7.7133 |
1419
+ | 0.8553 | 107800 | 7.7541 |
1420
+ | 0.8561 | 107900 | 7.7013 |
1421
+ | 0.8569 | 108000 | 7.7121 |
1422
+ | 0.8577 | 108100 | 7.7004 |
1423
+ | 0.8585 | 108200 | 7.734 |
1424
+ | 0.8593 | 108300 | 7.6979 |
1425
+ | 0.8601 | 108400 | 7.675 |
1426
+ | 0.8609 | 108500 | 7.718 |
1427
+ | 0.8617 | 108600 | 7.7178 |
1428
+ | 0.8625 | 108700 | 7.7362 |
1429
+ | 0.8633 | 108800 | 7.7027 |
1430
+ | 0.8641 | 108900 | 7.7332 |
1431
+ | 0.8649 | 109000 | 7.6945 |
1432
+ | 0.8657 | 109100 | 7.6932 |
1433
+ | 0.8664 | 109200 | 7.7061 |
1434
+ | 0.8672 | 109300 | 7.7138 |
1435
+ | 0.8680 | 109400 | 7.7057 |
1436
+ | 0.8688 | 109500 | 7.7152 |
1437
+ | 0.8696 | 109600 | 7.7505 |
1438
+ | 0.8704 | 109700 | 7.7545 |
1439
+ | 0.8712 | 109800 | 7.7368 |
1440
+ | 0.8720 | 109900 | 7.7147 |
1441
+ | 0.8728 | 110000 | 7.7154 |
1442
+ | 0.8736 | 110100 | 7.6988 |
1443
+ | 0.8744 | 110200 | 7.7057 |
1444
+ | 0.8752 | 110300 | 7.6855 |
1445
+ | 0.8760 | 110400 | 7.6839 |
1446
+ | 0.8768 | 110500 | 7.7192 |
1447
+ | 0.8776 | 110600 | 7.7253 |
1448
+ | 0.8783 | 110700 | 7.7051 |
1449
+ | 0.8791 | 110800 | 7.6837 |
1450
+ | 0.8799 | 110900 | 7.7374 |
1451
+ | 0.8807 | 111000 | 7.7311 |
1452
+ | 0.8815 | 111100 | 7.7378 |
1453
+ | 0.8823 | 111200 | 7.6895 |
1454
+ | 0.8831 | 111300 | 7.7247 |
1455
+ | 0.8839 | 111400 | 7.7215 |
1456
+ | 0.8847 | 111500 | 7.6748 |
1457
+ | 0.8855 | 111600 | 7.708 |
1458
+ | 0.8863 | 111700 | 7.7195 |
1459
+ | 0.8871 | 111800 | 7.6679 |
1460
+ | 0.8879 | 111900 | 7.7038 |
1461
+ | 0.8887 | 112000 | 7.6776 |
1462
+ | 0.8895 | 112100 | 7.7374 |
1463
+ | 0.8903 | 112200 | 7.7286 |
1464
+ | 0.8910 | 112300 | 7.712 |
1465
+ | 0.8918 | 112400 | 7.6966 |
1466
+ | 0.8926 | 112500 | 7.7214 |
1467
+ | 0.8934 | 112600 | 7.7212 |
1468
+ | 0.8942 | 112700 | 7.7122 |
1469
+ | 0.8950 | 112800 | 7.7271 |
1470
+ | 0.8958 | 112900 | 7.691 |
1471
+ | 0.8966 | 113000 | 7.7237 |
1472
+ | 0.8974 | 113100 | 7.6735 |
1473
+ | 0.8982 | 113200 | 7.7033 |
1474
+ | 0.8990 | 113300 | 7.7326 |
1475
+ | 0.8998 | 113400 | 7.6882 |
1476
+ | 0.9006 | 113500 | 7.7046 |
1477
+ | 0.9014 | 113600 | 7.7118 |
1478
+ | 0.9022 | 113700 | 7.7158 |
1479
+ | 0.9029 | 113800 | 7.7286 |
1480
+ | 0.9037 | 113900 | 7.6703 |
1481
+ | 0.9045 | 114000 | 7.7134 |
1482
+ | 0.9053 | 114100 | 7.7285 |
1483
+ | 0.9061 | 114200 | 7.7386 |
1484
+ | 0.9069 | 114300 | 7.7152 |
1485
+ | 0.9077 | 114400 | 7.6893 |
1486
+ | 0.9085 | 114500 | 7.7343 |
1487
+ | 0.9093 | 114600 | 7.7187 |
1488
+ | 0.9101 | 114700 | 7.7263 |
1489
+ | 0.9109 | 114800 | 7.6965 |
1490
+ | 0.9117 | 114900 | 7.7039 |
1491
+ | 0.9125 | 115000 | 7.6704 |
1492
+ | 0.9133 | 115100 | 7.6937 |
1493
+ | 0.9141 | 115200 | 7.6965 |
1494
+ | 0.9148 | 115300 | 7.7149 |
1495
+ | 0.9156 | 115400 | 7.6896 |
1496
+ | 0.9164 | 115500 | 7.7276 |
1497
+ | 0.9172 | 115600 | 7.7305 |
1498
+ | 0.9180 | 115700 | 7.6934 |
1499
+ | 0.9188 | 115800 | 7.7237 |
1500
+ | 0.9196 | 115900 | 7.6887 |
1501
+ | 0.9204 | 116000 | 7.7331 |
1502
+ | 0.9212 | 116100 | 7.6533 |
1503
+ | 0.9220 | 116200 | 7.7115 |
1504
+ | 0.9228 | 116300 | 7.7417 |
1505
+ | 0.9236 | 116400 | 7.7217 |
1506
+ | 0.9244 | 116500 | 7.7225 |
1507
+ | 0.9252 | 116600 | 7.6883 |
1508
+ | 0.9260 | 116700 | 7.7073 |
1509
+ | 0.9267 | 116800 | 7.698 |
1510
+ | 0.9275 | 116900 | 7.7024 |
1511
+ | 0.9283 | 117000 | 7.6976 |
1512
+ | 0.9291 | 117100 | 7.6934 |
1513
+ | 0.9299 | 117200 | 7.6732 |
1514
+ | 0.9307 | 117300 | 7.6981 |
1515
+ | 0.9315 | 117400 | 7.7606 |
1516
+ | 0.9323 | 117500 | 7.7274 |
1517
+ | 0.9331 | 117600 | 7.7134 |
1518
+ | 0.9339 | 117700 | 7.6873 |
1519
+ | 0.9347 | 117800 | 7.7084 |
1520
+ | 0.9355 | 117900 | 7.7014 |
1521
+ | 0.9363 | 118000 | 7.7204 |
1522
+ | 0.9371 | 118100 | 7.6777 |
1523
+ | 0.9379 | 118200 | 7.6858 |
1524
+ | 0.9387 | 118300 | 7.6906 |
1525
+ | 0.9394 | 118400 | 7.6936 |
1526
+ | 0.9402 | 118500 | 7.6827 |
1527
+ | 0.9410 | 118600 | 7.6914 |
1528
+ | 0.9418 | 118700 | 7.6868 |
1529
+ | 0.9426 | 118800 | 7.6998 |
1530
+ | 0.9434 | 118900 | 7.6739 |
1531
+ | 0.9442 | 119000 | 7.7324 |
1532
+ | 0.9450 | 119100 | 7.7352 |
1533
+ | 0.9458 | 119200 | 7.6911 |
1534
+ | 0.9466 | 119300 | 7.7013 |
1535
+ | 0.9474 | 119400 | 7.7063 |
1536
+ | 0.9482 | 119500 | 7.7036 |
1537
+ | 0.9490 | 119600 | 7.7127 |
1538
+ | 0.9498 | 119700 | 7.7237 |
1539
+ | 0.9506 | 119800 | 7.6743 |
1540
+ | 0.9513 | 119900 | 7.7109 |
1541
+ | 0.9521 | 120000 | 7.7011 |
1542
+ | 0.9529 | 120100 | 7.75 |
1543
+ | 0.9537 | 120200 | 7.7269 |
1544
+ | 0.9545 | 120300 | 7.7101 |
1545
+ | 0.9553 | 120400 | 7.7317 |
1546
+ | 0.9561 | 120500 | 7.7198 |
1547
+ | 0.9569 | 120600 | 7.6811 |
1548
+ | 0.9577 | 120700 | 7.7046 |
1549
+ | 0.9585 | 120800 | 7.6942 |
1550
+ | 0.9593 | 120900 | 7.7118 |
1551
+ | 0.9601 | 121000 | 7.6967 |
1552
+ | 0.9609 | 121100 | 7.7331 |
1553
+ | 0.9617 | 121200 | 7.7024 |
1554
+ | 0.9625 | 121300 | 7.6961 |
1555
+ | 0.9632 | 121400 | 7.6887 |
1556
+ | 0.9640 | 121500 | 7.7019 |
1557
+ | 0.9648 | 121600 | 7.6886 |
1558
+ | 0.9656 | 121700 | 7.7069 |
1559
+ | 0.9664 | 121800 | 7.6981 |
1560
+ | 0.9672 | 121900 | 7.6676 |
1561
+ | 0.9680 | 122000 | 7.6765 |
1562
+ | 0.9688 | 122100 | 7.6878 |
1563
+ | 0.9696 | 122200 | 7.6992 |
1564
+ | 0.9704 | 122300 | 7.6754 |
1565
+ | 0.9712 | 122400 | 7.6856 |
1566
+ | 0.9720 | 122500 | 7.7118 |
1567
+ | 0.9728 | 122600 | 7.7434 |
1568
+ | 0.9736 | 122700 | 7.6584 |
1569
+ | 0.9744 | 122800 | 7.716 |
1570
+ | 0.9751 | 122900 | 7.6941 |
1571
+ | 0.9759 | 123000 | 7.7396 |
1572
+ | 0.9767 | 123100 | 7.7097 |
1573
+ | 0.9775 | 123200 | 7.6739 |
1574
+ | 0.9783 | 123300 | 7.6959 |
1575
+ | 0.9791 | 123400 | 7.704 |
1576
+ | 0.9799 | 123500 | 7.6598 |
1577
+ | 0.9807 | 123600 | 7.6704 |
1578
+ | 0.9815 | 123700 | 7.7039 |
1579
+ | 0.9823 | 123800 | 7.672 |
1580
+ | 0.9831 | 123900 | 7.6978 |
1581
+ | 0.9839 | 124000 | 7.6927 |
1582
+ | 0.9847 | 124100 | 7.6665 |
1583
+ | 0.9855 | 124200 | 7.7026 |
1584
+ | 0.9863 | 124300 | 7.6866 |
1585
+ | 0.9871 | 124400 | 7.6796 |
1586
+ | 0.9878 | 124500 | 7.7315 |
1587
+ | 0.9886 | 124600 | 7.7352 |
1588
+ | 0.9894 | 124700 | 7.7342 |
1589
+ | 0.9902 | 124800 | 7.7109 |
1590
+ | 0.9910 | 124900 | 7.6762 |
1591
+ | 0.9918 | 125000 | 7.6896 |
1592
+ | 0.9926 | 125100 | 7.6893 |
1593
+ | 0.9934 | 125200 | 7.6874 |
1594
+ | 0.9942 | 125300 | 7.7031 |
1595
+ | 0.9950 | 125400 | 7.7119 |
1596
+ | 0.9958 | 125500 | 7.6758 |
1597
+ | 0.9966 | 125600 | 7.7484 |
1598
+ | 0.9974 | 125700 | 7.6668 |
1599
+ | 0.9982 | 125800 | 7.672 |
1600
+ | 0.9990 | 125900 | 7.7099 |
1601
+ | 0.9997 | 126000 | 7.7058 |
1602
+
1603
+ </details>
1604
+
1605
+ ### Framework Versions
1606
+ - Python: 3.12.3
1607
+ - Sentence Transformers: 5.1.0
1608
+ - Transformers: 4.55.4
1609
+ - PyTorch: 2.5.1+cu121
1610
+ - Accelerate: 1.10.1
1611
+ - Datasets: 4.0.0
1612
+ - Tokenizers: 0.21.4
1613
+
1614
+ ## Citation
1615
+
1616
+ ### BibTeX
1617
+
1618
+ #### Sentence Transformers
1619
+ ```bibtex
1620
+ @inproceedings{reimers-2019-sentence-bert,
1621
+ title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
1622
+ author = "Reimers, Nils and Gurevych, Iryna",
1623
+ booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
1624
+ month = "11",
1625
+ year = "2019",
1626
+ publisher = "Association for Computational Linguistics",
1627
+ url = "https://arxiv.org/abs/1908.10084",
1628
+ }
1629
+ ```
1630
+
1631
+ #### CoSENTLoss
1632
+ ```bibtex
1633
+ @online{kexuefm-8847,
1634
+ title={CoSENT: A more efficient sentence vector scheme than Sentence-BERT},
1635
+ author={Su Jianlin},
1636
+ year={2022},
1637
+ month={Jan},
1638
+ url={https://kexue.fm/archives/8847},
1639
+ }
1640
+ ```
1641
+
1642
+ <!--
1643
+ ## Glossary
1644
+
1645
+ *Clearly define terms in order to be accessible across audiences.*
1646
+ -->
1647
+
1648
+ <!--
1649
+ ## Model Card Authors
1650
+
1651
+ *Lists the people who create the model card, providing recognition and accountability for the detailed work that goes into its construction.*
1652
+ -->
1653
+
1654
+ <!--
1655
+ ## Model Card Contact
1656
+
1657
+ *Provides a way for people who have updates to the Model Card, suggestions, or questions, to contact the Model Card authors.*
1658
+ -->
config.json ADDED
@@ -0,0 +1,25 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "architectures": [
3
+ "BertModel"
4
+ ],
5
+ "attention_probs_dropout_prob": 0.1,
6
+ "classifier_dropout": null,
7
+ "gradient_checkpointing": false,
8
+ "hidden_act": "gelu",
9
+ "hidden_dropout_prob": 0.1,
10
+ "hidden_size": 384,
11
+ "initializer_range": 0.02,
12
+ "intermediate_size": 1536,
13
+ "layer_norm_eps": 1e-12,
14
+ "max_position_embeddings": 512,
15
+ "model_type": "bert",
16
+ "num_attention_heads": 12,
17
+ "num_hidden_layers": 6,
18
+ "pad_token_id": 0,
19
+ "position_embedding_type": "absolute",
20
+ "torch_dtype": "float32",
21
+ "transformers_version": "4.55.4",
22
+ "type_vocab_size": 2,
23
+ "use_cache": true,
24
+ "vocab_size": 30522
25
+ }
config_sentence_transformers.json ADDED
@@ -0,0 +1,14 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "__version__": {
3
+ "sentence_transformers": "5.1.0",
4
+ "transformers": "4.55.4",
5
+ "pytorch": "2.5.1+cu121"
6
+ },
7
+ "model_type": "SentenceTransformer",
8
+ "prompts": {
9
+ "query": "",
10
+ "document": ""
11
+ },
12
+ "default_prompt_name": null,
13
+ "similarity_fn_name": "cosine"
14
+ }
model.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:471d133c0b0b4af8229ecd4bb186cb074f88369cb3ee202a77583341870d1b72
3
+ size 90864192
modules.json ADDED
@@ -0,0 +1,20 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ [
2
+ {
3
+ "idx": 0,
4
+ "name": "0",
5
+ "path": "",
6
+ "type": "sentence_transformers.models.Transformer"
7
+ },
8
+ {
9
+ "idx": 1,
10
+ "name": "1",
11
+ "path": "1_Pooling",
12
+ "type": "sentence_transformers.models.Pooling"
13
+ },
14
+ {
15
+ "idx": 2,
16
+ "name": "2",
17
+ "path": "2_Normalize",
18
+ "type": "sentence_transformers.models.Normalize"
19
+ }
20
+ ]
sentence_bert_config.json ADDED
@@ -0,0 +1,4 @@
 
 
 
 
 
1
+ {
2
+ "max_seq_length": 256,
3
+ "do_lower_case": false
4
+ }
special_tokens_map.json ADDED
@@ -0,0 +1,37 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "cls_token": {
3
+ "content": "[CLS]",
4
+ "lstrip": false,
5
+ "normalized": false,
6
+ "rstrip": false,
7
+ "single_word": false
8
+ },
9
+ "mask_token": {
10
+ "content": "[MASK]",
11
+ "lstrip": false,
12
+ "normalized": false,
13
+ "rstrip": false,
14
+ "single_word": false
15
+ },
16
+ "pad_token": {
17
+ "content": "[PAD]",
18
+ "lstrip": false,
19
+ "normalized": false,
20
+ "rstrip": false,
21
+ "single_word": false
22
+ },
23
+ "sep_token": {
24
+ "content": "[SEP]",
25
+ "lstrip": false,
26
+ "normalized": false,
27
+ "rstrip": false,
28
+ "single_word": false
29
+ },
30
+ "unk_token": {
31
+ "content": "[UNK]",
32
+ "lstrip": false,
33
+ "normalized": false,
34
+ "rstrip": false,
35
+ "single_word": false
36
+ }
37
+ }
tokenizer.json ADDED
The diff for this file is too large to render. See raw diff
 
tokenizer_config.json ADDED
@@ -0,0 +1,65 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "added_tokens_decoder": {
3
+ "0": {
4
+ "content": "[PAD]",
5
+ "lstrip": false,
6
+ "normalized": false,
7
+ "rstrip": false,
8
+ "single_word": false,
9
+ "special": true
10
+ },
11
+ "100": {
12
+ "content": "[UNK]",
13
+ "lstrip": false,
14
+ "normalized": false,
15
+ "rstrip": false,
16
+ "single_word": false,
17
+ "special": true
18
+ },
19
+ "101": {
20
+ "content": "[CLS]",
21
+ "lstrip": false,
22
+ "normalized": false,
23
+ "rstrip": false,
24
+ "single_word": false,
25
+ "special": true
26
+ },
27
+ "102": {
28
+ "content": "[SEP]",
29
+ "lstrip": false,
30
+ "normalized": false,
31
+ "rstrip": false,
32
+ "single_word": false,
33
+ "special": true
34
+ },
35
+ "103": {
36
+ "content": "[MASK]",
37
+ "lstrip": false,
38
+ "normalized": false,
39
+ "rstrip": false,
40
+ "single_word": false,
41
+ "special": true
42
+ }
43
+ },
44
+ "clean_up_tokenization_spaces": false,
45
+ "cls_token": "[CLS]",
46
+ "do_basic_tokenize": true,
47
+ "do_lower_case": true,
48
+ "extra_special_tokens": {},
49
+ "mask_token": "[MASK]",
50
+ "max_length": 128,
51
+ "model_max_length": 256,
52
+ "never_split": null,
53
+ "pad_to_multiple_of": null,
54
+ "pad_token": "[PAD]",
55
+ "pad_token_type_id": 0,
56
+ "padding_side": "right",
57
+ "sep_token": "[SEP]",
58
+ "stride": 0,
59
+ "strip_accents": null,
60
+ "tokenize_chinese_chars": true,
61
+ "tokenizer_class": "BertTokenizer",
62
+ "truncation_side": "right",
63
+ "truncation_strategy": "longest_first",
64
+ "unk_token": "[UNK]"
65
+ }
vocab.txt ADDED
The diff for this file is too large to render. See raw diff