Add new CrossEncoder model
Browse files- README.md +588 -0
- model.safetensors +1 -1
README.md
CHANGED
|
@@ -19,6 +19,45 @@ datasets:
|
|
| 19 |
- redis/langcache-sentencepairs-v1
|
| 20 |
pipeline_tag: text-ranking
|
| 21 |
library_name: sentence-transformers
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 22 |
---
|
| 23 |
|
| 24 |
# Redis fine-tuned CrossEncoder model for semantic caching on LangCache
|
|
@@ -110,6 +149,25 @@ You can finetune this model on your own dataset.
|
|
| 110 |
*List how the model may foreseeably be misused and address what users ought not to do with the model.*
|
| 111 |
-->
|
| 112 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 113 |
<!--
|
| 114 |
## Bias, Risks and Limitations
|
| 115 |
|
|
@@ -178,6 +236,536 @@ You can finetune this model on your own dataset.
|
|
| 178 |
}
|
| 179 |
```
|
| 180 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 181 |
### Framework Versions
|
| 182 |
- Python: 3.12.3
|
| 183 |
- Sentence Transformers: 5.1.0
|
|
|
|
| 19 |
- redis/langcache-sentencepairs-v1
|
| 20 |
pipeline_tag: text-ranking
|
| 21 |
library_name: sentence-transformers
|
| 22 |
+
metrics:
|
| 23 |
+
- accuracy
|
| 24 |
+
- accuracy_threshold
|
| 25 |
+
- f1
|
| 26 |
+
- f1_threshold
|
| 27 |
+
- precision
|
| 28 |
+
- recall
|
| 29 |
+
- average_precision
|
| 30 |
+
model-index:
|
| 31 |
+
- name: Redis fine-tuned CrossEncoder model for semantic caching on LangCache
|
| 32 |
+
results:
|
| 33 |
+
- task:
|
| 34 |
+
type: cross-encoder-classification
|
| 35 |
+
name: Cross Encoder Classification
|
| 36 |
+
dataset:
|
| 37 |
+
name: test cls
|
| 38 |
+
type: test_cls
|
| 39 |
+
metrics:
|
| 40 |
+
- type: accuracy
|
| 41 |
+
value: 0.82683284693894
|
| 42 |
+
name: Accuracy
|
| 43 |
+
- type: accuracy_threshold
|
| 44 |
+
value: -0.0703125
|
| 45 |
+
name: Accuracy Threshold
|
| 46 |
+
- type: f1
|
| 47 |
+
value: 0.8097739600622851
|
| 48 |
+
name: F1
|
| 49 |
+
- type: f1_threshold
|
| 50 |
+
value: -0.27734375
|
| 51 |
+
name: F1 Threshold
|
| 52 |
+
- type: precision
|
| 53 |
+
value: 0.7490107942135419
|
| 54 |
+
name: Precision
|
| 55 |
+
- type: recall
|
| 56 |
+
value: 0.881266294227188
|
| 57 |
+
name: Recall
|
| 58 |
+
- type: average_precision
|
| 59 |
+
value: 0.8720293693345842
|
| 60 |
+
name: Average Precision
|
| 61 |
---
|
| 62 |
|
| 63 |
# Redis fine-tuned CrossEncoder model for semantic caching on LangCache
|
|
|
|
| 149 |
*List how the model may foreseeably be misused and address what users ought not to do with the model.*
|
| 150 |
-->
|
| 151 |
|
| 152 |
+
## Evaluation
|
| 153 |
+
|
| 154 |
+
### Metrics
|
| 155 |
+
|
| 156 |
+
#### Cross Encoder Classification
|
| 157 |
+
|
| 158 |
+
* Dataset: `test_cls`
|
| 159 |
+
* Evaluated with [<code>CrossEncoderClassificationEvaluator</code>](https://sbert.net/docs/package_reference/cross_encoder/evaluation.html#sentence_transformers.cross_encoder.evaluation.CrossEncoderClassificationEvaluator)
|
| 160 |
+
|
| 161 |
+
| Metric | Value |
|
| 162 |
+
|:----------------------|:----------|
|
| 163 |
+
| accuracy | 0.8268 |
|
| 164 |
+
| accuracy_threshold | -0.0703 |
|
| 165 |
+
| f1 | 0.8098 |
|
| 166 |
+
| f1_threshold | -0.2773 |
|
| 167 |
+
| precision | 0.749 |
|
| 168 |
+
| recall | 0.8813 |
|
| 169 |
+
| **average_precision** | **0.872** |
|
| 170 |
+
|
| 171 |
<!--
|
| 172 |
## Bias, Risks and Limitations
|
| 173 |
|
|
|
|
| 236 |
}
|
| 237 |
```
|
| 238 |
|
| 239 |
+
### Training Hyperparameters
|
| 240 |
+
#### Non-Default Hyperparameters
|
| 241 |
+
|
| 242 |
+
- `eval_strategy`: steps
|
| 243 |
+
- `per_device_train_batch_size`: 48
|
| 244 |
+
- `per_device_eval_batch_size`: 48
|
| 245 |
+
- `learning_rate`: 0.0002
|
| 246 |
+
- `weight_decay`: 0.001
|
| 247 |
+
- `num_train_epochs`: 50
|
| 248 |
+
- `warmup_ratio`: 0.1
|
| 249 |
+
- `load_best_model_at_end`: True
|
| 250 |
+
- `optim`: adamw_torch
|
| 251 |
+
- `ddp_find_unused_parameters`: False
|
| 252 |
+
- `push_to_hub`: True
|
| 253 |
+
- `hub_model_id`: redis/langcache-reranker-v1-miniL6-softmnrl-triplet
|
| 254 |
+
- `eval_on_start`: True
|
| 255 |
+
- `batch_sampler`: no_duplicates
|
| 256 |
+
|
| 257 |
+
#### All Hyperparameters
|
| 258 |
+
<details><summary>Click to expand</summary>
|
| 259 |
+
|
| 260 |
+
- `overwrite_output_dir`: False
|
| 261 |
+
- `do_predict`: False
|
| 262 |
+
- `eval_strategy`: steps
|
| 263 |
+
- `prediction_loss_only`: True
|
| 264 |
+
- `per_device_train_batch_size`: 48
|
| 265 |
+
- `per_device_eval_batch_size`: 48
|
| 266 |
+
- `per_gpu_train_batch_size`: None
|
| 267 |
+
- `per_gpu_eval_batch_size`: None
|
| 268 |
+
- `gradient_accumulation_steps`: 1
|
| 269 |
+
- `eval_accumulation_steps`: None
|
| 270 |
+
- `torch_empty_cache_steps`: None
|
| 271 |
+
- `learning_rate`: 0.0002
|
| 272 |
+
- `weight_decay`: 0.001
|
| 273 |
+
- `adam_beta1`: 0.9
|
| 274 |
+
- `adam_beta2`: 0.999
|
| 275 |
+
- `adam_epsilon`: 1e-08
|
| 276 |
+
- `max_grad_norm`: 1.0
|
| 277 |
+
- `num_train_epochs`: 50
|
| 278 |
+
- `max_steps`: -1
|
| 279 |
+
- `lr_scheduler_type`: linear
|
| 280 |
+
- `lr_scheduler_kwargs`: {}
|
| 281 |
+
- `warmup_ratio`: 0.1
|
| 282 |
+
- `warmup_steps`: 0
|
| 283 |
+
- `log_level`: passive
|
| 284 |
+
- `log_level_replica`: warning
|
| 285 |
+
- `log_on_each_node`: True
|
| 286 |
+
- `logging_nan_inf_filter`: True
|
| 287 |
+
- `save_safetensors`: True
|
| 288 |
+
- `save_on_each_node`: False
|
| 289 |
+
- `save_only_model`: False
|
| 290 |
+
- `restore_callback_states_from_checkpoint`: False
|
| 291 |
+
- `no_cuda`: False
|
| 292 |
+
- `use_cpu`: False
|
| 293 |
+
- `use_mps_device`: False
|
| 294 |
+
- `seed`: 42
|
| 295 |
+
- `data_seed`: None
|
| 296 |
+
- `jit_mode_eval`: False
|
| 297 |
+
- `use_ipex`: False
|
| 298 |
+
- `bf16`: False
|
| 299 |
+
- `fp16`: False
|
| 300 |
+
- `fp16_opt_level`: O1
|
| 301 |
+
- `half_precision_backend`: auto
|
| 302 |
+
- `bf16_full_eval`: False
|
| 303 |
+
- `fp16_full_eval`: False
|
| 304 |
+
- `tf32`: None
|
| 305 |
+
- `local_rank`: 2
|
| 306 |
+
- `ddp_backend`: None
|
| 307 |
+
- `tpu_num_cores`: None
|
| 308 |
+
- `tpu_metrics_debug`: False
|
| 309 |
+
- `debug`: []
|
| 310 |
+
- `dataloader_drop_last`: True
|
| 311 |
+
- `dataloader_num_workers`: 0
|
| 312 |
+
- `dataloader_prefetch_factor`: None
|
| 313 |
+
- `past_index`: -1
|
| 314 |
+
- `disable_tqdm`: False
|
| 315 |
+
- `remove_unused_columns`: True
|
| 316 |
+
- `label_names`: None
|
| 317 |
+
- `load_best_model_at_end`: True
|
| 318 |
+
- `ignore_data_skip`: False
|
| 319 |
+
- `fsdp`: []
|
| 320 |
+
- `fsdp_min_num_params`: 0
|
| 321 |
+
- `fsdp_config`: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
|
| 322 |
+
- `fsdp_transformer_layer_cls_to_wrap`: None
|
| 323 |
+
- `accelerator_config`: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
|
| 324 |
+
- `parallelism_config`: None
|
| 325 |
+
- `deepspeed`: None
|
| 326 |
+
- `label_smoothing_factor`: 0.0
|
| 327 |
+
- `optim`: adamw_torch
|
| 328 |
+
- `optim_args`: None
|
| 329 |
+
- `adafactor`: False
|
| 330 |
+
- `group_by_length`: False
|
| 331 |
+
- `length_column_name`: length
|
| 332 |
+
- `ddp_find_unused_parameters`: False
|
| 333 |
+
- `ddp_bucket_cap_mb`: None
|
| 334 |
+
- `ddp_broadcast_buffers`: False
|
| 335 |
+
- `dataloader_pin_memory`: True
|
| 336 |
+
- `dataloader_persistent_workers`: False
|
| 337 |
+
- `skip_memory_metrics`: True
|
| 338 |
+
- `use_legacy_prediction_loop`: False
|
| 339 |
+
- `push_to_hub`: True
|
| 340 |
+
- `resume_from_checkpoint`: None
|
| 341 |
+
- `hub_model_id`: redis/langcache-reranker-v1-miniL6-softmnrl-triplet
|
| 342 |
+
- `hub_strategy`: every_save
|
| 343 |
+
- `hub_private_repo`: None
|
| 344 |
+
- `hub_always_push`: False
|
| 345 |
+
- `hub_revision`: None
|
| 346 |
+
- `gradient_checkpointing`: False
|
| 347 |
+
- `gradient_checkpointing_kwargs`: None
|
| 348 |
+
- `include_inputs_for_metrics`: False
|
| 349 |
+
- `include_for_metrics`: []
|
| 350 |
+
- `eval_do_concat_batches`: True
|
| 351 |
+
- `fp16_backend`: auto
|
| 352 |
+
- `push_to_hub_model_id`: None
|
| 353 |
+
- `push_to_hub_organization`: None
|
| 354 |
+
- `mp_parameters`:
|
| 355 |
+
- `auto_find_batch_size`: False
|
| 356 |
+
- `full_determinism`: False
|
| 357 |
+
- `torchdynamo`: None
|
| 358 |
+
- `ray_scope`: last
|
| 359 |
+
- `ddp_timeout`: 1800
|
| 360 |
+
- `torch_compile`: False
|
| 361 |
+
- `torch_compile_backend`: None
|
| 362 |
+
- `torch_compile_mode`: None
|
| 363 |
+
- `include_tokens_per_second`: False
|
| 364 |
+
- `include_num_input_tokens_seen`: False
|
| 365 |
+
- `neftune_noise_alpha`: None
|
| 366 |
+
- `optim_target_modules`: None
|
| 367 |
+
- `batch_eval_metrics`: False
|
| 368 |
+
- `eval_on_start`: True
|
| 369 |
+
- `use_liger_kernel`: False
|
| 370 |
+
- `liger_kernel_config`: None
|
| 371 |
+
- `eval_use_gather_object`: False
|
| 372 |
+
- `average_tokens_across_devices`: True
|
| 373 |
+
- `prompts`: None
|
| 374 |
+
- `batch_sampler`: no_duplicates
|
| 375 |
+
- `multi_dataset_batch_sampler`: proportional
|
| 376 |
+
- `router_mapping`: {}
|
| 377 |
+
- `learning_rate_mapping`: {}
|
| 378 |
+
|
| 379 |
+
</details>
|
| 380 |
+
|
| 381 |
+
### Training Logs
|
| 382 |
+
<details><summary>Click to expand</summary>
|
| 383 |
+
|
| 384 |
+
| Epoch | Step | Training Loss | Validation Loss | test_cls_average_precision |
|
| 385 |
+
|:----------:|:---------:|:-------------:|:---------------:|:--------------------------:|
|
| 386 |
+
| 0 | 0 | - | 0.3223 | 0.5734 |
|
| 387 |
+
| 0.1322 | 1000 | 0.4286 | 0.3215 | 0.5735 |
|
| 388 |
+
| 0.2644 | 2000 | 0.4241 | 0.3151 | 0.5743 |
|
| 389 |
+
| 0.3966 | 3000 | 0.4182 | 0.3038 | 0.5759 |
|
| 390 |
+
| 0.5288 | 4000 | 0.4036 | 0.2876 | 0.5782 |
|
| 391 |
+
| 0.6609 | 5000 | 0.3919 | 0.2619 | 0.5830 |
|
| 392 |
+
| 0.7931 | 6000 | 0.3694 | 0.2290 | 0.5914 |
|
| 393 |
+
| 0.9253 | 7000 | 0.3481 | 0.1966 | 0.6029 |
|
| 394 |
+
| 1.0575 | 8000 | 0.3109 | 0.1650 | 0.6253 |
|
| 395 |
+
| 1.1897 | 9000 | 0.2665 | 0.1384 | 0.6591 |
|
| 396 |
+
| 1.3219 | 10000 | 0.2281 | 0.1154 | 0.6921 |
|
| 397 |
+
| 1.4541 | 11000 | 0.1984 | 0.0928 | 0.7189 |
|
| 398 |
+
| 1.5863 | 12000 | 0.1794 | 0.0814 | 0.7287 |
|
| 399 |
+
| 1.7184 | 13000 | 0.1619 | 0.0698 | 0.7398 |
|
| 400 |
+
| 1.8506 | 14000 | 0.1498 | 0.0619 | 0.7506 |
|
| 401 |
+
| 1.9828 | 15000 | 0.1409 | 0.0581 | 0.7653 |
|
| 402 |
+
| 2.1150 | 16000 | 0.1315 | 0.0537 | 0.7760 |
|
| 403 |
+
| 2.2472 | 17000 | 0.1239 | 0.0495 | 0.7809 |
|
| 404 |
+
| 2.3794 | 18000 | 0.1157 | 0.0471 | 0.7804 |
|
| 405 |
+
| 2.5116 | 19000 | 0.1093 | 0.0415 | 0.7912 |
|
| 406 |
+
| 2.6438 | 20000 | 0.1026 | 0.0428 | 0.8006 |
|
| 407 |
+
| 2.7759 | 21000 | 0.0958 | 0.0393 | 0.8013 |
|
| 408 |
+
| 2.9081 | 22000 | 0.0922 | 0.0387 | 0.8152 |
|
| 409 |
+
| 3.0403 | 23000 | 0.0873 | 0.0415 | 0.8117 |
|
| 410 |
+
| 3.1725 | 24000 | 0.0823 | 0.0382 | 0.8130 |
|
| 411 |
+
| 3.3047 | 25000 | 0.0807 | 0.0369 | 0.8141 |
|
| 412 |
+
| 3.4369 | 26000 | 0.0772 | 0.0370 | 0.8275 |
|
| 413 |
+
| 3.5691 | 27000 | 0.0734 | 0.0348 | 0.8197 |
|
| 414 |
+
| 3.7013 | 28000 | 0.0709 | 0.0335 | 0.8242 |
|
| 415 |
+
| 3.8334 | 29000 | 0.067 | 0.0363 | 0.8309 |
|
| 416 |
+
| **3.9656** | **30000** | **0.0675** | **0.0359** | **0.8327** |
|
| 417 |
+
| 4.0978 | 31000 | 0.0629 | 0.0337 | 0.8413 |
|
| 418 |
+
| 4.2300 | 32000 | 0.0611 | 0.0350 | 0.8418 |
|
| 419 |
+
| 4.3622 | 33000 | 0.0618 | 0.0372 | 0.8415 |
|
| 420 |
+
| 4.4944 | 34000 | 0.0585 | 0.0341 | 0.8437 |
|
| 421 |
+
| 4.6266 | 35000 | 0.0569 | 0.0364 | 0.8473 |
|
| 422 |
+
| 4.7588 | 36000 | 0.055 | 0.0355 | 0.8430 |
|
| 423 |
+
| 4.8909 | 37000 | 0.0529 | 0.0316 | 0.8468 |
|
| 424 |
+
| 5.0231 | 38000 | 0.0522 | 0.0346 | 0.8454 |
|
| 425 |
+
| 5.1553 | 39000 | 0.0501 | 0.0384 | 0.8493 |
|
| 426 |
+
| 5.2875 | 40000 | 0.0503 | 0.0345 | 0.8527 |
|
| 427 |
+
| 5.4197 | 41000 | 0.0487 | 0.0321 | 0.8542 |
|
| 428 |
+
| 5.5519 | 42000 | 0.0465 | 0.0321 | 0.8475 |
|
| 429 |
+
| 5.6841 | 43000 | 0.0453 | 0.0316 | 0.8487 |
|
| 430 |
+
| 5.8163 | 44000 | 0.0426 | 0.0355 | 0.8554 |
|
| 431 |
+
| 5.9484 | 45000 | 0.043 | 0.0329 | 0.8564 |
|
| 432 |
+
| 6.0806 | 46000 | 0.0405 | 0.0358 | 0.8513 |
|
| 433 |
+
| 6.2128 | 47000 | 0.0398 | 0.0345 | 0.8578 |
|
| 434 |
+
| 6.3450 | 48000 | 0.0406 | 0.0336 | 0.8605 |
|
| 435 |
+
| 6.4772 | 49000 | 0.0381 | 0.0324 | 0.8535 |
|
| 436 |
+
| 6.6094 | 50000 | 0.0377 | 0.0322 | 0.8522 |
|
| 437 |
+
| 6.7416 | 51000 | 0.0357 | 0.0321 | 0.8541 |
|
| 438 |
+
| 6.8738 | 52000 | 0.035 | 0.0338 | 0.8633 |
|
| 439 |
+
| 7.0059 | 53000 | 0.035 | 0.0348 | 0.8627 |
|
| 440 |
+
| 7.1381 | 54000 | 0.033 | 0.0341 | 0.8650 |
|
| 441 |
+
| 7.2703 | 55000 | 0.0347 | 0.0341 | 0.8621 |
|
| 442 |
+
| 7.4025 | 56000 | 0.0339 | 0.0327 | 0.8629 |
|
| 443 |
+
| 7.5347 | 57000 | 0.0325 | 0.0315 | 0.8559 |
|
| 444 |
+
| 7.6669 | 58000 | 0.0313 | 0.0353 | 0.8616 |
|
| 445 |
+
| 7.7991 | 59000 | 0.0305 | 0.0353 | 0.8622 |
|
| 446 |
+
| 7.9313 | 60000 | 0.0296 | 0.0358 | 0.8613 |
|
| 447 |
+
| 8.0635 | 61000 | 0.0292 | 0.0348 | 0.8652 |
|
| 448 |
+
| 8.1956 | 62000 | 0.0301 | 0.0366 | 0.8660 |
|
| 449 |
+
| 8.3278 | 63000 | 0.03 | 0.0336 | 0.8617 |
|
| 450 |
+
| 8.4600 | 64000 | 0.0287 | 0.0336 | 0.8649 |
|
| 451 |
+
| 8.5922 | 65000 | 0.0279 | 0.0315 | 0.8628 |
|
| 452 |
+
| 8.7244 | 66000 | 0.027 | 0.0322 | 0.8559 |
|
| 453 |
+
| 8.8566 | 67000 | 0.026 | 0.0336 | 0.8623 |
|
| 454 |
+
| 8.9888 | 68000 | 0.0268 | 0.0369 | 0.8628 |
|
| 455 |
+
| 9.1210 | 69000 | 0.0259 | 0.0333 | 0.8651 |
|
| 456 |
+
| 9.2531 | 70000 | 0.0261 | 0.0350 | 0.8682 |
|
| 457 |
+
| 9.3853 | 71000 | 0.0261 | 0.0332 | 0.8692 |
|
| 458 |
+
| 9.5175 | 72000 | 0.0253 | 0.0336 | 0.8659 |
|
| 459 |
+
| 9.6497 | 73000 | 0.0252 | 0.0342 | 0.8673 |
|
| 460 |
+
| 9.7819 | 74000 | 0.0243 | 0.0348 | 0.8613 |
|
| 461 |
+
| 9.9141 | 75000 | 0.0244 | 0.0338 | 0.8647 |
|
| 462 |
+
| 10.0463 | 76000 | 0.0238 | 0.0349 | 0.8672 |
|
| 463 |
+
| 10.1785 | 77000 | 0.0239 | 0.0359 | 0.8674 |
|
| 464 |
+
| 10.3106 | 78000 | 0.0241 | 0.0337 | 0.8696 |
|
| 465 |
+
| 10.4428 | 79000 | 0.0236 | 0.0349 | 0.8687 |
|
| 466 |
+
| 10.5750 | 80000 | 0.0234 | 0.0348 | 0.8648 |
|
| 467 |
+
| 10.7072 | 81000 | 0.0225 | 0.0345 | 0.8677 |
|
| 468 |
+
| 10.8394 | 82000 | 0.0217 | 0.0354 | 0.8695 |
|
| 469 |
+
| 10.9716 | 83000 | 0.0226 | 0.0339 | 0.8702 |
|
| 470 |
+
| 11.1038 | 84000 | 0.0215 | 0.0354 | 0.8717 |
|
| 471 |
+
| 11.2360 | 85000 | 0.022 | 0.0364 | 0.8687 |
|
| 472 |
+
| 11.3681 | 86000 | 0.022 | 0.0348 | 0.8740 |
|
| 473 |
+
| 11.5003 | 87000 | 0.0217 | 0.0353 | 0.8675 |
|
| 474 |
+
| 11.6325 | 88000 | 0.0221 | 0.0338 | 0.8678 |
|
| 475 |
+
| 11.7647 | 89000 | 0.0213 | 0.0324 | 0.8697 |
|
| 476 |
+
| 11.8969 | 90000 | 0.021 | 0.0336 | 0.8668 |
|
| 477 |
+
| 12.0291 | 91000 | 0.0206 | 0.0352 | 0.8675 |
|
| 478 |
+
| 12.1613 | 92000 | 0.0203 | 0.0344 | 0.8710 |
|
| 479 |
+
| 12.2935 | 93000 | 0.0207 | 0.0349 | 0.8675 |
|
| 480 |
+
| 12.4256 | 94000 | 0.0206 | 0.0339 | 0.8676 |
|
| 481 |
+
| 12.5578 | 95000 | 0.0199 | 0.0342 | 0.8732 |
|
| 482 |
+
| 12.6900 | 96000 | 0.0202 | 0.0323 | 0.8664 |
|
| 483 |
+
| 12.8222 | 97000 | 0.0192 | 0.0357 | 0.8688 |
|
| 484 |
+
| 12.9544 | 98000 | 0.0196 | 0.0359 | 0.8713 |
|
| 485 |
+
| 13.0866 | 99000 | 0.0196 | 0.0357 | 0.8687 |
|
| 486 |
+
| 13.2188 | 100000 | 0.0195 | 0.0347 | 0.8659 |
|
| 487 |
+
| 13.3510 | 101000 | 0.0198 | 0.0343 | 0.8702 |
|
| 488 |
+
| 13.4831 | 102000 | 0.0192 | 0.0329 | 0.8689 |
|
| 489 |
+
| 13.6153 | 103000 | 0.0191 | 0.0336 | 0.8679 |
|
| 490 |
+
| 13.7475 | 104000 | 0.0186 | 0.0326 | 0.8674 |
|
| 491 |
+
| 13.8797 | 105000 | 0.0183 | 0.0338 | 0.8687 |
|
| 492 |
+
| 14.0119 | 106000 | 0.0186 | 0.0346 | 0.8689 |
|
| 493 |
+
| 14.1441 | 107000 | 0.0177 | 0.0357 | 0.8717 |
|
| 494 |
+
| 14.2763 | 108000 | 0.0193 | 0.0344 | 0.8733 |
|
| 495 |
+
| 14.4085 | 109000 | 0.0186 | 0.0323 | 0.8742 |
|
| 496 |
+
| 14.5406 | 110000 | 0.018 | 0.0336 | 0.8722 |
|
| 497 |
+
| 14.6728 | 111000 | 0.0177 | 0.0353 | 0.8705 |
|
| 498 |
+
| 14.8050 | 112000 | 0.0176 | 0.0338 | 0.8678 |
|
| 499 |
+
| 14.9372 | 113000 | 0.0178 | 0.0348 | 0.8698 |
|
| 500 |
+
| 15.0694 | 114000 | 0.017 | 0.0353 | 0.8702 |
|
| 501 |
+
| 15.2016 | 115000 | 0.0181 | 0.0349 | 0.8721 |
|
| 502 |
+
| 15.3338 | 116000 | 0.0182 | 0.0341 | 0.8705 |
|
| 503 |
+
| 15.4660 | 117000 | 0.0171 | 0.0343 | 0.8715 |
|
| 504 |
+
| 15.5981 | 118000 | 0.0176 | 0.0341 | 0.8696 |
|
| 505 |
+
| 15.7303 | 119000 | 0.0173 | 0.0336 | 0.8706 |
|
| 506 |
+
| 15.8625 | 120000 | 0.0161 | 0.0342 | 0.8715 |
|
| 507 |
+
| 15.9947 | 121000 | 0.0174 | 0.0349 | 0.8701 |
|
| 508 |
+
| 16.1269 | 122000 | 0.0171 | 0.0341 | 0.8715 |
|
| 509 |
+
| 16.2591 | 123000 | 0.0171 | 0.0342 | 0.8720 |
|
| 510 |
+
| 16.3913 | 124000 | 0.0174 | 0.0336 | 0.8726 |
|
| 511 |
+
| 16.5235 | 125000 | 0.0167 | 0.0339 | 0.8694 |
|
| 512 |
+
| 16.6557 | 126000 | 0.0169 | 0.0344 | 0.8671 |
|
| 513 |
+
| 16.7878 | 127000 | 0.016 | 0.0341 | 0.8666 |
|
| 514 |
+
| 16.9200 | 128000 | 0.0163 | 0.0342 | 0.8696 |
|
| 515 |
+
| 17.0522 | 129000 | 0.0163 | 0.0342 | 0.8687 |
|
| 516 |
+
| 17.1844 | 130000 | 0.0163 | 0.0347 | 0.8709 |
|
| 517 |
+
| 17.3166 | 131000 | 0.017 | 0.0335 | 0.8719 |
|
| 518 |
+
| 17.4488 | 132000 | 0.0166 | 0.0337 | 0.8699 |
|
| 519 |
+
| 17.5810 | 133000 | 0.0165 | 0.0334 | 0.8706 |
|
| 520 |
+
| 17.7132 | 134000 | 0.0157 | 0.0334 | 0.8709 |
|
| 521 |
+
| 17.8453 | 135000 | 0.0154 | 0.0345 | 0.8718 |
|
| 522 |
+
| 17.9775 | 136000 | 0.0159 | 0.0340 | 0.8719 |
|
| 523 |
+
| 18.1097 | 137000 | 0.0156 | 0.0338 | 0.8697 |
|
| 524 |
+
| 18.2419 | 138000 | 0.0162 | 0.0333 | 0.8696 |
|
| 525 |
+
| 18.3741 | 139000 | 0.0161 | 0.0337 | 0.8712 |
|
| 526 |
+
| 18.5063 | 140000 | 0.0161 | 0.0345 | 0.8681 |
|
| 527 |
+
| 18.6385 | 141000 | 0.0163 | 0.0331 | 0.8708 |
|
| 528 |
+
| 18.7707 | 142000 | 0.015 | 0.0336 | 0.8711 |
|
| 529 |
+
| 18.9028 | 143000 | 0.0153 | 0.0350 | 0.8697 |
|
| 530 |
+
| 19.0350 | 144000 | 0.0152 | 0.0355 | 0.8690 |
|
| 531 |
+
| 19.1672 | 145000 | 0.0158 | 0.0354 | 0.8692 |
|
| 532 |
+
| 19.2994 | 146000 | 0.0158 | 0.0345 | 0.8709 |
|
| 533 |
+
| 19.4316 | 147000 | 0.0161 | 0.0327 | 0.8721 |
|
| 534 |
+
| 19.5638 | 148000 | 0.0155 | 0.0335 | 0.8701 |
|
| 535 |
+
| 19.6960 | 149000 | 0.015 | 0.0330 | 0.8694 |
|
| 536 |
+
| 19.8282 | 150000 | 0.0143 | 0.0339 | 0.8693 |
|
| 537 |
+
| 19.9603 | 151000 | 0.0156 | 0.0340 | 0.8694 |
|
| 538 |
+
| 20.0925 | 152000 | 0.0149 | 0.0337 | 0.8700 |
|
| 539 |
+
| 20.2247 | 153000 | 0.0154 | 0.0334 | 0.8703 |
|
| 540 |
+
| 20.3569 | 154000 | 0.0155 | 0.0337 | 0.8709 |
|
| 541 |
+
| 20.4891 | 155000 | 0.0156 | 0.0335 | 0.8706 |
|
| 542 |
+
| 20.6213 | 156000 | 0.0153 | 0.0337 | 0.8696 |
|
| 543 |
+
| 20.7535 | 157000 | 0.0149 | 0.0328 | 0.8687 |
|
| 544 |
+
| 20.8857 | 158000 | 0.0144 | 0.0331 | 0.8700 |
|
| 545 |
+
| 21.0178 | 159000 | 0.0148 | 0.0339 | 0.8698 |
|
| 546 |
+
| 21.1500 | 160000 | 0.0152 | 0.0331 | 0.8705 |
|
| 547 |
+
| 21.2822 | 161000 | 0.0156 | 0.0333 | 0.8706 |
|
| 548 |
+
| 21.4144 | 162000 | 0.0147 | 0.0328 | 0.8715 |
|
| 549 |
+
| 21.5466 | 163000 | 0.0148 | 0.0335 | 0.8717 |
|
| 550 |
+
| 21.6788 | 164000 | 0.0145 | 0.0342 | 0.8715 |
|
| 551 |
+
| 21.8110 | 165000 | 0.0142 | 0.0336 | 0.8711 |
|
| 552 |
+
| 21.9432 | 166000 | 0.0141 | 0.0346 | 0.8708 |
|
| 553 |
+
| 22.0753 | 167000 | 0.0148 | 0.0344 | 0.8705 |
|
| 554 |
+
| 22.2075 | 168000 | 0.0151 | 0.0335 | 0.8696 |
|
| 555 |
+
| 22.3397 | 169000 | 0.0147 | 0.0344 | 0.8708 |
|
| 556 |
+
| 22.4719 | 170000 | 0.0145 | 0.0343 | 0.8715 |
|
| 557 |
+
| 22.6041 | 171000 | 0.0144 | 0.0331 | 0.8717 |
|
| 558 |
+
| 22.7363 | 172000 | 0.014 | 0.0333 | 0.8712 |
|
| 559 |
+
| 22.8685 | 173000 | 0.0142 | 0.0341 | 0.8718 |
|
| 560 |
+
| 23.0007 | 174000 | 0.015 | 0.0344 | 0.8710 |
|
| 561 |
+
| 23.1328 | 175000 | 0.0141 | 0.0337 | 0.8701 |
|
| 562 |
+
| 23.2650 | 176000 | 0.0146 | 0.0336 | 0.8718 |
|
| 563 |
+
| 23.3972 | 177000 | 0.0143 | 0.0338 | 0.8725 |
|
| 564 |
+
| 23.5294 | 178000 | 0.0147 | 0.0330 | 0.8725 |
|
| 565 |
+
| 23.6616 | 179000 | 0.0141 | 0.0334 | 0.8719 |
|
| 566 |
+
| 23.7938 | 180000 | 0.0142 | 0.0329 | 0.8714 |
|
| 567 |
+
| 23.9260 | 181000 | 0.014 | 0.0338 | 0.8712 |
|
| 568 |
+
| 24.0582 | 182000 | 0.0141 | 0.0334 | 0.8719 |
|
| 569 |
+
| 24.1904 | 183000 | 0.0143 | 0.0350 | 0.8718 |
|
| 570 |
+
| 24.3225 | 184000 | 0.0144 | 0.0340 | 0.8723 |
|
| 571 |
+
| 24.4547 | 185000 | 0.015 | 0.0330 | 0.8727 |
|
| 572 |
+
| 24.5869 | 186000 | 0.0144 | 0.0341 | 0.8723 |
|
| 573 |
+
| 24.7191 | 187000 | 0.0143 | 0.0332 | 0.8722 |
|
| 574 |
+
| 24.8513 | 188000 | 0.014 | 0.0345 | 0.8722 |
|
| 575 |
+
| 24.9835 | 189000 | 0.0141 | 0.0353 | 0.8709 |
|
| 576 |
+
| 25.1157 | 190000 | 0.0137 | 0.0349 | 0.8719 |
|
| 577 |
+
| 25.2479 | 191000 | 0.0142 | 0.0345 | 0.8711 |
|
| 578 |
+
| 25.3800 | 192000 | 0.0143 | 0.0334 | 0.8716 |
|
| 579 |
+
| 25.5122 | 193000 | 0.0137 | 0.0332 | 0.8717 |
|
| 580 |
+
| 25.6444 | 194000 | 0.0143 | 0.0339 | 0.8720 |
|
| 581 |
+
| 25.7766 | 195000 | 0.0136 | 0.0338 | 0.8703 |
|
| 582 |
+
| 25.9088 | 196000 | 0.0134 | 0.0333 | 0.8710 |
|
| 583 |
+
| 26.0410 | 197000 | 0.0136 | 0.0350 | 0.8708 |
|
| 584 |
+
| 26.1732 | 198000 | 0.0136 | 0.0345 | 0.8709 |
|
| 585 |
+
| 26.3054 | 199000 | 0.0142 | 0.0340 | 0.8714 |
|
| 586 |
+
| 26.4375 | 200000 | 0.0141 | 0.0335 | 0.8722 |
|
| 587 |
+
| 26.5697 | 201000 | 0.0146 | 0.0343 | 0.8717 |
|
| 588 |
+
| 26.7019 | 202000 | 0.0136 | 0.0341 | 0.8713 |
|
| 589 |
+
| 26.8341 | 203000 | 0.0131 | 0.0348 | 0.8715 |
|
| 590 |
+
| 26.9663 | 204000 | 0.014 | 0.0345 | 0.8706 |
|
| 591 |
+
| 27.0985 | 205000 | 0.0135 | 0.0349 | 0.8715 |
|
| 592 |
+
| 27.2307 | 206000 | 0.0135 | 0.0337 | 0.8713 |
|
| 593 |
+
| 27.3629 | 207000 | 0.0146 | 0.0334 | 0.8717 |
|
| 594 |
+
| 27.4950 | 208000 | 0.0138 | 0.0337 | 0.8717 |
|
| 595 |
+
| 27.6272 | 209000 | 0.0136 | 0.0331 | 0.8723 |
|
| 596 |
+
| 27.7594 | 210000 | 0.0133 | 0.0343 | 0.8717 |
|
| 597 |
+
| 27.8916 | 211000 | 0.0137 | 0.0341 | 0.8722 |
|
| 598 |
+
| 28.0238 | 212000 | 0.0132 | 0.0340 | 0.8718 |
|
| 599 |
+
| 28.1560 | 213000 | 0.0136 | 0.0344 | 0.8720 |
|
| 600 |
+
| 28.2882 | 214000 | 0.0143 | 0.0337 | 0.8720 |
|
| 601 |
+
| 28.4204 | 215000 | 0.0136 | 0.0340 | 0.8729 |
|
| 602 |
+
| 28.5525 | 216000 | 0.014 | 0.0334 | 0.8721 |
|
| 603 |
+
| 28.6847 | 217000 | 0.0131 | 0.0338 | 0.8726 |
|
| 604 |
+
| 28.8169 | 218000 | 0.0131 | 0.0337 | 0.8726 |
|
| 605 |
+
| 28.9491 | 219000 | 0.0136 | 0.0346 | 0.8726 |
|
| 606 |
+
| 29.0813 | 220000 | 0.0132 | 0.0347 | 0.8721 |
|
| 607 |
+
| 29.2135 | 221000 | 0.0136 | 0.0344 | 0.8719 |
|
| 608 |
+
| 29.3457 | 222000 | 0.0137 | 0.0345 | 0.8724 |
|
| 609 |
+
| 29.4779 | 223000 | 0.0138 | 0.0337 | 0.8723 |
|
| 610 |
+
| 29.6100 | 224000 | 0.013 | 0.0337 | 0.8724 |
|
| 611 |
+
| 29.7422 | 225000 | 0.0134 | 0.0343 | 0.8724 |
|
| 612 |
+
| 29.8744 | 226000 | 0.0132 | 0.0338 | 0.8720 |
|
| 613 |
+
| 30.0066 | 227000 | 0.0133 | 0.0335 | 0.8721 |
|
| 614 |
+
| 30.1388 | 228000 | 0.013 | 0.0340 | 0.8715 |
|
| 615 |
+
| 30.2710 | 229000 | 0.0144 | 0.0332 | 0.8726 |
|
| 616 |
+
| 30.4032 | 230000 | 0.014 | 0.0346 | 0.8726 |
|
| 617 |
+
| 30.5354 | 231000 | 0.0137 | 0.0330 | 0.8726 |
|
| 618 |
+
| 30.6675 | 232000 | 0.0131 | 0.0342 | 0.8724 |
|
| 619 |
+
| 30.7997 | 233000 | 0.0128 | 0.0337 | 0.8718 |
|
| 620 |
+
| 30.9319 | 234000 | 0.0135 | 0.0342 | 0.8723 |
|
| 621 |
+
| 31.0641 | 235000 | 0.0138 | 0.0346 | 0.8722 |
|
| 622 |
+
| 31.1963 | 236000 | 0.0133 | 0.0347 | 0.8721 |
|
| 623 |
+
| 31.3285 | 237000 | 0.0137 | 0.0335 | 0.8723 |
|
| 624 |
+
| 31.4607 | 238000 | 0.0137 | 0.0337 | 0.8725 |
|
| 625 |
+
| 31.5929 | 239000 | 0.0131 | 0.0340 | 0.8723 |
|
| 626 |
+
| 31.7250 | 240000 | 0.0129 | 0.0334 | 0.8725 |
|
| 627 |
+
| 31.8572 | 241000 | 0.0133 | 0.0336 | 0.8731 |
|
| 628 |
+
| 31.9894 | 242000 | 0.0137 | 0.0343 | 0.8726 |
|
| 629 |
+
| 32.1216 | 243000 | 0.0132 | 0.0329 | 0.8722 |
|
| 630 |
+
| 32.2538 | 244000 | 0.0135 | 0.0338 | 0.8724 |
|
| 631 |
+
| 32.3860 | 245000 | 0.0129 | 0.0344 | 0.8724 |
|
| 632 |
+
| 32.5182 | 246000 | 0.0136 | 0.0342 | 0.8724 |
|
| 633 |
+
| 32.6504 | 247000 | 0.0133 | 0.0331 | 0.8720 |
|
| 634 |
+
| 32.7826 | 248000 | 0.0128 | 0.0337 | 0.8718 |
|
| 635 |
+
| 32.9147 | 249000 | 0.0127 | 0.0338 | 0.8717 |
|
| 636 |
+
| 33.0469 | 250000 | 0.013 | 0.0328 | 0.8717 |
|
| 637 |
+
| 33.1791 | 251000 | 0.0135 | 0.0337 | 0.8723 |
|
| 638 |
+
| 33.3113 | 252000 | 0.0131 | 0.0334 | 0.8722 |
|
| 639 |
+
| 33.4435 | 253000 | 0.0134 | 0.0339 | 0.8723 |
|
| 640 |
+
| 33.5757 | 254000 | 0.0135 | 0.0338 | 0.8724 |
|
| 641 |
+
| 33.7079 | 255000 | 0.013 | 0.0341 | 0.8722 |
|
| 642 |
+
| 33.8401 | 256000 | 0.0126 | 0.0334 | 0.8720 |
|
| 643 |
+
| 33.9722 | 257000 | 0.0136 | 0.0338 | 0.8719 |
|
| 644 |
+
| 34.1044 | 258000 | 0.0123 | 0.0338 | 0.8722 |
|
| 645 |
+
| 34.2366 | 259000 | 0.0135 | 0.0336 | 0.8719 |
|
| 646 |
+
| 34.3688 | 260000 | 0.0136 | 0.0343 | 0.8724 |
|
| 647 |
+
| 34.5010 | 261000 | 0.0134 | 0.0341 | 0.8722 |
|
| 648 |
+
| 34.6332 | 262000 | 0.0136 | 0.0343 | 0.8722 |
|
| 649 |
+
| 34.7654 | 263000 | 0.0131 | 0.0344 | 0.8719 |
|
| 650 |
+
| 34.8976 | 264000 | 0.0128 | 0.0343 | 0.8719 |
|
| 651 |
+
| 35.0297 | 265000 | 0.0129 | 0.0336 | 0.8718 |
|
| 652 |
+
| 35.1619 | 266000 | 0.0128 | 0.0334 | 0.8720 |
|
| 653 |
+
| 35.2941 | 267000 | 0.013 | 0.0340 | 0.8719 |
|
| 654 |
+
| 35.4263 | 268000 | 0.0133 | 0.0341 | 0.8723 |
|
| 655 |
+
| 35.5585 | 269000 | 0.0132 | 0.0331 | 0.8723 |
|
| 656 |
+
| 35.6907 | 270000 | 0.0127 | 0.0335 | 0.8721 |
|
| 657 |
+
| 35.8229 | 271000 | 0.0123 | 0.0334 | 0.8721 |
|
| 658 |
+
| 35.9551 | 272000 | 0.0135 | 0.0343 | 0.8721 |
|
| 659 |
+
| 36.0872 | 273000 | 0.0125 | 0.0345 | 0.8721 |
|
| 660 |
+
| 36.2194 | 274000 | 0.0134 | 0.0336 | 0.8719 |
|
| 661 |
+
| 36.3516 | 275000 | 0.0132 | 0.0338 | 0.8720 |
|
| 662 |
+
| 36.4838 | 276000 | 0.0136 | 0.0331 | 0.8722 |
|
| 663 |
+
| 36.6160 | 277000 | 0.0133 | 0.0335 | 0.8726 |
|
| 664 |
+
| 36.7482 | 278000 | 0.0125 | 0.0336 | 0.8726 |
|
| 665 |
+
| 36.8804 | 279000 | 0.0122 | 0.0344 | 0.8724 |
|
| 666 |
+
| 37.0126 | 280000 | 0.013 | 0.0336 | 0.8722 |
|
| 667 |
+
| 37.1447 | 281000 | 0.0132 | 0.0333 | 0.8724 |
|
| 668 |
+
| 37.2769 | 282000 | 0.0137 | 0.0333 | 0.8722 |
|
| 669 |
+
| 37.4091 | 283000 | 0.0133 | 0.0339 | 0.8723 |
|
| 670 |
+
| 37.5413 | 284000 | 0.013 | 0.0335 | 0.8723 |
|
| 671 |
+
| 37.6735 | 285000 | 0.0129 | 0.0329 | 0.8723 |
|
| 672 |
+
| 37.8057 | 286000 | 0.013 | 0.0327 | 0.8722 |
|
| 673 |
+
| 37.9379 | 287000 | 0.0124 | 0.0338 | 0.8722 |
|
| 674 |
+
| 38.0701 | 288000 | 0.0131 | 0.0338 | 0.8721 |
|
| 675 |
+
| 38.2022 | 289000 | 0.0129 | 0.0342 | 0.8723 |
|
| 676 |
+
| 38.3344 | 290000 | 0.013 | 0.0336 | 0.8720 |
|
| 677 |
+
| 38.4666 | 291000 | 0.0134 | 0.0335 | 0.8721 |
|
| 678 |
+
| 38.5988 | 292000 | 0.0129 | 0.0338 | 0.8722 |
|
| 679 |
+
| 38.7310 | 293000 | 0.0122 | 0.0337 | 0.8722 |
|
| 680 |
+
| 38.8632 | 294000 | 0.0123 | 0.0338 | 0.8722 |
|
| 681 |
+
| 38.9954 | 295000 | 0.0132 | 0.0335 | 0.8722 |
|
| 682 |
+
| 39.1276 | 296000 | 0.0128 | 0.0333 | 0.8720 |
|
| 683 |
+
| 39.2597 | 297000 | 0.0135 | 0.0336 | 0.8720 |
|
| 684 |
+
| 39.3919 | 298000 | 0.0132 | 0.0342 | 0.8721 |
|
| 685 |
+
| 39.5241 | 299000 | 0.0136 | 0.0328 | 0.8719 |
|
| 686 |
+
| 39.6563 | 300000 | 0.0125 | 0.0339 | 0.8720 |
|
| 687 |
+
| 39.7885 | 301000 | 0.0125 | 0.0343 | 0.8720 |
|
| 688 |
+
| 39.9207 | 302000 | 0.0126 | 0.0339 | 0.8721 |
|
| 689 |
+
| 40.0529 | 303000 | 0.0129 | 0.0338 | 0.8720 |
|
| 690 |
+
| 40.1851 | 304000 | 0.0133 | 0.0334 | 0.8719 |
|
| 691 |
+
| 40.3173 | 305000 | 0.0134 | 0.0336 | 0.8719 |
|
| 692 |
+
| 40.4494 | 306000 | 0.0127 | 0.0336 | 0.8720 |
|
| 693 |
+
| 40.5816 | 307000 | 0.0126 | 0.0342 | 0.8721 |
|
| 694 |
+
| 40.7138 | 308000 | 0.013 | 0.0340 | 0.8721 |
|
| 695 |
+
| 40.8460 | 309000 | 0.013 | 0.0332 | 0.8721 |
|
| 696 |
+
| 40.9782 | 310000 | 0.0129 | 0.0337 | 0.8720 |
|
| 697 |
+
| 41.1104 | 311000 | 0.0123 | 0.0328 | 0.8721 |
|
| 698 |
+
| 41.2426 | 312000 | 0.013 | 0.0336 | 0.8721 |
|
| 699 |
+
| 41.3748 | 313000 | 0.0132 | 0.0337 | 0.8721 |
|
| 700 |
+
| 41.5069 | 314000 | 0.0132 | 0.0335 | 0.8722 |
|
| 701 |
+
| 41.6391 | 315000 | 0.0131 | 0.0343 | 0.8721 |
|
| 702 |
+
| 41.7713 | 316000 | 0.0122 | 0.0339 | 0.8720 |
|
| 703 |
+
| 41.9035 | 317000 | 0.0125 | 0.0340 | 0.8720 |
|
| 704 |
+
| 42.0357 | 318000 | 0.0122 | 0.0342 | 0.8720 |
|
| 705 |
+
| 42.1679 | 319000 | 0.0129 | 0.0337 | 0.8720 |
|
| 706 |
+
| 42.3001 | 320000 | 0.013 | 0.0330 | 0.8720 |
|
| 707 |
+
| 42.4323 | 321000 | 0.013 | 0.0332 | 0.8720 |
|
| 708 |
+
| 42.5644 | 322000 | 0.0141 | 0.0349 | 0.8721 |
|
| 709 |
+
| 42.6966 | 323000 | 0.013 | 0.0334 | 0.8721 |
|
| 710 |
+
| 42.8288 | 324000 | 0.0125 | 0.0339 | 0.8720 |
|
| 711 |
+
| 42.9610 | 325000 | 0.0126 | 0.0342 | 0.8720 |
|
| 712 |
+
| 43.0932 | 326000 | 0.0127 | 0.0339 | 0.8720 |
|
| 713 |
+
| 43.2254 | 327000 | 0.0126 | 0.0330 | 0.8720 |
|
| 714 |
+
| 43.3576 | 328000 | 0.013 | 0.0343 | 0.8720 |
|
| 715 |
+
| 43.4898 | 329000 | 0.0135 | 0.0334 | 0.8720 |
|
| 716 |
+
| 43.6219 | 330000 | 0.0131 | 0.0327 | 0.8721 |
|
| 717 |
+
| 43.7541 | 331000 | 0.0124 | 0.0334 | 0.8721 |
|
| 718 |
+
| 43.8863 | 332000 | 0.0126 | 0.0344 | 0.8721 |
|
| 719 |
+
| 44.0185 | 333000 | 0.0131 | 0.0338 | 0.8721 |
|
| 720 |
+
| 44.1507 | 334000 | 0.0121 | 0.0340 | 0.8721 |
|
| 721 |
+
| 44.2829 | 335000 | 0.0131 | 0.0336 | 0.8721 |
|
| 722 |
+
| 44.4151 | 336000 | 0.0135 | 0.0340 | 0.8721 |
|
| 723 |
+
| 44.5473 | 337000 | 0.0131 | 0.0335 | 0.8721 |
|
| 724 |
+
| 44.6794 | 338000 | 0.0132 | 0.0340 | 0.8721 |
|
| 725 |
+
| 44.8116 | 339000 | 0.0128 | 0.0333 | 0.8721 |
|
| 726 |
+
| 44.9438 | 340000 | 0.0124 | 0.0333 | 0.8721 |
|
| 727 |
+
| 45.0760 | 341000 | 0.0131 | 0.0337 | 0.8720 |
|
| 728 |
+
| 45.2082 | 342000 | 0.0129 | 0.0341 | 0.8721 |
|
| 729 |
+
| 45.3404 | 343000 | 0.0133 | 0.0335 | 0.8721 |
|
| 730 |
+
| 45.4726 | 344000 | 0.0133 | 0.0341 | 0.8721 |
|
| 731 |
+
| 45.6048 | 345000 | 0.013 | 0.0334 | 0.8721 |
|
| 732 |
+
| 45.7369 | 346000 | 0.0129 | 0.0343 | 0.8721 |
|
| 733 |
+
| 45.8691 | 347000 | 0.0125 | 0.0335 | 0.8721 |
|
| 734 |
+
| 46.0013 | 348000 | 0.0133 | 0.0344 | 0.8721 |
|
| 735 |
+
| 46.1335 | 349000 | 0.013 | 0.0332 | 0.8720 |
|
| 736 |
+
| 46.2657 | 350000 | 0.0128 | 0.0337 | 0.8721 |
|
| 737 |
+
| 46.3979 | 351000 | 0.0132 | 0.0334 | 0.8721 |
|
| 738 |
+
| 46.5301 | 352000 | 0.0127 | 0.0343 | 0.8721 |
|
| 739 |
+
| 46.6623 | 353000 | 0.0127 | 0.0334 | 0.8721 |
|
| 740 |
+
| 46.7944 | 354000 | 0.0126 | 0.0332 | 0.8720 |
|
| 741 |
+
| 46.9266 | 355000 | 0.013 | 0.0339 | 0.8721 |
|
| 742 |
+
| 47.0588 | 356000 | 0.0126 | 0.0340 | 0.8721 |
|
| 743 |
+
| 47.1910 | 357000 | 0.0132 | 0.0336 | 0.8721 |
|
| 744 |
+
| 47.3232 | 358000 | 0.0138 | 0.0334 | 0.8721 |
|
| 745 |
+
| 47.4554 | 359000 | 0.0133 | 0.0336 | 0.8720 |
|
| 746 |
+
| 47.5876 | 360000 | 0.0135 | 0.0340 | 0.8720 |
|
| 747 |
+
| 47.7198 | 361000 | 0.0129 | 0.0341 | 0.8721 |
|
| 748 |
+
| 47.8519 | 362000 | 0.0123 | 0.0334 | 0.8721 |
|
| 749 |
+
| 47.9841 | 363000 | 0.0126 | 0.0334 | 0.8721 |
|
| 750 |
+
| 48.1163 | 364000 | 0.0121 | 0.0337 | 0.8721 |
|
| 751 |
+
| 48.2485 | 365000 | 0.0127 | 0.0342 | 0.8720 |
|
| 752 |
+
| 48.3807 | 366000 | 0.0124 | 0.0336 | 0.8721 |
|
| 753 |
+
| 48.5129 | 367000 | 0.0125 | 0.0338 | 0.8721 |
|
| 754 |
+
| 48.6451 | 368000 | 0.0125 | 0.0341 | 0.8720 |
|
| 755 |
+
| 48.7773 | 369000 | 0.0122 | 0.0333 | 0.8721 |
|
| 756 |
+
| 48.9095 | 370000 | 0.0123 | 0.0336 | 0.8721 |
|
| 757 |
+
| 49.0416 | 371000 | 0.0124 | 0.0341 | 0.8720 |
|
| 758 |
+
| 49.1738 | 372000 | 0.0132 | 0.0330 | 0.8720 |
|
| 759 |
+
| 49.3060 | 373000 | 0.0128 | 0.0342 | 0.8720 |
|
| 760 |
+
| 49.4382 | 374000 | 0.0132 | 0.0341 | 0.8720 |
|
| 761 |
+
| 49.5704 | 375000 | 0.013 | 0.0334 | 0.8721 |
|
| 762 |
+
| 49.7026 | 376000 | 0.0126 | 0.0340 | 0.8720 |
|
| 763 |
+
| 49.8348 | 377000 | 0.0126 | 0.0337 | 0.8720 |
|
| 764 |
+
| 49.9670 | 378000 | 0.0131 | 0.0337 | 0.8720 |
|
| 765 |
+
|
| 766 |
+
* The bold row denotes the saved checkpoint.
|
| 767 |
+
</details>
|
| 768 |
+
|
| 769 |
### Framework Versions
|
| 770 |
- Python: 3.12.3
|
| 771 |
- Sentence Transformers: 5.1.0
|
model.safetensors
CHANGED
|
@@ -1,3 +1,3 @@
|
|
| 1 |
version https://git-lfs.github.com/spec/v1
|
| 2 |
-
oid sha256:
|
| 3 |
size 45439314
|
|
|
|
| 1 |
version https://git-lfs.github.com/spec/v1
|
| 2 |
+
oid sha256:5c0a5cc2b8c443ebf2a4eecb674858aa7635fb8bce8a55bae03fc3c84855617b
|
| 3 |
size 45439314
|