Aditeya Baral commited on
Commit
47e0e27
·
verified ·
1 Parent(s): ee21ae0

Add new CrossEncoder model

Browse files
Files changed (2) hide show
  1. README.md +433 -17
  2. model.safetensors +1 -1
README.md CHANGED
@@ -66,25 +66,25 @@ model-index:
66
  type: test
67
  metrics:
68
  - type: accuracy
69
- value: 0.7230292965285952
70
  name: Accuracy
71
  - type: accuracy_threshold
72
- value: 0.9352303147315979
73
  name: Accuracy Threshold
74
  - type: f1
75
- value: 0.7144263194410831
76
  name: F1
77
  - type: f1_threshold
78
- value: 0.9142870903015137
79
  name: F1 Threshold
80
  - type: precision
81
- value: 0.6302559284880577
82
  name: Precision
83
  - type: recall
84
- value: 0.8245437616387337
85
  name: Recall
86
  - type: average_precision
87
- value: 0.6906882331078481
88
  name: Average Precision
89
  ---
90
 
@@ -188,13 +188,13 @@ You can finetune this model on your own dataset.
188
 
189
  | Metric | val | test |
190
  |:----------------------|:-----------|:-----------|
191
- | accuracy | 0.7718 | 0.723 |
192
- | accuracy_threshold | 0.8927 | 0.9352 |
193
- | f1 | 0.6934 | 0.7144 |
194
- | f1_threshold | 0.8759 | 0.9143 |
195
- | precision | 0.6788 | 0.6303 |
196
- | recall | 0.7086 | 0.8245 |
197
- | **average_precision** | **0.7676** | **0.6907** |
198
 
199
  <!--
200
  ## Bias, Risks and Limitations
@@ -262,11 +262,427 @@ You can finetune this model on your own dataset.
262
  }
263
  ```
264
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
265
  ### Training Logs
266
- | Epoch | Step | val_average_precision | test_average_precision |
267
- |:-----:|:----:|:---------------------:|:----------------------:|
268
- | -1 | -1 | 0.7676 | 0.6907 |
269
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
270
 
271
  ### Framework Versions
272
  - Python: 3.12.3
 
66
  type: test
67
  metrics:
68
  - type: accuracy
69
+ value: 0.8962125731607037
70
  name: Accuracy
71
  - type: accuracy_threshold
72
+ value: 0.7501106262207031
73
  name: Accuracy Threshold
74
  - type: f1
75
+ value: 0.8816524206227297
76
  name: F1
77
  - type: f1_threshold
78
+ value: 0.0708337128162384
79
  name: F1 Threshold
80
  - type: precision
81
+ value: 0.8563204268613753
82
  name: Precision
83
  - type: recall
84
+ value: 0.9085288640595903
85
  name: Recall
86
  - type: average_precision
87
+ value: 0.9302702993716566
88
  name: Average Precision
89
  ---
90
 
 
188
 
189
  | Metric | val | test |
190
  |:----------------------|:-----------|:-----------|
191
+ | accuracy | 0.7718 | 0.8962 |
192
+ | accuracy_threshold | 0.8927 | 0.7501 |
193
+ | f1 | 0.6934 | 0.8817 |
194
+ | f1_threshold | 0.8759 | 0.0708 |
195
+ | precision | 0.6788 | 0.8563 |
196
+ | recall | 0.7086 | 0.9085 |
197
+ | **average_precision** | **0.7676** | **0.9303** |
198
 
199
  <!--
200
  ## Bias, Risks and Limitations
 
262
  }
263
  ```
264
 
265
+ ### Training Hyperparameters
266
+ #### Non-Default Hyperparameters
267
+
268
+ - `eval_strategy`: steps
269
+ - `per_device_train_batch_size`: 48
270
+ - `per_device_eval_batch_size`: 48
271
+ - `learning_rate`: 0.0002
272
+ - `num_train_epochs`: 50
273
+ - `warmup_steps`: 100
274
+ - `load_best_model_at_end`: True
275
+ - `optim`: adamw_torch
276
+ - `push_to_hub`: True
277
+ - `hub_model_id`: aditeyabaral-redis/langcache-reranker-v1-test2
278
+
279
+ #### All Hyperparameters
280
+ <details><summary>Click to expand</summary>
281
+
282
+ - `overwrite_output_dir`: False
283
+ - `do_predict`: False
284
+ - `eval_strategy`: steps
285
+ - `prediction_loss_only`: True
286
+ - `per_device_train_batch_size`: 48
287
+ - `per_device_eval_batch_size`: 48
288
+ - `per_gpu_train_batch_size`: None
289
+ - `per_gpu_eval_batch_size`: None
290
+ - `gradient_accumulation_steps`: 1
291
+ - `eval_accumulation_steps`: None
292
+ - `torch_empty_cache_steps`: None
293
+ - `learning_rate`: 0.0002
294
+ - `weight_decay`: 0.0
295
+ - `adam_beta1`: 0.9
296
+ - `adam_beta2`: 0.999
297
+ - `adam_epsilon`: 1e-08
298
+ - `max_grad_norm`: 1.0
299
+ - `num_train_epochs`: 50
300
+ - `max_steps`: -1
301
+ - `lr_scheduler_type`: linear
302
+ - `lr_scheduler_kwargs`: {}
303
+ - `warmup_ratio`: 0.0
304
+ - `warmup_steps`: 100
305
+ - `log_level`: passive
306
+ - `log_level_replica`: warning
307
+ - `log_on_each_node`: True
308
+ - `logging_nan_inf_filter`: True
309
+ - `save_safetensors`: True
310
+ - `save_on_each_node`: False
311
+ - `save_only_model`: False
312
+ - `restore_callback_states_from_checkpoint`: False
313
+ - `no_cuda`: False
314
+ - `use_cpu`: False
315
+ - `use_mps_device`: False
316
+ - `seed`: 42
317
+ - `data_seed`: None
318
+ - `jit_mode_eval`: False
319
+ - `use_ipex`: False
320
+ - `bf16`: False
321
+ - `fp16`: False
322
+ - `fp16_opt_level`: O1
323
+ - `half_precision_backend`: auto
324
+ - `bf16_full_eval`: False
325
+ - `fp16_full_eval`: False
326
+ - `tf32`: None
327
+ - `local_rank`: 1
328
+ - `ddp_backend`: None
329
+ - `tpu_num_cores`: None
330
+ - `tpu_metrics_debug`: False
331
+ - `debug`: []
332
+ - `dataloader_drop_last`: True
333
+ - `dataloader_num_workers`: 0
334
+ - `dataloader_prefetch_factor`: None
335
+ - `past_index`: -1
336
+ - `disable_tqdm`: False
337
+ - `remove_unused_columns`: True
338
+ - `label_names`: None
339
+ - `load_best_model_at_end`: True
340
+ - `ignore_data_skip`: False
341
+ - `fsdp`: []
342
+ - `fsdp_min_num_params`: 0
343
+ - `fsdp_config`: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
344
+ - `fsdp_transformer_layer_cls_to_wrap`: None
345
+ - `accelerator_config`: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
346
+ - `deepspeed`: None
347
+ - `label_smoothing_factor`: 0.0
348
+ - `optim`: adamw_torch
349
+ - `optim_args`: None
350
+ - `adafactor`: False
351
+ - `group_by_length`: False
352
+ - `length_column_name`: length
353
+ - `ddp_find_unused_parameters`: None
354
+ - `ddp_bucket_cap_mb`: None
355
+ - `ddp_broadcast_buffers`: False
356
+ - `dataloader_pin_memory`: True
357
+ - `dataloader_persistent_workers`: False
358
+ - `skip_memory_metrics`: True
359
+ - `use_legacy_prediction_loop`: False
360
+ - `push_to_hub`: True
361
+ - `resume_from_checkpoint`: None
362
+ - `hub_model_id`: aditeyabaral-redis/langcache-reranker-v1-test2
363
+ - `hub_strategy`: every_save
364
+ - `hub_private_repo`: None
365
+ - `hub_always_push`: False
366
+ - `hub_revision`: None
367
+ - `gradient_checkpointing`: False
368
+ - `gradient_checkpointing_kwargs`: None
369
+ - `include_inputs_for_metrics`: False
370
+ - `include_for_metrics`: []
371
+ - `eval_do_concat_batches`: True
372
+ - `fp16_backend`: auto
373
+ - `push_to_hub_model_id`: None
374
+ - `push_to_hub_organization`: None
375
+ - `mp_parameters`:
376
+ - `auto_find_batch_size`: False
377
+ - `full_determinism`: False
378
+ - `torchdynamo`: None
379
+ - `ray_scope`: last
380
+ - `ddp_timeout`: 1800
381
+ - `torch_compile`: False
382
+ - `torch_compile_backend`: None
383
+ - `torch_compile_mode`: None
384
+ - `include_tokens_per_second`: False
385
+ - `include_num_input_tokens_seen`: False
386
+ - `neftune_noise_alpha`: None
387
+ - `optim_target_modules`: None
388
+ - `batch_eval_metrics`: False
389
+ - `eval_on_start`: False
390
+ - `use_liger_kernel`: False
391
+ - `liger_kernel_config`: None
392
+ - `eval_use_gather_object`: False
393
+ - `average_tokens_across_devices`: True
394
+ - `prompts`: None
395
+ - `batch_sampler`: batch_sampler
396
+ - `multi_dataset_batch_sampler`: proportional
397
+ - `router_mapping`: {}
398
+ - `learning_rate_mapping`: {}
399
+
400
+ </details>
401
+
402
  ### Training Logs
403
+ <details><summary>Click to expand</summary>
 
 
404
 
405
+ | Epoch | Step | Training Loss | Validation Loss | val_average_precision | test_average_precision |
406
+ |:-----------:|:----------:|:-------------:|:---------------:|:---------------------:|:----------------------:|
407
+ | -1 | -1 | - | - | 0.7676 | 0.6907 |
408
+ | 0.1818 | 1000 | 0.3058 | 0.3991 | - | 0.8791 |
409
+ | 0.3636 | 2000 | 0.2417 | 0.3729 | - | 0.8978 |
410
+ | 0.5455 | 3000 | 0.2287 | 0.3356 | - | 0.9178 |
411
+ | 0.7273 | 4000 | 0.2183 | 0.3729 | - | 0.9164 |
412
+ | 0.9091 | 5000 | 0.212 | 0.3404 | - | 0.9229 |
413
+ | 1.0909 | 6000 | 0.1999 | 0.3171 | - | 0.9189 |
414
+ | 1.2727 | 7000 | 0.1944 | 0.3131 | - | 0.9243 |
415
+ | 1.4545 | 8000 | 0.1933 | 0.3116 | - | 0.9230 |
416
+ | 1.6364 | 9000 | 0.193 | 0.3211 | - | 0.9238 |
417
+ | 1.8182 | 10000 | 0.1879 | 0.2984 | - | 0.9311 |
418
+ | 2.0 | 11000 | 0.1869 | 0.2983 | - | 0.9331 |
419
+ | 2.1818 | 12000 | 0.1726 | 0.3009 | - | 0.9311 |
420
+ | 2.3636 | 13000 | 0.1725 | 0.3058 | - | 0.9293 |
421
+ | 2.5455 | 14000 | 0.1743 | 0.2991 | - | 0.9303 |
422
+ | 2.7273 | 15000 | 0.1724 | 0.2875 | - | 0.9332 |
423
+ | 2.9091 | 16000 | 0.173 | 0.2872 | - | 0.9342 |
424
+ | 3.0909 | 17000 | 0.1652 | 0.2840 | - | 0.9372 |
425
+ | 3.2727 | 18000 | 0.1597 | 0.2825 | - | 0.9335 |
426
+ | 3.4545 | 19000 | 0.1613 | 0.2988 | - | 0.9373 |
427
+ | 3.6364 | 20000 | 0.1587 | 0.2868 | - | 0.9333 |
428
+ | 3.8182 | 21000 | 0.16 | 0.2863 | - | 0.9379 |
429
+ | 4.0 | 22000 | 0.1599 | 0.2853 | - | 0.9379 |
430
+ | 4.1818 | 23000 | 0.146 | 0.2876 | - | 0.9346 |
431
+ | 4.3636 | 24000 | 0.1481 | 0.2813 | - | 0.9382 |
432
+ | 4.5455 | 25000 | 0.1477 | 0.2831 | - | 0.9386 |
433
+ | 4.7273 | 26000 | 0.148 | 0.2769 | - | 0.9362 |
434
+ | 4.9091 | 27000 | 0.1485 | 0.2758 | - | 0.9423 |
435
+ | 5.0909 | 28000 | 0.142 | 0.2771 | - | 0.9362 |
436
+ | 5.2727 | 29000 | 0.1359 | 0.2878 | - | 0.9387 |
437
+ | 5.4545 | 30000 | 0.1368 | 0.2777 | - | 0.9399 |
438
+ | 5.6364 | 31000 | 0.1397 | 0.2797 | - | 0.9412 |
439
+ | 5.8182 | 32000 | 0.1395 | 0.2771 | - | 0.9412 |
440
+ | 6.0 | 33000 | 0.1393 | 0.2883 | - | 0.9436 |
441
+ | 6.1818 | 34000 | 0.1249 | 0.2822 | - | 0.9412 |
442
+ | 6.3636 | 35000 | 0.127 | 0.2739 | - | 0.9406 |
443
+ | 6.5455 | 36000 | 0.13 | 0.2797 | - | 0.9425 |
444
+ | 6.7273 | 37000 | 0.1304 | 0.2796 | - | 0.9410 |
445
+ | 6.9091 | 38000 | 0.1306 | 0.2750 | - | 0.9418 |
446
+ | 7.0909 | 39000 | 0.1236 | 0.2820 | - | 0.9421 |
447
+ | 7.2727 | 40000 | 0.1185 | 0.2898 | - | 0.9404 |
448
+ | 7.4545 | 41000 | 0.119 | 0.2863 | - | 0.9402 |
449
+ | 7.6364 | 42000 | 0.1206 | 0.2761 | - | 0.9397 |
450
+ | 7.8182 | 43000 | 0.1216 | 0.2702 | - | 0.9418 |
451
+ | 8.0 | 44000 | 0.1221 | 0.2771 | - | 0.9451 |
452
+ | 8.1818 | 45000 | 0.1083 | 0.2836 | - | 0.9368 |
453
+ | 8.3636 | 46000 | 0.1124 | 0.2889 | - | 0.9391 |
454
+ | 8.5455 | 47000 | 0.1106 | 0.2755 | - | 0.9422 |
455
+ | 8.7273 | 48000 | 0.1134 | 0.2853 | - | 0.9397 |
456
+ | 8.9091 | 49000 | 0.114 | 0.2845 | - | 0.9405 |
457
+ | 9.0909 | 50000 | 0.1095 | 0.2852 | - | 0.9421 |
458
+ | 9.2727 | 51000 | 0.1028 | 0.2880 | - | 0.9425 |
459
+ | 9.4545 | 52000 | 0.103 | 0.2796 | - | 0.9447 |
460
+ | 9.6364 | 53000 | 0.1048 | 0.2794 | - | 0.9468 |
461
+ | 9.8182 | 54000 | 0.105 | 0.2838 | - | 0.9447 |
462
+ | 10.0 | 55000 | 0.1086 | 0.2866 | - | 0.9423 |
463
+ | 10.1818 | 56000 | 0.0938 | 0.2809 | - | 0.9432 |
464
+ | 10.3636 | 57000 | 0.0969 | 0.3018 | - | 0.9442 |
465
+ | 10.5455 | 58000 | 0.0975 | 0.2823 | - | 0.9440 |
466
+ | 10.7273 | 59000 | 0.0971 | 0.2943 | - | 0.9445 |
467
+ | 10.9091 | 60000 | 0.1002 | 0.2915 | - | 0.9439 |
468
+ | 11.0909 | 61000 | 0.0939 | 0.2980 | - | 0.9451 |
469
+ | 11.2727 | 62000 | 0.0869 | 0.2932 | - | 0.9447 |
470
+ | 11.4545 | 63000 | 0.0888 | 0.2885 | - | 0.9433 |
471
+ | 11.6364 | 64000 | 0.0915 | 0.2844 | - | 0.9468 |
472
+ | 11.8182 | 65000 | 0.092 | 0.3085 | - | 0.9458 |
473
+ | 12.0 | 66000 | 0.0933 | 0.2833 | - | 0.9463 |
474
+ | 12.1818 | 67000 | 0.0796 | 0.3031 | - | 0.9446 |
475
+ | 12.3636 | 68000 | 0.082 | 0.2934 | - | 0.9428 |
476
+ | 12.5455 | 69000 | 0.0865 | 0.3030 | - | 0.9443 |
477
+ | 12.7273 | 70000 | 0.0858 | 0.3028 | - | 0.9452 |
478
+ | 12.9091 | 71000 | 0.0877 | 0.2930 | - | 0.9471 |
479
+ | 13.0909 | 72000 | 0.0791 | 0.3105 | - | 0.9388 |
480
+ | 13.2727 | 73000 | 0.0754 | 0.3189 | - | 0.9439 |
481
+ | 13.4545 | 74000 | 0.0777 | 0.2985 | - | 0.9434 |
482
+ | 13.6364 | 75000 | 0.0796 | 0.3170 | - | 0.9451 |
483
+ | 13.8182 | 76000 | 0.0799 | 0.2851 | - | 0.9458 |
484
+ | 14.0 | 77000 | 0.0816 | 0.3022 | - | 0.9461 |
485
+ | 14.1818 | 78000 | 0.0706 | 0.3169 | - | 0.9426 |
486
+ | 14.3636 | 79000 | 0.0727 | 0.3114 | - | 0.9402 |
487
+ | 14.5455 | 80000 | 0.073 | 0.3059 | - | 0.9439 |
488
+ | 14.7273 | 81000 | 0.0745 | 0.3108 | - | 0.9457 |
489
+ | 14.9091 | 82000 | 0.0741 | 0.3149 | - | 0.9442 |
490
+ | 15.0909 | 83000 | 0.0704 | 0.3213 | - | 0.9432 |
491
+ | 15.2727 | 84000 | 0.0649 | 0.3245 | - | 0.9449 |
492
+ | 15.4545 | 85000 | 0.0684 | 0.3180 | - | 0.9446 |
493
+ | 15.6364 | 86000 | 0.0694 | 0.3320 | - | 0.9425 |
494
+ | 15.8182 | 87000 | 0.0681 | 0.3138 | - | 0.9455 |
495
+ | 16.0 | 88000 | 0.0691 | 0.3158 | - | 0.9452 |
496
+ | 16.1818 | 89000 | 0.0608 | 0.3317 | - | 0.9449 |
497
+ | 16.3636 | 90000 | 0.0609 | 0.3253 | - | 0.9452 |
498
+ | 16.5455 | 91000 | 0.0621 | 0.3298 | - | 0.9447 |
499
+ | 16.7273 | 92000 | 0.0648 | 0.3246 | - | 0.9353 |
500
+ | 16.9091 | 93000 | 0.0657 | 0.3229 | - | 0.9411 |
501
+ | 17.0909 | 94000 | 0.0596 | 0.3327 | - | 0.9376 |
502
+ | 17.2727 | 95000 | 0.0579 | 0.3186 | - | 0.9420 |
503
+ | 17.4545 | 96000 | 0.0581 | 0.3272 | - | 0.9437 |
504
+ | 17.6364 | 97000 | 0.0592 | 0.3344 | - | 0.9450 |
505
+ | 17.8182 | 98000 | 0.06 | 0.3446 | - | 0.9444 |
506
+ | 18.0 | 99000 | 0.0598 | 0.3280 | - | 0.9414 |
507
+ | 18.1818 | 100000 | 0.0515 | 0.3577 | - | 0.9466 |
508
+ | 18.3636 | 101000 | 0.0539 | 0.3418 | - | 0.9413 |
509
+ | 18.5455 | 102000 | 0.0544 | 0.3365 | - | 0.9417 |
510
+ | 18.7273 | 103000 | 0.054 | 0.3294 | - | 0.9466 |
511
+ | 18.9091 | 104000 | 0.0568 | 0.3420 | - | 0.9461 |
512
+ | 19.0909 | 105000 | 0.0516 | 0.3650 | - | 0.9461 |
513
+ | 19.2727 | 106000 | 0.0482 | 0.3546 | - | 0.9397 |
514
+ | 19.4545 | 107000 | 0.0497 | 0.3338 | - | 0.9412 |
515
+ | 19.6364 | 108000 | 0.0495 | 0.3524 | - | 0.9441 |
516
+ | 19.8182 | 109000 | 0.051 | 0.3556 | - | 0.9449 |
517
+ | 20.0 | 110000 | 0.0512 | 0.3323 | - | 0.9439 |
518
+ | 20.1818 | 111000 | 0.0433 | 0.3572 | - | 0.9458 |
519
+ | 20.3636 | 112000 | 0.045 | 0.3678 | - | 0.9451 |
520
+ | 20.5455 | 113000 | 0.0466 | 0.3450 | - | 0.9429 |
521
+ | 20.7273 | 114000 | 0.047 | 0.3616 | - | 0.9411 |
522
+ | 20.9091 | 115000 | 0.048 | 0.3538 | - | 0.9461 |
523
+ | 21.0909 | 116000 | 0.044 | 0.3638 | - | 0.9418 |
524
+ | 21.2727 | 117000 | 0.0417 | 0.3767 | - | 0.9432 |
525
+ | 21.4545 | 118000 | 0.0428 | 0.3773 | - | 0.9445 |
526
+ | 21.6364 | 119000 | 0.0421 | 0.3613 | - | 0.9433 |
527
+ | 21.8182 | 120000 | 0.0442 | 0.3795 | - | 0.9459 |
528
+ | 22.0 | 121000 | 0.0453 | 0.3758 | - | 0.9406 |
529
+ | 22.1818 | 122000 | 0.0379 | 0.3819 | - | 0.9437 |
530
+ | 22.3636 | 123000 | 0.0397 | 0.3665 | - | 0.9444 |
531
+ | 22.5455 | 124000 | 0.039 | 0.3871 | - | 0.9431 |
532
+ | 22.7273 | 125000 | 0.0398 | 0.3752 | - | 0.9438 |
533
+ | 22.9091 | 126000 | 0.0408 | 0.3755 | - | 0.9441 |
534
+ | 23.0909 | 127000 | 0.0388 | 0.3698 | - | 0.9448 |
535
+ | 23.2727 | 128000 | 0.0348 | 0.3828 | - | 0.9425 |
536
+ | 23.4545 | 129000 | 0.0353 | 0.3814 | - | 0.9444 |
537
+ | 23.6364 | 130000 | 0.0375 | 0.3907 | - | 0.9427 |
538
+ | 23.8182 | 131000 | 0.0366 | 0.4085 | - | 0.9379 |
539
+ | 24.0 | 132000 | 0.0388 | 0.3734 | - | 0.9431 |
540
+ | 24.1818 | 133000 | 0.0321 | 0.4105 | - | 0.9409 |
541
+ | 24.3636 | 134000 | 0.0329 | 0.4038 | - | 0.9430 |
542
+ | 24.5455 | 135000 | 0.0335 | 0.4123 | - | 0.9452 |
543
+ | 24.7273 | 136000 | 0.0351 | 0.3945 | - | 0.9421 |
544
+ | 24.9091 | 137000 | 0.0347 | 0.3995 | - | 0.9412 |
545
+ | 25.0909 | 138000 | 0.0322 | 0.4154 | - | 0.9393 |
546
+ | 25.2727 | 139000 | 0.0312 | 0.3900 | - | 0.9447 |
547
+ | 25.4545 | 140000 | 0.0301 | 0.4083 | - | 0.9458 |
548
+ | 25.6364 | 141000 | 0.0318 | 0.4146 | - | 0.9448 |
549
+ | 25.8182 | 142000 | 0.0321 | 0.4198 | - | 0.9438 |
550
+ | 26.0 | 143000 | 0.032 | 0.4168 | - | 0.9438 |
551
+ | 26.1818 | 144000 | 0.0266 | 0.4293 | - | 0.9426 |
552
+ | 26.3636 | 145000 | 0.0277 | 0.4234 | - | 0.9427 |
553
+ | 26.5455 | 146000 | 0.0288 | 0.4309 | - | 0.9416 |
554
+ | 26.7273 | 147000 | 0.0292 | 0.4215 | - | 0.9413 |
555
+ | 26.9091 | 148000 | 0.0294 | 0.4020 | - | 0.9406 |
556
+ | 27.0909 | 149000 | 0.0272 | 0.4342 | - | 0.9387 |
557
+ | 27.2727 | 150000 | 0.025 | 0.4434 | - | 0.9366 |
558
+ | 27.4545 | 151000 | 0.027 | 0.4178 | - | 0.9386 |
559
+ | 27.6364 | 152000 | 0.0257 | 0.4396 | - | 0.9437 |
560
+ | 27.8182 | 153000 | 0.028 | 0.4099 | - | 0.9441 |
561
+ | 28.0 | 154000 | 0.0275 | 0.4185 | - | 0.9425 |
562
+ | 28.1818 | 155000 | 0.0236 | 0.4375 | - | 0.9411 |
563
+ | 28.3636 | 156000 | 0.0237 | 0.4232 | - | 0.9439 |
564
+ | 28.5455 | 157000 | 0.0237 | 0.4642 | - | 0.9408 |
565
+ | 28.7273 | 158000 | 0.0249 | 0.4374 | - | 0.9411 |
566
+ | 28.9091 | 159000 | 0.0258 | 0.4329 | - | 0.9410 |
567
+ | 29.0909 | 160000 | 0.0219 | 0.4867 | - | 0.9376 |
568
+ | 29.2727 | 161000 | 0.0216 | 0.4737 | - | 0.9436 |
569
+ | 29.4545 | 162000 | 0.0218 | 0.4577 | - | 0.9402 |
570
+ | 29.6364 | 163000 | 0.0223 | 0.4589 | - | 0.9418 |
571
+ | 29.8182 | 164000 | 0.0223 | 0.4410 | - | 0.9438 |
572
+ | 30.0 | 165000 | 0.0236 | 0.4477 | - | 0.9405 |
573
+ | 30.1818 | 166000 | 0.0203 | 0.4798 | - | 0.9414 |
574
+ | 30.3636 | 167000 | 0.02 | 0.4600 | - | 0.9456 |
575
+ | 30.5455 | 168000 | 0.0206 | 0.4492 | - | 0.9471 |
576
+ | 30.7273 | 169000 | 0.0203 | 0.4839 | - | 0.9426 |
577
+ | 30.9091 | 170000 | 0.0212 | 0.4731 | - | 0.9417 |
578
+ | 31.0909 | 171000 | 0.0196 | 0.4621 | - | 0.9412 |
579
+ | 31.2727 | 172000 | 0.0178 | 0.4986 | - | 0.9447 |
580
+ | 31.4545 | 173000 | 0.0177 | 0.4871 | - | 0.9463 |
581
+ | 31.6364 | 174000 | 0.0201 | 0.4520 | - | 0.9447 |
582
+ | 31.8182 | 175000 | 0.0191 | 0.4571 | - | 0.9458 |
583
+ | 32.0 | 176000 | 0.0201 | 0.4871 | - | 0.9472 |
584
+ | 32.1818 | 177000 | 0.0156 | 0.5061 | - | 0.9433 |
585
+ | 32.3636 | 178000 | 0.0174 | 0.4704 | - | 0.9433 |
586
+ | 32.5455 | 179000 | 0.0175 | 0.4900 | - | 0.9398 |
587
+ | 32.7273 | 180000 | 0.0175 | 0.4861 | - | 0.9456 |
588
+ | 32.9091 | 181000 | 0.0178 | 0.5005 | - | 0.9424 |
589
+ | 33.0909 | 182000 | 0.0163 | 0.4934 | - | 0.9418 |
590
+ | 33.2727 | 183000 | 0.0149 | 0.5065 | - | 0.9434 |
591
+ | 33.4545 | 184000 | 0.0157 | 0.4973 | - | 0.9430 |
592
+ | 33.6364 | 185000 | 0.0164 | 0.4993 | - | 0.9424 |
593
+ | 33.8182 | 186000 | 0.0164 | 0.4904 | - | 0.9442 |
594
+ | 34.0 | 187000 | 0.0167 | 0.5096 | - | 0.9397 |
595
+ | 34.1818 | 188000 | 0.0136 | 0.4960 | - | 0.9452 |
596
+ | 34.3636 | 189000 | 0.0142 | 0.5188 | - | 0.9427 |
597
+ | 34.5455 | 190000 | 0.0144 | 0.5139 | - | 0.9415 |
598
+ | 34.7273 | 191000 | 0.0149 | 0.4919 | - | 0.9443 |
599
+ | 34.9091 | 192000 | 0.0149 | 0.4775 | - | 0.9438 |
600
+ | 35.0909 | 193000 | 0.0127 | 0.5546 | - | 0.9418 |
601
+ | 35.2727 | 194000 | 0.0124 | 0.5440 | - | 0.9429 |
602
+ | 35.4545 | 195000 | 0.0126 | 0.5571 | - | 0.9412 |
603
+ | 35.6364 | 196000 | 0.0131 | 0.5127 | - | 0.9433 |
604
+ | 35.8182 | 197000 | 0.0134 | 0.5167 | - | 0.9434 |
605
+ | 36.0 | 198000 | 0.0134 | 0.4939 | - | 0.9452 |
606
+ | 36.1818 | 199000 | 0.011 | 0.5279 | - | 0.9390 |
607
+ | 36.3636 | 200000 | 0.0115 | 0.5336 | - | 0.9429 |
608
+ | 36.5455 | 201000 | 0.0113 | 0.5626 | - | 0.9399 |
609
+ | 36.7273 | 202000 | 0.012 | 0.5316 | - | 0.9446 |
610
+ | 36.9091 | 203000 | 0.0121 | 0.5222 | - | 0.9447 |
611
+ | 37.0909 | 204000 | 0.0107 | 0.5618 | - | 0.9426 |
612
+ | 37.2727 | 205000 | 0.0107 | 0.5508 | - | 0.9409 |
613
+ | 37.4545 | 206000 | 0.0106 | 0.5414 | - | 0.9433 |
614
+ | 37.6364 | 207000 | 0.0106 | 0.5522 | - | 0.9421 |
615
+ | 37.8182 | 208000 | 0.0111 | 0.5524 | - | 0.9428 |
616
+ | 38.0 | 209000 | 0.012 | 0.5176 | - | 0.9455 |
617
+ | 38.1818 | 210000 | 0.0087 | 0.5742 | - | 0.9444 |
618
+ | **38.3636** | **211000** | **0.0092** | **0.5686** | **-** | **0.9437** |
619
+ | 38.5455 | 212000 | 0.0099 | 0.5699 | - | 0.9436 |
620
+ | 38.7273 | 213000 | 0.0094 | 0.5733 | - | 0.9417 |
621
+ | 38.9091 | 214000 | 0.0097 | 0.5516 | - | 0.9450 |
622
+ | 39.0909 | 215000 | 0.009 | 0.5923 | - | 0.9444 |
623
+ | 39.2727 | 216000 | 0.0078 | 0.5925 | - | 0.9425 |
624
+ | 39.4545 | 217000 | 0.0086 | 0.5703 | - | 0.9410 |
625
+ | 39.6364 | 218000 | 0.0087 | 0.5921 | - | 0.9079 |
626
+ | 39.8182 | 219000 | 0.0085 | 0.5859 | - | 0.9237 |
627
+ | 40.0 | 220000 | 0.0091 | 0.5577 | - | 0.9359 |
628
+ | 40.1818 | 221000 | 0.0077 | 0.5844 | - | 0.9409 |
629
+ | 40.3636 | 222000 | 0.0077 | 0.5691 | - | 0.9432 |
630
+ | 40.5455 | 223000 | 0.008 | 0.5794 | - | 0.9445 |
631
+ | 40.7273 | 224000 | 0.0073 | 0.6036 | - | 0.9433 |
632
+ | 40.9091 | 225000 | 0.0083 | 0.5754 | - | 0.9428 |
633
+ | 41.0909 | 226000 | 0.0078 | 0.6141 | - | 0.9377 |
634
+ | 41.2727 | 227000 | 0.0069 | 0.6332 | - | 0.9416 |
635
+ | 41.4545 | 228000 | 0.007 | 0.6220 | - | 0.9340 |
636
+ | 41.6364 | 229000 | 0.0075 | 0.6110 | - | 0.9372 |
637
+ | 41.8182 | 230000 | 0.007 | 0.6248 | - | 0.9394 |
638
+ | 42.0 | 231000 | 0.0072 | 0.5950 | - | 0.9394 |
639
+ | 42.1818 | 232000 | 0.0059 | 0.6428 | - | 0.9415 |
640
+ | 42.3636 | 233000 | 0.0065 | 0.6298 | - | 0.9379 |
641
+ | 42.5455 | 234000 | 0.0065 | 0.6166 | - | 0.9392 |
642
+ | 42.7273 | 235000 | 0.0066 | 0.5990 | - | 0.9394 |
643
+ | 42.9091 | 236000 | 0.0065 | 0.6297 | - | 0.9348 |
644
+ | 43.0909 | 237000 | 0.0057 | 0.6483 | - | 0.9367 |
645
+ | 43.2727 | 238000 | 0.0058 | 0.6077 | - | 0.9376 |
646
+ | 43.4545 | 239000 | 0.0056 | 0.6420 | - | 0.9386 |
647
+ | 43.6364 | 240000 | 0.0059 | 0.6574 | - | 0.9333 |
648
+ | 43.8182 | 241000 | 0.0051 | 0.6819 | - | 0.9383 |
649
+ | 44.0 | 242000 | 0.0055 | 0.6567 | - | 0.9390 |
650
+ | 44.1818 | 243000 | 0.0051 | 0.6697 | - | 0.9398 |
651
+ | 44.3636 | 244000 | 0.005 | 0.6459 | - | 0.9395 |
652
+ | 44.5455 | 245000 | 0.0047 | 0.6693 | - | 0.9426 |
653
+ | 44.7273 | 246000 | 0.0054 | 0.6589 | - | 0.9397 |
654
+ | 44.9091 | 247000 | 0.0051 | 0.6886 | - | 0.9410 |
655
+ | 45.0909 | 248000 | 0.0047 | 0.6886 | - | 0.9370 |
656
+ | 45.2727 | 249000 | 0.0045 | 0.6959 | - | 0.9358 |
657
+ | 45.4545 | 250000 | 0.0045 | 0.6827 | - | 0.9381 |
658
+ | 45.6364 | 251000 | 0.0042 | 0.6706 | - | 0.9320 |
659
+ | 45.8182 | 252000 | 0.0043 | 0.6858 | - | 0.9310 |
660
+ | 46.0 | 253000 | 0.0045 | 0.6926 | - | 0.9361 |
661
+ | 46.1818 | 254000 | 0.0041 | 0.7091 | - | 0.9323 |
662
+ | 46.3636 | 255000 | 0.0038 | 0.7110 | - | 0.9287 |
663
+ | 46.5455 | 256000 | 0.0039 | 0.6991 | - | 0.9307 |
664
+ | 46.7273 | 257000 | 0.0038 | 0.7068 | - | 0.9326 |
665
+ | 46.9091 | 258000 | 0.0043 | 0.7026 | - | 0.9361 |
666
+ | 47.0909 | 259000 | 0.0041 | 0.7031 | - | 0.9287 |
667
+ | 47.2727 | 260000 | 0.0037 | 0.6905 | - | 0.9343 |
668
+ | 47.4545 | 261000 | 0.0035 | 0.7044 | - | 0.9287 |
669
+ | 47.6364 | 262000 | 0.0038 | 0.7005 | - | 0.9310 |
670
+ | 47.8182 | 263000 | 0.0039 | 0.7064 | - | 0.9302 |
671
+ | 48.0 | 264000 | 0.0034 | 0.7112 | - | 0.9273 |
672
+ | 48.1818 | 265000 | 0.0034 | 0.7092 | - | 0.9326 |
673
+ | 48.3636 | 266000 | 0.0029 | 0.7286 | - | 0.9303 |
674
+ | 48.5455 | 267000 | 0.0034 | 0.7270 | - | 0.9276 |
675
+ | 48.7273 | 268000 | 0.0035 | 0.7101 | - | 0.9320 |
676
+ | 48.9091 | 269000 | 0.0035 | 0.7127 | - | 0.9333 |
677
+ | 49.0909 | 270000 | 0.0031 | 0.7228 | - | 0.9310 |
678
+ | 49.2727 | 271000 | 0.0029 | 0.7289 | - | 0.9295 |
679
+ | 49.4545 | 272000 | 0.003 | 0.7309 | - | 0.9303 |
680
+ | 49.6364 | 273000 | 0.003 | 0.7308 | - | 0.9308 |
681
+ | 49.8182 | 274000 | 0.0033 | 0.7293 | - | 0.9303 |
682
+ | 50.0 | 275000 | 0.003 | 0.7302 | - | 0.9303 |
683
+
684
+ * The bold row denotes the saved checkpoint.
685
+ </details>
686
 
687
  ### Framework Versions
688
  - Python: 3.12.3
model.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:881608d71230209c9e322549e369bb956aef19264321808c9d67962082f5249b
3
  size 598436708
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:65db824fbfcaba61a653c59822fb36f3282478c99094e445c3e1347fd3b4787f
3
  size 598436708