Visualize in Weights & Biases

exceptions_exp2_swap_0.3_cost_to_push_2128

This model is a fine-tuned version of on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 3.5810
  • Accuracy: 0.3661

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0006
  • train_batch_size: 16
  • eval_batch_size: 16
  • seed: 2128
  • gradient_accumulation_steps: 5
  • total_train_batch_size: 80
  • optimizer: Use OptimizerNames.ADAMW_TORCH_FUSED with betas=(0.9,0.98) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 100
  • num_epochs: 50.0
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Accuracy
4.8475 0.2915 1000 4.7750 0.2502
4.3456 0.5831 2000 4.2943 0.2978
4.1456 0.8746 3000 4.1059 0.3138
3.9944 1.1662 4000 3.9968 0.3234
3.9491 1.4577 5000 3.9227 0.3306
3.8924 1.7493 6000 3.8664 0.3360
3.7617 2.0408 7000 3.8208 0.3401
3.7602 2.3324 8000 3.7899 0.3432
3.7334 2.6239 9000 3.7620 0.3459
3.7346 2.9155 10000 3.7338 0.3483
3.6444 3.2070 11000 3.7238 0.3499
3.659 3.4985 12000 3.7059 0.3517
3.6575 3.7901 13000 3.6863 0.3535
3.5507 4.0816 14000 3.6783 0.3551
3.5711 4.3732 15000 3.6675 0.3562
3.5827 4.6647 16000 3.6550 0.3573
3.5824 4.9563 17000 3.6408 0.3583
3.5035 5.2478 18000 3.6454 0.3591
3.5303 5.5394 19000 3.6323 0.3601
3.5382 5.8309 20000 3.6206 0.3610
3.4441 6.1224 21000 3.6226 0.3609
3.4783 6.4140 22000 3.6176 0.3619
3.5042 6.7055 23000 3.6065 0.3628
3.5019 6.9971 24000 3.5965 0.3634
3.4391 7.2886 25000 3.6066 0.3635
3.4671 7.5802 26000 3.5964 0.3643
3.4646 7.8717 27000 3.5879 0.3648
3.3788 8.1633 28000 3.5949 0.3645
3.42 8.4548 29000 3.5912 0.3657
3.4206 8.7464 30000 3.5810 0.3661
3.3393 9.0379 31000 3.5860 0.3663
3.3852 9.3294 32000 3.5854 0.3662
3.4032 9.6210 33000 3.5760 0.3670
3.4131 9.9125 34000 3.5672 0.3674
3.3514 10.2041 35000 3.5798 0.3672
3.3637 10.4956 36000 3.5698 0.3677
3.3826 10.7872 37000 3.5626 0.3685
3.3083 11.0787 38000 3.5737 0.3679
3.3538 11.3703 39000 3.5722 0.3681
3.363 11.6618 40000 3.5646 0.3687
3.3675 11.9534 41000 3.5520 0.3693
3.3242 12.2449 42000 3.5694 0.3687
3.3425 12.5364 43000 3.5615 0.3693
3.3599 12.8280 44000 3.5545 0.3693
3.2748 13.1195 45000 3.5677 0.3692
3.308 13.4111 46000 3.5632 0.3696
3.3306 13.7026 47000 3.5567 0.3703
3.3501 13.9942 48000 3.5477 0.3704
3.303 14.2857 49000 3.5616 0.3700
3.3112 14.5773 50000 3.5578 0.3704
3.3277 14.8688 51000 3.5486 0.3709
3.2649 15.1603 52000 3.5625 0.3701
3.2859 15.4519 53000 3.5579 0.3706
3.3246 15.7434 54000 3.5483 0.3712
3.2136 16.0350 55000 3.5616 0.3705
3.2559 16.3265 56000 3.5585 0.3707
3.2834 16.6181 57000 3.5516 0.3713
3.2976 16.9096 58000 3.5446 0.3715
3.2242 17.2012 59000 3.5605 0.3707
3.2621 17.4927 60000 3.5490 0.3716
3.2843 17.7843 61000 3.5449 0.3718
3.1983 18.0758 62000 3.5611 0.3714
3.2407 18.3673 63000 3.5544 0.3718
3.2768 18.6589 64000 3.5447 0.3720
3.2854 18.9504 65000 3.5428 0.3721
3.2234 19.2420 66000 3.5554 0.3717
3.2568 19.5335 67000 3.5503 0.3724
3.2694 19.8251 68000 3.5420 0.3728
3.1784 20.1166 69000 3.5584 0.3718
3.2217 20.4082 70000 3.5522 0.3721
3.2423 20.6997 71000 3.5455 0.3725
3.2513 20.9913 72000 3.5366 0.3730
3.2074 21.2828 73000 3.5550 0.3723
3.2366 21.5743 74000 3.5475 0.3726
3.2257 21.8659 75000 3.5367 0.3732
3.1729 22.1574 76000 3.5582 0.3723
3.2132 22.4490 77000 3.5498 0.3726
3.2481 22.7405 78000 3.5411 0.3731
3.1369 23.0321 79000 3.5574 0.3725
3.1759 23.3236 80000 3.5522 0.3729
3.2125 23.6152 81000 3.5454 0.3733
3.2274 23.9067 82000 3.5361 0.3737
3.1613 24.1983 83000 3.5553 0.3729
3.1876 24.4898 84000 3.5515 0.3730
3.2129 24.7813 85000 3.5437 0.3735
3.1389 25.0729 86000 3.5526 0.3732
3.1698 25.3644 87000 3.5525 0.3731
3.1955 25.6560 88000 3.5473 0.3735
3.1939 25.9475 89000 3.5392 0.3737
3.1502 26.2391 90000 3.5545 0.3731
3.1742 26.5306 91000 3.5458 0.3737
3.1846 26.8222 92000 3.5407 0.3741
3.1308 27.1137 93000 3.5584 0.3731
3.1635 27.4052 94000 3.5564 0.3734
3.192 27.6968 95000 3.5428 0.3742
3.186 27.9883 96000 3.5344 0.3744
3.1337 28.2799 97000 3.5569 0.3733
3.1666 28.5714 98000 3.5477 0.3739
3.1893 28.8630 99000 3.5428 0.3742
3.115 29.1545 100000 3.5589 0.3734
3.1486 29.4461 101000 3.5559 0.3735
3.1697 29.7376 102000 3.5468 0.3742
3.0804 30.0292 103000 3.5555 0.3737
3.12 30.3207 104000 3.5551 0.3737
3.1463 30.6122 105000 3.5494 0.3742
3.1608 30.9038 106000 3.5439 0.3744
3.1076 31.1953 107000 3.5588 0.3735
3.1425 31.4869 108000 3.5515 0.3741
3.1472 31.7784 109000 3.5468 0.3745
3.0671 32.0700 110000 3.5595 0.3737
3.1161 32.3615 111000 3.5552 0.3741
3.1468 32.6531 112000 3.5482 0.3746
3.145 32.9446 113000 3.5437 0.3746
3.0995 33.2362 114000 3.5615 0.3737
3.1161 33.5277 115000 3.5526 0.3743
3.1275 33.8192 116000 3.5474 0.3747

Framework versions

  • Transformers 4.55.2
  • Pytorch 2.8.0+cu128
  • Datasets 4.0.0
  • Tokenizers 0.21.4
Downloads last month
1
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support