--- library_name: transformers tags: - generated_from_trainer model-index: - name: pretrain results: [] --- # pretrain This model is a fine-tuned version of [](https://huggingface.co/) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 0.5260 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 1024 - eval_batch_size: 1024 - seed: 42 - optimizer: Use adamw_torch with betas=(0.9,0.95) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments - lr_scheduler_type: cosine - lr_scheduler_warmup_steps: 156250 - num_epochs: 25 - mixed_precision_training: Native AMP ### Training results | Training Loss | Epoch | Step | Validation Loss | |:-------------:|:-------:|:-----:|:---------------:| | 0.4587 | 0.3774 | 500 | 1.7455 | | 0.3676 | 0.7547 | 1000 | 1.3984 | | 0.3343 | 1.1321 | 1500 | 1.2729 | | 0.3118 | 1.5094 | 2000 | 1.1772 | | 0.2953 | 1.8868 | 2500 | 1.0904 | | 0.2771 | 2.2642 | 3000 | 1.0169 | | 0.2605 | 2.6415 | 3500 | 0.9581 | | 0.2501 | 3.0189 | 4000 | 0.8991 | | 0.2351 | 3.3962 | 4500 | 0.8535 | | 0.2245 | 3.7736 | 5000 | 0.8164 | | 0.2168 | 4.1509 | 5500 | 0.7843 | | 0.2121 | 4.5283 | 6000 | 0.7684 | | 0.205 | 4.9057 | 6500 | 0.7447 | | 0.1999 | 5.2830 | 7000 | 0.7284 | | 0.196 | 5.6604 | 7500 | 0.7089 | | 0.1894 | 6.0377 | 8000 | 0.7045 | | 0.188 | 6.4151 | 8500 | 0.6867 | | 0.1826 | 6.7925 | 9000 | 0.6750 | | 0.1821 | 7.1698 | 9500 | 0.6672 | | 0.1753 | 7.5472 | 10000 | 0.6650 | | 0.1746 | 7.9245 | 10500 | 0.6485 | | 0.1714 | 8.3019 | 11000 | 0.6420 | | 0.1726 | 8.6792 | 11500 | 0.6365 | | 0.169 | 9.0566 | 12000 | 0.6300 | | 0.1659 | 9.4340 | 12500 | 0.6244 | | 0.1653 | 9.8113 | 13000 | 0.6164 | | 0.1646 | 10.1887 | 13500 | 0.6122 | | 0.1623 | 10.5660 | 14000 | 0.6070 | | 0.1629 | 10.9434 | 14500 | 0.6045 | | 0.1603 | 11.3208 | 15000 | 0.5999 | | 0.16 | 11.6981 | 15500 | 0.5948 | | 0.1582 | 12.0755 | 16000 | 0.5898 | | 0.1565 | 12.4528 | 16500 | 0.5868 | | 0.1541 | 12.8302 | 17000 | 0.5844 | | 0.1553 | 13.2075 | 17500 | 0.5798 | | 0.152 | 13.5849 | 18000 | 0.5791 | | 0.1536 | 13.9623 | 18500 | 0.5745 | | 0.1525 | 14.3396 | 19000 | 0.5722 | | 0.1516 | 14.7170 | 19500 | 0.5718 | | 0.151 | 15.0943 | 20000 | 0.5675 | | 0.1502 | 15.4717 | 20500 | 0.5672 | | 0.1505 | 15.8491 | 21000 | 0.5639 | | 0.1497 | 16.2264 | 21500 | 0.5607 | | 0.1495 | 16.6038 | 22000 | 0.5583 | | 0.1463 | 16.9811 | 22500 | 0.5547 | | 0.1478 | 17.3585 | 23000 | 0.5556 | | 0.1468 | 17.7358 | 23500 | 0.5534 | | 0.1468 | 18.1132 | 24000 | 0.5509 | | 0.1447 | 18.4906 | 24500 | 0.5480 | | 0.1451 | 18.8679 | 25000 | 0.5479 | | 0.1449 | 19.2453 | 25500 | 0.5453 | | 0.1433 | 19.6226 | 26000 | 0.5449 | | 0.1434 | 20.0 | 26500 | 0.5423 | | 0.1434 | 20.3774 | 27000 | 0.5404 | | 0.1428 | 20.7547 | 27500 | 0.5393 | | 0.1435 | 21.1321 | 28000 | 0.5391 | | 0.142 | 21.5094 | 28500 | 0.5371 | | 0.142 | 21.8868 | 29000 | 0.5342 | | 0.1418 | 22.2642 | 29500 | 0.5340 | | 0.1417 | 22.6415 | 30000 | 0.5322 | | 0.1405 | 23.0189 | 30500 | 0.5309 | | 0.1412 | 23.3962 | 31000 | 0.5300 | | 0.1395 | 23.7736 | 31500 | 0.5295 | | 0.1383 | 24.1509 | 32000 | 0.5289 | | 0.1373 | 24.5283 | 32500 | 0.5272 | | 0.139 | 24.9057 | 33000 | 0.5260 | ### Framework versions - Transformers 4.51.1 - Pytorch 2.6.0+cu124 - Datasets 3.5.0 - Tokenizers 0.21.1