| | --- |
| | library_name: transformers |
| | tags: |
| | - generated_from_trainer |
| | metrics: |
| | - accuracy |
| | model-index: |
| | - name: 100M_low_2000_495 |
| | results: [] |
| | --- |
| | |
| | <!-- This model card has been generated automatically according to the information the Trainer had access to. You |
| | should probably proofread and complete it, then remove this comment. --> |
| |
|
| | # 100M_low_2000_495 |
| | |
| | This model is a fine-tuned version of [](https://huggingface.co/) on an unknown dataset. |
| | It achieves the following results on the evaluation set: |
| | - Loss: 3.2975 |
| | - Accuracy: 0.3950 |
| | |
| | ## Model description |
| | |
| | More information needed |
| | |
| | ## Intended uses & limitations |
| | |
| | More information needed |
| | |
| | ## Training and evaluation data |
| | |
| | More information needed |
| | |
| | ## Training procedure |
| | |
| | ### Training hyperparameters |
| | |
| | The following hyperparameters were used during training: |
| | - learning_rate: 0.0006 |
| | - train_batch_size: 32 |
| | - eval_batch_size: 16 |
| | - seed: 495 |
| | - optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.98) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments |
| | - lr_scheduler_type: linear |
| | - lr_scheduler_warmup_steps: 100 |
| | - num_epochs: 10 |
| | - mixed_precision_training: Native AMP |
| |
|
| | ### Training results |
| |
|
| | | Training Loss | Epoch | Step | Validation Loss | Accuracy | |
| | |:-------------:|:------:|:-----:|:---------------:|:--------:| |
| | | 5.0913 | 0.1078 | 1000 | 5.0183 | 0.2279 | |
| | | 4.5778 | 0.2156 | 2000 | 4.5154 | 0.2701 | |
| | | 4.3308 | 0.3235 | 3000 | 4.2365 | 0.2983 | |
| | | 4.1661 | 0.4313 | 4000 | 4.0956 | 0.3125 | |
| | | 4.0541 | 0.5391 | 5000 | 3.9986 | 0.3213 | |
| | | 3.9993 | 0.6469 | 6000 | 3.9231 | 0.3275 | |
| | | 3.9313 | 0.7547 | 7000 | 3.8664 | 0.3328 | |
| | | 3.8759 | 0.8625 | 8000 | 3.8210 | 0.3373 | |
| | | 3.8469 | 0.9704 | 9000 | 3.7821 | 0.3409 | |
| | | 3.7655 | 1.0782 | 10000 | 3.7513 | 0.3442 | |
| | | 3.7771 | 1.1860 | 11000 | 3.7270 | 0.3465 | |
| | | 3.74 | 1.2938 | 12000 | 3.7007 | 0.3490 | |
| | | 3.7117 | 1.4016 | 13000 | 3.6756 | 0.3517 | |
| | | 3.6977 | 1.5094 | 14000 | 3.6564 | 0.3538 | |
| | | 3.6874 | 1.6173 | 15000 | 3.6377 | 0.3555 | |
| | | 3.6624 | 1.7251 | 16000 | 3.6195 | 0.3575 | |
| | | 3.663 | 1.8329 | 17000 | 3.6076 | 0.3586 | |
| | | 3.6385 | 1.9407 | 18000 | 3.5908 | 0.3603 | |
| | | 3.5829 | 2.0485 | 19000 | 3.5801 | 0.3616 | |
| | | 3.5598 | 2.1563 | 20000 | 3.5735 | 0.3626 | |
| | | 3.5482 | 2.2642 | 21000 | 3.5643 | 0.3635 | |
| | | 3.5485 | 2.3720 | 22000 | 3.5526 | 0.3651 | |
| | | 3.5444 | 2.4798 | 23000 | 3.5410 | 0.3657 | |
| | | 3.5361 | 2.5876 | 24000 | 3.5321 | 0.3668 | |
| | | 3.5474 | 2.6954 | 25000 | 3.5204 | 0.3677 | |
| | | 3.5513 | 2.8032 | 26000 | 3.5141 | 0.3685 | |
| | | 3.5354 | 2.9111 | 27000 | 3.5045 | 0.3696 | |
| | | 3.4341 | 3.0189 | 28000 | 3.4976 | 0.3703 | |
| | | 3.4518 | 3.1267 | 29000 | 3.4946 | 0.3710 | |
| | | 3.468 | 3.2345 | 30000 | 3.4878 | 0.3720 | |
| | | 3.4562 | 3.3423 | 31000 | 3.4839 | 0.3727 | |
| | | 3.4647 | 3.4501 | 32000 | 3.4753 | 0.3730 | |
| | | 3.4527 | 3.5580 | 33000 | 3.4714 | 0.3736 | |
| | | 3.4759 | 3.6658 | 34000 | 3.4632 | 0.3746 | |
| | | 3.4521 | 3.7736 | 35000 | 3.4576 | 0.3750 | |
| | | 3.4381 | 3.8814 | 36000 | 3.4504 | 0.3758 | |
| | | 3.4585 | 3.9892 | 37000 | 3.4456 | 0.3763 | |
| | | 3.3733 | 4.0970 | 38000 | 3.4492 | 0.3770 | |
| | | 3.3841 | 4.2049 | 39000 | 3.4424 | 0.3771 | |
| | | 3.3871 | 4.3127 | 40000 | 3.4392 | 0.3777 | |
| | | 3.4032 | 4.4205 | 41000 | 3.4325 | 0.3782 | |
| | | 3.4043 | 4.5283 | 42000 | 3.4281 | 0.3787 | |
| | | 3.3929 | 4.6361 | 43000 | 3.4217 | 0.3791 | |
| | | 3.3988 | 4.7439 | 44000 | 3.4197 | 0.3797 | |
| | | 3.4005 | 4.8518 | 45000 | 3.4115 | 0.3801 | |
| | | 3.383 | 4.9596 | 46000 | 3.4090 | 0.3807 | |
| | | 3.309 | 5.0674 | 47000 | 3.4149 | 0.3808 | |
| | | 3.3143 | 5.1752 | 48000 | 3.4082 | 0.3815 | |
| | | 3.3358 | 5.2830 | 49000 | 3.4052 | 0.3817 | |
| | | 3.3396 | 5.3908 | 50000 | 3.4026 | 0.3821 | |
| | | 3.3419 | 5.4987 | 51000 | 3.3963 | 0.3826 | |
| | | 3.3504 | 5.6065 | 52000 | 3.3913 | 0.3832 | |
| | | 3.3407 | 5.7143 | 53000 | 3.3872 | 0.3834 | |
| | | 3.329 | 5.8221 | 54000 | 3.3830 | 0.3839 | |
| | | 3.3504 | 5.9299 | 55000 | 3.3787 | 0.3845 | |
| | | 3.2491 | 6.0377 | 56000 | 3.3818 | 0.3846 | |
| | | 3.2676 | 6.1456 | 57000 | 3.3806 | 0.3844 | |
| | | 3.2833 | 6.2534 | 58000 | 3.3783 | 0.3852 | |
| | | 3.2682 | 6.3612 | 59000 | 3.3740 | 0.3855 | |
| | | 3.2848 | 6.4690 | 60000 | 3.3714 | 0.3854 | |
| | | 3.2766 | 6.5768 | 61000 | 3.3665 | 0.3862 | |
| | | 3.2975 | 6.6846 | 62000 | 3.3614 | 0.3866 | |
| | | 3.273 | 6.7925 | 63000 | 3.3590 | 0.3868 | |
| | | 3.277 | 6.9003 | 64000 | 3.3543 | 0.3875 | |
| | | 3.2059 | 7.0081 | 65000 | 3.3568 | 0.3877 | |
| | | 3.2005 | 7.1159 | 66000 | 3.3586 | 0.3875 | |
| | | 3.2256 | 7.2237 | 67000 | 3.3558 | 0.3881 | |
| | | 3.225 | 7.3315 | 68000 | 3.3514 | 0.3883 | |
| | | 3.2217 | 7.4394 | 69000 | 3.3491 | 0.3883 | |
| | | 3.2322 | 7.5472 | 70000 | 3.3449 | 0.3889 | |
| | | 3.2266 | 7.6550 | 71000 | 3.3397 | 0.3893 | |
| | | 3.231 | 7.7628 | 72000 | 3.3377 | 0.3899 | |
| | | 3.2406 | 7.8706 | 73000 | 3.3335 | 0.3901 | |
| | | 3.2518 | 7.9784 | 74000 | 3.3297 | 0.3906 | |
| | | 3.1634 | 8.0863 | 75000 | 3.3361 | 0.3905 | |
| | | 3.1847 | 8.1941 | 76000 | 3.3341 | 0.3906 | |
| | | 3.1844 | 8.3019 | 77000 | 3.3304 | 0.3910 | |
| | | 3.1753 | 8.4097 | 78000 | 3.3276 | 0.3912 | |
| | | 3.1751 | 8.5175 | 79000 | 3.3245 | 0.3918 | |
| | | 3.2094 | 8.6253 | 80000 | 3.3210 | 0.3920 | |
| | | 3.1635 | 8.7332 | 81000 | 3.3182 | 0.3923 | |
| | | 3.1793 | 8.8410 | 82000 | 3.3148 | 0.3926 | |
| | | 3.1924 | 8.9488 | 83000 | 3.3111 | 0.3931 | |
| | | 3.1295 | 9.0566 | 84000 | 3.3149 | 0.3930 | |
| | | 3.1237 | 9.1644 | 85000 | 3.3130 | 0.3932 | |
| | | 3.1247 | 9.2722 | 86000 | 3.3104 | 0.3936 | |
| | | 3.1265 | 9.3801 | 87000 | 3.3088 | 0.3936 | |
| | | 3.1223 | 9.4879 | 88000 | 3.3064 | 0.3940 | |
| | | 3.1268 | 9.5957 | 89000 | 3.3029 | 0.3943 | |
| | | 3.1331 | 9.7035 | 90000 | 3.3005 | 0.3946 | |
| | | 3.1425 | 9.8113 | 91000 | 3.2986 | 0.3948 | |
| | | 3.1427 | 9.9191 | 92000 | 3.2975 | 0.3950 | |
| |
|
| |
|
| | ### Framework versions |
| |
|
| | - Transformers 4.47.0.dev0 |
| | - Pytorch 2.5.0+cu124 |
| | - Datasets 3.0.2 |
| | - Tokenizers 0.20.1 |
| |
|