--- library_name: transformers tags: - generated_from_trainer metrics: - accuracy model-index: - name: 100M_low_2000_8397 results: [] --- # 100M_low_2000_8397 This model is a fine-tuned version of [](https://huggingface.co/) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 3.3026 - Accuracy: 0.3942 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0006 - train_batch_size: 32 - eval_batch_size: 16 - seed: 8397 - optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.98) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments - lr_scheduler_type: linear - lr_scheduler_warmup_steps: 100 - num_epochs: 10 - mixed_precision_training: Native AMP ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:------:|:-----:|:---------------:|:--------:| | 5.1027 | 0.1078 | 1000 | 5.0253 | 0.2270 | | 4.5966 | 0.2156 | 2000 | 4.5191 | 0.2694 | | 4.3193 | 0.3235 | 3000 | 4.2507 | 0.2971 | | 4.1641 | 0.4313 | 4000 | 4.0930 | 0.3124 | | 4.0541 | 0.5391 | 5000 | 3.9960 | 0.3209 | | 4.0006 | 0.6469 | 6000 | 3.9242 | 0.3277 | | 3.9272 | 0.7547 | 7000 | 3.8659 | 0.3329 | | 3.8604 | 0.8625 | 8000 | 3.8231 | 0.3370 | | 3.8606 | 0.9704 | 9000 | 3.7847 | 0.3405 | | 3.7669 | 1.0782 | 10000 | 3.7536 | 0.3440 | | 3.7755 | 1.1860 | 11000 | 3.7266 | 0.3468 | | 3.7364 | 1.2938 | 12000 | 3.6996 | 0.3489 | | 3.7135 | 1.4016 | 13000 | 3.6761 | 0.3513 | | 3.7085 | 1.5094 | 14000 | 3.6597 | 0.3533 | | 3.6698 | 1.6173 | 15000 | 3.6416 | 0.3552 | | 3.6694 | 1.7251 | 16000 | 3.6238 | 0.3571 | | 3.6432 | 1.8329 | 17000 | 3.6079 | 0.3585 | | 3.6354 | 1.9407 | 18000 | 3.5928 | 0.3595 | | 3.5732 | 2.0485 | 19000 | 3.5855 | 0.3613 | | 3.5788 | 2.1563 | 20000 | 3.5758 | 0.3621 | | 3.5674 | 2.2642 | 21000 | 3.5648 | 0.3634 | | 3.5759 | 2.3720 | 22000 | 3.5541 | 0.3642 | | 3.5439 | 2.4798 | 23000 | 3.5441 | 0.3654 | | 3.5368 | 2.5876 | 24000 | 3.5353 | 0.3664 | | 3.5327 | 2.6954 | 25000 | 3.5247 | 0.3671 | | 3.5385 | 2.8032 | 26000 | 3.5150 | 0.3682 | | 3.5497 | 2.9111 | 27000 | 3.5089 | 0.3690 | | 3.4502 | 3.0189 | 28000 | 3.5044 | 0.3698 | | 3.4347 | 3.1267 | 29000 | 3.5016 | 0.3704 | | 3.4554 | 3.2345 | 30000 | 3.4930 | 0.3712 | | 3.4623 | 3.3423 | 31000 | 3.4872 | 0.3717 | | 3.4633 | 3.4501 | 32000 | 3.4791 | 0.3728 | | 3.4665 | 3.5580 | 33000 | 3.4755 | 0.3732 | | 3.4633 | 3.6658 | 34000 | 3.4683 | 0.3734 | | 3.4656 | 3.7736 | 35000 | 3.4613 | 0.3744 | | 3.4539 | 3.8814 | 36000 | 3.4559 | 0.3748 | | 3.4528 | 3.9892 | 37000 | 3.4474 | 0.3758 | | 3.3674 | 4.0970 | 38000 | 3.4528 | 0.3760 | | 3.369 | 4.2049 | 39000 | 3.4489 | 0.3769 | | 3.392 | 4.3127 | 40000 | 3.4412 | 0.3772 | | 3.3994 | 4.4205 | 41000 | 3.4375 | 0.3774 | | 3.3909 | 4.5283 | 42000 | 3.4333 | 0.3779 | | 3.4005 | 4.6361 | 43000 | 3.4265 | 0.3783 | | 3.3928 | 4.7439 | 44000 | 3.4222 | 0.3788 | | 3.4106 | 4.8518 | 45000 | 3.4176 | 0.3796 | | 3.3782 | 4.9596 | 46000 | 3.4130 | 0.3801 | | 3.3204 | 5.0674 | 47000 | 3.4155 | 0.3804 | | 3.3185 | 5.1752 | 48000 | 3.4171 | 0.3807 | | 3.3484 | 5.2830 | 49000 | 3.4122 | 0.3810 | | 3.3482 | 5.3908 | 50000 | 3.4060 | 0.3813 | | 3.3511 | 5.4987 | 51000 | 3.4023 | 0.3817 | | 3.3221 | 5.6065 | 52000 | 3.3966 | 0.3822 | | 3.339 | 5.7143 | 53000 | 3.3923 | 0.3826 | | 3.3373 | 5.8221 | 54000 | 3.3882 | 0.3830 | | 3.3391 | 5.9299 | 55000 | 3.3852 | 0.3835 | | 3.2419 | 6.0377 | 56000 | 3.3884 | 0.3835 | | 3.2621 | 6.1456 | 57000 | 3.3875 | 0.3836 | | 3.2821 | 6.2534 | 58000 | 3.3831 | 0.3844 | | 3.2945 | 6.3612 | 59000 | 3.3804 | 0.3844 | | 3.2938 | 6.4690 | 60000 | 3.3760 | 0.3850 | | 3.2862 | 6.5768 | 61000 | 3.3713 | 0.3856 | | 3.3055 | 6.6846 | 62000 | 3.3657 | 0.3858 | | 3.2823 | 6.7925 | 63000 | 3.3631 | 0.3859 | | 3.2918 | 6.9003 | 64000 | 3.3592 | 0.3866 | | 3.187 | 7.0081 | 65000 | 3.3601 | 0.3866 | | 3.2306 | 7.1159 | 66000 | 3.3626 | 0.3869 | | 3.2341 | 7.2237 | 67000 | 3.3604 | 0.3871 | | 3.2319 | 7.3315 | 68000 | 3.3570 | 0.3874 | | 3.224 | 7.4394 | 69000 | 3.3514 | 0.3878 | | 3.2291 | 7.5472 | 70000 | 3.3487 | 0.3879 | | 3.2551 | 7.6550 | 71000 | 3.3450 | 0.3887 | | 3.2557 | 7.7628 | 72000 | 3.3420 | 0.3890 | | 3.2361 | 7.8706 | 73000 | 3.3377 | 0.3893 | | 3.2561 | 7.9784 | 74000 | 3.3330 | 0.3899 | | 3.1611 | 8.0863 | 75000 | 3.3395 | 0.3898 | | 3.1624 | 8.1941 | 76000 | 3.3378 | 0.3899 | | 3.1808 | 8.3019 | 77000 | 3.3341 | 0.3901 | | 3.1735 | 8.4097 | 78000 | 3.3307 | 0.3905 | | 3.185 | 8.5175 | 79000 | 3.3277 | 0.3908 | | 3.201 | 8.6253 | 80000 | 3.3242 | 0.3914 | | 3.1937 | 8.7332 | 81000 | 3.3214 | 0.3917 | | 3.1875 | 8.8410 | 82000 | 3.3186 | 0.3917 | | 3.171 | 8.9488 | 83000 | 3.3150 | 0.3924 | | 3.133 | 9.0566 | 84000 | 3.3170 | 0.3924 | | 3.1247 | 9.1644 | 85000 | 3.3170 | 0.3924 | | 3.1504 | 9.2722 | 86000 | 3.3144 | 0.3928 | | 3.1282 | 9.3801 | 87000 | 3.3134 | 0.3930 | | 3.1361 | 9.4879 | 88000 | 3.3104 | 0.3932 | | 3.1146 | 9.5957 | 89000 | 3.3085 | 0.3936 | | 3.1429 | 9.7035 | 90000 | 3.3053 | 0.3940 | | 3.1332 | 9.8113 | 91000 | 3.3040 | 0.3941 | | 3.1257 | 9.9191 | 92000 | 3.3026 | 0.3942 | ### Framework versions - Transformers 4.47.0.dev0 - Pytorch 2.5.0+cu124 - Datasets 3.0.2 - Tokenizers 0.20.1