100M_high_10_495
This model is a fine-tuned version of on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 3.3025
- Accuracy: 0.3945
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0006
- train_batch_size: 32
- eval_batch_size: 16
- seed: 495
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.98) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 100
- num_epochs: 10
- mixed_precision_training: Native AMP
Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|---|---|---|---|---|
| 5.134 | 0.1078 | 1000 | 5.0580 | 0.2236 |
| 4.6368 | 0.2156 | 2000 | 4.5903 | 0.2611 |
| 4.3722 | 0.3235 | 3000 | 4.2728 | 0.2935 |
| 4.1956 | 0.4313 | 4000 | 4.1237 | 0.3089 |
| 4.076 | 0.5391 | 5000 | 4.0201 | 0.3186 |
| 4.0178 | 0.6469 | 6000 | 3.9412 | 0.3250 |
| 3.9497 | 0.7547 | 7000 | 3.8834 | 0.3313 |
| 3.8934 | 0.8625 | 8000 | 3.8358 | 0.3355 |
| 3.8623 | 0.9704 | 9000 | 3.7968 | 0.3398 |
| 3.7815 | 1.0782 | 10000 | 3.7646 | 0.3427 |
| 3.792 | 1.1860 | 11000 | 3.7366 | 0.3450 |
| 3.7548 | 1.2938 | 12000 | 3.7139 | 0.3474 |
| 3.7231 | 1.4016 | 13000 | 3.6877 | 0.3503 |
| 3.7126 | 1.5094 | 14000 | 3.6673 | 0.3523 |
| 3.6994 | 1.6173 | 15000 | 3.6493 | 0.3544 |
| 3.6737 | 1.7251 | 16000 | 3.6299 | 0.3563 |
| 3.6732 | 1.8329 | 17000 | 3.6185 | 0.3575 |
| 3.65 | 1.9407 | 18000 | 3.6006 | 0.3591 |
| 3.5935 | 2.0485 | 19000 | 3.5908 | 0.3605 |
| 3.5703 | 2.1563 | 20000 | 3.5840 | 0.3615 |
| 3.5583 | 2.2642 | 21000 | 3.5720 | 0.3625 |
| 3.5614 | 2.3720 | 22000 | 3.5617 | 0.3641 |
| 3.5545 | 2.4798 | 23000 | 3.5481 | 0.3650 |
| 3.5468 | 2.5876 | 24000 | 3.5416 | 0.3659 |
| 3.5558 | 2.6954 | 25000 | 3.5304 | 0.3667 |
| 3.5606 | 2.8032 | 26000 | 3.5206 | 0.3677 |
| 3.5451 | 2.9111 | 27000 | 3.5127 | 0.3688 |
| 3.4453 | 3.0189 | 28000 | 3.5084 | 0.3693 |
| 3.4629 | 3.1267 | 29000 | 3.5049 | 0.3699 |
| 3.4768 | 3.2345 | 30000 | 3.4970 | 0.3707 |
| 3.465 | 3.3423 | 31000 | 3.4925 | 0.3718 |
| 3.473 | 3.4501 | 32000 | 3.4848 | 0.3721 |
| 3.4612 | 3.5580 | 33000 | 3.4789 | 0.3727 |
| 3.4835 | 3.6658 | 34000 | 3.4723 | 0.3734 |
| 3.4624 | 3.7736 | 35000 | 3.4668 | 0.3740 |
| 3.4481 | 3.8814 | 36000 | 3.4592 | 0.3747 |
| 3.4675 | 3.9892 | 37000 | 3.4562 | 0.3753 |
| 3.3835 | 4.0970 | 38000 | 3.4573 | 0.3759 |
| 3.3926 | 4.2049 | 39000 | 3.4502 | 0.3762 |
| 3.397 | 4.3127 | 40000 | 3.4472 | 0.3767 |
| 3.4106 | 4.4205 | 41000 | 3.4399 | 0.3773 |
| 3.4111 | 4.5283 | 42000 | 3.4362 | 0.3778 |
| 3.4011 | 4.6361 | 43000 | 3.4296 | 0.3784 |
| 3.4046 | 4.7439 | 44000 | 3.4268 | 0.3792 |
| 3.4087 | 4.8518 | 45000 | 3.4196 | 0.3793 |
| 3.3922 | 4.9596 | 46000 | 3.4159 | 0.3799 |
| 3.3178 | 5.0674 | 47000 | 3.4214 | 0.3799 |
| 3.3207 | 5.1752 | 48000 | 3.4158 | 0.3806 |
| 3.3454 | 5.2830 | 49000 | 3.4116 | 0.3809 |
| 3.3477 | 5.3908 | 50000 | 3.4100 | 0.3813 |
| 3.3489 | 5.4987 | 51000 | 3.4046 | 0.3819 |
| 3.3582 | 5.6065 | 52000 | 3.4000 | 0.3823 |
| 3.3495 | 5.7143 | 53000 | 3.3953 | 0.3825 |
| 3.3373 | 5.8221 | 54000 | 3.3897 | 0.3829 |
| 3.358 | 5.9299 | 55000 | 3.3864 | 0.3838 |
| 3.2566 | 6.0377 | 56000 | 3.3888 | 0.3837 |
| 3.2756 | 6.1456 | 57000 | 3.3863 | 0.3838 |
| 3.2921 | 6.2534 | 58000 | 3.3858 | 0.3843 |
| 3.2778 | 6.3612 | 59000 | 3.3799 | 0.3848 |
| 3.2944 | 6.4690 | 60000 | 3.3771 | 0.3847 |
| 3.2839 | 6.5768 | 61000 | 3.3747 | 0.3853 |
| 3.304 | 6.6846 | 62000 | 3.3685 | 0.3860 |
| 3.2808 | 6.7925 | 63000 | 3.3651 | 0.3861 |
| 3.2843 | 6.9003 | 64000 | 3.3602 | 0.3868 |
| 3.2143 | 7.0081 | 65000 | 3.3621 | 0.3870 |
| 3.2068 | 7.1159 | 66000 | 3.3643 | 0.3868 |
| 3.2336 | 7.2237 | 67000 | 3.3611 | 0.3877 |
| 3.2329 | 7.3315 | 68000 | 3.3571 | 0.3876 |
| 3.23 | 7.4394 | 69000 | 3.3532 | 0.3876 |
| 3.2397 | 7.5472 | 70000 | 3.3483 | 0.3883 |
| 3.234 | 7.6550 | 71000 | 3.3463 | 0.3887 |
| 3.2382 | 7.7628 | 72000 | 3.3419 | 0.3892 |
| 3.2463 | 7.8706 | 73000 | 3.3385 | 0.3894 |
| 3.2581 | 7.9784 | 74000 | 3.3345 | 0.3900 |
| 3.1713 | 8.0863 | 75000 | 3.3392 | 0.3899 |
| 3.1919 | 8.1941 | 76000 | 3.3388 | 0.3901 |
| 3.1911 | 8.3019 | 77000 | 3.3363 | 0.3903 |
| 3.1846 | 8.4097 | 78000 | 3.3330 | 0.3906 |
| 3.1829 | 8.5175 | 79000 | 3.3299 | 0.3911 |
| 3.2168 | 8.6253 | 80000 | 3.3248 | 0.3915 |
| 3.1719 | 8.7332 | 81000 | 3.3222 | 0.3916 |
| 3.1861 | 8.8410 | 82000 | 3.3203 | 0.3920 |
| 3.1994 | 8.9488 | 83000 | 3.3157 | 0.3926 |
| 3.1367 | 9.0566 | 84000 | 3.3206 | 0.3924 |
| 3.1305 | 9.1644 | 85000 | 3.3179 | 0.3925 |
| 3.1315 | 9.2722 | 86000 | 3.3150 | 0.3930 |
| 3.1359 | 9.3801 | 87000 | 3.3135 | 0.3933 |
| 3.1313 | 9.4879 | 88000 | 3.3108 | 0.3935 |
| 3.1359 | 9.5957 | 89000 | 3.3081 | 0.3937 |
| 3.1414 | 9.7035 | 90000 | 3.3063 | 0.3940 |
| 3.1497 | 9.8113 | 91000 | 3.3036 | 0.3943 |
| 3.15 | 9.9191 | 92000 | 3.3025 | 0.3945 |
Framework versions
- Transformers 4.47.0.dev0
- Pytorch 2.5.0+cu124
- Datasets 3.0.2
- Tokenizers 0.20.1
- Downloads last month
- -