pretrained_dl_05_17
This model is a fine-tuned version of on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 0.5137
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 512
- eval_batch_size: 512
- seed: 42
- optimizer: Use adamw_torch with betas=(0.9,0.95) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- lr_scheduler_warmup_steps: 78125
- num_epochs: 9
- mixed_precision_training: Native AMP
Training results
| Training Loss | Epoch | Step | Validation Loss |
|---|---|---|---|
| 1.5933 | 0.1888 | 500 | 1.5166 |
| 1.3224 | 0.3775 | 1000 | 1.2561 |
| 1.1864 | 0.5663 | 1500 | 1.1098 |
| 1.0794 | 0.7550 | 2000 | 0.9943 |
| 0.9888 | 0.9438 | 2500 | 0.8991 |
| 0.9136 | 1.1325 | 3000 | 0.8287 |
| 0.8615 | 1.3213 | 3500 | 0.7879 |
| 0.8248 | 1.5100 | 4000 | 0.7525 |
| 0.7865 | 1.6988 | 4500 | 0.7258 |
| 0.7581 | 1.8875 | 5000 | 0.6998 |
| 0.7368 | 2.0763 | 5500 | 0.6856 |
| 0.7152 | 2.2650 | 6000 | 0.6644 |
| 0.701 | 2.4538 | 6500 | 0.6521 |
| 0.6883 | 2.6425 | 7000 | 0.6419 |
| 0.6732 | 2.8313 | 7500 | 0.6280 |
| 0.6645 | 3.0200 | 8000 | 0.6186 |
| 0.6551 | 3.2088 | 8500 | 0.6079 |
| 0.6472 | 3.3975 | 9000 | 0.6030 |
| 0.6407 | 3.5863 | 9500 | 0.5939 |
| 0.6289 | 3.7750 | 10000 | 0.5869 |
| 0.6185 | 3.9638 | 10500 | 0.5830 |
| 0.6219 | 4.1525 | 11000 | 0.5780 |
| 0.6102 | 4.3413 | 11500 | 0.5731 |
| 0.6115 | 4.5300 | 12000 | 0.5675 |
| 0.6023 | 4.7188 | 12500 | 0.5638 |
| 0.5937 | 4.9075 | 13000 | 0.5596 |
| 0.5928 | 5.0963 | 13500 | 0.5565 |
| 0.5874 | 5.2850 | 14000 | 0.5533 |
| 0.5822 | 5.4738 | 14500 | 0.5492 |
| 0.5782 | 5.6625 | 15000 | 0.5481 |
| 0.577 | 5.8513 | 15500 | 0.5445 |
| 0.5741 | 6.0400 | 16000 | 0.5413 |
| 0.5715 | 6.2288 | 16500 | 0.5380 |
| 0.5674 | 6.4175 | 17000 | 0.5365 |
| 0.5708 | 6.6063 | 17500 | 0.5344 |
| 0.5634 | 6.7950 | 18000 | 0.5323 |
| 0.5616 | 6.9838 | 18500 | 0.5313 |
| 0.5625 | 7.1725 | 19000 | 0.5285 |
| 0.5547 | 7.3613 | 19500 | 0.5270 |
| 0.5497 | 7.5500 | 20000 | 0.5257 |
| 0.5551 | 7.7388 | 20500 | 0.5234 |
| 0.5494 | 7.9275 | 21000 | 0.5211 |
| 0.5444 | 8.1163 | 21500 | 0.5192 |
| 0.547 | 8.3050 | 22000 | 0.5179 |
| 0.5415 | 8.4938 | 22500 | 0.5163 |
| 0.5414 | 8.6825 | 23000 | 0.5147 |
| 0.5397 | 8.8713 | 23500 | 0.5137 |
Framework versions
- Transformers 4.51.1
- Pytorch 2.6.0+cu124
- Datasets 3.5.0
- Tokenizers 0.21.1
- Downloads last month
- 2
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
🙋
Ask for provider support