pretrain_gpt2___
This model is a fine-tuned version of on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 0.5927
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 512
- eval_batch_size: 512
- seed: 42
- optimizer: Use adamw_torch with betas=(0.9,0.95) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- lr_scheduler_warmup_steps: 312500
- num_epochs: 10
- mixed_precision_training: Native AMP
Training results
| Training Loss | Epoch | Step | Validation Loss |
|---|---|---|---|
| 0.408 | 0.3775 | 1000 | 1.5666 |
| 0.3475 | 0.7550 | 2000 | 1.3169 |
| 0.3182 | 1.1325 | 3000 | 1.1971 |
| 0.293 | 1.5100 | 4000 | 1.0951 |
| 0.2728 | 1.8875 | 5000 | 1.0073 |
| 0.2552 | 2.2650 | 6000 | 0.9333 |
| 0.2417 | 2.6425 | 7000 | 0.8703 |
| 0.227 | 3.0200 | 8000 | 0.8206 |
| 0.2185 | 3.3975 | 9000 | 0.7925 |
| 0.2103 | 3.7750 | 10000 | 0.7635 |
| 0.2034 | 4.1525 | 11000 | 0.7346 |
| 0.1967 | 4.5300 | 12000 | 0.7210 |
| 0.1895 | 4.9075 | 13000 | 0.7079 |
| 0.1875 | 5.2850 | 14000 | 0.6897 |
| 0.1844 | 5.6625 | 15000 | 0.6761 |
| 0.1782 | 6.0400 | 16000 | 0.6619 |
| 0.1758 | 6.4175 | 17000 | 0.6534 |
| 0.1744 | 6.7950 | 18000 | 0.6451 |
| 0.1732 | 7.1725 | 19000 | 0.6344 |
| 0.1672 | 7.5500 | 20000 | 0.6276 |
| 0.168 | 7.9275 | 21000 | 0.6197 |
| 0.1661 | 8.3050 | 22000 | 0.6124 |
| 0.1627 | 8.6825 | 23000 | 0.6080 |
| 0.1593 | 9.0600 | 24000 | 0.6042 |
| 0.1586 | 9.4375 | 25000 | 0.5978 |
| 0.1589 | 9.8150 | 26000 | 0.5927 |
Framework versions
- Transformers 4.51.1
- Pytorch 2.6.0+cu124
- Datasets 3.5.0
- Tokenizers 0.21.1
- Downloads last month
- -
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
🙋
Ask for provider support