dense_g
This model is a fine-tuned version of on the arrow dataset. It achieves the following results on the evaluation set:
- Loss: 4.1873
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 8
- eval_batch_size: 32
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 32
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-06 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 48952
- training_steps: 489524
- mixed_precision_training: Native AMP
Training results
| Training Loss | Epoch | Step | Validation Loss |
|---|---|---|---|
| No log | 0 | 0 | 10.9825 |
| 7.7212 | 0.2043 | 10000 | 7.6909 |
| 6.3374 | 0.4086 | 20000 | 6.2922 |
| 5.5216 | 0.6128 | 30000 | 5.4995 |
| 5.2003 | 0.8171 | 40000 | 5.1776 |
| 4.9271 | 1.0214 | 50000 | 4.9817 |
| 4.8086 | 1.2257 | 60000 | 4.8141 |
| 4.7051 | 1.4299 | 70000 | 4.6936 |
| 4.6072 | 1.6342 | 80000 | 4.6053 |
| 4.5483 | 1.8385 | 90000 | 4.5332 |
| 4.35 | 2.0428 | 100000 | 4.4781 |
| 4.3469 | 2.2471 | 110000 | 4.4411 |
| 4.329 | 2.4513 | 120000 | 4.4050 |
| 4.3141 | 2.6556 | 130000 | 4.3695 |
| 4.2936 | 2.8599 | 140000 | 4.3380 |
| 4.1027 | 3.0642 | 150000 | 4.3209 |
| 4.1292 | 3.2684 | 160000 | 4.3032 |
| 4.1466 | 3.4727 | 170000 | 4.2814 |
| 4.1399 | 3.6770 | 180000 | 4.2613 |
| 4.1276 | 3.8813 | 190000 | 4.2408 |
| 3.947 | 4.0856 | 200000 | 4.2432 |
| 3.9995 | 4.2898 | 210000 | 4.2317 |
| 3.9753 | 4.4941 | 220000 | 4.2176 |
| 3.9981 | 4.6984 | 230000 | 4.2013 |
| 4.0043 | 4.9027 | 240000 | 4.1867 |
| 3.807 | 5.1069 | 250000 | 4.2044 |
| 3.8409 | 5.3112 | 260000 | 4.1961 |
| 3.8578 | 5.5155 | 270000 | 4.1831 |
| 3.8771 | 5.7198 | 280000 | 4.1700 |
| 3.8906 | 5.9241 | 290000 | 4.1563 |
| 3.6785 | 6.1283 | 300000 | 4.1879 |
| 3.7006 | 6.3326 | 310000 | 4.1814 |
| 3.7424 | 6.5369 | 320000 | 4.1690 |
| 3.7664 | 6.7412 | 330000 | 4.1567 |
| 3.753 | 6.9454 | 340000 | 4.1439 |
| 3.5871 | 7.1497 | 350000 | 4.1852 |
| 3.6093 | 7.3540 | 360000 | 4.1786 |
| 3.6389 | 7.5583 | 370000 | 4.1669 |
| 3.6415 | 7.7626 | 380000 | 4.1574 |
| 3.6364 | 7.9668 | 390000 | 4.1467 |
| 3.4652 | 8.1711 | 400000 | 4.1893 |
| 3.4976 | 8.3754 | 410000 | 4.1848 |
| 3.4929 | 8.5797 | 420000 | 4.1783 |
| 3.5109 | 8.7839 | 430000 | 4.1717 |
| 3.5206 | 8.9882 | 440000 | 4.1624 |
| 3.3863 | 9.1925 | 450000 | 4.1990 |
| 3.3933 | 9.3968 | 460000 | 4.1967 |
| 3.3898 | 9.6011 | 470000 | 4.1919 |
| 3.3996 | 9.8053 | 480000 | 4.1888 |
Framework versions
- Transformers 4.51.0
- Pytorch 2.7.0+cu126
- Datasets 3.6.0
- Tokenizers 0.21.1
- Downloads last month
- -