moe_g_topk_1
This model is a fine-tuned version of on the arrow dataset. It achieves the following results on the evaluation set:
- Loss: 4.2292
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 8
- eval_batch_size: 32
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 32
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-06 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 48952
- training_steps: 489524
- mixed_precision_training: Native AMP
Training results
| Training Loss | Epoch | Step | Validation Loss |
|---|---|---|---|
| No log | 0 | 0 | 10.9707 |
| 7.8926 | 0.2043 | 10000 | 7.8519 |
| 6.5381 | 0.4086 | 20000 | 6.4901 |
| 5.7257 | 0.6128 | 30000 | 5.6892 |
| 5.3423 | 0.8171 | 40000 | 5.3096 |
| 5.0446 | 1.0214 | 50000 | 5.0817 |
| 4.9015 | 1.2257 | 60000 | 4.8898 |
| 4.7889 | 1.4299 | 70000 | 4.7631 |
| 4.6863 | 1.6342 | 80000 | 4.6713 |
| 4.6214 | 1.8385 | 90000 | 4.5971 |
| 4.4259 | 2.0428 | 100000 | 4.5438 |
| 4.4244 | 2.2471 | 110000 | 4.5033 |
| 4.4051 | 2.4513 | 120000 | 4.4638 |
| 4.3872 | 2.6556 | 130000 | 4.4259 |
| 4.3639 | 2.8599 | 140000 | 4.3928 |
| 4.1714 | 3.0642 | 150000 | 4.3797 |
| 4.1964 | 3.2684 | 160000 | 4.3594 |
| 4.2105 | 3.4727 | 170000 | 4.3367 |
| 4.2028 | 3.6770 | 180000 | 4.3129 |
| 4.1899 | 3.8813 | 190000 | 4.2912 |
| 4.0088 | 4.0856 | 200000 | 4.2995 |
| 4.0585 | 4.2898 | 210000 | 4.2853 |
| 4.0317 | 4.4941 | 220000 | 4.2693 |
| 4.0556 | 4.6984 | 230000 | 4.2503 |
| 4.0647 | 4.9027 | 240000 | 4.2322 |
| 4.0351 | 5.0 | 244765 | 4.2236 |
| 3.888 | 5.1070 | 250000 | 4.2534 |
| 3.8887 | 5.3113 | 260000 | 4.2466 |
| 3.886 | 5.5156 | 270000 | 4.2317 |
| 3.9241 | 5.7199 | 280000 | 4.2154 |
| 3.9185 | 5.9242 | 290000 | 4.2018 |
| 3.7245 | 6.1285 | 300000 | 4.2359 |
| 3.7472 | 6.3327 | 310000 | 4.2270 |
| 3.7867 | 6.5370 | 320000 | 4.2154 |
| 3.8089 | 6.7413 | 330000 | 4.2018 |
| 3.7919 | 6.9456 | 340000 | 4.1873 |
| 3.6269 | 7.1498 | 350000 | 4.2293 |
| 3.6411 | 7.3541 | 360000 | 4.2234 |
| 3.6791 | 7.5584 | 370000 | 4.2120 |
| 3.6751 | 7.7627 | 380000 | 4.2034 |
| 3.671 | 7.9670 | 390000 | 4.1918 |
| 3.498 | 8.1712 | 400000 | 4.2328 |
| 3.5276 | 8.3755 | 410000 | 4.2280 |
| 3.5155 | 8.5798 | 420000 | 4.2226 |
| 3.5276 | 8.7841 | 430000 | 4.2143 |
| 3.5387 | 8.9883 | 440000 | 4.2068 |
| 3.4148 | 9.1926 | 450000 | 4.2388 |
| 3.4151 | 9.3969 | 460000 | 4.2367 |
| 3.4164 | 9.6012 | 470000 | 4.2339 |
| 3.4251 | 9.8055 | 480000 | 4.2306 |
Framework versions
- Transformers 4.51.0
- Pytorch 2.7.0+cu126
- Datasets 3.6.0
- Tokenizers 0.21.1
- Downloads last month
- -