moe_het_100m_dist
This model is a fine-tuned version of on the arrow dataset. It achieves the following results on the evaluation set:
- Loss: 4.3575
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 2
- eval_batch_size: 4
- seed: 42
- distributed_type: multi-GPU
- num_devices: 4
- gradient_accumulation_steps: 4
- total_train_batch_size: 32
- total_eval_batch_size: 16
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-06 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 48952
- training_steps: 489524
- mixed_precision_training: Native AMP
Training results
| Training Loss | Epoch | Step | Validation Loss |
|---|---|---|---|
| 7.7351 | 0.2043 | 10000 | 7.6727 |
| 6.3271 | 0.4086 | 20000 | 6.2540 |
| 5.5336 | 0.6128 | 30000 | 5.4857 |
| 5.1857 | 0.8171 | 40000 | 5.1390 |
| 4.8938 | 1.0214 | 50000 | 4.9301 |
| 4.7662 | 1.2257 | 60000 | 4.7527 |
| 4.6548 | 1.4299 | 70000 | 4.6254 |
| 4.5533 | 1.6342 | 80000 | 4.5319 |
| 4.491 | 1.8385 | 90000 | 4.4567 |
| 4.2623 | 2.0428 | 100000 | 4.3990 |
| 4.2597 | 2.2471 | 110000 | 4.3614 |
| 4.25 | 2.4513 | 120000 | 4.3254 |
| 4.2357 | 2.6556 | 130000 | 4.2882 |
| 4.2135 | 2.8599 | 140000 | 4.2564 |
| 3.9788 | 3.0642 | 150000 | 4.2441 |
| 4.0113 | 3.2684 | 160000 | 4.2275 |
| 4.0336 | 3.4727 | 170000 | 4.2071 |
| 4.0307 | 3.6770 | 180000 | 4.1868 |
| 4.0187 | 3.8813 | 190000 | 4.1665 |
| 3.7841 | 4.0856 | 200000 | 4.1848 |
| 3.8453 | 4.2898 | 210000 | 4.1761 |
| 3.8253 | 4.4941 | 220000 | 4.1622 |
| 3.8533 | 4.6984 | 230000 | 4.1445 |
| 3.8685 | 4.9027 | 240000 | 4.1282 |
| 3.5935 | 5.1069 | 250000 | 4.1764 |
| 3.6434 | 5.3112 | 260000 | 4.1689 |
| 3.6699 | 5.5155 | 270000 | 4.1565 |
| 3.6989 | 5.7198 | 280000 | 4.1415 |
| 3.717 | 5.9241 | 290000 | 4.1273 |
| 3.4139 | 6.1283 | 300000 | 4.1985 |
| 3.4533 | 6.3326 | 310000 | 4.1953 |
| 3.5066 | 6.5369 | 320000 | 4.1815 |
| 3.5352 | 6.7412 | 330000 | 4.1680 |
| 3.524 | 6.9454 | 340000 | 4.1530 |
| 3.2713 | 7.1497 | 350000 | 4.2425 |
| 3.3095 | 7.3540 | 360000 | 4.2411 |
| 3.3473 | 7.5583 | 370000 | 4.2315 |
| 3.3566 | 7.7626 | 380000 | 4.2210 |
| 3.3533 | 7.9668 | 390000 | 4.2091 |
| 3.102 | 8.1711 | 400000 | 4.2994 |
| 3.1389 | 8.3754 | 410000 | 4.3031 |
| 3.146 | 8.5797 | 420000 | 4.2976 |
| 3.1606 | 8.7839 | 430000 | 4.2925 |
| 3.1767 | 8.9882 | 440000 | 4.2834 |
| 2.9702 | 9.1925 | 450000 | 4.3574 |
| 2.9808 | 9.3968 | 460000 | 4.3613 |
| 2.9823 | 9.6011 | 470000 | 4.3604 |
| 2.9932 | 9.8053 | 480000 | 4.3578 |
Framework versions
- Transformers 4.57.1
- Pytorch 2.7.1+cu118
- Datasets 3.6.0
- Tokenizers 0.22.1
- Downloads last month
- -