f108a1b2c1c40ccfff0ff9f1bbe49536
This model is a fine-tuned version of facebook/mbart-large-50-one-to-many-mmt on the Helsinki-NLP/opus_books [fi-no] dataset. It achieves the following results on the evaluation set:
- Loss: 4.3941
- Data Size: 1.0
- Epoch Runtime: 26.0804
- Bleu: 5.8488
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- distributed_type: multi-GPU
- num_devices: 4
- total_train_batch_size: 32
- total_eval_batch_size: 32
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: constant
- num_epochs: 50
Training results
| Training Loss | Epoch | Step | Validation Loss | Data Size | Epoch Runtime | Bleu |
|---|---|---|---|---|---|---|
| No log | 0 | 0 | 9.5660 | 0 | 2.3212 | 0.2197 |
| No log | 1 | 85 | 8.4854 | 0.0078 | 3.0738 | 0.1704 |
| No log | 2 | 170 | 7.5612 | 0.0156 | 3.6157 | 0.3846 |
| No log | 3 | 255 | 6.9790 | 0.0312 | 4.7630 | 0.6933 |
| No log | 4 | 340 | 6.3720 | 0.0625 | 6.5844 | 0.8409 |
| 0.4266 | 5 | 425 | 5.7273 | 0.125 | 8.4043 | 1.1656 |
| 0.4266 | 6 | 510 | 4.9046 | 0.25 | 10.9274 | 1.3784 |
| 1.486 | 7 | 595 | 4.3110 | 0.5 | 15.1796 | 2.9648 |
| 3.8665 | 8.0 | 680 | 3.9113 | 1.0 | 28.1134 | 5.0586 |
| 3.111 | 9.0 | 765 | 3.7840 | 1.0 | 27.1383 | 5.1805 |
| 2.6286 | 10.0 | 850 | 3.7717 | 1.0 | 25.5010 | 6.0196 |
| 2.1677 | 11.0 | 935 | 3.8344 | 1.0 | 25.7779 | 6.2014 |
| 1.7239 | 12.0 | 1020 | 3.9871 | 1.0 | 27.3611 | 6.6675 |
| 1.3949 | 13.0 | 1105 | 4.2133 | 1.0 | 26.2326 | 5.5582 |
| 1.1029 | 14.0 | 1190 | 4.3941 | 1.0 | 26.0804 | 5.8488 |
Framework versions
- Transformers 4.57.0
- Pytorch 2.8.0+cu128
- Datasets 4.2.0
- Tokenizers 0.22.1
- Downloads last month
- -
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support
Model tree for contemmcm/f108a1b2c1c40ccfff0ff9f1bbe49536
Base model
facebook/mbart-large-50-one-to-many-mmt