85e7f5e9fd9f27aaa06ad576c111ec0e
This model is a fine-tuned version of facebook/mbart-large-50-many-to-many-mmt on the Helsinki-NLP/opus_books [fi-pl] dataset. It achieves the following results on the evaluation set:
- Loss: 3.4320
- Data Size: 1.0
- Epoch Runtime: 21.8610
- Bleu: 2.7968
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- distributed_type: multi-GPU
- num_devices: 4
- total_train_batch_size: 32
- total_eval_batch_size: 32
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: constant
- num_epochs: 50
Training results
| Training Loss | Epoch | Step | Validation Loss | Data Size | Epoch Runtime | Bleu |
|---|---|---|---|---|---|---|
| No log | 0 | 0 | 8.8652 | 0 | 2.1078 | 0.3527 |
| No log | 1 | 70 | 5.6598 | 0.0078 | 2.6090 | 0.4955 |
| No log | 2 | 140 | 4.2025 | 0.0156 | 4.3660 | 1.3683 |
| No log | 3 | 210 | 3.4679 | 0.0312 | 6.4246 | 1.5424 |
| No log | 4 | 280 | 3.3166 | 0.0625 | 8.0515 | 2.3667 |
| No log | 5 | 350 | 3.1599 | 0.125 | 9.2250 | 2.5596 |
| No log | 6 | 420 | 3.0531 | 0.25 | 10.5494 | 3.4340 |
| 0.4761 | 7 | 490 | 2.9127 | 0.5 | 13.1420 | 2.3470 |
| 2.4158 | 8.0 | 560 | 2.8439 | 1.0 | 22.0134 | 2.5924 |
| 1.9131 | 9.0 | 630 | 2.9053 | 1.0 | 22.5572 | 2.5951 |
| 1.3774 | 10.0 | 700 | 3.0882 | 1.0 | 20.3426 | 2.8000 |
| 0.9683 | 11.0 | 770 | 3.2552 | 1.0 | 21.1639 | 2.5319 |
| 0.8228 | 12.0 | 840 | 3.4320 | 1.0 | 21.8610 | 2.7968 |
Framework versions
- Transformers 4.57.0
- Pytorch 2.8.0+cu128
- Datasets 4.2.0
- Tokenizers 0.22.1
- Downloads last month
- -
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support
Model tree for contemmcm/85e7f5e9fd9f27aaa06ad576c111ec0e
Base model
facebook/mbart-large-50-many-to-many-mmt