a6ace61febf24ad62e27a3dd33dbfa4a
This model is a fine-tuned version of google/mt5-large on the Helsinki-NLP/opus_books [en-fi] dataset. It achieves the following results on the evaluation set:
- Loss: 2.1269
- Data Size: 1.0
- Epoch Runtime: 44.3674
- Bleu: 5.0572
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- distributed_type: multi-GPU
- num_devices: 4
- total_train_batch_size: 32
- total_eval_batch_size: 32
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: constant
- num_epochs: 50
Training results
| Training Loss | Epoch | Step | Validation Loss | Data Size | Epoch Runtime | Bleu |
|---|---|---|---|---|---|---|
| No log | 0 | 0 | 20.8328 | 0 | 3.6571 | 0.0136 |
| No log | 1 | 91 | 20.5809 | 0.0078 | 4.7164 | 0.0176 |
| No log | 2 | 182 | 19.0958 | 0.0156 | 5.7024 | 0.0111 |
| No log | 3 | 273 | 18.4367 | 0.0312 | 7.8785 | 0.0133 |
| No log | 4 | 364 | 17.6781 | 0.0625 | 10.5731 | 0.0110 |
| No log | 5 | 455 | 14.6516 | 0.125 | 13.8143 | 0.0139 |
| No log | 6 | 546 | 6.4806 | 0.25 | 18.6018 | 0.0296 |
| 1.5695 | 7 | 637 | 3.2936 | 0.5 | 28.5055 | 0.2850 |
| 3.8773 | 8.0 | 728 | 2.4982 | 1.0 | 45.8832 | 1.3749 |
| 3.0837 | 9.0 | 819 | 2.3093 | 1.0 | 43.1954 | 3.6330 |
| 2.789 | 10.0 | 910 | 2.2261 | 1.0 | 43.5214 | 4.0019 |
| 2.5661 | 11.0 | 1001 | 2.1749 | 1.0 | 46.5535 | 4.2296 |
| 2.4472 | 12.0 | 1092 | 2.1430 | 1.0 | 44.4746 | 4.3623 |
| 2.3362 | 13.0 | 1183 | 2.1247 | 1.0 | 43.9003 | 4.4396 |
| 2.2216 | 14.0 | 1274 | 2.1092 | 1.0 | 45.1125 | 4.5845 |
| 2.1228 | 15.0 | 1365 | 2.1020 | 1.0 | 44.5265 | 4.5838 |
| 2.0104 | 16.0 | 1456 | 2.0972 | 1.0 | 44.4803 | 4.6869 |
| 1.9408 | 17.0 | 1547 | 2.0954 | 1.0 | 45.7054 | 4.7448 |
| 1.8473 | 18.0 | 1638 | 2.0965 | 1.0 | 44.7305 | 4.8875 |
| 1.7498 | 19.0 | 1729 | 2.1176 | 1.0 | 43.8897 | 4.9420 |
| 1.6988 | 20.0 | 1820 | 2.1216 | 1.0 | 46.3713 | 5.0510 |
| 1.6431 | 21.0 | 1911 | 2.1269 | 1.0 | 44.3674 | 5.0572 |
Framework versions
- Transformers 4.57.0
- Pytorch 2.8.0+cu128
- Datasets 4.2.0
- Tokenizers 0.22.1
- Downloads last month
- -
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support
Model tree for contemmcm/a6ace61febf24ad62e27a3dd33dbfa4a
Base model
google/mt5-large