contemmcm's picture
End of training
d826021 verified
metadata
library_name: transformers
license: apache-2.0
base_model: Helsinki-NLP/opus-mt-tc-bible-big-deu_eng_fra_por_spa-mul
tags:
  - generated_from_trainer
metrics:
  - bleu
model-index:
  - name: 78e45edfab2e76d0d369e52b61e2100a
    results: []

78e45edfab2e76d0d369e52b61e2100a

This model is a fine-tuned version of Helsinki-NLP/opus-mt-tc-bible-big-deu_eng_fra_por_spa-mul on the Helsinki-NLP/opus_books [de-ru] dataset. It achieves the following results on the evaluation set:

  • Loss: 0.4578
  • Data Size: 1.0
  • Epoch Runtime: 27.7194
  • Bleu: 13.1403

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • distributed_type: multi-GPU
  • num_devices: 4
  • total_train_batch_size: 32
  • total_eval_batch_size: 32
  • optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: constant
  • num_epochs: 50

Training results

Training Loss Epoch Step Validation Loss Data Size Epoch Runtime Bleu
No log 0 0 7.4763 0 2.5179 0.0572
No log 1 434 2.5525 0.0078 3.9581 0.0049
No log 2 868 1.9816 0.0156 3.2139 0.0009
No log 3 1302 1.6718 0.0312 3.8519 0.0244
No log 4 1736 1.3498 0.0625 4.5576 0.3289
0.0714 5 2170 1.0614 0.125 6.0921 3.2350
0.9337 6 2604 0.7786 0.25 9.1924 5.3485
0.6619 7 3038 0.5972 0.5 15.4139 7.9549
0.5168 8.0 3472 0.4797 1.0 28.9889 10.9862
0.407 9.0 3906 0.4377 1.0 28.0562 11.5697
0.3527 10.0 4340 0.4187 1.0 27.0718 12.5934
0.3028 11.0 4774 0.4128 1.0 27.8363 12.7454
0.2623 12.0 5208 0.4204 1.0 27.2489 13.2087
0.2243 13.0 5642 0.4303 1.0 27.9318 13.0685
0.201 14.0 6076 0.4414 1.0 28.1184 13.3063
0.1808 15.0 6510 0.4578 1.0 27.7194 13.1403

Framework versions

  • Transformers 4.57.0
  • Pytorch 2.8.0+cu128
  • Datasets 4.2.0
  • Tokenizers 0.22.1