irish-whisper-combined-aug

This model is a fine-tuned version of openai/whisper-base on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.6851
  • Wer: 30.7302

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 4
  • eval_batch_size: 8
  • seed: 42
  • gradient_accumulation_steps: 4
  • total_train_batch_size: 16
  • optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 1000
  • training_steps: 20000
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer
0.4196 7.0672 2000 0.7531 46.2383
0.1137 14.1344 4000 0.7528 38.8904
0.0563 21.2016 6000 0.7506 38.1335
0.0474 28.2688 8000 0.7406 33.9795
0.0451 35.3360 10000 0.7460 35.1242
0.0228 42.4032 12000 0.7321 33.2041
0.0265 49.4704 14000 0.7366 33.6287
0.0343 56.5376 16000 0.7105 31.3579
0.0164 63.6048 18000 0.6946 30.7763
0.0109 70.6720 20000 0.6851 30.7302

Framework versions

  • Transformers 5.3.0.dev0
  • Pytorch 2.9.0+cu126
  • Datasets 4.0.0
  • Tokenizers 0.22.2
Downloads last month
1,017
Safetensors
Model size
72.6M params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for Eimhin03/irish-whisper-combined-aug

Finetuned
(688)
this model