52Hz Small Fr - IMT Atlantique X 52 Hertz

This model is a fine-tuned version of openai/whisper-small on the Transcriptions IMTx52Hz v2 dataset. It achieves the following results on the evaluation set:

  • Loss: 0.2991
  • Wer: 10.0993

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 4
  • eval_batch_size: 8
  • seed: 42
  • gradient_accumulation_steps: 4
  • total_train_batch_size: 16
  • optimizer: Use OptimizerNames.ADAMW_TORCH_FUSED with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: cosine
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 15
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer
1.6508 1.0 21 0.7795 28.8079
0.6567 2.0 42 0.3693 18.0464
0.3608 3.0 63 0.2880 13.5762
0.2693 4.0 84 0.2782 13.2450
0.1341 5.0 105 0.2645 12.4172
0.1078 6.0 126 0.2908 13.4106
0.0887 7.0 147 0.2834 12.9139
0.0626 8.0 168 0.2862 10.5960
0.0461 9.0 189 0.2833 10.9272
0.0283 10.0 210 0.2834 10.4305
0.0333 11.0 231 0.2935 9.4371
0.0219 12.0 252 0.2925 10.4305
0.0265 13.0 273 0.3023 10.2649
0.0274 14.0 294 0.2992 10.0993
0.0216 15.0 315 0.2991 10.0993

Framework versions

  • Transformers 4.57.3
  • Pytorch 2.9.1+cu130
  • Datasets 4.4.2
  • Tokenizers 0.22.2
Downloads last month
85
Safetensors
Model size
0.2B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for hip94/52Hz-small-fr-v2

Finetuned
(3202)
this model