52Hz-small-fr-v3 / README.md
hip94's picture
Update README.md
23e94a5 verified
|
raw
history blame
2.54 kB
metadata
library_name: transformers
language:
  - fr
license: apache-2.0
base_model: openai/whisper-small
tags:
  - generated_from_trainer
metrics:
  - wer
model-index:
  - name: 52Hz Small Fr - IMT Atlantique X 52 Hertz
    results: []

52Hz Small Fr - IMT Atlantique X 52 Hertz

This model is a fine-tuned version of openai/whisper-small on the Transcriptions IMTx52Hz v3 dataset. It achieves the following results on the evaluation set:

  • Loss: 0.4085
  • Wer: 17.6471

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 4
  • eval_batch_size: 8
  • seed: 42
  • gradient_accumulation_steps: 4
  • total_train_batch_size: 16
  • optimizer: Use OptimizerNames.ADAMW_TORCH_FUSED with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: cosine
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 15
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer
1.461 1.0 22 0.8945 41.0131
0.6454 2.0 44 0.5209 25.4902
0.3068 3.0 66 0.4752 21.0784
0.2361 4.0 88 0.4253 19.9346
0.1448 5.0 110 0.4112 19.7712
0.1367 6.0 132 0.4118 17.9739
0.0945 7.0 154 0.4220 19.6078
0.0575 8.0 176 0.4065 16.9935
0.0421 9.0 198 0.4356 17.6471
0.0414 10.0 220 0.4186 17.8105
0.027 11.0 242 0.4105 17.3203
0.0249 12.0 264 0.4037 16.9935
0.0314 13.0 286 0.4077 17.3203
0.0214 14.0 308 0.4085 17.6471
0.0148 15.0 330 0.4085 17.6471

Framework versions

  • Transformers 4.57.3
  • Pytorch 2.9.1+cu130
  • Datasets 4.4.2
  • Tokenizers 0.22.2