sulaimank's picture
End of training
7255b10 verified
metadata
library_name: transformers
license: apache-2.0
base_model: sulaimank/whisper-small-CV-Fleurs-lg-300
tags:
  - generated_from_trainer
metrics:
  - wer
model-index:
  - name: whisper-small-lug-4src
    results: []

whisper-small-lug-4src

This model is a fine-tuned version of sulaimank/whisper-small-CV-Fleurs-lg-300 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.6772
  • Wer: 0.2145
  • Cer: 0.1136

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 8
  • eval_batch_size: 4
  • seed: 42
  • gradient_accumulation_steps: 2
  • total_train_batch_size: 16
  • optimizer: Use OptimizerNames.ADAMW_TORCH_FUSED with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 500
  • num_epochs: 30

Training results

Training Loss Epoch Step Cer Validation Loss Wer
0.6768 1.1508 1000 0.2882 0.4067 0.3768
0.3794 2.3017 2000 0.2152 0.3882 0.3322
0.3018 3.4525 3000 0.4216 0.2716 0.1417
0.2216 4.6033 4000 0.4807 0.2945 0.1581
0.1689 5.7542 5000 0.5392 0.2358 0.1309
0.0748 6.9050 6000 0.5808 0.2331 0.1226
0.0453 8.0553 7000 0.6295 0.2382 0.1290
0.0276 9.2061 8000 0.6217 0.2413 0.1291
0.0166 10.3569 9000 0.6351 0.2300 0.1183
0.0115 11.5078 10000 0.6329 0.2366 0.1237
0.0096 12.6586 11000 0.6458 0.2428 0.1259
0.0059 13.8094 12000 0.6414 0.2323 0.1182
0.0045 14.9603 13000 0.6418 0.2638 0.1463
0.0043 16.1105 14000 0.6474 0.2362 0.1269
0.0026 17.2614 15000 0.6542 0.2284 0.1259
0.0026 18.4122 16000 0.6498 0.2253 0.1164
0.0013 19.5630 17000 0.6504 0.2758 0.1572
0.0015 20.7139 18000 0.6608 0.2187 0.1161
0.0015 21.8647 19000 0.6602 0.2234 0.1173
0.0015 23.0150 20000 0.6661 0.2455 0.1296
0.0008 24.1658 21000 0.6625 0.2265 0.1214
0.0003 25.3166 22000 0.6698 0.2498 0.1384
0.0003 26.4675 23000 0.6725 0.2347 0.1234
0.0001 27.6183 24000 0.6738 0.2187 0.1194
0.0001 28.7691 25000 0.6766 0.2133 0.1144
0.0001 29.9200 26000 0.6772 0.2145 0.1136

Framework versions

  • Transformers 5.3.0
  • Pytorch 2.10.0+cu130
  • Datasets 4.0.0
  • Tokenizers 0.22.2