whisper-small-taiwanese-full

This model is a fine-tuned version of openai/whisper-small on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.4080
  • Cer: 29.8836

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 16
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 300
  • training_steps: 4500
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Cer
0.7008 0.5282 750 0.6961 43.8228
0.3366 1.0563 1500 0.5263 36.2712
0.3311 1.5845 2250 0.4640 32.0063
0.1384 2.1127 3000 0.4337 31.4976
0.1616 2.6408 3750 0.4143 30.0890
0.0742 3.1690 4500 0.4080 29.8836

Framework versions

  • Transformers 4.52.4
  • Pytorch 2.1.0+cu118
  • Datasets 3.6.0
  • Tokenizers 0.21.1
Downloads last month
2
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for alexachang/whisper-small-taiwanese-full

Finetuned
(3489)
this model