Whisper Base Bn (10k steps) - by BanglaBridge
This model is a fine-tuned version of openai/whisper-base on the Common Voice 17.0 dataset.
It is the merged model from this fine-tuned PEFT LoRA adapter: Da4ThEdge/base-bn-lora-adapter-cp10k
After 10k steps it achieves the following results on the test set:
- Wer: 46.25395
- Normalized Wer: 23.31617
Refer to the 20k full-trained adapter repository for more details on the finetuning: banglabridge/base-bn-lora-adapter
Framework versions
- Transformers 4.40.2
- Pytorch 2.6.0+cu124
- Tokenizers 0.19.1
- Peft 0.10.0
- Downloads last month
- 1
Model tree for Da4ThEdge/base-bn-cp10k
Base model
openai/whisper-baseDataset used to train Da4ThEdge/base-bn-cp10k
Evaluation results
- Wer on Common Voice 17.0self-reported23.316