Whisper Base Bn (10k steps) - by BanglaBridge

This model is a fine-tuned version of openai/whisper-base on the Common Voice 17.0 dataset.

It is the merged model from this fine-tuned PEFT LoRA adapter: Da4ThEdge/base-bn-lora-adapter-cp10k

After 10k steps it achieves the following results on the test set:

  • Wer: 46.25395
  • Normalized Wer: 23.31617

Refer to the 20k full-trained adapter repository for more details on the finetuning: banglabridge/base-bn-lora-adapter

Framework versions

  • Transformers 4.40.2
  • Pytorch 2.6.0+cu124
  • Tokenizers 0.19.1
  • Peft 0.10.0
Downloads last month
1
Safetensors
Model size
72.6M params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for Da4ThEdge/base-bn-cp10k

Finetuned
(1)
this model

Dataset used to train Da4ThEdge/base-bn-cp10k

Evaluation results