Surt Small v1 โ€” Training Checkpoints

This repo contains the best training checkpoint for the Surt Small v1 Gurbani ASR model.

Current Checkpoint

Parameter Value
Step 3400 / 5000
WER 14.88%
CER 4.30%
Epoch ~3.4

This is the best-performing checkpoint from the full training run. The final model (step 5000, WER 15.39%) is available at surindersinghssj/surt-small-v1.

Usage

from transformers import WhisperProcessor, WhisperForConditionalGeneration

processor = WhisperProcessor.from_pretrained("openai/whisper-small")
model = WhisperForConditionalGeneration.from_pretrained("surindersinghssj/surt-small-v1-training")

For full details on training, dataset, and evaluation, see the main model card.

Downloads last month
342
Safetensors
Model size
0.2B params
Tensor type
F32
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for surindersinghssj/surt-small-v1-training

Finetuned
(3383)
this model
Finetunes
1 model