google/fleurs
Viewer • Updated • 768k • 57.7k • 402
How to use Sagicc/whisper-small-sr-fleurs with Transformers:
# Use a pipeline as a high-level helper
from transformers import pipeline
pipe = pipeline("automatic-speech-recognition", model="Sagicc/whisper-small-sr-fleurs") # Load model directly
from transformers import AutoProcessor, AutoModelForSpeechSeq2Seq
processor = AutoProcessor.from_pretrained("Sagicc/whisper-small-sr-fleurs")
model = AutoModelForSpeechSeq2Seq.from_pretrained("Sagicc/whisper-small-sr-fleurs")This model is a fine-tuned version of openai/whisper-small on the Google Fleurs dataset. It achieves the following results on the evaluation set:
More information needed
More information needed
More information needed
The following hyperparameters were used during training:
| Training Loss | Epoch | Step | Validation Loss | Wer Ortho | Wer |
|---|---|---|---|---|---|
| 0.0649 | 2.49 | 500 | 0.3685 | 30.6352 | 27.1489 |
| 0.0181 | 4.98 | 1000 | 0.4134 | 28.9292 | 25.6021 |
Base model
openai/whisper-small