Eimhin03's picture
Update README.md
476f0ed verified
metadata
library_name: transformers
license: apache-2.0
base_model: openai/whisper-tiny
tags:
  - generated_from_trainer
metrics:
  - wer
model-index:
  - name: outout_model_EUBookShop_data_only
    results: []

outout_model_EUBookShop_data_only

This model is a fine-tuned version of openai/whisper-tiny on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.1714
  • Wer: 12.0282

It was trained only using the EUBookShop dataset for 20000 steps.

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0001
  • train_batch_size: 2
  • eval_batch_size: 4
  • seed: 42
  • optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 200
  • training_steps: 20000
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer
0.3552 0.1652 5000 0.4031 23.3443
0.3810 0.3303 10000 0.2821 18.6306
0.2647 0.4955 15000 0.2088 13.6450
0.1948 0.6607 20000 0.1714 12.0282

Framework versions

  • Transformers 5.0.1.dev0
  • Pytorch 2.9.0+cu126
  • Datasets 4.0.0
  • Tokenizers 0.22.2