whisper_input_decoder_equal_labels_no_force__0010

This model is a fine-tuned version of openai/whisper-tiny on an unknown dataset. It achieves the following results on the evaluation set:

  • Train Loss: 0.0004
  • Train Accuracy: 0.0362
  • Train Wermet: 16.8461
  • Validation Loss: 0.0004
  • Validation Accuracy: 0.0266
  • Validation Wermet: 34.1220
  • Epoch: 9

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • optimizer: {'name': 'AdamWeightDecay', 'learning_rate': 1e-05, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False, 'weight_decay_rate': 0.01}
  • training_precision: float32

Training results

Train Loss Train Accuracy Train Wermet Validation Loss Validation Accuracy Validation Wermet Epoch
0.7688 0.0332 32.0424 0.0164 0.0265 57.8712 0
0.0114 0.0362 31.1067 0.0062 0.0266 54.8371 1
0.0048 0.0362 27.0610 0.0030 0.0266 50.1428 2
0.0027 0.0362 24.9672 0.0018 0.0266 48.0741 3
0.0018 0.0362 23.1500 0.0013 0.0266 44.9304 4
0.0013 0.0362 21.5445 0.0010 0.0266 42.3508 5
0.0009 0.0362 20.2775 0.0008 0.0266 40.4061 6
0.0007 0.0362 19.5082 0.0006 0.0266 38.2329 7
0.0005 0.0362 17.9967 0.0005 0.0266 35.9761 8
0.0004 0.0362 16.8461 0.0004 0.0266 34.1220 9

Framework versions

  • Transformers 4.33.0.dev0
  • TensorFlow 2.13.0
  • Tokenizers 0.13.3
Downloads last month
-
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for bigmorning/whisper_input_decoder_equal_labels_no_force__0010

Finetuned
(1690)
this model