Configuration Parsing Warning: In adapter_config.json: "peft.task_type" must be a string

whisper-audeering-classifier

This model is a fine-tuned version of openai/whisper-large-v2 on the JASMIN-CGN dataset. It achieves the following results on the evaluation set:

  • Loss: 0.9865
  • Wer: 35.3575

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 48
  • eval_batch_size: 32
  • seed: 42
  • optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 18
  • num_epochs: 3.0
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer
1.07 0.1613 10 1.2246 37.8233
1.0062 0.3226 20 1.2108 37.7193
1.023 0.4839 30 1.1929 37.3268
1.0902 0.6452 40 1.1739 37.3872
1.0716 0.8065 50 1.1540 37.1255
0.9588 0.9677 60 1.1339 36.7732
1.0063 1.1290 70 1.1152 36.3673
0.9921 1.2903 80 1.0970 36.0553
0.9091 1.4516 90 1.0795 35.9211
0.9646 1.6129 100 1.0622 35.7601
0.8876 1.7742 110 1.0463 35.6594
0.8532 1.9355 120 1.0320 35.4447
0.8725 2.0968 130 1.0196 35.2870
0.8831 2.2581 140 1.0095 35.3675
0.8586 2.4194 150 1.0007 35.3776
0.8097 2.5806 160 0.9938 35.3004
0.8906 2.7419 170 0.9890 35.6561
0.8542 2.9032 180 0.9865 35.3575

Framework versions

  • PEFT 0.16.0
  • Transformers 4.57.3
  • Pytorch 2.7.1+cu126
  • Datasets 3.6.0
  • Tokenizers 0.22.2
Downloads last month
7
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for greenw0lf/whisper-audeering-classifier

Adapter
(313)
this model

Evaluation results