wav2vec2-urdu-multitask-teacher
This model is a fine-tuned version of abidanoaman/wav2vec2-urdu-finetuned-ASR on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 1.8960
- Wer: 0.4799
- Emotion F1: 0.4457
- Gender Accuracy: 0.976
- Combined Score: 0.6345
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 32
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 100
- num_epochs: 40
- mixed_precision_training: Native AMP
Training results
| Training Loss | Epoch | Step | Validation Loss | Wer | Emotion F1 | Gender Accuracy | Combined Score |
|---|---|---|---|---|---|---|---|
| 3.0143 | 0.6849 | 50 | 2.5861 | 0.6759 | 0.0579 | 0.0 | 0.1470 |
| 2.8679 | 1.3699 | 100 | 2.5833 | 0.6569 | 0.0740 | 0.0 | 0.1594 |
| 2.8197 | 2.0548 | 150 | 2.5138 | 0.6460 | 0.1111 | 0.132 | 0.2145 |
| 2.6281 | 2.7397 | 200 | 2.3898 | 0.6089 | 0.1193 | 0.548 | 0.3566 |
| 2.51 | 3.4247 | 250 | 2.2804 | 0.5986 | 0.1349 | 0.674 | 0.4032 |
| 2.3366 | 4.1096 | 300 | 2.2034 | 0.5949 | 0.1134 | 0.74 | 0.4181 |
| 2.3195 | 4.7945 | 350 | 2.0848 | 0.5777 | 0.1867 | 0.794 | 0.4631 |
| 2.2542 | 5.4795 | 400 | 2.0140 | 0.5851 | 0.1676 | 0.892 | 0.4838 |
| 2.1615 | 6.1644 | 450 | 1.9212 | 0.5709 | 0.2389 | 0.91 | 0.5163 |
| 2.1168 | 6.8493 | 500 | 1.8401 | 0.5526 | 0.2426 | 0.944 | 0.5349 |
| 2.1297 | 7.5342 | 550 | 1.8193 | 0.5589 | 0.2636 | 0.914 | 0.5297 |
| 1.9481 | 8.2192 | 600 | 1.7621 | 0.5474 | 0.2876 | 0.946 | 0.5511 |
| 1.9068 | 8.9041 | 650 | 1.7353 | 0.5458 | 0.3476 | 0.964 | 0.5751 |
| 1.9135 | 9.5890 | 700 | 1.6984 | 0.5410 | 0.3363 | 0.988 | 0.5809 |
| 1.9114 | 10.2740 | 750 | 1.7621 | 0.5282 | 0.3521 | 0.948 | 0.5788 |
| 1.8333 | 10.9589 | 800 | 1.7001 | 0.5321 | 0.3749 | 0.96 | 0.5876 |
| 1.8406 | 11.6438 | 850 | 1.6963 | 0.5319 | 0.3657 | 0.984 | 0.5922 |
| 1.7488 | 12.3288 | 900 | 1.6972 | 0.5203 | 0.3938 | 0.974 | 0.6022 |
| 1.8025 | 13.0137 | 950 | 1.7500 | 0.5258 | 0.3553 | 0.952 | 0.5819 |
| 1.7811 | 13.6986 | 1000 | 1.7270 | 0.5196 | 0.4060 | 0.938 | 0.5953 |
| 1.6412 | 14.3836 | 1050 | 1.7763 | 0.5192 | 0.3617 | 0.95 | 0.5858 |
| 1.7068 | 15.0685 | 1100 | 1.6392 | 0.5196 | 0.4197 | 0.996 | 0.6169 |
| 1.6609 | 15.7534 | 1150 | 1.6699 | 0.5026 | 0.4418 | 0.986 | 0.6273 |
| 1.5145 | 16.4384 | 1200 | 1.6642 | 0.5103 | 0.4449 | 0.974 | 0.6216 |
| 1.5172 | 17.1233 | 1250 | 1.6888 | 0.5087 | 0.4730 | 0.958 | 0.6258 |
| 1.6041 | 17.8082 | 1300 | 1.7920 | 0.5002 | 0.4468 | 0.902 | 0.6045 |
| 1.5741 | 18.4932 | 1350 | 1.7422 | 0.5059 | 0.4607 | 0.928 | 0.6143 |
| 1.5694 | 19.1781 | 1400 | 1.7018 | 0.5007 | 0.4153 | 0.996 | 0.6231 |
| 1.4951 | 19.8630 | 1450 | 1.7225 | 0.5031 | 0.4469 | 0.966 | 0.6226 |
| 1.4128 | 20.5479 | 1500 | 1.7504 | 0.5111 | 0.4497 | 0.94 | 0.6125 |
| 1.3993 | 21.2329 | 1550 | 1.7292 | 0.4987 | 0.4536 | 0.982 | 0.6312 |
| 1.3689 | 21.9178 | 1600 | 1.7107 | 0.4952 | 0.4585 | 0.98 | 0.6335 |
| 1.3883 | 22.6027 | 1650 | 1.8204 | 0.5031 | 0.4295 | 0.97 | 0.6186 |
| 1.388 | 23.2877 | 1700 | 1.7624 | 0.4943 | 0.4713 | 0.966 | 0.6335 |
| 1.39 | 23.9726 | 1750 | 1.7373 | 0.5061 | 0.4780 | 0.988 | 0.6373 |
| 1.3144 | 24.6575 | 1800 | 1.7601 | 0.4941 | 0.4641 | 0.974 | 0.6338 |
| 1.308 | 25.3425 | 1850 | 1.8468 | 0.4906 | 0.4780 | 0.924 | 0.6243 |
| 1.1624 | 26.0274 | 1900 | 1.8270 | 0.4998 | 0.4624 | 0.98 | 0.6328 |
| 1.2308 | 26.7123 | 1950 | 1.8317 | 0.4921 | 0.4510 | 0.98 | 0.6325 |
| 1.2833 | 27.3973 | 2000 | 1.8082 | 0.4967 | 0.4462 | 0.984 | 0.6304 |
| 1.1986 | 28.0822 | 2050 | 1.7801 | 0.4854 | 0.4778 | 0.982 | 0.6438 |
| 1.2073 | 28.7671 | 2100 | 1.8085 | 0.4887 | 0.4900 | 0.978 | 0.6449 |
| 1.2382 | 29.4521 | 2150 | 1.8145 | 0.4884 | 0.4820 | 0.98 | 0.6432 |
| 1.2236 | 30.1370 | 2200 | 1.8277 | 0.4895 | 0.4563 | 0.98 | 0.6351 |
| 1.2668 | 30.8219 | 2250 | 1.8910 | 0.4965 | 0.4216 | 0.978 | 0.6213 |
| 1.0703 | 31.5068 | 2300 | 1.8507 | 0.4876 | 0.4637 | 0.958 | 0.6315 |
| 1.1168 | 32.1918 | 2350 | 1.8766 | 0.4867 | 0.4572 | 0.952 | 0.6281 |
| 1.2443 | 32.8767 | 2400 | 1.8841 | 0.4869 | 0.4609 | 0.962 | 0.6321 |
| 1.1923 | 33.5616 | 2450 | 1.9608 | 0.4856 | 0.4247 | 0.966 | 0.6230 |
| 1.155 | 34.2466 | 2500 | 1.9239 | 0.4830 | 0.4256 | 0.96 | 0.6225 |
| 1.153 | 34.9315 | 2550 | 1.8888 | 0.4838 | 0.4558 | 0.97 | 0.6342 |
| 1.1316 | 35.6164 | 2600 | 1.9003 | 0.4819 | 0.4482 | 0.978 | 0.6351 |
| 1.193 | 36.3014 | 2650 | 1.9069 | 0.4806 | 0.4527 | 0.982 | 0.6382 |
| 1.1364 | 36.9863 | 2700 | 1.8753 | 0.4808 | 0.4662 | 0.98 | 0.6415 |
| 1.142 | 37.6712 | 2750 | 1.8751 | 0.4819 | 0.4643 | 0.98 | 0.6405 |
| 1.0806 | 38.3562 | 2800 | 1.8894 | 0.4795 | 0.4568 | 0.982 | 0.6398 |
| 1.2624 | 39.0411 | 2850 | 1.8954 | 0.4797 | 0.4433 | 0.976 | 0.6339 |
| 1.0891 | 39.7260 | 2900 | 1.8960 | 0.4799 | 0.4457 | 0.976 | 0.6345 |
Framework versions
- Transformers 4.57.1
- Pytorch 2.8.0+cu126
- Datasets 4.4.2
- Tokenizers 0.22.1
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support
Model tree for abidanoaman/wav2vec2-urdu-multitask-teacher
Base model
facebook/wav2vec2-xls-r-300m Finetuned
abidanoaman/wav2vec2-urdu-finetuned-ASR