Helldivers2ASR_V5

This model is a fine-tuned version of facebook/wav2vec2-large-960h on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 124.4297
  • Wer: 0.0946
  • Cer: 0.0385

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0005
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • gradient_accumulation_steps: 4
  • total_train_batch_size: 128
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: cosine
  • lr_scheduler_warmup_ratio: 0.05
  • num_epochs: 50
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer Cer
2695.3033 0.9655 14 1193.2163 0.5150 0.2548
1690.871 2.0 29 1009.7814 0.4355 0.2077
1449.3326 2.9655 43 866.1747 0.3603 0.1693
1157.594 4.0 58 744.7312 0.3274 0.1547
1123.9697 4.9655 72 656.5463 0.3073 0.1384
967.6211 6.0 87 561.6840 0.2915 0.1309
926.493 6.9655 101 552.4729 0.2851 0.1334
810.3585 8.0 116 489.7581 0.2328 0.1091
802.5302 8.9655 130 477.9092 0.2235 0.0998
675.7667 10.0 145 429.9390 0.2034 0.0897
688.4148 10.9655 159 412.7467 0.2350 0.1008
593.3937 12.0 174 355.4245 0.1991 0.0865
594.2962 12.9655 188 329.0578 0.1819 0.0774
508.8992 14.0 203 350.7224 0.1891 0.0800
503.765 14.9655 217 300.9240 0.1705 0.0714
452.241 16.0 232 326.3430 0.1605 0.0698
474.4419 16.9655 246 292.7792 0.1691 0.0723
404.4718 18.0 261 284.8404 0.1633 0.0706
403.7702 18.9655 275 261.7959 0.1476 0.0650
381.9254 20.0 290 224.7344 0.1490 0.0630
367.5338 20.9655 304 280.7520 0.1540 0.0646
336.0608 22.0 319 226.6506 0.1554 0.0650
343.7045 22.9655 333 231.6033 0.1590 0.0697
279.5685 24.0 348 246.9725 0.1590 0.0686
311.3864 24.9655 362 204.4507 0.1511 0.0635
255.0848 26.0 377 194.7101 0.1289 0.0565
271.3048 26.9655 391 190.2275 0.1254 0.0533
258.5749 28.0 406 193.0812 0.1189 0.0483
252.6549 28.9655 420 182.3211 0.1232 0.0473
198.5987 30.0 435 180.7149 0.1153 0.0480
193.8807 30.9655 449 169.1754 0.1196 0.0480
184.148 32.0 464 155.5871 0.1139 0.0474
185.8568 32.9655 478 160.6432 0.1103 0.0474
182.5008 34.0 493 153.4942 0.1125 0.0486
182.8014 34.9655 507 148.8764 0.1096 0.0466
152.4978 36.0 522 141.8686 0.1125 0.0484
159.9153 36.9655 536 130.8442 0.1117 0.0463
153.8542 38.0 551 142.0020 0.1053 0.0435
194.8882 38.9655 565 128.6881 0.1032 0.0410
159.338 40.0 580 125.0831 0.0981 0.0408
168.303 40.9655 594 131.1471 0.0953 0.0391
149.8229 42.0 609 127.8085 0.0946 0.0394
161.5126 42.9655 623 124.2411 0.0960 0.0393
163.1556 44.0 638 123.0042 0.0967 0.0395
158.075 44.9655 652 124.5911 0.0938 0.0387
151.277 46.0 667 124.5658 0.0938 0.0382
145.2503 46.9655 681 124.3450 0.0938 0.0387
142.6398 48.0 696 124.3733 0.0938 0.0385
145.2239 48.2759 700 124.4297 0.0946 0.0385

Framework versions

  • Transformers 4.44.0
  • Pytorch 2.5.1+cu121
  • Datasets 3.6.0
  • Tokenizers 0.19.1
Downloads last month
4
Safetensors
Model size
0.3B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for 8688chris/Helldivers2ASR_V5

Finetuned
(13)
this model