File size: 5,144 Bytes
48930bb
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
9326639
 
 
48930bb
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
9326639
 
 
48930bb
9326639
 
48930bb
 
9326639
48930bb
 
 
 
 
 
 
9326639
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
48930bb
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
---

library_name: transformers
license: apache-2.0
base_model: facebook/wav2vec2-base-960h
tags:
- generated_from_trainer
metrics:
- wer
model-index:
- name: Helldivers2ASR
  results: []
---


<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# Helldivers2ASR

This model is a fine-tuned version of [facebook/wav2vec2-base-960h](https://huggingface.co/facebook/wav2vec2-base-960h) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 45.7976
- Wer: 0.0486
- Cer: 0.0203

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 5e-05

- train_batch_size: 16

- eval_batch_size: 16

- seed: 42

- gradient_accumulation_steps: 2

- total_train_batch_size: 32
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1

- num_epochs: 50
- mixed_precision_training: Native AMP

### Training results

| Training Loss | Epoch | Step | Validation Loss | Wer    | Cer    |
|:-------------:|:-----:|:----:|:---------------:|:------:|:------:|
| 1069.0904     | 1.0   | 111  | 690.8810        | 0.5789 | 0.3204 |
| 812.3311      | 2.0   | 222  | 524.6868        | 0.4769 | 0.2529 |
| 677.891       | 3.0   | 333  | 423.8824        | 0.4040 | 0.2043 |
| 584.6176      | 4.0   | 444  | 338.5651        | 0.3429 | 0.1709 |
| 510.1064      | 5.0   | 555  | 284.0197        | 0.3016 | 0.1467 |
| 462.5595      | 6.0   | 666  | 245.5092        | 0.2615 | 0.1233 |
| 398.9107      | 7.0   | 777  | 205.1819        | 0.2198 | 0.1036 |
| 375.3196      | 8.0   | 888  | 183.1224        | 0.2049 | 0.0943 |
| 338.0957      | 9.0   | 999  | 163.4431        | 0.1834 | 0.0832 |
| 305.4381      | 10.0  | 1110 | 158.5278        | 0.1636 | 0.0759 |
| 297.0845      | 11.0  | 1221 | 140.0732        | 0.1603 | 0.0715 |
| 280.9225      | 12.0  | 1332 | 128.1153        | 0.1453 | 0.0633 |
| 274.2178      | 13.0  | 1443 | 115.9890        | 0.1283 | 0.0581 |
| 238.5611      | 14.0  | 1554 | 112.8672        | 0.1271 | 0.0572 |
| 233.9152      | 15.0  | 1665 | 105.2606        | 0.1178 | 0.0517 |
| 222.8375      | 16.0  | 1776 | 105.7286        | 0.1093 | 0.0522 |
| 218.3437      | 17.0  | 1887 | 100.8668        | 0.1089 | 0.0480 |
| 208.0329      | 18.0  | 1998 | 96.2257         | 0.1020 | 0.0457 |
| 199.382       | 19.0  | 2109 | 85.2498         | 0.0939 | 0.0433 |
| 198.6175      | 20.0  | 2220 | 82.4100         | 0.0927 | 0.0425 |
| 173.409       | 21.0  | 2331 | 78.6151         | 0.0842 | 0.0377 |
| 168.3968      | 22.0  | 2442 | 79.7964         | 0.0830 | 0.0382 |
| 171.8005      | 23.0  | 2553 | 70.0593         | 0.0773 | 0.0336 |
| 157.5571      | 24.0  | 2664 | 67.0374         | 0.0713 | 0.0308 |
| 157.863       | 25.0  | 2775 | 61.6305         | 0.0684 | 0.0299 |
| 153.1922      | 26.0  | 2886 | 64.7148         | 0.0676 | 0.0305 |
| 158.7052      | 27.0  | 2997 | 59.4825         | 0.0676 | 0.0286 |
| 149.8696      | 28.0  | 3108 | 59.3228         | 0.0615 | 0.0290 |
| 145.0604      | 29.0  | 3219 | 57.5999         | 0.0595 | 0.0273 |
| 138.6555      | 30.0  | 3330 | 56.0322         | 0.0587 | 0.0273 |
| 145.2049      | 31.0  | 3441 | 51.0521         | 0.0559 | 0.0262 |
| 137.6945      | 32.0  | 3552 | 52.0388         | 0.0510 | 0.0236 |
| 130.1284      | 33.0  | 3663 | 51.2634         | 0.0587 | 0.0256 |
| 127.2604      | 34.0  | 3774 | 47.7200         | 0.0510 | 0.0239 |
| 124.2158      | 35.0  | 3885 | 47.9998         | 0.0530 | 0.0233 |
| 116.4625      | 36.0  | 3996 | 50.8454         | 0.0547 | 0.0236 |
| 125.1983      | 37.0  | 4107 | 47.3148         | 0.0490 | 0.0212 |
| 110.8833      | 38.0  | 4218 | 47.1674         | 0.0522 | 0.0229 |
| 109.9017      | 39.0  | 4329 | 46.9405         | 0.0506 | 0.0224 |
| 116.4361      | 40.0  | 4440 | 49.4927         | 0.0482 | 0.0221 |
| 117.1769      | 41.0  | 4551 | 46.7733         | 0.0474 | 0.0212 |
| 108.7644      | 42.0  | 4662 | 45.9081         | 0.0490 | 0.0209 |
| 117.5977      | 43.0  | 4773 | 43.4770         | 0.0462 | 0.0202 |
| 125.1272      | 44.0  | 4884 | 41.2925         | 0.0482 | 0.0206 |
| 110.6224      | 45.0  | 4995 | 47.2408         | 0.0498 | 0.0211 |
| 110.3162      | 46.0  | 5106 | 46.2208         | 0.0437 | 0.0196 |
| 107.2695      | 47.0  | 5217 | 44.8825         | 0.0433 | 0.0199 |
| 111.1748      | 48.0  | 5328 | 45.3059         | 0.0425 | 0.0191 |
| 108.1643      | 49.0  | 5439 | 45.5355         | 0.0449 | 0.0199 |
| 107.1299      | 50.0  | 5550 | 45.7976         | 0.0486 | 0.0203 |


### Framework versions

- Transformers 4.57.3
- Pytorch 2.5.1+cu121
- Datasets 3.6.0
- Tokenizers 0.22.1