File size: 5,173 Bytes
0dc49c9
0a53617
 
0dc49c9
 
 
 
 
 
 
 
 
 
 
 
 
 
0a53617
0dc49c9
93c61a1
 
0dc49c9
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
5d30e79
384c363
 
0dc49c9
 
6b8918d
0a53617
5d30e79
0dc49c9
 
 
 
0a53617
 
93c61a1
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
0dc49c9
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
---

license: apache-2.0
base_model: facebook/wav2vec2-large-960h
tags:
- generated_from_trainer
metrics:
- wer
model-index:
- name: Helldivers2ASR_V4
  results: []
---


<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# Helldivers2ASR_V4



This model is a fine-tuned version of [facebook/wav2vec2-large-960h](https://huggingface.co/facebook/wav2vec2-large-960h) on an unknown dataset.

It achieves the following results on the evaluation set:

- Loss: 23.9309

- Wer: 0.0418



## Model description



More information needed



## Intended uses & limitations



More information needed



## Training and evaluation data



More information needed



## Training procedure



### Training hyperparameters



The following hyperparameters were used during training:

- learning_rate: 8e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: constant
- lr_scheduler_warmup_ratio: 0.1

- num_epochs: 60
- mixed_precision_training: Native AMP

### Training results

| Training Loss | Epoch | Step  | Validation Loss | Wer    |
|:-------------:|:-----:|:-----:|:---------------:|:------:|
| 860.6827      | 1.0   | 505   | 518.6855        | 0.4561 |
| 646.7485      | 2.0   | 1010  | 404.5416        | 0.3735 |
| 550.9087      | 3.0   | 1515  | 347.0684        | 0.3078 |
| 479.678       | 4.0   | 2020  | 319.9583        | 0.2773 |
| 425.8765      | 5.0   | 2525  | 278.4693        | 0.2440 |
| 384.8481      | 6.0   | 3030  | 255.8700        | 0.2202 |
| 347.8814      | 7.0   | 3535  | 194.1332        | 0.1985 |
| 316.2373      | 8.0   | 4040  | 186.0174        | 0.1867 |
| 285.9741      | 9.0   | 4545  | 162.6776        | 0.1665 |
| 265.6298      | 10.0  | 5050  | 136.8190        | 0.1377 |
| 246.6871      | 11.0  | 5555  | 115.5616        | 0.1296 |
| 220.2408      | 12.0  | 6060  | 119.0393        | 0.1281 |
| 206.048       | 13.0  | 6565  | 105.3089        | 0.1142 |
| 187.9322      | 14.0  | 7070  | 104.7907        | 0.1088 |
| 179.1001      | 15.0  | 7575  | 84.0504         | 0.0909 |
| 166.2595      | 16.0  | 8080  | 77.4637         | 0.1009 |
| 160.3703      | 17.0  | 8585  | 70.2956         | 0.0825 |
| 146.5175      | 18.0  | 9090  | 65.9802         | 0.0793 |
| 135.1917      | 19.0  | 9595  | 69.4915         | 0.0824 |
| 129.9824      | 20.0  | 10100 | 60.3350         | 0.0807 |
| 128.6042      | 21.0  | 10605 | 56.4799         | 0.0743 |
| 119.3818      | 22.0  | 11110 | 58.4327         | 0.0745 |
| 111.8126      | 23.0  | 11615 | 49.4417         | 0.0733 |
| 104.4044      | 24.0  | 12120 | 50.4322         | 0.0726 |
| 105.2895      | 25.0  | 12625 | 48.5079         | 0.0738 |
| 98.1706       | 26.0  | 13130 | 39.8806         | 0.0627 |
| 96.8815       | 27.0  | 13635 | 43.9033         | 0.0683 |
| 94.4319       | 28.0  | 14140 | 40.7528         | 0.0644 |
| 88.7985       | 29.0  | 14645 | 38.5910         | 0.0841 |
| 89.5256       | 30.0  | 15150 | 45.4595         | 0.0807 |
| 86.8755       | 31.0  | 15655 | 43.6640         | 0.0709 |
| 83.8043       | 32.0  | 16160 | 41.5719         | 0.0579 |
| 80.2981       | 33.0  | 16665 | 32.2402         | 0.0632 |
| 80.228        | 34.0  | 17170 | 37.0875         | 0.0680 |
| 75.8638       | 35.0  | 17675 | 34.1005         | 0.0565 |
| 73.154        | 36.0  | 18180 | 33.2692         | 0.0637 |
| 68.464        | 37.0  | 18685 | 31.7566         | 0.0545 |
| 67.3167       | 38.0  | 19190 | 31.6638         | 0.0514 |
| 63.2984       | 39.0  | 19695 | 28.8354         | 0.0512 |
| 62.8066       | 40.0  | 20200 | 28.3509         | 0.0551 |
| 63.414        | 41.0  | 20705 | 24.4108         | 0.0504 |
| 64.3067       | 42.0  | 21210 | 26.4849         | 0.0569 |
| 57.9233       | 43.0  | 21715 | 24.2229         | 0.0587 |
| 59.5782       | 44.0  | 22220 | 22.2866         | 0.0617 |
| 56.6089       | 45.0  | 22725 | 26.2410         | 0.0444 |
| 59.7503       | 46.0  | 23230 | 30.5061         | 0.0584 |
| 54.7057       | 47.0  | 23735 | 27.0451         | 0.0512 |
| 57.7382       | 48.0  | 24240 | 21.9726         | 0.0521 |
| 50.2926       | 49.0  | 24745 | 26.2113         | 0.0534 |
| 52.8126       | 50.0  | 25250 | 28.5752         | 0.0515 |
| 50.02         | 51.0  | 25755 | 25.0730         | 0.0512 |
| 50.0685       | 52.0  | 26260 | 23.9120         | 0.0519 |
| 49.0512       | 53.0  | 26765 | 27.2960         | 0.0603 |
| 54.5931       | 54.0  | 27270 | 23.2472         | 0.0546 |
| 46.7353       | 55.0  | 27775 | 25.7537         | 0.0546 |
| 48.7158       | 56.0  | 28280 | 28.0895         | 0.0447 |
| 47.141        | 57.0  | 28785 | 23.7680         | 0.0462 |
| 46.1992       | 58.0  | 29290 | 21.9848         | 0.0413 |
| 44.0671       | 59.0  | 29795 | 26.6304         | 0.0507 |
| 44.9467       | 60.0  | 30300 | 23.9309         | 0.0418 |


### Framework versions

- Transformers 4.44.0
- Pytorch 2.5.1+cu121
- Datasets 3.6.0
- Tokenizers 0.19.1