deepdml commited on
Commit
18cd23a
·
1 Parent(s): 38ac88b

End of training

Browse files
Files changed (1) hide show
  1. README.md +50 -81
README.md CHANGED
@@ -6,10 +6,10 @@ base_model: openai/whisper-tiny
6
  tags:
7
  - generated_from_trainer
8
  datasets:
9
- - google/fleurs
10
  - dsfsi-anv/multilingual-nchlt-dataset
11
- - voice-biomarkers/openslr-32-hq-SA-languages-Afrikaans
12
  - andreoosthuizen/afrikaans-30s
 
13
  metrics:
14
  - wer
15
  model-index:
@@ -20,15 +20,16 @@ model-index:
20
  type: automatic-speech-recognition
21
  dataset:
22
  name: Common Voice 17.0
23
- type: google/fleurs
24
  config: af_za
25
  split: test
26
  args: af_za
27
  metrics:
28
  - name: Wer
29
  type: wer
30
- value: 45.17581846526936
31
  ---
 
32
  <!-- This model card has been generated automatically according to the information the Trainer had access to. You
33
  should probably proofread and complete it, then remove this comment. -->
34
 
@@ -36,9 +37,9 @@ should probably proofread and complete it, then remove this comment. -->
36
 
37
  This model is a fine-tuned version of [openai/whisper-tiny](https://huggingface.co/openai/whisper-tiny) on the Common Voice 17.0 dataset.
38
  It achieves the following results on the evaluation set:
39
- - Loss: 1.2813
40
- - Wer: 45.1758
41
- - Cer: 18.4153
42
 
43
  ## Model description
44
 
@@ -64,72 +65,53 @@ The following hyperparameters were used during training:
64
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
65
  - lr_scheduler_type: linear
66
  - lr_scheduler_warmup_ratio: 0.04
67
- - training_steps: 6000
68
 
69
  ### Training results
70
 
71
  | Training Loss | Epoch | Step | Validation Loss | Wer | Cer |
72
  |:-------------:|:------:|:----:|:---------------:|:-------:|:-------:|
73
- | 1.9201 | 0.0167 | 100 | 1.9214 | 75.4201 | 31.6653 |
74
- | 1.0858 | 0.0333 | 200 | 1.3963 | 56.0714 | 22.9766 |
75
- | 0.6925 | 0.05 | 300 | 1.2245 | 50.4417 | 19.7579 |
76
- | 0.5015 | 0.0667 | 400 | 1.1657 | 48.4150 | 19.2009 |
77
- | 0.3938 | 0.0833 | 500 | 1.1385 | 46.9773 | 18.5355 |
78
- | 0.2988 | 0.1 | 600 | 1.1282 | 47.6529 | 20.0862 |
79
- | 0.2636 | 0.1167 | 700 | 1.1273 | 47.9993 | 20.5523 |
80
- | 0.2127 | 0.1333 | 800 | 1.1218 | 47.5489 | 19.8868 |
81
- | 0.1874 | 0.15 | 900 | 1.1289 | 46.9080 | 20.8542 |
82
- | 0.1619 | 0.1667 | 1000 | 1.1330 | 49.1945 | 21.9476 |
83
- | 0.1337 | 0.1833 | 1100 | 1.1491 | 47.5489 | 20.1302 |
84
- | 0.1131 | 0.2 | 1200 | 1.1537 | 48.6575 | 20.8630 |
85
- | 0.1051 | 0.2167 | 1300 | 1.1685 | 50.3897 | 21.0037 |
86
- | 0.0973 | 0.2333 | 1400 | 1.1724 | 45.0719 | 18.5941 |
87
- | 0.0939 | 0.25 | 1500 | 1.1687 | 44.3963 | 17.9023 |
88
- | 0.0729 | 0.2667 | 1600 | 1.1577 | 48.2418 | 20.3764 |
89
- | 0.0798 | 0.2833 | 1700 | 1.1849 | 48.9174 | 20.8894 |
90
- | 0.0622 | 0.3 | 1800 | 1.1850 | 43.3397 | 17.7938 |
91
- | 0.0608 | 0.3167 | 1900 | 1.1882 | 44.0499 | 18.2540 |
92
- | 0.0567 | 0.3333 | 2000 | 1.1892 | 43.4609 | 17.9140 |
93
- | 0.0515 | 0.35 | 2100 | 1.1929 | 46.2151 | 18.8051 |
94
- | 0.0542 | 0.3667 | 2200 | 1.2082 | 44.3963 | 18.3918 |
95
- | 0.0503 | 0.3833 | 2300 | 1.1946 | 47.9820 | 19.6553 |
96
- | 0.0497 | 0.4 | 2400 | 1.1912 | 49.7662 | 21.5841 |
97
- | 0.0495 | 0.4167 | 2500 | 1.2044 | 47.2718 | 18.8491 |
98
- | 0.0338 | 0.4333 | 2600 | 1.2134 | 43.8940 | 17.4303 |
99
- | 0.0497 | 0.45 | 2700 | 1.2063 | 46.1632 | 18.7553 |
100
- | 0.0403 | 0.4667 | 2800 | 1.2163 | 46.7868 | 18.9605 |
101
- | 0.0363 | 0.4833 | 2900 | 1.2167 | 43.0972 | 17.1782 |
102
- | 0.0361 | 0.5 | 3000 | 1.2261 | 46.4403 | 18.8169 |
103
- | 0.0365 | 0.5167 | 3100 | 1.2220 | 42.6641 | 18.1280 |
104
- | 0.0277 | 0.5333 | 3200 | 1.2331 | 46.3191 | 18.8227 |
105
- | 0.0333 | 0.55 | 3300 | 1.2272 | 43.5822 | 17.4215 |
106
- | 0.0315 | 0.5667 | 3400 | 1.2376 | 46.4750 | 18.9928 |
107
- | 0.0285 | 0.5833 | 3500 | 1.2420 | 43.4263 | 17.3482 |
108
- | 0.0375 | 0.6 | 3600 | 1.2388 | 46.6309 | 18.5530 |
109
- | 0.0229 | 0.6167 | 3700 | 1.2376 | 42.7854 | 17.0287 |
110
- | 0.0197 | 0.6333 | 3800 | 1.2449 | 43.4263 | 17.2251 |
111
- | 0.0186 | 1.0068 | 3900 | 1.2528 | 46.5789 | 18.6791 |
112
- | 0.0245 | 1.0235 | 4000 | 1.2527 | 49.3851 | 21.7219 |
113
- | 0.0146 | 1.0402 | 4100 | 1.2579 | 41.9886 | 17.1518 |
114
- | 0.0137 | 1.0568 | 4200 | 1.2673 | 43.2877 | 17.0932 |
115
- | 0.0244 | 1.0735 | 4300 | 1.2768 | 46.5443 | 18.9957 |
116
- | 0.0271 | 1.0902 | 4400 | 1.2655 | 46.2325 | 18.7319 |
117
- | 0.0237 | 1.1068 | 4500 | 1.2803 | 46.5616 | 18.5237 |
118
- | 0.0212 | 1.1235 | 4600 | 1.2725 | 48.3804 | 20.5171 |
119
- | 0.0366 | 1.1402 | 4700 | 1.2623 | 46.6482 | 19.9777 |
120
- | 0.0278 | 1.1568 | 4800 | 1.2632 | 45.4010 | 18.4211 |
121
- | 0.0232 | 1.1735 | 4900 | 1.2652 | 45.5223 | 18.5501 |
122
- | 0.0264 | 1.1902 | 5000 | 1.2720 | 43.0799 | 17.3043 |
123
- | 0.0244 | 1.2068 | 5100 | 1.2714 | 44.2058 | 18.3625 |
124
- | 0.0162 | 1.2235 | 5200 | 1.2697 | 45.3144 | 18.4827 |
125
- | 0.0203 | 1.2402 | 5300 | 1.2812 | 43.2358 | 17.0727 |
126
- | 0.0214 | 1.2568 | 5400 | 1.2804 | 42.7681 | 16.8939 |
127
- | 0.0163 | 1.2735 | 5500 | 1.2834 | 45.8341 | 18.4416 |
128
- | 0.0149 | 1.2902 | 5600 | 1.2833 | 45.5049 | 18.4915 |
129
- | 0.0196 | 1.3068 | 5700 | 1.2822 | 46.0592 | 18.8491 |
130
- | 0.0134 | 1.3235 | 5800 | 1.2807 | 46.0073 | 18.8462 |
131
- | 0.0137 | 1.3402 | 5900 | 1.2803 | 42.5775 | 17.1694 |
132
- | 0.0174 | 1.3568 | 6000 | 1.2813 | 45.1758 | 18.4153 |
133
 
134
 
135
  ### Framework versions
@@ -138,16 +120,3 @@ The following hyperparameters were used during training:
138
  - Pytorch 2.3.0+cu121
139
  - Datasets 2.19.1
140
  - Tokenizers 0.19.1
141
-
142
- ## Citation
143
-
144
- Please cite the model using the following BibTeX entry:
145
-
146
- ```bibtex
147
- @misc{deepdml/whisper-tiny-af-mix-norm,
148
- title={Fine-tuned Whisper tiny ASR model for speech recognition in Afrikaans},
149
- author={Jimenez, David},
150
- howpublished={\url{https://huggingface.co/deepdml/whisper-tiny-af-mix-norm}},
151
- year={2026}
152
- }
153
- ```
 
6
  tags:
7
  - generated_from_trainer
8
  datasets:
 
9
  - dsfsi-anv/multilingual-nchlt-dataset
10
+ - google/fleurs
11
  - andreoosthuizen/afrikaans-30s
12
+ - voice-biomarkers/openslr-32-hq-SA-languages-Afrikaans
13
  metrics:
14
  - wer
15
  model-index:
 
20
  type: automatic-speech-recognition
21
  dataset:
22
  name: Common Voice 17.0
23
+ type: dsfsi-anv/multilingual-nchlt-dataset
24
  config: af_za
25
  split: test
26
  args: af_za
27
  metrics:
28
  - name: Wer
29
  type: wer
30
+ value: 44.257751602286504
31
  ---
32
+
33
  <!-- This model card has been generated automatically according to the information the Trainer had access to. You
34
  should probably proofread and complete it, then remove this comment. -->
35
 
 
37
 
38
  This model is a fine-tuned version of [openai/whisper-tiny](https://huggingface.co/openai/whisper-tiny) on the Common Voice 17.0 dataset.
39
  It achieves the following results on the evaluation set:
40
+ - Loss: 1.2213
41
+ - Wer: 44.2578
42
+ - Cer: 17.8026
43
 
44
  ## Model description
45
 
 
65
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
66
  - lr_scheduler_type: linear
67
  - lr_scheduler_warmup_ratio: 0.04
68
+ - training_steps: 4100
69
 
70
  ### Training results
71
 
72
  | Training Loss | Epoch | Step | Validation Loss | Wer | Cer |
73
  |:-------------:|:------:|:----:|:---------------:|:-------:|:-------:|
74
+ | 1.7169 | 0.0244 | 100 | 1.7637 | 68.4393 | 26.7904 |
75
+ | 0.9216 | 0.0488 | 200 | 1.3055 | 52.9014 | 21.5255 |
76
+ | 0.6082 | 0.0732 | 300 | 1.1946 | 49.3158 | 19.3768 |
77
+ | 0.4534 | 0.0976 | 400 | 1.1545 | 47.5143 | 18.1954 |
78
+ | 0.3675 | 0.1220 | 500 | 1.1354 | 46.8387 | 18.5267 |
79
+ | 0.282 | 0.1463 | 600 | 1.1251 | 46.0939 | 19.8751 |
80
+ | 0.254 | 0.1707 | 700 | 1.1269 | 45.4876 | 18.8345 |
81
+ | 0.2055 | 0.1951 | 800 | 1.1248 | 48.9347 | 20.0803 |
82
+ | 0.1837 | 0.2195 | 900 | 1.1323 | 45.0199 | 19.4325 |
83
+ | 0.1606 | 0.2439 | 1000 | 1.1317 | 49.2118 | 21.8832 |
84
+ | 0.1337 | 0.2683 | 1100 | 1.1491 | 44.7601 | 18.6498 |
85
+ | 0.1149 | 0.2927 | 1200 | 1.1535 | 45.4530 | 19.5761 |
86
+ | 0.1072 | 0.3171 | 1300 | 1.1685 | 48.6056 | 20.2328 |
87
+ | 0.0998 | 0.3415 | 1400 | 1.1738 | 44.5695 | 18.5501 |
88
+ | 0.097 | 0.3659 | 1500 | 1.1702 | 44.4656 | 18.3068 |
89
+ | 0.0769 | 0.3902 | 1600 | 1.1601 | 47.1159 | 19.3709 |
90
+ | 0.084 | 0.4146 | 1700 | 1.1815 | 47.5663 | 19.6347 |
91
+ | 0.0664 | 0.4390 | 1800 | 1.1821 | 44.3097 | 18.7582 |
92
+ | 0.0652 | 0.4634 | 1900 | 1.1854 | 43.2184 | 18.4123 |
93
+ | 0.0609 | 0.4878 | 2000 | 1.1830 | 43.1145 | 17.4508 |
94
+ | 0.0565 | 0.5122 | 2100 | 1.1897 | 47.1505 | 19.0514 |
95
+ | 0.0589 | 0.5366 | 2200 | 1.2024 | 45.4010 | 18.6996 |
96
+ | 0.0552 | 0.5610 | 2300 | 1.1956 | 48.6402 | 20.3764 |
97
+ | 0.0551 | 0.5854 | 2400 | 1.1930 | 45.3837 | 18.6527 |
98
+ | 0.0551 | 0.6098 | 2500 | 1.1984 | 47.1159 | 18.6996 |
99
+ | 0.04 | 0.6341 | 2600 | 1.2092 | 47.4796 | 19.7725 |
100
+ | 0.0548 | 0.6585 | 2700 | 1.1981 | 42.7681 | 17.5915 |
101
+ | 0.0466 | 0.6829 | 2800 | 1.2144 | 48.1379 | 20.3588 |
102
+ | 0.0425 | 0.7073 | 2900 | 1.2051 | 46.0766 | 18.7670 |
103
+ | 0.0431 | 0.7317 | 3000 | 1.2157 | 44.3963 | 17.4596 |
104
+ | 0.0427 | 0.7561 | 3100 | 1.2178 | 48.1032 | 19.8517 |
105
+ | 0.0346 | 0.7805 | 3200 | 1.2177 | 47.4970 | 19.5644 |
106
+ | 0.0395 | 0.8049 | 3300 | 1.2199 | 47.1159 | 18.9312 |
107
+ | 0.039 | 0.8293 | 3400 | 1.2219 | 45.7474 | 19.4090 |
108
+ | 0.0359 | 0.8537 | 3500 | 1.2191 | 46.4057 | 18.7846 |
109
+ | 0.0461 | 0.8780 | 3600 | 1.2172 | 51.2039 | 21.9476 |
110
+ | 0.0299 | 0.9024 | 3700 | 1.2202 | 47.5316 | 19.1335 |
111
+ | 0.028 | 0.9268 | 3800 | 1.2216 | 47.1505 | 19.4999 |
112
+ | 0.0305 | 1.01 | 3900 | 1.2241 | 46.5443 | 18.7231 |
113
+ | 0.038 | 1.0344 | 4000 | 1.2218 | 44.2924 | 17.6619 |
114
+ | 0.0249 | 1.0588 | 4100 | 1.2213 | 44.2578 | 17.8026 |
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
115
 
116
 
117
  ### Framework versions
 
120
  - Pytorch 2.3.0+cu121
121
  - Datasets 2.19.1
122
  - Tokenizers 0.19.1