Vkt commited on
Commit
f6e8f35
·
1 Parent(s): 240e4c5

update model card README.md

Browse files
Files changed (1) hide show
  1. README.md +166 -0
README.md ADDED
@@ -0,0 +1,166 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: apache-2.0
3
+ tags:
4
+ - generated_from_trainer
5
+ datasets:
6
+ - common_voice
7
+ model-index:
8
+ - name: test-model
9
+ results: []
10
+ ---
11
+
12
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
13
+ should probably proofread and complete it, then remove this comment. -->
14
+
15
+ # test-model
16
+
17
+ This model is a fine-tuned version of [facebook/wav2vec2-xls-r-300m](https://huggingface.co/facebook/wav2vec2-xls-r-300m) on the common_voice dataset.
18
+ It achieves the following results on the evaluation set:
19
+ - Loss: 0.0161
20
+ - Wer: 0.0141
21
+
22
+ ## Model description
23
+
24
+ More information needed
25
+
26
+ ## Intended uses & limitations
27
+
28
+ More information needed
29
+
30
+ ## Training and evaluation data
31
+
32
+ More information needed
33
+
34
+ ## Training procedure
35
+
36
+ ### Training hyperparameters
37
+
38
+ The following hyperparameters were used during training:
39
+ - learning_rate: 0.0003
40
+ - train_batch_size: 8
41
+ - eval_batch_size: 8
42
+ - seed: 42
43
+ - gradient_accumulation_steps: 2
44
+ - total_train_batch_size: 16
45
+ - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
46
+ - lr_scheduler_type: linear
47
+ - lr_scheduler_warmup_steps: 500
48
+ - num_epochs: 30
49
+ - mixed_precision_training: Native AMP
50
+
51
+ ### Training results
52
+
53
+ | Training Loss | Epoch | Step | Validation Loss | Wer |
54
+ |:-------------:|:-----:|:-----:|:---------------:|:------:|
55
+ | 4.8062 | 0.29 | 400 | 2.0576 | 1.0 |
56
+ | 0.9633 | 0.57 | 800 | 0.5862 | 0.6023 |
57
+ | 0.6079 | 0.86 | 1200 | 0.4897 | 0.4824 |
58
+ | 0.4993 | 1.14 | 1600 | 0.3823 | 0.3989 |
59
+ | 0.4269 | 1.43 | 2000 | 0.3749 | 0.3761 |
60
+ | 0.4049 | 1.72 | 2400 | 0.3501 | 0.3536 |
61
+ | 0.3998 | 2.0 | 2800 | 0.3527 | 0.3381 |
62
+ | 0.3172 | 2.29 | 3200 | 0.3188 | 0.3257 |
63
+ | 0.3161 | 2.57 | 3600 | 0.3217 | 0.3185 |
64
+ | 0.3213 | 2.86 | 4000 | 0.2988 | 0.3007 |
65
+ | 0.3035 | 3.15 | 4400 | 0.3036 | 0.3288 |
66
+ | 0.261 | 3.43 | 4800 | 0.3095 | 0.2947 |
67
+ | 0.2639 | 3.72 | 5200 | 0.2818 | 0.2767 |
68
+ | 0.2771 | 4.0 | 5600 | 0.2739 | 0.2812 |
69
+ | 0.2343 | 4.29 | 6000 | 0.2820 | 0.2700 |
70
+ | 0.2452 | 4.57 | 6400 | 0.2663 | 0.2697 |
71
+ | 0.2344 | 4.86 | 6800 | 0.2679 | 0.2666 |
72
+ | 0.2215 | 5.15 | 7200 | 0.2687 | 0.2571 |
73
+ | 0.2032 | 5.43 | 7600 | 0.2791 | 0.2624 |
74
+ | 0.2092 | 5.72 | 8000 | 0.2682 | 0.2616 |
75
+ | 0.2122 | 6.0 | 8400 | 0.2770 | 0.2591 |
76
+ | 0.1878 | 6.29 | 8800 | 0.2760 | 0.2584 |
77
+ | 0.1884 | 6.58 | 9200 | 0.2641 | 0.2515 |
78
+ | 0.194 | 6.86 | 9600 | 0.2500 | 0.2415 |
79
+ | 0.175 | 7.15 | 10000 | 0.2635 | 0.2532 |
80
+ | 0.1658 | 7.43 | 10400 | 0.2588 | 0.2371 |
81
+ | 0.177 | 7.72 | 10800 | 0.2813 | 0.2493 |
82
+ | 0.1786 | 8.01 | 11200 | 0.2628 | 0.2437 |
83
+ | 0.1509 | 8.29 | 11600 | 0.2592 | 0.2453 |
84
+ | 0.1597 | 8.58 | 12000 | 0.2737 | 0.2523 |
85
+ | 0.1646 | 8.86 | 12400 | 0.2556 | 0.2436 |
86
+ | 0.1587 | 9.15 | 12800 | 0.2669 | 0.2453 |
87
+ | 0.1489 | 9.44 | 13200 | 0.2596 | 0.2353 |
88
+ | 0.1468 | 9.72 | 13600 | 0.2620 | 0.2419 |
89
+ | 0.1482 | 10.01 | 14000 | 0.2622 | 0.2334 |
90
+ | 0.1285 | 10.29 | 14400 | 0.2531 | 0.2258 |
91
+ | 0.1335 | 10.58 | 14800 | 0.2512 | 0.2273 |
92
+ | 0.1335 | 10.86 | 15200 | 0.2475 | 0.2246 |
93
+ | 0.132 | 11.15 | 15600 | 0.2575 | 0.2275 |
94
+ | 0.1249 | 11.44 | 16000 | 0.2503 | 0.2223 |
95
+ | 0.1229 | 11.72 | 16400 | 0.2817 | 0.2297 |
96
+ | 0.1274 | 12.01 | 16800 | 0.2707 | 0.2211 |
97
+ | 0.1115 | 12.29 | 17200 | 0.2647 | 0.2175 |
98
+ | 0.117 | 12.58 | 17600 | 0.2501 | 0.2178 |
99
+ | 0.1164 | 12.87 | 18000 | 0.2579 | 0.2216 |
100
+ | 0.1085 | 13.15 | 18400 | 0.2636 | 0.2130 |
101
+ | 0.1033 | 13.44 | 18800 | 0.2643 | 0.2184 |
102
+ | 0.1066 | 13.72 | 19200 | 0.2519 | 0.2158 |
103
+ | 0.1032 | 14.01 | 19600 | 0.2322 | 0.2082 |
104
+ | 0.0981 | 14.3 | 20000 | 0.2613 | 0.2125 |
105
+ | 0.1009 | 14.58 | 20400 | 0.2479 | 0.2076 |
106
+ | 0.1 | 14.87 | 20800 | 0.2464 | 0.2058 |
107
+ | 0.0886 | 15.15 | 21200 | 0.2595 | 0.2014 |
108
+ | 0.0888 | 15.44 | 21600 | 0.2565 | 0.2048 |
109
+ | 0.0916 | 15.73 | 22000 | 0.2470 | 0.2000 |
110
+ | 0.095 | 16.01 | 22400 | 0.2539 | 0.1997 |
111
+ | 0.0875 | 16.3 | 22800 | 0.2576 | 0.1995 |
112
+ | 0.0833 | 16.58 | 23200 | 0.2514 | 0.1990 |
113
+ | 0.0813 | 16.87 | 23600 | 0.2522 | 0.2020 |
114
+ | 0.0845 | 17.16 | 24000 | 0.2522 | 0.2045 |
115
+ | 0.0879 | 17.44 | 24400 | 0.2629 | 0.2183 |
116
+ | 0.0854 | 17.73 | 24800 | 0.2464 | 0.2000 |
117
+ | 0.0795 | 18.01 | 25200 | 0.2526 | 0.2078 |
118
+ | 0.075 | 18.3 | 25600 | 0.2519 | 0.1971 |
119
+ | 0.0724 | 18.58 | 26000 | 0.2551 | 0.1965 |
120
+ | 0.0735 | 18.87 | 26400 | 0.2536 | 0.1934 |
121
+ | 0.0735 | 19.16 | 26800 | 0.2504 | 0.1916 |
122
+ | 0.0676 | 19.44 | 27200 | 0.2532 | 0.1884 |
123
+ | 0.0687 | 19.73 | 27600 | 0.2498 | 0.1849 |
124
+ | 0.0652 | 20.01 | 28000 | 0.2490 | 0.1847 |
125
+ | 0.0617 | 20.3 | 28400 | 0.2547 | 0.1899 |
126
+ | 0.0627 | 20.59 | 28800 | 0.2509 | 0.1834 |
127
+ | 0.0639 | 20.87 | 29200 | 0.2472 | 0.1812 |
128
+ | 0.0611 | 21.16 | 29600 | 0.2486 | 0.1827 |
129
+ | 0.0559 | 21.44 | 30000 | 0.2530 | 0.1825 |
130
+ | 0.0564 | 21.73 | 30400 | 0.2484 | 0.1785 |
131
+ | 0.0593 | 22.02 | 30800 | 0.2425 | 0.1781 |
132
+ | 0.0517 | 22.3 | 31200 | 0.2613 | 0.1775 |
133
+ | 0.0528 | 22.59 | 31600 | 0.2517 | 0.1759 |
134
+ | 0.0556 | 22.87 | 32000 | 0.2494 | 0.1811 |
135
+ | 0.0507 | 23.16 | 32400 | 0.2522 | 0.1761 |
136
+ | 0.0485 | 23.45 | 32800 | 0.2344 | 0.1717 |
137
+ | 0.0504 | 23.73 | 33200 | 0.2458 | 0.1772 |
138
+ | 0.0485 | 24.02 | 33600 | 0.2497 | 0.1748 |
139
+ | 0.0436 | 24.3 | 34000 | 0.2405 | 0.1738 |
140
+ | 0.0468 | 24.59 | 34400 | 0.2446 | 0.1735 |
141
+ | 0.0443 | 24.87 | 34800 | 0.2514 | 0.1709 |
142
+ | 0.0417 | 25.16 | 35200 | 0.2515 | 0.1711 |
143
+ | 0.0399 | 25.45 | 35600 | 0.2452 | 0.1664 |
144
+ | 0.0416 | 25.73 | 36000 | 0.2438 | 0.1664 |
145
+ | 0.0412 | 26.02 | 36400 | 0.2457 | 0.1662 |
146
+ | 0.0406 | 26.3 | 36800 | 0.2475 | 0.1659 |
147
+ | 0.0376 | 26.59 | 37200 | 0.2454 | 0.1682 |
148
+ | 0.0365 | 26.88 | 37600 | 0.2511 | 0.1650 |
149
+ | 0.0355 | 27.16 | 38000 | 0.2518 | 0.1633 |
150
+ | 0.032 | 27.45 | 38400 | 0.2479 | 0.1604 |
151
+ | 0.0348 | 27.73 | 38800 | 0.2391 | 0.1599 |
152
+ | 0.0331 | 28.02 | 39200 | 0.2417 | 0.1617 |
153
+ | 0.0349 | 28.31 | 39600 | 0.2358 | 0.1590 |
154
+ | 0.0347 | 28.59 | 40000 | 0.2388 | 0.1582 |
155
+ | 0.0325 | 28.88 | 40400 | 0.2412 | 0.1564 |
156
+ | 0.0332 | 29.16 | 40800 | 0.2390 | 0.1545 |
157
+ | 0.0613 | 29.45 | 41200 | 0.0167 | 0.0141 |
158
+ | 0.0563 | 29.74 | 41600 | 0.0161 | 0.0141 |
159
+
160
+
161
+ ### Framework versions
162
+
163
+ - Transformers 4.17.0
164
+ - Pytorch 1.8.1+cu111
165
+ - Datasets 2.2.1
166
+ - Tokenizers 0.12.1