Vkt commited on
Commit
067c935
·
1 Parent(s): 9a4e8f4

update model card README.md

Browse files
Files changed (1) hide show
  1. README.md +166 -0
README.md ADDED
@@ -0,0 +1,166 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: apache-2.0
3
+ tags:
4
+ - generated_from_trainer
5
+ datasets:
6
+ - common_voice
7
+ model-index:
8
+ - name: model-facebookptbrlarge
9
+ results: []
10
+ ---
11
+
12
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
13
+ should probably proofread and complete it, then remove this comment. -->
14
+
15
+ # model-facebookptbrlarge
16
+
17
+ This model is a fine-tuned version of [facebook/wav2vec2-large-xlsr-53-portuguese](https://huggingface.co/facebook/wav2vec2-large-xlsr-53-portuguese) on the common_voice dataset.
18
+ It achieves the following results on the evaluation set:
19
+ - Loss: 0.2206
20
+ - Wer: 0.1322
21
+
22
+ ## Model description
23
+
24
+ More information needed
25
+
26
+ ## Intended uses & limitations
27
+
28
+ More information needed
29
+
30
+ ## Training and evaluation data
31
+
32
+ More information needed
33
+
34
+ ## Training procedure
35
+
36
+ ### Training hyperparameters
37
+
38
+ The following hyperparameters were used during training:
39
+ - learning_rate: 0.0003
40
+ - train_batch_size: 8
41
+ - eval_batch_size: 8
42
+ - seed: 42
43
+ - gradient_accumulation_steps: 2
44
+ - total_train_batch_size: 16
45
+ - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
46
+ - lr_scheduler_type: linear
47
+ - lr_scheduler_warmup_steps: 500
48
+ - num_epochs: 30
49
+ - mixed_precision_training: Native AMP
50
+
51
+ ### Training results
52
+
53
+ | Training Loss | Epoch | Step | Validation Loss | Wer |
54
+ |:-------------:|:-----:|:-----:|:---------------:|:------:|
55
+ | 5.8975 | 0.29 | 400 | 0.4131 | 0.3336 |
56
+ | 0.5131 | 0.57 | 800 | 0.4103 | 0.3293 |
57
+ | 0.4846 | 0.86 | 1200 | 0.3493 | 0.3028 |
58
+ | 0.4174 | 1.14 | 1600 | 0.3055 | 0.2730 |
59
+ | 0.4105 | 1.43 | 2000 | 0.3283 | 0.3041 |
60
+ | 0.4028 | 1.72 | 2400 | 0.3539 | 0.3210 |
61
+ | 0.386 | 2.0 | 2800 | 0.2925 | 0.2690 |
62
+ | 0.3224 | 2.29 | 3200 | 0.2842 | 0.2665 |
63
+ | 0.3122 | 2.57 | 3600 | 0.2781 | 0.2472 |
64
+ | 0.3087 | 2.86 | 4000 | 0.2794 | 0.2692 |
65
+ | 0.2878 | 3.15 | 4400 | 0.2795 | 0.2537 |
66
+ | 0.2915 | 3.43 | 4800 | 0.2764 | 0.2478 |
67
+ | 0.2816 | 3.72 | 5200 | 0.2761 | 0.2366 |
68
+ | 0.283 | 4.0 | 5600 | 0.2641 | 0.2587 |
69
+ | 0.2448 | 4.29 | 6000 | 0.2489 | 0.2417 |
70
+ | 0.247 | 4.57 | 6400 | 0.2538 | 0.2422 |
71
+ | 0.25 | 4.86 | 6800 | 0.2660 | 0.2306 |
72
+ | 0.2256 | 5.15 | 7200 | 0.2477 | 0.2267 |
73
+ | 0.2225 | 5.43 | 7600 | 0.2364 | 0.2195 |
74
+ | 0.2217 | 5.72 | 8000 | 0.2319 | 0.2139 |
75
+ | 0.2272 | 6.0 | 8400 | 0.2489 | 0.2427 |
76
+ | 0.2016 | 6.29 | 8800 | 0.2404 | 0.2181 |
77
+ | 0.1973 | 6.58 | 9200 | 0.2532 | 0.2273 |
78
+ | 0.2101 | 6.86 | 9600 | 0.2590 | 0.2100 |
79
+ | 0.1946 | 7.15 | 10000 | 0.2414 | 0.2108 |
80
+ | 0.1845 | 7.43 | 10400 | 0.2485 | 0.2124 |
81
+ | 0.1861 | 7.72 | 10800 | 0.2405 | 0.2124 |
82
+ | 0.1851 | 8.01 | 11200 | 0.2449 | 0.2062 |
83
+ | 0.1587 | 8.29 | 11600 | 0.2510 | 0.2048 |
84
+ | 0.1694 | 8.58 | 12000 | 0.2290 | 0.2059 |
85
+ | 0.1637 | 8.86 | 12400 | 0.2376 | 0.2063 |
86
+ | 0.1594 | 9.15 | 12800 | 0.2307 | 0.1967 |
87
+ | 0.1537 | 9.44 | 13200 | 0.2274 | 0.2017 |
88
+ | 0.1498 | 9.72 | 13600 | 0.2322 | 0.2025 |
89
+ | 0.1516 | 10.01 | 14000 | 0.2323 | 0.1971 |
90
+ | 0.1336 | 10.29 | 14400 | 0.2249 | 0.1920 |
91
+ | 0.134 | 10.58 | 14800 | 0.2258 | 0.2055 |
92
+ | 0.138 | 10.86 | 15200 | 0.2250 | 0.1906 |
93
+ | 0.13 | 11.15 | 15600 | 0.2423 | 0.1920 |
94
+ | 0.1302 | 11.44 | 16000 | 0.2294 | 0.1849 |
95
+ | 0.1253 | 11.72 | 16400 | 0.2193 | 0.1889 |
96
+ | 0.1219 | 12.01 | 16800 | 0.2350 | 0.1869 |
97
+ | 0.1149 | 12.29 | 17200 | 0.2350 | 0.1903 |
98
+ | 0.1161 | 12.58 | 17600 | 0.2277 | 0.1899 |
99
+ | 0.1129 | 12.87 | 18000 | 0.2416 | 0.1855 |
100
+ | 0.1091 | 13.15 | 18400 | 0.2289 | 0.1815 |
101
+ | 0.1073 | 13.44 | 18800 | 0.2383 | 0.1799 |
102
+ | 0.1135 | 13.72 | 19200 | 0.2306 | 0.1819 |
103
+ | 0.1075 | 14.01 | 19600 | 0.2283 | 0.1742 |
104
+ | 0.0971 | 14.3 | 20000 | 0.2271 | 0.1851 |
105
+ | 0.0967 | 14.58 | 20400 | 0.2395 | 0.1809 |
106
+ | 0.1039 | 14.87 | 20800 | 0.2286 | 0.1808 |
107
+ | 0.0984 | 15.15 | 21200 | 0.2303 | 0.1821 |
108
+ | 0.0922 | 15.44 | 21600 | 0.2254 | 0.1745 |
109
+ | 0.0882 | 15.73 | 22000 | 0.2280 | 0.1836 |
110
+ | 0.0859 | 16.01 | 22400 | 0.2355 | 0.1779 |
111
+ | 0.0832 | 16.3 | 22800 | 0.2347 | 0.1740 |
112
+ | 0.0854 | 16.58 | 23200 | 0.2342 | 0.1739 |
113
+ | 0.0874 | 16.87 | 23600 | 0.2316 | 0.1719 |
114
+ | 0.0808 | 17.16 | 24000 | 0.2291 | 0.1730 |
115
+ | 0.0741 | 17.44 | 24400 | 0.2308 | 0.1674 |
116
+ | 0.0815 | 17.73 | 24800 | 0.2329 | 0.1655 |
117
+ | 0.0764 | 18.01 | 25200 | 0.2514 | 0.1711 |
118
+ | 0.0719 | 18.3 | 25600 | 0.2275 | 0.1578 |
119
+ | 0.0665 | 18.58 | 26000 | 0.2367 | 0.1614 |
120
+ | 0.0693 | 18.87 | 26400 | 0.2185 | 0.1593 |
121
+ | 0.0662 | 19.16 | 26800 | 0.2266 | 0.1678 |
122
+ | 0.0612 | 19.44 | 27200 | 0.2332 | 0.1602 |
123
+ | 0.0623 | 19.73 | 27600 | 0.2283 | 0.1670 |
124
+ | 0.0659 | 20.01 | 28000 | 0.2142 | 0.1626 |
125
+ | 0.0581 | 20.3 | 28400 | 0.2198 | 0.1646 |
126
+ | 0.063 | 20.59 | 28800 | 0.2251 | 0.1588 |
127
+ | 0.0618 | 20.87 | 29200 | 0.2186 | 0.1554 |
128
+ | 0.0549 | 21.16 | 29600 | 0.2251 | 0.1490 |
129
+ | 0.058 | 21.44 | 30000 | 0.2366 | 0.1559 |
130
+ | 0.0543 | 21.73 | 30400 | 0.2262 | 0.1535 |
131
+ | 0.0529 | 22.02 | 30800 | 0.2358 | 0.1519 |
132
+ | 0.053 | 22.3 | 31200 | 0.2198 | 0.1513 |
133
+ | 0.0552 | 22.59 | 31600 | 0.2234 | 0.1503 |
134
+ | 0.0492 | 22.87 | 32000 | 0.2191 | 0.1516 |
135
+ | 0.0488 | 23.16 | 32400 | 0.2321 | 0.1500 |
136
+ | 0.0479 | 23.45 | 32800 | 0.2152 | 0.1420 |
137
+ | 0.0453 | 23.73 | 33200 | 0.2202 | 0.1453 |
138
+ | 0.0485 | 24.02 | 33600 | 0.2235 | 0.1468 |
139
+ | 0.0451 | 24.3 | 34000 | 0.2192 | 0.1455 |
140
+ | 0.041 | 24.59 | 34400 | 0.2138 | 0.1438 |
141
+ | 0.0435 | 24.87 | 34800 | 0.2335 | 0.1423 |
142
+ | 0.0404 | 25.16 | 35200 | 0.2220 | 0.1409 |
143
+ | 0.0374 | 25.45 | 35600 | 0.2366 | 0.1437 |
144
+ | 0.0405 | 25.73 | 36000 | 0.2233 | 0.1428 |
145
+ | 0.0385 | 26.02 | 36400 | 0.2208 | 0.1414 |
146
+ | 0.0373 | 26.3 | 36800 | 0.2265 | 0.1420 |
147
+ | 0.0365 | 26.59 | 37200 | 0.2174 | 0.1402 |
148
+ | 0.037 | 26.88 | 37600 | 0.2249 | 0.1397 |
149
+ | 0.0379 | 27.16 | 38000 | 0.2173 | 0.1374 |
150
+ | 0.0354 | 27.45 | 38400 | 0.2212 | 0.1381 |
151
+ | 0.034 | 27.73 | 38800 | 0.2313 | 0.1364 |
152
+ | 0.0347 | 28.02 | 39200 | 0.2230 | 0.1356 |
153
+ | 0.0318 | 28.31 | 39600 | 0.2231 | 0.1357 |
154
+ | 0.0305 | 28.59 | 40000 | 0.2281 | 0.1366 |
155
+ | 0.0307 | 28.88 | 40400 | 0.2259 | 0.1342 |
156
+ | 0.0315 | 29.16 | 40800 | 0.2252 | 0.1332 |
157
+ | 0.0314 | 29.45 | 41200 | 0.2218 | 0.1328 |
158
+ | 0.0307 | 29.74 | 41600 | 0.2206 | 0.1322 |
159
+
160
+
161
+ ### Framework versions
162
+
163
+ - Transformers 4.17.0
164
+ - Pytorch 1.8.1+cu111
165
+ - Datasets 2.2.1
166
+ - Tokenizers 0.12.1