van-ng commited on
Commit
dab6c20
·
verified ·
1 Parent(s): 2398262

Model save

Browse files
Files changed (2) hide show
  1. README.md +204 -34
  2. pytorch_model.bin +1 -1
README.md CHANGED
@@ -22,7 +22,7 @@ model-index:
22
  metrics:
23
  - name: Accuracy
24
  type: accuracy
25
- value: 0.061946902654867256
26
  ---
27
 
28
  <!-- This model card has been generated automatically according to the information the Trainer had access to. You
@@ -32,8 +32,8 @@ should probably proofread and complete it, then remove this comment. -->
32
 
33
  This model is a fine-tuned version of [facebook/wav2vec2-base](https://huggingface.co/facebook/wav2vec2-base) on the minds14 dataset.
34
  It achieves the following results on the evaluation set:
35
- - Loss: 2.7674
36
- - Accuracy: 0.0619
37
 
38
  ## Model description
39
 
@@ -61,42 +61,212 @@ The following hyperparameters were used during training:
61
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
62
  - lr_scheduler_type: linear
63
  - lr_scheduler_warmup_ratio: 0.1
64
- - num_epochs: 30
65
 
66
  ### Training results
67
 
68
  | Training Loss | Epoch | Step | Validation Loss | Accuracy |
69
  |:-------------:|:-----:|:----:|:---------------:|:--------:|
70
- | No log | 0.98 | 14 | 2.6369 | 0.1062 |
71
- | No log | 1.96 | 28 | 2.6476 | 0.0973 |
72
- | 2.6346 | 2.95 | 42 | 2.6571 | 0.0973 |
73
- | 2.6346 | 4.0 | 57 | 2.6573 | 0.0796 |
74
- | 2.6206 | 4.98 | 71 | 2.6699 | 0.0708 |
75
- | 2.6206 | 5.96 | 85 | 2.6687 | 0.0619 |
76
- | 2.5993 | 6.95 | 99 | 2.6739 | 0.0619 |
77
- | 2.5993 | 8.0 | 114 | 2.6755 | 0.0531 |
78
- | 2.5752 | 8.98 | 128 | 2.6848 | 0.0619 |
79
- | 2.5752 | 9.96 | 142 | 2.6820 | 0.0354 |
80
- | 2.5487 | 10.95 | 156 | 2.6892 | 0.0354 |
81
- | 2.5487 | 12.0 | 171 | 2.6989 | 0.0442 |
82
- | 2.5112 | 12.98 | 185 | 2.7059 | 0.0354 |
83
- | 2.5112 | 13.96 | 199 | 2.7208 | 0.0442 |
84
- | 2.4728 | 14.95 | 213 | 2.7136 | 0.0442 |
85
- | 2.4728 | 16.0 | 228 | 2.7208 | 0.0442 |
86
- | 2.4331 | 16.98 | 242 | 2.7166 | 0.0265 |
87
- | 2.4331 | 17.96 | 256 | 2.7288 | 0.0442 |
88
- | 2.3926 | 18.95 | 270 | 2.7281 | 0.0354 |
89
- | 2.3926 | 20.0 | 285 | 2.7297 | 0.0531 |
90
- | 2.3926 | 20.98 | 299 | 2.7471 | 0.0531 |
91
- | 2.341 | 21.96 | 313 | 2.7498 | 0.0619 |
92
- | 2.341 | 22.95 | 327 | 2.7535 | 0.0619 |
93
- | 2.2983 | 24.0 | 342 | 2.7534 | 0.0442 |
94
- | 2.2983 | 24.98 | 356 | 2.7647 | 0.0442 |
95
- | 2.2703 | 25.96 | 370 | 2.7696 | 0.0708 |
96
- | 2.2703 | 26.95 | 384 | 2.7656 | 0.0531 |
97
- | 2.238 | 28.0 | 399 | 2.7689 | 0.0619 |
98
- | 2.238 | 28.98 | 413 | 2.7667 | 0.0619 |
99
- | 2.213 | 29.47 | 420 | 2.7674 | 0.0619 |
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
100
 
101
 
102
  ### Framework versions
 
22
  metrics:
23
  - name: Accuracy
24
  type: accuracy
25
+ value: 0.07058823529411765
26
  ---
27
 
28
  <!-- This model card has been generated automatically according to the information the Trainer had access to. You
 
32
 
33
  This model is a fine-tuned version of [facebook/wav2vec2-base](https://huggingface.co/facebook/wav2vec2-base) on the minds14 dataset.
34
  It achieves the following results on the evaluation set:
35
+ - Loss: 7.7654
36
+ - Accuracy: 0.0706
37
 
38
  ## Model description
39
 
 
61
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
62
  - lr_scheduler_type: linear
63
  - lr_scheduler_warmup_ratio: 0.1
64
+ - num_epochs: 200
65
 
66
  ### Training results
67
 
68
  | Training Loss | Epoch | Step | Validation Loss | Accuracy |
69
  |:-------------:|:-----:|:----:|:---------------:|:--------:|
70
+ | 2.6382 | 1.0 | 15 | 2.6424 | 0.1059 |
71
+ | 2.6403 | 2.0 | 30 | 2.6453 | 0.0706 |
72
+ | 2.6356 | 3.0 | 45 | 2.6431 | 0.0471 |
73
+ | 2.6339 | 4.0 | 60 | 2.6444 | 0.0471 |
74
+ | 2.6314 | 5.0 | 75 | 2.6529 | 0.0471 |
75
+ | 2.6296 | 6.0 | 90 | 2.6488 | 0.0588 |
76
+ | 2.6242 | 7.0 | 105 | 2.6569 | 0.0824 |
77
+ | 2.6215 | 8.0 | 120 | 2.6629 | 0.0353 |
78
+ | 2.6134 | 9.0 | 135 | 2.6670 | 0.0588 |
79
+ | 2.611 | 10.0 | 150 | 2.6836 | 0.0235 |
80
+ | 2.5978 | 11.0 | 165 | 2.6735 | 0.0706 |
81
+ | 2.5974 | 12.0 | 180 | 2.6879 | 0.0235 |
82
+ | 2.5974 | 13.0 | 195 | 2.6895 | 0.0471 |
83
+ | 2.5914 | 14.0 | 210 | 2.7066 | 0.0235 |
84
+ | 2.5677 | 15.0 | 225 | 2.7058 | 0.0353 |
85
+ | 2.567 | 16.0 | 240 | 2.7118 | 0.0471 |
86
+ | 2.5579 | 17.0 | 255 | 2.7248 | 0.0471 |
87
+ | 2.559 | 18.0 | 270 | 2.6907 | 0.0353 |
88
+ | 2.5335 | 19.0 | 285 | 2.7091 | 0.0706 |
89
+ | 2.5327 | 20.0 | 300 | 2.7387 | 0.0824 |
90
+ | 2.499 | 21.0 | 315 | 2.7275 | 0.0235 |
91
+ | 2.4624 | 22.0 | 330 | 2.7613 | 0.0706 |
92
+ | 2.4557 | 23.0 | 345 | 2.7961 | 0.0235 |
93
+ | 2.4194 | 24.0 | 360 | 2.8100 | 0.0235 |
94
+ | 2.4061 | 25.0 | 375 | 2.7907 | 0.0706 |
95
+ | 2.3704 | 26.0 | 390 | 2.8218 | 0.0588 |
96
+ | 2.3319 | 27.0 | 405 | 2.8153 | 0.0353 |
97
+ | 2.3214 | 28.0 | 420 | 2.8860 | 0.0824 |
98
+ | 2.3314 | 29.0 | 435 | 2.8715 | 0.0706 |
99
+ | 2.2418 | 30.0 | 450 | 2.8179 | 0.0588 |
100
+ | 2.2144 | 31.0 | 465 | 2.8949 | 0.0471 |
101
+ | 2.2343 | 32.0 | 480 | 2.8831 | 0.0706 |
102
+ | 2.1429 | 33.0 | 495 | 2.9078 | 0.0235 |
103
+ | 2.1252 | 34.0 | 510 | 2.8757 | 0.0824 |
104
+ | 2.0381 | 35.0 | 525 | 2.8561 | 0.0706 |
105
+ | 1.9718 | 36.0 | 540 | 2.9334 | 0.0235 |
106
+ | 1.9676 | 37.0 | 555 | 2.9418 | 0.0353 |
107
+ | 1.9167 | 38.0 | 570 | 3.0164 | 0.0353 |
108
+ | 1.8649 | 39.0 | 585 | 2.9759 | 0.0471 |
109
+ | 1.8494 | 40.0 | 600 | 2.9568 | 0.1059 |
110
+ | 1.768 | 41.0 | 615 | 2.9746 | 0.0588 |
111
+ | 1.6931 | 42.0 | 630 | 2.9224 | 0.0588 |
112
+ | 1.7221 | 43.0 | 645 | 2.9451 | 0.0941 |
113
+ | 1.6963 | 44.0 | 660 | 3.0814 | 0.0824 |
114
+ | 1.6515 | 45.0 | 675 | 2.9779 | 0.0941 |
115
+ | 1.5451 | 46.0 | 690 | 3.1134 | 0.1059 |
116
+ | 1.4958 | 47.0 | 705 | 2.9865 | 0.0824 |
117
+ | 1.432 | 48.0 | 720 | 3.1357 | 0.0588 |
118
+ | 1.4167 | 49.0 | 735 | 3.2270 | 0.0824 |
119
+ | 1.3876 | 50.0 | 750 | 3.2101 | 0.0353 |
120
+ | 1.3323 | 51.0 | 765 | 3.1803 | 0.0824 |
121
+ | 1.2767 | 52.0 | 780 | 3.2781 | 0.0353 |
122
+ | 1.3037 | 53.0 | 795 | 3.2814 | 0.0588 |
123
+ | 1.214 | 54.0 | 810 | 3.3239 | 0.0941 |
124
+ | 1.1582 | 55.0 | 825 | 3.2247 | 0.1176 |
125
+ | 1.1028 | 56.0 | 840 | 3.3801 | 0.0706 |
126
+ | 1.117 | 57.0 | 855 | 3.3110 | 0.0941 |
127
+ | 1.0498 | 58.0 | 870 | 3.3820 | 0.0588 |
128
+ | 0.9688 | 59.0 | 885 | 3.2971 | 0.0706 |
129
+ | 0.9991 | 60.0 | 900 | 3.4578 | 0.0706 |
130
+ | 0.915 | 61.0 | 915 | 3.5240 | 0.0706 |
131
+ | 0.9858 | 62.0 | 930 | 3.4743 | 0.0706 |
132
+ | 0.8826 | 63.0 | 945 | 3.4516 | 0.0588 |
133
+ | 0.8748 | 64.0 | 960 | 3.4834 | 0.0824 |
134
+ | 0.8671 | 65.0 | 975 | 3.4300 | 0.0706 |
135
+ | 0.8005 | 66.0 | 990 | 3.5403 | 0.0588 |
136
+ | 0.7662 | 67.0 | 1005 | 3.6394 | 0.0588 |
137
+ | 0.7789 | 68.0 | 1020 | 3.6355 | 0.0235 |
138
+ | 0.6816 | 69.0 | 1035 | 3.7145 | 0.0235 |
139
+ | 0.678 | 70.0 | 1050 | 3.7057 | 0.0353 |
140
+ | 0.6307 | 71.0 | 1065 | 3.6650 | 0.0588 |
141
+ | 0.6853 | 72.0 | 1080 | 3.7011 | 0.0353 |
142
+ | 0.5857 | 73.0 | 1095 | 3.6480 | 0.0706 |
143
+ | 0.5405 | 74.0 | 1110 | 3.7454 | 0.0588 |
144
+ | 0.6295 | 75.0 | 1125 | 3.6397 | 0.0824 |
145
+ | 0.5667 | 76.0 | 1140 | 3.6528 | 0.0588 |
146
+ | 0.5558 | 77.0 | 1155 | 3.8219 | 0.0471 |
147
+ | 0.4908 | 78.0 | 1170 | 3.9318 | 0.0353 |
148
+ | 0.4427 | 79.0 | 1185 | 3.8695 | 0.0471 |
149
+ | 0.4437 | 80.0 | 1200 | 4.0755 | 0.0353 |
150
+ | 0.3798 | 81.0 | 1215 | 4.0077 | 0.0353 |
151
+ | 0.477 | 82.0 | 1230 | 3.9117 | 0.0706 |
152
+ | 0.4199 | 83.0 | 1245 | 4.1337 | 0.0471 |
153
+ | 0.4037 | 84.0 | 1260 | 4.0306 | 0.0235 |
154
+ | 0.3283 | 85.0 | 1275 | 4.1248 | 0.0471 |
155
+ | 0.4361 | 86.0 | 1290 | 4.0707 | 0.0235 |
156
+ | 0.3949 | 87.0 | 1305 | 4.2368 | 0.0235 |
157
+ | 0.3577 | 88.0 | 1320 | 4.2299 | 0.0706 |
158
+ | 0.2885 | 89.0 | 1335 | 4.3665 | 0.0471 |
159
+ | 0.2737 | 90.0 | 1350 | 4.1773 | 0.0706 |
160
+ | 0.3 | 91.0 | 1365 | 4.5002 | 0.0471 |
161
+ | 0.2936 | 92.0 | 1380 | 4.5914 | 0.0235 |
162
+ | 0.3035 | 93.0 | 1395 | 4.3489 | 0.0471 |
163
+ | 0.2401 | 94.0 | 1410 | 4.3683 | 0.0706 |
164
+ | 0.1996 | 95.0 | 1425 | 4.4946 | 0.0588 |
165
+ | 0.232 | 96.0 | 1440 | 4.6429 | 0.0588 |
166
+ | 0.291 | 97.0 | 1455 | 4.5975 | 0.0353 |
167
+ | 0.2111 | 98.0 | 1470 | 4.5378 | 0.0353 |
168
+ | 0.1986 | 99.0 | 1485 | 4.5688 | 0.0471 |
169
+ | 0.2242 | 100.0 | 1500 | 4.5640 | 0.0118 |
170
+ | 0.1679 | 101.0 | 1515 | 4.7323 | 0.0588 |
171
+ | 0.1897 | 102.0 | 1530 | 4.6266 | 0.0235 |
172
+ | 0.2212 | 103.0 | 1545 | 4.8046 | 0.0706 |
173
+ | 0.2138 | 104.0 | 1560 | 4.6699 | 0.0588 |
174
+ | 0.1921 | 105.0 | 1575 | 4.7727 | 0.0235 |
175
+ | 0.1625 | 106.0 | 1590 | 4.8053 | 0.0471 |
176
+ | 0.1206 | 107.0 | 1605 | 5.0319 | 0.0588 |
177
+ | 0.1841 | 108.0 | 1620 | 4.9295 | 0.0353 |
178
+ | 0.1288 | 109.0 | 1635 | 4.9922 | 0.0588 |
179
+ | 0.1647 | 110.0 | 1650 | 5.1317 | 0.0235 |
180
+ | 0.1758 | 111.0 | 1665 | 5.1606 | 0.0353 |
181
+ | 0.1281 | 112.0 | 1680 | 5.2126 | 0.0118 |
182
+ | 0.1925 | 113.0 | 1695 | 5.1029 | 0.0471 |
183
+ | 0.1177 | 114.0 | 1710 | 5.3407 | 0.0353 |
184
+ | 0.128 | 115.0 | 1725 | 4.9612 | 0.0706 |
185
+ | 0.1078 | 116.0 | 1740 | 5.4318 | 0.0235 |
186
+ | 0.0747 | 117.0 | 1755 | 5.3757 | 0.0588 |
187
+ | 0.1359 | 118.0 | 1770 | 5.3949 | 0.0471 |
188
+ | 0.0971 | 119.0 | 1785 | 5.4532 | 0.0471 |
189
+ | 0.0671 | 120.0 | 1800 | 5.6330 | 0.0353 |
190
+ | 0.0819 | 121.0 | 1815 | 5.5617 | 0.0471 |
191
+ | 0.0892 | 122.0 | 1830 | 5.6881 | 0.0353 |
192
+ | 0.0861 | 123.0 | 1845 | 5.7083 | 0.0353 |
193
+ | 0.0649 | 124.0 | 1860 | 5.8477 | 0.0824 |
194
+ | 0.0674 | 125.0 | 1875 | 5.6822 | 0.0588 |
195
+ | 0.0788 | 126.0 | 1890 | 5.7720 | 0.0824 |
196
+ | 0.0439 | 127.0 | 1905 | 5.8210 | 0.0706 |
197
+ | 0.0586 | 128.0 | 1920 | 5.9101 | 0.0588 |
198
+ | 0.0674 | 129.0 | 1935 | 5.7681 | 0.0588 |
199
+ | 0.0563 | 130.0 | 1950 | 5.7770 | 0.0824 |
200
+ | 0.0284 | 131.0 | 1965 | 6.1912 | 0.0588 |
201
+ | 0.0717 | 132.0 | 1980 | 6.0938 | 0.0588 |
202
+ | 0.0424 | 133.0 | 1995 | 6.0714 | 0.0824 |
203
+ | 0.0768 | 134.0 | 2010 | 6.1924 | 0.0706 |
204
+ | 0.0592 | 135.0 | 2025 | 6.5515 | 0.0118 |
205
+ | 0.0217 | 136.0 | 2040 | 6.1961 | 0.0706 |
206
+ | 0.0544 | 137.0 | 2055 | 6.4168 | 0.0353 |
207
+ | 0.0417 | 138.0 | 2070 | 6.4916 | 0.0588 |
208
+ | 0.0339 | 139.0 | 2085 | 6.6678 | 0.0235 |
209
+ | 0.0208 | 140.0 | 2100 | 6.4968 | 0.0588 |
210
+ | 0.0436 | 141.0 | 2115 | 6.5245 | 0.0588 |
211
+ | 0.033 | 142.0 | 2130 | 6.6816 | 0.0706 |
212
+ | 0.037 | 143.0 | 2145 | 6.3041 | 0.0824 |
213
+ | 0.0132 | 144.0 | 2160 | 6.6597 | 0.0588 |
214
+ | 0.0484 | 145.0 | 2175 | 6.6440 | 0.0824 |
215
+ | 0.0264 | 146.0 | 2190 | 6.7801 | 0.0353 |
216
+ | 0.0115 | 147.0 | 2205 | 6.7156 | 0.0471 |
217
+ | 0.027 | 148.0 | 2220 | 6.7250 | 0.0706 |
218
+ | 0.0394 | 149.0 | 2235 | 6.8474 | 0.0706 |
219
+ | 0.0113 | 150.0 | 2250 | 6.8180 | 0.0824 |
220
+ | 0.0157 | 151.0 | 2265 | 6.8688 | 0.0824 |
221
+ | 0.0385 | 152.0 | 2280 | 6.8874 | 0.0824 |
222
+ | 0.0224 | 153.0 | 2295 | 7.0014 | 0.0706 |
223
+ | 0.0522 | 154.0 | 2310 | 7.1680 | 0.0706 |
224
+ | 0.0099 | 155.0 | 2325 | 7.1595 | 0.0471 |
225
+ | 0.01 | 156.0 | 2340 | 7.1259 | 0.0471 |
226
+ | 0.0144 | 157.0 | 2355 | 7.1538 | 0.0471 |
227
+ | 0.0175 | 158.0 | 2370 | 7.0335 | 0.0706 |
228
+ | 0.008 | 159.0 | 2385 | 7.0295 | 0.0588 |
229
+ | 0.0311 | 160.0 | 2400 | 7.1288 | 0.0706 |
230
+ | 0.0416 | 161.0 | 2415 | 7.1012 | 0.0471 |
231
+ | 0.0333 | 162.0 | 2430 | 7.3391 | 0.0588 |
232
+ | 0.0241 | 163.0 | 2445 | 7.2666 | 0.0588 |
233
+ | 0.0068 | 164.0 | 2460 | 7.1324 | 0.0706 |
234
+ | 0.0194 | 165.0 | 2475 | 7.1494 | 0.0824 |
235
+ | 0.0089 | 166.0 | 2490 | 7.2136 | 0.0824 |
236
+ | 0.0071 | 167.0 | 2505 | 7.2442 | 0.0706 |
237
+ | 0.0174 | 168.0 | 2520 | 7.3070 | 0.0588 |
238
+ | 0.0056 | 169.0 | 2535 | 7.3370 | 0.0588 |
239
+ | 0.0054 | 170.0 | 2550 | 7.3814 | 0.0588 |
240
+ | 0.0087 | 171.0 | 2565 | 7.3903 | 0.0588 |
241
+ | 0.0052 | 172.0 | 2580 | 7.4102 | 0.0588 |
242
+ | 0.0255 | 173.0 | 2595 | 7.3886 | 0.0588 |
243
+ | 0.0056 | 174.0 | 2610 | 7.4785 | 0.0588 |
244
+ | 0.005 | 175.0 | 2625 | 7.5349 | 0.0588 |
245
+ | 0.0078 | 176.0 | 2640 | 7.5136 | 0.0588 |
246
+ | 0.0214 | 177.0 | 2655 | 7.5146 | 0.0706 |
247
+ | 0.0827 | 178.0 | 2670 | 7.5079 | 0.0706 |
248
+ | 0.0046 | 179.0 | 2685 | 7.5157 | 0.0941 |
249
+ | 0.0098 | 180.0 | 2700 | 7.5161 | 0.0941 |
250
+ | 0.0049 | 181.0 | 2715 | 7.5169 | 0.0824 |
251
+ | 0.0063 | 182.0 | 2730 | 7.5643 | 0.0824 |
252
+ | 0.0147 | 183.0 | 2745 | 7.6032 | 0.0824 |
253
+ | 0.0279 | 184.0 | 2760 | 7.6901 | 0.0706 |
254
+ | 0.0044 | 185.0 | 2775 | 7.7511 | 0.0706 |
255
+ | 0.0106 | 186.0 | 2790 | 7.6778 | 0.0588 |
256
+ | 0.0042 | 187.0 | 2805 | 7.6374 | 0.0588 |
257
+ | 0.0043 | 188.0 | 2820 | 7.6470 | 0.0706 |
258
+ | 0.0279 | 189.0 | 2835 | 7.6876 | 0.0706 |
259
+ | 0.0089 | 190.0 | 2850 | 7.6849 | 0.0706 |
260
+ | 0.0128 | 191.0 | 2865 | 7.6929 | 0.0706 |
261
+ | 0.0046 | 192.0 | 2880 | 7.6891 | 0.0706 |
262
+ | 0.0041 | 193.0 | 2895 | 7.7036 | 0.0706 |
263
+ | 0.0215 | 194.0 | 2910 | 7.7134 | 0.0706 |
264
+ | 0.0041 | 195.0 | 2925 | 7.7340 | 0.0706 |
265
+ | 0.0041 | 196.0 | 2940 | 7.7637 | 0.0706 |
266
+ | 0.0278 | 197.0 | 2955 | 7.7695 | 0.0706 |
267
+ | 0.004 | 198.0 | 2970 | 7.7673 | 0.0706 |
268
+ | 0.0039 | 199.0 | 2985 | 7.7663 | 0.0706 |
269
+ | 0.01 | 200.0 | 3000 | 7.7654 | 0.0706 |
270
 
271
 
272
  ### Framework versions
pytorch_model.bin CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:218ebe8805ccd908b74970342d5647833f879de2f316b0fa5c82d317a69feabd
3
  size 378363238
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:e086d125e607e72c24eab8785b7b3ce290db9c0ce5b26461d57561208ef53301
3
  size 378363238