Sphincz commited on
Commit
fa2fa06
·
verified ·
1 Parent(s): a582d75

End of training

Browse files
README.md ADDED
@@ -0,0 +1,407 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ library_name: transformers
3
+ tags:
4
+ - generated_from_trainer
5
+ model-index:
6
+ - name: monolang_roberta
7
+ results: []
8
+ ---
9
+
10
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
11
+ should probably proofread and complete it, then remove this comment. -->
12
+
13
+ [<img src="https://raw.githubusercontent.com/wandb/assets/main/wandb-github-badge-28.svg" alt="Visualize in Weights & Biases" width="200" height="32"/>](https://wandb.ai/iscteiul/CVTT/runs/0jwmsbus)
14
+ # monolang_roberta
15
+
16
+ This model is a fine-tuned version of [](https://huggingface.co/) on an unknown dataset.
17
+ It achieves the following results on the evaluation set:
18
+ - Loss: 0.9382
19
+
20
+ ## Model description
21
+
22
+ More information needed
23
+
24
+ ## Intended uses & limitations
25
+
26
+ More information needed
27
+
28
+ ## Training and evaluation data
29
+
30
+ More information needed
31
+
32
+ ## Training procedure
33
+
34
+ ### Training hyperparameters
35
+
36
+ The following hyperparameters were used during training:
37
+ - learning_rate: 5e-05
38
+ - train_batch_size: 128
39
+ - eval_batch_size: 128
40
+ - seed: 42
41
+ - gradient_accumulation_steps: 8
42
+ - total_train_batch_size: 1024
43
+ - optimizer: Use OptimizerNames.ADAMW_HF with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
44
+ - lr_scheduler_type: linear
45
+ - num_epochs: 200
46
+ - mixed_precision_training: Native AMP
47
+
48
+ ### Training results
49
+
50
+ | Training Loss | Epoch | Step | Validation Loss |
51
+ |:-------------:|:--------:|:-----:|:---------------:|
52
+ | 4.3759 | 0.5735 | 100 | 4.2809 |
53
+ | 3.8527 | 1.1434 | 200 | 3.7609 |
54
+ | 3.3289 | 1.7168 | 300 | 3.2264 |
55
+ | 2.9251 | 2.2867 | 400 | 2.8520 |
56
+ | 2.7058 | 2.8602 | 500 | 2.6128 |
57
+ | 2.5376 | 3.4301 | 600 | 2.4893 |
58
+ | 2.3017 | 4.0 | 700 | 2.3807 |
59
+ | 2.3375 | 4.5735 | 800 | 2.3030 |
60
+ | 2.2953 | 5.1434 | 900 | 2.2198 |
61
+ | 2.2365 | 5.7168 | 1000 | 2.1678 |
62
+ | 2.1709 | 6.2867 | 1100 | 2.1113 |
63
+ | 2.0904 | 6.8602 | 1200 | 2.0296 |
64
+ | 2.0443 | 7.4301 | 1300 | 1.9785 |
65
+ | 1.8951 | 8.0 | 1400 | 1.9323 |
66
+ | 1.9251 | 8.5735 | 1500 | 1.8842 |
67
+ | 1.8639 | 9.1434 | 1600 | 1.8533 |
68
+ | 1.8359 | 9.7168 | 1700 | 1.8028 |
69
+ | 1.8146 | 10.2867 | 1800 | 1.7665 |
70
+ | 1.8021 | 10.8602 | 1900 | 1.7489 |
71
+ | 1.7819 | 11.4301 | 2000 | 1.7204 |
72
+ | 1.6114 | 12.0 | 2100 | 1.6807 |
73
+ | 1.7171 | 12.5735 | 2200 | 1.6776 |
74
+ | 1.6929 | 13.1434 | 2300 | 1.6672 |
75
+ | 1.6784 | 13.7168 | 2400 | 1.6448 |
76
+ | 1.6724 | 14.2867 | 2500 | 1.5998 |
77
+ | 1.6372 | 14.8602 | 2600 | 1.5929 |
78
+ | 1.6156 | 15.4301 | 2700 | 1.5809 |
79
+ | 1.5284 | 16.0 | 2800 | 1.5574 |
80
+ | 1.5787 | 16.5735 | 2900 | 1.5392 |
81
+ | 1.5807 | 17.1434 | 3000 | 1.5195 |
82
+ | 1.5732 | 17.7168 | 3100 | 1.5140 |
83
+ | 1.5191 | 18.2867 | 3200 | 1.5144 |
84
+ | 1.5057 | 18.8602 | 3300 | 1.4768 |
85
+ | 1.5029 | 19.4301 | 3400 | 1.4749 |
86
+ | 1.3981 | 20.0 | 3500 | 1.4581 |
87
+ | 1.4977 | 20.5735 | 3600 | 1.4443 |
88
+ | 1.4729 | 21.1434 | 3700 | 1.4306 |
89
+ | 1.4637 | 21.7168 | 3800 | 1.4297 |
90
+ | 1.4432 | 22.2867 | 3900 | 1.4210 |
91
+ | 1.4343 | 22.8602 | 4000 | 1.4103 |
92
+ | 1.4179 | 23.4301 | 4100 | 1.3859 |
93
+ | 1.3212 | 24.0 | 4200 | 1.4049 |
94
+ | 1.4155 | 24.5735 | 4300 | 1.3742 |
95
+ | 1.3983 | 25.1434 | 4400 | 1.3840 |
96
+ | 1.3998 | 25.7168 | 4500 | 1.3647 |
97
+ | 1.3565 | 26.2867 | 4600 | 1.3581 |
98
+ | 1.3805 | 26.8602 | 4700 | 1.3466 |
99
+ | 1.3731 | 27.4301 | 4800 | 1.3374 |
100
+ | 1.2775 | 28.0 | 4900 | 1.3320 |
101
+ | 1.3217 | 28.5735 | 5000 | 1.3215 |
102
+ | 1.3404 | 29.1434 | 5100 | 1.3185 |
103
+ | 1.3235 | 29.7168 | 5200 | 1.3119 |
104
+ | 1.3445 | 30.2867 | 5300 | 1.3268 |
105
+ | 1.3243 | 30.8602 | 5400 | 1.3003 |
106
+ | 1.2853 | 31.4301 | 5500 | 1.2758 |
107
+ | 1.1974 | 32.0 | 5600 | 1.2880 |
108
+ | 1.2663 | 32.5735 | 5700 | 1.2987 |
109
+ | 1.2617 | 33.1434 | 5800 | 1.2877 |
110
+ | 1.2717 | 33.7168 | 5900 | 1.2772 |
111
+ | 1.2652 | 34.2867 | 6000 | 1.2786 |
112
+ | 1.2668 | 34.8602 | 6100 | 1.2532 |
113
+ | 1.2628 | 35.4301 | 6200 | 1.2404 |
114
+ | 1.1627 | 36.0 | 6300 | 1.2596 |
115
+ | 1.2476 | 36.5735 | 6400 | 1.2492 |
116
+ | 1.2457 | 37.1434 | 6500 | 1.2402 |
117
+ | 1.2393 | 37.7168 | 6600 | 1.2334 |
118
+ | 1.2378 | 38.2867 | 6700 | 1.2394 |
119
+ | 1.2272 | 38.8602 | 6800 | 1.2213 |
120
+ | 1.2204 | 39.4301 | 6900 | 1.2100 |
121
+ | 1.1453 | 40.0 | 7000 | 1.2134 |
122
+ | 1.1963 | 40.5735 | 7100 | 1.2253 |
123
+ | 1.2227 | 41.1434 | 7200 | 1.2188 |
124
+ | 1.2166 | 41.7168 | 7300 | 1.2008 |
125
+ | 1.2091 | 42.2867 | 7400 | 1.2055 |
126
+ | 1.2127 | 42.8602 | 7500 | 1.2055 |
127
+ | 1.1997 | 43.4301 | 7600 | 1.1971 |
128
+ | 1.0956 | 44.0 | 7700 | 1.1974 |
129
+ | 1.1672 | 44.5735 | 7800 | 1.1813 |
130
+ | 1.1746 | 45.1434 | 7900 | 1.1858 |
131
+ | 1.1569 | 45.7168 | 8000 | 1.1858 |
132
+ | 1.1644 | 46.2867 | 8100 | 1.1759 |
133
+ | 1.1629 | 46.8602 | 8200 | 1.1726 |
134
+ | 1.1744 | 47.4301 | 8300 | 1.1646 |
135
+ | 1.0728 | 48.0 | 8400 | 1.1666 |
136
+ | 1.1436 | 48.5735 | 8500 | 1.1801 |
137
+ | 1.126 | 49.1434 | 8600 | 1.1663 |
138
+ | 1.1428 | 49.7168 | 8700 | 1.1510 |
139
+ | 1.1298 | 50.2867 | 8800 | 1.1669 |
140
+ | 1.1083 | 50.8602 | 8900 | 1.1516 |
141
+ | 1.1404 | 51.4301 | 9000 | 1.1462 |
142
+ | 1.0507 | 52.0 | 9100 | 1.1323 |
143
+ | 1.1203 | 52.5735 | 9200 | 1.1381 |
144
+ | 1.1176 | 53.1434 | 9300 | 1.1368 |
145
+ | 1.1228 | 53.7168 | 9400 | 1.1321 |
146
+ | 1.1341 | 54.2867 | 9500 | 1.1388 |
147
+ | 1.1133 | 54.8602 | 9600 | 1.1385 |
148
+ | 1.0973 | 55.4301 | 9700 | 1.1329 |
149
+ | 1.0327 | 56.0 | 9800 | 1.1310 |
150
+ | 1.0939 | 56.5735 | 9900 | 1.1411 |
151
+ | 1.0856 | 57.1434 | 10000 | 1.1149 |
152
+ | 1.1024 | 57.7168 | 10100 | 1.1196 |
153
+ | 1.0865 | 58.2867 | 10200 | 1.1196 |
154
+ | 1.0842 | 58.8602 | 10300 | 1.1250 |
155
+ | 1.0786 | 59.4301 | 10400 | 1.1089 |
156
+ | 0.9986 | 60.0 | 10500 | 1.1002 |
157
+ | 1.0667 | 60.5735 | 10600 | 1.1080 |
158
+ | 1.064 | 61.1434 | 10700 | 1.0939 |
159
+ | 1.0784 | 61.7168 | 10800 | 1.1257 |
160
+ | 1.0663 | 62.2867 | 10900 | 1.0998 |
161
+ | 1.0794 | 62.8602 | 11000 | 1.0964 |
162
+ | 1.0457 | 63.4301 | 11100 | 1.0893 |
163
+ | 0.9931 | 64.0 | 11200 | 1.1005 |
164
+ | 1.0574 | 64.5735 | 11300 | 1.0909 |
165
+ | 1.0619 | 65.1434 | 11400 | 1.0873 |
166
+ | 1.0669 | 65.7168 | 11500 | 1.0914 |
167
+ | 1.0273 | 66.2867 | 11600 | 1.0926 |
168
+ | 1.0467 | 66.8602 | 11700 | 1.0774 |
169
+ | 1.0418 | 67.4301 | 11800 | 1.0900 |
170
+ | 0.9703 | 68.0 | 11900 | 1.0799 |
171
+ | 1.0118 | 68.5735 | 12000 | 1.0846 |
172
+ | 1.0261 | 69.1434 | 12100 | 1.0652 |
173
+ | 1.028 | 69.7168 | 12200 | 1.0750 |
174
+ | 1.0204 | 70.2867 | 12300 | 1.0674 |
175
+ | 1.0448 | 70.8602 | 12400 | 1.0690 |
176
+ | 0.9965 | 71.4301 | 12500 | 1.0509 |
177
+ | 0.9445 | 72.0 | 12600 | 1.0611 |
178
+ | 1.0125 | 72.5735 | 12700 | 1.0688 |
179
+ | 0.9895 | 73.1434 | 12800 | 1.0629 |
180
+ | 1.0051 | 73.7168 | 12900 | 1.0446 |
181
+ | 0.9876 | 74.2867 | 13000 | 1.0576 |
182
+ | 0.9867 | 74.8602 | 13100 | 1.0576 |
183
+ | 0.9902 | 75.4301 | 13200 | 1.0572 |
184
+ | 0.9248 | 76.0 | 13300 | 1.0636 |
185
+ | 0.9988 | 76.5735 | 13400 | 1.0623 |
186
+ | 0.998 | 77.1434 | 13500 | 1.0668 |
187
+ | 0.9777 | 77.7168 | 13600 | 1.0523 |
188
+ | 0.9915 | 78.2867 | 13700 | 1.0476 |
189
+ | 0.9914 | 78.8602 | 13800 | 1.0419 |
190
+ | 0.992 | 79.4301 | 13900 | 1.0407 |
191
+ | 0.922 | 80.0 | 14000 | 1.0459 |
192
+ | 0.9738 | 80.5735 | 14100 | 1.0424 |
193
+ | 0.9996 | 81.1434 | 14200 | 1.0459 |
194
+ | 0.9856 | 81.7168 | 14300 | 1.0328 |
195
+ | 0.9839 | 82.2867 | 14400 | 1.0587 |
196
+ | 0.9748 | 82.8602 | 14500 | 1.0369 |
197
+ | 0.9601 | 83.4301 | 14600 | 1.0350 |
198
+ | 0.8863 | 84.0 | 14700 | 1.0475 |
199
+ | 0.9639 | 84.5735 | 14800 | 1.0441 |
200
+ | 0.9671 | 85.1434 | 14900 | 1.0326 |
201
+ | 0.9515 | 85.7168 | 15000 | 1.0256 |
202
+ | 0.9309 | 86.2867 | 15100 | 1.0351 |
203
+ | 0.9479 | 86.8602 | 15200 | 1.0349 |
204
+ | 0.9738 | 87.4301 | 15300 | 1.0221 |
205
+ | 0.8831 | 88.0 | 15400 | 1.0311 |
206
+ | 0.9384 | 88.5735 | 15500 | 1.0231 |
207
+ | 0.9404 | 89.1434 | 15600 | 1.0369 |
208
+ | 0.9354 | 89.7168 | 15700 | 1.0220 |
209
+ | 0.9228 | 90.2867 | 15800 | 1.0354 |
210
+ | 0.9312 | 90.8602 | 15900 | 1.0232 |
211
+ | 0.9466 | 91.4301 | 16000 | 1.0174 |
212
+ | 0.8721 | 92.0 | 16100 | 1.0168 |
213
+ | 0.9328 | 92.5735 | 16200 | 1.0216 |
214
+ | 0.9331 | 93.1434 | 16300 | 1.0145 |
215
+ | 0.9435 | 93.7168 | 16400 | 1.0208 |
216
+ | 0.9253 | 94.2867 | 16500 | 1.0163 |
217
+ | 0.9231 | 94.8602 | 16600 | 1.0159 |
218
+ | 0.9273 | 95.4301 | 16700 | 1.0044 |
219
+ | 0.8571 | 96.0 | 16800 | 1.0047 |
220
+ | 0.9293 | 96.5735 | 16900 | 1.0111 |
221
+ | 0.9264 | 97.1434 | 17000 | 1.0088 |
222
+ | 0.9221 | 97.7168 | 17100 | 1.0074 |
223
+ | 0.9143 | 98.2867 | 17200 | 1.0123 |
224
+ | 0.9119 | 98.8602 | 17300 | 1.0049 |
225
+ | 0.9036 | 99.4301 | 17400 | 0.9973 |
226
+ | 0.869 | 100.0 | 17500 | 1.0031 |
227
+ | 0.9024 | 100.5735 | 17600 | 1.0054 |
228
+ | 0.896 | 101.1434 | 17700 | 0.9966 |
229
+ | 0.8883 | 101.7168 | 17800 | 1.0010 |
230
+ | 0.9053 | 102.2867 | 17900 | 1.0160 |
231
+ | 0.9263 | 102.8602 | 18000 | 1.0012 |
232
+ | 0.9042 | 103.4301 | 18100 | 1.0145 |
233
+ | 0.8341 | 104.0 | 18200 | 0.9961 |
234
+ | 0.9028 | 104.5735 | 18300 | 1.0040 |
235
+ | 0.9026 | 105.1434 | 18400 | 0.9844 |
236
+ | 0.8801 | 105.7168 | 18500 | 0.9907 |
237
+ | 0.8956 | 106.2867 | 18600 | 0.9983 |
238
+ | 0.9023 | 106.8602 | 18700 | 0.9840 |
239
+ | 0.8712 | 107.4301 | 18800 | 0.9946 |
240
+ | 0.8374 | 108.0 | 18900 | 0.9944 |
241
+ | 0.9102 | 108.5735 | 19000 | 0.9928 |
242
+ | 0.8914 | 109.1434 | 19100 | 0.9778 |
243
+ | 0.8852 | 109.7168 | 19200 | 0.9928 |
244
+ | 0.8826 | 110.2867 | 19300 | 0.9829 |
245
+ | 0.881 | 110.8602 | 19400 | 0.9747 |
246
+ | 0.878 | 111.4301 | 19500 | 0.9809 |
247
+ | 0.8219 | 112.0 | 19600 | 0.9707 |
248
+ | 0.8797 | 112.5735 | 19700 | 0.9727 |
249
+ | 0.8911 | 113.1434 | 19800 | 0.9759 |
250
+ | 0.8706 | 113.7168 | 19900 | 0.9809 |
251
+ | 0.8952 | 114.2867 | 20000 | 0.9871 |
252
+ | 0.8585 | 114.8602 | 20100 | 0.9777 |
253
+ | 0.8679 | 115.4301 | 20200 | 0.9745 |
254
+ | 0.8289 | 116.0 | 20300 | 0.9793 |
255
+ | 0.8647 | 116.5735 | 20400 | 0.9791 |
256
+ | 0.8711 | 117.1434 | 20500 | 0.9806 |
257
+ | 0.8734 | 117.7168 | 20600 | 0.9831 |
258
+ | 0.8427 | 118.2867 | 20700 | 0.9862 |
259
+ | 0.8678 | 118.8602 | 20800 | 0.9786 |
260
+ | 0.8687 | 119.4301 | 20900 | 0.9738 |
261
+ | 0.8072 | 120.0 | 21000 | 0.9758 |
262
+ | 0.854 | 120.5735 | 21100 | 0.9647 |
263
+ | 0.8679 | 121.1434 | 21200 | 0.9648 |
264
+ | 0.8652 | 121.7168 | 21300 | 0.9747 |
265
+ | 0.8621 | 122.2867 | 21400 | 0.9818 |
266
+ | 0.8446 | 122.8602 | 21500 | 0.9861 |
267
+ | 0.8618 | 123.4301 | 21600 | 0.9597 |
268
+ | 0.8102 | 124.0 | 21700 | 0.9710 |
269
+ | 0.8605 | 124.5735 | 21800 | 0.9672 |
270
+ | 0.845 | 125.1434 | 21900 | 0.9760 |
271
+ | 0.8452 | 125.7168 | 22000 | 0.9756 |
272
+ | 0.848 | 126.2867 | 22100 | 0.9624 |
273
+ | 0.8522 | 126.8602 | 22200 | 0.9559 |
274
+ | 0.8481 | 127.4301 | 22300 | 0.9682 |
275
+ | 0.7941 | 128.0 | 22400 | 0.9693 |
276
+ | 0.8381 | 128.5735 | 22500 | 0.9690 |
277
+ | 0.8498 | 129.1434 | 22600 | 0.9656 |
278
+ | 0.8558 | 129.7168 | 22700 | 0.9619 |
279
+ | 0.8241 | 130.2867 | 22800 | 0.9568 |
280
+ | 0.8422 | 130.8602 | 22900 | 0.9662 |
281
+ | 0.8446 | 131.4301 | 23000 | 0.9598 |
282
+ | 0.7789 | 132.0 | 23100 | 0.9545 |
283
+ | 0.8287 | 132.5735 | 23200 | 0.9655 |
284
+ | 0.8413 | 133.1434 | 23300 | 0.9517 |
285
+ | 0.8344 | 133.7168 | 23400 | 0.9638 |
286
+ | 0.8287 | 134.2867 | 23500 | 0.9715 |
287
+ | 0.8253 | 134.8602 | 23600 | 0.9619 |
288
+ | 0.8232 | 135.4301 | 23700 | 0.9617 |
289
+ | 0.7756 | 136.0 | 23800 | 0.9559 |
290
+ | 0.8306 | 136.5735 | 23900 | 0.9612 |
291
+ | 0.8383 | 137.1434 | 24000 | 0.9481 |
292
+ | 0.8176 | 137.7168 | 24100 | 0.9537 |
293
+ | 0.8292 | 138.2867 | 24200 | 0.9640 |
294
+ | 0.8332 | 138.8602 | 24300 | 0.9477 |
295
+ | 0.8296 | 139.4301 | 24400 | 0.9636 |
296
+ | 0.7812 | 140.0 | 24500 | 0.9598 |
297
+ | 0.8303 | 140.5735 | 24600 | 0.9559 |
298
+ | 0.8207 | 141.1434 | 24700 | 0.9397 |
299
+ | 0.8163 | 141.7168 | 24800 | 0.9461 |
300
+ | 0.8177 | 142.2867 | 24900 | 0.9570 |
301
+ | 0.8185 | 142.8602 | 25000 | 0.9555 |
302
+ | 0.821 | 143.4301 | 25100 | 0.9620 |
303
+ | 0.7833 | 144.0 | 25200 | 0.9386 |
304
+ | 0.8218 | 144.5735 | 25300 | 0.9523 |
305
+ | 0.8156 | 145.1434 | 25400 | 0.9508 |
306
+ | 0.8205 | 145.7168 | 25500 | 0.9559 |
307
+ | 0.8204 | 146.2867 | 25600 | 0.9569 |
308
+ | 0.8113 | 146.8602 | 25700 | 0.9489 |
309
+ | 0.8093 | 147.4301 | 25800 | 0.9483 |
310
+ | 0.7531 | 148.0 | 25900 | 0.9512 |
311
+ | 0.8023 | 148.5735 | 26000 | 0.9543 |
312
+ | 0.8088 | 149.1434 | 26100 | 0.9640 |
313
+ | 0.8054 | 149.7168 | 26200 | 0.9508 |
314
+ | 0.8164 | 150.2867 | 26300 | 0.9372 |
315
+ | 0.8307 | 150.8602 | 26400 | 0.9495 |
316
+ | 0.8084 | 151.4301 | 26500 | 0.9408 |
317
+ | 0.7723 | 152.0 | 26600 | 0.9561 |
318
+ | 0.7929 | 152.5735 | 26700 | 0.9448 |
319
+ | 0.7836 | 153.1434 | 26800 | 0.9495 |
320
+ | 0.8016 | 153.7168 | 26900 | 0.9471 |
321
+ | 0.8073 | 154.2867 | 27000 | 0.9490 |
322
+ | 0.811 | 154.8602 | 27100 | 0.9445 |
323
+ | 0.808 | 155.4301 | 27200 | 0.9497 |
324
+ | 0.7446 | 156.0 | 27300 | 0.9624 |
325
+ | 0.8005 | 156.5735 | 27400 | 0.9527 |
326
+ | 0.8003 | 157.1434 | 27500 | 0.9406 |
327
+ | 0.7985 | 157.7168 | 27600 | 0.9430 |
328
+ | 0.8044 | 158.2867 | 27700 | 0.9457 |
329
+ | 0.817 | 158.8602 | 27800 | 0.9419 |
330
+ | 0.8118 | 159.4301 | 27900 | 0.9360 |
331
+ | 0.7421 | 160.0 | 28000 | 0.9401 |
332
+ | 0.8 | 160.5735 | 28100 | 0.9437 |
333
+ | 0.7981 | 161.1434 | 28200 | 0.9223 |
334
+ | 0.7917 | 161.7168 | 28300 | 0.9423 |
335
+ | 0.8029 | 162.2867 | 28400 | 0.9357 |
336
+ | 0.7934 | 162.8602 | 28500 | 0.9504 |
337
+ | 0.7907 | 163.4301 | 28600 | 0.9382 |
338
+ | 0.7451 | 164.0 | 28700 | 0.9348 |
339
+ | 0.7871 | 164.5735 | 28800 | 0.9435 |
340
+ | 0.7918 | 165.1434 | 28900 | 0.9399 |
341
+ | 0.8033 | 165.7168 | 29000 | 0.9382 |
342
+ | 0.7913 | 166.2867 | 29100 | 0.9279 |
343
+ | 0.7902 | 166.8602 | 29200 | 0.9461 |
344
+ | 0.781 | 167.4301 | 29300 | 0.9376 |
345
+ | 0.7472 | 168.0 | 29400 | 0.9337 |
346
+ | 0.7756 | 168.5735 | 29500 | 0.9419 |
347
+ | 0.7944 | 169.1434 | 29600 | 0.9313 |
348
+ | 0.7868 | 169.7168 | 29700 | 0.9308 |
349
+ | 0.7914 | 170.2867 | 29800 | 0.9339 |
350
+ | 0.7904 | 170.8602 | 29900 | 0.9410 |
351
+ | 0.7923 | 171.4301 | 30000 | 0.9442 |
352
+ | 0.7428 | 172.0 | 30100 | 0.9383 |
353
+ | 0.7822 | 172.5735 | 30200 | 0.9369 |
354
+ | 0.7956 | 173.1434 | 30300 | 0.9367 |
355
+ | 0.7888 | 173.7168 | 30400 | 0.9332 |
356
+ | 0.79 | 174.2867 | 30500 | 0.9264 |
357
+ | 0.7922 | 174.8602 | 30600 | 0.9313 |
358
+ | 0.7817 | 175.4301 | 30700 | 0.9363 |
359
+ | 0.7282 | 176.0 | 30800 | 0.9198 |
360
+ | 0.7819 | 176.5735 | 30900 | 0.9448 |
361
+ | 0.7692 | 177.1434 | 31000 | 0.9359 |
362
+ | 0.7725 | 177.7168 | 31100 | 0.9350 |
363
+ | 0.7819 | 178.2867 | 31200 | 0.9376 |
364
+ | 0.7951 | 178.8602 | 31300 | 0.9235 |
365
+ | 0.7793 | 179.4301 | 31400 | 0.9309 |
366
+ | 0.7439 | 180.0 | 31500 | 0.9385 |
367
+ | 0.7843 | 180.5735 | 31600 | 0.9364 |
368
+ | 0.781 | 181.1434 | 31700 | 0.9339 |
369
+ | 0.7859 | 181.7168 | 31800 | 0.9422 |
370
+ | 0.7684 | 182.2867 | 31900 | 0.9404 |
371
+ | 0.7722 | 182.8602 | 32000 | 0.9209 |
372
+ | 0.7799 | 183.4301 | 32100 | 0.9293 |
373
+ | 0.7344 | 184.0 | 32200 | 0.9242 |
374
+ | 0.7847 | 184.5735 | 32300 | 0.9372 |
375
+ | 0.7825 | 185.1434 | 32400 | 0.9271 |
376
+ | 0.7733 | 185.7168 | 32500 | 0.9302 |
377
+ | 0.7603 | 186.2867 | 32600 | 0.9334 |
378
+ | 0.7825 | 186.8602 | 32700 | 0.9228 |
379
+ | 0.7806 | 187.4301 | 32800 | 0.9348 |
380
+ | 0.7184 | 188.0 | 32900 | 0.9262 |
381
+ | 0.7866 | 188.5735 | 33000 | 0.9271 |
382
+ | 0.7649 | 189.1434 | 33100 | 0.9236 |
383
+ | 0.7825 | 189.7168 | 33200 | 0.9378 |
384
+ | 0.7788 | 190.2867 | 33300 | 0.9328 |
385
+ | 0.7771 | 190.8602 | 33400 | 0.9311 |
386
+ | 0.7771 | 191.4301 | 33500 | 0.9356 |
387
+ | 0.722 | 192.0 | 33600 | 0.9329 |
388
+ | 0.7683 | 192.5735 | 33700 | 0.9266 |
389
+ | 0.7724 | 193.1434 | 33800 | 0.9198 |
390
+ | 0.7491 | 193.7168 | 33900 | 0.9304 |
391
+ | 0.7591 | 194.2867 | 34000 | 0.9245 |
392
+ | 0.7609 | 194.8602 | 34100 | 0.9380 |
393
+ | 0.7672 | 195.4301 | 34200 | 0.9298 |
394
+ | 0.7194 | 196.0 | 34300 | 0.9120 |
395
+ | 0.7618 | 196.5735 | 34400 | 0.9269 |
396
+ | 0.7847 | 197.1434 | 34500 | 0.9285 |
397
+ | 0.7746 | 197.7168 | 34600 | 0.9113 |
398
+ | 0.7885 | 198.2867 | 34700 | 0.9460 |
399
+ | 0.7768 | 198.8602 | 34800 | 0.9382 |
400
+
401
+
402
+ ### Framework versions
403
+
404
+ - Transformers 4.49.0
405
+ - Pytorch 2.6.0+cu124
406
+ - Datasets 3.5.0
407
+ - Tokenizers 0.21.1
config.json ADDED
@@ -0,0 +1,26 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "architectures": [
3
+ "RobertaForMaskedLM"
4
+ ],
5
+ "attention_probs_dropout_prob": 0.1,
6
+ "bos_token_id": 0,
7
+ "classifier_dropout": null,
8
+ "eos_token_id": 2,
9
+ "hidden_act": "gelu",
10
+ "hidden_dropout_prob": 0.1,
11
+ "hidden_size": 768,
12
+ "initializer_range": 0.02,
13
+ "intermediate_size": 3072,
14
+ "layer_norm_eps": 1e-12,
15
+ "max_position_embeddings": 514,
16
+ "model_type": "roberta",
17
+ "num_attention_heads": 12,
18
+ "num_hidden_layers": 6,
19
+ "pad_token_id": 1,
20
+ "position_embedding_type": "absolute",
21
+ "torch_dtype": "float32",
22
+ "transformers_version": "4.49.0",
23
+ "type_vocab_size": 1,
24
+ "use_cache": true,
25
+ "vocab_size": 30000
26
+ }
merges.txt ADDED
The diff for this file is too large to render. See raw diff
 
model.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:505eea56759ab63c93ace16da77db4cd2093d15254feac11012e4517dc5cc4f4
3
+ size 266358248
special_tokens_map.json ADDED
@@ -0,0 +1,51 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "bos_token": {
3
+ "content": "<s>",
4
+ "lstrip": false,
5
+ "normalized": true,
6
+ "rstrip": false,
7
+ "single_word": false
8
+ },
9
+ "cls_token": {
10
+ "content": "<s>",
11
+ "lstrip": false,
12
+ "normalized": true,
13
+ "rstrip": false,
14
+ "single_word": false
15
+ },
16
+ "eos_token": {
17
+ "content": "</s>",
18
+ "lstrip": false,
19
+ "normalized": true,
20
+ "rstrip": false,
21
+ "single_word": false
22
+ },
23
+ "mask_token": {
24
+ "content": "<mask>",
25
+ "lstrip": true,
26
+ "normalized": false,
27
+ "rstrip": false,
28
+ "single_word": false
29
+ },
30
+ "pad_token": {
31
+ "content": "<pad>",
32
+ "lstrip": false,
33
+ "normalized": true,
34
+ "rstrip": false,
35
+ "single_word": false
36
+ },
37
+ "sep_token": {
38
+ "content": "</s>",
39
+ "lstrip": false,
40
+ "normalized": true,
41
+ "rstrip": false,
42
+ "single_word": false
43
+ },
44
+ "unk_token": {
45
+ "content": "<unk>",
46
+ "lstrip": false,
47
+ "normalized": true,
48
+ "rstrip": false,
49
+ "single_word": false
50
+ }
51
+ }
tokenizer.json ADDED
The diff for this file is too large to render. See raw diff
 
tokenizer_config.json ADDED
@@ -0,0 +1,59 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "add_prefix_space": false,
3
+ "added_tokens_decoder": {
4
+ "0": {
5
+ "content": "<s>",
6
+ "lstrip": false,
7
+ "normalized": true,
8
+ "rstrip": false,
9
+ "single_word": false,
10
+ "special": true
11
+ },
12
+ "1": {
13
+ "content": "<pad>",
14
+ "lstrip": false,
15
+ "normalized": true,
16
+ "rstrip": false,
17
+ "single_word": false,
18
+ "special": true
19
+ },
20
+ "2": {
21
+ "content": "</s>",
22
+ "lstrip": false,
23
+ "normalized": true,
24
+ "rstrip": false,
25
+ "single_word": false,
26
+ "special": true
27
+ },
28
+ "3": {
29
+ "content": "<unk>",
30
+ "lstrip": false,
31
+ "normalized": true,
32
+ "rstrip": false,
33
+ "single_word": false,
34
+ "special": true
35
+ },
36
+ "4": {
37
+ "content": "<mask>",
38
+ "lstrip": true,
39
+ "normalized": false,
40
+ "rstrip": false,
41
+ "single_word": false,
42
+ "special": true
43
+ }
44
+ },
45
+ "bos_token": "<s>",
46
+ "clean_up_tokenization_spaces": false,
47
+ "cls_token": "<s>",
48
+ "eos_token": "</s>",
49
+ "errors": "replace",
50
+ "extra_special_tokens": {},
51
+ "mask_token": "<mask>",
52
+ "max_len": 512,
53
+ "model_max_length": 512,
54
+ "pad_token": "<pad>",
55
+ "sep_token": "</s>",
56
+ "tokenizer_class": "RobertaTokenizer",
57
+ "trim_offsets": true,
58
+ "unk_token": "<unk>"
59
+ }
training_args.bin ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:28b388e2e18b728de410a11992f7f67839ea5b6b5017f4237b0022f24acb3484
3
+ size 5368
vocab.json ADDED
The diff for this file is too large to render. See raw diff