MayBashendy commited on
Commit
fa9ad6a
·
verified ·
1 Parent(s): a9c080d

Training in progress, step 500

Browse files
Files changed (4) hide show
  1. README.md +314 -0
  2. config.json +32 -0
  3. model.safetensors +3 -0
  4. training_args.bin +3 -0
README.md ADDED
@@ -0,0 +1,314 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ library_name: transformers
3
+ base_model: aubmindlab/bert-base-arabertv02
4
+ tags:
5
+ - generated_from_trainer
6
+ model-index:
7
+ - name: ArabicNewSplits7_usingWellWrittenEssays_FineTuningAraBERT_run3_AugV5_k4_task2_organization
8
+ results: []
9
+ ---
10
+
11
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
12
+ should probably proofread and complete it, then remove this comment. -->
13
+
14
+ # ArabicNewSplits7_usingWellWrittenEssays_FineTuningAraBERT_run3_AugV5_k4_task2_organization
15
+
16
+ This model is a fine-tuned version of [aubmindlab/bert-base-arabertv02](https://huggingface.co/aubmindlab/bert-base-arabertv02) on the None dataset.
17
+ It achieves the following results on the evaluation set:
18
+ - Loss: 1.1112
19
+ - Qwk: 0.2403
20
+ - Mse: 1.1112
21
+ - Rmse: 1.0541
22
+
23
+ ## Model description
24
+
25
+ More information needed
26
+
27
+ ## Intended uses & limitations
28
+
29
+ More information needed
30
+
31
+ ## Training and evaluation data
32
+
33
+ More information needed
34
+
35
+ ## Training procedure
36
+
37
+ ### Training hyperparameters
38
+
39
+ The following hyperparameters were used during training:
40
+ - learning_rate: 2e-05
41
+ - train_batch_size: 8
42
+ - eval_batch_size: 8
43
+ - seed: 42
44
+ - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
45
+ - lr_scheduler_type: linear
46
+ - num_epochs: 100
47
+
48
+ ### Training results
49
+
50
+ | Training Loss | Epoch | Step | Validation Loss | Qwk | Mse | Rmse |
51
+ |:-------------:|:-------:|:----:|:---------------:|:-------:|:------:|:------:|
52
+ | No log | 0.1667 | 2 | 4.7841 | 0.0010 | 4.7841 | 2.1873 |
53
+ | No log | 0.3333 | 4 | 2.7856 | 0.0 | 2.7856 | 1.6690 |
54
+ | No log | 0.5 | 6 | 2.0168 | -0.0164 | 2.0168 | 1.4201 |
55
+ | No log | 0.6667 | 8 | 1.7539 | 0.0444 | 1.7539 | 1.3244 |
56
+ | No log | 0.8333 | 10 | 1.5072 | 0.0076 | 1.5072 | 1.2277 |
57
+ | No log | 1.0 | 12 | 1.3825 | 0.0357 | 1.3825 | 1.1758 |
58
+ | No log | 1.1667 | 14 | 1.3544 | 0.0030 | 1.3544 | 1.1638 |
59
+ | No log | 1.3333 | 16 | 1.2901 | 0.1014 | 1.2901 | 1.1358 |
60
+ | No log | 1.5 | 18 | 1.3139 | -0.0182 | 1.3139 | 1.1462 |
61
+ | No log | 1.6667 | 20 | 1.6583 | -0.0149 | 1.6583 | 1.2878 |
62
+ | No log | 1.8333 | 22 | 1.7799 | 0.0585 | 1.7799 | 1.3341 |
63
+ | No log | 2.0 | 24 | 1.4922 | -0.0199 | 1.4922 | 1.2216 |
64
+ | No log | 2.1667 | 26 | 1.2118 | 0.1875 | 1.2118 | 1.1008 |
65
+ | No log | 2.3333 | 28 | 1.2558 | 0.1249 | 1.2558 | 1.1206 |
66
+ | No log | 2.5 | 30 | 1.2410 | 0.0980 | 1.2410 | 1.1140 |
67
+ | No log | 2.6667 | 32 | 1.1366 | 0.2408 | 1.1366 | 1.0661 |
68
+ | No log | 2.8333 | 34 | 1.1000 | 0.2939 | 1.1000 | 1.0488 |
69
+ | No log | 3.0 | 36 | 1.1246 | 0.2835 | 1.1246 | 1.0605 |
70
+ | No log | 3.1667 | 38 | 1.1201 | 0.3603 | 1.1201 | 1.0584 |
71
+ | No log | 3.3333 | 40 | 1.1878 | 0.2446 | 1.1878 | 1.0899 |
72
+ | No log | 3.5 | 42 | 1.2704 | 0.1404 | 1.2704 | 1.1271 |
73
+ | No log | 3.6667 | 44 | 1.2563 | 0.1959 | 1.2563 | 1.1209 |
74
+ | No log | 3.8333 | 46 | 1.2000 | 0.2632 | 1.2000 | 1.0954 |
75
+ | No log | 4.0 | 48 | 1.1321 | 0.3347 | 1.1321 | 1.0640 |
76
+ | No log | 4.1667 | 50 | 1.3041 | 0.1404 | 1.3041 | 1.1420 |
77
+ | No log | 4.3333 | 52 | 1.6427 | 0.0622 | 1.6427 | 1.2817 |
78
+ | No log | 4.5 | 54 | 1.5936 | 0.0622 | 1.5936 | 1.2624 |
79
+ | No log | 4.6667 | 56 | 1.3245 | 0.1552 | 1.3245 | 1.1509 |
80
+ | No log | 4.8333 | 58 | 1.0780 | 0.2167 | 1.0780 | 1.0382 |
81
+ | No log | 5.0 | 60 | 1.1052 | 0.1042 | 1.1052 | 1.0513 |
82
+ | No log | 5.1667 | 62 | 1.2020 | 0.0802 | 1.2020 | 1.0963 |
83
+ | No log | 5.3333 | 64 | 1.1696 | 0.0612 | 1.1696 | 1.0815 |
84
+ | No log | 5.5 | 66 | 1.0527 | 0.2993 | 1.0527 | 1.0260 |
85
+ | No log | 5.6667 | 68 | 1.0322 | 0.2835 | 1.0322 | 1.0160 |
86
+ | No log | 5.8333 | 70 | 1.0268 | 0.2167 | 1.0268 | 1.0133 |
87
+ | No log | 6.0 | 72 | 1.0108 | 0.2167 | 1.0108 | 1.0054 |
88
+ | No log | 6.1667 | 74 | 0.9803 | 0.3374 | 0.9803 | 0.9901 |
89
+ | No log | 6.3333 | 76 | 1.0468 | 0.1344 | 1.0468 | 1.0232 |
90
+ | No log | 6.5 | 78 | 1.1586 | 0.0843 | 1.1586 | 1.0764 |
91
+ | No log | 6.6667 | 80 | 1.1476 | 0.0843 | 1.1476 | 1.0713 |
92
+ | No log | 6.8333 | 82 | 1.0972 | 0.0857 | 1.0972 | 1.0475 |
93
+ | No log | 7.0 | 84 | 0.9757 | 0.2360 | 0.9757 | 0.9878 |
94
+ | No log | 7.1667 | 86 | 0.9784 | 0.2991 | 0.9784 | 0.9891 |
95
+ | No log | 7.3333 | 88 | 1.0107 | 0.1935 | 1.0107 | 1.0053 |
96
+ | No log | 7.5 | 90 | 1.1134 | 0.0513 | 1.1134 | 1.0552 |
97
+ | No log | 7.6667 | 92 | 1.2349 | 0.0444 | 1.2349 | 1.1113 |
98
+ | No log | 7.8333 | 94 | 1.1761 | 0.1345 | 1.1761 | 1.0845 |
99
+ | No log | 8.0 | 96 | 1.0462 | 0.1761 | 1.0462 | 1.0228 |
100
+ | No log | 8.1667 | 98 | 0.9383 | 0.4588 | 0.9383 | 0.9687 |
101
+ | No log | 8.3333 | 100 | 0.9904 | 0.3635 | 0.9904 | 0.9952 |
102
+ | No log | 8.5 | 102 | 0.9724 | 0.3590 | 0.9724 | 0.9861 |
103
+ | No log | 8.6667 | 104 | 0.9852 | 0.2331 | 0.9852 | 0.9926 |
104
+ | No log | 8.8333 | 106 | 1.2392 | 0.1896 | 1.2392 | 1.1132 |
105
+ | No log | 9.0 | 108 | 1.5681 | 0.1884 | 1.5681 | 1.2522 |
106
+ | No log | 9.1667 | 110 | 1.6556 | 0.1179 | 1.6556 | 1.2867 |
107
+ | No log | 9.3333 | 112 | 1.4848 | 0.1462 | 1.4848 | 1.2185 |
108
+ | No log | 9.5 | 114 | 1.2836 | 0.0642 | 1.2836 | 1.1330 |
109
+ | No log | 9.6667 | 116 | 1.2218 | 0.1343 | 1.2218 | 1.1054 |
110
+ | No log | 9.8333 | 118 | 1.1811 | 0.1278 | 1.1811 | 1.0868 |
111
+ | No log | 10.0 | 120 | 1.1811 | 0.1442 | 1.1811 | 1.0868 |
112
+ | No log | 10.1667 | 122 | 1.1952 | 0.1118 | 1.1952 | 1.0933 |
113
+ | No log | 10.3333 | 124 | 1.2723 | 0.1557 | 1.2723 | 1.1280 |
114
+ | No log | 10.5 | 126 | 1.2502 | 0.1557 | 1.2502 | 1.1181 |
115
+ | No log | 10.6667 | 128 | 1.1155 | 0.2470 | 1.1155 | 1.0562 |
116
+ | No log | 10.8333 | 130 | 1.0650 | 0.3021 | 1.0650 | 1.0320 |
117
+ | No log | 11.0 | 132 | 1.0575 | 0.3021 | 1.0575 | 1.0283 |
118
+ | No log | 11.1667 | 134 | 1.1922 | 0.3088 | 1.1922 | 1.0919 |
119
+ | No log | 11.3333 | 136 | 1.3939 | 0.3060 | 1.3939 | 1.1806 |
120
+ | No log | 11.5 | 138 | 1.3992 | 0.3253 | 1.3992 | 1.1829 |
121
+ | No log | 11.6667 | 140 | 1.2563 | 0.3118 | 1.2563 | 1.1208 |
122
+ | No log | 11.8333 | 142 | 1.1919 | 0.375 | 1.1919 | 1.0918 |
123
+ | No log | 12.0 | 144 | 1.1895 | 0.3631 | 1.1895 | 1.0907 |
124
+ | No log | 12.1667 | 146 | 1.1707 | 0.3136 | 1.1707 | 1.0820 |
125
+ | No log | 12.3333 | 148 | 1.2170 | 0.3688 | 1.2170 | 1.1032 |
126
+ | No log | 12.5 | 150 | 1.2518 | 0.3277 | 1.2518 | 1.1188 |
127
+ | No log | 12.6667 | 152 | 1.4028 | 0.3324 | 1.4028 | 1.1844 |
128
+ | No log | 12.8333 | 154 | 1.3852 | 0.3324 | 1.3852 | 1.1769 |
129
+ | No log | 13.0 | 156 | 1.2217 | 0.2367 | 1.2217 | 1.1053 |
130
+ | No log | 13.1667 | 158 | 1.0816 | 0.2056 | 1.0816 | 1.0400 |
131
+ | No log | 13.3333 | 160 | 1.0821 | 0.2056 | 1.0821 | 1.0402 |
132
+ | No log | 13.5 | 162 | 1.0836 | 0.1755 | 1.0836 | 1.0410 |
133
+ | No log | 13.6667 | 164 | 1.1595 | 0.2330 | 1.1595 | 1.0768 |
134
+ | No log | 13.8333 | 166 | 1.1949 | 0.2634 | 1.1949 | 1.0931 |
135
+ | No log | 14.0 | 168 | 1.1958 | 0.3155 | 1.1958 | 1.0935 |
136
+ | No log | 14.1667 | 170 | 1.1561 | 0.3155 | 1.1561 | 1.0752 |
137
+ | No log | 14.3333 | 172 | 1.0665 | 0.2953 | 1.0665 | 1.0327 |
138
+ | No log | 14.5 | 174 | 1.0295 | 0.3377 | 1.0295 | 1.0146 |
139
+ | No log | 14.6667 | 176 | 0.9752 | 0.4165 | 0.9752 | 0.9875 |
140
+ | No log | 14.8333 | 178 | 1.0341 | 0.4186 | 1.0341 | 1.0169 |
141
+ | No log | 15.0 | 180 | 1.2186 | 0.3556 | 1.2186 | 1.1039 |
142
+ | No log | 15.1667 | 182 | 1.2450 | 0.3931 | 1.2450 | 1.1158 |
143
+ | No log | 15.3333 | 184 | 1.2422 | 0.3293 | 1.2422 | 1.1145 |
144
+ | No log | 15.5 | 186 | 1.1222 | 0.2864 | 1.1222 | 1.0594 |
145
+ | No log | 15.6667 | 188 | 1.1029 | 0.2535 | 1.1029 | 1.0502 |
146
+ | No log | 15.8333 | 190 | 1.0441 | 0.2152 | 1.0441 | 1.0218 |
147
+ | No log | 16.0 | 192 | 0.9930 | 0.3851 | 0.9930 | 0.9965 |
148
+ | No log | 16.1667 | 194 | 1.0585 | 0.3739 | 1.0585 | 1.0289 |
149
+ | No log | 16.3333 | 196 | 1.2877 | 0.3353 | 1.2877 | 1.1348 |
150
+ | No log | 16.5 | 198 | 1.3686 | 0.2895 | 1.3686 | 1.1699 |
151
+ | No log | 16.6667 | 200 | 1.2849 | 0.2647 | 1.2849 | 1.1336 |
152
+ | No log | 16.8333 | 202 | 1.2155 | 0.3136 | 1.2155 | 1.1025 |
153
+ | No log | 17.0 | 204 | 1.2273 | 0.2494 | 1.2273 | 1.1078 |
154
+ | No log | 17.1667 | 206 | 1.1773 | 0.1026 | 1.1773 | 1.0851 |
155
+ | No log | 17.3333 | 208 | 1.0958 | 0.0974 | 1.0958 | 1.0468 |
156
+ | No log | 17.5 | 210 | 1.0502 | 0.0974 | 1.0502 | 1.0248 |
157
+ | No log | 17.6667 | 212 | 1.0049 | 0.1814 | 1.0049 | 1.0025 |
158
+ | No log | 17.8333 | 214 | 1.0212 | 0.1480 | 1.0212 | 1.0106 |
159
+ | No log | 18.0 | 216 | 1.0977 | 0.1217 | 1.0977 | 1.0477 |
160
+ | No log | 18.1667 | 218 | 1.1481 | 0.1637 | 1.1481 | 1.0715 |
161
+ | No log | 18.3333 | 220 | 1.2109 | 0.1440 | 1.2109 | 1.1004 |
162
+ | No log | 18.5 | 222 | 1.2670 | 0.1795 | 1.2670 | 1.1256 |
163
+ | No log | 18.6667 | 224 | 1.2196 | 0.3071 | 1.2196 | 1.1044 |
164
+ | No log | 18.8333 | 226 | 1.1261 | 0.1943 | 1.1261 | 1.0612 |
165
+ | No log | 19.0 | 228 | 1.0440 | 0.1860 | 1.0440 | 1.0218 |
166
+ | No log | 19.1667 | 230 | 1.0296 | 0.1662 | 1.0296 | 1.0147 |
167
+ | No log | 19.3333 | 232 | 1.0606 | 0.1701 | 1.0606 | 1.0299 |
168
+ | No log | 19.5 | 234 | 1.0947 | 0.1379 | 1.0947 | 1.0463 |
169
+ | No log | 19.6667 | 236 | 1.1007 | 0.1379 | 1.1007 | 1.0492 |
170
+ | No log | 19.8333 | 238 | 1.0689 | 0.1480 | 1.0689 | 1.0339 |
171
+ | No log | 20.0 | 240 | 1.0288 | 0.1605 | 1.0288 | 1.0143 |
172
+ | No log | 20.1667 | 242 | 1.0320 | 0.1706 | 1.0320 | 1.0159 |
173
+ | No log | 20.3333 | 244 | 1.1000 | 0.1541 | 1.1000 | 1.0488 |
174
+ | No log | 20.5 | 246 | 1.1313 | 0.1185 | 1.1313 | 1.0636 |
175
+ | No log | 20.6667 | 248 | 1.0859 | 0.1541 | 1.0859 | 1.0421 |
176
+ | No log | 20.8333 | 250 | 1.0423 | 0.1605 | 1.0423 | 1.0209 |
177
+ | No log | 21.0 | 252 | 1.0494 | 0.1706 | 1.0494 | 1.0244 |
178
+ | No log | 21.1667 | 254 | 1.0725 | 0.1543 | 1.0725 | 1.0356 |
179
+ | No log | 21.3333 | 256 | 1.1434 | 0.1315 | 1.1434 | 1.0693 |
180
+ | No log | 21.5 | 258 | 1.2508 | 0.1650 | 1.2508 | 1.1184 |
181
+ | No log | 21.6667 | 260 | 1.2938 | 0.2037 | 1.2938 | 1.1375 |
182
+ | No log | 21.8333 | 262 | 1.2174 | 0.1345 | 1.2174 | 1.1033 |
183
+ | No log | 22.0 | 264 | 1.1028 | 0.1214 | 1.1028 | 1.0501 |
184
+ | No log | 22.1667 | 266 | 1.0392 | 0.1868 | 1.0392 | 1.0194 |
185
+ | No log | 22.3333 | 268 | 1.0016 | 0.3070 | 1.0016 | 1.0008 |
186
+ | No log | 22.5 | 270 | 0.9790 | 0.3243 | 0.9790 | 0.9895 |
187
+ | No log | 22.6667 | 272 | 1.0101 | 0.2121 | 1.0101 | 1.0050 |
188
+ | No log | 22.8333 | 274 | 1.1179 | 0.2343 | 1.1179 | 1.0573 |
189
+ | No log | 23.0 | 276 | 1.2234 | 0.3094 | 1.2234 | 1.1061 |
190
+ | No log | 23.1667 | 278 | 1.2513 | 0.3357 | 1.2513 | 1.1186 |
191
+ | No log | 23.3333 | 280 | 1.1673 | 0.3231 | 1.1673 | 1.0804 |
192
+ | No log | 23.5 | 282 | 1.1194 | 0.2683 | 1.1194 | 1.0580 |
193
+ | No log | 23.6667 | 284 | 1.1279 | 0.2683 | 1.1279 | 1.0620 |
194
+ | No log | 23.8333 | 286 | 1.1079 | 0.2829 | 1.1079 | 1.0526 |
195
+ | No log | 24.0 | 288 | 1.1222 | 0.2304 | 1.1222 | 1.0594 |
196
+ | No log | 24.1667 | 290 | 1.1910 | 0.1650 | 1.1910 | 1.0913 |
197
+ | No log | 24.3333 | 292 | 1.2946 | 0.2080 | 1.2946 | 1.1378 |
198
+ | No log | 24.5 | 294 | 1.3793 | 0.3316 | 1.3793 | 1.1745 |
199
+ | No log | 24.6667 | 296 | 1.3327 | 0.2354 | 1.3327 | 1.1544 |
200
+ | No log | 24.8333 | 298 | 1.2019 | 0.1538 | 1.2019 | 1.0963 |
201
+ | No log | 25.0 | 300 | 1.1026 | 0.1281 | 1.1026 | 1.0501 |
202
+ | No log | 25.1667 | 302 | 1.0787 | 0.1442 | 1.0787 | 1.0386 |
203
+ | No log | 25.3333 | 304 | 1.1119 | 0.1281 | 1.1119 | 1.0545 |
204
+ | No log | 25.5 | 306 | 1.1707 | 0.1217 | 1.1707 | 1.0820 |
205
+ | No log | 25.6667 | 308 | 1.2377 | 0.1846 | 1.2377 | 1.1125 |
206
+ | No log | 25.8333 | 310 | 1.3500 | 0.3093 | 1.3500 | 1.1619 |
207
+ | No log | 26.0 | 312 | 1.4644 | 0.2633 | 1.4644 | 1.2101 |
208
+ | No log | 26.1667 | 314 | 1.4858 | 0.3082 | 1.4858 | 1.2189 |
209
+ | No log | 26.3333 | 316 | 1.3617 | 0.3006 | 1.3617 | 1.1669 |
210
+ | No log | 26.5 | 318 | 1.2031 | 0.3525 | 1.2031 | 1.0969 |
211
+ | No log | 26.6667 | 320 | 1.0835 | 0.2909 | 1.0835 | 1.0409 |
212
+ | No log | 26.8333 | 322 | 1.0846 | 0.2288 | 1.0846 | 1.0415 |
213
+ | No log | 27.0 | 324 | 1.1290 | 0.2864 | 1.1290 | 1.0625 |
214
+ | No log | 27.1667 | 326 | 1.2367 | 0.2559 | 1.2367 | 1.1121 |
215
+ | No log | 27.3333 | 328 | 1.3322 | 0.3324 | 1.3322 | 1.1542 |
216
+ | No log | 27.5 | 330 | 1.3360 | 0.3324 | 1.3360 | 1.1559 |
217
+ | No log | 27.6667 | 332 | 1.2522 | 0.3601 | 1.2522 | 1.1190 |
218
+ | No log | 27.8333 | 334 | 1.1433 | 0.2953 | 1.1433 | 1.0692 |
219
+ | No log | 28.0 | 336 | 1.0849 | 0.2152 | 1.0849 | 1.0416 |
220
+ | No log | 28.1667 | 338 | 1.0558 | 0.2009 | 1.0558 | 1.0275 |
221
+ | No log | 28.3333 | 340 | 1.0357 | 0.2164 | 1.0357 | 1.0177 |
222
+ | No log | 28.5 | 342 | 1.0761 | 0.1853 | 1.0761 | 1.0374 |
223
+ | No log | 28.6667 | 344 | 1.1101 | 0.1696 | 1.1101 | 1.0536 |
224
+ | No log | 28.8333 | 346 | 1.1299 | 0.1696 | 1.1299 | 1.0630 |
225
+ | No log | 29.0 | 348 | 1.1596 | 0.1696 | 1.1596 | 1.0769 |
226
+ | No log | 29.1667 | 350 | 1.2182 | 0.1696 | 1.2182 | 1.1037 |
227
+ | No log | 29.3333 | 352 | 1.2905 | 0.1696 | 1.2905 | 1.1360 |
228
+ | No log | 29.5 | 354 | 1.2953 | 0.2291 | 1.2953 | 1.1381 |
229
+ | No log | 29.6667 | 356 | 1.2433 | 0.2439 | 1.2433 | 1.1150 |
230
+ | No log | 29.8333 | 358 | 1.1731 | 0.2152 | 1.1731 | 1.0831 |
231
+ | No log | 30.0 | 360 | 1.1340 | 0.1853 | 1.1340 | 1.0649 |
232
+ | No log | 30.1667 | 362 | 1.1197 | 0.2009 | 1.1197 | 1.0582 |
233
+ | No log | 30.3333 | 364 | 1.1736 | 0.1853 | 1.1736 | 1.0833 |
234
+ | No log | 30.5 | 366 | 1.2439 | 0.1557 | 1.2439 | 1.1153 |
235
+ | No log | 30.6667 | 368 | 1.3112 | 0.2664 | 1.3112 | 1.1451 |
236
+ | No log | 30.8333 | 370 | 1.4041 | 0.3087 | 1.4041 | 1.1849 |
237
+ | No log | 31.0 | 372 | 1.4113 | 0.2941 | 1.4113 | 1.1880 |
238
+ | No log | 31.1667 | 374 | 1.3308 | 0.2330 | 1.3308 | 1.1536 |
239
+ | No log | 31.3333 | 376 | 1.2113 | 0.1696 | 1.2113 | 1.1006 |
240
+ | No log | 31.5 | 378 | 1.1429 | 0.1853 | 1.1429 | 1.0691 |
241
+ | No log | 31.6667 | 380 | 1.1255 | 0.1853 | 1.1255 | 1.0609 |
242
+ | No log | 31.8333 | 382 | 1.1352 | 0.1696 | 1.1352 | 1.0654 |
243
+ | No log | 32.0 | 384 | 1.1755 | 0.1696 | 1.1755 | 1.0842 |
244
+ | No log | 32.1667 | 386 | 1.2623 | 0.2097 | 1.2623 | 1.1235 |
245
+ | No log | 32.3333 | 388 | 1.3322 | 0.1944 | 1.3322 | 1.1542 |
246
+ | No log | 32.5 | 390 | 1.3382 | 0.1991 | 1.3382 | 1.1568 |
247
+ | No log | 32.6667 | 392 | 1.2571 | 0.1750 | 1.2571 | 1.1212 |
248
+ | No log | 32.8333 | 394 | 1.1215 | 0.1541 | 1.1215 | 1.0590 |
249
+ | No log | 33.0 | 396 | 1.0436 | 0.2276 | 1.0436 | 1.0216 |
250
+ | No log | 33.1667 | 398 | 1.0099 | 0.2587 | 1.0099 | 1.0049 |
251
+ | No log | 33.3333 | 400 | 1.0268 | 0.2432 | 1.0268 | 1.0133 |
252
+ | No log | 33.5 | 402 | 1.0846 | 0.2009 | 1.0846 | 1.0415 |
253
+ | No log | 33.6667 | 404 | 1.1934 | 0.2000 | 1.1934 | 1.0924 |
254
+ | No log | 33.8333 | 406 | 1.3387 | 0.3574 | 1.3387 | 1.1570 |
255
+ | No log | 34.0 | 408 | 1.3902 | 0.3791 | 1.3902 | 1.1791 |
256
+ | No log | 34.1667 | 410 | 1.3616 | 0.3784 | 1.3616 | 1.1669 |
257
+ | No log | 34.3333 | 412 | 1.2799 | 0.2403 | 1.2799 | 1.1313 |
258
+ | No log | 34.5 | 414 | 1.2026 | 0.1801 | 1.2026 | 1.0966 |
259
+ | No log | 34.6667 | 416 | 1.1158 | 0.1696 | 1.1158 | 1.0563 |
260
+ | No log | 34.8333 | 418 | 1.0628 | 0.1860 | 1.0628 | 1.0309 |
261
+ | No log | 35.0 | 420 | 1.0157 | 0.2640 | 1.0157 | 1.0078 |
262
+ | No log | 35.1667 | 422 | 1.0076 | 0.2244 | 1.0076 | 1.0038 |
263
+ | No log | 35.3333 | 424 | 1.0387 | 0.2019 | 1.0387 | 1.0191 |
264
+ | No log | 35.5 | 426 | 1.0876 | 0.1696 | 1.0876 | 1.0429 |
265
+ | No log | 35.6667 | 428 | 1.1088 | 0.1696 | 1.1088 | 1.0530 |
266
+ | No log | 35.8333 | 430 | 1.1354 | 0.2090 | 1.1354 | 1.0656 |
267
+ | No log | 36.0 | 432 | 1.1177 | 0.2291 | 1.1177 | 1.0572 |
268
+ | No log | 36.1667 | 434 | 1.0801 | 0.2291 | 1.0801 | 1.0393 |
269
+ | No log | 36.3333 | 436 | 1.0557 | 0.2733 | 1.0557 | 1.0275 |
270
+ | No log | 36.5 | 438 | 1.0392 | 0.2733 | 1.0392 | 1.0194 |
271
+ | No log | 36.6667 | 440 | 1.0505 | 0.2857 | 1.0505 | 1.0250 |
272
+ | No log | 36.8333 | 442 | 1.0906 | 0.2367 | 1.0906 | 1.0443 |
273
+ | No log | 37.0 | 444 | 1.1003 | 0.2367 | 1.1003 | 1.0490 |
274
+ | No log | 37.1667 | 446 | 1.0798 | 0.2743 | 1.0798 | 1.0392 |
275
+ | No log | 37.3333 | 448 | 1.0683 | 0.2402 | 1.0683 | 1.0336 |
276
+ | No log | 37.5 | 450 | 1.0184 | 0.2604 | 1.0184 | 1.0092 |
277
+ | No log | 37.6667 | 452 | 0.9762 | 0.3093 | 0.9762 | 0.9880 |
278
+ | No log | 37.8333 | 454 | 0.9541 | 0.3478 | 0.9541 | 0.9768 |
279
+ | No log | 38.0 | 456 | 0.9720 | 0.3093 | 0.9720 | 0.9859 |
280
+ | No log | 38.1667 | 458 | 1.0345 | 0.2702 | 1.0345 | 1.0171 |
281
+ | No log | 38.3333 | 460 | 1.1170 | 0.2090 | 1.1170 | 1.0569 |
282
+ | No log | 38.5 | 462 | 1.1412 | 0.2090 | 1.1412 | 1.0683 |
283
+ | No log | 38.6667 | 464 | 1.1506 | 0.2090 | 1.1506 | 1.0727 |
284
+ | No log | 38.8333 | 466 | 1.1145 | 0.2000 | 1.1145 | 1.0557 |
285
+ | No log | 39.0 | 468 | 1.0370 | 0.2552 | 1.0370 | 1.0183 |
286
+ | No log | 39.1667 | 470 | 0.9865 | 0.2752 | 0.9865 | 0.9932 |
287
+ | No log | 39.3333 | 472 | 0.9559 | 0.3217 | 0.9559 | 0.9777 |
288
+ | No log | 39.5 | 474 | 0.9552 | 0.3094 | 0.9552 | 0.9774 |
289
+ | No log | 39.6667 | 476 | 0.9713 | 0.3434 | 0.9713 | 0.9856 |
290
+ | No log | 39.8333 | 478 | 0.9986 | 0.2552 | 0.9986 | 0.9993 |
291
+ | No log | 40.0 | 480 | 1.0328 | 0.2263 | 1.0328 | 1.0163 |
292
+ | No log | 40.1667 | 482 | 1.0570 | 0.2108 | 1.0570 | 1.0281 |
293
+ | No log | 40.3333 | 484 | 1.0898 | 0.1696 | 1.0898 | 1.0439 |
294
+ | No log | 40.5 | 486 | 1.1321 | 0.1801 | 1.1321 | 1.0640 |
295
+ | No log | 40.6667 | 488 | 1.1400 | 0.1801 | 1.1400 | 1.0677 |
296
+ | No log | 40.8333 | 490 | 1.0991 | 0.2000 | 1.0991 | 1.0484 |
297
+ | No log | 41.0 | 492 | 1.0648 | 0.2000 | 1.0648 | 1.0319 |
298
+ | No log | 41.1667 | 494 | 1.0418 | 0.2402 | 1.0418 | 1.0207 |
299
+ | No log | 41.3333 | 496 | 1.0322 | 0.2402 | 1.0322 | 1.0160 |
300
+ | No log | 41.5 | 498 | 1.0159 | 0.2552 | 1.0159 | 1.0079 |
301
+ | 0.3077 | 41.6667 | 500 | 1.0117 | 0.2552 | 1.0117 | 1.0058 |
302
+ | 0.3077 | 41.8333 | 502 | 1.0457 | 0.2000 | 1.0457 | 1.0226 |
303
+ | 0.3077 | 42.0 | 504 | 1.0935 | 0.2330 | 1.0935 | 1.0457 |
304
+ | 0.3077 | 42.1667 | 506 | 1.1318 | 0.2403 | 1.1318 | 1.0639 |
305
+ | 0.3077 | 42.3333 | 508 | 1.1197 | 0.2132 | 1.1197 | 1.0582 |
306
+ | 0.3077 | 42.5 | 510 | 1.1112 | 0.2403 | 1.1112 | 1.0541 |
307
+
308
+
309
+ ### Framework versions
310
+
311
+ - Transformers 4.44.2
312
+ - Pytorch 2.4.0+cu118
313
+ - Datasets 2.21.0
314
+ - Tokenizers 0.19.1
config.json ADDED
@@ -0,0 +1,32 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "_name_or_path": "aubmindlab/bert-base-arabertv02",
3
+ "architectures": [
4
+ "BertForSequenceClassification"
5
+ ],
6
+ "attention_probs_dropout_prob": 0.1,
7
+ "classifier_dropout": null,
8
+ "hidden_act": "gelu",
9
+ "hidden_dropout_prob": 0.1,
10
+ "hidden_size": 768,
11
+ "id2label": {
12
+ "0": "LABEL_0"
13
+ },
14
+ "initializer_range": 0.02,
15
+ "intermediate_size": 3072,
16
+ "label2id": {
17
+ "LABEL_0": 0
18
+ },
19
+ "layer_norm_eps": 1e-12,
20
+ "max_position_embeddings": 512,
21
+ "model_type": "bert",
22
+ "num_attention_heads": 12,
23
+ "num_hidden_layers": 12,
24
+ "pad_token_id": 0,
25
+ "position_embedding_type": "absolute",
26
+ "problem_type": "regression",
27
+ "torch_dtype": "float32",
28
+ "transformers_version": "4.44.2",
29
+ "type_vocab_size": 2,
30
+ "use_cache": true,
31
+ "vocab_size": 64000
32
+ }
model.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:8027b664eefd087196bcda4e5bf9a25e241c58ff6eb8f6b186873d2ec7dcfde4
3
+ size 540799996
training_args.bin ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:f671349360810caf75a19c9e4d94f8307aac9aef3f402136f2cd8ad801e9ec67
3
+ size 5304