MayBashendy commited on
Commit
2200dc3
·
verified ·
1 Parent(s): 9fbc8e5

Training in progress, step 500

Browse files
Files changed (4) hide show
  1. README.md +314 -0
  2. config.json +32 -0
  3. model.safetensors +3 -0
  4. training_args.bin +3 -0
README.md ADDED
@@ -0,0 +1,314 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ library_name: transformers
3
+ base_model: aubmindlab/bert-base-arabertv02
4
+ tags:
5
+ - generated_from_trainer
6
+ model-index:
7
+ - name: ArabicNewSplits7_B_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k6_task5_organization
8
+ results: []
9
+ ---
10
+
11
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
12
+ should probably proofread and complete it, then remove this comment. -->
13
+
14
+ # ArabicNewSplits7_B_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k6_task5_organization
15
+
16
+ This model is a fine-tuned version of [aubmindlab/bert-base-arabertv02](https://huggingface.co/aubmindlab/bert-base-arabertv02) on the None dataset.
17
+ It achieves the following results on the evaluation set:
18
+ - Loss: 1.4248
19
+ - Qwk: 0.1952
20
+ - Mse: 1.4248
21
+ - Rmse: 1.1937
22
+
23
+ ## Model description
24
+
25
+ More information needed
26
+
27
+ ## Intended uses & limitations
28
+
29
+ More information needed
30
+
31
+ ## Training and evaluation data
32
+
33
+ More information needed
34
+
35
+ ## Training procedure
36
+
37
+ ### Training hyperparameters
38
+
39
+ The following hyperparameters were used during training:
40
+ - learning_rate: 2e-05
41
+ - train_batch_size: 8
42
+ - eval_batch_size: 8
43
+ - seed: 42
44
+ - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
45
+ - lr_scheduler_type: linear
46
+ - num_epochs: 100
47
+
48
+ ### Training results
49
+
50
+ | Training Loss | Epoch | Step | Validation Loss | Qwk | Mse | Rmse |
51
+ |:-------------:|:-------:|:----:|:---------------:|:-------:|:------:|:------:|
52
+ | No log | 0.0690 | 2 | 4.0777 | -0.0074 | 4.0777 | 2.0193 |
53
+ | No log | 0.1379 | 4 | 2.2030 | -0.0553 | 2.2030 | 1.4843 |
54
+ | No log | 0.2069 | 6 | 2.0009 | 0.0116 | 2.0009 | 1.4145 |
55
+ | No log | 0.2759 | 8 | 1.5532 | -0.0009 | 1.5532 | 1.2463 |
56
+ | No log | 0.3448 | 10 | 1.2093 | 0.0545 | 1.2093 | 1.0997 |
57
+ | No log | 0.4138 | 12 | 1.2859 | 0.1247 | 1.2859 | 1.1340 |
58
+ | No log | 0.4828 | 14 | 1.8067 | 0.0705 | 1.8067 | 1.3441 |
59
+ | No log | 0.5517 | 16 | 2.6426 | -0.0373 | 2.6426 | 1.6256 |
60
+ | No log | 0.6207 | 18 | 1.8069 | 0.1185 | 1.8069 | 1.3442 |
61
+ | No log | 0.6897 | 20 | 1.5729 | 0.0937 | 1.5729 | 1.2542 |
62
+ | No log | 0.7586 | 22 | 1.6814 | 0.0429 | 1.6814 | 1.2967 |
63
+ | No log | 0.8276 | 24 | 1.3131 | 0.1173 | 1.3131 | 1.1459 |
64
+ | No log | 0.8966 | 26 | 1.1228 | 0.2520 | 1.1228 | 1.0596 |
65
+ | No log | 0.9655 | 28 | 1.0704 | 0.2114 | 1.0704 | 1.0346 |
66
+ | No log | 1.0345 | 30 | 1.0750 | 0.1732 | 1.0750 | 1.0368 |
67
+ | No log | 1.1034 | 32 | 1.1240 | 0.2590 | 1.1240 | 1.0602 |
68
+ | No log | 1.1724 | 34 | 1.1558 | 0.2212 | 1.1558 | 1.0751 |
69
+ | No log | 1.2414 | 36 | 1.1036 | 0.2590 | 1.1036 | 1.0505 |
70
+ | No log | 1.3103 | 38 | 1.0147 | 0.2352 | 1.0147 | 1.0073 |
71
+ | No log | 1.3793 | 40 | 1.0049 | 0.2448 | 1.0049 | 1.0025 |
72
+ | No log | 1.4483 | 42 | 1.0401 | 0.1969 | 1.0401 | 1.0199 |
73
+ | No log | 1.5172 | 44 | 1.0635 | 0.2089 | 1.0635 | 1.0313 |
74
+ | No log | 1.5862 | 46 | 1.3921 | 0.1220 | 1.3921 | 1.1799 |
75
+ | No log | 1.6552 | 48 | 1.4566 | 0.1245 | 1.4566 | 1.2069 |
76
+ | No log | 1.7241 | 50 | 1.2267 | 0.1085 | 1.2267 | 1.1076 |
77
+ | No log | 1.7931 | 52 | 1.0043 | 0.2517 | 1.0043 | 1.0022 |
78
+ | No log | 1.8621 | 54 | 1.1361 | 0.2528 | 1.1361 | 1.0659 |
79
+ | No log | 1.9310 | 56 | 1.1966 | 0.2015 | 1.1966 | 1.0939 |
80
+ | No log | 2.0 | 58 | 1.1022 | 0.1379 | 1.1022 | 1.0499 |
81
+ | No log | 2.0690 | 60 | 1.0088 | 0.2716 | 1.0088 | 1.0044 |
82
+ | No log | 2.1379 | 62 | 1.0367 | 0.2343 | 1.0367 | 1.0182 |
83
+ | No log | 2.2069 | 64 | 0.9705 | 0.3184 | 0.9705 | 0.9851 |
84
+ | No log | 2.2759 | 66 | 0.9682 | 0.2647 | 0.9682 | 0.9840 |
85
+ | No log | 2.3448 | 68 | 0.9275 | 0.2718 | 0.9275 | 0.9631 |
86
+ | No log | 2.4138 | 70 | 1.0230 | 0.2138 | 1.0230 | 1.0114 |
87
+ | No log | 2.4828 | 72 | 1.1596 | 0.0827 | 1.1596 | 1.0768 |
88
+ | No log | 2.5517 | 74 | 1.0546 | 0.1296 | 1.0546 | 1.0269 |
89
+ | No log | 2.6207 | 76 | 0.9545 | 0.3528 | 0.9545 | 0.9770 |
90
+ | No log | 2.6897 | 78 | 1.0927 | 0.3496 | 1.0927 | 1.0453 |
91
+ | No log | 2.7586 | 80 | 1.1954 | 0.3158 | 1.1954 | 1.0933 |
92
+ | No log | 2.8276 | 82 | 1.0532 | 0.3761 | 1.0532 | 1.0263 |
93
+ | No log | 2.8966 | 84 | 0.9922 | 0.2721 | 0.9922 | 0.9961 |
94
+ | No log | 2.9655 | 86 | 1.0108 | 0.2647 | 1.0108 | 1.0054 |
95
+ | No log | 3.0345 | 88 | 1.0813 | 0.2325 | 1.0813 | 1.0399 |
96
+ | No log | 3.1034 | 90 | 1.0408 | 0.2669 | 1.0408 | 1.0202 |
97
+ | No log | 3.1724 | 92 | 1.0243 | 0.1859 | 1.0243 | 1.0121 |
98
+ | No log | 3.2414 | 94 | 1.1254 | 0.3514 | 1.1254 | 1.0608 |
99
+ | No log | 3.3103 | 96 | 1.2117 | 0.3514 | 1.2117 | 1.1008 |
100
+ | No log | 3.3793 | 98 | 1.1550 | 0.3149 | 1.1550 | 1.0747 |
101
+ | No log | 3.4483 | 100 | 1.0993 | 0.2276 | 1.0993 | 1.0485 |
102
+ | No log | 3.5172 | 102 | 1.1970 | 0.3182 | 1.1970 | 1.0941 |
103
+ | No log | 3.5862 | 104 | 1.1925 | 0.3761 | 1.1925 | 1.0920 |
104
+ | No log | 3.6552 | 106 | 1.1372 | 0.2505 | 1.1372 | 1.0664 |
105
+ | No log | 3.7241 | 108 | 1.1344 | 0.2231 | 1.1344 | 1.0651 |
106
+ | No log | 3.7931 | 110 | 1.1560 | 0.1650 | 1.1560 | 1.0752 |
107
+ | No log | 3.8621 | 112 | 1.5916 | 0.2353 | 1.5916 | 1.2616 |
108
+ | No log | 3.9310 | 114 | 1.8023 | 0.2068 | 1.8023 | 1.3425 |
109
+ | No log | 4.0 | 116 | 1.5509 | 0.2506 | 1.5509 | 1.2454 |
110
+ | No log | 4.0690 | 118 | 1.0625 | 0.1996 | 1.0625 | 1.0308 |
111
+ | No log | 4.1379 | 120 | 1.1307 | 0.2352 | 1.1307 | 1.0634 |
112
+ | No log | 4.2069 | 122 | 1.1093 | 0.2736 | 1.1093 | 1.0532 |
113
+ | No log | 4.2759 | 124 | 0.9962 | 0.3392 | 0.9962 | 0.9981 |
114
+ | No log | 4.3448 | 126 | 1.0006 | 0.2941 | 1.0006 | 1.0003 |
115
+ | No log | 4.4138 | 128 | 1.0403 | 0.2012 | 1.0403 | 1.0200 |
116
+ | No log | 4.4828 | 130 | 1.0364 | 0.2416 | 1.0364 | 1.0180 |
117
+ | No log | 4.5517 | 132 | 0.9813 | 0.2887 | 0.9813 | 0.9906 |
118
+ | No log | 4.6207 | 134 | 0.9608 | 0.2791 | 0.9608 | 0.9802 |
119
+ | No log | 4.6897 | 136 | 1.0257 | 0.3405 | 1.0257 | 1.0128 |
120
+ | No log | 4.7586 | 138 | 1.3107 | 0.3114 | 1.3107 | 1.1448 |
121
+ | No log | 4.8276 | 140 | 1.3969 | 0.2806 | 1.3969 | 1.1819 |
122
+ | No log | 4.8966 | 142 | 1.1833 | 0.3091 | 1.1833 | 1.0878 |
123
+ | No log | 4.9655 | 144 | 0.9306 | 0.2239 | 0.9306 | 0.9647 |
124
+ | No log | 5.0345 | 146 | 1.0066 | 0.1944 | 1.0066 | 1.0033 |
125
+ | No log | 5.1034 | 148 | 0.9812 | 0.3344 | 0.9812 | 0.9906 |
126
+ | No log | 5.1724 | 150 | 1.0223 | 0.2535 | 1.0223 | 1.0111 |
127
+ | No log | 5.2414 | 152 | 1.3018 | 0.2958 | 1.3018 | 1.1410 |
128
+ | No log | 5.3103 | 154 | 1.3591 | 0.2958 | 1.3591 | 1.1658 |
129
+ | No log | 5.3793 | 156 | 1.1902 | 0.3021 | 1.1902 | 1.0910 |
130
+ | No log | 5.4483 | 158 | 1.0751 | 0.3434 | 1.0751 | 1.0369 |
131
+ | No log | 5.5172 | 160 | 0.9700 | 0.2702 | 0.9700 | 0.9849 |
132
+ | No log | 5.5862 | 162 | 0.9809 | 0.3176 | 0.9809 | 0.9904 |
133
+ | No log | 5.6552 | 164 | 1.0751 | 0.3434 | 1.0751 | 1.0369 |
134
+ | No log | 5.7241 | 166 | 1.3354 | 0.3319 | 1.3354 | 1.1556 |
135
+ | No log | 5.7931 | 168 | 1.5140 | 0.3022 | 1.5140 | 1.2304 |
136
+ | No log | 5.8621 | 170 | 1.7180 | 0.2103 | 1.7180 | 1.3107 |
137
+ | No log | 5.9310 | 172 | 1.7784 | 0.1291 | 1.7784 | 1.3336 |
138
+ | No log | 6.0 | 174 | 1.7985 | 0.1062 | 1.7985 | 1.3411 |
139
+ | No log | 6.0690 | 176 | 1.6318 | 0.1601 | 1.6318 | 1.2774 |
140
+ | No log | 6.1379 | 178 | 1.6002 | 0.2070 | 1.6002 | 1.2650 |
141
+ | No log | 6.2069 | 180 | 1.5862 | 0.2771 | 1.5862 | 1.2594 |
142
+ | No log | 6.2759 | 182 | 1.3962 | 0.3021 | 1.3962 | 1.1816 |
143
+ | No log | 6.3448 | 184 | 1.2736 | 0.3059 | 1.2736 | 1.1286 |
144
+ | No log | 6.4138 | 186 | 1.2394 | 0.2707 | 1.2394 | 1.1133 |
145
+ | No log | 6.4828 | 188 | 1.3231 | 0.2730 | 1.3231 | 1.1503 |
146
+ | No log | 6.5517 | 190 | 1.2925 | 0.2686 | 1.2925 | 1.1369 |
147
+ | No log | 6.6207 | 192 | 1.1695 | 0.2686 | 1.1695 | 1.0814 |
148
+ | No log | 6.6897 | 194 | 1.0991 | 0.3075 | 1.0991 | 1.0484 |
149
+ | No log | 6.7586 | 196 | 1.1485 | 0.3344 | 1.1485 | 1.0717 |
150
+ | No log | 6.8276 | 198 | 1.4350 | 0.2769 | 1.4350 | 1.1979 |
151
+ | No log | 6.8966 | 200 | 1.6563 | 0.2294 | 1.6563 | 1.2870 |
152
+ | No log | 6.9655 | 202 | 1.7351 | 0.1929 | 1.7351 | 1.3172 |
153
+ | No log | 7.0345 | 204 | 1.6633 | 0.2314 | 1.6633 | 1.2897 |
154
+ | No log | 7.1034 | 206 | 1.3814 | 0.2501 | 1.3814 | 1.1753 |
155
+ | No log | 7.1724 | 208 | 1.0909 | 0.3079 | 1.0909 | 1.0445 |
156
+ | No log | 7.2414 | 210 | 1.0686 | 0.3024 | 1.0686 | 1.0337 |
157
+ | No log | 7.3103 | 212 | 1.1751 | 0.3811 | 1.1751 | 1.0840 |
158
+ | No log | 7.3793 | 214 | 1.4166 | 0.2772 | 1.4166 | 1.1902 |
159
+ | No log | 7.4483 | 216 | 1.5202 | 0.2191 | 1.5202 | 1.2330 |
160
+ | No log | 7.5172 | 218 | 1.4574 | 0.2851 | 1.4574 | 1.2072 |
161
+ | No log | 7.5862 | 220 | 1.2213 | 0.3268 | 1.2213 | 1.1051 |
162
+ | No log | 7.6552 | 222 | 1.1044 | 0.2898 | 1.1044 | 1.0509 |
163
+ | No log | 7.7241 | 224 | 1.0758 | 0.2968 | 1.0758 | 1.0372 |
164
+ | No log | 7.7931 | 226 | 1.1352 | 0.3295 | 1.1352 | 1.0655 |
165
+ | No log | 7.8621 | 228 | 1.3372 | 0.2964 | 1.3372 | 1.1564 |
166
+ | No log | 7.9310 | 230 | 1.5431 | 0.2401 | 1.5431 | 1.2422 |
167
+ | No log | 8.0 | 232 | 1.5300 | 0.3130 | 1.5300 | 1.2369 |
168
+ | No log | 8.0690 | 234 | 1.2950 | 0.2572 | 1.2950 | 1.1380 |
169
+ | No log | 8.1379 | 236 | 1.2052 | 0.2584 | 1.2052 | 1.0978 |
170
+ | No log | 8.2069 | 238 | 1.2352 | 0.2308 | 1.2352 | 1.1114 |
171
+ | No log | 8.2759 | 240 | 1.4407 | 0.2863 | 1.4407 | 1.2003 |
172
+ | No log | 8.3448 | 242 | 1.6543 | 0.1631 | 1.6543 | 1.2862 |
173
+ | No log | 8.4138 | 244 | 1.6138 | 0.2339 | 1.6138 | 1.2704 |
174
+ | No log | 8.4828 | 246 | 1.4161 | 0.2555 | 1.4161 | 1.1900 |
175
+ | No log | 8.5517 | 248 | 1.1922 | 0.1795 | 1.1922 | 1.0919 |
176
+ | No log | 8.6207 | 250 | 1.1025 | 0.1343 | 1.1025 | 1.0500 |
177
+ | No log | 8.6897 | 252 | 1.1076 | 0.2047 | 1.1076 | 1.0524 |
178
+ | No log | 8.7586 | 254 | 1.1824 | 0.1730 | 1.1824 | 1.0874 |
179
+ | No log | 8.8276 | 256 | 1.3048 | 0.3004 | 1.3048 | 1.1423 |
180
+ | No log | 8.8966 | 258 | 1.5028 | 0.2731 | 1.5028 | 1.2259 |
181
+ | No log | 8.9655 | 260 | 1.5540 | 0.2270 | 1.5540 | 1.2466 |
182
+ | No log | 9.0345 | 262 | 1.4333 | 0.2975 | 1.4333 | 1.1972 |
183
+ | No log | 9.1034 | 264 | 1.2312 | 0.3130 | 1.2312 | 1.1096 |
184
+ | No log | 9.1724 | 266 | 1.1454 | 0.3115 | 1.1454 | 1.0702 |
185
+ | No log | 9.2414 | 268 | 1.1458 | 0.2857 | 1.1458 | 1.0704 |
186
+ | No log | 9.3103 | 270 | 1.2478 | 0.2640 | 1.2478 | 1.1170 |
187
+ | No log | 9.3793 | 272 | 1.3633 | 0.2260 | 1.3633 | 1.1676 |
188
+ | No log | 9.4483 | 274 | 1.4209 | 0.2555 | 1.4209 | 1.1920 |
189
+ | No log | 9.5172 | 276 | 1.5479 | 0.2511 | 1.5479 | 1.2441 |
190
+ | No log | 9.5862 | 278 | 1.5713 | 0.2533 | 1.5713 | 1.2535 |
191
+ | No log | 9.6552 | 280 | 1.4712 | 0.2159 | 1.4712 | 1.2129 |
192
+ | No log | 9.7241 | 282 | 1.3427 | 0.2230 | 1.3427 | 1.1587 |
193
+ | No log | 9.7931 | 284 | 1.2893 | 0.2038 | 1.2893 | 1.1355 |
194
+ | No log | 9.8621 | 286 | 1.2601 | 0.0661 | 1.2601 | 1.1225 |
195
+ | No log | 9.9310 | 288 | 1.2831 | 0.0661 | 1.2831 | 1.1328 |
196
+ | No log | 10.0 | 290 | 1.3784 | 0.1628 | 1.3784 | 1.1740 |
197
+ | No log | 10.0690 | 292 | 1.4598 | 0.2015 | 1.4598 | 1.2082 |
198
+ | No log | 10.1379 | 294 | 1.4200 | 0.2015 | 1.4200 | 1.1916 |
199
+ | No log | 10.2069 | 296 | 1.2808 | 0.2773 | 1.2808 | 1.1317 |
200
+ | No log | 10.2759 | 298 | 1.2366 | 0.2381 | 1.2366 | 1.1120 |
201
+ | No log | 10.3448 | 300 | 1.1945 | 0.2319 | 1.1945 | 1.0929 |
202
+ | No log | 10.4138 | 302 | 1.1812 | 0.2401 | 1.1812 | 1.0868 |
203
+ | No log | 10.4828 | 304 | 1.2583 | 0.2328 | 1.2583 | 1.1217 |
204
+ | No log | 10.5517 | 306 | 1.4523 | 0.2686 | 1.4523 | 1.2051 |
205
+ | No log | 10.6207 | 308 | 1.5213 | 0.2015 | 1.5213 | 1.2334 |
206
+ | No log | 10.6897 | 310 | 1.3975 | 0.1700 | 1.3975 | 1.1822 |
207
+ | No log | 10.7586 | 312 | 1.1690 | 0.1605 | 1.1690 | 1.0812 |
208
+ | No log | 10.8276 | 314 | 1.0938 | 0.2487 | 1.0938 | 1.0458 |
209
+ | No log | 10.8966 | 316 | 1.1801 | 0.2545 | 1.1801 | 1.0863 |
210
+ | No log | 10.9655 | 318 | 1.3224 | 0.2195 | 1.3224 | 1.1500 |
211
+ | No log | 11.0345 | 320 | 1.5279 | 0.1628 | 1.5279 | 1.2361 |
212
+ | No log | 11.1034 | 322 | 1.6426 | 0.1952 | 1.6426 | 1.2816 |
213
+ | No log | 11.1724 | 324 | 1.6486 | 0.2070 | 1.6486 | 1.2840 |
214
+ | No log | 11.2414 | 326 | 1.5838 | 0.2132 | 1.5838 | 1.2585 |
215
+ | No log | 11.3103 | 328 | 1.4186 | 0.2120 | 1.4186 | 1.1910 |
216
+ | No log | 11.3793 | 330 | 1.3323 | 0.1725 | 1.3323 | 1.1542 |
217
+ | No log | 11.4483 | 332 | 1.3696 | 0.1288 | 1.3696 | 1.1703 |
218
+ | No log | 11.5172 | 334 | 1.4180 | 0.1288 | 1.4180 | 1.1908 |
219
+ | No log | 11.5862 | 336 | 1.4414 | 0.1952 | 1.4414 | 1.2006 |
220
+ | No log | 11.6552 | 338 | 1.4568 | 0.1700 | 1.4568 | 1.2070 |
221
+ | No log | 11.7241 | 340 | 1.4582 | 0.1980 | 1.4582 | 1.2076 |
222
+ | No log | 11.7931 | 342 | 1.4446 | 0.2232 | 1.4446 | 1.2019 |
223
+ | No log | 11.8621 | 344 | 1.5240 | 0.1622 | 1.5240 | 1.2345 |
224
+ | No log | 11.9310 | 346 | 1.5550 | 0.1963 | 1.5550 | 1.2470 |
225
+ | No log | 12.0 | 348 | 1.6095 | 0.2223 | 1.6095 | 1.2686 |
226
+ | No log | 12.0690 | 350 | 1.5451 | 0.2132 | 1.5451 | 1.2430 |
227
+ | No log | 12.1379 | 352 | 1.4976 | 0.2132 | 1.4976 | 1.2238 |
228
+ | No log | 12.2069 | 354 | 1.5218 | 0.2315 | 1.5218 | 1.2336 |
229
+ | No log | 12.2759 | 356 | 1.4551 | 0.2206 | 1.4551 | 1.2063 |
230
+ | No log | 12.3448 | 358 | 1.4312 | 0.2495 | 1.4312 | 1.1963 |
231
+ | No log | 12.4138 | 360 | 1.3514 | 0.2315 | 1.3514 | 1.1625 |
232
+ | No log | 12.4828 | 362 | 1.2506 | 0.1700 | 1.2506 | 1.1183 |
233
+ | No log | 12.5517 | 364 | 1.1744 | 0.2424 | 1.1744 | 1.0837 |
234
+ | No log | 12.6207 | 366 | 1.1195 | 0.0990 | 1.1195 | 1.0581 |
235
+ | No log | 12.6897 | 368 | 1.0961 | 0.1019 | 1.0961 | 1.0469 |
236
+ | No log | 12.7586 | 370 | 1.1174 | 0.1590 | 1.1174 | 1.0571 |
237
+ | No log | 12.8276 | 372 | 1.2010 | 0.2195 | 1.2010 | 1.0959 |
238
+ | No log | 12.8966 | 374 | 1.3184 | 0.2089 | 1.3184 | 1.1482 |
239
+ | No log | 12.9655 | 376 | 1.4072 | 0.2315 | 1.4072 | 1.1862 |
240
+ | No log | 13.0345 | 378 | 1.3712 | 0.2015 | 1.3712 | 1.1710 |
241
+ | No log | 13.1034 | 380 | 1.2626 | 0.2283 | 1.2626 | 1.1237 |
242
+ | No log | 13.1724 | 382 | 1.2542 | 0.2283 | 1.2542 | 1.1199 |
243
+ | No log | 13.2414 | 384 | 1.3234 | 0.2395 | 1.3234 | 1.1504 |
244
+ | No log | 13.3103 | 386 | 1.4088 | 0.2417 | 1.4088 | 1.1869 |
245
+ | No log | 13.3793 | 388 | 1.4157 | 0.2315 | 1.4157 | 1.1898 |
246
+ | No log | 13.4483 | 390 | 1.4291 | 0.2315 | 1.4291 | 1.1955 |
247
+ | No log | 13.5172 | 392 | 1.2954 | 0.2395 | 1.2954 | 1.1382 |
248
+ | No log | 13.5862 | 394 | 1.1579 | 0.2587 | 1.1579 | 1.0760 |
249
+ | No log | 13.6552 | 396 | 1.1008 | 0.3453 | 1.1008 | 1.0492 |
250
+ | No log | 13.7241 | 398 | 1.1544 | 0.3302 | 1.1544 | 1.0745 |
251
+ | No log | 13.7931 | 400 | 1.2896 | 0.3280 | 1.2896 | 1.1356 |
252
+ | No log | 13.8621 | 402 | 1.4536 | 0.2291 | 1.4536 | 1.2056 |
253
+ | No log | 13.9310 | 404 | 1.4523 | 0.2132 | 1.4523 | 1.2051 |
254
+ | No log | 14.0 | 406 | 1.3569 | 0.1628 | 1.3569 | 1.1649 |
255
+ | No log | 14.0690 | 408 | 1.2561 | 0.1024 | 1.2561 | 1.1207 |
256
+ | No log | 14.1379 | 410 | 1.2137 | 0.0661 | 1.2137 | 1.1017 |
257
+ | No log | 14.2069 | 412 | 1.2105 | 0.0661 | 1.2105 | 1.1002 |
258
+ | No log | 14.2759 | 414 | 1.2943 | 0.1628 | 1.2943 | 1.1377 |
259
+ | No log | 14.3448 | 416 | 1.4275 | 0.1628 | 1.4275 | 1.1948 |
260
+ | No log | 14.4138 | 418 | 1.6213 | 0.1908 | 1.6213 | 1.2733 |
261
+ | No log | 14.4828 | 420 | 1.7265 | 0.2016 | 1.7265 | 1.3140 |
262
+ | No log | 14.5517 | 422 | 1.6858 | 0.2270 | 1.6858 | 1.2984 |
263
+ | No log | 14.6207 | 424 | 1.5951 | 0.2174 | 1.5951 | 1.2630 |
264
+ | No log | 14.6897 | 426 | 1.5293 | 0.2075 | 1.5293 | 1.2366 |
265
+ | No log | 14.7586 | 428 | 1.5205 | 0.1769 | 1.5205 | 1.2331 |
266
+ | No log | 14.8276 | 430 | 1.5197 | 0.2075 | 1.5197 | 1.2328 |
267
+ | No log | 14.8966 | 432 | 1.5783 | 0.1898 | 1.5783 | 1.2563 |
268
+ | No log | 14.9655 | 434 | 1.5847 | 0.2240 | 1.5847 | 1.2588 |
269
+ | No log | 15.0345 | 436 | 1.5304 | 0.2075 | 1.5304 | 1.2371 |
270
+ | No log | 15.1034 | 438 | 1.4690 | 0.1769 | 1.4690 | 1.2120 |
271
+ | No log | 15.1724 | 440 | 1.4156 | 0.1628 | 1.4156 | 1.1898 |
272
+ | No log | 15.2414 | 442 | 1.3694 | 0.1628 | 1.3694 | 1.1702 |
273
+ | No log | 15.3103 | 444 | 1.3671 | 0.1628 | 1.3671 | 1.1692 |
274
+ | No log | 15.3793 | 446 | 1.3579 | 0.1628 | 1.3579 | 1.1653 |
275
+ | No log | 15.4483 | 448 | 1.3262 | 0.1628 | 1.3262 | 1.1516 |
276
+ | No log | 15.5172 | 450 | 1.3183 | 0.1628 | 1.3183 | 1.1482 |
277
+ | No log | 15.5862 | 452 | 1.3863 | 0.1700 | 1.3863 | 1.1774 |
278
+ | No log | 15.6552 | 454 | 1.4346 | 0.2315 | 1.4346 | 1.1978 |
279
+ | No log | 15.7241 | 456 | 1.4326 | 0.2877 | 1.4326 | 1.1969 |
280
+ | No log | 15.7931 | 458 | 1.3967 | 0.2495 | 1.3967 | 1.1818 |
281
+ | No log | 15.8621 | 460 | 1.3378 | 0.1838 | 1.3378 | 1.1566 |
282
+ | No log | 15.9310 | 462 | 1.3057 | 0.1628 | 1.3057 | 1.1427 |
283
+ | No log | 16.0 | 464 | 1.3420 | 0.1628 | 1.3420 | 1.1584 |
284
+ | No log | 16.0690 | 466 | 1.2975 | 0.1628 | 1.2975 | 1.1391 |
285
+ | No log | 16.1379 | 468 | 1.1634 | 0.1081 | 1.1634 | 1.0786 |
286
+ | No log | 16.2069 | 470 | 1.0875 | 0.1322 | 1.0875 | 1.0428 |
287
+ | No log | 16.2759 | 472 | 1.1121 | 0.1259 | 1.1121 | 1.0545 |
288
+ | No log | 16.3448 | 474 | 1.2259 | 0.2227 | 1.2259 | 1.1072 |
289
+ | No log | 16.4138 | 476 | 1.4127 | 0.2602 | 1.4127 | 1.1886 |
290
+ | No log | 16.4828 | 478 | 1.5244 | 0.2315 | 1.5244 | 1.2347 |
291
+ | No log | 16.5517 | 480 | 1.5191 | 0.1952 | 1.5191 | 1.2325 |
292
+ | No log | 16.6207 | 482 | 1.4513 | 0.1838 | 1.4513 | 1.2047 |
293
+ | No log | 16.6897 | 484 | 1.4104 | 0.2283 | 1.4104 | 1.1876 |
294
+ | No log | 16.7586 | 486 | 1.3428 | 0.2283 | 1.3428 | 1.1588 |
295
+ | No log | 16.8276 | 488 | 1.2253 | 0.3314 | 1.2253 | 1.1070 |
296
+ | No log | 16.8966 | 490 | 1.1247 | 0.3191 | 1.1247 | 1.0605 |
297
+ | No log | 16.9655 | 492 | 1.1062 | 0.3000 | 1.1062 | 1.0518 |
298
+ | No log | 17.0345 | 494 | 1.1538 | 0.3018 | 1.1538 | 1.0742 |
299
+ | No log | 17.1034 | 496 | 1.1801 | 0.2640 | 1.1801 | 1.0863 |
300
+ | No log | 17.1724 | 498 | 1.1603 | 0.2260 | 1.1603 | 1.0772 |
301
+ | 0.3399 | 17.2414 | 500 | 1.1964 | 0.2260 | 1.1964 | 1.0938 |
302
+ | 0.3399 | 17.3103 | 502 | 1.1922 | 0.1838 | 1.1922 | 1.0919 |
303
+ | 0.3399 | 17.3793 | 504 | 1.2346 | 0.2260 | 1.2346 | 1.1111 |
304
+ | 0.3399 | 17.4483 | 506 | 1.3234 | 0.2260 | 1.3234 | 1.1504 |
305
+ | 0.3399 | 17.5172 | 508 | 1.3990 | 0.1952 | 1.3990 | 1.1828 |
306
+ | 0.3399 | 17.5862 | 510 | 1.4248 | 0.1952 | 1.4248 | 1.1937 |
307
+
308
+
309
+ ### Framework versions
310
+
311
+ - Transformers 4.44.2
312
+ - Pytorch 2.4.0+cu118
313
+ - Datasets 2.21.0
314
+ - Tokenizers 0.19.1
config.json ADDED
@@ -0,0 +1,32 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "_name_or_path": "aubmindlab/bert-base-arabertv02",
3
+ "architectures": [
4
+ "BertForSequenceClassification"
5
+ ],
6
+ "attention_probs_dropout_prob": 0.1,
7
+ "classifier_dropout": null,
8
+ "hidden_act": "gelu",
9
+ "hidden_dropout_prob": 0.1,
10
+ "hidden_size": 768,
11
+ "id2label": {
12
+ "0": "LABEL_0"
13
+ },
14
+ "initializer_range": 0.02,
15
+ "intermediate_size": 3072,
16
+ "label2id": {
17
+ "LABEL_0": 0
18
+ },
19
+ "layer_norm_eps": 1e-12,
20
+ "max_position_embeddings": 512,
21
+ "model_type": "bert",
22
+ "num_attention_heads": 12,
23
+ "num_hidden_layers": 12,
24
+ "pad_token_id": 0,
25
+ "position_embedding_type": "absolute",
26
+ "problem_type": "regression",
27
+ "torch_dtype": "float32",
28
+ "transformers_version": "4.44.2",
29
+ "type_vocab_size": 2,
30
+ "use_cache": true,
31
+ "vocab_size": 64000
32
+ }
model.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:c6ac373da80615f42926e4df109d7a20a30cf65e40a950187d8ad6da6edec58c
3
+ size 540799996
training_args.bin ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:9dc2de8500192b9e3a4a84b0c40852b8d30ab634526ed37e240820e773ee4a6b
3
+ size 5368