MayBashendy commited on
Commit
0c8df73
·
verified ·
1 Parent(s): bb0287b

Training in progress, step 500

Browse files
Files changed (4) hide show
  1. README.md +320 -0
  2. config.json +32 -0
  3. model.safetensors +3 -0
  4. training_args.bin +3 -0
README.md ADDED
@@ -0,0 +1,320 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ library_name: transformers
3
+ base_model: aubmindlab/bert-base-arabertv02
4
+ tags:
5
+ - generated_from_trainer
6
+ model-index:
7
+ - name: ArabicNewSplits7_usingALLEssays_FineTuningAraBERT_run2_AugV5_k17_task5_organization
8
+ results: []
9
+ ---
10
+
11
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
12
+ should probably proofread and complete it, then remove this comment. -->
13
+
14
+ # ArabicNewSplits7_usingALLEssays_FineTuningAraBERT_run2_AugV5_k17_task5_organization
15
+
16
+ This model is a fine-tuned version of [aubmindlab/bert-base-arabertv02](https://huggingface.co/aubmindlab/bert-base-arabertv02) on the None dataset.
17
+ It achieves the following results on the evaluation set:
18
+ - Loss: 1.0495
19
+ - Qwk: 0.2108
20
+ - Mse: 1.0495
21
+ - Rmse: 1.0245
22
+
23
+ ## Model description
24
+
25
+ More information needed
26
+
27
+ ## Intended uses & limitations
28
+
29
+ More information needed
30
+
31
+ ## Training and evaluation data
32
+
33
+ More information needed
34
+
35
+ ## Training procedure
36
+
37
+ ### Training hyperparameters
38
+
39
+ The following hyperparameters were used during training:
40
+ - learning_rate: 2e-05
41
+ - train_batch_size: 8
42
+ - eval_batch_size: 8
43
+ - seed: 42
44
+ - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
45
+ - lr_scheduler_type: linear
46
+ - num_epochs: 100
47
+
48
+ ### Training results
49
+
50
+ | Training Loss | Epoch | Step | Validation Loss | Qwk | Mse | Rmse |
51
+ |:-------------:|:-------:|:----:|:---------------:|:-------:|:------:|:------:|
52
+ | No log | 0.0385 | 2 | 3.9895 | -0.0319 | 3.9895 | 1.9974 |
53
+ | No log | 0.0769 | 4 | 2.4094 | -0.0234 | 2.4094 | 1.5522 |
54
+ | No log | 0.1154 | 6 | 1.8147 | 0.0008 | 1.8147 | 1.3471 |
55
+ | No log | 0.1538 | 8 | 1.2220 | 0.2049 | 1.2220 | 1.1054 |
56
+ | No log | 0.1923 | 10 | 1.1949 | 0.2192 | 1.1949 | 1.0931 |
57
+ | No log | 0.2308 | 12 | 1.8593 | 0.0811 | 1.8593 | 1.3636 |
58
+ | No log | 0.2692 | 14 | 2.2208 | 0.0970 | 2.2208 | 1.4902 |
59
+ | No log | 0.3077 | 16 | 1.6122 | 0.0651 | 1.6122 | 1.2697 |
60
+ | No log | 0.3462 | 18 | 1.3083 | 0.0 | 1.3083 | 1.1438 |
61
+ | No log | 0.3846 | 20 | 1.2208 | 0.0116 | 1.2208 | 1.1049 |
62
+ | No log | 0.4231 | 22 | 1.3889 | 0.0 | 1.3889 | 1.1785 |
63
+ | No log | 0.4615 | 24 | 1.3903 | 0.0 | 1.3903 | 1.1791 |
64
+ | No log | 0.5 | 26 | 1.2782 | 0.0116 | 1.2782 | 1.1306 |
65
+ | No log | 0.5385 | 28 | 1.2673 | 0.0116 | 1.2673 | 1.1258 |
66
+ | No log | 0.5769 | 30 | 1.1549 | 0.1114 | 1.1549 | 1.0746 |
67
+ | No log | 0.6154 | 32 | 1.2344 | 0.1525 | 1.2344 | 1.1110 |
68
+ | No log | 0.6538 | 34 | 1.5694 | 0.0 | 1.5694 | 1.2528 |
69
+ | No log | 0.6923 | 36 | 1.6453 | 0.0532 | 1.6453 | 1.2827 |
70
+ | No log | 0.7308 | 38 | 1.4894 | 0.0530 | 1.4894 | 1.2204 |
71
+ | No log | 0.7692 | 40 | 1.3520 | 0.1085 | 1.3520 | 1.1628 |
72
+ | No log | 0.8077 | 42 | 1.1858 | 0.0640 | 1.1858 | 1.0889 |
73
+ | No log | 0.8462 | 44 | 1.1147 | 0.1465 | 1.1147 | 1.0558 |
74
+ | No log | 0.8846 | 46 | 1.1093 | 0.2068 | 1.1093 | 1.0532 |
75
+ | No log | 0.9231 | 48 | 1.2401 | 0.1057 | 1.2401 | 1.1136 |
76
+ | No log | 0.9615 | 50 | 1.3351 | 0.2071 | 1.3351 | 1.1554 |
77
+ | No log | 1.0 | 52 | 1.4306 | 0.2174 | 1.4306 | 1.1961 |
78
+ | No log | 1.0385 | 54 | 1.7799 | 0.0494 | 1.7799 | 1.3341 |
79
+ | No log | 1.0769 | 56 | 2.0608 | 0.0422 | 2.0608 | 1.4355 |
80
+ | No log | 1.1154 | 58 | 1.7094 | 0.0769 | 1.7094 | 1.3075 |
81
+ | No log | 1.1538 | 60 | 1.3516 | 0.2371 | 1.3516 | 1.1626 |
82
+ | No log | 1.1923 | 62 | 1.1986 | 0.2956 | 1.1986 | 1.0948 |
83
+ | No log | 1.2308 | 64 | 1.1289 | 0.2114 | 1.1289 | 1.0625 |
84
+ | No log | 1.2692 | 66 | 1.1865 | 0.2448 | 1.1865 | 1.0893 |
85
+ | No log | 1.3077 | 68 | 1.2167 | 0.2448 | 1.2167 | 1.1031 |
86
+ | No log | 1.3462 | 70 | 1.1566 | 0.1631 | 1.1566 | 1.0754 |
87
+ | No log | 1.3846 | 72 | 1.1039 | 0.0884 | 1.1039 | 1.0507 |
88
+ | No log | 1.4231 | 74 | 1.1189 | 0.0454 | 1.1189 | 1.0578 |
89
+ | No log | 1.4615 | 76 | 1.1398 | 0.0458 | 1.1398 | 1.0676 |
90
+ | No log | 1.5 | 78 | 1.1463 | 0.1033 | 1.1463 | 1.0706 |
91
+ | No log | 1.5385 | 80 | 1.1344 | 0.0791 | 1.1344 | 1.0651 |
92
+ | No log | 1.5769 | 82 | 1.1340 | 0.0941 | 1.1340 | 1.0649 |
93
+ | No log | 1.6154 | 84 | 1.0970 | 0.0153 | 1.0970 | 1.0474 |
94
+ | No log | 1.6538 | 86 | 1.1114 | 0.1783 | 1.1114 | 1.0542 |
95
+ | No log | 1.6923 | 88 | 1.1726 | 0.2108 | 1.1726 | 1.0829 |
96
+ | No log | 1.7308 | 90 | 1.1235 | 0.1727 | 1.1235 | 1.0599 |
97
+ | No log | 1.7692 | 92 | 1.0471 | 0.1067 | 1.0471 | 1.0233 |
98
+ | No log | 1.8077 | 94 | 1.0779 | 0.2203 | 1.0779 | 1.0382 |
99
+ | No log | 1.8462 | 96 | 1.1292 | 0.2925 | 1.1292 | 1.0626 |
100
+ | No log | 1.8846 | 98 | 1.0529 | 0.2647 | 1.0529 | 1.0261 |
101
+ | No log | 1.9231 | 100 | 1.0341 | 0.2265 | 1.0341 | 1.0169 |
102
+ | No log | 1.9615 | 102 | 1.1046 | 0.2154 | 1.1046 | 1.0510 |
103
+ | No log | 2.0 | 104 | 1.1120 | 0.2154 | 1.1120 | 1.0545 |
104
+ | No log | 2.0385 | 106 | 1.1712 | 0.2375 | 1.1712 | 1.0822 |
105
+ | No log | 2.0769 | 108 | 1.1091 | 0.3229 | 1.1091 | 1.0532 |
106
+ | No log | 2.1154 | 110 | 0.9890 | 0.2239 | 0.9890 | 0.9945 |
107
+ | No log | 2.1538 | 112 | 0.9578 | 0.2888 | 0.9578 | 0.9787 |
108
+ | No log | 2.1923 | 114 | 0.9526 | 0.3614 | 0.9526 | 0.9760 |
109
+ | No log | 2.2308 | 116 | 0.9756 | 0.4030 | 0.9756 | 0.9877 |
110
+ | No log | 2.2692 | 118 | 1.0543 | 0.3822 | 1.0543 | 1.0268 |
111
+ | No log | 2.3077 | 120 | 1.0903 | 0.3318 | 1.0903 | 1.0442 |
112
+ | No log | 2.3462 | 122 | 1.2471 | 0.2655 | 1.2471 | 1.1167 |
113
+ | No log | 2.3846 | 124 | 1.2992 | 0.2033 | 1.2992 | 1.1398 |
114
+ | No log | 2.4231 | 126 | 1.2077 | 0.2333 | 1.2077 | 1.0990 |
115
+ | No log | 2.4615 | 128 | 1.2120 | 0.1859 | 1.2120 | 1.1009 |
116
+ | No log | 2.5 | 130 | 1.3328 | 0.2043 | 1.3328 | 1.1545 |
117
+ | No log | 2.5385 | 132 | 1.4944 | 0.1129 | 1.4944 | 1.2225 |
118
+ | No log | 2.5769 | 134 | 1.3687 | 0.1712 | 1.3687 | 1.1699 |
119
+ | No log | 2.6154 | 136 | 1.1666 | 0.2310 | 1.1666 | 1.0801 |
120
+ | No log | 2.6538 | 138 | 1.0825 | 0.2651 | 1.0825 | 1.0404 |
121
+ | No log | 2.6923 | 140 | 1.0908 | 0.3124 | 1.0908 | 1.0444 |
122
+ | No log | 2.7308 | 142 | 1.1413 | 0.2887 | 1.1413 | 1.0683 |
123
+ | No log | 2.7692 | 144 | 1.0605 | 0.2088 | 1.0605 | 1.0298 |
124
+ | No log | 2.8077 | 146 | 1.0012 | 0.2380 | 1.0012 | 1.0006 |
125
+ | No log | 2.8462 | 148 | 1.0572 | 0.3800 | 1.0572 | 1.0282 |
126
+ | No log | 2.8846 | 150 | 1.0578 | 0.4035 | 1.0578 | 1.0285 |
127
+ | No log | 2.9231 | 152 | 0.9872 | 0.4757 | 0.9872 | 0.9936 |
128
+ | No log | 2.9615 | 154 | 1.0135 | 0.3993 | 1.0135 | 1.0067 |
129
+ | No log | 3.0 | 156 | 1.1629 | 0.3182 | 1.1629 | 1.0784 |
130
+ | No log | 3.0385 | 158 | 1.2497 | 0.3059 | 1.2497 | 1.1179 |
131
+ | No log | 3.0769 | 160 | 1.1032 | 0.3506 | 1.1032 | 1.0503 |
132
+ | No log | 3.1154 | 162 | 0.9620 | 0.3149 | 0.9620 | 0.9808 |
133
+ | No log | 3.1538 | 164 | 0.9281 | 0.3737 | 0.9281 | 0.9634 |
134
+ | No log | 3.1923 | 166 | 0.9184 | 0.3896 | 0.9184 | 0.9583 |
135
+ | No log | 3.2308 | 168 | 0.8923 | 0.4083 | 0.8923 | 0.9446 |
136
+ | No log | 3.2692 | 170 | 0.9280 | 0.4082 | 0.9280 | 0.9633 |
137
+ | No log | 3.3077 | 172 | 0.9233 | 0.3682 | 0.9233 | 0.9609 |
138
+ | No log | 3.3462 | 174 | 0.9811 | 0.3785 | 0.9811 | 0.9905 |
139
+ | No log | 3.3846 | 176 | 1.0080 | 0.3424 | 1.0080 | 1.0040 |
140
+ | No log | 3.4231 | 178 | 1.0065 | 0.2628 | 1.0065 | 1.0032 |
141
+ | No log | 3.4615 | 180 | 0.9128 | 0.2291 | 0.9128 | 0.9554 |
142
+ | No log | 3.5 | 182 | 0.8865 | 0.3328 | 0.8865 | 0.9415 |
143
+ | No log | 3.5385 | 184 | 0.8664 | 0.4498 | 0.8664 | 0.9308 |
144
+ | No log | 3.5769 | 186 | 0.8834 | 0.3151 | 0.8834 | 0.9399 |
145
+ | No log | 3.6154 | 188 | 0.9954 | 0.3503 | 0.9954 | 0.9977 |
146
+ | No log | 3.6538 | 190 | 1.0599 | 0.3444 | 1.0599 | 1.0295 |
147
+ | No log | 3.6923 | 192 | 1.0136 | 0.2965 | 1.0136 | 1.0068 |
148
+ | No log | 3.7308 | 194 | 0.9893 | 0.2718 | 0.9893 | 0.9947 |
149
+ | No log | 3.7692 | 196 | 0.9656 | 0.2742 | 0.9656 | 0.9826 |
150
+ | No log | 3.8077 | 198 | 0.9894 | 0.3841 | 0.9894 | 0.9947 |
151
+ | No log | 3.8462 | 200 | 1.0555 | 0.3414 | 1.0555 | 1.0274 |
152
+ | No log | 3.8846 | 202 | 1.0301 | 0.3663 | 1.0301 | 1.0149 |
153
+ | No log | 3.9231 | 204 | 0.9628 | 0.3011 | 0.9628 | 0.9812 |
154
+ | No log | 3.9615 | 206 | 0.9711 | 0.1971 | 0.9711 | 0.9854 |
155
+ | No log | 4.0 | 208 | 0.9672 | 0.2692 | 0.9672 | 0.9835 |
156
+ | No log | 4.0385 | 210 | 1.0321 | 0.3207 | 1.0321 | 1.0159 |
157
+ | No log | 4.0769 | 212 | 1.1018 | 0.3405 | 1.1018 | 1.0496 |
158
+ | No log | 4.1154 | 214 | 1.0422 | 0.3483 | 1.0422 | 1.0209 |
159
+ | No log | 4.1538 | 216 | 0.9663 | 0.3067 | 0.9663 | 0.9830 |
160
+ | No log | 4.1923 | 218 | 0.9665 | 0.2577 | 0.9665 | 0.9831 |
161
+ | No log | 4.2308 | 220 | 0.9717 | 0.2721 | 0.9717 | 0.9857 |
162
+ | No log | 4.2692 | 222 | 1.0174 | 0.3921 | 1.0174 | 1.0087 |
163
+ | No log | 4.3077 | 224 | 1.1078 | 0.3677 | 1.1078 | 1.0525 |
164
+ | No log | 4.3462 | 226 | 1.1199 | 0.3654 | 1.1199 | 1.0583 |
165
+ | No log | 4.3846 | 228 | 1.0264 | 0.2819 | 1.0264 | 1.0131 |
166
+ | No log | 4.4231 | 230 | 0.9658 | 0.2239 | 0.9658 | 0.9828 |
167
+ | No log | 4.4615 | 232 | 0.9816 | 0.2746 | 0.9816 | 0.9908 |
168
+ | No log | 4.5 | 234 | 0.9657 | 0.2214 | 0.9657 | 0.9827 |
169
+ | No log | 4.5385 | 236 | 0.9391 | 0.2667 | 0.9391 | 0.9691 |
170
+ | No log | 4.5769 | 238 | 0.9201 | 0.2794 | 0.9201 | 0.9592 |
171
+ | No log | 4.6154 | 240 | 0.9369 | 0.4106 | 0.9369 | 0.9679 |
172
+ | No log | 4.6538 | 242 | 1.0534 | 0.3409 | 1.0534 | 1.0264 |
173
+ | No log | 4.6923 | 244 | 1.0831 | 0.3409 | 1.0831 | 1.0407 |
174
+ | No log | 4.7308 | 246 | 1.0801 | 0.3790 | 1.0801 | 1.0393 |
175
+ | No log | 4.7692 | 248 | 0.9713 | 0.2896 | 0.9713 | 0.9855 |
176
+ | No log | 4.8077 | 250 | 0.9102 | 0.3094 | 0.9102 | 0.9540 |
177
+ | No log | 4.8462 | 252 | 0.9161 | 0.3323 | 0.9161 | 0.9571 |
178
+ | No log | 4.8846 | 254 | 0.9278 | 0.3094 | 0.9278 | 0.9632 |
179
+ | No log | 4.9231 | 256 | 0.9744 | 0.2187 | 0.9744 | 0.9871 |
180
+ | No log | 4.9615 | 258 | 1.1460 | 0.3470 | 1.1460 | 1.0705 |
181
+ | No log | 5.0 | 260 | 1.2211 | 0.2851 | 1.2211 | 1.1050 |
182
+ | No log | 5.0385 | 262 | 1.1109 | 0.2464 | 1.1109 | 1.0540 |
183
+ | No log | 5.0769 | 264 | 1.0262 | 0.2865 | 1.0262 | 1.0130 |
184
+ | No log | 5.1154 | 266 | 1.0281 | 0.2291 | 1.0281 | 1.0140 |
185
+ | No log | 5.1538 | 268 | 1.0290 | 0.2865 | 1.0290 | 1.0144 |
186
+ | No log | 5.1923 | 270 | 1.0221 | 0.3008 | 1.0221 | 1.0110 |
187
+ | No log | 5.2308 | 272 | 1.0081 | 0.3840 | 1.0081 | 1.0040 |
188
+ | No log | 5.2692 | 274 | 1.0400 | 0.3543 | 1.0400 | 1.0198 |
189
+ | No log | 5.3077 | 276 | 1.1558 | 0.1919 | 1.1558 | 1.0751 |
190
+ | No log | 5.3462 | 278 | 1.2620 | 0.1398 | 1.2620 | 1.1234 |
191
+ | No log | 5.3846 | 280 | 1.1674 | 0.2424 | 1.1674 | 1.0804 |
192
+ | No log | 5.4231 | 282 | 1.0326 | 0.3523 | 1.0326 | 1.0162 |
193
+ | No log | 5.4615 | 284 | 0.9759 | 0.3280 | 0.9759 | 0.9879 |
194
+ | No log | 5.5 | 286 | 0.9759 | 0.24 | 0.9759 | 0.9879 |
195
+ | No log | 5.5385 | 288 | 0.9775 | 0.3134 | 0.9775 | 0.9887 |
196
+ | No log | 5.5769 | 290 | 1.0561 | 0.2771 | 1.0561 | 1.0277 |
197
+ | No log | 5.6154 | 292 | 1.1517 | 0.2149 | 1.1517 | 1.0732 |
198
+ | No log | 5.6538 | 294 | 1.1281 | 0.2574 | 1.1281 | 1.0621 |
199
+ | No log | 5.6923 | 296 | 1.0435 | 0.2492 | 1.0435 | 1.0215 |
200
+ | No log | 5.7308 | 298 | 1.0110 | 0.2140 | 1.0110 | 1.0055 |
201
+ | No log | 5.7692 | 300 | 0.9847 | 0.3258 | 0.9847 | 0.9923 |
202
+ | No log | 5.8077 | 302 | 1.0193 | 0.3424 | 1.0193 | 1.0096 |
203
+ | No log | 5.8462 | 304 | 1.1097 | 0.3409 | 1.1097 | 1.0534 |
204
+ | No log | 5.8846 | 306 | 1.1040 | 0.3436 | 1.1040 | 1.0507 |
205
+ | No log | 5.9231 | 308 | 1.0028 | 0.4041 | 1.0028 | 1.0014 |
206
+ | No log | 5.9615 | 310 | 0.9629 | 0.2692 | 0.9629 | 0.9813 |
207
+ | No log | 6.0 | 312 | 0.9865 | 0.2217 | 0.9865 | 0.9932 |
208
+ | No log | 6.0385 | 314 | 1.0142 | 0.2391 | 1.0142 | 1.0071 |
209
+ | No log | 6.0769 | 316 | 0.9852 | 0.2068 | 0.9852 | 0.9926 |
210
+ | No log | 6.1154 | 318 | 0.9690 | 0.2794 | 0.9690 | 0.9844 |
211
+ | No log | 6.1538 | 320 | 1.0075 | 0.2963 | 1.0075 | 1.0037 |
212
+ | No log | 6.1923 | 322 | 1.0032 | 0.3264 | 1.0032 | 1.0016 |
213
+ | No log | 6.2308 | 324 | 0.9756 | 0.3272 | 0.9756 | 0.9877 |
214
+ | No log | 6.2692 | 326 | 0.9783 | 0.3840 | 0.9783 | 0.9891 |
215
+ | No log | 6.3077 | 328 | 0.9751 | 0.3840 | 0.9751 | 0.9874 |
216
+ | No log | 6.3462 | 330 | 0.9649 | 0.3403 | 0.9649 | 0.9823 |
217
+ | No log | 6.3846 | 332 | 0.9648 | 0.3403 | 0.9648 | 0.9823 |
218
+ | No log | 6.4231 | 334 | 0.9651 | 0.2944 | 0.9651 | 0.9824 |
219
+ | No log | 6.4615 | 336 | 0.9696 | 0.2842 | 0.9696 | 0.9847 |
220
+ | No log | 6.5 | 338 | 0.9643 | 0.3172 | 0.9643 | 0.9820 |
221
+ | No log | 6.5385 | 340 | 0.9523 | 0.3455 | 0.9523 | 0.9759 |
222
+ | No log | 6.5769 | 342 | 0.9485 | 0.3740 | 0.9485 | 0.9739 |
223
+ | No log | 6.6154 | 344 | 0.9567 | 0.3977 | 0.9567 | 0.9781 |
224
+ | No log | 6.6538 | 346 | 0.9654 | 0.3821 | 0.9654 | 0.9826 |
225
+ | No log | 6.6923 | 348 | 1.0130 | 0.3414 | 1.0130 | 1.0065 |
226
+ | No log | 6.7308 | 350 | 1.0148 | 0.2941 | 1.0148 | 1.0074 |
227
+ | No log | 6.7692 | 352 | 0.9707 | 0.2794 | 0.9707 | 0.9852 |
228
+ | No log | 6.8077 | 354 | 0.9960 | 0.2978 | 0.9960 | 0.9980 |
229
+ | No log | 6.8462 | 356 | 1.0246 | 0.2666 | 1.0246 | 1.0122 |
230
+ | No log | 6.8846 | 358 | 1.0015 | 0.2690 | 1.0015 | 1.0007 |
231
+ | No log | 6.9231 | 360 | 0.9930 | 0.2794 | 0.9930 | 0.9965 |
232
+ | No log | 6.9615 | 362 | 1.0774 | 0.3330 | 1.0774 | 1.0380 |
233
+ | No log | 7.0 | 364 | 1.0676 | 0.3207 | 1.0676 | 1.0333 |
234
+ | No log | 7.0385 | 366 | 0.9947 | 0.2770 | 0.9947 | 0.9973 |
235
+ | No log | 7.0769 | 368 | 0.9603 | 0.2944 | 0.9603 | 0.9799 |
236
+ | No log | 7.1154 | 370 | 0.9500 | 0.2841 | 0.9500 | 0.9747 |
237
+ | No log | 7.1538 | 372 | 0.9434 | 0.3156 | 0.9434 | 0.9713 |
238
+ | No log | 7.1923 | 374 | 0.9384 | 0.3603 | 0.9384 | 0.9687 |
239
+ | No log | 7.2308 | 376 | 0.9272 | 0.3603 | 0.9272 | 0.9629 |
240
+ | No log | 7.2692 | 378 | 0.9350 | 0.3702 | 0.9350 | 0.9670 |
241
+ | No log | 7.3077 | 380 | 0.9613 | 0.3372 | 0.9613 | 0.9805 |
242
+ | No log | 7.3462 | 382 | 0.9452 | 0.3089 | 0.9452 | 0.9722 |
243
+ | No log | 7.3846 | 384 | 0.9459 | 0.2416 | 0.9459 | 0.9726 |
244
+ | No log | 7.4231 | 386 | 0.9598 | 0.2291 | 0.9598 | 0.9797 |
245
+ | No log | 7.4615 | 388 | 0.9653 | 0.2742 | 0.9653 | 0.9825 |
246
+ | No log | 7.5 | 390 | 0.9691 | 0.2842 | 0.9691 | 0.9844 |
247
+ | No log | 7.5385 | 392 | 0.9761 | 0.3008 | 0.9761 | 0.9880 |
248
+ | No log | 7.5769 | 394 | 0.9992 | 0.3663 | 0.9992 | 0.9996 |
249
+ | No log | 7.6154 | 396 | 0.9769 | 0.3414 | 0.9769 | 0.9884 |
250
+ | No log | 7.6538 | 398 | 0.9651 | 0.3030 | 0.9651 | 0.9824 |
251
+ | No log | 7.6923 | 400 | 0.9622 | 0.3172 | 0.9622 | 0.9809 |
252
+ | No log | 7.7308 | 402 | 0.9998 | 0.3372 | 0.9998 | 0.9999 |
253
+ | No log | 7.7692 | 404 | 1.0088 | 0.2963 | 1.0088 | 1.0044 |
254
+ | No log | 7.8077 | 406 | 0.9839 | 0.3393 | 0.9839 | 0.9919 |
255
+ | No log | 7.8462 | 408 | 0.9671 | 0.2865 | 0.9671 | 0.9834 |
256
+ | No log | 7.8846 | 410 | 0.9644 | 0.2692 | 0.9644 | 0.9820 |
257
+ | No log | 7.9231 | 412 | 0.9581 | 0.2692 | 0.9581 | 0.9788 |
258
+ | No log | 7.9615 | 414 | 0.9505 | 0.2991 | 0.9505 | 0.9749 |
259
+ | No log | 8.0 | 416 | 0.9457 | 0.3011 | 0.9457 | 0.9725 |
260
+ | No log | 8.0385 | 418 | 0.9325 | 0.3403 | 0.9325 | 0.9657 |
261
+ | No log | 8.0769 | 420 | 0.9542 | 0.3360 | 0.9542 | 0.9768 |
262
+ | No log | 8.1154 | 422 | 0.9404 | 0.3214 | 0.9404 | 0.9697 |
263
+ | No log | 8.1538 | 424 | 0.9101 | 0.3922 | 0.9101 | 0.9540 |
264
+ | No log | 8.1923 | 426 | 0.9232 | 0.3476 | 0.9232 | 0.9608 |
265
+ | No log | 8.2308 | 428 | 0.9540 | 0.2873 | 0.9540 | 0.9767 |
266
+ | No log | 8.2692 | 430 | 1.0149 | 0.2724 | 1.0149 | 1.0074 |
267
+ | No log | 8.3077 | 432 | 1.0225 | 0.2724 | 1.0225 | 1.0112 |
268
+ | No log | 8.3462 | 434 | 0.9641 | 0.2849 | 0.9641 | 0.9819 |
269
+ | No log | 8.3846 | 436 | 0.9451 | 0.2746 | 0.9451 | 0.9722 |
270
+ | No log | 8.4231 | 438 | 0.9557 | 0.3763 | 0.9557 | 0.9776 |
271
+ | No log | 8.4615 | 440 | 0.9602 | 0.3623 | 0.9602 | 0.9799 |
272
+ | No log | 8.5 | 442 | 0.9452 | 0.3782 | 0.9452 | 0.9722 |
273
+ | No log | 8.5385 | 444 | 0.9549 | 0.4326 | 0.9549 | 0.9772 |
274
+ | No log | 8.5769 | 446 | 0.9548 | 0.4214 | 0.9548 | 0.9771 |
275
+ | No log | 8.6154 | 448 | 0.9416 | 0.4119 | 0.9416 | 0.9704 |
276
+ | No log | 8.6538 | 450 | 0.9478 | 0.3990 | 0.9478 | 0.9736 |
277
+ | No log | 8.6923 | 452 | 0.9744 | 0.4197 | 0.9744 | 0.9871 |
278
+ | No log | 8.7308 | 454 | 1.0438 | 0.3176 | 1.0438 | 1.0217 |
279
+ | No log | 8.7692 | 456 | 1.0646 | 0.2842 | 1.0646 | 1.0318 |
280
+ | No log | 8.8077 | 458 | 1.0499 | 0.3243 | 1.0499 | 1.0246 |
281
+ | No log | 8.8462 | 460 | 1.0849 | 0.3938 | 1.0849 | 1.0416 |
282
+ | No log | 8.8846 | 462 | 1.1348 | 0.3677 | 1.1348 | 1.0652 |
283
+ | No log | 8.9231 | 464 | 1.0873 | 0.3938 | 1.0873 | 1.0428 |
284
+ | No log | 8.9615 | 466 | 1.0389 | 0.2908 | 1.0389 | 1.0193 |
285
+ | No log | 9.0 | 468 | 1.0242 | 0.2865 | 1.0242 | 1.0120 |
286
+ | No log | 9.0385 | 470 | 1.0188 | 0.2226 | 1.0188 | 1.0094 |
287
+ | No log | 9.0769 | 472 | 1.0027 | 0.1908 | 1.0027 | 1.0014 |
288
+ | No log | 9.1154 | 474 | 0.9831 | 0.2517 | 0.9831 | 0.9915 |
289
+ | No log | 9.1538 | 476 | 0.9793 | 0.2492 | 0.9793 | 0.9896 |
290
+ | No log | 9.1923 | 478 | 0.9796 | 0.2492 | 0.9796 | 0.9897 |
291
+ | No log | 9.2308 | 480 | 1.0076 | 0.2623 | 1.0076 | 1.0038 |
292
+ | No log | 9.2692 | 482 | 1.0491 | 0.2941 | 1.0491 | 1.0243 |
293
+ | No log | 9.3077 | 484 | 1.0206 | 0.2941 | 1.0206 | 1.0102 |
294
+ | No log | 9.3462 | 486 | 0.9796 | 0.2467 | 0.9796 | 0.9897 |
295
+ | No log | 9.3846 | 488 | 0.9570 | 0.2291 | 0.9570 | 0.9783 |
296
+ | No log | 9.4231 | 490 | 0.9689 | 0.2490 | 0.9689 | 0.9843 |
297
+ | No log | 9.4615 | 492 | 0.9754 | 0.1918 | 0.9754 | 0.9876 |
298
+ | No log | 9.5 | 494 | 1.0066 | 0.2517 | 1.0066 | 1.0033 |
299
+ | No log | 9.5385 | 496 | 1.0662 | 0.3192 | 1.0662 | 1.0326 |
300
+ | No log | 9.5769 | 498 | 1.0779 | 0.2529 | 1.0779 | 1.0382 |
301
+ | 0.3119 | 9.6154 | 500 | 1.0520 | 0.3523 | 1.0520 | 1.0257 |
302
+ | 0.3119 | 9.6538 | 502 | 1.0143 | 0.3214 | 1.0143 | 1.0071 |
303
+ | 0.3119 | 9.6923 | 504 | 0.9721 | 0.2416 | 0.9721 | 0.9860 |
304
+ | 0.3119 | 9.7308 | 506 | 0.9607 | 0.2794 | 0.9607 | 0.9801 |
305
+ | 0.3119 | 9.7692 | 508 | 0.9735 | 0.2897 | 0.9735 | 0.9867 |
306
+ | 0.3119 | 9.8077 | 510 | 1.0330 | 0.3617 | 1.0330 | 1.0164 |
307
+ | 0.3119 | 9.8462 | 512 | 1.0397 | 0.3617 | 1.0397 | 1.0196 |
308
+ | 0.3119 | 9.8846 | 514 | 1.0060 | 0.3001 | 1.0060 | 1.0030 |
309
+ | 0.3119 | 9.9231 | 516 | 0.9824 | 0.3001 | 0.9824 | 0.9912 |
310
+ | 0.3119 | 9.9615 | 518 | 0.9994 | 0.2849 | 0.9994 | 0.9997 |
311
+ | 0.3119 | 10.0 | 520 | 1.0482 | 0.2416 | 1.0482 | 1.0238 |
312
+ | 0.3119 | 10.0385 | 522 | 1.0495 | 0.2108 | 1.0495 | 1.0245 |
313
+
314
+
315
+ ### Framework versions
316
+
317
+ - Transformers 4.44.2
318
+ - Pytorch 2.4.0+cu118
319
+ - Datasets 2.21.0
320
+ - Tokenizers 0.19.1
config.json ADDED
@@ -0,0 +1,32 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "_name_or_path": "aubmindlab/bert-base-arabertv02",
3
+ "architectures": [
4
+ "BertForSequenceClassification"
5
+ ],
6
+ "attention_probs_dropout_prob": 0.1,
7
+ "classifier_dropout": null,
8
+ "hidden_act": "gelu",
9
+ "hidden_dropout_prob": 0.1,
10
+ "hidden_size": 768,
11
+ "id2label": {
12
+ "0": "LABEL_0"
13
+ },
14
+ "initializer_range": 0.02,
15
+ "intermediate_size": 3072,
16
+ "label2id": {
17
+ "LABEL_0": 0
18
+ },
19
+ "layer_norm_eps": 1e-12,
20
+ "max_position_embeddings": 512,
21
+ "model_type": "bert",
22
+ "num_attention_heads": 12,
23
+ "num_hidden_layers": 12,
24
+ "pad_token_id": 0,
25
+ "position_embedding_type": "absolute",
26
+ "problem_type": "regression",
27
+ "torch_dtype": "float32",
28
+ "transformers_version": "4.44.2",
29
+ "type_vocab_size": 2,
30
+ "use_cache": true,
31
+ "vocab_size": 64000
32
+ }
model.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:f461785a86fc4aff2e4b58cbb34a3664588cc65302e12dd491aa433b1c0e1fd3
3
+ size 540799996
training_args.bin ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:6c2f82fd9d5d64e33d5a99986caa3bddc01e2a5e077430680c00ca61102eb909
3
+ size 5304