MayBashendy commited on
Commit
d07f542
·
verified ·
1 Parent(s): 6810ec8

Training in progress, step 500

Browse files
Files changed (4) hide show
  1. README.md +318 -0
  2. config.json +32 -0
  3. model.safetensors +3 -0
  4. training_args.bin +3 -0
README.md ADDED
@@ -0,0 +1,318 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ library_name: transformers
3
+ base_model: aubmindlab/bert-base-arabertv02
4
+ tags:
5
+ - generated_from_trainer
6
+ model-index:
7
+ - name: ArabicNewSplits7_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k8_task7_organization
8
+ results: []
9
+ ---
10
+
11
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
12
+ should probably proofread and complete it, then remove this comment. -->
13
+
14
+ # ArabicNewSplits7_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k8_task7_organization
15
+
16
+ This model is a fine-tuned version of [aubmindlab/bert-base-arabertv02](https://huggingface.co/aubmindlab/bert-base-arabertv02) on the None dataset.
17
+ It achieves the following results on the evaluation set:
18
+ - Loss: 0.9402
19
+ - Qwk: 0.2460
20
+ - Mse: 0.9402
21
+ - Rmse: 0.9697
22
+
23
+ ## Model description
24
+
25
+ More information needed
26
+
27
+ ## Intended uses & limitations
28
+
29
+ More information needed
30
+
31
+ ## Training and evaluation data
32
+
33
+ More information needed
34
+
35
+ ## Training procedure
36
+
37
+ ### Training hyperparameters
38
+
39
+ The following hyperparameters were used during training:
40
+ - learning_rate: 2e-05
41
+ - train_batch_size: 8
42
+ - eval_batch_size: 8
43
+ - seed: 42
44
+ - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
45
+ - lr_scheduler_type: linear
46
+ - num_epochs: 100
47
+
48
+ ### Training results
49
+
50
+ | Training Loss | Epoch | Step | Validation Loss | Qwk | Mse | Rmse |
51
+ |:-------------:|:-------:|:----:|:---------------:|:-------:|:------:|:------:|
52
+ | No log | 0.1053 | 2 | 2.4641 | -0.0568 | 2.4641 | 1.5697 |
53
+ | No log | 0.2105 | 4 | 1.2791 | 0.1882 | 1.2791 | 1.1310 |
54
+ | No log | 0.3158 | 6 | 1.0031 | -0.0550 | 1.0031 | 1.0015 |
55
+ | No log | 0.4211 | 8 | 1.1737 | -0.1355 | 1.1737 | 1.0834 |
56
+ | No log | 0.5263 | 10 | 1.2532 | -0.1993 | 1.2532 | 1.1195 |
57
+ | No log | 0.6316 | 12 | 0.8472 | 0.0 | 0.8472 | 0.9204 |
58
+ | No log | 0.7368 | 14 | 0.6607 | 0.1232 | 0.6607 | 0.8128 |
59
+ | No log | 0.8421 | 16 | 0.6433 | 0.2676 | 0.6433 | 0.8021 |
60
+ | No log | 0.9474 | 18 | 0.7040 | 0.3019 | 0.7040 | 0.8391 |
61
+ | No log | 1.0526 | 20 | 0.7188 | 0.3019 | 0.7188 | 0.8478 |
62
+ | No log | 1.1579 | 22 | 0.6808 | 0.2676 | 0.6808 | 0.8251 |
63
+ | No log | 1.2632 | 24 | 0.6844 | 0.2676 | 0.6844 | 0.8273 |
64
+ | No log | 1.3684 | 26 | 0.6940 | 0.2676 | 0.6940 | 0.8331 |
65
+ | No log | 1.4737 | 28 | 0.7603 | 0.3125 | 0.7603 | 0.8720 |
66
+ | No log | 1.5789 | 30 | 0.8185 | 0.1660 | 0.8185 | 0.9047 |
67
+ | No log | 1.6842 | 32 | 0.7336 | 0.2748 | 0.7336 | 0.8565 |
68
+ | No log | 1.7895 | 34 | 0.8494 | 0.2358 | 0.8494 | 0.9217 |
69
+ | No log | 1.8947 | 36 | 1.0127 | 0.1955 | 1.0127 | 1.0063 |
70
+ | No log | 2.0 | 38 | 0.8270 | 0.1459 | 0.8270 | 0.9094 |
71
+ | No log | 2.1053 | 40 | 0.7232 | 0.1277 | 0.7232 | 0.8504 |
72
+ | No log | 2.2105 | 42 | 0.7874 | 0.2156 | 0.7874 | 0.8874 |
73
+ | No log | 2.3158 | 44 | 0.7714 | 0.1365 | 0.7714 | 0.8783 |
74
+ | No log | 2.4211 | 46 | 0.7528 | 0.0717 | 0.7528 | 0.8677 |
75
+ | No log | 2.5263 | 48 | 0.8233 | 0.2407 | 0.8233 | 0.9073 |
76
+ | No log | 2.6316 | 50 | 0.8316 | 0.2652 | 0.8316 | 0.9119 |
77
+ | No log | 2.7368 | 52 | 0.7702 | 0.1863 | 0.7702 | 0.8776 |
78
+ | No log | 2.8421 | 54 | 0.7735 | 0.2884 | 0.7735 | 0.8795 |
79
+ | No log | 2.9474 | 56 | 0.7935 | 0.3238 | 0.7935 | 0.8908 |
80
+ | No log | 3.0526 | 58 | 1.0933 | 0.1241 | 1.0933 | 1.0456 |
81
+ | No log | 3.1579 | 60 | 1.2316 | 0.1839 | 1.2316 | 1.1098 |
82
+ | No log | 3.2632 | 62 | 1.0365 | 0.2460 | 1.0365 | 1.0181 |
83
+ | No log | 3.3684 | 64 | 0.9874 | 0.1692 | 0.9874 | 0.9937 |
84
+ | No log | 3.4737 | 66 | 1.1110 | 0.2209 | 1.1110 | 1.0540 |
85
+ | No log | 3.5789 | 68 | 1.4546 | 0.1067 | 1.4546 | 1.2061 |
86
+ | No log | 3.6842 | 70 | 1.6366 | 0.1555 | 1.6366 | 1.2793 |
87
+ | No log | 3.7895 | 72 | 1.3934 | 0.1093 | 1.3934 | 1.1804 |
88
+ | No log | 3.8947 | 74 | 1.3039 | 0.1175 | 1.3039 | 1.1419 |
89
+ | No log | 4.0 | 76 | 1.1431 | 0.1394 | 1.1431 | 1.0692 |
90
+ | No log | 4.1053 | 78 | 1.1404 | 0.1976 | 1.1404 | 1.0679 |
91
+ | No log | 4.2105 | 80 | 1.3409 | 0.1568 | 1.3409 | 1.1580 |
92
+ | No log | 4.3158 | 82 | 1.1559 | 0.1618 | 1.1559 | 1.0751 |
93
+ | No log | 4.4211 | 84 | 1.2002 | 0.1799 | 1.2002 | 1.0955 |
94
+ | No log | 4.5263 | 86 | 1.5611 | 0.1169 | 1.5611 | 1.2495 |
95
+ | No log | 4.6316 | 88 | 1.9707 | 0.0421 | 1.9707 | 1.4038 |
96
+ | No log | 4.7368 | 90 | 1.9777 | 0.0421 | 1.9777 | 1.4063 |
97
+ | No log | 4.8421 | 92 | 1.7506 | 0.0589 | 1.7506 | 1.3231 |
98
+ | No log | 4.9474 | 94 | 1.5731 | 0.1195 | 1.5731 | 1.2542 |
99
+ | No log | 5.0526 | 96 | 1.4243 | 0.1093 | 1.4243 | 1.1934 |
100
+ | No log | 5.1579 | 98 | 1.3725 | 0.1093 | 1.3725 | 1.1716 |
101
+ | No log | 5.2632 | 100 | 1.6731 | 0.0689 | 1.6731 | 1.2935 |
102
+ | No log | 5.3684 | 102 | 1.6296 | 0.0300 | 1.6296 | 1.2766 |
103
+ | No log | 5.4737 | 104 | 1.1849 | 0.1029 | 1.1849 | 1.0885 |
104
+ | No log | 5.5789 | 106 | 0.9579 | 0.1661 | 0.9579 | 0.9787 |
105
+ | No log | 5.6842 | 108 | 0.9987 | 0.1603 | 0.9987 | 0.9994 |
106
+ | No log | 5.7895 | 110 | 1.2834 | 0.1458 | 1.2834 | 1.1329 |
107
+ | No log | 5.8947 | 112 | 1.4733 | 0.0803 | 1.4733 | 1.2138 |
108
+ | No log | 6.0 | 114 | 1.3218 | 0.1458 | 1.3218 | 1.1497 |
109
+ | No log | 6.1053 | 116 | 0.9666 | 0.1651 | 0.9666 | 0.9832 |
110
+ | No log | 6.2105 | 118 | 0.9062 | 0.2076 | 0.9062 | 0.9519 |
111
+ | No log | 6.3158 | 120 | 1.0268 | 0.1210 | 1.0268 | 1.0133 |
112
+ | No log | 6.4211 | 122 | 1.4521 | 0.0361 | 1.4521 | 1.2050 |
113
+ | No log | 6.5263 | 124 | 1.8056 | 0.0350 | 1.8056 | 1.3437 |
114
+ | No log | 6.6316 | 126 | 1.7978 | 0.0350 | 1.7978 | 1.3408 |
115
+ | No log | 6.7368 | 128 | 1.5457 | 0.0447 | 1.5457 | 1.2433 |
116
+ | No log | 6.8421 | 130 | 1.2205 | 0.1262 | 1.2205 | 1.1048 |
117
+ | No log | 6.9474 | 132 | 1.1007 | 0.1356 | 1.1007 | 1.0491 |
118
+ | No log | 7.0526 | 134 | 1.1800 | 0.1293 | 1.1800 | 1.0863 |
119
+ | No log | 7.1579 | 136 | 1.3351 | 0.1174 | 1.3351 | 1.1555 |
120
+ | No log | 7.2632 | 138 | 1.2657 | 0.1293 | 1.2657 | 1.1251 |
121
+ | No log | 7.3684 | 140 | 1.2024 | 0.1293 | 1.2024 | 1.0965 |
122
+ | No log | 7.4737 | 142 | 1.3602 | 0.1175 | 1.3602 | 1.1663 |
123
+ | No log | 7.5789 | 144 | 1.5870 | 0.0283 | 1.5870 | 1.2598 |
124
+ | No log | 7.6842 | 146 | 1.5829 | 0.0283 | 1.5829 | 1.2581 |
125
+ | No log | 7.7895 | 148 | 1.3217 | 0.1175 | 1.3217 | 1.1497 |
126
+ | No log | 7.8947 | 150 | 1.0634 | 0.2119 | 1.0634 | 1.0312 |
127
+ | No log | 8.0 | 152 | 1.0604 | 0.1787 | 1.0604 | 1.0298 |
128
+ | No log | 8.1053 | 154 | 1.1823 | 0.2412 | 1.1823 | 1.0873 |
129
+ | No log | 8.2105 | 156 | 1.3899 | 0.0873 | 1.3899 | 1.1789 |
130
+ | No log | 8.3158 | 158 | 1.3188 | 0.1464 | 1.3188 | 1.1484 |
131
+ | No log | 8.4211 | 160 | 1.0921 | 0.1709 | 1.0921 | 1.0450 |
132
+ | No log | 8.5263 | 162 | 0.9088 | 0.1777 | 0.9088 | 0.9533 |
133
+ | No log | 8.6316 | 164 | 0.8664 | 0.2692 | 0.8664 | 0.9308 |
134
+ | No log | 8.7368 | 166 | 0.9342 | 0.1955 | 0.9342 | 0.9665 |
135
+ | No log | 8.8421 | 168 | 1.2602 | 0.1458 | 1.2602 | 1.1226 |
136
+ | No log | 8.9474 | 170 | 1.4897 | 0.0745 | 1.4897 | 1.2205 |
137
+ | No log | 9.0526 | 172 | 1.3828 | 0.0829 | 1.3828 | 1.1759 |
138
+ | No log | 9.1579 | 174 | 1.0783 | 0.2782 | 1.0783 | 1.0384 |
139
+ | No log | 9.2632 | 176 | 0.8226 | 0.2352 | 0.8226 | 0.9070 |
140
+ | No log | 9.3684 | 178 | 0.7643 | 0.2407 | 0.7643 | 0.8743 |
141
+ | No log | 9.4737 | 180 | 0.7740 | 0.2718 | 0.7740 | 0.8798 |
142
+ | No log | 9.5789 | 182 | 0.9104 | 0.2000 | 0.9104 | 0.9541 |
143
+ | No log | 9.6842 | 184 | 1.2011 | 0.2045 | 1.2011 | 1.0959 |
144
+ | No log | 9.7895 | 186 | 1.3583 | 0.1427 | 1.3583 | 1.1655 |
145
+ | No log | 9.8947 | 188 | 1.3863 | 0.1275 | 1.3863 | 1.1774 |
146
+ | No log | 10.0 | 190 | 1.4604 | 0.1549 | 1.4604 | 1.2085 |
147
+ | No log | 10.1053 | 192 | 1.2898 | 0.1638 | 1.2898 | 1.1357 |
148
+ | No log | 10.2105 | 194 | 1.1966 | 0.1784 | 1.1966 | 1.0939 |
149
+ | No log | 10.3158 | 196 | 1.1865 | 0.1784 | 1.1865 | 1.0893 |
150
+ | No log | 10.4211 | 198 | 1.2104 | 0.1490 | 1.2104 | 1.1002 |
151
+ | No log | 10.5263 | 200 | 1.2296 | 0.1458 | 1.2296 | 1.1089 |
152
+ | No log | 10.6316 | 202 | 1.2308 | 0.1458 | 1.2308 | 1.1094 |
153
+ | No log | 10.7368 | 204 | 1.1178 | 0.1626 | 1.1178 | 1.0573 |
154
+ | No log | 10.8421 | 206 | 0.9721 | 0.1274 | 0.9721 | 0.9859 |
155
+ | No log | 10.9474 | 208 | 0.9787 | 0.1557 | 0.9787 | 0.9893 |
156
+ | No log | 11.0526 | 210 | 1.0491 | 0.0925 | 1.0491 | 1.0242 |
157
+ | No log | 11.1579 | 212 | 1.2200 | 0.1943 | 1.2200 | 1.1046 |
158
+ | No log | 11.2632 | 214 | 1.3621 | 0.1220 | 1.3621 | 1.1671 |
159
+ | No log | 11.3684 | 216 | 1.2789 | 0.1427 | 1.2789 | 1.1309 |
160
+ | No log | 11.4737 | 218 | 1.1262 | 0.2782 | 1.1262 | 1.0612 |
161
+ | No log | 11.5789 | 220 | 1.0803 | 0.1787 | 1.0803 | 1.0394 |
162
+ | No log | 11.6842 | 222 | 1.0540 | 0.1787 | 1.0540 | 1.0266 |
163
+ | No log | 11.7895 | 224 | 1.1171 | 0.1949 | 1.1171 | 1.0569 |
164
+ | No log | 11.8947 | 226 | 1.2800 | 0.0712 | 1.2800 | 1.1314 |
165
+ | No log | 12.0 | 228 | 1.3595 | 0.0419 | 1.3595 | 1.1660 |
166
+ | No log | 12.1053 | 230 | 1.3345 | 0.0694 | 1.3345 | 1.1552 |
167
+ | No log | 12.2105 | 232 | 1.5379 | 0.0832 | 1.5379 | 1.2401 |
168
+ | No log | 12.3158 | 234 | 1.7880 | 0.0932 | 1.7880 | 1.3372 |
169
+ | No log | 12.4211 | 236 | 1.6579 | 0.1549 | 1.6579 | 1.2876 |
170
+ | No log | 12.5263 | 238 | 1.3537 | 0.0952 | 1.3537 | 1.1635 |
171
+ | No log | 12.6316 | 240 | 1.1170 | 0.0448 | 1.1170 | 1.0569 |
172
+ | No log | 12.7368 | 242 | 1.0241 | 0.0448 | 1.0241 | 1.0120 |
173
+ | No log | 12.8421 | 244 | 1.0171 | 0.0799 | 1.0171 | 1.0085 |
174
+ | No log | 12.9474 | 246 | 1.1371 | 0.0585 | 1.1371 | 1.0663 |
175
+ | No log | 13.0526 | 248 | 1.2053 | 0.1205 | 1.2053 | 1.0979 |
176
+ | No log | 13.1579 | 250 | 1.1845 | 0.0538 | 1.1845 | 1.0884 |
177
+ | No log | 13.2632 | 252 | 1.0946 | 0.0982 | 1.0946 | 1.0463 |
178
+ | No log | 13.3684 | 254 | 1.1487 | 0.0982 | 1.1487 | 1.0718 |
179
+ | No log | 13.4737 | 256 | 1.3419 | 0.0694 | 1.3419 | 1.1584 |
180
+ | No log | 13.5789 | 258 | 1.5060 | 0.0086 | 1.5060 | 1.2272 |
181
+ | No log | 13.6842 | 260 | 1.4753 | 0.0086 | 1.4753 | 1.2146 |
182
+ | No log | 13.7895 | 262 | 1.2918 | 0.0459 | 1.2918 | 1.1366 |
183
+ | No log | 13.8947 | 264 | 1.2588 | 0.0761 | 1.2588 | 1.1220 |
184
+ | No log | 14.0 | 266 | 1.1447 | 0.1147 | 1.1447 | 1.0699 |
185
+ | No log | 14.1053 | 268 | 1.0228 | 0.1385 | 1.0228 | 1.0114 |
186
+ | No log | 14.2105 | 270 | 1.0093 | 0.1734 | 1.0093 | 1.0046 |
187
+ | No log | 14.3158 | 272 | 1.1433 | 0.0561 | 1.1433 | 1.0692 |
188
+ | No log | 14.4211 | 274 | 1.3866 | 0.0584 | 1.3866 | 1.1776 |
189
+ | No log | 14.5263 | 276 | 1.5194 | 0.0465 | 1.5194 | 1.2326 |
190
+ | No log | 14.6316 | 278 | 1.4404 | 0.0531 | 1.4404 | 1.2002 |
191
+ | No log | 14.7368 | 280 | 1.3060 | 0.1458 | 1.3060 | 1.1428 |
192
+ | No log | 14.8421 | 282 | 1.2034 | 0.0546 | 1.2034 | 1.0970 |
193
+ | No log | 14.9474 | 284 | 1.1476 | 0.0315 | 1.1476 | 1.0712 |
194
+ | No log | 15.0526 | 286 | 1.2144 | 0.1262 | 1.2144 | 1.1020 |
195
+ | No log | 15.1579 | 288 | 1.3063 | 0.0648 | 1.3063 | 1.1429 |
196
+ | No log | 15.2632 | 290 | 1.3645 | 0.0921 | 1.3645 | 1.1681 |
197
+ | No log | 15.3684 | 292 | 1.2482 | 0.0921 | 1.2482 | 1.1172 |
198
+ | No log | 15.4737 | 294 | 1.0915 | 0.2183 | 1.0915 | 1.0447 |
199
+ | No log | 15.5789 | 296 | 1.0100 | 0.2032 | 1.0100 | 1.0050 |
200
+ | No log | 15.6842 | 298 | 0.9617 | 0.1651 | 0.9617 | 0.9807 |
201
+ | No log | 15.7895 | 300 | 0.9237 | 0.1822 | 0.9237 | 0.9611 |
202
+ | No log | 15.8947 | 302 | 0.9719 | 0.1651 | 0.9719 | 0.9859 |
203
+ | No log | 16.0 | 304 | 1.0726 | 0.1389 | 1.0726 | 1.0357 |
204
+ | No log | 16.1053 | 306 | 1.2458 | 0.1490 | 1.2458 | 1.1162 |
205
+ | No log | 16.2105 | 308 | 1.4795 | 0.0519 | 1.4795 | 1.2164 |
206
+ | No log | 16.3158 | 310 | 1.4706 | 0.0519 | 1.4706 | 1.2127 |
207
+ | No log | 16.4211 | 312 | 1.2663 | 0.1233 | 1.2663 | 1.1253 |
208
+ | No log | 16.5263 | 314 | 1.0773 | 0.0569 | 1.0773 | 1.0379 |
209
+ | No log | 16.6316 | 316 | 1.0396 | 0.0592 | 1.0396 | 1.0196 |
210
+ | No log | 16.7368 | 318 | 1.0507 | 0.0894 | 1.0507 | 1.0251 |
211
+ | No log | 16.8421 | 320 | 1.0997 | 0.1394 | 1.0997 | 1.0487 |
212
+ | No log | 16.9474 | 322 | 1.1279 | 0.1635 | 1.1279 | 1.0620 |
213
+ | No log | 17.0526 | 324 | 1.1312 | 0.1635 | 1.1312 | 1.0636 |
214
+ | No log | 17.1579 | 326 | 1.0328 | 0.1463 | 1.0328 | 1.0163 |
215
+ | No log | 17.2632 | 328 | 0.9286 | 0.1612 | 0.9286 | 0.9636 |
216
+ | No log | 17.3684 | 330 | 0.9605 | 0.1612 | 0.9605 | 0.9800 |
217
+ | No log | 17.4737 | 332 | 1.0802 | 0.2183 | 1.0802 | 1.0393 |
218
+ | No log | 17.5789 | 334 | 1.2681 | 0.0921 | 1.2681 | 1.1261 |
219
+ | No log | 17.6842 | 336 | 1.3526 | 0.0343 | 1.3526 | 1.1630 |
220
+ | No log | 17.7895 | 338 | 1.3346 | 0.0898 | 1.3346 | 1.1553 |
221
+ | No log | 17.8947 | 340 | 1.3625 | 0.0873 | 1.3625 | 1.1673 |
222
+ | No log | 18.0 | 342 | 1.1606 | 0.1523 | 1.1606 | 1.0773 |
223
+ | No log | 18.1053 | 344 | 0.9443 | 0.1803 | 0.9443 | 0.9717 |
224
+ | No log | 18.2105 | 346 | 0.9077 | 0.2287 | 0.9077 | 0.9527 |
225
+ | No log | 18.3158 | 348 | 0.9189 | 0.2000 | 0.9189 | 0.9586 |
226
+ | No log | 18.4211 | 350 | 0.9441 | 0.2211 | 0.9441 | 0.9717 |
227
+ | No log | 18.5263 | 352 | 0.9530 | 0.2211 | 0.9530 | 0.9762 |
228
+ | No log | 18.6316 | 354 | 1.0559 | 0.2316 | 1.0559 | 1.0276 |
229
+ | No log | 18.7368 | 356 | 1.1002 | 0.2552 | 1.1002 | 1.0489 |
230
+ | No log | 18.8421 | 358 | 1.1299 | 0.1858 | 1.1299 | 1.0629 |
231
+ | No log | 18.9474 | 360 | 1.1509 | 0.1821 | 1.1509 | 1.0728 |
232
+ | No log | 19.0526 | 362 | 1.0969 | 0.1709 | 1.0969 | 1.0473 |
233
+ | No log | 19.1579 | 364 | 1.0309 | 0.1869 | 1.0309 | 1.0153 |
234
+ | No log | 19.2632 | 366 | 1.0453 | 0.1869 | 1.0453 | 1.0224 |
235
+ | No log | 19.3684 | 368 | 1.0596 | 0.2032 | 1.0596 | 1.0294 |
236
+ | No log | 19.4737 | 370 | 1.1637 | 0.1870 | 1.1637 | 1.0788 |
237
+ | No log | 19.5789 | 372 | 1.2256 | 0.1205 | 1.2256 | 1.1070 |
238
+ | No log | 19.6842 | 374 | 1.2246 | 0.1205 | 1.2246 | 1.1066 |
239
+ | No log | 19.7895 | 376 | 1.1160 | 0.2183 | 1.1160 | 1.0564 |
240
+ | No log | 19.8947 | 378 | 1.0576 | 0.1210 | 1.0576 | 1.0284 |
241
+ | No log | 20.0 | 380 | 1.0879 | 0.1463 | 1.0879 | 1.0430 |
242
+ | No log | 20.1053 | 382 | 1.1567 | 0.1463 | 1.1567 | 1.0755 |
243
+ | No log | 20.2105 | 384 | 1.2232 | 0.1262 | 1.2232 | 1.1060 |
244
+ | No log | 20.3158 | 386 | 1.2875 | 0.1233 | 1.2875 | 1.1347 |
245
+ | No log | 20.4211 | 388 | 1.3831 | 0.0343 | 1.3831 | 1.1761 |
246
+ | No log | 20.5263 | 390 | 1.4274 | 0.0708 | 1.4274 | 1.1947 |
247
+ | No log | 20.6316 | 392 | 1.3132 | 0.1233 | 1.3132 | 1.1459 |
248
+ | No log | 20.7368 | 394 | 1.1052 | 0.1747 | 1.1052 | 1.0513 |
249
+ | No log | 20.8421 | 396 | 0.9038 | 0.1651 | 0.9038 | 0.9507 |
250
+ | No log | 20.9474 | 398 | 0.8384 | 0.2632 | 0.8384 | 0.9157 |
251
+ | No log | 21.0526 | 400 | 0.8279 | 0.2063 | 0.8279 | 0.9099 |
252
+ | No log | 21.1579 | 402 | 0.8476 | 0.2297 | 0.8476 | 0.9206 |
253
+ | No log | 21.2632 | 404 | 0.9798 | 0.0616 | 0.9798 | 0.9899 |
254
+ | No log | 21.3684 | 406 | 1.2113 | 0.1591 | 1.2113 | 1.1006 |
255
+ | No log | 21.4737 | 408 | 1.4117 | 0.1093 | 1.4117 | 1.1882 |
256
+ | No log | 21.5789 | 410 | 1.5392 | 0.1285 | 1.5392 | 1.2407 |
257
+ | No log | 21.6842 | 412 | 1.4635 | 0.0766 | 1.4635 | 1.2097 |
258
+ | No log | 21.7895 | 414 | 1.1993 | 0.1324 | 1.1993 | 1.0951 |
259
+ | No log | 21.8947 | 416 | 0.9784 | 0.0896 | 0.9784 | 0.9891 |
260
+ | No log | 22.0 | 418 | 0.8859 | 0.0895 | 0.8859 | 0.9412 |
261
+ | No log | 22.1053 | 420 | 0.8847 | 0.0895 | 0.8847 | 0.9406 |
262
+ | No log | 22.2105 | 422 | 0.8522 | 0.2171 | 0.8522 | 0.9232 |
263
+ | No log | 22.3158 | 424 | 0.8241 | 0.2171 | 0.8241 | 0.9078 |
264
+ | No log | 22.4211 | 426 | 0.8996 | 0.1718 | 0.8996 | 0.9485 |
265
+ | No log | 22.5263 | 428 | 1.0471 | 0.0896 | 1.0471 | 1.0233 |
266
+ | No log | 22.6316 | 430 | 1.0850 | 0.0842 | 1.0850 | 1.0416 |
267
+ | No log | 22.7368 | 432 | 1.0406 | 0.0896 | 1.0406 | 1.0201 |
268
+ | No log | 22.8421 | 434 | 0.9429 | 0.1692 | 0.9429 | 0.9711 |
269
+ | No log | 22.9474 | 436 | 0.8892 | 0.2632 | 0.8892 | 0.9430 |
270
+ | No log | 23.0526 | 438 | 0.8751 | 0.2409 | 0.8751 | 0.9355 |
271
+ | No log | 23.1579 | 440 | 0.8877 | 0.1914 | 0.8877 | 0.9422 |
272
+ | No log | 23.2632 | 442 | 0.8993 | 0.1914 | 0.8993 | 0.9483 |
273
+ | No log | 23.3684 | 444 | 0.9416 | 0.2000 | 0.9416 | 0.9703 |
274
+ | No log | 23.4737 | 446 | 0.9549 | 0.1651 | 0.9549 | 0.9772 |
275
+ | No log | 23.5789 | 448 | 0.8887 | 0.1962 | 0.8887 | 0.9427 |
276
+ | No log | 23.6842 | 450 | 0.8707 | 0.2352 | 0.8707 | 0.9331 |
277
+ | No log | 23.7895 | 452 | 0.9281 | 0.2142 | 0.9281 | 0.9634 |
278
+ | No log | 23.8947 | 454 | 0.9651 | 0.2259 | 0.9651 | 0.9824 |
279
+ | No log | 24.0 | 456 | 0.9547 | 0.2000 | 0.9547 | 0.9771 |
280
+ | No log | 24.1053 | 458 | 0.9812 | 0.2259 | 0.9812 | 0.9905 |
281
+ | No log | 24.2105 | 460 | 0.9499 | 0.2000 | 0.9499 | 0.9746 |
282
+ | No log | 24.3158 | 462 | 0.9134 | 0.1734 | 0.9134 | 0.9557 |
283
+ | No log | 24.4211 | 464 | 0.8957 | 0.1542 | 0.8957 | 0.9464 |
284
+ | No log | 24.5263 | 466 | 0.9177 | 0.1501 | 0.9177 | 0.9580 |
285
+ | No log | 24.6316 | 468 | 0.9123 | 0.1144 | 0.9123 | 0.9552 |
286
+ | No log | 24.7368 | 470 | 0.9427 | 0.0803 | 0.9427 | 0.9709 |
287
+ | No log | 24.8421 | 472 | 0.9644 | 0.1045 | 0.9644 | 0.9820 |
288
+ | No log | 24.9474 | 474 | 0.9877 | 0.0953 | 0.9877 | 0.9938 |
289
+ | No log | 25.0526 | 476 | 0.9548 | 0.0746 | 0.9548 | 0.9771 |
290
+ | No log | 25.1579 | 478 | 0.8722 | 0.1962 | 0.8722 | 0.9339 |
291
+ | No log | 25.2632 | 480 | 0.8807 | 0.1962 | 0.8807 | 0.9385 |
292
+ | No log | 25.3684 | 482 | 0.9581 | 0.1642 | 0.9581 | 0.9788 |
293
+ | No log | 25.4737 | 484 | 0.9240 | 0.1379 | 0.9240 | 0.9612 |
294
+ | No log | 25.5789 | 486 | 0.8329 | 0.2883 | 0.8329 | 0.9126 |
295
+ | No log | 25.6842 | 488 | 0.7711 | 0.3312 | 0.7711 | 0.8781 |
296
+ | No log | 25.7895 | 490 | 0.7734 | 0.2883 | 0.7734 | 0.8794 |
297
+ | No log | 25.8947 | 492 | 0.7991 | 0.3167 | 0.7991 | 0.8939 |
298
+ | No log | 26.0 | 494 | 0.8618 | 0.3359 | 0.8618 | 0.9283 |
299
+ | No log | 26.1053 | 496 | 0.8989 | 0.3231 | 0.8989 | 0.9481 |
300
+ | No log | 26.2105 | 498 | 0.8341 | 0.3294 | 0.8341 | 0.9133 |
301
+ | 0.2424 | 26.3158 | 500 | 0.7303 | 0.3099 | 0.7303 | 0.8546 |
302
+ | 0.2424 | 26.4211 | 502 | 0.7038 | 0.2817 | 0.7038 | 0.8389 |
303
+ | 0.2424 | 26.5263 | 504 | 0.7111 | 0.3099 | 0.7111 | 0.8433 |
304
+ | 0.2424 | 26.6316 | 506 | 0.7457 | 0.3099 | 0.7457 | 0.8635 |
305
+ | 0.2424 | 26.7368 | 508 | 0.8348 | 0.3359 | 0.8348 | 0.9137 |
306
+ | 0.2424 | 26.8421 | 510 | 0.9436 | 0.2460 | 0.9436 | 0.9714 |
307
+ | 0.2424 | 26.9474 | 512 | 1.0461 | 0.2141 | 1.0461 | 1.0228 |
308
+ | 0.2424 | 27.0526 | 514 | 1.1003 | 0.1832 | 1.1003 | 1.0490 |
309
+ | 0.2424 | 27.1579 | 516 | 1.0420 | 0.2227 | 1.0420 | 1.0208 |
310
+ | 0.2424 | 27.2632 | 518 | 0.9402 | 0.2460 | 0.9402 | 0.9697 |
311
+
312
+
313
+ ### Framework versions
314
+
315
+ - Transformers 4.44.2
316
+ - Pytorch 2.4.0+cu118
317
+ - Datasets 2.21.0
318
+ - Tokenizers 0.19.1
config.json ADDED
@@ -0,0 +1,32 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "_name_or_path": "aubmindlab/bert-base-arabertv02",
3
+ "architectures": [
4
+ "BertForSequenceClassification"
5
+ ],
6
+ "attention_probs_dropout_prob": 0.1,
7
+ "classifier_dropout": null,
8
+ "hidden_act": "gelu",
9
+ "hidden_dropout_prob": 0.1,
10
+ "hidden_size": 768,
11
+ "id2label": {
12
+ "0": "LABEL_0"
13
+ },
14
+ "initializer_range": 0.02,
15
+ "intermediate_size": 3072,
16
+ "label2id": {
17
+ "LABEL_0": 0
18
+ },
19
+ "layer_norm_eps": 1e-12,
20
+ "max_position_embeddings": 512,
21
+ "model_type": "bert",
22
+ "num_attention_heads": 12,
23
+ "num_hidden_layers": 12,
24
+ "pad_token_id": 0,
25
+ "position_embedding_type": "absolute",
26
+ "problem_type": "regression",
27
+ "torch_dtype": "float32",
28
+ "transformers_version": "4.44.2",
29
+ "type_vocab_size": 2,
30
+ "use_cache": true,
31
+ "vocab_size": 64000
32
+ }
model.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:2b66699306626b07019342082d8c66b00c957c246c2d97ba02dcf8b2307b4db7
3
+ size 540799996
training_args.bin ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:c60ff2489a3ffcdae9531f95d579d62815434528bcee96431b723aa5a56ed4a4
3
+ size 5368