MayBashendy commited on
Commit
103b9a2
·
verified ·
1 Parent(s): d3eb44f

Training in progress, step 500

Browse files
Files changed (4) hide show
  1. README.md +318 -0
  2. config.json +32 -0
  3. model.safetensors +3 -0
  4. training_args.bin +3 -0
README.md ADDED
@@ -0,0 +1,318 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ library_name: transformers
3
+ base_model: aubmindlab/bert-base-arabertv02
4
+ tags:
5
+ - generated_from_trainer
6
+ model-index:
7
+ - name: ArabicNewSplits7_B_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k17_task2_organization
8
+ results: []
9
+ ---
10
+
11
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
12
+ should probably proofread and complete it, then remove this comment. -->
13
+
14
+ # ArabicNewSplits7_B_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k17_task2_organization
15
+
16
+ This model is a fine-tuned version of [aubmindlab/bert-base-arabertv02](https://huggingface.co/aubmindlab/bert-base-arabertv02) on the None dataset.
17
+ It achieves the following results on the evaluation set:
18
+ - Loss: 0.9379
19
+ - Qwk: 0.2843
20
+ - Mse: 0.9379
21
+ - Rmse: 0.9685
22
+
23
+ ## Model description
24
+
25
+ More information needed
26
+
27
+ ## Intended uses & limitations
28
+
29
+ More information needed
30
+
31
+ ## Training and evaluation data
32
+
33
+ More information needed
34
+
35
+ ## Training procedure
36
+
37
+ ### Training hyperparameters
38
+
39
+ The following hyperparameters were used during training:
40
+ - learning_rate: 2e-05
41
+ - train_batch_size: 8
42
+ - eval_batch_size: 8
43
+ - seed: 42
44
+ - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
45
+ - lr_scheduler_type: linear
46
+ - num_epochs: 100
47
+
48
+ ### Training results
49
+
50
+ | Training Loss | Epoch | Step | Validation Loss | Qwk | Mse | Rmse |
51
+ |:-------------:|:------:|:----:|:---------------:|:-------:|:------:|:------:|
52
+ | No log | 0.0225 | 2 | 4.7898 | 0.0010 | 4.7898 | 2.1886 |
53
+ | No log | 0.0449 | 4 | 2.9923 | -0.0590 | 2.9923 | 1.7298 |
54
+ | No log | 0.0674 | 6 | 1.9490 | -0.0233 | 1.9490 | 1.3961 |
55
+ | No log | 0.0899 | 8 | 1.6010 | 0.0682 | 1.6010 | 1.2653 |
56
+ | No log | 0.1124 | 10 | 1.3744 | 0.0374 | 1.3744 | 1.1723 |
57
+ | No log | 0.1348 | 12 | 1.3024 | 0.0671 | 1.3024 | 1.1412 |
58
+ | No log | 0.1573 | 14 | 1.3018 | 0.1043 | 1.3018 | 1.1410 |
59
+ | No log | 0.1798 | 16 | 1.3289 | 0.0395 | 1.3289 | 1.1528 |
60
+ | No log | 0.2022 | 18 | 1.3721 | 0.0395 | 1.3721 | 1.1714 |
61
+ | No log | 0.2247 | 20 | 1.4859 | 0.0228 | 1.4859 | 1.2190 |
62
+ | No log | 0.2472 | 22 | 1.6867 | -0.0664 | 1.6867 | 1.2987 |
63
+ | No log | 0.2697 | 24 | 1.6283 | -0.0411 | 1.6283 | 1.2761 |
64
+ | No log | 0.2921 | 26 | 1.4503 | -0.0350 | 1.4503 | 1.2043 |
65
+ | No log | 0.3146 | 28 | 1.3326 | 0.1747 | 1.3326 | 1.1544 |
66
+ | No log | 0.3371 | 30 | 1.2693 | 0.1570 | 1.2693 | 1.1266 |
67
+ | No log | 0.3596 | 32 | 1.2227 | 0.2462 | 1.2227 | 1.1058 |
68
+ | No log | 0.3820 | 34 | 1.3549 | 0.1228 | 1.3549 | 1.1640 |
69
+ | No log | 0.4045 | 36 | 1.3685 | 0.1055 | 1.3685 | 1.1698 |
70
+ | No log | 0.4270 | 38 | 1.1331 | 0.3498 | 1.1331 | 1.0645 |
71
+ | No log | 0.4494 | 40 | 1.1228 | 0.2023 | 1.1228 | 1.0596 |
72
+ | No log | 0.4719 | 42 | 1.2609 | 0.0974 | 1.2609 | 1.1229 |
73
+ | No log | 0.4944 | 44 | 1.2152 | 0.1315 | 1.2152 | 1.1023 |
74
+ | No log | 0.5169 | 46 | 1.1091 | 0.1417 | 1.1091 | 1.0532 |
75
+ | No log | 0.5393 | 48 | 1.0522 | 0.3100 | 1.0522 | 1.0257 |
76
+ | No log | 0.5618 | 50 | 1.1146 | 0.2245 | 1.1146 | 1.0557 |
77
+ | No log | 0.5843 | 52 | 1.2540 | 0.1587 | 1.2540 | 1.1198 |
78
+ | No log | 0.6067 | 54 | 1.3671 | 0.1224 | 1.3671 | 1.1692 |
79
+ | No log | 0.6292 | 56 | 1.3752 | 0.1314 | 1.3752 | 1.1727 |
80
+ | No log | 0.6517 | 58 | 1.2919 | 0.1530 | 1.2919 | 1.1366 |
81
+ | No log | 0.6742 | 60 | 1.1209 | 0.2289 | 1.1209 | 1.0587 |
82
+ | No log | 0.6966 | 62 | 1.1003 | 0.2424 | 1.1003 | 1.0490 |
83
+ | No log | 0.7191 | 64 | 1.1060 | 0.2068 | 1.1060 | 1.0517 |
84
+ | No log | 0.7416 | 66 | 1.1020 | 0.2969 | 1.1020 | 1.0498 |
85
+ | No log | 0.7640 | 68 | 1.1082 | 0.2459 | 1.1082 | 1.0527 |
86
+ | No log | 0.7865 | 70 | 1.1201 | 0.3377 | 1.1201 | 1.0583 |
87
+ | No log | 0.8090 | 72 | 1.1383 | 0.3319 | 1.1383 | 1.0669 |
88
+ | No log | 0.8315 | 74 | 1.1557 | 0.3209 | 1.1557 | 1.0750 |
89
+ | No log | 0.8539 | 76 | 1.1645 | 0.4005 | 1.1645 | 1.0791 |
90
+ | No log | 0.8764 | 78 | 1.1812 | 0.4250 | 1.1812 | 1.0868 |
91
+ | No log | 0.8989 | 80 | 1.1845 | 0.3808 | 1.1845 | 1.0884 |
92
+ | No log | 0.9213 | 82 | 1.1911 | 0.3808 | 1.1911 | 1.0914 |
93
+ | No log | 0.9438 | 84 | 1.3228 | 0.2191 | 1.3228 | 1.1501 |
94
+ | No log | 0.9663 | 86 | 1.7306 | 0.2241 | 1.7306 | 1.3155 |
95
+ | No log | 0.9888 | 88 | 1.6685 | 0.2213 | 1.6685 | 1.2917 |
96
+ | No log | 1.0112 | 90 | 1.4183 | 0.2857 | 1.4183 | 1.1909 |
97
+ | No log | 1.0337 | 92 | 1.2584 | 0.3002 | 1.2584 | 1.1218 |
98
+ | No log | 1.0562 | 94 | 1.3975 | 0.2898 | 1.3975 | 1.1822 |
99
+ | No log | 1.0787 | 96 | 1.5917 | 0.2991 | 1.5917 | 1.2616 |
100
+ | No log | 1.1011 | 98 | 1.5003 | 0.3148 | 1.5003 | 1.2249 |
101
+ | No log | 1.1236 | 100 | 1.2913 | 0.3719 | 1.2913 | 1.1364 |
102
+ | No log | 1.1461 | 102 | 1.1731 | 0.2691 | 1.1731 | 1.0831 |
103
+ | No log | 1.1685 | 104 | 1.1659 | 0.2368 | 1.1659 | 1.0798 |
104
+ | No log | 1.1910 | 106 | 1.2454 | 0.3222 | 1.2454 | 1.1160 |
105
+ | No log | 1.2135 | 108 | 1.2659 | 0.3318 | 1.2659 | 1.1251 |
106
+ | No log | 1.2360 | 110 | 1.1583 | 0.3375 | 1.1583 | 1.0762 |
107
+ | No log | 1.2584 | 112 | 1.1306 | 0.3285 | 1.1306 | 1.0633 |
108
+ | No log | 1.2809 | 114 | 1.0664 | 0.3250 | 1.0664 | 1.0327 |
109
+ | No log | 1.3034 | 116 | 1.0416 | 0.3062 | 1.0416 | 1.0206 |
110
+ | No log | 1.3258 | 118 | 1.0439 | 0.3839 | 1.0439 | 1.0217 |
111
+ | No log | 1.3483 | 120 | 1.1662 | 0.4527 | 1.1662 | 1.0799 |
112
+ | No log | 1.3708 | 122 | 1.2617 | 0.3265 | 1.2617 | 1.1233 |
113
+ | No log | 1.3933 | 124 | 1.0936 | 0.2899 | 1.0936 | 1.0457 |
114
+ | No log | 1.4157 | 126 | 0.9722 | 0.3013 | 0.9722 | 0.9860 |
115
+ | No log | 1.4382 | 128 | 1.2549 | 0.3421 | 1.2549 | 1.1202 |
116
+ | No log | 1.4607 | 130 | 1.3728 | 0.3196 | 1.3728 | 1.1716 |
117
+ | No log | 1.4831 | 132 | 1.1425 | 0.4100 | 1.1425 | 1.0689 |
118
+ | No log | 1.5056 | 134 | 1.0040 | 0.2679 | 1.0040 | 1.0020 |
119
+ | No log | 1.5281 | 136 | 1.0547 | 0.3427 | 1.0547 | 1.0270 |
120
+ | No log | 1.5506 | 138 | 1.1538 | 0.4645 | 1.1538 | 1.0742 |
121
+ | No log | 1.5730 | 140 | 1.2110 | 0.4380 | 1.2110 | 1.1005 |
122
+ | No log | 1.5955 | 142 | 1.1272 | 0.3800 | 1.1272 | 1.0617 |
123
+ | No log | 1.6180 | 144 | 1.1439 | 0.3800 | 1.1439 | 1.0695 |
124
+ | No log | 1.6404 | 146 | 1.1933 | 0.3485 | 1.1933 | 1.0924 |
125
+ | No log | 1.6629 | 148 | 1.1105 | 0.3260 | 1.1105 | 1.0538 |
126
+ | No log | 1.6854 | 150 | 1.0331 | 0.2948 | 1.0331 | 1.0164 |
127
+ | No log | 1.7079 | 152 | 1.0179 | 0.3319 | 1.0179 | 1.0089 |
128
+ | No log | 1.7303 | 154 | 1.1010 | 0.3962 | 1.1010 | 1.0493 |
129
+ | No log | 1.7528 | 156 | 1.3014 | 0.4156 | 1.3014 | 1.1408 |
130
+ | No log | 1.7753 | 158 | 1.2141 | 0.3874 | 1.2141 | 1.1019 |
131
+ | No log | 1.7978 | 160 | 1.0116 | 0.4111 | 1.0116 | 1.0058 |
132
+ | No log | 1.8202 | 162 | 0.9492 | 0.3283 | 0.9492 | 0.9743 |
133
+ | No log | 1.8427 | 164 | 0.9463 | 0.2843 | 0.9463 | 0.9728 |
134
+ | No log | 1.8652 | 166 | 0.9370 | 0.3596 | 0.9370 | 0.9680 |
135
+ | No log | 1.8876 | 168 | 1.0431 | 0.4166 | 1.0431 | 1.0213 |
136
+ | No log | 1.9101 | 170 | 1.3791 | 0.3867 | 1.3791 | 1.1744 |
137
+ | No log | 1.9326 | 172 | 1.4327 | 0.3467 | 1.4327 | 1.1970 |
138
+ | No log | 1.9551 | 174 | 1.2389 | 0.3174 | 1.2389 | 1.1130 |
139
+ | No log | 1.9775 | 176 | 1.0455 | 0.1493 | 1.0455 | 1.0225 |
140
+ | No log | 2.0 | 178 | 1.0813 | 0.2651 | 1.0813 | 1.0398 |
141
+ | No log | 2.0225 | 180 | 1.1141 | 0.3418 | 1.1141 | 1.0555 |
142
+ | No log | 2.0449 | 182 | 1.0909 | 0.2113 | 1.0909 | 1.0445 |
143
+ | No log | 2.0674 | 184 | 1.1700 | 0.2686 | 1.1700 | 1.0817 |
144
+ | No log | 2.0899 | 186 | 1.2761 | 0.3215 | 1.2761 | 1.1296 |
145
+ | No log | 2.1124 | 188 | 1.2365 | 0.3636 | 1.2365 | 1.1120 |
146
+ | No log | 2.1348 | 190 | 1.1217 | 0.2608 | 1.1217 | 1.0591 |
147
+ | No log | 2.1573 | 192 | 1.0570 | 0.3185 | 1.0570 | 1.0281 |
148
+ | No log | 2.1798 | 194 | 1.0402 | 0.3327 | 1.0402 | 1.0199 |
149
+ | No log | 2.2022 | 196 | 1.0343 | 0.3327 | 1.0343 | 1.0170 |
150
+ | No log | 2.2247 | 198 | 1.0443 | 0.3344 | 1.0443 | 1.0219 |
151
+ | No log | 2.2472 | 200 | 1.0551 | 0.3344 | 1.0551 | 1.0272 |
152
+ | No log | 2.2697 | 202 | 1.0784 | 0.3380 | 1.0784 | 1.0384 |
153
+ | No log | 2.2921 | 204 | 1.0726 | 0.2483 | 1.0726 | 1.0357 |
154
+ | No log | 2.3146 | 206 | 1.0933 | 0.2562 | 1.0933 | 1.0456 |
155
+ | No log | 2.3371 | 208 | 1.0941 | 0.2830 | 1.0941 | 1.0460 |
156
+ | No log | 2.3596 | 210 | 1.1327 | 0.3308 | 1.1327 | 1.0643 |
157
+ | No log | 2.3820 | 212 | 1.2498 | 0.3771 | 1.2498 | 1.1179 |
158
+ | No log | 2.4045 | 214 | 1.2420 | 0.3350 | 1.2420 | 1.1144 |
159
+ | No log | 2.4270 | 216 | 1.1340 | 0.3097 | 1.1340 | 1.0649 |
160
+ | No log | 2.4494 | 218 | 1.0391 | 0.2834 | 1.0391 | 1.0194 |
161
+ | No log | 2.4719 | 220 | 1.0022 | 0.2057 | 1.0022 | 1.0011 |
162
+ | No log | 2.4944 | 222 | 0.9909 | 0.2057 | 0.9909 | 0.9954 |
163
+ | No log | 2.5169 | 224 | 0.9667 | 0.2610 | 0.9667 | 0.9832 |
164
+ | No log | 2.5393 | 226 | 0.9441 | 0.3117 | 0.9441 | 0.9716 |
165
+ | No log | 2.5618 | 228 | 0.9029 | 0.4084 | 0.9029 | 0.9502 |
166
+ | No log | 2.5843 | 230 | 0.8866 | 0.4485 | 0.8866 | 0.9416 |
167
+ | No log | 2.6067 | 232 | 0.8879 | 0.4059 | 0.8879 | 0.9423 |
168
+ | No log | 2.6292 | 234 | 0.9321 | 0.3059 | 0.9321 | 0.9655 |
169
+ | No log | 2.6517 | 236 | 0.9106 | 0.3704 | 0.9106 | 0.9542 |
170
+ | No log | 2.6742 | 238 | 0.9346 | 0.4603 | 0.9346 | 0.9667 |
171
+ | No log | 2.6966 | 240 | 0.9809 | 0.3926 | 0.9809 | 0.9904 |
172
+ | No log | 2.7191 | 242 | 0.9428 | 0.3583 | 0.9428 | 0.9710 |
173
+ | No log | 2.7416 | 244 | 0.9837 | 0.3110 | 0.9837 | 0.9918 |
174
+ | No log | 2.7640 | 246 | 0.9827 | 0.3062 | 0.9827 | 0.9913 |
175
+ | No log | 2.7865 | 248 | 0.9798 | 0.2965 | 0.9798 | 0.9899 |
176
+ | No log | 2.8090 | 250 | 1.0313 | 0.3110 | 1.0313 | 1.0155 |
177
+ | No log | 2.8315 | 252 | 1.2825 | 0.2024 | 1.2825 | 1.1325 |
178
+ | No log | 2.8539 | 254 | 1.3405 | 0.2549 | 1.3405 | 1.1578 |
179
+ | No log | 2.8764 | 256 | 1.1411 | 0.2864 | 1.1411 | 1.0682 |
180
+ | No log | 2.8989 | 258 | 0.9386 | 0.4016 | 0.9386 | 0.9688 |
181
+ | No log | 2.9213 | 260 | 0.9707 | 0.3613 | 0.9707 | 0.9853 |
182
+ | No log | 2.9438 | 262 | 0.9709 | 0.3834 | 0.9709 | 0.9853 |
183
+ | No log | 2.9663 | 264 | 0.9229 | 0.3263 | 0.9229 | 0.9607 |
184
+ | No log | 2.9888 | 266 | 0.9233 | 0.3558 | 0.9233 | 0.9609 |
185
+ | No log | 3.0112 | 268 | 0.9300 | 0.3163 | 0.9300 | 0.9644 |
186
+ | No log | 3.0337 | 270 | 0.9259 | 0.2763 | 0.9259 | 0.9622 |
187
+ | No log | 3.0562 | 272 | 0.9319 | 0.3164 | 0.9319 | 0.9654 |
188
+ | No log | 3.0787 | 274 | 0.9248 | 0.3804 | 0.9248 | 0.9617 |
189
+ | No log | 3.1011 | 276 | 0.9614 | 0.3043 | 0.9614 | 0.9805 |
190
+ | No log | 3.1236 | 278 | 0.9925 | 0.4166 | 0.9925 | 0.9962 |
191
+ | No log | 3.1461 | 280 | 0.9831 | 0.3841 | 0.9831 | 0.9915 |
192
+ | No log | 3.1685 | 282 | 0.9251 | 0.3164 | 0.9251 | 0.9618 |
193
+ | No log | 3.1910 | 284 | 0.9036 | 0.3873 | 0.9036 | 0.9506 |
194
+ | No log | 3.2135 | 286 | 0.8925 | 0.3879 | 0.8925 | 0.9447 |
195
+ | No log | 3.2360 | 288 | 0.8893 | 0.3738 | 0.8893 | 0.9430 |
196
+ | No log | 3.2584 | 290 | 0.9091 | 0.3493 | 0.9091 | 0.9535 |
197
+ | No log | 3.2809 | 292 | 0.9543 | 0.3335 | 0.9543 | 0.9769 |
198
+ | No log | 3.3034 | 294 | 0.9193 | 0.3621 | 0.9193 | 0.9588 |
199
+ | No log | 3.3258 | 296 | 0.9104 | 0.4002 | 0.9104 | 0.9541 |
200
+ | No log | 3.3483 | 298 | 0.8674 | 0.4119 | 0.8674 | 0.9314 |
201
+ | No log | 3.3708 | 300 | 0.9196 | 0.4550 | 0.9196 | 0.9590 |
202
+ | No log | 3.3933 | 302 | 0.9393 | 0.4480 | 0.9393 | 0.9692 |
203
+ | No log | 3.4157 | 304 | 0.8689 | 0.3704 | 0.8689 | 0.9322 |
204
+ | No log | 3.4382 | 306 | 0.8537 | 0.3762 | 0.8537 | 0.9240 |
205
+ | No log | 3.4607 | 308 | 0.8610 | 0.3557 | 0.8610 | 0.9279 |
206
+ | No log | 3.4831 | 310 | 0.8788 | 0.3145 | 0.8788 | 0.9375 |
207
+ | No log | 3.5056 | 312 | 0.8704 | 0.3859 | 0.8704 | 0.9329 |
208
+ | No log | 3.5281 | 314 | 0.9245 | 0.4076 | 0.9245 | 0.9615 |
209
+ | No log | 3.5506 | 316 | 0.9702 | 0.4543 | 0.9702 | 0.9850 |
210
+ | No log | 3.5730 | 318 | 0.9239 | 0.4880 | 0.9239 | 0.9612 |
211
+ | No log | 3.5955 | 320 | 0.8321 | 0.4730 | 0.8321 | 0.9122 |
212
+ | No log | 3.6180 | 322 | 0.8286 | 0.4491 | 0.8286 | 0.9103 |
213
+ | No log | 3.6404 | 324 | 0.8615 | 0.3991 | 0.8615 | 0.9282 |
214
+ | No log | 3.6629 | 326 | 0.8895 | 0.3454 | 0.8895 | 0.9431 |
215
+ | No log | 3.6854 | 328 | 0.8192 | 0.4424 | 0.8192 | 0.9051 |
216
+ | No log | 3.7079 | 330 | 0.8448 | 0.4898 | 0.8448 | 0.9191 |
217
+ | No log | 3.7303 | 332 | 0.8362 | 0.4898 | 0.8362 | 0.9144 |
218
+ | No log | 3.7528 | 334 | 0.8225 | 0.3639 | 0.8225 | 0.9069 |
219
+ | No log | 3.7753 | 336 | 0.8407 | 0.3437 | 0.8407 | 0.9169 |
220
+ | No log | 3.7978 | 338 | 0.8927 | 0.3363 | 0.8927 | 0.9448 |
221
+ | No log | 3.8202 | 340 | 0.8511 | 0.3695 | 0.8511 | 0.9225 |
222
+ | No log | 3.8427 | 342 | 0.8342 | 0.4118 | 0.8342 | 0.9133 |
223
+ | No log | 3.8652 | 344 | 0.8704 | 0.4054 | 0.8704 | 0.9329 |
224
+ | No log | 3.8876 | 346 | 0.8888 | 0.4054 | 0.8888 | 0.9427 |
225
+ | No log | 3.9101 | 348 | 0.9074 | 0.3408 | 0.9074 | 0.9526 |
226
+ | No log | 3.9326 | 350 | 1.0442 | 0.3770 | 1.0442 | 1.0218 |
227
+ | No log | 3.9551 | 352 | 1.0647 | 0.3409 | 1.0647 | 1.0319 |
228
+ | No log | 3.9775 | 354 | 0.9584 | 0.3070 | 0.9584 | 0.9790 |
229
+ | No log | 4.0 | 356 | 0.9194 | 0.3866 | 0.9194 | 0.9589 |
230
+ | No log | 4.0225 | 358 | 0.9634 | 0.3402 | 0.9634 | 0.9815 |
231
+ | No log | 4.0449 | 360 | 0.9266 | 0.3956 | 0.9266 | 0.9626 |
232
+ | No log | 4.0674 | 362 | 0.9108 | 0.2843 | 0.9108 | 0.9544 |
233
+ | No log | 4.0899 | 364 | 1.0308 | 0.3091 | 1.0308 | 1.0153 |
234
+ | No log | 4.1124 | 366 | 1.1552 | 0.4410 | 1.1552 | 1.0748 |
235
+ | No log | 4.1348 | 368 | 1.1453 | 0.4410 | 1.1453 | 1.0702 |
236
+ | No log | 4.1573 | 370 | 0.9973 | 0.4638 | 0.9973 | 0.9987 |
237
+ | No log | 4.1798 | 372 | 0.8591 | 0.3619 | 0.8591 | 0.9269 |
238
+ | No log | 4.2022 | 374 | 0.8464 | 0.3627 | 0.8464 | 0.9200 |
239
+ | No log | 4.2247 | 376 | 0.8584 | 0.3845 | 0.8584 | 0.9265 |
240
+ | No log | 4.2472 | 378 | 0.9299 | 0.3225 | 0.9299 | 0.9643 |
241
+ | No log | 4.2697 | 380 | 0.9817 | 0.4165 | 0.9817 | 0.9908 |
242
+ | No log | 4.2921 | 382 | 0.9878 | 0.3343 | 0.9878 | 0.9939 |
243
+ | No log | 4.3146 | 384 | 0.9297 | 0.2300 | 0.9297 | 0.9642 |
244
+ | No log | 4.3371 | 386 | 0.8894 | 0.3066 | 0.8894 | 0.9431 |
245
+ | No log | 4.3596 | 388 | 0.8856 | 0.3216 | 0.8856 | 0.9410 |
246
+ | No log | 4.3820 | 390 | 0.8705 | 0.3237 | 0.8705 | 0.9330 |
247
+ | No log | 4.4045 | 392 | 0.8913 | 0.3168 | 0.8913 | 0.9441 |
248
+ | No log | 4.4270 | 394 | 0.9036 | 0.3738 | 0.9036 | 0.9506 |
249
+ | No log | 4.4494 | 396 | 0.9355 | 0.2843 | 0.9355 | 0.9672 |
250
+ | No log | 4.4719 | 398 | 0.9554 | 0.2843 | 0.9554 | 0.9775 |
251
+ | No log | 4.4944 | 400 | 1.0114 | 0.2822 | 1.0114 | 1.0057 |
252
+ | No log | 4.5169 | 402 | 1.0712 | 0.3046 | 1.0712 | 1.0350 |
253
+ | No log | 4.5393 | 404 | 1.0287 | 0.2735 | 1.0287 | 1.0142 |
254
+ | No log | 4.5618 | 406 | 1.0309 | 0.3249 | 1.0309 | 1.0153 |
255
+ | No log | 4.5843 | 408 | 1.0175 | 0.2921 | 1.0175 | 1.0087 |
256
+ | No log | 4.6067 | 410 | 0.9902 | 0.2503 | 0.9902 | 0.9951 |
257
+ | No log | 4.6292 | 412 | 1.0155 | 0.1767 | 1.0155 | 1.0077 |
258
+ | No log | 4.6517 | 414 | 1.0209 | 0.2998 | 1.0209 | 1.0104 |
259
+ | No log | 4.6742 | 416 | 1.1030 | 0.4099 | 1.1030 | 1.0502 |
260
+ | No log | 4.6966 | 418 | 1.1633 | 0.4387 | 1.1633 | 1.0786 |
261
+ | No log | 4.7191 | 420 | 1.1085 | 0.4418 | 1.1085 | 1.0529 |
262
+ | No log | 4.7416 | 422 | 0.9395 | 0.3674 | 0.9395 | 0.9693 |
263
+ | No log | 4.7640 | 424 | 0.9778 | 0.3476 | 0.9778 | 0.9889 |
264
+ | No log | 4.7865 | 426 | 1.0229 | 0.3424 | 1.0229 | 1.0114 |
265
+ | No log | 4.8090 | 428 | 0.9421 | 0.3714 | 0.9421 | 0.9706 |
266
+ | No log | 4.8315 | 430 | 0.9675 | 0.3886 | 0.9675 | 0.9836 |
267
+ | No log | 4.8539 | 432 | 1.0543 | 0.4220 | 1.0543 | 1.0268 |
268
+ | No log | 4.8764 | 434 | 0.9973 | 0.3918 | 0.9973 | 0.9986 |
269
+ | No log | 4.8989 | 436 | 0.9117 | 0.3237 | 0.9117 | 0.9548 |
270
+ | No log | 4.9213 | 438 | 0.8720 | 0.2509 | 0.8720 | 0.9338 |
271
+ | No log | 4.9438 | 440 | 0.8465 | 0.3719 | 0.8465 | 0.9201 |
272
+ | No log | 4.9663 | 442 | 0.8427 | 0.3437 | 0.8427 | 0.9180 |
273
+ | No log | 4.9888 | 444 | 0.8556 | 0.3654 | 0.8556 | 0.9250 |
274
+ | No log | 5.0112 | 446 | 0.9448 | 0.4972 | 0.9448 | 0.9720 |
275
+ | No log | 5.0337 | 448 | 0.9425 | 0.4972 | 0.9425 | 0.9708 |
276
+ | No log | 5.0562 | 450 | 0.9619 | 0.5015 | 0.9619 | 0.9808 |
277
+ | No log | 5.0787 | 452 | 0.9440 | 0.4869 | 0.9440 | 0.9716 |
278
+ | No log | 5.1011 | 454 | 0.9802 | 0.5015 | 0.9802 | 0.9900 |
279
+ | No log | 5.1236 | 456 | 0.9895 | 0.5015 | 0.9895 | 0.9947 |
280
+ | No log | 5.1461 | 458 | 1.0262 | 0.5015 | 1.0262 | 1.0130 |
281
+ | No log | 5.1685 | 460 | 1.1085 | 0.4697 | 1.1085 | 1.0528 |
282
+ | No log | 5.1910 | 462 | 1.0470 | 0.4857 | 1.0470 | 1.0232 |
283
+ | No log | 5.2135 | 464 | 0.9219 | 0.4202 | 0.9219 | 0.9602 |
284
+ | No log | 5.2360 | 466 | 0.9111 | 0.3548 | 0.9111 | 0.9545 |
285
+ | No log | 5.2584 | 468 | 0.8833 | 0.3042 | 0.8833 | 0.9398 |
286
+ | No log | 5.2809 | 470 | 0.8870 | 0.3042 | 0.8870 | 0.9418 |
287
+ | No log | 5.3034 | 472 | 0.9191 | 0.3160 | 0.9191 | 0.9587 |
288
+ | No log | 5.3258 | 474 | 0.9657 | 0.4034 | 0.9657 | 0.9827 |
289
+ | No log | 5.3483 | 476 | 0.9905 | 0.4258 | 0.9905 | 0.9952 |
290
+ | No log | 5.3708 | 478 | 0.9303 | 0.3632 | 0.9303 | 0.9645 |
291
+ | No log | 5.3933 | 480 | 0.8790 | 0.3041 | 0.8790 | 0.9376 |
292
+ | No log | 5.4157 | 482 | 0.8790 | 0.3859 | 0.8790 | 0.9375 |
293
+ | No log | 5.4382 | 484 | 0.8641 | 0.3437 | 0.8641 | 0.9296 |
294
+ | No log | 5.4607 | 486 | 0.8624 | 0.3290 | 0.8624 | 0.9286 |
295
+ | No log | 5.4831 | 488 | 0.8689 | 0.3508 | 0.8689 | 0.9321 |
296
+ | No log | 5.5056 | 490 | 0.8406 | 0.3830 | 0.8406 | 0.9169 |
297
+ | No log | 5.5281 | 492 | 0.8474 | 0.3830 | 0.8474 | 0.9205 |
298
+ | No log | 5.5506 | 494 | 0.8655 | 0.3859 | 0.8655 | 0.9303 |
299
+ | No log | 5.5730 | 496 | 0.8625 | 0.3859 | 0.8625 | 0.9287 |
300
+ | No log | 5.5955 | 498 | 0.8304 | 0.3830 | 0.8304 | 0.9113 |
301
+ | 0.3912 | 5.6180 | 500 | 0.8759 | 0.4444 | 0.8759 | 0.9359 |
302
+ | 0.3912 | 5.6404 | 502 | 0.9440 | 0.4811 | 0.9440 | 0.9716 |
303
+ | 0.3912 | 5.6629 | 504 | 0.9122 | 0.4724 | 0.9122 | 0.9551 |
304
+ | 0.3912 | 5.6854 | 506 | 0.8377 | 0.3948 | 0.8377 | 0.9153 |
305
+ | 0.3912 | 5.7079 | 508 | 0.8391 | 0.4548 | 0.8391 | 0.9160 |
306
+ | 0.3912 | 5.7303 | 510 | 0.8559 | 0.4048 | 0.8559 | 0.9252 |
307
+ | 0.3912 | 5.7528 | 512 | 0.9112 | 0.3654 | 0.9112 | 0.9546 |
308
+ | 0.3912 | 5.7753 | 514 | 0.8912 | 0.3142 | 0.8912 | 0.9440 |
309
+ | 0.3912 | 5.7978 | 516 | 0.9229 | 0.2199 | 0.9229 | 0.9607 |
310
+ | 0.3912 | 5.8202 | 518 | 0.9379 | 0.2843 | 0.9379 | 0.9685 |
311
+
312
+
313
+ ### Framework versions
314
+
315
+ - Transformers 4.44.2
316
+ - Pytorch 2.4.0+cu118
317
+ - Datasets 2.21.0
318
+ - Tokenizers 0.19.1
config.json ADDED
@@ -0,0 +1,32 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "_name_or_path": "aubmindlab/bert-base-arabertv02",
3
+ "architectures": [
4
+ "BertForSequenceClassification"
5
+ ],
6
+ "attention_probs_dropout_prob": 0.1,
7
+ "classifier_dropout": null,
8
+ "hidden_act": "gelu",
9
+ "hidden_dropout_prob": 0.1,
10
+ "hidden_size": 768,
11
+ "id2label": {
12
+ "0": "LABEL_0"
13
+ },
14
+ "initializer_range": 0.02,
15
+ "intermediate_size": 3072,
16
+ "label2id": {
17
+ "LABEL_0": 0
18
+ },
19
+ "layer_norm_eps": 1e-12,
20
+ "max_position_embeddings": 512,
21
+ "model_type": "bert",
22
+ "num_attention_heads": 12,
23
+ "num_hidden_layers": 12,
24
+ "pad_token_id": 0,
25
+ "position_embedding_type": "absolute",
26
+ "problem_type": "regression",
27
+ "torch_dtype": "float32",
28
+ "transformers_version": "4.44.2",
29
+ "type_vocab_size": 2,
30
+ "use_cache": true,
31
+ "vocab_size": 64000
32
+ }
model.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:f22b91fe104357c529d0ecebc4dd61d2bcd751d8c7d77ca4acfdde1380ab3711
3
+ size 540799996
training_args.bin ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:b85ee66c028ad71811b0029283363b0498386c6b6386cafc32b77e1551c601d2
3
+ size 5368