MayBashendy commited on
Commit
f510595
·
verified ·
1 Parent(s): 2a37950

Training in progress, step 500

Browse files
Files changed (4) hide show
  1. README.md +317 -0
  2. config.json +32 -0
  3. model.safetensors +3 -0
  4. training_args.bin +3 -0
README.md ADDED
@@ -0,0 +1,317 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ library_name: transformers
3
+ base_model: aubmindlab/bert-base-arabertv02
4
+ tags:
5
+ - generated_from_trainer
6
+ model-index:
7
+ - name: ArabicNewSplits7_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k10_task2_organization
8
+ results: []
9
+ ---
10
+
11
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
12
+ should probably proofread and complete it, then remove this comment. -->
13
+
14
+ # ArabicNewSplits7_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k10_task2_organization
15
+
16
+ This model is a fine-tuned version of [aubmindlab/bert-base-arabertv02](https://huggingface.co/aubmindlab/bert-base-arabertv02) on the None dataset.
17
+ It achieves the following results on the evaluation set:
18
+ - Loss: 1.0123
19
+ - Qwk: 0.3711
20
+ - Mse: 1.0123
21
+ - Rmse: 1.0061
22
+
23
+ ## Model description
24
+
25
+ More information needed
26
+
27
+ ## Intended uses & limitations
28
+
29
+ More information needed
30
+
31
+ ## Training and evaluation data
32
+
33
+ More information needed
34
+
35
+ ## Training procedure
36
+
37
+ ### Training hyperparameters
38
+
39
+ The following hyperparameters were used during training:
40
+ - learning_rate: 2e-05
41
+ - train_batch_size: 8
42
+ - eval_batch_size: 8
43
+ - seed: 42
44
+ - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
45
+ - lr_scheduler_type: linear
46
+ - num_epochs: 100
47
+
48
+ ### Training results
49
+
50
+ | Training Loss | Epoch | Step | Validation Loss | Qwk | Mse | Rmse |
51
+ |:-------------:|:-------:|:----:|:---------------:|:-------:|:------:|:------:|
52
+ | No log | 0.0690 | 2 | 4.6454 | -0.0020 | 4.6454 | 2.1553 |
53
+ | No log | 0.1379 | 4 | 2.6055 | 0.0025 | 2.6055 | 1.6141 |
54
+ | No log | 0.2069 | 6 | 2.1625 | 0.0143 | 2.1625 | 1.4706 |
55
+ | No log | 0.2759 | 8 | 2.0563 | 0.0324 | 2.0563 | 1.4340 |
56
+ | No log | 0.3448 | 10 | 1.5247 | 0.0182 | 1.5247 | 1.2348 |
57
+ | No log | 0.4138 | 12 | 1.2363 | 0.1532 | 1.2363 | 1.1119 |
58
+ | No log | 0.4828 | 14 | 1.1886 | 0.1379 | 1.1886 | 1.0902 |
59
+ | No log | 0.5517 | 16 | 1.1939 | 0.1711 | 1.1939 | 1.0927 |
60
+ | No log | 0.6207 | 18 | 1.3784 | 0.0254 | 1.3784 | 1.1741 |
61
+ | No log | 0.6897 | 20 | 1.5929 | 0.0 | 1.5929 | 1.2621 |
62
+ | No log | 0.7586 | 22 | 1.4892 | 0.0 | 1.4892 | 1.2203 |
63
+ | No log | 0.8276 | 24 | 1.2276 | 0.1542 | 1.2276 | 1.1080 |
64
+ | No log | 0.8966 | 26 | 1.1763 | 0.2142 | 1.1763 | 1.0846 |
65
+ | No log | 0.9655 | 28 | 1.1921 | 0.1979 | 1.1921 | 1.0919 |
66
+ | No log | 1.0345 | 30 | 1.2465 | 0.1814 | 1.2465 | 1.1165 |
67
+ | No log | 1.1034 | 32 | 1.5569 | -0.0535 | 1.5569 | 1.2478 |
68
+ | No log | 1.1724 | 34 | 1.7020 | 0.0793 | 1.7020 | 1.3046 |
69
+ | No log | 1.2414 | 36 | 1.7523 | 0.0033 | 1.7523 | 1.3238 |
70
+ | No log | 1.3103 | 38 | 1.5029 | 0.1542 | 1.5029 | 1.2259 |
71
+ | No log | 1.3793 | 40 | 1.3188 | 0.0654 | 1.3188 | 1.1484 |
72
+ | No log | 1.4483 | 42 | 1.2829 | 0.1335 | 1.2829 | 1.1327 |
73
+ | No log | 1.5172 | 44 | 1.2488 | 0.1556 | 1.2488 | 1.1175 |
74
+ | No log | 1.5862 | 46 | 1.2460 | 0.2342 | 1.2460 | 1.1162 |
75
+ | No log | 1.6552 | 48 | 1.4588 | 0.1772 | 1.4588 | 1.2078 |
76
+ | No log | 1.7241 | 50 | 1.5077 | 0.1196 | 1.5077 | 1.2279 |
77
+ | No log | 1.7931 | 52 | 1.2343 | 0.1865 | 1.2343 | 1.1110 |
78
+ | No log | 1.8621 | 54 | 1.0707 | 0.3603 | 1.0707 | 1.0347 |
79
+ | No log | 1.9310 | 56 | 1.0509 | 0.3299 | 1.0509 | 1.0251 |
80
+ | No log | 2.0 | 58 | 1.0824 | 0.1752 | 1.0824 | 1.0404 |
81
+ | No log | 2.0690 | 60 | 1.0970 | 0.1752 | 1.0970 | 1.0474 |
82
+ | No log | 2.1379 | 62 | 1.1368 | 0.2835 | 1.1368 | 1.0662 |
83
+ | No log | 2.2069 | 64 | 1.3277 | 0.0838 | 1.3277 | 1.1522 |
84
+ | No log | 2.2759 | 66 | 1.4978 | 0.0723 | 1.4978 | 1.2239 |
85
+ | No log | 2.3448 | 68 | 1.4254 | 0.0811 | 1.4254 | 1.1939 |
86
+ | No log | 2.4138 | 70 | 1.2430 | 0.0688 | 1.2430 | 1.1149 |
87
+ | No log | 2.4828 | 72 | 1.1293 | 0.2935 | 1.1293 | 1.0627 |
88
+ | No log | 2.5517 | 74 | 1.0813 | 0.4072 | 1.0813 | 1.0399 |
89
+ | No log | 2.6207 | 76 | 1.0692 | 0.3921 | 1.0692 | 1.0340 |
90
+ | No log | 2.6897 | 78 | 1.0735 | 0.3902 | 1.0735 | 1.0361 |
91
+ | No log | 2.7586 | 80 | 1.1464 | 0.2690 | 1.1464 | 1.0707 |
92
+ | No log | 2.8276 | 82 | 1.2223 | 0.2675 | 1.2223 | 1.1056 |
93
+ | No log | 2.8966 | 84 | 1.2232 | 0.2346 | 1.2232 | 1.1060 |
94
+ | No log | 2.9655 | 86 | 1.2903 | 0.2019 | 1.2903 | 1.1359 |
95
+ | No log | 3.0345 | 88 | 1.2289 | 0.2346 | 1.2289 | 1.1085 |
96
+ | No log | 3.1034 | 90 | 1.0991 | 0.3031 | 1.0991 | 1.0484 |
97
+ | No log | 3.1724 | 92 | 1.0551 | 0.3855 | 1.0551 | 1.0272 |
98
+ | No log | 3.2414 | 94 | 1.1354 | 0.3018 | 1.1354 | 1.0656 |
99
+ | No log | 3.3103 | 96 | 1.1983 | 0.3530 | 1.1983 | 1.0946 |
100
+ | No log | 3.3793 | 98 | 1.0883 | 0.3633 | 1.0883 | 1.0432 |
101
+ | No log | 3.4483 | 100 | 1.0806 | 0.4050 | 1.0806 | 1.0395 |
102
+ | No log | 3.5172 | 102 | 1.1054 | 0.3897 | 1.1054 | 1.0514 |
103
+ | No log | 3.5862 | 104 | 1.0486 | 0.4318 | 1.0486 | 1.0240 |
104
+ | No log | 3.6552 | 106 | 1.0682 | 0.3985 | 1.0682 | 1.0335 |
105
+ | No log | 3.7241 | 108 | 1.0650 | 0.4215 | 1.0650 | 1.0320 |
106
+ | No log | 3.7931 | 110 | 1.0845 | 0.4215 | 1.0845 | 1.0414 |
107
+ | No log | 3.8621 | 112 | 1.0785 | 0.3923 | 1.0785 | 1.0385 |
108
+ | No log | 3.9310 | 114 | 1.0960 | 0.3460 | 1.0960 | 1.0469 |
109
+ | No log | 4.0 | 116 | 1.1551 | 0.3469 | 1.1551 | 1.0748 |
110
+ | No log | 4.0690 | 118 | 1.0701 | 0.3590 | 1.0701 | 1.0345 |
111
+ | No log | 4.1379 | 120 | 1.0676 | 0.3328 | 1.0676 | 1.0332 |
112
+ | No log | 4.2069 | 122 | 1.0709 | 0.4036 | 1.0709 | 1.0348 |
113
+ | No log | 4.2759 | 124 | 0.9802 | 0.4290 | 0.9802 | 0.9900 |
114
+ | No log | 4.3448 | 126 | 0.9626 | 0.4417 | 0.9626 | 0.9811 |
115
+ | No log | 4.4138 | 128 | 1.0460 | 0.4677 | 1.0460 | 1.0227 |
116
+ | No log | 4.4828 | 130 | 1.0330 | 0.4431 | 1.0330 | 1.0163 |
117
+ | No log | 4.5517 | 132 | 1.1244 | 0.3560 | 1.1244 | 1.0604 |
118
+ | No log | 4.6207 | 134 | 1.1056 | 0.3941 | 1.1056 | 1.0515 |
119
+ | No log | 4.6897 | 136 | 0.9625 | 0.4527 | 0.9625 | 0.9811 |
120
+ | No log | 4.7586 | 138 | 0.9949 | 0.4297 | 0.9949 | 0.9974 |
121
+ | No log | 4.8276 | 140 | 0.9952 | 0.4297 | 0.9952 | 0.9976 |
122
+ | No log | 4.8966 | 142 | 0.9702 | 0.4349 | 0.9702 | 0.9850 |
123
+ | No log | 4.9655 | 144 | 1.1275 | 0.3484 | 1.1275 | 1.0619 |
124
+ | No log | 5.0345 | 146 | 1.4800 | 0.3025 | 1.4800 | 1.2165 |
125
+ | No log | 5.1034 | 148 | 1.5980 | 0.2796 | 1.5980 | 1.2641 |
126
+ | No log | 5.1724 | 150 | 1.2760 | 0.3792 | 1.2760 | 1.1296 |
127
+ | No log | 5.2414 | 152 | 0.9558 | 0.5121 | 0.9558 | 0.9776 |
128
+ | No log | 5.3103 | 154 | 0.9440 | 0.4814 | 0.9440 | 0.9716 |
129
+ | No log | 5.3793 | 156 | 0.9740 | 0.4382 | 0.9740 | 0.9869 |
130
+ | No log | 5.4483 | 158 | 1.0761 | 0.4974 | 1.0761 | 1.0374 |
131
+ | No log | 5.5172 | 160 | 1.2551 | 0.4261 | 1.2551 | 1.1203 |
132
+ | No log | 5.5862 | 162 | 1.3741 | 0.3876 | 1.3741 | 1.1722 |
133
+ | No log | 5.6552 | 164 | 1.1897 | 0.3490 | 1.1897 | 1.0907 |
134
+ | No log | 5.7241 | 166 | 1.1177 | 0.3884 | 1.1177 | 1.0572 |
135
+ | No log | 5.7931 | 168 | 1.0083 | 0.4291 | 1.0083 | 1.0042 |
136
+ | No log | 5.8621 | 170 | 0.9754 | 0.4074 | 0.9754 | 0.9876 |
137
+ | No log | 5.9310 | 172 | 1.0399 | 0.3667 | 1.0399 | 1.0198 |
138
+ | No log | 6.0 | 174 | 1.0542 | 0.3413 | 1.0542 | 1.0267 |
139
+ | No log | 6.0690 | 176 | 0.9969 | 0.4007 | 0.9969 | 0.9984 |
140
+ | No log | 6.1379 | 178 | 0.8884 | 0.4459 | 0.8884 | 0.9425 |
141
+ | No log | 6.2069 | 180 | 0.9236 | 0.5060 | 0.9236 | 0.9610 |
142
+ | No log | 6.2759 | 182 | 1.1094 | 0.4583 | 1.1094 | 1.0533 |
143
+ | No log | 6.3448 | 184 | 1.1477 | 0.4293 | 1.1477 | 1.0713 |
144
+ | No log | 6.4138 | 186 | 0.9977 | 0.3980 | 0.9977 | 0.9988 |
145
+ | No log | 6.4828 | 188 | 0.9228 | 0.5176 | 0.9228 | 0.9606 |
146
+ | No log | 6.5517 | 190 | 0.9307 | 0.4586 | 0.9307 | 0.9647 |
147
+ | No log | 6.6207 | 192 | 0.9429 | 0.4685 | 0.9429 | 0.9710 |
148
+ | No log | 6.6897 | 194 | 0.8759 | 0.5264 | 0.8759 | 0.9359 |
149
+ | No log | 6.7586 | 196 | 0.8872 | 0.5470 | 0.8872 | 0.9419 |
150
+ | No log | 6.8276 | 198 | 1.0619 | 0.3921 | 1.0619 | 1.0305 |
151
+ | No log | 6.8966 | 200 | 1.3526 | 0.3209 | 1.3526 | 1.1630 |
152
+ | No log | 6.9655 | 202 | 1.3885 | 0.3099 | 1.3885 | 1.1783 |
153
+ | No log | 7.0345 | 204 | 1.1847 | 0.3411 | 1.1847 | 1.0884 |
154
+ | No log | 7.1034 | 206 | 0.9044 | 0.5029 | 0.9044 | 0.9510 |
155
+ | No log | 7.1724 | 208 | 0.8520 | 0.5175 | 0.8520 | 0.9230 |
156
+ | No log | 7.2414 | 210 | 0.8574 | 0.4498 | 0.8574 | 0.9259 |
157
+ | No log | 7.3103 | 212 | 0.8700 | 0.4927 | 0.8700 | 0.9327 |
158
+ | No log | 7.3793 | 214 | 0.9989 | 0.4289 | 0.9989 | 0.9994 |
159
+ | No log | 7.4483 | 216 | 1.1896 | 0.3024 | 1.1896 | 1.0907 |
160
+ | No log | 7.5172 | 218 | 1.2545 | 0.3134 | 1.2545 | 1.1200 |
161
+ | No log | 7.5862 | 220 | 1.1618 | 0.2798 | 1.1618 | 1.0779 |
162
+ | No log | 7.6552 | 222 | 1.0199 | 0.4127 | 1.0199 | 1.0099 |
163
+ | No log | 7.7241 | 224 | 0.9416 | 0.4300 | 0.9416 | 0.9704 |
164
+ | No log | 7.7931 | 226 | 0.9471 | 0.3931 | 0.9471 | 0.9732 |
165
+ | No log | 7.8621 | 228 | 0.9499 | 0.3434 | 0.9499 | 0.9746 |
166
+ | No log | 7.9310 | 230 | 1.0119 | 0.3996 | 1.0119 | 1.0059 |
167
+ | No log | 8.0 | 232 | 1.1404 | 0.2634 | 1.1404 | 1.0679 |
168
+ | No log | 8.0690 | 234 | 1.1457 | 0.3044 | 1.1457 | 1.0704 |
169
+ | No log | 8.1379 | 236 | 1.0377 | 0.4281 | 1.0377 | 1.0187 |
170
+ | No log | 8.2069 | 238 | 0.9411 | 0.5013 | 0.9411 | 0.9701 |
171
+ | No log | 8.2759 | 240 | 0.9227 | 0.4780 | 0.9227 | 0.9606 |
172
+ | No log | 8.3448 | 242 | 0.9561 | 0.4611 | 0.9561 | 0.9778 |
173
+ | No log | 8.4138 | 244 | 1.0110 | 0.3936 | 1.0110 | 1.0055 |
174
+ | No log | 8.4828 | 246 | 1.0071 | 0.3560 | 1.0071 | 1.0035 |
175
+ | No log | 8.5517 | 248 | 1.0148 | 0.3891 | 1.0148 | 1.0074 |
176
+ | No log | 8.6207 | 250 | 0.9813 | 0.3841 | 0.9813 | 0.9906 |
177
+ | No log | 8.6897 | 252 | 0.9511 | 0.4369 | 0.9511 | 0.9753 |
178
+ | No log | 8.7586 | 254 | 0.9231 | 0.4611 | 0.9231 | 0.9608 |
179
+ | No log | 8.8276 | 256 | 0.9293 | 0.4145 | 0.9293 | 0.9640 |
180
+ | No log | 8.8966 | 258 | 0.9602 | 0.4572 | 0.9602 | 0.9799 |
181
+ | No log | 8.9655 | 260 | 1.0675 | 0.4668 | 1.0675 | 1.0332 |
182
+ | No log | 9.0345 | 262 | 1.0312 | 0.4607 | 1.0312 | 1.0155 |
183
+ | No log | 9.1034 | 264 | 1.0011 | 0.4454 | 1.0011 | 1.0005 |
184
+ | No log | 9.1724 | 266 | 0.9381 | 0.4016 | 0.9381 | 0.9686 |
185
+ | No log | 9.2414 | 268 | 0.9309 | 0.4009 | 0.9309 | 0.9648 |
186
+ | No log | 9.3103 | 270 | 0.9310 | 0.4242 | 0.9310 | 0.9649 |
187
+ | No log | 9.3793 | 272 | 0.9314 | 0.4519 | 0.9314 | 0.9651 |
188
+ | No log | 9.4483 | 274 | 0.9742 | 0.3787 | 0.9742 | 0.9870 |
189
+ | No log | 9.5172 | 276 | 1.0505 | 0.3584 | 1.0505 | 1.0250 |
190
+ | No log | 9.5862 | 278 | 1.0823 | 0.2844 | 1.0823 | 1.0403 |
191
+ | No log | 9.6552 | 280 | 1.0337 | 0.2976 | 1.0337 | 1.0167 |
192
+ | No log | 9.7241 | 282 | 1.0356 | 0.2574 | 1.0356 | 1.0176 |
193
+ | No log | 9.7931 | 284 | 1.0392 | 0.2431 | 1.0392 | 1.0194 |
194
+ | No log | 9.8621 | 286 | 1.0149 | 0.3069 | 1.0149 | 1.0074 |
195
+ | No log | 9.9310 | 288 | 1.0037 | 0.3613 | 1.0037 | 1.0018 |
196
+ | No log | 10.0 | 290 | 1.0890 | 0.2727 | 1.0890 | 1.0436 |
197
+ | No log | 10.0690 | 292 | 1.1181 | 0.2978 | 1.1181 | 1.0574 |
198
+ | No log | 10.1379 | 294 | 1.0352 | 0.3716 | 1.0352 | 1.0175 |
199
+ | No log | 10.2069 | 296 | 0.9644 | 0.3654 | 0.9644 | 0.9821 |
200
+ | No log | 10.2759 | 298 | 0.9096 | 0.3884 | 0.9096 | 0.9537 |
201
+ | No log | 10.3448 | 300 | 0.9025 | 0.3769 | 0.9025 | 0.9500 |
202
+ | No log | 10.4138 | 302 | 0.9132 | 0.4244 | 0.9132 | 0.9556 |
203
+ | No log | 10.4828 | 304 | 1.0247 | 0.3225 | 1.0247 | 1.0123 |
204
+ | No log | 10.5517 | 306 | 1.1756 | 0.2857 | 1.1756 | 1.0843 |
205
+ | No log | 10.6207 | 308 | 1.1710 | 0.2716 | 1.1710 | 1.0821 |
206
+ | No log | 10.6897 | 310 | 1.0307 | 0.3655 | 1.0307 | 1.0152 |
207
+ | No log | 10.7586 | 312 | 0.8973 | 0.4328 | 0.8973 | 0.9473 |
208
+ | No log | 10.8276 | 314 | 0.8537 | 0.5374 | 0.8537 | 0.9240 |
209
+ | No log | 10.8966 | 316 | 0.8342 | 0.5279 | 0.8342 | 0.9133 |
210
+ | No log | 10.9655 | 318 | 0.8389 | 0.5250 | 0.8389 | 0.9159 |
211
+ | No log | 11.0345 | 320 | 0.9041 | 0.4201 | 0.9041 | 0.9508 |
212
+ | No log | 11.1034 | 322 | 0.9791 | 0.4291 | 0.9791 | 0.9895 |
213
+ | No log | 11.1724 | 324 | 0.9276 | 0.4648 | 0.9276 | 0.9631 |
214
+ | No log | 11.2414 | 326 | 0.8392 | 0.5080 | 0.8392 | 0.9161 |
215
+ | No log | 11.3103 | 328 | 0.8196 | 0.4563 | 0.8196 | 0.9053 |
216
+ | No log | 11.3793 | 330 | 0.8215 | 0.4526 | 0.8215 | 0.9064 |
217
+ | No log | 11.4483 | 332 | 0.8499 | 0.4722 | 0.8499 | 0.9219 |
218
+ | No log | 11.5172 | 334 | 0.8641 | 0.4946 | 0.8641 | 0.9296 |
219
+ | No log | 11.5862 | 336 | 0.8626 | 0.5213 | 0.8626 | 0.9287 |
220
+ | No log | 11.6552 | 338 | 0.8756 | 0.5045 | 0.8756 | 0.9357 |
221
+ | No log | 11.7241 | 340 | 0.9343 | 0.4519 | 0.9343 | 0.9666 |
222
+ | No log | 11.7931 | 342 | 0.9181 | 0.4526 | 0.9181 | 0.9582 |
223
+ | No log | 11.8621 | 344 | 0.8827 | 0.4526 | 0.8827 | 0.9395 |
224
+ | No log | 11.9310 | 346 | 0.9031 | 0.4036 | 0.9031 | 0.9503 |
225
+ | No log | 12.0 | 348 | 0.9501 | 0.3401 | 0.9501 | 0.9748 |
226
+ | No log | 12.0690 | 350 | 0.9368 | 0.3994 | 0.9368 | 0.9679 |
227
+ | No log | 12.1379 | 352 | 0.9132 | 0.3557 | 0.9132 | 0.9556 |
228
+ | No log | 12.2069 | 354 | 0.8988 | 0.4036 | 0.8988 | 0.9480 |
229
+ | No log | 12.2759 | 356 | 0.9501 | 0.4295 | 0.9501 | 0.9747 |
230
+ | No log | 12.3448 | 358 | 1.0398 | 0.4539 | 1.0398 | 1.0197 |
231
+ | No log | 12.4138 | 360 | 1.2429 | 0.3304 | 1.2429 | 1.1148 |
232
+ | No log | 12.4828 | 362 | 1.3039 | 0.3576 | 1.3039 | 1.1419 |
233
+ | No log | 12.5517 | 364 | 1.1888 | 0.3523 | 1.1888 | 1.0903 |
234
+ | No log | 12.6207 | 366 | 1.0657 | 0.2381 | 1.0657 | 1.0323 |
235
+ | No log | 12.6897 | 368 | 1.0717 | 0.2236 | 1.0717 | 1.0352 |
236
+ | No log | 12.7586 | 370 | 1.0196 | 0.2702 | 1.0196 | 1.0097 |
237
+ | No log | 12.8276 | 372 | 0.9746 | 0.3879 | 0.9746 | 0.9872 |
238
+ | No log | 12.8966 | 374 | 0.9594 | 0.3656 | 0.9594 | 0.9795 |
239
+ | No log | 12.9655 | 376 | 1.0135 | 0.3762 | 1.0135 | 1.0067 |
240
+ | No log | 13.0345 | 378 | 1.0540 | 0.4096 | 1.0540 | 1.0266 |
241
+ | No log | 13.1034 | 380 | 1.0768 | 0.3739 | 1.0768 | 1.0377 |
242
+ | No log | 13.1724 | 382 | 1.0794 | 0.3739 | 1.0794 | 1.0389 |
243
+ | No log | 13.2414 | 384 | 0.9824 | 0.3753 | 0.9824 | 0.9911 |
244
+ | No log | 13.3103 | 386 | 0.9750 | 0.3753 | 0.9750 | 0.9874 |
245
+ | No log | 13.3793 | 388 | 0.9401 | 0.3165 | 0.9401 | 0.9696 |
246
+ | No log | 13.4483 | 390 | 0.8928 | 0.3957 | 0.8928 | 0.9449 |
247
+ | No log | 13.5172 | 392 | 0.8779 | 0.3616 | 0.8779 | 0.9370 |
248
+ | No log | 13.5862 | 394 | 0.8979 | 0.3462 | 0.8979 | 0.9476 |
249
+ | No log | 13.6552 | 396 | 0.9734 | 0.2619 | 0.9734 | 0.9866 |
250
+ | No log | 13.7241 | 398 | 0.9776 | 0.2781 | 0.9776 | 0.9887 |
251
+ | No log | 13.7931 | 400 | 1.0028 | 0.2619 | 1.0028 | 1.0014 |
252
+ | No log | 13.8621 | 402 | 1.0322 | 0.2669 | 1.0322 | 1.0160 |
253
+ | No log | 13.9310 | 404 | 1.0472 | 0.2669 | 1.0472 | 1.0233 |
254
+ | No log | 14.0 | 406 | 1.0288 | 0.3844 | 1.0288 | 1.0143 |
255
+ | No log | 14.0690 | 408 | 0.9881 | 0.3474 | 0.9881 | 0.9940 |
256
+ | No log | 14.1379 | 410 | 0.9533 | 0.3474 | 0.9533 | 0.9764 |
257
+ | No log | 14.2069 | 412 | 0.9776 | 0.3474 | 0.9776 | 0.9887 |
258
+ | No log | 14.2759 | 414 | 0.9935 | 0.2949 | 0.9935 | 0.9968 |
259
+ | No log | 14.3448 | 416 | 0.9899 | 0.2552 | 0.9899 | 0.9949 |
260
+ | No log | 14.4138 | 418 | 0.9637 | 0.3223 | 0.9637 | 0.9817 |
261
+ | No log | 14.4828 | 420 | 0.9843 | 0.2220 | 0.9843 | 0.9921 |
262
+ | No log | 14.5517 | 422 | 1.0023 | 0.2263 | 1.0023 | 1.0012 |
263
+ | No log | 14.6207 | 424 | 1.0532 | 0.2857 | 1.0532 | 1.0263 |
264
+ | No log | 14.6897 | 426 | 1.0035 | 0.3394 | 1.0035 | 1.0018 |
265
+ | No log | 14.7586 | 428 | 0.9527 | 0.4328 | 0.9527 | 0.9761 |
266
+ | No log | 14.8276 | 430 | 1.0052 | 0.4811 | 1.0052 | 1.0026 |
267
+ | No log | 14.8966 | 432 | 1.0275 | 0.4134 | 1.0275 | 1.0137 |
268
+ | No log | 14.9655 | 434 | 0.9701 | 0.4833 | 0.9701 | 0.9850 |
269
+ | No log | 15.0345 | 436 | 0.9370 | 0.4983 | 0.9370 | 0.9680 |
270
+ | No log | 15.1034 | 438 | 0.9519 | 0.4631 | 0.9519 | 0.9756 |
271
+ | No log | 15.1724 | 440 | 0.9807 | 0.4291 | 0.9807 | 0.9903 |
272
+ | No log | 15.2414 | 442 | 1.0333 | 0.3567 | 1.0333 | 1.0165 |
273
+ | No log | 15.3103 | 444 | 1.0543 | 0.3089 | 1.0543 | 1.0268 |
274
+ | No log | 15.3793 | 446 | 0.9667 | 0.4354 | 0.9667 | 0.9832 |
275
+ | No log | 15.4483 | 448 | 0.8873 | 0.3998 | 0.8873 | 0.9420 |
276
+ | No log | 15.5172 | 450 | 0.8848 | 0.3998 | 0.8848 | 0.9407 |
277
+ | No log | 15.5862 | 452 | 0.9109 | 0.4203 | 0.9109 | 0.9544 |
278
+ | No log | 15.6552 | 454 | 0.9767 | 0.3625 | 0.9767 | 0.9883 |
279
+ | No log | 15.7241 | 456 | 1.1133 | 0.3249 | 1.1133 | 1.0551 |
280
+ | No log | 15.7931 | 458 | 1.1312 | 0.3249 | 1.1312 | 1.0636 |
281
+ | No log | 15.8621 | 460 | 1.0700 | 0.2978 | 1.0700 | 1.0344 |
282
+ | No log | 15.9310 | 462 | 0.9866 | 0.3202 | 0.9866 | 0.9933 |
283
+ | No log | 16.0 | 464 | 1.0060 | 0.1918 | 1.0060 | 1.0030 |
284
+ | No log | 16.0690 | 466 | 1.0651 | 0.2108 | 1.0651 | 1.0320 |
285
+ | No log | 16.1379 | 468 | 1.0275 | 0.2604 | 1.0275 | 1.0137 |
286
+ | No log | 16.2069 | 470 | 1.0119 | 0.3503 | 1.0119 | 1.0059 |
287
+ | No log | 16.2759 | 472 | 0.9644 | 0.4262 | 0.9644 | 0.9820 |
288
+ | No log | 16.3448 | 474 | 0.8936 | 0.4470 | 0.8936 | 0.9453 |
289
+ | No log | 16.4138 | 476 | 0.8581 | 0.4006 | 0.8581 | 0.9263 |
290
+ | No log | 16.4828 | 478 | 0.8356 | 0.3902 | 0.8356 | 0.9141 |
291
+ | No log | 16.5517 | 480 | 0.8829 | 0.4560 | 0.8829 | 0.9396 |
292
+ | No log | 16.6207 | 482 | 0.9921 | 0.4435 | 0.9921 | 0.9960 |
293
+ | No log | 16.6897 | 484 | 0.9787 | 0.4772 | 0.9787 | 0.9893 |
294
+ | No log | 16.7586 | 486 | 0.9104 | 0.3834 | 0.9104 | 0.9542 |
295
+ | No log | 16.8276 | 488 | 0.9249 | 0.3697 | 0.9249 | 0.9617 |
296
+ | No log | 16.8966 | 490 | 0.9676 | 0.3517 | 0.9676 | 0.9837 |
297
+ | No log | 16.9655 | 492 | 0.9290 | 0.3590 | 0.9290 | 0.9638 |
298
+ | No log | 17.0345 | 494 | 0.8797 | 0.4482 | 0.8797 | 0.9379 |
299
+ | No log | 17.1034 | 496 | 0.9041 | 0.3820 | 0.9041 | 0.9508 |
300
+ | No log | 17.1724 | 498 | 1.0079 | 0.4468 | 1.0079 | 1.0039 |
301
+ | 0.4117 | 17.2414 | 500 | 1.0903 | 0.4304 | 1.0903 | 1.0442 |
302
+ | 0.4117 | 17.3103 | 502 | 1.0323 | 0.4348 | 1.0323 | 1.0160 |
303
+ | 0.4117 | 17.3793 | 504 | 0.9619 | 0.4297 | 0.9619 | 0.9807 |
304
+ | 0.4117 | 17.4483 | 506 | 0.9751 | 0.4631 | 0.9751 | 0.9875 |
305
+ | 0.4117 | 17.5172 | 508 | 1.0505 | 0.3117 | 1.0505 | 1.0249 |
306
+ | 0.4117 | 17.5862 | 510 | 1.0564 | 0.3192 | 1.0564 | 1.0278 |
307
+ | 0.4117 | 17.6552 | 512 | 1.0005 | 0.3711 | 1.0005 | 1.0003 |
308
+ | 0.4117 | 17.7241 | 514 | 0.9970 | 0.3711 | 0.9970 | 0.9985 |
309
+ | 0.4117 | 17.7931 | 516 | 1.0123 | 0.3711 | 1.0123 | 1.0061 |
310
+
311
+
312
+ ### Framework versions
313
+
314
+ - Transformers 4.44.2
315
+ - Pytorch 2.4.0+cu118
316
+ - Datasets 2.21.0
317
+ - Tokenizers 0.19.1
config.json ADDED
@@ -0,0 +1,32 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "_name_or_path": "aubmindlab/bert-base-arabertv02",
3
+ "architectures": [
4
+ "BertForSequenceClassification"
5
+ ],
6
+ "attention_probs_dropout_prob": 0.1,
7
+ "classifier_dropout": null,
8
+ "hidden_act": "gelu",
9
+ "hidden_dropout_prob": 0.1,
10
+ "hidden_size": 768,
11
+ "id2label": {
12
+ "0": "LABEL_0"
13
+ },
14
+ "initializer_range": 0.02,
15
+ "intermediate_size": 3072,
16
+ "label2id": {
17
+ "LABEL_0": 0
18
+ },
19
+ "layer_norm_eps": 1e-12,
20
+ "max_position_embeddings": 512,
21
+ "model_type": "bert",
22
+ "num_attention_heads": 12,
23
+ "num_hidden_layers": 12,
24
+ "pad_token_id": 0,
25
+ "position_embedding_type": "absolute",
26
+ "problem_type": "regression",
27
+ "torch_dtype": "float32",
28
+ "transformers_version": "4.44.2",
29
+ "type_vocab_size": 2,
30
+ "use_cache": true,
31
+ "vocab_size": 64000
32
+ }
model.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:452b7fe54ad930692f6c1a53d05a859d2080adf8da1a81ab68769d2b0bd30cce
3
+ size 540799996
training_args.bin ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:51cec9a4219b67227bae3908a95fd1261b8e33e360aa0e17fb181c6b05bbc081
3
+ size 5304