MayBashendy commited on
Commit
062df42
·
verified ·
1 Parent(s): 7068a47

Training in progress, step 500

Browse files
Files changed (4) hide show
  1. README.md +316 -0
  2. config.json +32 -0
  3. model.safetensors +3 -0
  4. training_args.bin +3 -0
README.md ADDED
@@ -0,0 +1,316 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ library_name: transformers
3
+ base_model: aubmindlab/bert-base-arabertv02
4
+ tags:
5
+ - generated_from_trainer
6
+ model-index:
7
+ - name: ArabicNewSplits7_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k5_task3_organization
8
+ results: []
9
+ ---
10
+
11
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
12
+ should probably proofread and complete it, then remove this comment. -->
13
+
14
+ # ArabicNewSplits7_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k5_task3_organization
15
+
16
+ This model is a fine-tuned version of [aubmindlab/bert-base-arabertv02](https://huggingface.co/aubmindlab/bert-base-arabertv02) on the None dataset.
17
+ It achieves the following results on the evaluation set:
18
+ - Loss: 1.2068
19
+ - Qwk: -0.2217
20
+ - Mse: 1.2068
21
+ - Rmse: 1.0985
22
+
23
+ ## Model description
24
+
25
+ More information needed
26
+
27
+ ## Intended uses & limitations
28
+
29
+ More information needed
30
+
31
+ ## Training and evaluation data
32
+
33
+ More information needed
34
+
35
+ ## Training procedure
36
+
37
+ ### Training hyperparameters
38
+
39
+ The following hyperparameters were used during training:
40
+ - learning_rate: 2e-05
41
+ - train_batch_size: 8
42
+ - eval_batch_size: 8
43
+ - seed: 42
44
+ - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
45
+ - lr_scheduler_type: linear
46
+ - num_epochs: 100
47
+
48
+ ### Training results
49
+
50
+ | Training Loss | Epoch | Step | Validation Loss | Qwk | Mse | Rmse |
51
+ |:-------------:|:-------:|:----:|:---------------:|:-------:|:------:|:------:|
52
+ | No log | 0.1538 | 2 | 3.7193 | 0.0035 | 3.7193 | 1.9286 |
53
+ | No log | 0.3077 | 4 | 2.0465 | 0.0704 | 2.0465 | 1.4306 |
54
+ | No log | 0.4615 | 6 | 1.5715 | 0.0452 | 1.5715 | 1.2536 |
55
+ | No log | 0.6154 | 8 | 1.1011 | -0.0385 | 1.1011 | 1.0493 |
56
+ | No log | 0.7692 | 10 | 1.4501 | -0.0988 | 1.4501 | 1.2042 |
57
+ | No log | 0.9231 | 12 | 1.8211 | -0.0690 | 1.8211 | 1.3495 |
58
+ | No log | 1.0769 | 14 | 2.2098 | -0.0122 | 2.2098 | 1.4865 |
59
+ | No log | 1.2308 | 16 | 1.5168 | -0.0457 | 1.5168 | 1.2316 |
60
+ | No log | 1.3846 | 18 | 1.0273 | -0.0668 | 1.0273 | 1.0136 |
61
+ | No log | 1.5385 | 20 | 0.9127 | -0.0504 | 0.9127 | 0.9553 |
62
+ | No log | 1.6923 | 22 | 0.9253 | 0.0316 | 0.9253 | 0.9619 |
63
+ | No log | 1.8462 | 24 | 1.0840 | -0.0648 | 1.0840 | 1.0412 |
64
+ | No log | 2.0 | 26 | 1.0027 | -0.0031 | 1.0027 | 1.0014 |
65
+ | No log | 2.1538 | 28 | 0.9084 | -0.0182 | 0.9084 | 0.9531 |
66
+ | No log | 2.3077 | 30 | 0.8240 | -0.2652 | 0.8240 | 0.9078 |
67
+ | No log | 2.4615 | 32 | 0.8565 | -0.0408 | 0.8565 | 0.9255 |
68
+ | No log | 2.6154 | 34 | 1.0667 | -0.0101 | 1.0667 | 1.0328 |
69
+ | No log | 2.7692 | 36 | 1.2955 | -0.0736 | 1.2955 | 1.1382 |
70
+ | No log | 2.9231 | 38 | 1.4678 | -0.0490 | 1.4678 | 1.2115 |
71
+ | No log | 3.0769 | 40 | 1.1994 | -0.0712 | 1.1994 | 1.0952 |
72
+ | No log | 3.2308 | 42 | 0.8824 | -0.0056 | 0.8824 | 0.9393 |
73
+ | No log | 3.3846 | 44 | 0.8153 | -0.0351 | 0.8153 | 0.9029 |
74
+ | No log | 3.5385 | 46 | 0.8191 | -0.1244 | 0.8191 | 0.9051 |
75
+ | No log | 3.6923 | 48 | 1.1448 | -0.0617 | 1.1448 | 1.0700 |
76
+ | No log | 3.8462 | 50 | 1.6883 | 0.0014 | 1.6883 | 1.2993 |
77
+ | No log | 4.0 | 52 | 2.0572 | 0.0144 | 2.0572 | 1.4343 |
78
+ | No log | 4.1538 | 54 | 1.4892 | 0.0090 | 1.4892 | 1.2203 |
79
+ | No log | 4.3077 | 56 | 1.0851 | 0.0026 | 1.0851 | 1.0417 |
80
+ | No log | 4.4615 | 58 | 0.9129 | -0.0033 | 0.9129 | 0.9555 |
81
+ | No log | 4.6154 | 60 | 0.8386 | 0.0588 | 0.8386 | 0.9158 |
82
+ | No log | 4.7692 | 62 | 0.9700 | -0.1645 | 0.9700 | 0.9849 |
83
+ | No log | 4.9231 | 64 | 1.1665 | -0.0668 | 1.1665 | 1.0801 |
84
+ | No log | 5.0769 | 66 | 1.1615 | -0.0320 | 1.1615 | 1.0777 |
85
+ | No log | 5.2308 | 68 | 1.1329 | -0.0568 | 1.1329 | 1.0644 |
86
+ | No log | 5.3846 | 70 | 1.2129 | -0.0561 | 1.2129 | 1.1013 |
87
+ | No log | 5.5385 | 72 | 1.0848 | -0.0452 | 1.0848 | 1.0415 |
88
+ | No log | 5.6923 | 74 | 1.0829 | -0.0087 | 1.0829 | 1.0406 |
89
+ | No log | 5.8462 | 76 | 1.1218 | 0.0249 | 1.1218 | 1.0591 |
90
+ | No log | 6.0 | 78 | 1.3921 | -0.0238 | 1.3921 | 1.1799 |
91
+ | No log | 6.1538 | 80 | 1.2751 | -0.0497 | 1.2751 | 1.1292 |
92
+ | No log | 6.3077 | 82 | 0.9798 | 0.0109 | 0.9798 | 0.9898 |
93
+ | No log | 6.4615 | 84 | 0.9105 | 0.0633 | 0.9105 | 0.9542 |
94
+ | No log | 6.6154 | 86 | 0.9332 | 0.0586 | 0.9332 | 0.9660 |
95
+ | No log | 6.7692 | 88 | 1.2546 | -0.0777 | 1.2546 | 1.1201 |
96
+ | No log | 6.9231 | 90 | 1.6012 | -0.1155 | 1.6012 | 1.2654 |
97
+ | No log | 7.0769 | 92 | 1.2167 | 0.0175 | 1.2167 | 1.1031 |
98
+ | No log | 7.2308 | 94 | 1.0428 | -0.0362 | 1.0428 | 1.0212 |
99
+ | No log | 7.3846 | 96 | 1.0973 | -0.0732 | 1.0973 | 1.0475 |
100
+ | No log | 7.5385 | 98 | 1.4963 | -0.0942 | 1.4963 | 1.2232 |
101
+ | No log | 7.6923 | 100 | 1.4827 | -0.0311 | 1.4827 | 1.2177 |
102
+ | No log | 7.8462 | 102 | 1.2310 | -0.0820 | 1.2310 | 1.1095 |
103
+ | No log | 8.0 | 104 | 1.1404 | -0.0717 | 1.1404 | 1.0679 |
104
+ | No log | 8.1538 | 106 | 1.0853 | -0.0749 | 1.0853 | 1.0418 |
105
+ | No log | 8.3077 | 108 | 1.1576 | -0.0795 | 1.1576 | 1.0759 |
106
+ | No log | 8.4615 | 110 | 1.2816 | -0.0842 | 1.2816 | 1.1321 |
107
+ | No log | 8.6154 | 112 | 1.3359 | -0.0912 | 1.3359 | 1.1558 |
108
+ | No log | 8.7692 | 114 | 1.2597 | -0.0905 | 1.2597 | 1.1224 |
109
+ | No log | 8.9231 | 116 | 0.9925 | -0.1131 | 0.9925 | 0.9962 |
110
+ | No log | 9.0769 | 118 | 0.9895 | -0.0240 | 0.9895 | 0.9947 |
111
+ | No log | 9.2308 | 120 | 1.1331 | -0.1877 | 1.1331 | 1.0645 |
112
+ | No log | 9.3846 | 122 | 1.4308 | -0.0794 | 1.4308 | 1.1961 |
113
+ | No log | 9.5385 | 124 | 1.7116 | -0.1227 | 1.7116 | 1.3083 |
114
+ | No log | 9.6923 | 126 | 1.3232 | -0.0479 | 1.3232 | 1.1503 |
115
+ | No log | 9.8462 | 128 | 1.0722 | -0.0655 | 1.0722 | 1.0355 |
116
+ | No log | 10.0 | 130 | 0.9995 | -0.1957 | 0.9995 | 0.9997 |
117
+ | No log | 10.1538 | 132 | 1.0512 | -0.1512 | 1.0512 | 1.0253 |
118
+ | No log | 10.3077 | 134 | 1.2328 | -0.1869 | 1.2328 | 1.1103 |
119
+ | No log | 10.4615 | 136 | 1.3196 | -0.1861 | 1.3196 | 1.1487 |
120
+ | No log | 10.6154 | 138 | 1.3307 | -0.1484 | 1.3307 | 1.1535 |
121
+ | No log | 10.7692 | 140 | 1.1922 | -0.1499 | 1.1922 | 1.0919 |
122
+ | No log | 10.9231 | 142 | 1.1789 | -0.1877 | 1.1789 | 1.0858 |
123
+ | No log | 11.0769 | 144 | 1.1572 | -0.1138 | 1.1572 | 1.0757 |
124
+ | No log | 11.2308 | 146 | 1.1244 | -0.0795 | 1.1244 | 1.0604 |
125
+ | No log | 11.3846 | 148 | 1.0980 | -0.0068 | 1.0980 | 1.0479 |
126
+ | No log | 11.5385 | 150 | 1.0043 | -0.1083 | 1.0043 | 1.0021 |
127
+ | No log | 11.6923 | 152 | 1.0025 | -0.0669 | 1.0025 | 1.0013 |
128
+ | No log | 11.8462 | 154 | 0.9909 | -0.1088 | 0.9909 | 0.9954 |
129
+ | No log | 12.0 | 156 | 1.1064 | -0.0076 | 1.1064 | 1.0519 |
130
+ | No log | 12.1538 | 158 | 1.1272 | -0.0076 | 1.1272 | 1.0617 |
131
+ | No log | 12.3077 | 160 | 0.9584 | -0.0656 | 0.9584 | 0.9790 |
132
+ | No log | 12.4615 | 162 | 0.9304 | -0.0251 | 0.9304 | 0.9646 |
133
+ | No log | 12.6154 | 164 | 1.1162 | -0.0076 | 1.1162 | 1.0565 |
134
+ | No log | 12.7692 | 166 | 1.2922 | 0.0065 | 1.2922 | 1.1367 |
135
+ | No log | 12.9231 | 168 | 1.1007 | -0.0409 | 1.1007 | 1.0491 |
136
+ | No log | 13.0769 | 170 | 1.0232 | -0.1338 | 1.0232 | 1.0115 |
137
+ | No log | 13.2308 | 172 | 1.0064 | -0.1791 | 1.0064 | 1.0032 |
138
+ | No log | 13.3846 | 174 | 1.0630 | -0.1055 | 1.0630 | 1.0310 |
139
+ | No log | 13.5385 | 176 | 1.2743 | -0.0142 | 1.2743 | 1.1288 |
140
+ | No log | 13.6923 | 178 | 1.4308 | -0.0888 | 1.4308 | 1.1962 |
141
+ | No log | 13.8462 | 180 | 1.2363 | -0.0840 | 1.2363 | 1.1119 |
142
+ | No log | 14.0 | 182 | 1.0019 | -0.1083 | 1.0019 | 1.0010 |
143
+ | No log | 14.1538 | 184 | 0.9629 | -0.1006 | 0.9629 | 0.9813 |
144
+ | No log | 14.3077 | 186 | 0.9927 | 0.0099 | 0.9927 | 0.9964 |
145
+ | No log | 14.4615 | 188 | 1.1052 | -0.1221 | 1.1052 | 1.0513 |
146
+ | No log | 14.6154 | 190 | 1.1401 | -0.0471 | 1.1401 | 1.0678 |
147
+ | No log | 14.7692 | 192 | 1.1260 | -0.1169 | 1.1260 | 1.0611 |
148
+ | No log | 14.9231 | 194 | 1.0646 | -0.2191 | 1.0646 | 1.0318 |
149
+ | No log | 15.0769 | 196 | 1.0090 | -0.0963 | 1.0090 | 1.0045 |
150
+ | No log | 15.2308 | 198 | 1.0192 | -0.1449 | 1.0192 | 1.0096 |
151
+ | No log | 15.3846 | 200 | 1.0611 | -0.1446 | 1.0611 | 1.0301 |
152
+ | No log | 15.5385 | 202 | 1.0569 | -0.1496 | 1.0569 | 1.0280 |
153
+ | No log | 15.6923 | 204 | 0.9495 | -0.0672 | 0.9495 | 0.9744 |
154
+ | No log | 15.8462 | 206 | 0.9191 | 0.0099 | 0.9191 | 0.9587 |
155
+ | No log | 16.0 | 208 | 0.9901 | 0.0017 | 0.9901 | 0.9950 |
156
+ | No log | 16.1538 | 210 | 1.0368 | 0.0409 | 1.0368 | 1.0182 |
157
+ | No log | 16.3077 | 212 | 0.9741 | -0.0735 | 0.9741 | 0.9870 |
158
+ | No log | 16.4615 | 214 | 0.9788 | -0.0699 | 0.9788 | 0.9893 |
159
+ | No log | 16.6154 | 216 | 1.0205 | -0.1135 | 1.0205 | 1.0102 |
160
+ | No log | 16.7692 | 218 | 0.9992 | -0.1126 | 0.9992 | 0.9996 |
161
+ | No log | 16.9231 | 220 | 1.0027 | -0.0711 | 1.0027 | 1.0013 |
162
+ | No log | 17.0769 | 222 | 0.9937 | -0.0711 | 0.9937 | 0.9968 |
163
+ | No log | 17.2308 | 224 | 0.9908 | -0.0699 | 0.9908 | 0.9954 |
164
+ | No log | 17.3846 | 226 | 1.0754 | -0.0717 | 1.0754 | 1.0370 |
165
+ | No log | 17.5385 | 228 | 1.0301 | -0.1102 | 1.0301 | 1.0149 |
166
+ | No log | 17.6923 | 230 | 0.9372 | -0.1580 | 0.9372 | 0.9681 |
167
+ | No log | 17.8462 | 232 | 0.9727 | -0.1131 | 0.9727 | 0.9862 |
168
+ | No log | 18.0 | 234 | 1.0565 | -0.0717 | 1.0565 | 1.0279 |
169
+ | No log | 18.1538 | 236 | 1.0621 | -0.0717 | 1.0621 | 1.0306 |
170
+ | No log | 18.3077 | 238 | 1.0389 | -0.1806 | 1.0389 | 1.0193 |
171
+ | No log | 18.4615 | 240 | 1.0347 | -0.1679 | 1.0347 | 1.0172 |
172
+ | No log | 18.6154 | 242 | 1.0110 | -0.1679 | 1.0110 | 1.0055 |
173
+ | No log | 18.7692 | 244 | 1.0777 | -0.0322 | 1.0777 | 1.0381 |
174
+ | No log | 18.9231 | 246 | 1.2515 | -0.0847 | 1.2515 | 1.1187 |
175
+ | No log | 19.0769 | 248 | 1.1770 | -0.1166 | 1.1770 | 1.0849 |
176
+ | No log | 19.2308 | 250 | 1.0219 | -0.1032 | 1.0219 | 1.0109 |
177
+ | No log | 19.3846 | 252 | 0.9574 | -0.0860 | 0.9574 | 0.9785 |
178
+ | No log | 19.5385 | 254 | 0.9043 | -0.1057 | 0.9043 | 0.9510 |
179
+ | No log | 19.6923 | 256 | 0.8989 | -0.0851 | 0.8989 | 0.9481 |
180
+ | No log | 19.8462 | 258 | 0.9894 | -0.0616 | 0.9894 | 0.9947 |
181
+ | No log | 20.0 | 260 | 1.1852 | -0.0712 | 1.1852 | 1.0887 |
182
+ | No log | 20.1538 | 262 | 1.2273 | -0.1480 | 1.2273 | 1.1079 |
183
+ | No log | 20.3077 | 264 | 1.0690 | -0.1111 | 1.0690 | 1.0339 |
184
+ | No log | 20.4615 | 266 | 0.9824 | -0.1915 | 0.9824 | 0.9912 |
185
+ | No log | 20.6154 | 268 | 0.9655 | -0.1915 | 0.9655 | 0.9826 |
186
+ | No log | 20.7692 | 270 | 1.0337 | -0.1554 | 1.0337 | 1.0167 |
187
+ | No log | 20.9231 | 272 | 1.2665 | -0.0198 | 1.2665 | 1.1254 |
188
+ | No log | 21.0769 | 274 | 1.4318 | -0.0252 | 1.4318 | 1.1966 |
189
+ | No log | 21.2308 | 276 | 1.2913 | -0.0175 | 1.2913 | 1.1364 |
190
+ | No log | 21.3846 | 278 | 1.0581 | -0.1060 | 1.0581 | 1.0286 |
191
+ | No log | 21.5385 | 280 | 0.9691 | -0.1336 | 0.9691 | 0.9844 |
192
+ | No log | 21.6923 | 282 | 0.9372 | -0.1572 | 0.9372 | 0.9681 |
193
+ | No log | 21.8462 | 284 | 0.9349 | -0.1336 | 0.9349 | 0.9669 |
194
+ | No log | 22.0 | 286 | 1.0082 | -0.0008 | 1.0082 | 1.0041 |
195
+ | No log | 22.1538 | 288 | 1.3025 | -0.0600 | 1.3025 | 1.1413 |
196
+ | No log | 22.3077 | 290 | 1.4747 | -0.0655 | 1.4747 | 1.2144 |
197
+ | No log | 22.4615 | 292 | 1.3250 | -0.0324 | 1.3250 | 1.1511 |
198
+ | No log | 22.6154 | 294 | 1.0725 | 0.0283 | 1.0725 | 1.0356 |
199
+ | No log | 22.7692 | 296 | 0.9214 | -0.1077 | 0.9214 | 0.9599 |
200
+ | No log | 22.9231 | 298 | 0.8787 | -0.1066 | 0.8787 | 0.9374 |
201
+ | No log | 23.0769 | 300 | 0.8829 | -0.1077 | 0.8829 | 0.9397 |
202
+ | No log | 23.2308 | 302 | 0.9689 | 0.0377 | 0.9689 | 0.9843 |
203
+ | No log | 23.3846 | 304 | 1.1306 | 0.0065 | 1.1306 | 1.0633 |
204
+ | No log | 23.5385 | 306 | 1.1464 | -0.0608 | 1.1464 | 1.0707 |
205
+ | No log | 23.6923 | 308 | 1.0525 | 0.0277 | 1.0525 | 1.0259 |
206
+ | No log | 23.8462 | 310 | 0.9984 | -0.0030 | 0.9984 | 0.9992 |
207
+ | No log | 24.0 | 312 | 0.8993 | -0.1144 | 0.8993 | 0.9483 |
208
+ | No log | 24.1538 | 314 | 0.8753 | -0.1072 | 0.8753 | 0.9356 |
209
+ | No log | 24.3077 | 316 | 0.9165 | -0.0669 | 0.9165 | 0.9573 |
210
+ | No log | 24.4615 | 318 | 1.1212 | -0.0090 | 1.1212 | 1.0588 |
211
+ | No log | 24.6154 | 320 | 1.3377 | -0.0619 | 1.3377 | 1.1566 |
212
+ | No log | 24.7692 | 322 | 1.3515 | 0.0367 | 1.3515 | 1.1626 |
213
+ | No log | 24.9231 | 324 | 1.1422 | -0.0424 | 1.1422 | 1.0688 |
214
+ | No log | 25.0769 | 326 | 0.9964 | -0.1152 | 0.9964 | 0.9982 |
215
+ | No log | 25.2308 | 328 | 0.9575 | -0.1144 | 0.9575 | 0.9785 |
216
+ | No log | 25.3846 | 330 | 0.9886 | -0.1111 | 0.9886 | 0.9943 |
217
+ | No log | 25.5385 | 332 | 1.0416 | -0.0030 | 1.0416 | 1.0206 |
218
+ | No log | 25.6923 | 334 | 1.0901 | -0.0094 | 1.0901 | 1.0441 |
219
+ | No log | 25.8462 | 336 | 1.0502 | -0.0391 | 1.0502 | 1.0248 |
220
+ | No log | 26.0 | 338 | 0.9873 | -0.0767 | 0.9873 | 0.9936 |
221
+ | No log | 26.1538 | 340 | 0.9676 | -0.0767 | 0.9676 | 0.9837 |
222
+ | No log | 26.3077 | 342 | 0.9516 | -0.1985 | 0.9516 | 0.9755 |
223
+ | No log | 26.4615 | 344 | 0.9423 | -0.1576 | 0.9423 | 0.9707 |
224
+ | No log | 26.6154 | 346 | 0.9653 | -0.1572 | 0.9653 | 0.9825 |
225
+ | No log | 26.7692 | 348 | 1.0091 | -0.1140 | 1.0091 | 1.0046 |
226
+ | No log | 26.9231 | 350 | 1.1133 | -0.1119 | 1.1133 | 1.0551 |
227
+ | No log | 27.0769 | 352 | 1.2052 | -0.0151 | 1.2052 | 1.0978 |
228
+ | No log | 27.2308 | 354 | 1.1649 | -0.0466 | 1.1649 | 1.0793 |
229
+ | No log | 27.3846 | 356 | 1.0639 | -0.1115 | 1.0639 | 1.0315 |
230
+ | No log | 27.5385 | 358 | 1.0164 | -0.1119 | 1.0164 | 1.0082 |
231
+ | No log | 27.6923 | 360 | 0.9412 | -0.0735 | 0.9412 | 0.9701 |
232
+ | No log | 27.8462 | 362 | 0.9023 | -0.0755 | 0.9023 | 0.9499 |
233
+ | No log | 28.0 | 364 | 0.8725 | -0.1121 | 0.8725 | 0.9341 |
234
+ | No log | 28.1538 | 366 | 0.8736 | -0.1121 | 0.8736 | 0.9347 |
235
+ | No log | 28.3077 | 368 | 0.8840 | -0.1121 | 0.8840 | 0.9402 |
236
+ | No log | 28.4615 | 370 | 0.9557 | -0.0362 | 0.9557 | 0.9776 |
237
+ | No log | 28.6154 | 372 | 1.0181 | 0.0267 | 1.0181 | 1.0090 |
238
+ | No log | 28.7692 | 374 | 0.9907 | -0.0362 | 0.9907 | 0.9953 |
239
+ | No log | 28.9231 | 376 | 0.9169 | -0.1140 | 0.9169 | 0.9576 |
240
+ | No log | 29.0769 | 378 | 0.9230 | -0.1140 | 0.9230 | 0.9607 |
241
+ | No log | 29.2308 | 380 | 0.9710 | -0.1152 | 0.9710 | 0.9854 |
242
+ | No log | 29.3846 | 382 | 0.9761 | -0.1152 | 0.9761 | 0.9880 |
243
+ | No log | 29.5385 | 384 | 0.9446 | -0.1135 | 0.9446 | 0.9719 |
244
+ | No log | 29.6923 | 386 | 0.9055 | -0.1982 | 0.9055 | 0.9516 |
245
+ | No log | 29.8462 | 388 | 0.8876 | -0.1982 | 0.8876 | 0.9421 |
246
+ | No log | 30.0 | 390 | 0.9127 | -0.1572 | 0.9127 | 0.9554 |
247
+ | No log | 30.1538 | 392 | 0.9565 | -0.0322 | 0.9565 | 0.9780 |
248
+ | No log | 30.3077 | 394 | 1.0463 | -0.0118 | 1.0463 | 1.0229 |
249
+ | No log | 30.4615 | 396 | 1.0181 | -0.0456 | 1.0181 | 1.0090 |
250
+ | No log | 30.6154 | 398 | 0.9270 | -0.0723 | 0.9270 | 0.9628 |
251
+ | No log | 30.7692 | 400 | 0.8774 | -0.1524 | 0.8774 | 0.9367 |
252
+ | No log | 30.9231 | 402 | 0.8817 | -0.1524 | 0.8817 | 0.9390 |
253
+ | No log | 31.0769 | 404 | 0.8817 | -0.1524 | 0.8817 | 0.9390 |
254
+ | No log | 31.2308 | 406 | 0.9033 | -0.1135 | 0.9033 | 0.9504 |
255
+ | No log | 31.3846 | 408 | 1.0288 | -0.0118 | 1.0288 | 1.0143 |
256
+ | No log | 31.5385 | 410 | 1.2109 | 0.0687 | 1.2109 | 1.1004 |
257
+ | No log | 31.6923 | 412 | 1.2889 | 0.0224 | 1.2889 | 1.1353 |
258
+ | No log | 31.8462 | 414 | 1.1746 | -0.0575 | 1.1746 | 1.0838 |
259
+ | No log | 32.0 | 416 | 1.0522 | -0.0777 | 1.0522 | 1.0257 |
260
+ | No log | 32.1538 | 418 | 1.0285 | -0.0759 | 1.0285 | 1.0142 |
261
+ | No log | 32.3077 | 420 | 1.0063 | -0.1568 | 1.0063 | 1.0032 |
262
+ | No log | 32.4615 | 422 | 0.9941 | -0.1512 | 0.9941 | 0.9970 |
263
+ | No log | 32.6154 | 424 | 0.9962 | -0.1576 | 0.9962 | 0.9981 |
264
+ | No log | 32.7692 | 426 | 1.0088 | -0.1568 | 1.0088 | 1.0044 |
265
+ | No log | 32.9231 | 428 | 1.0328 | -0.1932 | 1.0328 | 1.0163 |
266
+ | No log | 33.0769 | 430 | 1.0583 | -0.1166 | 1.0583 | 1.0287 |
267
+ | No log | 33.2308 | 432 | 1.0397 | -0.1932 | 1.0397 | 1.0197 |
268
+ | No log | 33.3846 | 434 | 0.9957 | -0.2351 | 0.9957 | 0.9978 |
269
+ | No log | 33.5385 | 436 | 0.9594 | -0.1515 | 0.9594 | 0.9795 |
270
+ | No log | 33.6923 | 438 | 0.9706 | -0.1515 | 0.9706 | 0.9852 |
271
+ | No log | 33.8462 | 440 | 0.9998 | -0.1077 | 0.9998 | 0.9999 |
272
+ | No log | 34.0 | 442 | 1.0249 | -0.1083 | 1.0249 | 1.0124 |
273
+ | No log | 34.1538 | 444 | 1.0782 | -0.2308 | 1.0782 | 1.0384 |
274
+ | No log | 34.3077 | 446 | 1.0904 | -0.2308 | 1.0904 | 1.0442 |
275
+ | No log | 34.4615 | 448 | 1.0745 | -0.2046 | 1.0745 | 1.0366 |
276
+ | No log | 34.6154 | 450 | 1.0598 | -0.2046 | 1.0598 | 1.0295 |
277
+ | No log | 34.7692 | 452 | 1.0566 | -0.2033 | 1.0566 | 1.0279 |
278
+ | No log | 34.9231 | 454 | 1.0873 | -0.1977 | 1.0873 | 1.0427 |
279
+ | No log | 35.0769 | 456 | 1.0689 | -0.1977 | 1.0689 | 1.0339 |
280
+ | No log | 35.2308 | 458 | 1.0055 | -0.2046 | 1.0055 | 1.0027 |
281
+ | No log | 35.3846 | 460 | 0.9655 | -0.2086 | 0.9655 | 0.9826 |
282
+ | No log | 35.5385 | 462 | 0.9562 | -0.2062 | 0.9562 | 0.9778 |
283
+ | No log | 35.6923 | 464 | 0.9633 | -0.2062 | 0.9633 | 0.9815 |
284
+ | No log | 35.8462 | 466 | 0.9716 | -0.2975 | 0.9716 | 0.9857 |
285
+ | No log | 36.0 | 468 | 0.9714 | -0.2116 | 0.9714 | 0.9856 |
286
+ | No log | 36.1538 | 470 | 0.9867 | -0.2100 | 0.9867 | 0.9933 |
287
+ | No log | 36.3077 | 472 | 1.0237 | -0.1628 | 1.0237 | 1.0118 |
288
+ | No log | 36.4615 | 474 | 1.0433 | -0.1628 | 1.0433 | 1.0214 |
289
+ | No log | 36.6154 | 476 | 1.0286 | -0.1628 | 1.0286 | 1.0142 |
290
+ | No log | 36.7692 | 478 | 1.0218 | -0.1628 | 1.0218 | 1.0108 |
291
+ | No log | 36.9231 | 480 | 1.0353 | -0.1628 | 1.0353 | 1.0175 |
292
+ | No log | 37.0769 | 482 | 1.0124 | -0.1628 | 1.0124 | 1.0062 |
293
+ | No log | 37.2308 | 484 | 0.9603 | -0.1628 | 0.9603 | 0.9800 |
294
+ | No log | 37.3846 | 486 | 0.9427 | -0.1628 | 0.9427 | 0.9709 |
295
+ | No log | 37.5385 | 488 | 0.9574 | -0.1628 | 0.9574 | 0.9785 |
296
+ | No log | 37.6923 | 490 | 0.9690 | -0.2086 | 0.9690 | 0.9844 |
297
+ | No log | 37.8462 | 492 | 0.9794 | -0.1576 | 0.9794 | 0.9896 |
298
+ | No log | 38.0 | 494 | 0.9878 | -0.1576 | 0.9878 | 0.9939 |
299
+ | No log | 38.1538 | 496 | 1.0407 | -0.2072 | 1.0407 | 1.0202 |
300
+ | No log | 38.3077 | 498 | 1.1507 | -0.1169 | 1.1507 | 1.0727 |
301
+ | 0.2154 | 38.4615 | 500 | 1.1175 | -0.1212 | 1.1175 | 1.0571 |
302
+ | 0.2154 | 38.6154 | 502 | 1.0049 | -0.1628 | 1.0049 | 1.0025 |
303
+ | 0.2154 | 38.7692 | 504 | 0.9495 | -0.1187 | 0.9495 | 0.9744 |
304
+ | 0.2154 | 38.9231 | 506 | 0.9659 | -0.1576 | 0.9659 | 0.9828 |
305
+ | 0.2154 | 39.0769 | 508 | 1.0077 | -0.1512 | 1.0077 | 1.0038 |
306
+ | 0.2154 | 39.2308 | 510 | 1.0615 | -0.2343 | 1.0615 | 1.0303 |
307
+ | 0.2154 | 39.3846 | 512 | 1.0995 | -0.1985 | 1.0995 | 1.0486 |
308
+ | 0.2154 | 39.5385 | 514 | 1.2068 | -0.2217 | 1.2068 | 1.0985 |
309
+
310
+
311
+ ### Framework versions
312
+
313
+ - Transformers 4.44.2
314
+ - Pytorch 2.4.0+cu118
315
+ - Datasets 2.21.0
316
+ - Tokenizers 0.19.1
config.json ADDED
@@ -0,0 +1,32 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "_name_or_path": "aubmindlab/bert-base-arabertv02",
3
+ "architectures": [
4
+ "BertForSequenceClassification"
5
+ ],
6
+ "attention_probs_dropout_prob": 0.1,
7
+ "classifier_dropout": null,
8
+ "hidden_act": "gelu",
9
+ "hidden_dropout_prob": 0.1,
10
+ "hidden_size": 768,
11
+ "id2label": {
12
+ "0": "LABEL_0"
13
+ },
14
+ "initializer_range": 0.02,
15
+ "intermediate_size": 3072,
16
+ "label2id": {
17
+ "LABEL_0": 0
18
+ },
19
+ "layer_norm_eps": 1e-12,
20
+ "max_position_embeddings": 512,
21
+ "model_type": "bert",
22
+ "num_attention_heads": 12,
23
+ "num_hidden_layers": 12,
24
+ "pad_token_id": 0,
25
+ "position_embedding_type": "absolute",
26
+ "problem_type": "regression",
27
+ "torch_dtype": "float32",
28
+ "transformers_version": "4.44.2",
29
+ "type_vocab_size": 2,
30
+ "use_cache": true,
31
+ "vocab_size": 64000
32
+ }
model.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:c1519a5df839d4c2bc701e2eab230d6560534915cf8215efcdd2f4d1ffb4c1cb
3
+ size 540799996
training_args.bin ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:e8508d1ca1473c67188e835d22c6cde5fd0310a2eda973bab13dd7afdbe2e3aa
3
+ size 5304