MayBashendy commited on
Commit
87c3daf
·
verified ·
1 Parent(s): b0f5f2a

Training in progress, step 400

Browse files
Files changed (4) hide show
  1. README.md +318 -0
  2. config.json +32 -0
  3. model.safetensors +3 -0
  4. training_args.bin +3 -0
README.md ADDED
@@ -0,0 +1,318 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ library_name: transformers
3
+ base_model: aubmindlab/bert-base-arabertv02
4
+ tags:
5
+ - generated_from_trainer
6
+ model-index:
7
+ - name: ArabicNewSplits7_FineTuningAraBERT_run1_AugV5_k3_task8_organization
8
+ results: []
9
+ ---
10
+
11
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
12
+ should probably proofread and complete it, then remove this comment. -->
13
+
14
+ # ArabicNewSplits7_FineTuningAraBERT_run1_AugV5_k3_task8_organization
15
+
16
+ This model is a fine-tuned version of [aubmindlab/bert-base-arabertv02](https://huggingface.co/aubmindlab/bert-base-arabertv02) on the None dataset.
17
+ It achieves the following results on the evaluation set:
18
+ - Loss: 0.8402
19
+ - Qwk: 0.4011
20
+ - Mse: 0.8402
21
+ - Rmse: 0.9166
22
+
23
+ ## Model description
24
+
25
+ More information needed
26
+
27
+ ## Intended uses & limitations
28
+
29
+ More information needed
30
+
31
+ ## Training and evaluation data
32
+
33
+ More information needed
34
+
35
+ ## Training procedure
36
+
37
+ ### Training hyperparameters
38
+
39
+ The following hyperparameters were used during training:
40
+ - learning_rate: 2e-05
41
+ - train_batch_size: 8
42
+ - eval_batch_size: 8
43
+ - seed: 42
44
+ - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
45
+ - lr_scheduler_type: linear
46
+ - num_epochs: 100
47
+
48
+ ### Training results
49
+
50
+ | Training Loss | Epoch | Step | Validation Loss | Qwk | Mse | Rmse |
51
+ |:-------------:|:-------:|:----:|:---------------:|:-------:|:------:|:------:|
52
+ | No log | 0.2222 | 2 | 2.6355 | -0.0385 | 2.6355 | 1.6234 |
53
+ | No log | 0.4444 | 4 | 1.2878 | -0.0482 | 1.2878 | 1.1348 |
54
+ | No log | 0.6667 | 6 | 0.8837 | -0.1736 | 0.8837 | 0.9400 |
55
+ | No log | 0.8889 | 8 | 0.7389 | -0.0096 | 0.7389 | 0.8596 |
56
+ | No log | 1.1111 | 10 | 0.6981 | 0.1045 | 0.6981 | 0.8355 |
57
+ | No log | 1.3333 | 12 | 0.6964 | 0.1014 | 0.6964 | 0.8345 |
58
+ | No log | 1.5556 | 14 | 0.6654 | 0.1321 | 0.6654 | 0.8157 |
59
+ | No log | 1.7778 | 16 | 0.6297 | 0.0436 | 0.6297 | 0.7936 |
60
+ | No log | 2.0 | 18 | 0.6026 | 0.0436 | 0.6026 | 0.7763 |
61
+ | No log | 2.2222 | 20 | 0.6275 | 0.1188 | 0.6275 | 0.7921 |
62
+ | No log | 2.4444 | 22 | 0.6305 | 0.1723 | 0.6305 | 0.7940 |
63
+ | No log | 2.6667 | 24 | 0.6117 | 0.2576 | 0.6117 | 0.7821 |
64
+ | No log | 2.8889 | 26 | 0.6438 | 0.2765 | 0.6438 | 0.8024 |
65
+ | No log | 3.1111 | 28 | 0.5908 | 0.2901 | 0.5908 | 0.7686 |
66
+ | No log | 3.3333 | 30 | 0.5569 | 0.3395 | 0.5569 | 0.7463 |
67
+ | No log | 3.5556 | 32 | 0.5268 | 0.3115 | 0.5268 | 0.7258 |
68
+ | No log | 3.7778 | 34 | 0.5134 | 0.0857 | 0.5134 | 0.7165 |
69
+ | No log | 4.0 | 36 | 0.5803 | 0.2576 | 0.5803 | 0.7618 |
70
+ | No log | 4.2222 | 38 | 0.5871 | 0.2574 | 0.5871 | 0.7662 |
71
+ | No log | 4.4444 | 40 | 0.5905 | 0.2574 | 0.5905 | 0.7684 |
72
+ | No log | 4.6667 | 42 | 0.5087 | 0.2614 | 0.5087 | 0.7132 |
73
+ | No log | 4.8889 | 44 | 0.6014 | 0.2934 | 0.6014 | 0.7755 |
74
+ | No log | 5.1111 | 46 | 0.6846 | 0.3102 | 0.6846 | 0.8274 |
75
+ | No log | 5.3333 | 48 | 0.7389 | 0.2754 | 0.7389 | 0.8596 |
76
+ | No log | 5.5556 | 50 | 0.7140 | 0.1720 | 0.7140 | 0.8450 |
77
+ | No log | 5.7778 | 52 | 0.6383 | 0.0814 | 0.6383 | 0.7989 |
78
+ | No log | 6.0 | 54 | 0.5826 | 0.0857 | 0.5826 | 0.7633 |
79
+ | No log | 6.2222 | 56 | 0.5874 | 0.1501 | 0.5874 | 0.7664 |
80
+ | No log | 6.4444 | 58 | 0.5834 | 0.2894 | 0.5834 | 0.7638 |
81
+ | No log | 6.6667 | 60 | 0.5328 | 0.3085 | 0.5328 | 0.7299 |
82
+ | No log | 6.8889 | 62 | 0.4841 | 0.3809 | 0.4841 | 0.6957 |
83
+ | No log | 7.1111 | 64 | 0.5115 | 0.4862 | 0.5115 | 0.7152 |
84
+ | No log | 7.3333 | 66 | 0.5369 | 0.3783 | 0.5369 | 0.7327 |
85
+ | No log | 7.5556 | 68 | 0.5754 | 0.3263 | 0.5754 | 0.7585 |
86
+ | No log | 7.7778 | 70 | 0.5317 | 0.4069 | 0.5317 | 0.7292 |
87
+ | No log | 8.0 | 72 | 0.4520 | 0.6209 | 0.4520 | 0.6723 |
88
+ | No log | 8.2222 | 74 | 0.4640 | 0.4729 | 0.4640 | 0.6812 |
89
+ | No log | 8.4444 | 76 | 0.4742 | 0.5018 | 0.4742 | 0.6886 |
90
+ | No log | 8.6667 | 78 | 0.4750 | 0.6348 | 0.4750 | 0.6892 |
91
+ | No log | 8.8889 | 80 | 0.5145 | 0.5680 | 0.5145 | 0.7173 |
92
+ | No log | 9.1111 | 82 | 0.5991 | 0.3581 | 0.5991 | 0.7740 |
93
+ | No log | 9.3333 | 84 | 0.6568 | 0.3259 | 0.6568 | 0.8104 |
94
+ | No log | 9.5556 | 86 | 0.5637 | 0.5367 | 0.5637 | 0.7508 |
95
+ | No log | 9.7778 | 88 | 0.5169 | 0.6210 | 0.5169 | 0.7190 |
96
+ | No log | 10.0 | 90 | 0.5337 | 0.6210 | 0.5337 | 0.7305 |
97
+ | No log | 10.2222 | 92 | 0.5709 | 0.6001 | 0.5709 | 0.7555 |
98
+ | No log | 10.4444 | 94 | 0.7220 | 0.3703 | 0.7220 | 0.8497 |
99
+ | No log | 10.6667 | 96 | 0.7394 | 0.3703 | 0.7394 | 0.8599 |
100
+ | No log | 10.8889 | 98 | 0.6761 | 0.3851 | 0.6761 | 0.8223 |
101
+ | No log | 11.1111 | 100 | 0.6624 | 0.4189 | 0.6624 | 0.8139 |
102
+ | No log | 11.3333 | 102 | 0.6601 | 0.4466 | 0.6601 | 0.8125 |
103
+ | No log | 11.5556 | 104 | 0.6171 | 0.4838 | 0.6171 | 0.7855 |
104
+ | No log | 11.7778 | 106 | 0.6275 | 0.5173 | 0.6275 | 0.7922 |
105
+ | No log | 12.0 | 108 | 0.6485 | 0.4610 | 0.6485 | 0.8053 |
106
+ | No log | 12.2222 | 110 | 0.6515 | 0.5117 | 0.6515 | 0.8071 |
107
+ | No log | 12.4444 | 112 | 0.6671 | 0.5155 | 0.6671 | 0.8167 |
108
+ | No log | 12.6667 | 114 | 0.7484 | 0.4192 | 0.7484 | 0.8651 |
109
+ | No log | 12.8889 | 116 | 0.7931 | 0.3431 | 0.7931 | 0.8906 |
110
+ | No log | 13.1111 | 118 | 0.8651 | 0.3465 | 0.8651 | 0.9301 |
111
+ | No log | 13.3333 | 120 | 0.8687 | 0.3465 | 0.8687 | 0.9320 |
112
+ | No log | 13.5556 | 122 | 0.7581 | 0.4442 | 0.7581 | 0.8707 |
113
+ | No log | 13.7778 | 124 | 0.6572 | 0.4510 | 0.6572 | 0.8107 |
114
+ | No log | 14.0 | 126 | 0.6165 | 0.4526 | 0.6165 | 0.7852 |
115
+ | No log | 14.2222 | 128 | 0.6084 | 0.4852 | 0.6084 | 0.7800 |
116
+ | No log | 14.4444 | 130 | 0.8393 | 0.4428 | 0.8393 | 0.9161 |
117
+ | No log | 14.6667 | 132 | 1.1616 | 0.3607 | 1.1616 | 1.0778 |
118
+ | No log | 14.8889 | 134 | 1.0736 | 0.3690 | 1.0736 | 1.0361 |
119
+ | No log | 15.1111 | 136 | 0.8326 | 0.4271 | 0.8326 | 0.9125 |
120
+ | No log | 15.3333 | 138 | 0.6768 | 0.4767 | 0.6768 | 0.8227 |
121
+ | No log | 15.5556 | 140 | 0.6473 | 0.4858 | 0.6473 | 0.8046 |
122
+ | No log | 15.7778 | 142 | 0.7009 | 0.4616 | 0.7009 | 0.8372 |
123
+ | No log | 16.0 | 144 | 0.8736 | 0.4051 | 0.8736 | 0.9346 |
124
+ | No log | 16.2222 | 146 | 1.0529 | 0.3705 | 1.0529 | 1.0261 |
125
+ | No log | 16.4444 | 148 | 0.9471 | 0.3969 | 0.9471 | 0.9732 |
126
+ | No log | 16.6667 | 150 | 0.7028 | 0.4568 | 0.7028 | 0.8383 |
127
+ | No log | 16.8889 | 152 | 0.5512 | 0.4901 | 0.5512 | 0.7424 |
128
+ | No log | 17.1111 | 154 | 0.5378 | 0.4652 | 0.5378 | 0.7334 |
129
+ | No log | 17.3333 | 156 | 0.5409 | 0.4942 | 0.5409 | 0.7355 |
130
+ | No log | 17.5556 | 158 | 0.5750 | 0.5032 | 0.5750 | 0.7583 |
131
+ | No log | 17.7778 | 160 | 0.6952 | 0.4450 | 0.6952 | 0.8338 |
132
+ | No log | 18.0 | 162 | 0.9442 | 0.3920 | 0.9442 | 0.9717 |
133
+ | No log | 18.2222 | 164 | 1.0041 | 0.4007 | 1.0041 | 1.0020 |
134
+ | No log | 18.4444 | 166 | 0.8952 | 0.4068 | 0.8952 | 0.9462 |
135
+ | No log | 18.6667 | 168 | 0.6795 | 0.4673 | 0.6795 | 0.8243 |
136
+ | No log | 18.8889 | 170 | 0.5997 | 0.5221 | 0.5997 | 0.7744 |
137
+ | No log | 19.1111 | 172 | 0.5774 | 0.5247 | 0.5774 | 0.7599 |
138
+ | No log | 19.3333 | 174 | 0.6169 | 0.3846 | 0.6169 | 0.7854 |
139
+ | No log | 19.5556 | 176 | 0.7978 | 0.3229 | 0.7978 | 0.8932 |
140
+ | No log | 19.7778 | 178 | 1.0104 | 0.3399 | 1.0104 | 1.0052 |
141
+ | No log | 20.0 | 180 | 1.1491 | 0.3086 | 1.1491 | 1.0720 |
142
+ | No log | 20.2222 | 182 | 1.0388 | 0.2837 | 1.0388 | 1.0192 |
143
+ | No log | 20.4444 | 184 | 0.7814 | 0.4062 | 0.7814 | 0.8840 |
144
+ | No log | 20.6667 | 186 | 0.6819 | 0.5173 | 0.6819 | 0.8258 |
145
+ | No log | 20.8889 | 188 | 0.6716 | 0.4713 | 0.6716 | 0.8195 |
146
+ | No log | 21.1111 | 190 | 0.6417 | 0.4526 | 0.6417 | 0.8011 |
147
+ | No log | 21.3333 | 192 | 0.6467 | 0.3417 | 0.6467 | 0.8042 |
148
+ | No log | 21.5556 | 194 | 0.7487 | 0.3382 | 0.7487 | 0.8653 |
149
+ | No log | 21.7778 | 196 | 0.8479 | 0.3330 | 0.8479 | 0.9208 |
150
+ | No log | 22.0 | 198 | 0.9421 | 0.3681 | 0.9421 | 0.9706 |
151
+ | No log | 22.2222 | 200 | 0.9295 | 0.3443 | 0.9295 | 0.9641 |
152
+ | No log | 22.4444 | 202 | 0.7452 | 0.4194 | 0.7452 | 0.8633 |
153
+ | No log | 22.6667 | 204 | 0.6599 | 0.4717 | 0.6599 | 0.8124 |
154
+ | No log | 22.8889 | 206 | 0.5646 | 0.6141 | 0.5646 | 0.7514 |
155
+ | No log | 23.1111 | 208 | 0.5421 | 0.5467 | 0.5421 | 0.7363 |
156
+ | No log | 23.3333 | 210 | 0.5756 | 0.4929 | 0.5756 | 0.7587 |
157
+ | No log | 23.5556 | 212 | 0.7316 | 0.3344 | 0.7316 | 0.8554 |
158
+ | No log | 23.7778 | 214 | 0.9002 | 0.3790 | 0.9002 | 0.9488 |
159
+ | No log | 24.0 | 216 | 0.9553 | 0.3781 | 0.9553 | 0.9774 |
160
+ | No log | 24.2222 | 218 | 0.8794 | 0.4034 | 0.8794 | 0.9378 |
161
+ | No log | 24.4444 | 220 | 0.6828 | 0.4691 | 0.6828 | 0.8263 |
162
+ | No log | 24.6667 | 222 | 0.5576 | 0.5351 | 0.5576 | 0.7467 |
163
+ | No log | 24.8889 | 224 | 0.5560 | 0.5651 | 0.5560 | 0.7456 |
164
+ | No log | 25.1111 | 226 | 0.5966 | 0.5117 | 0.5966 | 0.7724 |
165
+ | No log | 25.3333 | 228 | 0.6928 | 0.3981 | 0.6928 | 0.8324 |
166
+ | No log | 25.5556 | 230 | 0.7991 | 0.3494 | 0.7991 | 0.8939 |
167
+ | No log | 25.7778 | 232 | 0.7769 | 0.4005 | 0.7769 | 0.8814 |
168
+ | No log | 26.0 | 234 | 0.6793 | 0.4234 | 0.6793 | 0.8242 |
169
+ | No log | 26.2222 | 236 | 0.5978 | 0.4755 | 0.5978 | 0.7731 |
170
+ | No log | 26.4444 | 238 | 0.5401 | 0.4627 | 0.5401 | 0.7349 |
171
+ | No log | 26.6667 | 240 | 0.5230 | 0.5037 | 0.5230 | 0.7232 |
172
+ | No log | 26.8889 | 242 | 0.5638 | 0.4702 | 0.5638 | 0.7509 |
173
+ | No log | 27.1111 | 244 | 0.5721 | 0.4203 | 0.5721 | 0.7564 |
174
+ | No log | 27.3333 | 246 | 0.5930 | 0.4308 | 0.5930 | 0.7701 |
175
+ | No log | 27.5556 | 248 | 0.6033 | 0.4562 | 0.6033 | 0.7767 |
176
+ | No log | 27.7778 | 250 | 0.6601 | 0.4540 | 0.6601 | 0.8125 |
177
+ | No log | 28.0 | 252 | 0.6581 | 0.4624 | 0.6581 | 0.8112 |
178
+ | No log | 28.2222 | 254 | 0.6580 | 0.5098 | 0.6580 | 0.8112 |
179
+ | No log | 28.4444 | 256 | 0.5800 | 0.4783 | 0.5800 | 0.7616 |
180
+ | No log | 28.6667 | 258 | 0.5790 | 0.4664 | 0.5790 | 0.7609 |
181
+ | No log | 28.8889 | 260 | 0.6563 | 0.4176 | 0.6563 | 0.8101 |
182
+ | No log | 29.1111 | 262 | 0.7746 | 0.4436 | 0.7746 | 0.8801 |
183
+ | No log | 29.3333 | 264 | 0.9298 | 0.4264 | 0.9298 | 0.9643 |
184
+ | No log | 29.5556 | 266 | 0.9776 | 0.4519 | 0.9776 | 0.9887 |
185
+ | No log | 29.7778 | 268 | 0.8414 | 0.4164 | 0.8414 | 0.9173 |
186
+ | No log | 30.0 | 270 | 0.6517 | 0.4852 | 0.6517 | 0.8073 |
187
+ | No log | 30.2222 | 272 | 0.5404 | 0.6132 | 0.5404 | 0.7351 |
188
+ | No log | 30.4444 | 274 | 0.5334 | 0.5642 | 0.5334 | 0.7303 |
189
+ | No log | 30.6667 | 276 | 0.5798 | 0.4929 | 0.5798 | 0.7614 |
190
+ | No log | 30.8889 | 278 | 0.6703 | 0.4128 | 0.6703 | 0.8187 |
191
+ | No log | 31.1111 | 280 | 0.7023 | 0.3186 | 0.7023 | 0.8380 |
192
+ | No log | 31.3333 | 282 | 0.6983 | 0.3186 | 0.6983 | 0.8357 |
193
+ | No log | 31.5556 | 284 | 0.6913 | 0.3472 | 0.6913 | 0.8314 |
194
+ | No log | 31.7778 | 286 | 0.6170 | 0.4925 | 0.6170 | 0.7855 |
195
+ | No log | 32.0 | 288 | 0.6065 | 0.4974 | 0.6065 | 0.7788 |
196
+ | No log | 32.2222 | 290 | 0.6331 | 0.5373 | 0.6331 | 0.7957 |
197
+ | No log | 32.4444 | 292 | 0.6076 | 0.4982 | 0.6076 | 0.7795 |
198
+ | No log | 32.6667 | 294 | 0.5572 | 0.5692 | 0.5572 | 0.7464 |
199
+ | No log | 32.8889 | 296 | 0.5694 | 0.5165 | 0.5694 | 0.7546 |
200
+ | No log | 33.1111 | 298 | 0.5578 | 0.5592 | 0.5578 | 0.7469 |
201
+ | No log | 33.3333 | 300 | 0.5777 | 0.4853 | 0.5777 | 0.7600 |
202
+ | No log | 33.5556 | 302 | 0.6294 | 0.5013 | 0.6294 | 0.7934 |
203
+ | No log | 33.7778 | 304 | 0.6642 | 0.4692 | 0.6642 | 0.8150 |
204
+ | No log | 34.0 | 306 | 0.6851 | 0.3981 | 0.6851 | 0.8277 |
205
+ | No log | 34.2222 | 308 | 0.6754 | 0.4454 | 0.6754 | 0.8218 |
206
+ | No log | 34.4444 | 310 | 0.6819 | 0.4708 | 0.6819 | 0.8258 |
207
+ | No log | 34.6667 | 312 | 0.7561 | 0.4662 | 0.7561 | 0.8696 |
208
+ | No log | 34.8889 | 314 | 0.7818 | 0.4486 | 0.7818 | 0.8842 |
209
+ | No log | 35.1111 | 316 | 0.7748 | 0.4486 | 0.7748 | 0.8802 |
210
+ | No log | 35.3333 | 318 | 0.7130 | 0.4822 | 0.7130 | 0.8444 |
211
+ | No log | 35.5556 | 320 | 0.6499 | 0.5657 | 0.6499 | 0.8062 |
212
+ | No log | 35.7778 | 322 | 0.6589 | 0.5341 | 0.6589 | 0.8117 |
213
+ | No log | 36.0 | 324 | 0.7499 | 0.4707 | 0.7499 | 0.8660 |
214
+ | No log | 36.2222 | 326 | 0.7761 | 0.4362 | 0.7761 | 0.8810 |
215
+ | No log | 36.4444 | 328 | 0.7095 | 0.4804 | 0.7095 | 0.8423 |
216
+ | No log | 36.6667 | 330 | 0.6311 | 0.5144 | 0.6311 | 0.7944 |
217
+ | No log | 36.8889 | 332 | 0.5911 | 0.5144 | 0.5911 | 0.7688 |
218
+ | No log | 37.1111 | 334 | 0.5464 | 0.5942 | 0.5464 | 0.7392 |
219
+ | No log | 37.3333 | 336 | 0.5765 | 0.5236 | 0.5765 | 0.7593 |
220
+ | No log | 37.5556 | 338 | 0.5954 | 0.5236 | 0.5954 | 0.7716 |
221
+ | No log | 37.7778 | 340 | 0.6276 | 0.5144 | 0.6276 | 0.7922 |
222
+ | No log | 38.0 | 342 | 0.6862 | 0.4880 | 0.6862 | 0.8284 |
223
+ | No log | 38.2222 | 344 | 0.7020 | 0.4880 | 0.7020 | 0.8378 |
224
+ | No log | 38.4444 | 346 | 0.7228 | 0.4966 | 0.7228 | 0.8502 |
225
+ | No log | 38.6667 | 348 | 0.6926 | 0.5013 | 0.6926 | 0.8322 |
226
+ | No log | 38.8889 | 350 | 0.6627 | 0.4719 | 0.6627 | 0.8140 |
227
+ | No log | 39.1111 | 352 | 0.6268 | 0.5383 | 0.6268 | 0.7917 |
228
+ | No log | 39.3333 | 354 | 0.6465 | 0.4982 | 0.6465 | 0.8041 |
229
+ | No log | 39.5556 | 356 | 0.6994 | 0.5065 | 0.6994 | 0.8363 |
230
+ | No log | 39.7778 | 358 | 0.7561 | 0.4708 | 0.7561 | 0.8695 |
231
+ | No log | 40.0 | 360 | 0.8164 | 0.4436 | 0.8164 | 0.9035 |
232
+ | No log | 40.2222 | 362 | 0.8081 | 0.4436 | 0.8081 | 0.8989 |
233
+ | No log | 40.4444 | 364 | 0.7443 | 0.3955 | 0.7443 | 0.8627 |
234
+ | No log | 40.6667 | 366 | 0.6866 | 0.5013 | 0.6866 | 0.8286 |
235
+ | No log | 40.8889 | 368 | 0.6817 | 0.4659 | 0.6817 | 0.8257 |
236
+ | No log | 41.1111 | 370 | 0.7290 | 0.4490 | 0.7290 | 0.8538 |
237
+ | No log | 41.3333 | 372 | 0.8276 | 0.4397 | 0.8276 | 0.9097 |
238
+ | No log | 41.5556 | 374 | 0.8765 | 0.4452 | 0.8765 | 0.9362 |
239
+ | No log | 41.7778 | 376 | 0.8115 | 0.4169 | 0.8115 | 0.9008 |
240
+ | No log | 42.0 | 378 | 0.6996 | 0.4154 | 0.6996 | 0.8364 |
241
+ | No log | 42.2222 | 380 | 0.6504 | 0.4965 | 0.6504 | 0.8065 |
242
+ | No log | 42.4444 | 382 | 0.6270 | 0.5058 | 0.6270 | 0.7918 |
243
+ | No log | 42.6667 | 384 | 0.6276 | 0.4965 | 0.6276 | 0.7922 |
244
+ | No log | 42.8889 | 386 | 0.6304 | 0.5190 | 0.6304 | 0.7939 |
245
+ | No log | 43.1111 | 388 | 0.6182 | 0.5450 | 0.6182 | 0.7863 |
246
+ | No log | 43.3333 | 390 | 0.6439 | 0.5013 | 0.6439 | 0.8024 |
247
+ | No log | 43.5556 | 392 | 0.6321 | 0.5657 | 0.6321 | 0.7950 |
248
+ | No log | 43.7778 | 394 | 0.5714 | 0.5659 | 0.5714 | 0.7559 |
249
+ | No log | 44.0 | 396 | 0.5834 | 0.5716 | 0.5834 | 0.7638 |
250
+ | No log | 44.2222 | 398 | 0.6521 | 0.5236 | 0.6521 | 0.8076 |
251
+ | No log | 44.4444 | 400 | 0.6331 | 0.5061 | 0.6331 | 0.7957 |
252
+ | No log | 44.6667 | 402 | 0.5713 | 0.5493 | 0.5713 | 0.7559 |
253
+ | No log | 44.8889 | 404 | 0.5368 | 0.5617 | 0.5368 | 0.7327 |
254
+ | No log | 45.1111 | 406 | 0.5045 | 0.6475 | 0.5045 | 0.7103 |
255
+ | No log | 45.3333 | 408 | 0.5068 | 0.6435 | 0.5068 | 0.7119 |
256
+ | No log | 45.5556 | 410 | 0.5556 | 0.5725 | 0.5556 | 0.7454 |
257
+ | No log | 45.7778 | 412 | 0.6030 | 0.5117 | 0.6030 | 0.7765 |
258
+ | No log | 46.0 | 414 | 0.6746 | 0.4684 | 0.6746 | 0.8214 |
259
+ | No log | 46.2222 | 416 | 0.7856 | 0.4662 | 0.7856 | 0.8864 |
260
+ | No log | 46.4444 | 418 | 0.7975 | 0.4813 | 0.7975 | 0.8930 |
261
+ | No log | 46.6667 | 420 | 0.7190 | 0.4926 | 0.7190 | 0.8480 |
262
+ | No log | 46.8889 | 422 | 0.6293 | 0.4717 | 0.6293 | 0.7933 |
263
+ | No log | 47.1111 | 424 | 0.5562 | 0.5627 | 0.5562 | 0.7458 |
264
+ | No log | 47.3333 | 426 | 0.5536 | 0.5627 | 0.5536 | 0.7440 |
265
+ | No log | 47.5556 | 428 | 0.5759 | 0.5523 | 0.5759 | 0.7589 |
266
+ | No log | 47.7778 | 430 | 0.6495 | 0.4783 | 0.6495 | 0.8059 |
267
+ | No log | 48.0 | 432 | 0.7423 | 0.4001 | 0.7423 | 0.8616 |
268
+ | No log | 48.2222 | 434 | 0.7558 | 0.3681 | 0.7558 | 0.8694 |
269
+ | No log | 48.4444 | 436 | 0.6988 | 0.4082 | 0.6988 | 0.8360 |
270
+ | No log | 48.6667 | 438 | 0.6335 | 0.5019 | 0.6335 | 0.7959 |
271
+ | No log | 48.8889 | 440 | 0.5622 | 0.5223 | 0.5622 | 0.7498 |
272
+ | No log | 49.1111 | 442 | 0.5267 | 0.5918 | 0.5267 | 0.7257 |
273
+ | No log | 49.3333 | 444 | 0.5152 | 0.6249 | 0.5152 | 0.7178 |
274
+ | No log | 49.5556 | 446 | 0.5558 | 0.5503 | 0.5558 | 0.7455 |
275
+ | No log | 49.7778 | 448 | 0.6651 | 0.5636 | 0.6651 | 0.8155 |
276
+ | No log | 50.0 | 450 | 0.8430 | 0.4369 | 0.8430 | 0.9181 |
277
+ | No log | 50.2222 | 452 | 0.9658 | 0.3900 | 0.9658 | 0.9827 |
278
+ | No log | 50.4444 | 454 | 0.9813 | 0.3900 | 0.9813 | 0.9906 |
279
+ | No log | 50.6667 | 456 | 0.9036 | 0.3919 | 0.9036 | 0.9506 |
280
+ | No log | 50.8889 | 458 | 0.8074 | 0.4197 | 0.8074 | 0.8986 |
281
+ | No log | 51.1111 | 460 | 0.7200 | 0.4841 | 0.7200 | 0.8485 |
282
+ | No log | 51.3333 | 462 | 0.6824 | 0.5516 | 0.6824 | 0.8261 |
283
+ | No log | 51.5556 | 464 | 0.6952 | 0.5844 | 0.6952 | 0.8338 |
284
+ | No log | 51.7778 | 466 | 0.7562 | 0.5089 | 0.7562 | 0.8696 |
285
+ | No log | 52.0 | 468 | 0.8319 | 0.4770 | 0.8319 | 0.9121 |
286
+ | No log | 52.2222 | 470 | 0.9193 | 0.4264 | 0.9193 | 0.9588 |
287
+ | No log | 52.4444 | 472 | 0.9185 | 0.4075 | 0.9185 | 0.9584 |
288
+ | No log | 52.6667 | 474 | 0.8440 | 0.4444 | 0.8440 | 0.9187 |
289
+ | No log | 52.8889 | 476 | 0.7316 | 0.4298 | 0.7316 | 0.8553 |
290
+ | No log | 53.1111 | 478 | 0.6188 | 0.5247 | 0.6188 | 0.7866 |
291
+ | No log | 53.3333 | 480 | 0.5554 | 0.5396 | 0.5554 | 0.7452 |
292
+ | No log | 53.5556 | 482 | 0.5506 | 0.5505 | 0.5506 | 0.7420 |
293
+ | No log | 53.7778 | 484 | 0.5887 | 0.5410 | 0.5887 | 0.7672 |
294
+ | No log | 54.0 | 486 | 0.6380 | 0.4444 | 0.6380 | 0.7988 |
295
+ | No log | 54.2222 | 488 | 0.7301 | 0.4247 | 0.7301 | 0.8545 |
296
+ | No log | 54.4444 | 490 | 0.8116 | 0.4158 | 0.8116 | 0.9009 |
297
+ | No log | 54.6667 | 492 | 0.8454 | 0.4221 | 0.8454 | 0.9194 |
298
+ | No log | 54.8889 | 494 | 0.8212 | 0.4369 | 0.8212 | 0.9062 |
299
+ | No log | 55.1111 | 496 | 0.7851 | 0.4649 | 0.7851 | 0.8861 |
300
+ | No log | 55.3333 | 498 | 0.7483 | 0.5588 | 0.7483 | 0.8651 |
301
+ | 0.2662 | 55.5556 | 500 | 0.7013 | 0.5779 | 0.7013 | 0.8374 |
302
+ | 0.2662 | 55.7778 | 502 | 0.7025 | 0.5931 | 0.7025 | 0.8381 |
303
+ | 0.2662 | 56.0 | 504 | 0.7131 | 0.5931 | 0.7131 | 0.8444 |
304
+ | 0.2662 | 56.2222 | 506 | 0.6986 | 0.5922 | 0.6986 | 0.8358 |
305
+ | 0.2662 | 56.4444 | 508 | 0.6810 | 0.5978 | 0.6810 | 0.8252 |
306
+ | 0.2662 | 56.6667 | 510 | 0.6861 | 0.5636 | 0.6861 | 0.8283 |
307
+ | 0.2662 | 56.8889 | 512 | 0.7351 | 0.4523 | 0.7351 | 0.8574 |
308
+ | 0.2662 | 57.1111 | 514 | 0.8028 | 0.4016 | 0.8028 | 0.8960 |
309
+ | 0.2662 | 57.3333 | 516 | 0.8537 | 0.4011 | 0.8537 | 0.9240 |
310
+ | 0.2662 | 57.5556 | 518 | 0.8402 | 0.4011 | 0.8402 | 0.9166 |
311
+
312
+
313
+ ### Framework versions
314
+
315
+ - Transformers 4.44.2
316
+ - Pytorch 2.4.0+cu118
317
+ - Datasets 2.21.0
318
+ - Tokenizers 0.19.1
config.json ADDED
@@ -0,0 +1,32 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "_name_or_path": "aubmindlab/bert-base-arabertv02",
3
+ "architectures": [
4
+ "BertForSequenceClassification"
5
+ ],
6
+ "attention_probs_dropout_prob": 0.1,
7
+ "classifier_dropout": null,
8
+ "hidden_act": "gelu",
9
+ "hidden_dropout_prob": 0.1,
10
+ "hidden_size": 768,
11
+ "id2label": {
12
+ "0": "LABEL_0"
13
+ },
14
+ "initializer_range": 0.02,
15
+ "intermediate_size": 3072,
16
+ "label2id": {
17
+ "LABEL_0": 0
18
+ },
19
+ "layer_norm_eps": 1e-12,
20
+ "max_position_embeddings": 512,
21
+ "model_type": "bert",
22
+ "num_attention_heads": 12,
23
+ "num_hidden_layers": 12,
24
+ "pad_token_id": 0,
25
+ "position_embedding_type": "absolute",
26
+ "problem_type": "regression",
27
+ "torch_dtype": "float32",
28
+ "transformers_version": "4.44.2",
29
+ "type_vocab_size": 2,
30
+ "use_cache": true,
31
+ "vocab_size": 64000
32
+ }
model.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:f04bcdb16482d67fbd0086716c7cc666bc8ae51136cbb5af3a42fe6d19801915
3
+ size 540799996
training_args.bin ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:c39ec062f99b391655be626117f47d0a5ec8fec1715f1c2488fc6a27105ff74e
3
+ size 5240