MayBashendy commited on
Commit
303127f
·
verified ·
1 Parent(s): 01f969a

Training in progress, step 500

Browse files
Files changed (4) hide show
  1. README.md +314 -0
  2. config.json +32 -0
  3. model.safetensors +3 -0
  4. training_args.bin +3 -0
README.md ADDED
@@ -0,0 +1,314 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ library_name: transformers
3
+ base_model: aubmindlab/bert-base-arabertv02
4
+ tags:
5
+ - generated_from_trainer
6
+ model-index:
7
+ - name: ArabicNewSplits7_B_usingALLEssays_FineTuningAraBERT_run2_AugV5_k2_task5_organization
8
+ results: []
9
+ ---
10
+
11
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
12
+ should probably proofread and complete it, then remove this comment. -->
13
+
14
+ # ArabicNewSplits7_B_usingALLEssays_FineTuningAraBERT_run2_AugV5_k2_task5_organization
15
+
16
+ This model is a fine-tuned version of [aubmindlab/bert-base-arabertv02](https://huggingface.co/aubmindlab/bert-base-arabertv02) on the None dataset.
17
+ It achieves the following results on the evaluation set:
18
+ - Loss: 0.5722
19
+ - Qwk: 0.6488
20
+ - Mse: 0.5722
21
+ - Rmse: 0.7565
22
+
23
+ ## Model description
24
+
25
+ More information needed
26
+
27
+ ## Intended uses & limitations
28
+
29
+ More information needed
30
+
31
+ ## Training and evaluation data
32
+
33
+ More information needed
34
+
35
+ ## Training procedure
36
+
37
+ ### Training hyperparameters
38
+
39
+ The following hyperparameters were used during training:
40
+ - learning_rate: 2e-05
41
+ - train_batch_size: 8
42
+ - eval_batch_size: 8
43
+ - seed: 42
44
+ - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
45
+ - lr_scheduler_type: linear
46
+ - num_epochs: 100
47
+
48
+ ### Training results
49
+
50
+ | Training Loss | Epoch | Step | Validation Loss | Qwk | Mse | Rmse |
51
+ |:-------------:|:-------:|:----:|:---------------:|:-------:|:------:|:------:|
52
+ | No log | 0.1667 | 2 | 3.8728 | -0.0170 | 3.8728 | 1.9679 |
53
+ | No log | 0.3333 | 4 | 2.6239 | 0.0544 | 2.6239 | 1.6198 |
54
+ | No log | 0.5 | 6 | 1.3255 | 0.0380 | 1.3255 | 1.1513 |
55
+ | No log | 0.6667 | 8 | 1.0787 | 0.2588 | 1.0787 | 1.0386 |
56
+ | No log | 0.8333 | 10 | 1.1564 | 0.1740 | 1.1564 | 1.0754 |
57
+ | No log | 1.0 | 12 | 1.2248 | 0.1740 | 1.2248 | 1.1067 |
58
+ | No log | 1.1667 | 14 | 1.7353 | -0.0130 | 1.7353 | 1.3173 |
59
+ | No log | 1.3333 | 16 | 1.6921 | 0.0365 | 1.6921 | 1.3008 |
60
+ | No log | 1.5 | 18 | 1.2117 | 0.2175 | 1.2117 | 1.1008 |
61
+ | No log | 1.6667 | 20 | 1.1073 | 0.2503 | 1.1073 | 1.0523 |
62
+ | No log | 1.8333 | 22 | 1.3909 | 0.1267 | 1.3909 | 1.1794 |
63
+ | No log | 2.0 | 24 | 1.4579 | 0.1901 | 1.4579 | 1.2074 |
64
+ | No log | 2.1667 | 26 | 1.0873 | 0.2304 | 1.0873 | 1.0427 |
65
+ | No log | 2.3333 | 28 | 1.0095 | 0.3257 | 1.0095 | 1.0048 |
66
+ | No log | 2.5 | 30 | 1.0450 | 0.3160 | 1.0450 | 1.0223 |
67
+ | No log | 2.6667 | 32 | 1.2267 | 0.3093 | 1.2267 | 1.1076 |
68
+ | No log | 2.8333 | 34 | 0.9818 | 0.3779 | 0.9818 | 0.9909 |
69
+ | No log | 3.0 | 36 | 0.8206 | 0.5002 | 0.8206 | 0.9058 |
70
+ | No log | 3.1667 | 38 | 0.8024 | 0.5330 | 0.8024 | 0.8958 |
71
+ | No log | 3.3333 | 40 | 0.7935 | 0.5188 | 0.7935 | 0.8908 |
72
+ | No log | 3.5 | 42 | 0.7753 | 0.5638 | 0.7753 | 0.8805 |
73
+ | No log | 3.6667 | 44 | 0.7787 | 0.5393 | 0.7787 | 0.8824 |
74
+ | No log | 3.8333 | 46 | 0.7684 | 0.5500 | 0.7684 | 0.8766 |
75
+ | No log | 4.0 | 48 | 0.7396 | 0.5545 | 0.7396 | 0.8600 |
76
+ | No log | 4.1667 | 50 | 0.8661 | 0.4720 | 0.8661 | 0.9307 |
77
+ | No log | 4.3333 | 52 | 0.9420 | 0.4898 | 0.9420 | 0.9706 |
78
+ | No log | 4.5 | 54 | 0.9381 | 0.5106 | 0.9381 | 0.9685 |
79
+ | No log | 4.6667 | 56 | 0.8157 | 0.5069 | 0.8157 | 0.9032 |
80
+ | No log | 4.8333 | 58 | 0.7567 | 0.5662 | 0.7567 | 0.8699 |
81
+ | No log | 5.0 | 60 | 0.7743 | 0.5490 | 0.7743 | 0.8799 |
82
+ | No log | 5.1667 | 62 | 0.7784 | 0.5490 | 0.7784 | 0.8823 |
83
+ | No log | 5.3333 | 64 | 0.7848 | 0.5784 | 0.7848 | 0.8859 |
84
+ | No log | 5.5 | 66 | 0.8824 | 0.5017 | 0.8824 | 0.9393 |
85
+ | No log | 5.6667 | 68 | 0.9304 | 0.5145 | 0.9304 | 0.9646 |
86
+ | No log | 5.8333 | 70 | 0.7821 | 0.4920 | 0.7821 | 0.8844 |
87
+ | No log | 6.0 | 72 | 1.0293 | 0.4314 | 1.0293 | 1.0145 |
88
+ | No log | 6.1667 | 74 | 1.0318 | 0.3711 | 1.0318 | 1.0158 |
89
+ | No log | 6.3333 | 76 | 0.8934 | 0.4748 | 0.8934 | 0.9452 |
90
+ | No log | 6.5 | 78 | 0.7655 | 0.6046 | 0.7655 | 0.8749 |
91
+ | No log | 6.6667 | 80 | 0.7875 | 0.5530 | 0.7875 | 0.8874 |
92
+ | No log | 6.8333 | 82 | 0.7417 | 0.5626 | 0.7417 | 0.8612 |
93
+ | No log | 7.0 | 84 | 0.8057 | 0.5540 | 0.8057 | 0.8976 |
94
+ | No log | 7.1667 | 86 | 0.8043 | 0.5540 | 0.8043 | 0.8968 |
95
+ | No log | 7.3333 | 88 | 0.7320 | 0.5955 | 0.7320 | 0.8556 |
96
+ | No log | 7.5 | 90 | 0.8484 | 0.4140 | 0.8484 | 0.9211 |
97
+ | No log | 7.6667 | 92 | 0.8145 | 0.4671 | 0.8145 | 0.9025 |
98
+ | No log | 7.8333 | 94 | 0.7213 | 0.6227 | 0.7213 | 0.8493 |
99
+ | No log | 8.0 | 96 | 0.7428 | 0.6147 | 0.7428 | 0.8618 |
100
+ | No log | 8.1667 | 98 | 0.7150 | 0.6415 | 0.7150 | 0.8456 |
101
+ | No log | 8.3333 | 100 | 0.8460 | 0.3966 | 0.8460 | 0.9198 |
102
+ | No log | 8.5 | 102 | 1.0128 | 0.4200 | 1.0128 | 1.0064 |
103
+ | No log | 8.6667 | 104 | 0.8045 | 0.3833 | 0.8045 | 0.8969 |
104
+ | No log | 8.8333 | 106 | 0.6650 | 0.6537 | 0.6650 | 0.8155 |
105
+ | No log | 9.0 | 108 | 0.6467 | 0.6229 | 0.6467 | 0.8042 |
106
+ | No log | 9.1667 | 110 | 0.6301 | 0.6364 | 0.6301 | 0.7938 |
107
+ | No log | 9.3333 | 112 | 0.6485 | 0.6564 | 0.6485 | 0.8053 |
108
+ | No log | 9.5 | 114 | 0.6091 | 0.6364 | 0.6091 | 0.7804 |
109
+ | No log | 9.6667 | 116 | 0.6600 | 0.6167 | 0.6600 | 0.8124 |
110
+ | No log | 9.8333 | 118 | 0.6347 | 0.6043 | 0.6347 | 0.7967 |
111
+ | No log | 10.0 | 120 | 0.6036 | 0.6390 | 0.6036 | 0.7769 |
112
+ | No log | 10.1667 | 122 | 0.6255 | 0.6426 | 0.6255 | 0.7909 |
113
+ | No log | 10.3333 | 124 | 0.6051 | 0.6584 | 0.6051 | 0.7779 |
114
+ | No log | 10.5 | 126 | 0.6140 | 0.6354 | 0.6140 | 0.7836 |
115
+ | No log | 10.6667 | 128 | 0.6623 | 0.6246 | 0.6623 | 0.8138 |
116
+ | No log | 10.8333 | 130 | 0.9290 | 0.5177 | 0.9290 | 0.9639 |
117
+ | No log | 11.0 | 132 | 1.0516 | 0.4177 | 1.0516 | 1.0255 |
118
+ | No log | 11.1667 | 134 | 0.8502 | 0.5164 | 0.8502 | 0.9221 |
119
+ | No log | 11.3333 | 136 | 0.6744 | 0.5301 | 0.6744 | 0.8212 |
120
+ | No log | 11.5 | 138 | 0.6629 | 0.5850 | 0.6629 | 0.8142 |
121
+ | No log | 11.6667 | 140 | 0.6463 | 0.6198 | 0.6463 | 0.8039 |
122
+ | No log | 11.8333 | 142 | 0.6505 | 0.5783 | 0.6505 | 0.8065 |
123
+ | No log | 12.0 | 144 | 0.7391 | 0.5804 | 0.7391 | 0.8597 |
124
+ | No log | 12.1667 | 146 | 0.7566 | 0.5781 | 0.7566 | 0.8698 |
125
+ | No log | 12.3333 | 148 | 0.6321 | 0.6461 | 0.6321 | 0.7951 |
126
+ | No log | 12.5 | 150 | 0.6532 | 0.6045 | 0.6532 | 0.8082 |
127
+ | No log | 12.6667 | 152 | 0.7301 | 0.6468 | 0.7301 | 0.8545 |
128
+ | No log | 12.8333 | 154 | 0.6477 | 0.6455 | 0.6477 | 0.8048 |
129
+ | No log | 13.0 | 156 | 0.6489 | 0.6115 | 0.6489 | 0.8056 |
130
+ | No log | 13.1667 | 158 | 0.7156 | 0.6499 | 0.7156 | 0.8460 |
131
+ | No log | 13.3333 | 160 | 0.6876 | 0.6394 | 0.6876 | 0.8292 |
132
+ | No log | 13.5 | 162 | 0.6307 | 0.6390 | 0.6307 | 0.7942 |
133
+ | No log | 13.6667 | 164 | 0.6670 | 0.6623 | 0.6670 | 0.8167 |
134
+ | No log | 13.8333 | 166 | 0.6412 | 0.6407 | 0.6412 | 0.8008 |
135
+ | No log | 14.0 | 168 | 0.6781 | 0.6032 | 0.6781 | 0.8235 |
136
+ | No log | 14.1667 | 170 | 0.7783 | 0.5659 | 0.7783 | 0.8822 |
137
+ | No log | 14.3333 | 172 | 0.7719 | 0.5650 | 0.7719 | 0.8786 |
138
+ | No log | 14.5 | 174 | 0.6748 | 0.6516 | 0.6748 | 0.8214 |
139
+ | No log | 14.6667 | 176 | 0.6474 | 0.6215 | 0.6474 | 0.8046 |
140
+ | No log | 14.8333 | 178 | 0.6725 | 0.6615 | 0.6725 | 0.8200 |
141
+ | No log | 15.0 | 180 | 0.7653 | 0.5719 | 0.7653 | 0.8748 |
142
+ | No log | 15.1667 | 182 | 0.7392 | 0.5895 | 0.7392 | 0.8598 |
143
+ | No log | 15.3333 | 184 | 0.6462 | 0.6115 | 0.6462 | 0.8039 |
144
+ | No log | 15.5 | 186 | 0.6698 | 0.6269 | 0.6698 | 0.8184 |
145
+ | No log | 15.6667 | 188 | 0.6987 | 0.5602 | 0.6987 | 0.8359 |
146
+ | No log | 15.8333 | 190 | 0.6987 | 0.5416 | 0.6987 | 0.8359 |
147
+ | No log | 16.0 | 192 | 0.6352 | 0.6380 | 0.6352 | 0.7970 |
148
+ | No log | 16.1667 | 194 | 0.6463 | 0.6393 | 0.6463 | 0.8039 |
149
+ | No log | 16.3333 | 196 | 0.6480 | 0.6828 | 0.6480 | 0.8050 |
150
+ | No log | 16.5 | 198 | 0.6240 | 0.6721 | 0.6240 | 0.7900 |
151
+ | No log | 16.6667 | 200 | 0.6613 | 0.6499 | 0.6613 | 0.8132 |
152
+ | No log | 16.8333 | 202 | 0.7753 | 0.5937 | 0.7753 | 0.8805 |
153
+ | No log | 17.0 | 204 | 0.7638 | 0.6356 | 0.7638 | 0.8740 |
154
+ | No log | 17.1667 | 206 | 0.6664 | 0.6525 | 0.6664 | 0.8163 |
155
+ | No log | 17.3333 | 208 | 0.6297 | 0.6259 | 0.6297 | 0.7936 |
156
+ | No log | 17.5 | 210 | 0.6244 | 0.6154 | 0.6244 | 0.7902 |
157
+ | No log | 17.6667 | 212 | 0.6769 | 0.6525 | 0.6769 | 0.8227 |
158
+ | No log | 17.8333 | 214 | 0.8226 | 0.5078 | 0.8226 | 0.9070 |
159
+ | No log | 18.0 | 216 | 0.8814 | 0.4693 | 0.8814 | 0.9388 |
160
+ | No log | 18.1667 | 218 | 0.8226 | 0.5382 | 0.8226 | 0.9070 |
161
+ | No log | 18.3333 | 220 | 0.6472 | 0.6602 | 0.6472 | 0.8045 |
162
+ | No log | 18.5 | 222 | 0.6118 | 0.6292 | 0.6118 | 0.7822 |
163
+ | No log | 18.6667 | 224 | 0.6067 | 0.6733 | 0.6067 | 0.7789 |
164
+ | No log | 18.8333 | 226 | 0.6121 | 0.6292 | 0.6121 | 0.7824 |
165
+ | No log | 19.0 | 228 | 0.6315 | 0.6097 | 0.6315 | 0.7947 |
166
+ | No log | 19.1667 | 230 | 0.6484 | 0.5561 | 0.6484 | 0.8052 |
167
+ | No log | 19.3333 | 232 | 0.5821 | 0.6292 | 0.5821 | 0.7630 |
168
+ | No log | 19.5 | 234 | 0.5561 | 0.6970 | 0.5561 | 0.7457 |
169
+ | No log | 19.6667 | 236 | 0.5627 | 0.7295 | 0.5627 | 0.7502 |
170
+ | No log | 19.8333 | 238 | 0.5384 | 0.6951 | 0.5384 | 0.7338 |
171
+ | No log | 20.0 | 240 | 0.5506 | 0.6869 | 0.5506 | 0.7420 |
172
+ | No log | 20.1667 | 242 | 0.5492 | 0.6869 | 0.5492 | 0.7411 |
173
+ | No log | 20.3333 | 244 | 0.5462 | 0.7064 | 0.5462 | 0.7391 |
174
+ | No log | 20.5 | 246 | 0.5932 | 0.6581 | 0.5932 | 0.7702 |
175
+ | No log | 20.6667 | 248 | 0.7056 | 0.5030 | 0.7056 | 0.8400 |
176
+ | No log | 20.8333 | 250 | 0.7243 | 0.4596 | 0.7243 | 0.8511 |
177
+ | No log | 21.0 | 252 | 0.6954 | 0.5348 | 0.6954 | 0.8339 |
178
+ | No log | 21.1667 | 254 | 0.6348 | 0.6234 | 0.6348 | 0.7967 |
179
+ | No log | 21.3333 | 256 | 0.5937 | 0.6572 | 0.5937 | 0.7705 |
180
+ | No log | 21.5 | 258 | 0.6168 | 0.6572 | 0.6168 | 0.7854 |
181
+ | No log | 21.6667 | 260 | 0.6508 | 0.6307 | 0.6508 | 0.8067 |
182
+ | No log | 21.8333 | 262 | 0.7009 | 0.5951 | 0.7009 | 0.8372 |
183
+ | No log | 22.0 | 264 | 0.6932 | 0.6260 | 0.6932 | 0.8326 |
184
+ | No log | 22.1667 | 266 | 0.5888 | 0.6383 | 0.5888 | 0.7673 |
185
+ | No log | 22.3333 | 268 | 0.5613 | 0.6488 | 0.5613 | 0.7492 |
186
+ | No log | 22.5 | 270 | 0.5587 | 0.6488 | 0.5587 | 0.7474 |
187
+ | No log | 22.6667 | 272 | 0.5537 | 0.6644 | 0.5537 | 0.7441 |
188
+ | No log | 22.8333 | 274 | 0.5826 | 0.6499 | 0.5826 | 0.7633 |
189
+ | No log | 23.0 | 276 | 0.6409 | 0.6484 | 0.6409 | 0.8006 |
190
+ | No log | 23.1667 | 278 | 0.6336 | 0.6707 | 0.6336 | 0.7960 |
191
+ | No log | 23.3333 | 280 | 0.5785 | 0.6581 | 0.5785 | 0.7606 |
192
+ | No log | 23.5 | 282 | 0.5593 | 0.6488 | 0.5593 | 0.7479 |
193
+ | No log | 23.6667 | 284 | 0.5759 | 0.6629 | 0.5759 | 0.7589 |
194
+ | No log | 23.8333 | 286 | 0.5926 | 0.6946 | 0.5926 | 0.7698 |
195
+ | No log | 24.0 | 288 | 0.5719 | 0.6559 | 0.5719 | 0.7562 |
196
+ | No log | 24.1667 | 290 | 0.5693 | 0.6196 | 0.5693 | 0.7545 |
197
+ | No log | 24.3333 | 292 | 0.5688 | 0.6517 | 0.5688 | 0.7542 |
198
+ | No log | 24.5 | 294 | 0.5635 | 0.6317 | 0.5635 | 0.7507 |
199
+ | No log | 24.6667 | 296 | 0.5655 | 0.6317 | 0.5655 | 0.7520 |
200
+ | No log | 24.8333 | 298 | 0.5728 | 0.6409 | 0.5728 | 0.7568 |
201
+ | No log | 25.0 | 300 | 0.5731 | 0.6675 | 0.5731 | 0.7570 |
202
+ | No log | 25.1667 | 302 | 0.5759 | 0.6479 | 0.5759 | 0.7589 |
203
+ | No log | 25.3333 | 304 | 0.5992 | 0.7088 | 0.5992 | 0.7741 |
204
+ | No log | 25.5 | 306 | 0.6026 | 0.7308 | 0.6026 | 0.7763 |
205
+ | No log | 25.6667 | 308 | 0.6128 | 0.6490 | 0.6128 | 0.7828 |
206
+ | No log | 25.8333 | 310 | 0.6430 | 0.6595 | 0.6430 | 0.8019 |
207
+ | No log | 26.0 | 312 | 0.6346 | 0.6490 | 0.6346 | 0.7966 |
208
+ | No log | 26.1667 | 314 | 0.6066 | 0.6572 | 0.6066 | 0.7789 |
209
+ | No log | 26.3333 | 316 | 0.5941 | 0.7089 | 0.5941 | 0.7708 |
210
+ | No log | 26.5 | 318 | 0.6131 | 0.6779 | 0.6131 | 0.7830 |
211
+ | No log | 26.6667 | 320 | 0.6104 | 0.6824 | 0.6104 | 0.7813 |
212
+ | No log | 26.8333 | 322 | 0.6139 | 0.6286 | 0.6139 | 0.7835 |
213
+ | No log | 27.0 | 324 | 0.6041 | 0.6639 | 0.6041 | 0.7773 |
214
+ | No log | 27.1667 | 326 | 0.6028 | 0.6824 | 0.6028 | 0.7764 |
215
+ | No log | 27.3333 | 328 | 0.6127 | 0.6908 | 0.6127 | 0.7827 |
216
+ | No log | 27.5 | 330 | 0.6167 | 0.6779 | 0.6167 | 0.7853 |
217
+ | No log | 27.6667 | 332 | 0.6147 | 0.6874 | 0.6147 | 0.7840 |
218
+ | No log | 27.8333 | 334 | 0.6147 | 0.6874 | 0.6147 | 0.7840 |
219
+ | No log | 28.0 | 336 | 0.6158 | 0.6880 | 0.6158 | 0.7847 |
220
+ | No log | 28.1667 | 338 | 0.6207 | 0.6693 | 0.6207 | 0.7879 |
221
+ | No log | 28.3333 | 340 | 0.6196 | 0.6693 | 0.6196 | 0.7871 |
222
+ | No log | 28.5 | 342 | 0.5933 | 0.7042 | 0.5933 | 0.7703 |
223
+ | No log | 28.6667 | 344 | 0.6088 | 0.6653 | 0.6088 | 0.7802 |
224
+ | No log | 28.8333 | 346 | 0.7269 | 0.5914 | 0.7269 | 0.8526 |
225
+ | No log | 29.0 | 348 | 0.7538 | 0.6002 | 0.7538 | 0.8682 |
226
+ | No log | 29.1667 | 350 | 0.6784 | 0.6707 | 0.6784 | 0.8236 |
227
+ | No log | 29.3333 | 352 | 0.5976 | 0.6393 | 0.5976 | 0.7731 |
228
+ | No log | 29.5 | 354 | 0.5772 | 0.6409 | 0.5772 | 0.7598 |
229
+ | No log | 29.6667 | 356 | 0.5821 | 0.6409 | 0.5821 | 0.7630 |
230
+ | No log | 29.8333 | 358 | 0.5890 | 0.6409 | 0.5890 | 0.7675 |
231
+ | No log | 30.0 | 360 | 0.5789 | 0.6517 | 0.5789 | 0.7609 |
232
+ | No log | 30.1667 | 362 | 0.5665 | 0.6517 | 0.5665 | 0.7526 |
233
+ | No log | 30.3333 | 364 | 0.5627 | 0.6627 | 0.5627 | 0.7501 |
234
+ | No log | 30.5 | 366 | 0.5629 | 0.6627 | 0.5629 | 0.7503 |
235
+ | No log | 30.6667 | 368 | 0.5581 | 0.6509 | 0.5581 | 0.7471 |
236
+ | No log | 30.8333 | 370 | 0.5582 | 0.6649 | 0.5582 | 0.7471 |
237
+ | No log | 31.0 | 372 | 0.5741 | 0.6498 | 0.5741 | 0.7577 |
238
+ | No log | 31.1667 | 374 | 0.5741 | 0.6498 | 0.5741 | 0.7577 |
239
+ | No log | 31.3333 | 376 | 0.5619 | 0.6641 | 0.5619 | 0.7496 |
240
+ | No log | 31.5 | 378 | 0.5539 | 0.6962 | 0.5539 | 0.7443 |
241
+ | No log | 31.6667 | 380 | 0.5600 | 0.6507 | 0.5600 | 0.7483 |
242
+ | No log | 31.8333 | 382 | 0.5693 | 0.6544 | 0.5693 | 0.7545 |
243
+ | No log | 32.0 | 384 | 0.5627 | 0.6689 | 0.5627 | 0.7501 |
244
+ | No log | 32.1667 | 386 | 0.5709 | 0.6733 | 0.5709 | 0.7556 |
245
+ | No log | 32.3333 | 388 | 0.5770 | 0.6674 | 0.5770 | 0.7596 |
246
+ | No log | 32.5 | 390 | 0.5955 | 0.6567 | 0.5955 | 0.7717 |
247
+ | No log | 32.6667 | 392 | 0.6240 | 0.6603 | 0.6240 | 0.7899 |
248
+ | No log | 32.8333 | 394 | 0.5937 | 0.6427 | 0.5937 | 0.7705 |
249
+ | No log | 33.0 | 396 | 0.5697 | 0.6164 | 0.5697 | 0.7548 |
250
+ | No log | 33.1667 | 398 | 0.5730 | 0.6185 | 0.5730 | 0.7569 |
251
+ | No log | 33.3333 | 400 | 0.5768 | 0.6400 | 0.5768 | 0.7595 |
252
+ | No log | 33.5 | 402 | 0.5918 | 0.5759 | 0.5918 | 0.7693 |
253
+ | No log | 33.6667 | 404 | 0.5899 | 0.6358 | 0.5899 | 0.7680 |
254
+ | No log | 33.8333 | 406 | 0.5899 | 0.6311 | 0.5899 | 0.7680 |
255
+ | No log | 34.0 | 408 | 0.5966 | 0.6424 | 0.5966 | 0.7724 |
256
+ | No log | 34.1667 | 410 | 0.6045 | 0.6393 | 0.6045 | 0.7775 |
257
+ | No log | 34.3333 | 412 | 0.6076 | 0.6260 | 0.6076 | 0.7795 |
258
+ | No log | 34.5 | 414 | 0.6067 | 0.6260 | 0.6067 | 0.7789 |
259
+ | No log | 34.6667 | 416 | 0.5900 | 0.6745 | 0.5900 | 0.7681 |
260
+ | No log | 34.8333 | 418 | 0.5969 | 0.6778 | 0.5969 | 0.7726 |
261
+ | No log | 35.0 | 420 | 0.6049 | 0.6207 | 0.6049 | 0.7777 |
262
+ | No log | 35.1667 | 422 | 0.6046 | 0.6207 | 0.6046 | 0.7775 |
263
+ | No log | 35.3333 | 424 | 0.6106 | 0.6207 | 0.6106 | 0.7814 |
264
+ | No log | 35.5 | 426 | 0.6124 | 0.5854 | 0.6124 | 0.7826 |
265
+ | No log | 35.6667 | 428 | 0.6459 | 0.5993 | 0.6459 | 0.8037 |
266
+ | No log | 35.8333 | 430 | 0.6933 | 0.5455 | 0.6933 | 0.8326 |
267
+ | No log | 36.0 | 432 | 0.6690 | 0.5835 | 0.6690 | 0.8179 |
268
+ | No log | 36.1667 | 434 | 0.5907 | 0.6993 | 0.5907 | 0.7686 |
269
+ | No log | 36.3333 | 436 | 0.5646 | 0.6805 | 0.5646 | 0.7514 |
270
+ | No log | 36.5 | 438 | 0.5674 | 0.6924 | 0.5674 | 0.7533 |
271
+ | No log | 36.6667 | 440 | 0.5711 | 0.6805 | 0.5711 | 0.7557 |
272
+ | No log | 36.8333 | 442 | 0.5778 | 0.6805 | 0.5778 | 0.7601 |
273
+ | No log | 37.0 | 444 | 0.5895 | 0.6286 | 0.5895 | 0.7678 |
274
+ | No log | 37.1667 | 446 | 0.6037 | 0.6165 | 0.6037 | 0.7770 |
275
+ | No log | 37.3333 | 448 | 0.6162 | 0.6133 | 0.6162 | 0.7850 |
276
+ | No log | 37.5 | 450 | 0.6187 | 0.6473 | 0.6187 | 0.7866 |
277
+ | No log | 37.6667 | 452 | 0.5903 | 0.6488 | 0.5903 | 0.7683 |
278
+ | No log | 37.8333 | 454 | 0.5963 | 0.6742 | 0.5963 | 0.7722 |
279
+ | No log | 38.0 | 456 | 0.6565 | 0.5665 | 0.6565 | 0.8102 |
280
+ | No log | 38.1667 | 458 | 0.6675 | 0.5770 | 0.6675 | 0.8170 |
281
+ | No log | 38.3333 | 460 | 0.6320 | 0.6368 | 0.6320 | 0.7950 |
282
+ | No log | 38.5 | 462 | 0.6197 | 0.6368 | 0.6197 | 0.7872 |
283
+ | No log | 38.6667 | 464 | 0.6148 | 0.6368 | 0.6148 | 0.7841 |
284
+ | No log | 38.8333 | 466 | 0.6501 | 0.6353 | 0.6501 | 0.8063 |
285
+ | No log | 39.0 | 468 | 0.6889 | 0.5451 | 0.6889 | 0.8300 |
286
+ | No log | 39.1667 | 470 | 0.7078 | 0.5558 | 0.7078 | 0.8413 |
287
+ | No log | 39.3333 | 472 | 0.6565 | 0.5665 | 0.6565 | 0.8102 |
288
+ | No log | 39.5 | 474 | 0.6081 | 0.6822 | 0.6081 | 0.7798 |
289
+ | No log | 39.6667 | 476 | 0.5969 | 0.6911 | 0.5969 | 0.7726 |
290
+ | No log | 39.8333 | 478 | 0.6001 | 0.6796 | 0.6001 | 0.7746 |
291
+ | No log | 40.0 | 480 | 0.5944 | 0.6720 | 0.5944 | 0.7710 |
292
+ | No log | 40.1667 | 482 | 0.5862 | 0.6720 | 0.5862 | 0.7656 |
293
+ | No log | 40.3333 | 484 | 0.5792 | 0.6916 | 0.5792 | 0.7611 |
294
+ | No log | 40.5 | 486 | 0.5786 | 0.6966 | 0.5786 | 0.7607 |
295
+ | No log | 40.6667 | 488 | 0.5899 | 0.6691 | 0.5899 | 0.7681 |
296
+ | No log | 40.8333 | 490 | 0.6163 | 0.6699 | 0.6163 | 0.7851 |
297
+ | No log | 41.0 | 492 | 0.6086 | 0.6795 | 0.6086 | 0.7801 |
298
+ | No log | 41.1667 | 494 | 0.5795 | 0.6349 | 0.5795 | 0.7613 |
299
+ | No log | 41.3333 | 496 | 0.5710 | 0.6699 | 0.5710 | 0.7556 |
300
+ | No log | 41.5 | 498 | 0.5721 | 0.6584 | 0.5721 | 0.7564 |
301
+ | 0.2273 | 41.6667 | 500 | 0.5742 | 0.6584 | 0.5742 | 0.7577 |
302
+ | 0.2273 | 41.8333 | 502 | 0.5792 | 0.6374 | 0.5792 | 0.7611 |
303
+ | 0.2273 | 42.0 | 504 | 0.5833 | 0.6488 | 0.5833 | 0.7638 |
304
+ | 0.2273 | 42.1667 | 506 | 0.5771 | 0.6488 | 0.5771 | 0.7597 |
305
+ | 0.2273 | 42.3333 | 508 | 0.5716 | 0.6488 | 0.5716 | 0.7561 |
306
+ | 0.2273 | 42.5 | 510 | 0.5722 | 0.6488 | 0.5722 | 0.7565 |
307
+
308
+
309
+ ### Framework versions
310
+
311
+ - Transformers 4.44.2
312
+ - Pytorch 2.4.0+cu118
313
+ - Datasets 2.21.0
314
+ - Tokenizers 0.19.1
config.json ADDED
@@ -0,0 +1,32 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "_name_or_path": "aubmindlab/bert-base-arabertv02",
3
+ "architectures": [
4
+ "BertForSequenceClassification"
5
+ ],
6
+ "attention_probs_dropout_prob": 0.1,
7
+ "classifier_dropout": null,
8
+ "hidden_act": "gelu",
9
+ "hidden_dropout_prob": 0.1,
10
+ "hidden_size": 768,
11
+ "id2label": {
12
+ "0": "LABEL_0"
13
+ },
14
+ "initializer_range": 0.02,
15
+ "intermediate_size": 3072,
16
+ "label2id": {
17
+ "LABEL_0": 0
18
+ },
19
+ "layer_norm_eps": 1e-12,
20
+ "max_position_embeddings": 512,
21
+ "model_type": "bert",
22
+ "num_attention_heads": 12,
23
+ "num_hidden_layers": 12,
24
+ "pad_token_id": 0,
25
+ "position_embedding_type": "absolute",
26
+ "problem_type": "regression",
27
+ "torch_dtype": "float32",
28
+ "transformers_version": "4.44.2",
29
+ "type_vocab_size": 2,
30
+ "use_cache": true,
31
+ "vocab_size": 64000
32
+ }
model.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:7997a9449042e7c676751c7fa42846d00563cfecb77413677d8056dabd49dc77
3
+ size 540799996
training_args.bin ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:d50019257f808506c973fc58da897154f1b6c6503fca0bcfb8458d5b7eb4fab9
3
+ size 5304