MayBashendy commited on
Commit
7b76755
·
verified ·
1 Parent(s): 5715928

Training in progress, step 500

Browse files
Files changed (4) hide show
  1. README.md +314 -0
  2. config.json +32 -0
  3. model.safetensors +3 -0
  4. training_args.bin +3 -0
README.md ADDED
@@ -0,0 +1,314 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ library_name: transformers
3
+ base_model: aubmindlab/bert-base-arabertv02
4
+ tags:
5
+ - generated_from_trainer
6
+ model-index:
7
+ - name: ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k9_task2_organization
8
+ results: []
9
+ ---
10
+
11
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
12
+ should probably proofread and complete it, then remove this comment. -->
13
+
14
+ # ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k9_task2_organization
15
+
16
+ This model is a fine-tuned version of [aubmindlab/bert-base-arabertv02](https://huggingface.co/aubmindlab/bert-base-arabertv02) on the None dataset.
17
+ It achieves the following results on the evaluation set:
18
+ - Loss: 0.8361
19
+ - Qwk: 0.5411
20
+ - Mse: 0.8361
21
+ - Rmse: 0.9144
22
+
23
+ ## Model description
24
+
25
+ More information needed
26
+
27
+ ## Intended uses & limitations
28
+
29
+ More information needed
30
+
31
+ ## Training and evaluation data
32
+
33
+ More information needed
34
+
35
+ ## Training procedure
36
+
37
+ ### Training hyperparameters
38
+
39
+ The following hyperparameters were used during training:
40
+ - learning_rate: 2e-05
41
+ - train_batch_size: 8
42
+ - eval_batch_size: 8
43
+ - seed: 42
44
+ - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
45
+ - lr_scheduler_type: linear
46
+ - num_epochs: 100
47
+
48
+ ### Training results
49
+
50
+ | Training Loss | Epoch | Step | Validation Loss | Qwk | Mse | Rmse |
51
+ |:-------------:|:------:|:----:|:---------------:|:-------:|:------:|:------:|
52
+ | No log | 0.0392 | 2 | 4.3390 | 0.0163 | 4.3390 | 2.0830 |
53
+ | No log | 0.0784 | 4 | 3.1711 | 0.0080 | 3.1711 | 1.7808 |
54
+ | No log | 0.1176 | 6 | 1.8291 | 0.0629 | 1.8291 | 1.3525 |
55
+ | No log | 0.1569 | 8 | 1.3150 | 0.1021 | 1.3150 | 1.1467 |
56
+ | No log | 0.1961 | 10 | 1.4439 | -0.0697 | 1.4439 | 1.2016 |
57
+ | No log | 0.2353 | 12 | 1.1873 | 0.1585 | 1.1873 | 1.0896 |
58
+ | No log | 0.2745 | 14 | 1.2630 | 0.0974 | 1.2630 | 1.1238 |
59
+ | No log | 0.3137 | 16 | 1.2934 | 0.1354 | 1.2934 | 1.1373 |
60
+ | No log | 0.3529 | 18 | 1.3609 | -0.0451 | 1.3609 | 1.1666 |
61
+ | No log | 0.3922 | 20 | 1.3466 | -0.0300 | 1.3466 | 1.1604 |
62
+ | No log | 0.4314 | 22 | 1.3356 | 0.0339 | 1.3356 | 1.1557 |
63
+ | No log | 0.4706 | 24 | 1.1176 | 0.3338 | 1.1176 | 1.0572 |
64
+ | No log | 0.5098 | 26 | 1.0756 | 0.2936 | 1.0756 | 1.0371 |
65
+ | No log | 0.5490 | 28 | 0.9859 | 0.3307 | 0.9859 | 0.9929 |
66
+ | No log | 0.5882 | 30 | 0.9643 | 0.3073 | 0.9643 | 0.9820 |
67
+ | No log | 0.6275 | 32 | 0.9007 | 0.3854 | 0.9007 | 0.9490 |
68
+ | No log | 0.6667 | 34 | 0.9622 | 0.4213 | 0.9622 | 0.9809 |
69
+ | No log | 0.7059 | 36 | 0.9708 | 0.4383 | 0.9708 | 0.9853 |
70
+ | No log | 0.7451 | 38 | 0.9878 | 0.4100 | 0.9878 | 0.9939 |
71
+ | No log | 0.7843 | 40 | 1.0558 | 0.3998 | 1.0558 | 1.0275 |
72
+ | No log | 0.8235 | 42 | 0.9242 | 0.5643 | 0.9242 | 0.9614 |
73
+ | No log | 0.8627 | 44 | 0.7955 | 0.5241 | 0.7955 | 0.8919 |
74
+ | No log | 0.9020 | 46 | 0.7980 | 0.5329 | 0.7980 | 0.8933 |
75
+ | No log | 0.9412 | 48 | 0.8442 | 0.6061 | 0.8442 | 0.9188 |
76
+ | No log | 0.9804 | 50 | 1.0407 | 0.4839 | 1.0407 | 1.0202 |
77
+ | No log | 1.0196 | 52 | 1.1216 | 0.3570 | 1.1216 | 1.0591 |
78
+ | No log | 1.0588 | 54 | 1.0024 | 0.4949 | 1.0024 | 1.0012 |
79
+ | No log | 1.0980 | 56 | 0.8697 | 0.5816 | 0.8697 | 0.9326 |
80
+ | No log | 1.1373 | 58 | 0.8600 | 0.5816 | 0.8600 | 0.9274 |
81
+ | No log | 1.1765 | 60 | 0.8561 | 0.5673 | 0.8561 | 0.9253 |
82
+ | No log | 1.2157 | 62 | 0.9019 | 0.4699 | 0.9019 | 0.9497 |
83
+ | No log | 1.2549 | 64 | 0.8546 | 0.5415 | 0.8546 | 0.9245 |
84
+ | No log | 1.2941 | 66 | 0.8366 | 0.5308 | 0.8366 | 0.9147 |
85
+ | No log | 1.3333 | 68 | 0.8373 | 0.5102 | 0.8373 | 0.9150 |
86
+ | No log | 1.3725 | 70 | 0.8966 | 0.4919 | 0.8966 | 0.9469 |
87
+ | No log | 1.4118 | 72 | 1.1326 | 0.3388 | 1.1326 | 1.0642 |
88
+ | No log | 1.4510 | 74 | 1.1207 | 0.3289 | 1.1207 | 1.0586 |
89
+ | No log | 1.4902 | 76 | 0.9146 | 0.5290 | 0.9146 | 0.9563 |
90
+ | No log | 1.5294 | 78 | 0.9239 | 0.5012 | 0.9239 | 0.9612 |
91
+ | No log | 1.5686 | 80 | 0.9205 | 0.4914 | 0.9205 | 0.9594 |
92
+ | No log | 1.6078 | 82 | 0.8815 | 0.4496 | 0.8815 | 0.9389 |
93
+ | No log | 1.6471 | 84 | 0.9873 | 0.4961 | 0.9873 | 0.9936 |
94
+ | No log | 1.6863 | 86 | 0.9612 | 0.5115 | 0.9612 | 0.9804 |
95
+ | No log | 1.7255 | 88 | 0.9322 | 0.4411 | 0.9322 | 0.9655 |
96
+ | No log | 1.7647 | 90 | 0.9161 | 0.4718 | 0.9161 | 0.9571 |
97
+ | No log | 1.8039 | 92 | 0.9150 | 0.4718 | 0.9150 | 0.9566 |
98
+ | No log | 1.8431 | 94 | 0.9314 | 0.4403 | 0.9314 | 0.9651 |
99
+ | No log | 1.8824 | 96 | 0.9626 | 0.5011 | 0.9626 | 0.9811 |
100
+ | No log | 1.9216 | 98 | 0.9502 | 0.4914 | 0.9502 | 0.9748 |
101
+ | No log | 1.9608 | 100 | 0.9553 | 0.5180 | 0.9553 | 0.9774 |
102
+ | No log | 2.0 | 102 | 0.9492 | 0.5196 | 0.9492 | 0.9743 |
103
+ | No log | 2.0392 | 104 | 0.9319 | 0.5121 | 0.9319 | 0.9653 |
104
+ | No log | 2.0784 | 106 | 0.9399 | 0.4358 | 0.9399 | 0.9695 |
105
+ | No log | 2.1176 | 108 | 1.0022 | 0.4567 | 1.0022 | 1.0011 |
106
+ | No log | 2.1569 | 110 | 0.9675 | 0.3985 | 0.9675 | 0.9836 |
107
+ | No log | 2.1961 | 112 | 1.0184 | 0.4842 | 1.0184 | 1.0091 |
108
+ | No log | 2.2353 | 114 | 1.0343 | 0.5439 | 1.0343 | 1.0170 |
109
+ | No log | 2.2745 | 116 | 1.0025 | 0.5171 | 1.0025 | 1.0012 |
110
+ | No log | 2.3137 | 118 | 0.9541 | 0.4731 | 0.9541 | 0.9768 |
111
+ | No log | 2.3529 | 120 | 0.9240 | 0.5143 | 0.9240 | 0.9612 |
112
+ | No log | 2.3922 | 122 | 0.8736 | 0.4894 | 0.8736 | 0.9346 |
113
+ | No log | 2.4314 | 124 | 0.8481 | 0.4747 | 0.8481 | 0.9209 |
114
+ | No log | 2.4706 | 126 | 0.8490 | 0.4939 | 0.8490 | 0.9214 |
115
+ | No log | 2.5098 | 128 | 0.8457 | 0.5136 | 0.8457 | 0.9196 |
116
+ | No log | 2.5490 | 130 | 0.9305 | 0.5704 | 0.9305 | 0.9646 |
117
+ | No log | 2.5882 | 132 | 0.8684 | 0.5802 | 0.8684 | 0.9319 |
118
+ | No log | 2.6275 | 134 | 0.8357 | 0.5669 | 0.8357 | 0.9141 |
119
+ | No log | 2.6667 | 136 | 0.7962 | 0.5621 | 0.7962 | 0.8923 |
120
+ | No log | 2.7059 | 138 | 0.9018 | 0.5431 | 0.9018 | 0.9496 |
121
+ | No log | 2.7451 | 140 | 1.0108 | 0.4976 | 1.0108 | 1.0054 |
122
+ | No log | 2.7843 | 142 | 0.8833 | 0.4667 | 0.8833 | 0.9399 |
123
+ | No log | 2.8235 | 144 | 0.8146 | 0.3830 | 0.8146 | 0.9025 |
124
+ | No log | 2.8627 | 146 | 0.8029 | 0.4908 | 0.8029 | 0.8960 |
125
+ | No log | 2.9020 | 148 | 0.7534 | 0.5044 | 0.7534 | 0.8680 |
126
+ | No log | 2.9412 | 150 | 0.7207 | 0.5675 | 0.7207 | 0.8489 |
127
+ | No log | 2.9804 | 152 | 0.7896 | 0.5763 | 0.7896 | 0.8886 |
128
+ | No log | 3.0196 | 154 | 0.8883 | 0.5458 | 0.8883 | 0.9425 |
129
+ | No log | 3.0588 | 156 | 0.7851 | 0.5920 | 0.7851 | 0.8861 |
130
+ | No log | 3.0980 | 158 | 0.7777 | 0.6271 | 0.7777 | 0.8819 |
131
+ | No log | 3.1373 | 160 | 0.7791 | 0.6305 | 0.7791 | 0.8827 |
132
+ | No log | 3.1765 | 162 | 0.7703 | 0.5937 | 0.7703 | 0.8777 |
133
+ | No log | 3.2157 | 164 | 0.7848 | 0.5611 | 0.7848 | 0.8859 |
134
+ | No log | 3.2549 | 166 | 0.8604 | 0.5892 | 0.8604 | 0.9276 |
135
+ | No log | 3.2941 | 168 | 0.8466 | 0.5647 | 0.8466 | 0.9201 |
136
+ | No log | 3.3333 | 170 | 0.8205 | 0.5735 | 0.8205 | 0.9058 |
137
+ | No log | 3.3725 | 172 | 0.8292 | 0.5879 | 0.8292 | 0.9106 |
138
+ | No log | 3.4118 | 174 | 0.8713 | 0.5601 | 0.8713 | 0.9334 |
139
+ | No log | 3.4510 | 176 | 0.8535 | 0.5624 | 0.8535 | 0.9238 |
140
+ | No log | 3.4902 | 178 | 0.8391 | 0.5424 | 0.8391 | 0.9160 |
141
+ | No log | 3.5294 | 180 | 0.8333 | 0.4582 | 0.8333 | 0.9128 |
142
+ | No log | 3.5686 | 182 | 0.8298 | 0.4936 | 0.8298 | 0.9109 |
143
+ | No log | 3.6078 | 184 | 0.8462 | 0.4870 | 0.8462 | 0.9199 |
144
+ | No log | 3.6471 | 186 | 0.9183 | 0.5495 | 0.9183 | 0.9583 |
145
+ | No log | 3.6863 | 188 | 1.0800 | 0.5489 | 1.0800 | 1.0392 |
146
+ | No log | 3.7255 | 190 | 1.1408 | 0.5358 | 1.1408 | 1.0681 |
147
+ | No log | 3.7647 | 192 | 1.0451 | 0.5526 | 1.0451 | 1.0223 |
148
+ | No log | 3.8039 | 194 | 0.8646 | 0.5039 | 0.8646 | 0.9298 |
149
+ | No log | 3.8431 | 196 | 0.8304 | 0.4853 | 0.8304 | 0.9113 |
150
+ | No log | 3.8824 | 198 | 0.8522 | 0.5008 | 0.8522 | 0.9231 |
151
+ | No log | 3.9216 | 200 | 0.8346 | 0.4699 | 0.8346 | 0.9136 |
152
+ | No log | 3.9608 | 202 | 0.8770 | 0.5305 | 0.8770 | 0.9365 |
153
+ | No log | 4.0 | 204 | 0.8574 | 0.4926 | 0.8574 | 0.9259 |
154
+ | No log | 4.0392 | 206 | 0.8427 | 0.4618 | 0.8427 | 0.9180 |
155
+ | No log | 4.0784 | 208 | 0.8458 | 0.3913 | 0.8458 | 0.9197 |
156
+ | No log | 4.1176 | 210 | 0.8551 | 0.4304 | 0.8551 | 0.9247 |
157
+ | No log | 4.1569 | 212 | 0.8266 | 0.4728 | 0.8266 | 0.9092 |
158
+ | No log | 4.1961 | 214 | 0.8493 | 0.4632 | 0.8493 | 0.9216 |
159
+ | No log | 4.2353 | 216 | 0.8255 | 0.4993 | 0.8255 | 0.9086 |
160
+ | No log | 4.2745 | 218 | 0.8336 | 0.4993 | 0.8336 | 0.9130 |
161
+ | No log | 4.3137 | 220 | 0.8400 | 0.5295 | 0.8400 | 0.9165 |
162
+ | No log | 4.3529 | 222 | 0.8757 | 0.5114 | 0.8757 | 0.9358 |
163
+ | No log | 4.3922 | 224 | 0.8357 | 0.5275 | 0.8357 | 0.9142 |
164
+ | No log | 4.4314 | 226 | 0.8055 | 0.5462 | 0.8055 | 0.8975 |
165
+ | No log | 4.4706 | 228 | 0.8990 | 0.4829 | 0.8990 | 0.9482 |
166
+ | No log | 4.5098 | 230 | 0.9906 | 0.4444 | 0.9906 | 0.9953 |
167
+ | No log | 4.5490 | 232 | 0.9171 | 0.4849 | 0.9171 | 0.9577 |
168
+ | No log | 4.5882 | 234 | 0.8014 | 0.4815 | 0.8014 | 0.8952 |
169
+ | No log | 4.6275 | 236 | 0.8638 | 0.5177 | 0.8638 | 0.9294 |
170
+ | No log | 4.6667 | 238 | 1.0202 | 0.5224 | 1.0202 | 1.0101 |
171
+ | No log | 4.7059 | 240 | 0.9798 | 0.4883 | 0.9798 | 0.9898 |
172
+ | No log | 4.7451 | 242 | 0.8498 | 0.5647 | 0.8498 | 0.9219 |
173
+ | No log | 4.7843 | 244 | 0.8324 | 0.5279 | 0.8324 | 0.9124 |
174
+ | No log | 4.8235 | 246 | 0.8286 | 0.5279 | 0.8286 | 0.9103 |
175
+ | No log | 4.8627 | 248 | 0.8093 | 0.5411 | 0.8093 | 0.8996 |
176
+ | No log | 4.9020 | 250 | 0.7805 | 0.5204 | 0.7805 | 0.8834 |
177
+ | No log | 4.9412 | 252 | 0.7706 | 0.5556 | 0.7706 | 0.8778 |
178
+ | No log | 4.9804 | 254 | 0.8082 | 0.5812 | 0.8082 | 0.8990 |
179
+ | No log | 5.0196 | 256 | 0.8358 | 0.5269 | 0.8358 | 0.9142 |
180
+ | No log | 5.0588 | 258 | 0.7784 | 0.6288 | 0.7784 | 0.8823 |
181
+ | No log | 5.0980 | 260 | 0.7799 | 0.6241 | 0.7799 | 0.8831 |
182
+ | No log | 5.1373 | 262 | 0.8340 | 0.5728 | 0.8340 | 0.9133 |
183
+ | No log | 5.1765 | 264 | 0.8336 | 0.5458 | 0.8336 | 0.9130 |
184
+ | No log | 5.2157 | 266 | 0.8319 | 0.5345 | 0.8319 | 0.9121 |
185
+ | No log | 5.2549 | 268 | 0.8207 | 0.4794 | 0.8207 | 0.9059 |
186
+ | No log | 5.2941 | 270 | 0.8046 | 0.3991 | 0.8046 | 0.8970 |
187
+ | No log | 5.3333 | 272 | 0.7943 | 0.3987 | 0.7943 | 0.8913 |
188
+ | No log | 5.3725 | 274 | 0.7896 | 0.4218 | 0.7896 | 0.8886 |
189
+ | No log | 5.4118 | 276 | 0.7945 | 0.4534 | 0.7945 | 0.8913 |
190
+ | No log | 5.4510 | 278 | 0.7943 | 0.3852 | 0.7943 | 0.8913 |
191
+ | No log | 5.4902 | 280 | 0.8186 | 0.4603 | 0.8186 | 0.9047 |
192
+ | No log | 5.5294 | 282 | 0.8494 | 0.5542 | 0.8494 | 0.9216 |
193
+ | No log | 5.5686 | 284 | 0.9098 | 0.5600 | 0.9098 | 0.9538 |
194
+ | No log | 5.6078 | 286 | 0.8693 | 0.5600 | 0.8693 | 0.9324 |
195
+ | No log | 5.6471 | 288 | 0.7850 | 0.5073 | 0.7850 | 0.8860 |
196
+ | No log | 5.6863 | 290 | 0.8154 | 0.4465 | 0.8154 | 0.9030 |
197
+ | No log | 5.7255 | 292 | 0.8813 | 0.4741 | 0.8813 | 0.9388 |
198
+ | No log | 5.7647 | 294 | 0.8395 | 0.4331 | 0.8395 | 0.9162 |
199
+ | No log | 5.8039 | 296 | 0.8105 | 0.4980 | 0.8105 | 0.9003 |
200
+ | No log | 5.8431 | 298 | 0.9452 | 0.5614 | 0.9452 | 0.9722 |
201
+ | No log | 5.8824 | 300 | 0.9990 | 0.5471 | 0.9990 | 0.9995 |
202
+ | No log | 5.9216 | 302 | 0.8805 | 0.5515 | 0.8805 | 0.9383 |
203
+ | No log | 5.9608 | 304 | 0.7853 | 0.4160 | 0.7853 | 0.8861 |
204
+ | No log | 6.0 | 306 | 0.8060 | 0.4598 | 0.8060 | 0.8978 |
205
+ | No log | 6.0392 | 308 | 0.7794 | 0.4598 | 0.7794 | 0.8828 |
206
+ | No log | 6.0784 | 310 | 0.7702 | 0.5408 | 0.7702 | 0.8776 |
207
+ | No log | 6.1176 | 312 | 0.8461 | 0.5658 | 0.8461 | 0.9198 |
208
+ | No log | 6.1569 | 314 | 0.8483 | 0.5658 | 0.8483 | 0.9211 |
209
+ | No log | 6.1961 | 316 | 0.8126 | 0.5390 | 0.8126 | 0.9014 |
210
+ | No log | 6.2353 | 318 | 0.7923 | 0.5435 | 0.7923 | 0.8901 |
211
+ | No log | 6.2745 | 320 | 0.7870 | 0.4834 | 0.7870 | 0.8871 |
212
+ | No log | 6.3137 | 322 | 0.8131 | 0.4724 | 0.8131 | 0.9017 |
213
+ | No log | 6.3529 | 324 | 0.9059 | 0.5577 | 0.9059 | 0.9518 |
214
+ | No log | 6.3922 | 326 | 0.9568 | 0.5199 | 0.9568 | 0.9782 |
215
+ | No log | 6.4314 | 328 | 1.0070 | 0.5280 | 1.0070 | 1.0035 |
216
+ | No log | 6.4706 | 330 | 0.9527 | 0.5224 | 0.9527 | 0.9760 |
217
+ | No log | 6.5098 | 332 | 0.8292 | 0.5660 | 0.8292 | 0.9106 |
218
+ | No log | 6.5490 | 334 | 0.7479 | 0.5548 | 0.7479 | 0.8648 |
219
+ | No log | 6.5882 | 336 | 0.7430 | 0.5439 | 0.7430 | 0.8620 |
220
+ | No log | 6.6275 | 338 | 0.7477 | 0.5322 | 0.7477 | 0.8647 |
221
+ | No log | 6.6667 | 340 | 0.7680 | 0.5971 | 0.7680 | 0.8764 |
222
+ | No log | 6.7059 | 342 | 0.8607 | 0.5370 | 0.8607 | 0.9277 |
223
+ | No log | 6.7451 | 344 | 0.9665 | 0.5304 | 0.9665 | 0.9831 |
224
+ | No log | 6.7843 | 346 | 0.9626 | 0.5233 | 0.9626 | 0.9811 |
225
+ | No log | 6.8235 | 348 | 0.9042 | 0.5130 | 0.9042 | 0.9509 |
226
+ | No log | 6.8627 | 350 | 0.8280 | 0.4143 | 0.8280 | 0.9100 |
227
+ | No log | 6.9020 | 352 | 0.8019 | 0.4620 | 0.8019 | 0.8955 |
228
+ | No log | 6.9412 | 354 | 0.8078 | 0.5390 | 0.8078 | 0.8988 |
229
+ | No log | 6.9804 | 356 | 0.8204 | 0.5487 | 0.8204 | 0.9058 |
230
+ | No log | 7.0196 | 358 | 0.8385 | 0.5279 | 0.8385 | 0.9157 |
231
+ | No log | 7.0588 | 360 | 0.8447 | 0.5465 | 0.8447 | 0.9191 |
232
+ | No log | 7.0980 | 362 | 0.8569 | 0.4964 | 0.8569 | 0.9257 |
233
+ | No log | 7.1373 | 364 | 0.9630 | 0.4991 | 0.9630 | 0.9813 |
234
+ | No log | 7.1765 | 366 | 1.0235 | 0.4475 | 1.0235 | 1.0117 |
235
+ | No log | 7.2157 | 368 | 0.9669 | 0.4197 | 0.9669 | 0.9833 |
236
+ | No log | 7.2549 | 370 | 0.8880 | 0.3627 | 0.8880 | 0.9423 |
237
+ | No log | 7.2941 | 372 | 0.9081 | 0.4363 | 0.9081 | 0.9529 |
238
+ | No log | 7.3333 | 374 | 0.9592 | 0.4840 | 0.9592 | 0.9794 |
239
+ | No log | 7.3725 | 376 | 0.9552 | 0.5014 | 0.9552 | 0.9773 |
240
+ | No log | 7.4118 | 378 | 0.8842 | 0.5042 | 0.8842 | 0.9403 |
241
+ | No log | 7.4510 | 380 | 0.8214 | 0.4340 | 0.8214 | 0.9063 |
242
+ | No log | 7.4902 | 382 | 0.8120 | 0.4340 | 0.8120 | 0.9011 |
243
+ | No log | 7.5294 | 384 | 0.8256 | 0.5411 | 0.8256 | 0.9086 |
244
+ | No log | 7.5686 | 386 | 0.8395 | 0.5390 | 0.8395 | 0.9162 |
245
+ | No log | 7.6078 | 388 | 0.8130 | 0.5548 | 0.8130 | 0.9017 |
246
+ | No log | 7.6471 | 390 | 0.7916 | 0.5011 | 0.7916 | 0.8897 |
247
+ | No log | 7.6863 | 392 | 0.7922 | 0.4760 | 0.7922 | 0.8901 |
248
+ | No log | 7.7255 | 394 | 0.7928 | 0.4760 | 0.7928 | 0.8904 |
249
+ | No log | 7.7647 | 396 | 0.7967 | 0.5361 | 0.7967 | 0.8926 |
250
+ | No log | 7.8039 | 398 | 0.8061 | 0.5738 | 0.8061 | 0.8978 |
251
+ | No log | 7.8431 | 400 | 0.8119 | 0.5730 | 0.8119 | 0.9011 |
252
+ | No log | 7.8824 | 402 | 0.8206 | 0.5728 | 0.8206 | 0.9059 |
253
+ | No log | 7.9216 | 404 | 0.8140 | 0.5611 | 0.8140 | 0.9022 |
254
+ | No log | 7.9608 | 406 | 0.8045 | 0.5880 | 0.8045 | 0.8969 |
255
+ | No log | 8.0 | 408 | 0.7933 | 0.6120 | 0.7933 | 0.8907 |
256
+ | No log | 8.0392 | 410 | 0.7993 | 0.5892 | 0.7993 | 0.8940 |
257
+ | No log | 8.0784 | 412 | 0.7860 | 0.5983 | 0.7860 | 0.8866 |
258
+ | No log | 8.1176 | 414 | 0.7771 | 0.5391 | 0.7771 | 0.8815 |
259
+ | No log | 8.1569 | 416 | 0.7740 | 0.5214 | 0.7740 | 0.8798 |
260
+ | No log | 8.1961 | 418 | 0.7780 | 0.5769 | 0.7780 | 0.8820 |
261
+ | No log | 8.2353 | 420 | 0.8088 | 0.5706 | 0.8088 | 0.8993 |
262
+ | No log | 8.2745 | 422 | 0.8658 | 0.5682 | 0.8658 | 0.9305 |
263
+ | No log | 8.3137 | 424 | 0.8623 | 0.5539 | 0.8623 | 0.9286 |
264
+ | No log | 8.3529 | 426 | 0.8157 | 0.5385 | 0.8157 | 0.9032 |
265
+ | No log | 8.3922 | 428 | 0.7895 | 0.4757 | 0.7895 | 0.8885 |
266
+ | No log | 8.4314 | 430 | 0.7917 | 0.4548 | 0.7917 | 0.8898 |
267
+ | No log | 8.4706 | 432 | 0.7948 | 0.4548 | 0.7948 | 0.8915 |
268
+ | No log | 8.5098 | 434 | 0.7991 | 0.5028 | 0.7991 | 0.8939 |
269
+ | No log | 8.5490 | 436 | 0.7882 | 0.5773 | 0.7882 | 0.8878 |
270
+ | No log | 8.5882 | 438 | 0.7758 | 0.5596 | 0.7758 | 0.8808 |
271
+ | No log | 8.6275 | 440 | 0.7697 | 0.5242 | 0.7697 | 0.8773 |
272
+ | No log | 8.6667 | 442 | 0.7701 | 0.5773 | 0.7701 | 0.8775 |
273
+ | No log | 8.7059 | 444 | 0.7728 | 0.5773 | 0.7728 | 0.8791 |
274
+ | No log | 8.7451 | 446 | 0.7758 | 0.5573 | 0.7758 | 0.8808 |
275
+ | No log | 8.7843 | 448 | 0.7774 | 0.5773 | 0.7774 | 0.8817 |
276
+ | No log | 8.8235 | 450 | 0.7862 | 0.5773 | 0.7862 | 0.8867 |
277
+ | No log | 8.8627 | 452 | 0.7919 | 0.5773 | 0.7919 | 0.8899 |
278
+ | No log | 8.9020 | 454 | 0.8019 | 0.5773 | 0.8019 | 0.8955 |
279
+ | No log | 8.9412 | 456 | 0.8108 | 0.5886 | 0.8108 | 0.9004 |
280
+ | No log | 8.9804 | 458 | 0.8118 | 0.5886 | 0.8118 | 0.9010 |
281
+ | No log | 9.0196 | 460 | 0.8145 | 0.5886 | 0.8145 | 0.9025 |
282
+ | No log | 9.0588 | 462 | 0.8053 | 0.5391 | 0.8053 | 0.8974 |
283
+ | No log | 9.0980 | 464 | 0.8075 | 0.5391 | 0.8075 | 0.8986 |
284
+ | No log | 9.1373 | 466 | 0.8047 | 0.5204 | 0.8047 | 0.8970 |
285
+ | No log | 9.1765 | 468 | 0.7909 | 0.5011 | 0.7909 | 0.8893 |
286
+ | No log | 9.2157 | 470 | 0.7839 | 0.5011 | 0.7839 | 0.8854 |
287
+ | No log | 9.2549 | 472 | 0.7871 | 0.5204 | 0.7871 | 0.8872 |
288
+ | No log | 9.2941 | 474 | 0.8246 | 0.5698 | 0.8246 | 0.9081 |
289
+ | No log | 9.3333 | 476 | 0.8261 | 0.5698 | 0.8261 | 0.9089 |
290
+ | No log | 9.3725 | 478 | 0.7843 | 0.5203 | 0.7843 | 0.8856 |
291
+ | No log | 9.4118 | 480 | 0.7642 | 0.4548 | 0.7642 | 0.8742 |
292
+ | No log | 9.4510 | 482 | 0.7512 | 0.4980 | 0.7512 | 0.8667 |
293
+ | No log | 9.4902 | 484 | 0.7383 | 0.5983 | 0.7383 | 0.8593 |
294
+ | No log | 9.5294 | 486 | 0.7351 | 0.5892 | 0.7351 | 0.8574 |
295
+ | No log | 9.5686 | 488 | 0.7181 | 0.5684 | 0.7181 | 0.8474 |
296
+ | No log | 9.6078 | 490 | 0.7140 | 0.5322 | 0.7140 | 0.8450 |
297
+ | No log | 9.6471 | 492 | 0.7296 | 0.5474 | 0.7296 | 0.8542 |
298
+ | No log | 9.6863 | 494 | 0.7306 | 0.5451 | 0.7306 | 0.8547 |
299
+ | No log | 9.7255 | 496 | 0.7309 | 0.6237 | 0.7309 | 0.8549 |
300
+ | No log | 9.7647 | 498 | 0.7987 | 0.5981 | 0.7987 | 0.8937 |
301
+ | 0.3162 | 9.8039 | 500 | 0.8345 | 0.5781 | 0.8345 | 0.9135 |
302
+ | 0.3162 | 9.8431 | 502 | 0.8285 | 0.5680 | 0.8285 | 0.9102 |
303
+ | 0.3162 | 9.8824 | 504 | 0.8142 | 0.5563 | 0.8142 | 0.9023 |
304
+ | 0.3162 | 9.9216 | 506 | 0.8326 | 0.5365 | 0.8326 | 0.9125 |
305
+ | 0.3162 | 9.9608 | 508 | 0.8258 | 0.5026 | 0.8258 | 0.9087 |
306
+ | 0.3162 | 10.0 | 510 | 0.8361 | 0.5411 | 0.8361 | 0.9144 |
307
+
308
+
309
+ ### Framework versions
310
+
311
+ - Transformers 4.44.2
312
+ - Pytorch 2.4.0+cu118
313
+ - Datasets 2.21.0
314
+ - Tokenizers 0.19.1
config.json ADDED
@@ -0,0 +1,32 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "_name_or_path": "aubmindlab/bert-base-arabertv02",
3
+ "architectures": [
4
+ "BertForSequenceClassification"
5
+ ],
6
+ "attention_probs_dropout_prob": 0.1,
7
+ "classifier_dropout": null,
8
+ "hidden_act": "gelu",
9
+ "hidden_dropout_prob": 0.1,
10
+ "hidden_size": 768,
11
+ "id2label": {
12
+ "0": "LABEL_0"
13
+ },
14
+ "initializer_range": 0.02,
15
+ "intermediate_size": 3072,
16
+ "label2id": {
17
+ "LABEL_0": 0
18
+ },
19
+ "layer_norm_eps": 1e-12,
20
+ "max_position_embeddings": 512,
21
+ "model_type": "bert",
22
+ "num_attention_heads": 12,
23
+ "num_hidden_layers": 12,
24
+ "pad_token_id": 0,
25
+ "position_embedding_type": "absolute",
26
+ "problem_type": "regression",
27
+ "torch_dtype": "float32",
28
+ "transformers_version": "4.44.2",
29
+ "type_vocab_size": 2,
30
+ "use_cache": true,
31
+ "vocab_size": 64000
32
+ }
model.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:069f64e049d0ef2ddbe5ad662d6e3951bae1651502c9482426810f3592a62c0a
3
+ size 540799996
training_args.bin ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:19ae8f8ecb83bd0bac8c9fc3b36f3425816f04b5070528be194a0f813d00b108
3
+ size 5304