MayBashendy commited on
Commit
037e48c
·
verified ·
1 Parent(s): a9fd6d7

Training in progress, step 500

Browse files
Files changed (4) hide show
  1. README.md +315 -0
  2. config.json +32 -0
  3. model.safetensors +3 -0
  4. training_args.bin +3 -0
README.md ADDED
@@ -0,0 +1,315 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ library_name: transformers
3
+ base_model: aubmindlab/bert-base-arabertv02
4
+ tags:
5
+ - generated_from_trainer
6
+ model-index:
7
+ - name: ArabicNewSplits7_usingALLEssays_FineTuningAraBERT_run3_AugV5_k6_task2_organization
8
+ results: []
9
+ ---
10
+
11
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
12
+ should probably proofread and complete it, then remove this comment. -->
13
+
14
+ # ArabicNewSplits7_usingALLEssays_FineTuningAraBERT_run3_AugV5_k6_task2_organization
15
+
16
+ This model is a fine-tuned version of [aubmindlab/bert-base-arabertv02](https://huggingface.co/aubmindlab/bert-base-arabertv02) on the None dataset.
17
+ It achieves the following results on the evaluation set:
18
+ - Loss: 0.8315
19
+ - Qwk: 0.4290
20
+ - Mse: 0.8315
21
+ - Rmse: 0.9119
22
+
23
+ ## Model description
24
+
25
+ More information needed
26
+
27
+ ## Intended uses & limitations
28
+
29
+ More information needed
30
+
31
+ ## Training and evaluation data
32
+
33
+ More information needed
34
+
35
+ ## Training procedure
36
+
37
+ ### Training hyperparameters
38
+
39
+ The following hyperparameters were used during training:
40
+ - learning_rate: 2e-05
41
+ - train_batch_size: 8
42
+ - eval_batch_size: 8
43
+ - seed: 42
44
+ - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
45
+ - lr_scheduler_type: linear
46
+ - num_epochs: 100
47
+
48
+ ### Training results
49
+
50
+ | Training Loss | Epoch | Step | Validation Loss | Qwk | Mse | Rmse |
51
+ |:-------------:|:------:|:----:|:---------------:|:-------:|:------:|:------:|
52
+ | No log | 0.08 | 2 | 4.6328 | 0.0010 | 4.6328 | 2.1524 |
53
+ | No log | 0.16 | 4 | 2.9019 | 0.0015 | 2.9019 | 1.7035 |
54
+ | No log | 0.24 | 6 | 1.7877 | 0.0198 | 1.7877 | 1.3371 |
55
+ | No log | 0.32 | 8 | 1.4700 | -0.0261 | 1.4700 | 1.2124 |
56
+ | No log | 0.4 | 10 | 1.2383 | 0.1108 | 1.2383 | 1.1128 |
57
+ | No log | 0.48 | 12 | 1.4928 | 0.0169 | 1.4928 | 1.2218 |
58
+ | No log | 0.56 | 14 | 2.0456 | 0.1060 | 2.0456 | 1.4303 |
59
+ | No log | 0.64 | 16 | 1.9477 | 0.1379 | 1.9477 | 1.3956 |
60
+ | No log | 0.72 | 18 | 1.9568 | 0.1860 | 1.9568 | 1.3989 |
61
+ | No log | 0.8 | 20 | 1.7803 | 0.1478 | 1.7803 | 1.3343 |
62
+ | No log | 0.88 | 22 | 1.6257 | 0.0724 | 1.6257 | 1.2750 |
63
+ | No log | 0.96 | 24 | 1.3773 | 0.0 | 1.3773 | 1.1736 |
64
+ | No log | 1.04 | 26 | 1.2678 | 0.0637 | 1.2678 | 1.1260 |
65
+ | No log | 1.12 | 28 | 1.3445 | 0.0 | 1.3445 | 1.1595 |
66
+ | No log | 1.2 | 30 | 1.3906 | 0.0 | 1.3906 | 1.1793 |
67
+ | No log | 1.28 | 32 | 1.3842 | 0.0 | 1.3842 | 1.1765 |
68
+ | No log | 1.3600 | 34 | 1.4295 | 0.0 | 1.4295 | 1.1956 |
69
+ | No log | 1.44 | 36 | 1.4228 | 0.0 | 1.4228 | 1.1928 |
70
+ | No log | 1.52 | 38 | 1.1866 | 0.2014 | 1.1866 | 1.0893 |
71
+ | No log | 1.6 | 40 | 0.9843 | 0.3708 | 0.9843 | 0.9921 |
72
+ | No log | 1.6800 | 42 | 1.0167 | 0.3570 | 1.0167 | 1.0083 |
73
+ | No log | 1.76 | 44 | 1.0929 | 0.3394 | 1.0929 | 1.0454 |
74
+ | No log | 1.8400 | 46 | 1.1927 | 0.2298 | 1.1927 | 1.0921 |
75
+ | No log | 1.92 | 48 | 1.3266 | 0.1106 | 1.3266 | 1.1518 |
76
+ | No log | 2.0 | 50 | 1.4676 | 0.0403 | 1.4676 | 1.2115 |
77
+ | No log | 2.08 | 52 | 1.4086 | 0.0958 | 1.4086 | 1.1868 |
78
+ | No log | 2.16 | 54 | 1.4504 | 0.0811 | 1.4504 | 1.2043 |
79
+ | No log | 2.24 | 56 | 1.5052 | 0.0575 | 1.5052 | 1.2269 |
80
+ | No log | 2.32 | 58 | 1.4342 | 0.1165 | 1.4342 | 1.1976 |
81
+ | No log | 2.4 | 60 | 1.3514 | 0.1346 | 1.3514 | 1.1625 |
82
+ | No log | 2.48 | 62 | 1.2667 | 0.1865 | 1.2667 | 1.1255 |
83
+ | No log | 2.56 | 64 | 1.1706 | 0.3135 | 1.1706 | 1.0819 |
84
+ | No log | 2.64 | 66 | 1.1644 | 0.2589 | 1.1644 | 1.0791 |
85
+ | No log | 2.7200 | 68 | 1.2308 | 0.2254 | 1.2308 | 1.1094 |
86
+ | No log | 2.8 | 70 | 1.3013 | 0.0165 | 1.3013 | 1.1407 |
87
+ | No log | 2.88 | 72 | 1.2545 | 0.0955 | 1.2545 | 1.1200 |
88
+ | No log | 2.96 | 74 | 1.1347 | 0.2298 | 1.1347 | 1.0652 |
89
+ | No log | 3.04 | 76 | 1.0337 | 0.3487 | 1.0337 | 1.0167 |
90
+ | No log | 3.12 | 78 | 0.9992 | 0.3487 | 0.9992 | 0.9996 |
91
+ | No log | 3.2 | 80 | 1.0921 | 0.2395 | 1.0921 | 1.0450 |
92
+ | No log | 3.2800 | 82 | 1.4556 | 0.1916 | 1.4556 | 1.2065 |
93
+ | No log | 3.36 | 84 | 1.6264 | 0.2593 | 1.6264 | 1.2753 |
94
+ | No log | 3.44 | 86 | 1.3852 | 0.25 | 1.3852 | 1.1770 |
95
+ | No log | 3.52 | 88 | 1.1409 | 0.2062 | 1.1409 | 1.0681 |
96
+ | No log | 3.6 | 90 | 1.0076 | 0.3100 | 1.0076 | 1.0038 |
97
+ | No log | 3.68 | 92 | 0.9301 | 0.4175 | 0.9301 | 0.9644 |
98
+ | No log | 3.76 | 94 | 1.0017 | 0.4045 | 1.0017 | 1.0009 |
99
+ | No log | 3.84 | 96 | 1.0917 | 0.3620 | 1.0917 | 1.0449 |
100
+ | No log | 3.92 | 98 | 1.0223 | 0.4218 | 1.0223 | 1.0111 |
101
+ | No log | 4.0 | 100 | 0.9517 | 0.4120 | 0.9517 | 0.9755 |
102
+ | No log | 4.08 | 102 | 0.8379 | 0.4848 | 0.8379 | 0.9153 |
103
+ | No log | 4.16 | 104 | 0.8799 | 0.4468 | 0.8799 | 0.9380 |
104
+ | No log | 4.24 | 106 | 0.9213 | 0.4468 | 0.9213 | 0.9598 |
105
+ | No log | 4.32 | 108 | 0.9304 | 0.4404 | 0.9304 | 0.9646 |
106
+ | No log | 4.4 | 110 | 0.9491 | 0.4839 | 0.9491 | 0.9742 |
107
+ | No log | 4.48 | 112 | 0.8710 | 0.5236 | 0.8710 | 0.9333 |
108
+ | No log | 4.5600 | 114 | 0.8548 | 0.5448 | 0.8548 | 0.9246 |
109
+ | No log | 4.64 | 116 | 0.9003 | 0.5042 | 0.9003 | 0.9488 |
110
+ | No log | 4.72 | 118 | 0.8624 | 0.6276 | 0.8624 | 0.9287 |
111
+ | No log | 4.8 | 120 | 0.8553 | 0.5735 | 0.8553 | 0.9248 |
112
+ | No log | 4.88 | 122 | 0.8664 | 0.5348 | 0.8664 | 0.9308 |
113
+ | No log | 4.96 | 124 | 0.8798 | 0.5125 | 0.8798 | 0.9380 |
114
+ | No log | 5.04 | 126 | 0.8836 | 0.4681 | 0.8836 | 0.9400 |
115
+ | No log | 5.12 | 128 | 0.9418 | 0.4330 | 0.9418 | 0.9705 |
116
+ | No log | 5.2 | 130 | 0.8967 | 0.3849 | 0.8967 | 0.9469 |
117
+ | No log | 5.28 | 132 | 0.8977 | 0.4343 | 0.8977 | 0.9475 |
118
+ | No log | 5.36 | 134 | 0.9139 | 0.4440 | 0.9139 | 0.9560 |
119
+ | No log | 5.44 | 136 | 0.9115 | 0.5008 | 0.9115 | 0.9547 |
120
+ | No log | 5.52 | 138 | 0.9154 | 0.5209 | 0.9154 | 0.9568 |
121
+ | No log | 5.6 | 140 | 1.0092 | 0.4965 | 1.0092 | 1.0046 |
122
+ | No log | 5.68 | 142 | 0.9434 | 0.4594 | 0.9434 | 0.9713 |
123
+ | No log | 5.76 | 144 | 0.8972 | 0.4886 | 0.8972 | 0.9472 |
124
+ | No log | 5.84 | 146 | 0.9098 | 0.5611 | 0.9098 | 0.9538 |
125
+ | No log | 5.92 | 148 | 0.8903 | 0.5356 | 0.8903 | 0.9436 |
126
+ | No log | 6.0 | 150 | 0.9064 | 0.4272 | 0.9064 | 0.9521 |
127
+ | No log | 6.08 | 152 | 0.8765 | 0.5342 | 0.8765 | 0.9362 |
128
+ | No log | 6.16 | 154 | 0.8757 | 0.5223 | 0.8757 | 0.9358 |
129
+ | No log | 6.24 | 156 | 0.9016 | 0.4313 | 0.9016 | 0.9495 |
130
+ | No log | 6.32 | 158 | 0.8995 | 0.4859 | 0.8995 | 0.9484 |
131
+ | No log | 6.4 | 160 | 0.9130 | 0.4760 | 0.9130 | 0.9555 |
132
+ | No log | 6.48 | 162 | 0.9243 | 0.4760 | 0.9243 | 0.9614 |
133
+ | No log | 6.5600 | 164 | 0.9278 | 0.4196 | 0.9278 | 0.9632 |
134
+ | No log | 6.64 | 166 | 0.9541 | 0.5338 | 0.9541 | 0.9768 |
135
+ | No log | 6.72 | 168 | 0.9583 | 0.5041 | 0.9583 | 0.9789 |
136
+ | No log | 6.8 | 170 | 0.9974 | 0.3878 | 0.9974 | 0.9987 |
137
+ | No log | 6.88 | 172 | 1.0346 | 0.3757 | 1.0346 | 1.0172 |
138
+ | No log | 6.96 | 174 | 0.9927 | 0.4517 | 0.9927 | 0.9963 |
139
+ | No log | 7.04 | 176 | 1.0035 | 0.4910 | 1.0035 | 1.0018 |
140
+ | No log | 7.12 | 178 | 1.0197 | 0.4589 | 1.0197 | 1.0098 |
141
+ | No log | 7.2 | 180 | 1.0088 | 0.3376 | 1.0088 | 1.0044 |
142
+ | No log | 7.28 | 182 | 1.0026 | 0.3463 | 1.0026 | 1.0013 |
143
+ | No log | 7.36 | 184 | 1.0075 | 0.3231 | 1.0075 | 1.0037 |
144
+ | No log | 7.44 | 186 | 1.0038 | 0.4073 | 1.0038 | 1.0019 |
145
+ | No log | 7.52 | 188 | 1.0182 | 0.3715 | 1.0182 | 1.0091 |
146
+ | No log | 7.6 | 190 | 1.0288 | 0.3490 | 1.0288 | 1.0143 |
147
+ | No log | 7.68 | 192 | 1.0977 | 0.3006 | 1.0977 | 1.0477 |
148
+ | No log | 7.76 | 194 | 1.0034 | 0.3579 | 1.0034 | 1.0017 |
149
+ | No log | 7.84 | 196 | 0.9724 | 0.4037 | 0.9724 | 0.9861 |
150
+ | No log | 7.92 | 198 | 0.9599 | 0.4032 | 0.9599 | 0.9798 |
151
+ | No log | 8.0 | 200 | 0.9741 | 0.4998 | 0.9741 | 0.9870 |
152
+ | No log | 8.08 | 202 | 1.0130 | 0.4822 | 1.0130 | 1.0065 |
153
+ | No log | 8.16 | 204 | 0.9405 | 0.5248 | 0.9405 | 0.9698 |
154
+ | No log | 8.24 | 206 | 0.9284 | 0.4672 | 0.9284 | 0.9636 |
155
+ | No log | 8.32 | 208 | 0.9061 | 0.4894 | 0.9061 | 0.9519 |
156
+ | No log | 8.4 | 210 | 0.9463 | 0.5211 | 0.9463 | 0.9728 |
157
+ | No log | 8.48 | 212 | 0.9002 | 0.5338 | 0.9002 | 0.9488 |
158
+ | No log | 8.56 | 214 | 0.9005 | 0.4523 | 0.9005 | 0.9489 |
159
+ | No log | 8.64 | 216 | 0.9512 | 0.4418 | 0.9512 | 0.9753 |
160
+ | No log | 8.72 | 218 | 0.9011 | 0.5753 | 0.9011 | 0.9493 |
161
+ | No log | 8.8 | 220 | 0.8745 | 0.6010 | 0.8745 | 0.9351 |
162
+ | No log | 8.88 | 222 | 0.8911 | 0.4812 | 0.8911 | 0.9440 |
163
+ | No log | 8.96 | 224 | 0.8931 | 0.4812 | 0.8931 | 0.9450 |
164
+ | No log | 9.04 | 226 | 0.8681 | 0.5196 | 0.8681 | 0.9317 |
165
+ | No log | 9.12 | 228 | 0.8736 | 0.5208 | 0.8736 | 0.9347 |
166
+ | No log | 9.2 | 230 | 0.8981 | 0.4762 | 0.8981 | 0.9477 |
167
+ | No log | 9.28 | 232 | 0.8710 | 0.5106 | 0.8710 | 0.9333 |
168
+ | No log | 9.36 | 234 | 0.8824 | 0.5283 | 0.8824 | 0.9394 |
169
+ | No log | 9.44 | 236 | 0.8819 | 0.5323 | 0.8819 | 0.9391 |
170
+ | No log | 9.52 | 238 | 0.8946 | 0.5283 | 0.8946 | 0.9458 |
171
+ | No log | 9.6 | 240 | 0.9248 | 0.5344 | 0.9248 | 0.9616 |
172
+ | No log | 9.68 | 242 | 0.9132 | 0.4685 | 0.9132 | 0.9556 |
173
+ | No log | 9.76 | 244 | 0.9778 | 0.4694 | 0.9778 | 0.9889 |
174
+ | No log | 9.84 | 246 | 0.9531 | 0.4454 | 0.9531 | 0.9762 |
175
+ | No log | 9.92 | 248 | 0.8786 | 0.4708 | 0.8786 | 0.9373 |
176
+ | No log | 10.0 | 250 | 0.8723 | 0.4877 | 0.8723 | 0.9340 |
177
+ | No log | 10.08 | 252 | 0.8662 | 0.4787 | 0.8662 | 0.9307 |
178
+ | No log | 10.16 | 254 | 0.8826 | 0.4392 | 0.8826 | 0.9395 |
179
+ | No log | 10.24 | 256 | 1.0058 | 0.4172 | 1.0058 | 1.0029 |
180
+ | No log | 10.32 | 258 | 0.9938 | 0.4137 | 0.9938 | 0.9969 |
181
+ | No log | 10.4 | 260 | 0.9158 | 0.4048 | 0.9158 | 0.9570 |
182
+ | No log | 10.48 | 262 | 0.9070 | 0.3933 | 0.9070 | 0.9524 |
183
+ | No log | 10.56 | 264 | 0.9081 | 0.4265 | 0.9081 | 0.9529 |
184
+ | No log | 10.64 | 266 | 0.8955 | 0.3933 | 0.8955 | 0.9463 |
185
+ | No log | 10.72 | 268 | 0.9233 | 0.4752 | 0.9233 | 0.9609 |
186
+ | No log | 10.8 | 270 | 0.9399 | 0.4752 | 0.9399 | 0.9695 |
187
+ | No log | 10.88 | 272 | 0.9211 | 0.3964 | 0.9211 | 0.9598 |
188
+ | No log | 10.96 | 274 | 0.9080 | 0.5487 | 0.9080 | 0.9529 |
189
+ | No log | 11.04 | 276 | 0.9178 | 0.5659 | 0.9178 | 0.9580 |
190
+ | No log | 11.12 | 278 | 0.8954 | 0.5309 | 0.8954 | 0.9462 |
191
+ | No log | 11.2 | 280 | 0.8871 | 0.4906 | 0.8871 | 0.9419 |
192
+ | No log | 11.28 | 282 | 0.8858 | 0.4906 | 0.8859 | 0.9412 |
193
+ | No log | 11.36 | 284 | 0.8859 | 0.5481 | 0.8859 | 0.9412 |
194
+ | No log | 11.44 | 286 | 0.9024 | 0.5365 | 0.9024 | 0.9499 |
195
+ | No log | 11.52 | 288 | 0.8773 | 0.5396 | 0.8773 | 0.9366 |
196
+ | No log | 11.6 | 290 | 0.9639 | 0.4584 | 0.9639 | 0.9818 |
197
+ | No log | 11.68 | 292 | 1.1839 | 0.4045 | 1.1839 | 1.0881 |
198
+ | No log | 11.76 | 294 | 1.0923 | 0.4539 | 1.0923 | 1.0452 |
199
+ | No log | 11.84 | 296 | 0.8846 | 0.4121 | 0.8846 | 0.9405 |
200
+ | No log | 11.92 | 298 | 0.8180 | 0.5167 | 0.8180 | 0.9044 |
201
+ | No log | 12.0 | 300 | 0.8281 | 0.5352 | 0.8281 | 0.9100 |
202
+ | No log | 12.08 | 302 | 0.8326 | 0.5474 | 0.8326 | 0.9124 |
203
+ | No log | 12.16 | 304 | 0.8298 | 0.5271 | 0.8298 | 0.9109 |
204
+ | No log | 12.24 | 306 | 0.8612 | 0.4489 | 0.8612 | 0.9280 |
205
+ | No log | 12.32 | 308 | 0.8361 | 0.5271 | 0.8361 | 0.9144 |
206
+ | No log | 12.4 | 310 | 0.8313 | 0.5119 | 0.8313 | 0.9118 |
207
+ | No log | 12.48 | 312 | 0.8376 | 0.4916 | 0.8376 | 0.9152 |
208
+ | No log | 12.56 | 314 | 0.8494 | 0.4822 | 0.8494 | 0.9216 |
209
+ | No log | 12.64 | 316 | 0.8665 | 0.4916 | 0.8665 | 0.9309 |
210
+ | No log | 12.72 | 318 | 0.8867 | 0.4354 | 0.8867 | 0.9416 |
211
+ | No log | 12.8 | 320 | 0.8965 | 0.4384 | 0.8965 | 0.9468 |
212
+ | No log | 12.88 | 322 | 0.9436 | 0.3506 | 0.9436 | 0.9714 |
213
+ | No log | 12.96 | 324 | 0.9456 | 0.3812 | 0.9456 | 0.9724 |
214
+ | No log | 13.04 | 326 | 0.9105 | 0.3529 | 0.9105 | 0.9542 |
215
+ | No log | 13.12 | 328 | 0.8742 | 0.4902 | 0.8742 | 0.9350 |
216
+ | No log | 13.2 | 330 | 0.8775 | 0.5154 | 0.8775 | 0.9367 |
217
+ | No log | 13.28 | 332 | 0.8743 | 0.5045 | 0.8743 | 0.9350 |
218
+ | No log | 13.36 | 334 | 0.8612 | 0.4980 | 0.8612 | 0.9280 |
219
+ | No log | 13.44 | 336 | 0.8698 | 0.4935 | 0.8698 | 0.9326 |
220
+ | No log | 13.52 | 338 | 0.8800 | 0.4772 | 0.8800 | 0.9381 |
221
+ | No log | 13.6 | 340 | 0.8666 | 0.4978 | 0.8666 | 0.9309 |
222
+ | No log | 13.68 | 342 | 0.8425 | 0.4889 | 0.8425 | 0.9179 |
223
+ | No log | 13.76 | 344 | 0.8264 | 0.5530 | 0.8264 | 0.9090 |
224
+ | No log | 13.84 | 346 | 0.8303 | 0.5930 | 0.8303 | 0.9112 |
225
+ | No log | 13.92 | 348 | 0.8257 | 0.5528 | 0.8257 | 0.9087 |
226
+ | No log | 14.0 | 350 | 0.8379 | 0.5528 | 0.8379 | 0.9154 |
227
+ | No log | 14.08 | 352 | 0.8530 | 0.4949 | 0.8530 | 0.9236 |
228
+ | No log | 14.16 | 354 | 0.8673 | 0.5408 | 0.8673 | 0.9313 |
229
+ | No log | 14.24 | 356 | 0.8748 | 0.5011 | 0.8748 | 0.9353 |
230
+ | No log | 14.32 | 358 | 0.8989 | 0.4770 | 0.8989 | 0.9481 |
231
+ | No log | 14.4 | 360 | 0.9131 | 0.4286 | 0.9131 | 0.9556 |
232
+ | No log | 14.48 | 362 | 0.9373 | 0.4746 | 0.9373 | 0.9681 |
233
+ | No log | 14.56 | 364 | 0.9498 | 0.3756 | 0.9498 | 0.9746 |
234
+ | No log | 14.64 | 366 | 0.9547 | 0.3852 | 0.9547 | 0.9771 |
235
+ | No log | 14.72 | 368 | 0.9612 | 0.3319 | 0.9612 | 0.9804 |
236
+ | No log | 14.8 | 370 | 0.9781 | 0.3648 | 0.9781 | 0.9890 |
237
+ | No log | 14.88 | 372 | 0.9813 | 0.4435 | 0.9813 | 0.9906 |
238
+ | No log | 14.96 | 374 | 1.0039 | 0.4726 | 1.0039 | 1.0020 |
239
+ | No log | 15.04 | 376 | 1.0119 | 0.4389 | 1.0119 | 1.0060 |
240
+ | No log | 15.12 | 378 | 0.9737 | 0.3886 | 0.9737 | 0.9868 |
241
+ | No log | 15.2 | 380 | 1.0385 | 0.3897 | 1.0385 | 1.0191 |
242
+ | No log | 15.28 | 382 | 1.1741 | 0.3491 | 1.1741 | 1.0836 |
243
+ | No log | 15.36 | 384 | 1.1173 | 0.3416 | 1.1173 | 1.0570 |
244
+ | No log | 15.44 | 386 | 0.9577 | 0.3908 | 0.9577 | 0.9786 |
245
+ | No log | 15.52 | 388 | 0.8953 | 0.5073 | 0.8953 | 0.9462 |
246
+ | No log | 15.6 | 390 | 0.8943 | 0.5331 | 0.8943 | 0.9457 |
247
+ | No log | 15.68 | 392 | 0.8672 | 0.4789 | 0.8672 | 0.9312 |
248
+ | No log | 15.76 | 394 | 0.8889 | 0.3596 | 0.8889 | 0.9428 |
249
+ | No log | 15.84 | 396 | 0.9534 | 0.3805 | 0.9534 | 0.9764 |
250
+ | No log | 15.92 | 398 | 0.9341 | 0.3933 | 0.9341 | 0.9665 |
251
+ | No log | 16.0 | 400 | 0.8645 | 0.4690 | 0.8645 | 0.9298 |
252
+ | No log | 16.08 | 402 | 0.8444 | 0.5908 | 0.8444 | 0.9189 |
253
+ | No log | 16.16 | 404 | 0.8495 | 0.5966 | 0.8495 | 0.9217 |
254
+ | No log | 16.24 | 406 | 0.8507 | 0.4949 | 0.8507 | 0.9223 |
255
+ | No log | 16.32 | 408 | 0.8609 | 0.4949 | 0.8609 | 0.9278 |
256
+ | No log | 16.4 | 410 | 0.8813 | 0.4568 | 0.8813 | 0.9388 |
257
+ | No log | 16.48 | 412 | 0.8756 | 0.4949 | 0.8756 | 0.9357 |
258
+ | No log | 16.56 | 414 | 0.8769 | 0.4374 | 0.8769 | 0.9364 |
259
+ | No log | 16.64 | 416 | 0.8928 | 0.4672 | 0.8928 | 0.9449 |
260
+ | No log | 16.72 | 418 | 0.9017 | 0.5077 | 0.9017 | 0.9496 |
261
+ | No log | 16.8 | 420 | 0.8687 | 0.4161 | 0.8687 | 0.9320 |
262
+ | No log | 16.88 | 422 | 0.8636 | 0.4885 | 0.8636 | 0.9293 |
263
+ | No log | 16.96 | 424 | 0.8564 | 0.4978 | 0.8564 | 0.9254 |
264
+ | No log | 17.04 | 426 | 0.8505 | 0.4920 | 0.8505 | 0.9223 |
265
+ | No log | 17.12 | 428 | 0.8524 | 0.4724 | 0.8524 | 0.9233 |
266
+ | No log | 17.2 | 430 | 0.8439 | 0.4724 | 0.8439 | 0.9186 |
267
+ | No log | 17.28 | 432 | 0.8370 | 0.4980 | 0.8370 | 0.9149 |
268
+ | No log | 17.36 | 434 | 0.8717 | 0.5447 | 0.8717 | 0.9337 |
269
+ | No log | 17.44 | 436 | 0.8857 | 0.5324 | 0.8857 | 0.9411 |
270
+ | No log | 17.52 | 438 | 0.8557 | 0.4373 | 0.8557 | 0.9251 |
271
+ | No log | 17.6 | 440 | 0.8609 | 0.3886 | 0.8609 | 0.9279 |
272
+ | No log | 17.68 | 442 | 0.8822 | 0.4048 | 0.8822 | 0.9392 |
273
+ | No log | 17.76 | 444 | 0.8712 | 0.3796 | 0.8712 | 0.9334 |
274
+ | No log | 17.84 | 446 | 0.8732 | 0.3762 | 0.8732 | 0.9344 |
275
+ | No log | 17.92 | 448 | 0.8857 | 0.3519 | 0.8857 | 0.9411 |
276
+ | No log | 18.0 | 450 | 0.8744 | 0.3564 | 0.8744 | 0.9351 |
277
+ | No log | 18.08 | 452 | 0.8687 | 0.4514 | 0.8687 | 0.9320 |
278
+ | No log | 18.16 | 454 | 0.8979 | 0.4121 | 0.8979 | 0.9476 |
279
+ | No log | 18.24 | 456 | 0.9551 | 0.3928 | 0.9551 | 0.9773 |
280
+ | No log | 18.32 | 458 | 0.9562 | 0.3672 | 0.9562 | 0.9779 |
281
+ | No log | 18.4 | 460 | 0.9173 | 0.4121 | 0.9173 | 0.9577 |
282
+ | No log | 18.48 | 462 | 0.8816 | 0.3788 | 0.8816 | 0.9389 |
283
+ | No log | 18.56 | 464 | 0.8745 | 0.5080 | 0.8745 | 0.9352 |
284
+ | No log | 18.64 | 466 | 0.8811 | 0.5173 | 0.8811 | 0.9387 |
285
+ | No log | 18.72 | 468 | 0.8806 | 0.5498 | 0.8806 | 0.9384 |
286
+ | No log | 18.8 | 470 | 0.8789 | 0.5322 | 0.8789 | 0.9375 |
287
+ | No log | 18.88 | 472 | 0.8921 | 0.4592 | 0.8921 | 0.9445 |
288
+ | No log | 18.96 | 474 | 0.9214 | 0.3220 | 0.9214 | 0.9599 |
289
+ | No log | 19.04 | 476 | 0.9032 | 0.3819 | 0.9032 | 0.9504 |
290
+ | No log | 19.12 | 478 | 0.8638 | 0.4323 | 0.8638 | 0.9294 |
291
+ | No log | 19.2 | 480 | 0.8646 | 0.5716 | 0.8646 | 0.9298 |
292
+ | No log | 19.28 | 482 | 0.8631 | 0.5524 | 0.8631 | 0.9290 |
293
+ | No log | 19.36 | 484 | 0.8694 | 0.4064 | 0.8694 | 0.9324 |
294
+ | No log | 19.44 | 486 | 0.9272 | 0.3852 | 0.9272 | 0.9629 |
295
+ | No log | 19.52 | 488 | 0.9565 | 0.3255 | 0.9565 | 0.9780 |
296
+ | No log | 19.6 | 490 | 0.9059 | 0.4176 | 0.9059 | 0.9518 |
297
+ | No log | 19.68 | 492 | 0.8658 | 0.4568 | 0.8658 | 0.9305 |
298
+ | No log | 19.76 | 494 | 0.8724 | 0.4800 | 0.8724 | 0.9340 |
299
+ | No log | 19.84 | 496 | 0.9415 | 0.4014 | 0.9415 | 0.9703 |
300
+ | No log | 19.92 | 498 | 0.9647 | 0.4234 | 0.9647 | 0.9822 |
301
+ | 0.3396 | 20.0 | 500 | 0.8851 | 0.5159 | 0.8851 | 0.9408 |
302
+ | 0.3396 | 20.08 | 502 | 0.8338 | 0.5223 | 0.8338 | 0.9131 |
303
+ | 0.3396 | 20.16 | 504 | 0.8267 | 0.5038 | 0.8267 | 0.9092 |
304
+ | 0.3396 | 20.24 | 506 | 0.8343 | 0.4103 | 0.8343 | 0.9134 |
305
+ | 0.3396 | 20.32 | 508 | 0.8546 | 0.4454 | 0.8546 | 0.9244 |
306
+ | 0.3396 | 20.4 | 510 | 0.8377 | 0.4064 | 0.8377 | 0.9153 |
307
+ | 0.3396 | 20.48 | 512 | 0.8315 | 0.4290 | 0.8315 | 0.9119 |
308
+
309
+
310
+ ### Framework versions
311
+
312
+ - Transformers 4.44.2
313
+ - Pytorch 2.4.0+cu118
314
+ - Datasets 2.21.0
315
+ - Tokenizers 0.19.1
config.json ADDED
@@ -0,0 +1,32 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "_name_or_path": "aubmindlab/bert-base-arabertv02",
3
+ "architectures": [
4
+ "BertForSequenceClassification"
5
+ ],
6
+ "attention_probs_dropout_prob": 0.1,
7
+ "classifier_dropout": null,
8
+ "hidden_act": "gelu",
9
+ "hidden_dropout_prob": 0.1,
10
+ "hidden_size": 768,
11
+ "id2label": {
12
+ "0": "LABEL_0"
13
+ },
14
+ "initializer_range": 0.02,
15
+ "intermediate_size": 3072,
16
+ "label2id": {
17
+ "LABEL_0": 0
18
+ },
19
+ "layer_norm_eps": 1e-12,
20
+ "max_position_embeddings": 512,
21
+ "model_type": "bert",
22
+ "num_attention_heads": 12,
23
+ "num_hidden_layers": 12,
24
+ "pad_token_id": 0,
25
+ "position_embedding_type": "absolute",
26
+ "problem_type": "regression",
27
+ "torch_dtype": "float32",
28
+ "transformers_version": "4.44.2",
29
+ "type_vocab_size": 2,
30
+ "use_cache": true,
31
+ "vocab_size": 64000
32
+ }
model.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:fcda4c330b24610d12dc680d11bf4a9fb9d0f9c2f332f7a1491b02cdc3b20492
3
+ size 540799996
training_args.bin ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:4d4c80b9f5ade59a4158572ed4fccee1ea797512c6b1dc99024675f60aea31cf
3
+ size 5240