MayBashendy commited on
Commit
23affbe
·
verified ·
1 Parent(s): db26717

Training in progress, step 500

Browse files
Files changed (4) hide show
  1. README.md +259 -0
  2. config.json +32 -0
  3. model.safetensors +3 -0
  4. training_args.bin +3 -0
README.md ADDED
@@ -0,0 +1,259 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ library_name: transformers
3
+ base_model: aubmindlab/bert-base-arabertv02
4
+ tags:
5
+ - generated_from_trainer
6
+ model-index:
7
+ - name: ArabicNewSplits7_B_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k1_task5_organization
8
+ results: []
9
+ ---
10
+
11
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
12
+ should probably proofread and complete it, then remove this comment. -->
13
+
14
+ # ArabicNewSplits7_B_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k1_task5_organization
15
+
16
+ This model is a fine-tuned version of [aubmindlab/bert-base-arabertv02](https://huggingface.co/aubmindlab/bert-base-arabertv02) on the None dataset.
17
+ It achieves the following results on the evaluation set:
18
+ - Loss: 0.9145
19
+ - Qwk: 0.2624
20
+ - Mse: 0.9145
21
+ - Rmse: 0.9563
22
+
23
+ ## Model description
24
+
25
+ More information needed
26
+
27
+ ## Intended uses & limitations
28
+
29
+ More information needed
30
+
31
+ ## Training and evaluation data
32
+
33
+ More information needed
34
+
35
+ ## Training procedure
36
+
37
+ ### Training hyperparameters
38
+
39
+ The following hyperparameters were used during training:
40
+ - learning_rate: 2e-05
41
+ - train_batch_size: 8
42
+ - eval_batch_size: 8
43
+ - seed: 42
44
+ - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
45
+ - lr_scheduler_type: linear
46
+ - num_epochs: 100
47
+
48
+ ### Training results
49
+
50
+ | Training Loss | Epoch | Step | Validation Loss | Qwk | Mse | Rmse |
51
+ |:-------------:|:-----:|:----:|:---------------:|:-------:|:------:|:------:|
52
+ | No log | 0.5 | 2 | 4.0958 | 0.0130 | 4.0958 | 2.0238 |
53
+ | No log | 1.0 | 4 | 2.1088 | 0.0357 | 2.1088 | 1.4522 |
54
+ | No log | 1.5 | 6 | 1.4657 | -0.0550 | 1.4657 | 1.2107 |
55
+ | No log | 2.0 | 8 | 1.2147 | 0.2441 | 1.2147 | 1.1021 |
56
+ | No log | 2.5 | 10 | 1.1574 | 0.2217 | 1.1574 | 1.0758 |
57
+ | No log | 3.0 | 12 | 1.1017 | 0.2265 | 1.1017 | 1.0496 |
58
+ | No log | 3.5 | 14 | 1.1114 | 0.2492 | 1.1114 | 1.0542 |
59
+ | No log | 4.0 | 16 | 1.1285 | 0.2204 | 1.1285 | 1.0623 |
60
+ | No log | 4.5 | 18 | 1.2481 | 0.1682 | 1.2481 | 1.1172 |
61
+ | No log | 5.0 | 20 | 1.1151 | 0.2813 | 1.1151 | 1.0560 |
62
+ | No log | 5.5 | 22 | 1.6223 | -0.0968 | 1.6223 | 1.2737 |
63
+ | No log | 6.0 | 24 | 2.2020 | -0.2805 | 2.2020 | 1.4839 |
64
+ | No log | 6.5 | 26 | 1.8058 | -0.0775 | 1.8058 | 1.3438 |
65
+ | No log | 7.0 | 28 | 1.2023 | 0.1943 | 1.2023 | 1.0965 |
66
+ | No log | 7.5 | 30 | 1.1299 | 0.3312 | 1.1299 | 1.0630 |
67
+ | No log | 8.0 | 32 | 1.3970 | 0.2539 | 1.3970 | 1.1819 |
68
+ | No log | 8.5 | 34 | 1.5870 | 0.1160 | 1.5870 | 1.2598 |
69
+ | No log | 9.0 | 36 | 1.4716 | 0.1084 | 1.4716 | 1.2131 |
70
+ | No log | 9.5 | 38 | 1.3242 | 0.2199 | 1.3242 | 1.1508 |
71
+ | No log | 10.0 | 40 | 1.2099 | 0.3089 | 1.2099 | 1.1000 |
72
+ | No log | 10.5 | 42 | 1.4118 | 0.1080 | 1.4118 | 1.1882 |
73
+ | No log | 11.0 | 44 | 1.4426 | 0.1538 | 1.4426 | 1.2011 |
74
+ | No log | 11.5 | 46 | 1.2015 | 0.2494 | 1.2015 | 1.0961 |
75
+ | No log | 12.0 | 48 | 1.0515 | 0.3367 | 1.0515 | 1.0254 |
76
+ | No log | 12.5 | 50 | 1.0381 | 0.3485 | 1.0381 | 1.0189 |
77
+ | No log | 13.0 | 52 | 1.0968 | 0.2477 | 1.0968 | 1.0473 |
78
+ | No log | 13.5 | 54 | 1.1887 | 0.2545 | 1.1887 | 1.0903 |
79
+ | No log | 14.0 | 56 | 1.1212 | 0.1827 | 1.1212 | 1.0589 |
80
+ | No log | 14.5 | 58 | 1.0283 | 0.3030 | 1.0283 | 1.0141 |
81
+ | No log | 15.0 | 60 | 1.0368 | 0.3314 | 1.0368 | 1.0182 |
82
+ | No log | 15.5 | 62 | 1.1286 | 0.2312 | 1.1286 | 1.0624 |
83
+ | No log | 16.0 | 64 | 1.3279 | 0.1397 | 1.3279 | 1.1524 |
84
+ | No log | 16.5 | 66 | 1.3659 | 0.1628 | 1.3659 | 1.1687 |
85
+ | No log | 17.0 | 68 | 1.2715 | 0.2165 | 1.2715 | 1.1276 |
86
+ | No log | 17.5 | 70 | 1.1731 | 0.2593 | 1.1731 | 1.0831 |
87
+ | No log | 18.0 | 72 | 1.0433 | 0.2769 | 1.0433 | 1.0214 |
88
+ | No log | 18.5 | 74 | 0.9589 | 0.3236 | 0.9589 | 0.9792 |
89
+ | No log | 19.0 | 76 | 1.0167 | 0.4044 | 1.0167 | 1.0083 |
90
+ | No log | 19.5 | 78 | 1.0336 | 0.4168 | 1.0336 | 1.0166 |
91
+ | No log | 20.0 | 80 | 0.9569 | 0.3506 | 0.9569 | 0.9782 |
92
+ | No log | 20.5 | 82 | 0.9108 | 0.3236 | 0.9108 | 0.9543 |
93
+ | No log | 21.0 | 84 | 0.9356 | 0.3682 | 0.9356 | 0.9672 |
94
+ | No log | 21.5 | 86 | 1.0095 | 0.3706 | 1.0095 | 1.0047 |
95
+ | No log | 22.0 | 88 | 0.9893 | 0.3706 | 0.9893 | 0.9947 |
96
+ | No log | 22.5 | 90 | 1.0141 | 0.3424 | 1.0141 | 1.0070 |
97
+ | No log | 23.0 | 92 | 1.0610 | 0.3188 | 1.0610 | 1.0301 |
98
+ | No log | 23.5 | 94 | 0.9901 | 0.2842 | 0.9901 | 0.9951 |
99
+ | No log | 24.0 | 96 | 0.9455 | 0.2988 | 0.9455 | 0.9723 |
100
+ | No log | 24.5 | 98 | 0.9279 | 0.2767 | 0.9279 | 0.9633 |
101
+ | No log | 25.0 | 100 | 0.9208 | 0.3563 | 0.9208 | 0.9596 |
102
+ | No log | 25.5 | 102 | 0.9102 | 0.3563 | 0.9102 | 0.9540 |
103
+ | No log | 26.0 | 104 | 0.8849 | 0.3074 | 0.8849 | 0.9407 |
104
+ | No log | 26.5 | 106 | 0.8817 | 0.3356 | 0.8817 | 0.9390 |
105
+ | No log | 27.0 | 108 | 0.8892 | 0.3757 | 0.8892 | 0.9430 |
106
+ | No log | 27.5 | 110 | 0.8918 | 0.3994 | 0.8918 | 0.9443 |
107
+ | No log | 28.0 | 112 | 0.9332 | 0.3842 | 0.9332 | 0.9660 |
108
+ | No log | 28.5 | 114 | 0.9544 | 0.3710 | 0.9544 | 0.9769 |
109
+ | No log | 29.0 | 116 | 0.9169 | 0.3682 | 0.9169 | 0.9575 |
110
+ | No log | 29.5 | 118 | 0.9172 | 0.3543 | 0.9172 | 0.9577 |
111
+ | No log | 30.0 | 120 | 0.9176 | 0.3485 | 0.9176 | 0.9579 |
112
+ | No log | 30.5 | 122 | 0.9086 | 0.3777 | 0.9086 | 0.9532 |
113
+ | No log | 31.0 | 124 | 0.9187 | 0.3011 | 0.9187 | 0.9585 |
114
+ | No log | 31.5 | 126 | 0.9541 | 0.3403 | 0.9541 | 0.9768 |
115
+ | No log | 32.0 | 128 | 1.0529 | 0.3107 | 1.0529 | 1.0261 |
116
+ | No log | 32.5 | 130 | 1.1441 | 0.2155 | 1.1441 | 1.0696 |
117
+ | No log | 33.0 | 132 | 1.1422 | 0.2815 | 1.1422 | 1.0687 |
118
+ | No log | 33.5 | 134 | 1.0143 | 0.3577 | 1.0143 | 1.0071 |
119
+ | No log | 34.0 | 136 | 0.9160 | 0.2910 | 0.9160 | 0.9571 |
120
+ | No log | 34.5 | 138 | 0.8996 | 0.2932 | 0.8996 | 0.9485 |
121
+ | No log | 35.0 | 140 | 0.9014 | 0.2910 | 0.9014 | 0.9494 |
122
+ | No log | 35.5 | 142 | 0.9701 | 0.3005 | 0.9701 | 0.9849 |
123
+ | No log | 36.0 | 144 | 1.0410 | 0.2773 | 1.0410 | 1.0203 |
124
+ | No log | 36.5 | 146 | 1.0420 | 0.2864 | 1.0420 | 1.0208 |
125
+ | No log | 37.0 | 148 | 1.0021 | 0.2908 | 1.0021 | 1.0011 |
126
+ | No log | 37.5 | 150 | 0.9308 | 0.3067 | 0.9308 | 0.9648 |
127
+ | No log | 38.0 | 152 | 0.8932 | 0.3258 | 0.8932 | 0.9451 |
128
+ | No log | 38.5 | 154 | 0.8902 | 0.3673 | 0.8902 | 0.9435 |
129
+ | No log | 39.0 | 156 | 0.8918 | 0.3258 | 0.8918 | 0.9443 |
130
+ | No log | 39.5 | 158 | 0.9036 | 0.2842 | 0.9036 | 0.9506 |
131
+ | No log | 40.0 | 160 | 0.9346 | 0.3802 | 0.9346 | 0.9668 |
132
+ | No log | 40.5 | 162 | 0.9444 | 0.3687 | 0.9444 | 0.9718 |
133
+ | No log | 41.0 | 164 | 0.9220 | 0.3250 | 0.9220 | 0.9602 |
134
+ | No log | 41.5 | 166 | 0.9585 | 0.3782 | 0.9585 | 0.9790 |
135
+ | No log | 42.0 | 168 | 0.9946 | 0.2963 | 0.9946 | 0.9973 |
136
+ | No log | 42.5 | 170 | 0.9980 | 0.2963 | 0.9980 | 0.9990 |
137
+ | No log | 43.0 | 172 | 0.9514 | 0.3515 | 0.9514 | 0.9754 |
138
+ | No log | 43.5 | 174 | 0.9069 | 0.2818 | 0.9069 | 0.9523 |
139
+ | No log | 44.0 | 176 | 0.9161 | 0.2818 | 0.9161 | 0.9571 |
140
+ | No log | 44.5 | 178 | 0.9337 | 0.2942 | 0.9337 | 0.9663 |
141
+ | No log | 45.0 | 180 | 0.9516 | 0.3515 | 0.9516 | 0.9755 |
142
+ | No log | 45.5 | 182 | 0.9614 | 0.3372 | 0.9614 | 0.9805 |
143
+ | No log | 46.0 | 184 | 0.9477 | 0.3551 | 0.9477 | 0.9735 |
144
+ | No log | 46.5 | 186 | 0.9227 | 0.3145 | 0.9227 | 0.9606 |
145
+ | No log | 47.0 | 188 | 0.9062 | 0.3145 | 0.9062 | 0.9520 |
146
+ | No log | 47.5 | 190 | 0.9216 | 0.3145 | 0.9216 | 0.9600 |
147
+ | No log | 48.0 | 192 | 0.9631 | 0.3551 | 0.9631 | 0.9814 |
148
+ | No log | 48.5 | 194 | 1.0191 | 0.3236 | 1.0191 | 1.0095 |
149
+ | No log | 49.0 | 196 | 1.0533 | 0.2750 | 1.0533 | 1.0263 |
150
+ | No log | 49.5 | 198 | 1.0194 | 0.2627 | 1.0194 | 1.0097 |
151
+ | No log | 50.0 | 200 | 0.9745 | 0.3351 | 0.9745 | 0.9872 |
152
+ | No log | 50.5 | 202 | 0.9417 | 0.2942 | 0.9417 | 0.9704 |
153
+ | No log | 51.0 | 204 | 0.9257 | 0.3067 | 0.9257 | 0.9621 |
154
+ | No log | 51.5 | 206 | 0.9159 | 0.3067 | 0.9159 | 0.9570 |
155
+ | No log | 52.0 | 208 | 0.9191 | 0.3067 | 0.9191 | 0.9587 |
156
+ | No log | 52.5 | 210 | 0.9323 | 0.3067 | 0.9323 | 0.9656 |
157
+ | No log | 53.0 | 212 | 0.9546 | 0.3372 | 0.9546 | 0.9770 |
158
+ | No log | 53.5 | 214 | 0.9710 | 0.2963 | 0.9710 | 0.9854 |
159
+ | No log | 54.0 | 216 | 0.9576 | 0.3372 | 0.9576 | 0.9786 |
160
+ | No log | 54.5 | 218 | 0.9197 | 0.2842 | 0.9197 | 0.9590 |
161
+ | No log | 55.0 | 220 | 0.9126 | 0.3817 | 0.9126 | 0.9553 |
162
+ | No log | 55.5 | 222 | 0.9278 | 0.2842 | 0.9278 | 0.9632 |
163
+ | No log | 56.0 | 224 | 0.9622 | 0.2547 | 0.9622 | 0.9809 |
164
+ | No log | 56.5 | 226 | 1.0209 | 0.2674 | 1.0209 | 1.0104 |
165
+ | No log | 57.0 | 228 | 1.0529 | 0.2983 | 1.0529 | 1.0261 |
166
+ | No log | 57.5 | 230 | 1.0316 | 0.2983 | 1.0316 | 1.0157 |
167
+ | No log | 58.0 | 232 | 1.0100 | 0.2819 | 1.0100 | 1.0050 |
168
+ | No log | 58.5 | 234 | 1.0056 | 0.2819 | 1.0056 | 1.0028 |
169
+ | No log | 59.0 | 236 | 0.9830 | 0.2819 | 0.9830 | 0.9915 |
170
+ | No log | 59.5 | 238 | 0.9913 | 0.2819 | 0.9913 | 0.9957 |
171
+ | No log | 60.0 | 240 | 1.0016 | 0.2674 | 1.0016 | 1.0008 |
172
+ | No log | 60.5 | 242 | 0.9681 | 0.3229 | 0.9681 | 0.9839 |
173
+ | No log | 61.0 | 244 | 0.9263 | 0.3485 | 0.9263 | 0.9625 |
174
+ | No log | 61.5 | 246 | 0.9083 | 0.3258 | 0.9083 | 0.9531 |
175
+ | No log | 62.0 | 248 | 0.9067 | 0.3652 | 0.9067 | 0.9522 |
176
+ | No log | 62.5 | 250 | 0.9102 | 0.3360 | 0.9102 | 0.9540 |
177
+ | No log | 63.0 | 252 | 0.9300 | 0.3637 | 0.9300 | 0.9644 |
178
+ | No log | 63.5 | 254 | 0.9786 | 0.2819 | 0.9786 | 0.9892 |
179
+ | No log | 64.0 | 256 | 1.0414 | 0.2384 | 1.0414 | 1.0205 |
180
+ | No log | 64.5 | 258 | 1.0787 | 0.1970 | 1.0787 | 1.0386 |
181
+ | No log | 65.0 | 260 | 1.0550 | 0.2701 | 1.0550 | 1.0271 |
182
+ | No log | 65.5 | 262 | 0.9978 | 0.2674 | 0.9978 | 0.9989 |
183
+ | No log | 66.0 | 264 | 0.9683 | 0.3229 | 0.9683 | 0.9840 |
184
+ | No log | 66.5 | 266 | 0.9412 | 0.2842 | 0.9412 | 0.9702 |
185
+ | No log | 67.0 | 268 | 0.9231 | 0.3403 | 0.9231 | 0.9608 |
186
+ | No log | 67.5 | 270 | 0.9241 | 0.3280 | 0.9241 | 0.9613 |
187
+ | No log | 68.0 | 272 | 0.9373 | 0.3258 | 0.9373 | 0.9682 |
188
+ | No log | 68.5 | 274 | 0.9361 | 0.3326 | 0.9361 | 0.9675 |
189
+ | No log | 69.0 | 276 | 0.9307 | 0.3209 | 0.9307 | 0.9647 |
190
+ | No log | 69.5 | 278 | 0.9407 | 0.3108 | 0.9407 | 0.9699 |
191
+ | No log | 70.0 | 280 | 0.9412 | 0.3108 | 0.9412 | 0.9701 |
192
+ | No log | 70.5 | 282 | 0.9330 | 0.3326 | 0.9330 | 0.9659 |
193
+ | No log | 71.0 | 284 | 0.9425 | 0.2931 | 0.9425 | 0.9708 |
194
+ | No log | 71.5 | 286 | 0.9463 | 0.2651 | 0.9463 | 0.9728 |
195
+ | No log | 72.0 | 288 | 0.9273 | 0.3151 | 0.9273 | 0.9630 |
196
+ | No log | 72.5 | 290 | 0.9079 | 0.3403 | 0.9079 | 0.9529 |
197
+ | No log | 73.0 | 292 | 0.9012 | 0.3403 | 0.9012 | 0.9493 |
198
+ | No log | 73.5 | 294 | 0.9014 | 0.3465 | 0.9014 | 0.9494 |
199
+ | No log | 74.0 | 296 | 0.9044 | 0.3326 | 0.9044 | 0.9510 |
200
+ | No log | 74.5 | 298 | 0.9049 | 0.3326 | 0.9049 | 0.9513 |
201
+ | No log | 75.0 | 300 | 0.9128 | 0.3011 | 0.9128 | 0.9554 |
202
+ | No log | 75.5 | 302 | 0.9148 | 0.3011 | 0.9148 | 0.9565 |
203
+ | No log | 76.0 | 304 | 0.9073 | 0.3223 | 0.9073 | 0.9525 |
204
+ | No log | 76.5 | 306 | 0.8963 | 0.3697 | 0.8963 | 0.9467 |
205
+ | No log | 77.0 | 308 | 0.8897 | 0.3403 | 0.8897 | 0.9433 |
206
+ | No log | 77.5 | 310 | 0.8901 | 0.3817 | 0.8901 | 0.9435 |
207
+ | No log | 78.0 | 312 | 0.8940 | 0.3403 | 0.8940 | 0.9455 |
208
+ | No log | 78.5 | 314 | 0.9012 | 0.2988 | 0.9012 | 0.9493 |
209
+ | No log | 79.0 | 316 | 0.9176 | 0.2842 | 0.9176 | 0.9579 |
210
+ | No log | 79.5 | 318 | 0.9270 | 0.2842 | 0.9270 | 0.9628 |
211
+ | No log | 80.0 | 320 | 0.9384 | 0.3372 | 0.9384 | 0.9687 |
212
+ | No log | 80.5 | 322 | 0.9485 | 0.2963 | 0.9485 | 0.9739 |
213
+ | No log | 81.0 | 324 | 0.9675 | 0.3196 | 0.9675 | 0.9836 |
214
+ | No log | 81.5 | 326 | 1.0022 | 0.2988 | 1.0022 | 1.0011 |
215
+ | No log | 82.0 | 328 | 1.0185 | 0.2988 | 1.0185 | 1.0092 |
216
+ | No log | 82.5 | 330 | 1.0296 | 0.3006 | 1.0296 | 1.0147 |
217
+ | No log | 83.0 | 332 | 1.0237 | 0.2857 | 1.0237 | 1.0118 |
218
+ | No log | 83.5 | 334 | 1.0220 | 0.2857 | 1.0220 | 1.0109 |
219
+ | No log | 84.0 | 336 | 1.0032 | 0.2988 | 1.0032 | 1.0016 |
220
+ | No log | 84.5 | 338 | 0.9765 | 0.2908 | 0.9765 | 0.9882 |
221
+ | No log | 85.0 | 340 | 0.9550 | 0.2647 | 0.9550 | 0.9772 |
222
+ | No log | 85.5 | 342 | 0.9335 | 0.3360 | 0.9335 | 0.9662 |
223
+ | No log | 86.0 | 344 | 0.9179 | 0.3258 | 0.9179 | 0.9581 |
224
+ | No log | 86.5 | 346 | 0.9094 | 0.3817 | 0.9094 | 0.9536 |
225
+ | No log | 87.0 | 348 | 0.9048 | 0.3837 | 0.9048 | 0.9512 |
226
+ | No log | 87.5 | 350 | 0.9026 | 0.3837 | 0.9026 | 0.9500 |
227
+ | No log | 88.0 | 352 | 0.9046 | 0.3693 | 0.9046 | 0.9511 |
228
+ | No log | 88.5 | 354 | 0.9094 | 0.3258 | 0.9094 | 0.9536 |
229
+ | No log | 89.0 | 356 | 0.9135 | 0.2842 | 0.9135 | 0.9558 |
230
+ | No log | 89.5 | 358 | 0.9165 | 0.2842 | 0.9165 | 0.9573 |
231
+ | No log | 90.0 | 360 | 0.9164 | 0.2842 | 0.9164 | 0.9573 |
232
+ | No log | 90.5 | 362 | 0.9147 | 0.3151 | 0.9147 | 0.9564 |
233
+ | No log | 91.0 | 364 | 0.9111 | 0.2624 | 0.9111 | 0.9545 |
234
+ | No log | 91.5 | 366 | 0.9086 | 0.3030 | 0.9086 | 0.9532 |
235
+ | No log | 92.0 | 368 | 0.9046 | 0.3314 | 0.9046 | 0.9511 |
236
+ | No log | 92.5 | 370 | 0.9048 | 0.3314 | 0.9048 | 0.9512 |
237
+ | No log | 93.0 | 372 | 0.9071 | 0.3172 | 0.9071 | 0.9524 |
238
+ | No log | 93.5 | 374 | 0.9077 | 0.3172 | 0.9077 | 0.9528 |
239
+ | No log | 94.0 | 376 | 0.9100 | 0.3326 | 0.9100 | 0.9540 |
240
+ | No log | 94.5 | 378 | 0.9118 | 0.3326 | 0.9118 | 0.9549 |
241
+ | No log | 95.0 | 380 | 0.9144 | 0.2931 | 0.9144 | 0.9562 |
242
+ | No log | 95.5 | 382 | 0.9157 | 0.3306 | 0.9157 | 0.9569 |
243
+ | No log | 96.0 | 384 | 0.9162 | 0.3306 | 0.9162 | 0.9572 |
244
+ | No log | 96.5 | 386 | 0.9154 | 0.3306 | 0.9154 | 0.9568 |
245
+ | No log | 97.0 | 388 | 0.9151 | 0.3008 | 0.9151 | 0.9566 |
246
+ | No log | 97.5 | 390 | 0.9157 | 0.3008 | 0.9157 | 0.9569 |
247
+ | No log | 98.0 | 392 | 0.9157 | 0.3008 | 0.9157 | 0.9569 |
248
+ | No log | 98.5 | 394 | 0.9154 | 0.3008 | 0.9154 | 0.9568 |
249
+ | No log | 99.0 | 396 | 0.9151 | 0.3008 | 0.9151 | 0.9566 |
250
+ | No log | 99.5 | 398 | 0.9148 | 0.3008 | 0.9148 | 0.9564 |
251
+ | No log | 100.0 | 400 | 0.9145 | 0.2624 | 0.9145 | 0.9563 |
252
+
253
+
254
+ ### Framework versions
255
+
256
+ - Transformers 4.44.2
257
+ - Pytorch 2.4.0+cu118
258
+ - Datasets 2.21.0
259
+ - Tokenizers 0.19.1
config.json ADDED
@@ -0,0 +1,32 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "_name_or_path": "aubmindlab/bert-base-arabertv02",
3
+ "architectures": [
4
+ "BertForSequenceClassification"
5
+ ],
6
+ "attention_probs_dropout_prob": 0.1,
7
+ "classifier_dropout": null,
8
+ "hidden_act": "gelu",
9
+ "hidden_dropout_prob": 0.1,
10
+ "hidden_size": 768,
11
+ "id2label": {
12
+ "0": "LABEL_0"
13
+ },
14
+ "initializer_range": 0.02,
15
+ "intermediate_size": 3072,
16
+ "label2id": {
17
+ "LABEL_0": 0
18
+ },
19
+ "layer_norm_eps": 1e-12,
20
+ "max_position_embeddings": 512,
21
+ "model_type": "bert",
22
+ "num_attention_heads": 12,
23
+ "num_hidden_layers": 12,
24
+ "pad_token_id": 0,
25
+ "position_embedding_type": "absolute",
26
+ "problem_type": "regression",
27
+ "torch_dtype": "float32",
28
+ "transformers_version": "4.44.2",
29
+ "type_vocab_size": 2,
30
+ "use_cache": true,
31
+ "vocab_size": 64000
32
+ }
model.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:21bb192d47063b7699d486dffc5130da2291748882274465916b1d1ff45632fc
3
+ size 540799996
training_args.bin ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:a19c84a6c9e0d95b1e41e24bea916a1f00683e2960f3adcd82a93c7aa6b77499
3
+ size 5368