MayBashendy commited on
Commit
842d4f4
·
verified ·
1 Parent(s): 6c6f5b0

Training in progress, step 500

Browse files
Files changed (4) hide show
  1. README.md +318 -0
  2. config.json +32 -0
  3. model.safetensors +3 -0
  4. training_args.bin +3 -0
README.md ADDED
@@ -0,0 +1,318 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ library_name: transformers
3
+ base_model: aubmindlab/bert-base-arabertv02
4
+ tags:
5
+ - generated_from_trainer
6
+ model-index:
7
+ - name: ArabicNewSplits7_usingALLEssays_FineTuningAraBERT_run1_AugV5_k14_task5_organization
8
+ results: []
9
+ ---
10
+
11
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
12
+ should probably proofread and complete it, then remove this comment. -->
13
+
14
+ # ArabicNewSplits7_usingALLEssays_FineTuningAraBERT_run1_AugV5_k14_task5_organization
15
+
16
+ This model is a fine-tuned version of [aubmindlab/bert-base-arabertv02](https://huggingface.co/aubmindlab/bert-base-arabertv02) on the None dataset.
17
+ It achieves the following results on the evaluation set:
18
+ - Loss: 0.8595
19
+ - Qwk: 0.3596
20
+ - Mse: 0.8595
21
+ - Rmse: 0.9271
22
+
23
+ ## Model description
24
+
25
+ More information needed
26
+
27
+ ## Intended uses & limitations
28
+
29
+ More information needed
30
+
31
+ ## Training and evaluation data
32
+
33
+ More information needed
34
+
35
+ ## Training procedure
36
+
37
+ ### Training hyperparameters
38
+
39
+ The following hyperparameters were used during training:
40
+ - learning_rate: 2e-05
41
+ - train_batch_size: 8
42
+ - eval_batch_size: 8
43
+ - seed: 42
44
+ - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
45
+ - lr_scheduler_type: linear
46
+ - num_epochs: 100
47
+
48
+ ### Training results
49
+
50
+ | Training Loss | Epoch | Step | Validation Loss | Qwk | Mse | Rmse |
51
+ |:-------------:|:-------:|:----:|:---------------:|:------:|:------:|:------:|
52
+ | No log | 0.0465 | 2 | 3.8826 | 0.0078 | 3.8826 | 1.9704 |
53
+ | No log | 0.0930 | 4 | 2.0712 | 0.0440 | 2.0712 | 1.4392 |
54
+ | No log | 0.1395 | 6 | 1.8339 | 0.0916 | 1.8339 | 1.3542 |
55
+ | No log | 0.1860 | 8 | 1.9781 | 0.1164 | 1.9781 | 1.4065 |
56
+ | No log | 0.2326 | 10 | 1.1698 | 0.1952 | 1.1698 | 1.0816 |
57
+ | No log | 0.2791 | 12 | 0.9928 | 0.4866 | 0.9928 | 0.9964 |
58
+ | No log | 0.3256 | 14 | 1.0272 | 0.2465 | 1.0272 | 1.0135 |
59
+ | No log | 0.3721 | 16 | 1.0953 | 0.2416 | 1.0953 | 1.0466 |
60
+ | No log | 0.4186 | 18 | 1.1547 | 0.1500 | 1.1547 | 1.0746 |
61
+ | No log | 0.4651 | 20 | 1.3264 | 0.1347 | 1.3264 | 1.1517 |
62
+ | No log | 0.5116 | 22 | 1.1965 | 0.2273 | 1.1965 | 1.0938 |
63
+ | No log | 0.5581 | 24 | 1.0025 | 0.3063 | 1.0025 | 1.0012 |
64
+ | No log | 0.6047 | 26 | 1.0285 | 0.2812 | 1.0285 | 1.0142 |
65
+ | No log | 0.6512 | 28 | 1.0618 | 0.2503 | 1.0618 | 1.0304 |
66
+ | No log | 0.6977 | 30 | 1.1455 | 0.2530 | 1.1455 | 1.0703 |
67
+ | No log | 0.7442 | 32 | 1.1444 | 0.2056 | 1.1444 | 1.0698 |
68
+ | No log | 0.7907 | 34 | 1.1814 | 0.1624 | 1.1814 | 1.0869 |
69
+ | No log | 0.8372 | 36 | 1.2072 | 0.0778 | 1.2072 | 1.0987 |
70
+ | No log | 0.8837 | 38 | 1.5230 | 0.0872 | 1.5230 | 1.2341 |
71
+ | No log | 0.9302 | 40 | 2.1401 | 0.0916 | 2.1401 | 1.4629 |
72
+ | No log | 0.9767 | 42 | 2.2156 | 0.0626 | 2.2156 | 1.4885 |
73
+ | No log | 1.0233 | 44 | 1.9804 | 0.0633 | 1.9804 | 1.4073 |
74
+ | No log | 1.0698 | 46 | 1.5622 | 0.1038 | 1.5622 | 1.2499 |
75
+ | No log | 1.1163 | 48 | 1.2749 | 0.2098 | 1.2749 | 1.1291 |
76
+ | No log | 1.1628 | 50 | 1.1360 | 0.2138 | 1.1360 | 1.0658 |
77
+ | No log | 1.2093 | 52 | 1.0856 | 0.2158 | 1.0856 | 1.0419 |
78
+ | No log | 1.2558 | 54 | 1.0524 | 0.1720 | 1.0524 | 1.0259 |
79
+ | No log | 1.3023 | 56 | 1.0579 | 0.3117 | 1.0579 | 1.0285 |
80
+ | No log | 1.3488 | 58 | 1.3499 | 0.28 | 1.3499 | 1.1619 |
81
+ | No log | 1.3953 | 60 | 1.9807 | 0.1086 | 1.9807 | 1.4074 |
82
+ | No log | 1.4419 | 62 | 1.8260 | 0.1270 | 1.8260 | 1.3513 |
83
+ | No log | 1.4884 | 64 | 1.2954 | 0.2101 | 1.2954 | 1.1382 |
84
+ | No log | 1.5349 | 66 | 1.0404 | 0.2151 | 1.0404 | 1.0200 |
85
+ | No log | 1.5814 | 68 | 1.0131 | 0.2569 | 1.0131 | 1.0065 |
86
+ | No log | 1.6279 | 70 | 1.0325 | 0.2472 | 1.0325 | 1.0161 |
87
+ | No log | 1.6744 | 72 | 1.1190 | 0.1324 | 1.1190 | 1.0578 |
88
+ | No log | 1.7209 | 74 | 1.1506 | 0.1645 | 1.1506 | 1.0727 |
89
+ | No log | 1.7674 | 76 | 1.1932 | 0.1512 | 1.1932 | 1.0923 |
90
+ | No log | 1.8140 | 78 | 1.2090 | 0.1512 | 1.2090 | 1.0995 |
91
+ | No log | 1.8605 | 80 | 1.2225 | 0.1976 | 1.2225 | 1.1057 |
92
+ | No log | 1.9070 | 82 | 1.3905 | 0.2303 | 1.3905 | 1.1792 |
93
+ | No log | 1.9535 | 84 | 1.4168 | 0.2817 | 1.4168 | 1.1903 |
94
+ | No log | 2.0 | 86 | 1.2702 | 0.3893 | 1.2702 | 1.1270 |
95
+ | No log | 2.0465 | 88 | 1.1746 | 0.3590 | 1.1746 | 1.0838 |
96
+ | No log | 2.0930 | 90 | 1.1246 | 0.3302 | 1.1246 | 1.0605 |
97
+ | No log | 2.1395 | 92 | 1.1285 | 0.3460 | 1.1285 | 1.0623 |
98
+ | No log | 2.1860 | 94 | 1.1106 | 0.3617 | 1.1106 | 1.0539 |
99
+ | No log | 2.2326 | 96 | 1.0588 | 0.3042 | 1.0588 | 1.0290 |
100
+ | No log | 2.2791 | 98 | 1.0435 | 0.2738 | 1.0435 | 1.0215 |
101
+ | No log | 2.3256 | 100 | 1.0389 | 0.2025 | 1.0389 | 1.0193 |
102
+ | No log | 2.3721 | 102 | 1.0391 | 0.2025 | 1.0391 | 1.0193 |
103
+ | No log | 2.4186 | 104 | 1.0497 | 0.2357 | 1.0497 | 1.0246 |
104
+ | No log | 2.4651 | 106 | 1.0727 | 0.2738 | 1.0727 | 1.0357 |
105
+ | No log | 2.5116 | 108 | 1.0950 | 0.3042 | 1.0950 | 1.0464 |
106
+ | No log | 2.5581 | 110 | 1.0762 | 0.3406 | 1.0762 | 1.0374 |
107
+ | No log | 2.6047 | 112 | 1.0446 | 0.3613 | 1.0446 | 1.0220 |
108
+ | No log | 2.6512 | 114 | 1.0308 | 0.4058 | 1.0308 | 1.0153 |
109
+ | No log | 2.6977 | 116 | 1.0369 | 0.4239 | 1.0369 | 1.0183 |
110
+ | No log | 2.7442 | 118 | 1.0206 | 0.4717 | 1.0206 | 1.0102 |
111
+ | No log | 2.7907 | 120 | 0.9993 | 0.4286 | 0.9993 | 0.9996 |
112
+ | No log | 2.8372 | 122 | 1.0235 | 0.3958 | 1.0235 | 1.0117 |
113
+ | No log | 2.8837 | 124 | 1.0580 | 0.3902 | 1.0580 | 1.0286 |
114
+ | No log | 2.9302 | 126 | 0.9802 | 0.3713 | 0.9802 | 0.9900 |
115
+ | No log | 2.9767 | 128 | 1.0006 | 0.4351 | 1.0006 | 1.0003 |
116
+ | No log | 3.0233 | 130 | 0.9925 | 0.4344 | 0.9925 | 0.9962 |
117
+ | No log | 3.0698 | 132 | 0.9743 | 0.3678 | 0.9743 | 0.9871 |
118
+ | No log | 3.1163 | 134 | 0.9863 | 0.3802 | 0.9863 | 0.9931 |
119
+ | No log | 3.1628 | 136 | 1.0130 | 0.4045 | 1.0130 | 1.0065 |
120
+ | No log | 3.2093 | 138 | 1.0842 | 0.4416 | 1.0842 | 1.0413 |
121
+ | No log | 3.2558 | 140 | 1.1389 | 0.4057 | 1.1389 | 1.0672 |
122
+ | No log | 3.3023 | 142 | 1.0393 | 0.4519 | 1.0393 | 1.0195 |
123
+ | No log | 3.3488 | 144 | 0.9882 | 0.4567 | 0.9882 | 0.9941 |
124
+ | No log | 3.3953 | 146 | 1.1066 | 0.4706 | 1.1066 | 1.0519 |
125
+ | No log | 3.4419 | 148 | 1.1080 | 0.4603 | 1.1080 | 1.0526 |
126
+ | No log | 3.4884 | 150 | 0.9418 | 0.4062 | 0.9418 | 0.9705 |
127
+ | No log | 3.5349 | 152 | 0.9196 | 0.4138 | 0.9196 | 0.9589 |
128
+ | No log | 3.5814 | 154 | 0.9900 | 0.3840 | 0.9900 | 0.9950 |
129
+ | No log | 3.6279 | 156 | 1.1182 | 0.4355 | 1.1182 | 1.0574 |
130
+ | No log | 3.6744 | 158 | 1.1696 | 0.4056 | 1.1696 | 1.0815 |
131
+ | No log | 3.7209 | 160 | 1.1858 | 0.5153 | 1.1858 | 1.0889 |
132
+ | No log | 3.7674 | 162 | 1.5207 | 0.3141 | 1.5207 | 1.2332 |
133
+ | No log | 3.8140 | 164 | 1.6225 | 0.3390 | 1.6225 | 1.2738 |
134
+ | No log | 3.8605 | 166 | 1.3155 | 0.2982 | 1.3155 | 1.1470 |
135
+ | No log | 3.9070 | 168 | 1.0391 | 0.4685 | 1.0391 | 1.0194 |
136
+ | No log | 3.9535 | 170 | 0.9703 | 0.5060 | 0.9703 | 0.9850 |
137
+ | No log | 4.0 | 172 | 0.9581 | 0.4358 | 0.9581 | 0.9789 |
138
+ | No log | 4.0465 | 174 | 0.9287 | 0.4114 | 0.9287 | 0.9637 |
139
+ | No log | 4.0930 | 176 | 0.9172 | 0.4572 | 0.9172 | 0.9577 |
140
+ | No log | 4.1395 | 178 | 0.9015 | 0.4948 | 0.9015 | 0.9495 |
141
+ | No log | 4.1860 | 180 | 0.9275 | 0.4657 | 0.9275 | 0.9631 |
142
+ | No log | 4.2326 | 182 | 0.9387 | 0.4570 | 0.9387 | 0.9688 |
143
+ | No log | 4.2791 | 184 | 0.8812 | 0.4731 | 0.8812 | 0.9387 |
144
+ | No log | 4.3256 | 186 | 0.9410 | 0.3875 | 0.9410 | 0.9700 |
145
+ | No log | 4.3721 | 188 | 1.1095 | 0.4792 | 1.1095 | 1.0533 |
146
+ | No log | 4.4186 | 190 | 1.1052 | 0.4695 | 1.1052 | 1.0513 |
147
+ | No log | 4.4651 | 192 | 0.9941 | 0.3437 | 0.9941 | 0.9971 |
148
+ | No log | 4.5116 | 194 | 0.9563 | 0.2785 | 0.9563 | 0.9779 |
149
+ | No log | 4.5581 | 196 | 1.0133 | 0.3192 | 1.0133 | 1.0066 |
150
+ | No log | 4.6047 | 198 | 1.0122 | 0.2892 | 1.0122 | 1.0061 |
151
+ | No log | 4.6512 | 200 | 0.9783 | 0.2338 | 0.9783 | 0.9891 |
152
+ | No log | 4.6977 | 202 | 0.9664 | 0.2574 | 0.9664 | 0.9831 |
153
+ | No log | 4.7442 | 204 | 0.9442 | 0.2693 | 0.9442 | 0.9717 |
154
+ | No log | 4.7907 | 206 | 0.9410 | 0.3647 | 0.9410 | 0.9700 |
155
+ | No log | 4.8372 | 208 | 0.9069 | 0.3671 | 0.9069 | 0.9523 |
156
+ | No log | 4.8837 | 210 | 0.8733 | 0.3781 | 0.8733 | 0.9345 |
157
+ | No log | 4.9302 | 212 | 0.8615 | 0.3915 | 0.8615 | 0.9282 |
158
+ | No log | 4.9767 | 214 | 0.8662 | 0.4012 | 0.8662 | 0.9307 |
159
+ | No log | 5.0233 | 216 | 0.9264 | 0.5070 | 0.9264 | 0.9625 |
160
+ | No log | 5.0698 | 218 | 0.9382 | 0.5070 | 0.9382 | 0.9686 |
161
+ | No log | 5.1163 | 220 | 0.8878 | 0.4391 | 0.8878 | 0.9423 |
162
+ | No log | 5.1628 | 222 | 0.8854 | 0.3154 | 0.8854 | 0.9409 |
163
+ | No log | 5.2093 | 224 | 0.9413 | 0.3783 | 0.9413 | 0.9702 |
164
+ | No log | 5.2558 | 226 | 0.9243 | 0.3664 | 0.9243 | 0.9614 |
165
+ | No log | 5.3023 | 228 | 0.9028 | 0.4270 | 0.9028 | 0.9502 |
166
+ | No log | 5.3488 | 230 | 0.9116 | 0.4134 | 0.9116 | 0.9548 |
167
+ | No log | 5.3953 | 232 | 0.9151 | 0.4134 | 0.9151 | 0.9566 |
168
+ | No log | 5.4419 | 234 | 0.9240 | 0.3957 | 0.9240 | 0.9613 |
169
+ | No log | 5.4884 | 236 | 0.9029 | 0.4321 | 0.9029 | 0.9502 |
170
+ | No log | 5.5349 | 238 | 0.9534 | 0.4514 | 0.9534 | 0.9764 |
171
+ | No log | 5.5814 | 240 | 1.0161 | 0.4610 | 1.0161 | 1.0080 |
172
+ | No log | 5.6279 | 242 | 0.9859 | 0.4915 | 0.9859 | 0.9929 |
173
+ | No log | 5.6744 | 244 | 0.8727 | 0.4843 | 0.8727 | 0.9342 |
174
+ | No log | 5.7209 | 246 | 0.8498 | 0.4268 | 0.8498 | 0.9218 |
175
+ | No log | 5.7674 | 248 | 0.9071 | 0.3619 | 0.9071 | 0.9524 |
176
+ | No log | 5.8140 | 250 | 0.8972 | 0.3157 | 0.8972 | 0.9472 |
177
+ | No log | 5.8605 | 252 | 0.8908 | 0.2998 | 0.8908 | 0.9438 |
178
+ | No log | 5.9070 | 254 | 0.9525 | 0.3959 | 0.9525 | 0.9759 |
179
+ | No log | 5.9535 | 256 | 0.9973 | 0.3474 | 0.9973 | 0.9986 |
180
+ | No log | 6.0 | 258 | 0.9748 | 0.3841 | 0.9748 | 0.9873 |
181
+ | No log | 6.0465 | 260 | 0.9415 | 0.3188 | 0.9415 | 0.9703 |
182
+ | No log | 6.0930 | 262 | 0.9678 | 0.3841 | 0.9678 | 0.9838 |
183
+ | No log | 6.1395 | 264 | 0.9613 | 0.4141 | 0.9613 | 0.9805 |
184
+ | No log | 6.1860 | 266 | 0.9587 | 0.3747 | 0.9587 | 0.9791 |
185
+ | No log | 6.2326 | 268 | 1.0244 | 0.3282 | 1.0244 | 1.0121 |
186
+ | No log | 6.2791 | 270 | 1.1313 | 0.2904 | 1.1313 | 1.0636 |
187
+ | No log | 6.3256 | 272 | 1.0222 | 0.3025 | 1.0222 | 1.0110 |
188
+ | No log | 6.3721 | 274 | 0.9258 | 0.2963 | 0.9258 | 0.9622 |
189
+ | No log | 6.4186 | 276 | 0.8829 | 0.3152 | 0.8829 | 0.9396 |
190
+ | No log | 6.4651 | 278 | 0.8675 | 0.3004 | 0.8675 | 0.9314 |
191
+ | No log | 6.5116 | 280 | 0.8602 | 0.3838 | 0.8602 | 0.9275 |
192
+ | No log | 6.5581 | 282 | 0.9199 | 0.4471 | 0.9199 | 0.9591 |
193
+ | No log | 6.6047 | 284 | 0.9488 | 0.4586 | 0.9488 | 0.9741 |
194
+ | No log | 6.6512 | 286 | 0.9075 | 0.4231 | 0.9075 | 0.9526 |
195
+ | No log | 6.6977 | 288 | 0.8662 | 0.4553 | 0.8662 | 0.9307 |
196
+ | No log | 6.7442 | 290 | 0.8841 | 0.4838 | 0.8841 | 0.9403 |
197
+ | No log | 6.7907 | 292 | 0.8812 | 0.4824 | 0.8812 | 0.9387 |
198
+ | No log | 6.8372 | 294 | 0.8666 | 0.5016 | 0.8666 | 0.9309 |
199
+ | No log | 6.8837 | 296 | 0.8886 | 0.4209 | 0.8886 | 0.9426 |
200
+ | No log | 6.9302 | 298 | 0.9173 | 0.4041 | 0.9173 | 0.9577 |
201
+ | No log | 6.9767 | 300 | 0.8884 | 0.3192 | 0.8884 | 0.9425 |
202
+ | No log | 7.0233 | 302 | 0.8545 | 0.4192 | 0.8545 | 0.9244 |
203
+ | No log | 7.0698 | 304 | 0.8139 | 0.4241 | 0.8139 | 0.9022 |
204
+ | No log | 7.1163 | 306 | 0.7922 | 0.4547 | 0.7922 | 0.8901 |
205
+ | No log | 7.1628 | 308 | 0.7893 | 0.4883 | 0.7893 | 0.8884 |
206
+ | No log | 7.2093 | 310 | 0.7984 | 0.4629 | 0.7984 | 0.8935 |
207
+ | No log | 7.2558 | 312 | 0.8307 | 0.5220 | 0.8307 | 0.9114 |
208
+ | No log | 7.3023 | 314 | 0.8258 | 0.4743 | 0.8258 | 0.9087 |
209
+ | No log | 7.3488 | 316 | 0.8425 | 0.4743 | 0.8425 | 0.9179 |
210
+ | No log | 7.3953 | 318 | 0.8137 | 0.4378 | 0.8137 | 0.9020 |
211
+ | No log | 7.4419 | 320 | 0.7999 | 0.4251 | 0.7999 | 0.8944 |
212
+ | No log | 7.4884 | 322 | 0.8044 | 0.3941 | 0.8044 | 0.8969 |
213
+ | No log | 7.5349 | 324 | 0.8430 | 0.4327 | 0.8430 | 0.9182 |
214
+ | No log | 7.5814 | 326 | 0.8596 | 0.3780 | 0.8596 | 0.9271 |
215
+ | No log | 7.6279 | 328 | 0.8636 | 0.3175 | 0.8636 | 0.9293 |
216
+ | No log | 7.6744 | 330 | 0.8704 | 0.2643 | 0.8704 | 0.9330 |
217
+ | No log | 7.7209 | 332 | 0.8642 | 0.3071 | 0.8642 | 0.9296 |
218
+ | No log | 7.7674 | 334 | 0.8526 | 0.3117 | 0.8526 | 0.9234 |
219
+ | No log | 7.8140 | 336 | 0.8383 | 0.3301 | 0.8383 | 0.9156 |
220
+ | No log | 7.8605 | 338 | 0.8406 | 0.4012 | 0.8406 | 0.9168 |
221
+ | No log | 7.9070 | 340 | 0.8384 | 0.4271 | 0.8384 | 0.9156 |
222
+ | No log | 7.9535 | 342 | 0.8234 | 0.4329 | 0.8234 | 0.9074 |
223
+ | No log | 8.0 | 344 | 0.8183 | 0.5450 | 0.8183 | 0.9046 |
224
+ | No log | 8.0465 | 346 | 0.8179 | 0.5433 | 0.8179 | 0.9044 |
225
+ | No log | 8.0930 | 348 | 0.8404 | 0.4433 | 0.8404 | 0.9167 |
226
+ | No log | 8.1395 | 350 | 0.8262 | 0.4812 | 0.8262 | 0.9090 |
227
+ | No log | 8.1860 | 352 | 0.7726 | 0.5733 | 0.7726 | 0.8790 |
228
+ | No log | 8.2326 | 354 | 0.7498 | 0.4762 | 0.7498 | 0.8659 |
229
+ | No log | 8.2791 | 356 | 0.7581 | 0.4388 | 0.7581 | 0.8707 |
230
+ | No log | 8.3256 | 358 | 0.7797 | 0.5472 | 0.7797 | 0.8830 |
231
+ | No log | 8.3721 | 360 | 0.7903 | 0.5698 | 0.7903 | 0.8890 |
232
+ | No log | 8.4186 | 362 | 0.7873 | 0.3873 | 0.7873 | 0.8873 |
233
+ | No log | 8.4651 | 364 | 0.7986 | 0.4002 | 0.7986 | 0.8936 |
234
+ | No log | 8.5116 | 366 | 0.7924 | 0.3915 | 0.7924 | 0.8901 |
235
+ | No log | 8.5581 | 368 | 0.8115 | 0.4279 | 0.8115 | 0.9008 |
236
+ | No log | 8.6047 | 370 | 0.8072 | 0.4014 | 0.8072 | 0.8984 |
237
+ | No log | 8.6512 | 372 | 0.7952 | 0.4030 | 0.7952 | 0.8918 |
238
+ | No log | 8.6977 | 374 | 0.7743 | 0.3797 | 0.7743 | 0.8800 |
239
+ | No log | 8.7442 | 376 | 0.7953 | 0.3700 | 0.7953 | 0.8918 |
240
+ | No log | 8.7907 | 378 | 0.7859 | 0.3700 | 0.7859 | 0.8865 |
241
+ | No log | 8.8372 | 380 | 0.7662 | 0.3496 | 0.7662 | 0.8753 |
242
+ | No log | 8.8837 | 382 | 0.7892 | 0.4867 | 0.7892 | 0.8884 |
243
+ | No log | 8.9302 | 384 | 0.8076 | 0.4729 | 0.8076 | 0.8987 |
244
+ | No log | 8.9767 | 386 | 0.7704 | 0.4211 | 0.7704 | 0.8777 |
245
+ | No log | 9.0233 | 388 | 0.7811 | 0.3923 | 0.7811 | 0.8838 |
246
+ | No log | 9.0698 | 390 | 0.8018 | 0.3795 | 0.8018 | 0.8954 |
247
+ | No log | 9.1163 | 392 | 0.8471 | 0.4309 | 0.8471 | 0.9204 |
248
+ | No log | 9.1628 | 394 | 0.8573 | 0.3733 | 0.8573 | 0.9259 |
249
+ | No log | 9.2093 | 396 | 0.8127 | 0.3392 | 0.8127 | 0.9015 |
250
+ | No log | 9.2558 | 398 | 0.8274 | 0.3025 | 0.8274 | 0.9096 |
251
+ | No log | 9.3023 | 400 | 0.8452 | 0.3602 | 0.8452 | 0.9194 |
252
+ | No log | 9.3488 | 402 | 0.8242 | 0.3025 | 0.8242 | 0.9078 |
253
+ | No log | 9.3953 | 404 | 0.7870 | 0.3257 | 0.7870 | 0.8871 |
254
+ | No log | 9.4419 | 406 | 0.8058 | 0.4456 | 0.8058 | 0.8977 |
255
+ | No log | 9.4884 | 408 | 0.8365 | 0.3733 | 0.8365 | 0.9146 |
256
+ | No log | 9.5349 | 410 | 0.8004 | 0.4160 | 0.8004 | 0.8947 |
257
+ | No log | 9.5814 | 412 | 0.7776 | 0.3774 | 0.7776 | 0.8818 |
258
+ | No log | 9.6279 | 414 | 0.7710 | 0.3667 | 0.7710 | 0.8781 |
259
+ | No log | 9.6744 | 416 | 0.7688 | 0.3139 | 0.7688 | 0.8768 |
260
+ | No log | 9.7209 | 418 | 0.7715 | 0.4086 | 0.7715 | 0.8783 |
261
+ | No log | 9.7674 | 420 | 0.7926 | 0.4626 | 0.7926 | 0.8903 |
262
+ | No log | 9.8140 | 422 | 0.8198 | 0.4576 | 0.8198 | 0.9054 |
263
+ | No log | 9.8605 | 424 | 0.8009 | 0.5083 | 0.8009 | 0.8949 |
264
+ | No log | 9.9070 | 426 | 0.7568 | 0.5003 | 0.7568 | 0.8700 |
265
+ | No log | 9.9535 | 428 | 0.7292 | 0.3858 | 0.7292 | 0.8539 |
266
+ | No log | 10.0 | 430 | 0.7215 | 0.4151 | 0.7215 | 0.8494 |
267
+ | No log | 10.0465 | 432 | 0.7471 | 0.4643 | 0.7471 | 0.8643 |
268
+ | No log | 10.0930 | 434 | 0.7538 | 0.5495 | 0.7538 | 0.8682 |
269
+ | No log | 10.1395 | 436 | 0.7477 | 0.4411 | 0.7477 | 0.8647 |
270
+ | No log | 10.1860 | 438 | 0.7674 | 0.4411 | 0.7674 | 0.8760 |
271
+ | No log | 10.2326 | 440 | 0.7879 | 0.4626 | 0.7879 | 0.8876 |
272
+ | No log | 10.2791 | 442 | 0.8217 | 0.4730 | 0.8217 | 0.9065 |
273
+ | No log | 10.3256 | 444 | 0.8864 | 0.4433 | 0.8864 | 0.9415 |
274
+ | No log | 10.3721 | 446 | 0.8881 | 0.3902 | 0.8881 | 0.9424 |
275
+ | No log | 10.4186 | 448 | 0.8342 | 0.4746 | 0.8342 | 0.9134 |
276
+ | No log | 10.4651 | 450 | 0.8260 | 0.3961 | 0.8260 | 0.9088 |
277
+ | No log | 10.5116 | 452 | 0.8434 | 0.4241 | 0.8434 | 0.9184 |
278
+ | No log | 10.5581 | 454 | 0.8508 | 0.4626 | 0.8508 | 0.9224 |
279
+ | No log | 10.6047 | 456 | 0.8644 | 0.4490 | 0.8644 | 0.9297 |
280
+ | No log | 10.6512 | 458 | 0.8598 | 0.3961 | 0.8598 | 0.9273 |
281
+ | No log | 10.6977 | 460 | 0.8534 | 0.4086 | 0.8534 | 0.9238 |
282
+ | No log | 10.7442 | 462 | 0.8529 | 0.4746 | 0.8529 | 0.9235 |
283
+ | No log | 10.7907 | 464 | 0.8795 | 0.3822 | 0.8795 | 0.9378 |
284
+ | No log | 10.8372 | 466 | 0.8939 | 0.3822 | 0.8939 | 0.9455 |
285
+ | No log | 10.8837 | 468 | 0.8670 | 0.3799 | 0.8670 | 0.9311 |
286
+ | No log | 10.9302 | 470 | 0.8522 | 0.3178 | 0.8522 | 0.9231 |
287
+ | No log | 10.9767 | 472 | 0.8488 | 0.3178 | 0.8488 | 0.9213 |
288
+ | No log | 11.0233 | 474 | 0.8390 | 0.3178 | 0.8390 | 0.9160 |
289
+ | No log | 11.0698 | 476 | 0.8433 | 0.4204 | 0.8433 | 0.9183 |
290
+ | No log | 11.1163 | 478 | 0.8582 | 0.4198 | 0.8582 | 0.9264 |
291
+ | No log | 11.1628 | 480 | 0.8561 | 0.4579 | 0.8561 | 0.9253 |
292
+ | No log | 11.2093 | 482 | 0.8739 | 0.4579 | 0.8739 | 0.9348 |
293
+ | No log | 11.2558 | 484 | 0.8247 | 0.4612 | 0.8247 | 0.9081 |
294
+ | No log | 11.3023 | 486 | 0.8040 | 0.4355 | 0.8040 | 0.8966 |
295
+ | No log | 11.3488 | 488 | 0.8269 | 0.4355 | 0.8269 | 0.9093 |
296
+ | No log | 11.3953 | 490 | 0.8531 | 0.3476 | 0.8531 | 0.9236 |
297
+ | No log | 11.4419 | 492 | 0.8476 | 0.3476 | 0.8476 | 0.9206 |
298
+ | No log | 11.4884 | 494 | 0.8135 | 0.4051 | 0.8135 | 0.9020 |
299
+ | No log | 11.5349 | 496 | 0.7820 | 0.3998 | 0.7820 | 0.8843 |
300
+ | No log | 11.5814 | 498 | 0.7814 | 0.3877 | 0.7814 | 0.8840 |
301
+ | 0.3085 | 11.6279 | 500 | 0.7803 | 0.4151 | 0.7803 | 0.8833 |
302
+ | 0.3085 | 11.6744 | 502 | 0.7768 | 0.4305 | 0.7768 | 0.8813 |
303
+ | 0.3085 | 11.7209 | 504 | 0.8608 | 0.4708 | 0.8608 | 0.9278 |
304
+ | 0.3085 | 11.7674 | 506 | 0.9217 | 0.4902 | 0.9217 | 0.9600 |
305
+ | 0.3085 | 11.8140 | 508 | 0.8814 | 0.4593 | 0.8814 | 0.9388 |
306
+ | 0.3085 | 11.8605 | 510 | 0.8143 | 0.3879 | 0.8143 | 0.9024 |
307
+ | 0.3085 | 11.9070 | 512 | 0.8159 | 0.3877 | 0.8159 | 0.9033 |
308
+ | 0.3085 | 11.9535 | 514 | 0.8281 | 0.3590 | 0.8281 | 0.9100 |
309
+ | 0.3085 | 12.0 | 516 | 0.8218 | 0.3858 | 0.8218 | 0.9066 |
310
+ | 0.3085 | 12.0465 | 518 | 0.8595 | 0.3596 | 0.8595 | 0.9271 |
311
+
312
+
313
+ ### Framework versions
314
+
315
+ - Transformers 4.44.2
316
+ - Pytorch 2.4.0+cu118
317
+ - Datasets 2.21.0
318
+ - Tokenizers 0.19.1
config.json ADDED
@@ -0,0 +1,32 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "_name_or_path": "aubmindlab/bert-base-arabertv02",
3
+ "architectures": [
4
+ "BertForSequenceClassification"
5
+ ],
6
+ "attention_probs_dropout_prob": 0.1,
7
+ "classifier_dropout": null,
8
+ "hidden_act": "gelu",
9
+ "hidden_dropout_prob": 0.1,
10
+ "hidden_size": 768,
11
+ "id2label": {
12
+ "0": "LABEL_0"
13
+ },
14
+ "initializer_range": 0.02,
15
+ "intermediate_size": 3072,
16
+ "label2id": {
17
+ "LABEL_0": 0
18
+ },
19
+ "layer_norm_eps": 1e-12,
20
+ "max_position_embeddings": 512,
21
+ "model_type": "bert",
22
+ "num_attention_heads": 12,
23
+ "num_hidden_layers": 12,
24
+ "pad_token_id": 0,
25
+ "position_embedding_type": "absolute",
26
+ "problem_type": "regression",
27
+ "torch_dtype": "float32",
28
+ "transformers_version": "4.44.2",
29
+ "type_vocab_size": 2,
30
+ "use_cache": true,
31
+ "vocab_size": 64000
32
+ }
model.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:e4fbe9d085a357a2add1461b40416790284ad9a6a8f307c709edd3fd529be404
3
+ size 540799996
training_args.bin ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:0b1636d329d8641e22a9540cd0a79724e81bbc75fee41244b24f16721759660e
3
+ size 5304