MayBashendy commited on
Commit
266b772
·
verified ·
1 Parent(s): 92ec94e

Training in progress, step 500

Browse files
Files changed (4) hide show
  1. README.md +316 -0
  2. config.json +32 -0
  3. model.safetensors +3 -0
  4. training_args.bin +3 -0
README.md ADDED
@@ -0,0 +1,316 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ library_name: transformers
3
+ base_model: aubmindlab/bert-base-arabertv02
4
+ tags:
5
+ - generated_from_trainer
6
+ model-index:
7
+ - name: ArabicNewSplits7_B_usingALLEssays_FineTuningAraBERT_run3_AugV5_k7_task2_organization
8
+ results: []
9
+ ---
10
+
11
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
12
+ should probably proofread and complete it, then remove this comment. -->
13
+
14
+ # ArabicNewSplits7_B_usingALLEssays_FineTuningAraBERT_run3_AugV5_k7_task2_organization
15
+
16
+ This model is a fine-tuned version of [aubmindlab/bert-base-arabertv02](https://huggingface.co/aubmindlab/bert-base-arabertv02) on the None dataset.
17
+ It achieves the following results on the evaluation set:
18
+ - Loss: 0.9284
19
+ - Qwk: 0.4514
20
+ - Mse: 0.9284
21
+ - Rmse: 0.9635
22
+
23
+ ## Model description
24
+
25
+ More information needed
26
+
27
+ ## Intended uses & limitations
28
+
29
+ More information needed
30
+
31
+ ## Training and evaluation data
32
+
33
+ More information needed
34
+
35
+ ## Training procedure
36
+
37
+ ### Training hyperparameters
38
+
39
+ The following hyperparameters were used during training:
40
+ - learning_rate: 2e-05
41
+ - train_batch_size: 8
42
+ - eval_batch_size: 8
43
+ - seed: 42
44
+ - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
45
+ - lr_scheduler_type: linear
46
+ - num_epochs: 100
47
+
48
+ ### Training results
49
+
50
+ | Training Loss | Epoch | Step | Validation Loss | Qwk | Mse | Rmse |
51
+ |:-------------:|:-------:|:----:|:---------------:|:-------:|:------:|:------:|
52
+ | No log | 0.0541 | 2 | 4.5464 | 0.0018 | 4.5464 | 2.1322 |
53
+ | No log | 0.1081 | 4 | 2.5874 | 0.0025 | 2.5874 | 1.6085 |
54
+ | No log | 0.1622 | 6 | 1.9243 | 0.0062 | 1.9243 | 1.3872 |
55
+ | No log | 0.2162 | 8 | 1.5219 | -0.0211 | 1.5219 | 1.2336 |
56
+ | No log | 0.2703 | 10 | 1.3021 | 0.0452 | 1.3021 | 1.1411 |
57
+ | No log | 0.3243 | 12 | 1.2545 | 0.1432 | 1.2545 | 1.1200 |
58
+ | No log | 0.3784 | 14 | 1.2282 | 0.1785 | 1.2282 | 1.1082 |
59
+ | No log | 0.4324 | 16 | 1.3112 | 0.1493 | 1.3112 | 1.1451 |
60
+ | No log | 0.4865 | 18 | 1.2261 | 0.2058 | 1.2261 | 1.1073 |
61
+ | No log | 0.5405 | 20 | 1.1855 | 0.2388 | 1.1855 | 1.0888 |
62
+ | No log | 0.5946 | 22 | 1.2676 | 0.2736 | 1.2676 | 1.1259 |
63
+ | No log | 0.6486 | 24 | 1.8758 | 0.1831 | 1.8758 | 1.3696 |
64
+ | No log | 0.7027 | 26 | 2.4966 | 0.1256 | 2.4966 | 1.5801 |
65
+ | No log | 0.7568 | 28 | 1.8298 | 0.2023 | 1.8298 | 1.3527 |
66
+ | No log | 0.8108 | 30 | 1.2078 | 0.2377 | 1.2078 | 1.0990 |
67
+ | No log | 0.8649 | 32 | 1.1048 | 0.2721 | 1.1048 | 1.0511 |
68
+ | No log | 0.9189 | 34 | 1.1013 | 0.2721 | 1.1013 | 1.0494 |
69
+ | No log | 0.9730 | 36 | 1.2300 | 0.2346 | 1.2300 | 1.1091 |
70
+ | No log | 1.0270 | 38 | 1.2607 | 0.2864 | 1.2607 | 1.1228 |
71
+ | No log | 1.0811 | 40 | 1.1874 | 0.2346 | 1.1874 | 1.0897 |
72
+ | No log | 1.1351 | 42 | 1.1316 | 0.2532 | 1.1316 | 1.0638 |
73
+ | No log | 1.1892 | 44 | 1.1438 | 0.2911 | 1.1438 | 1.0695 |
74
+ | No log | 1.2432 | 46 | 1.2910 | 0.2077 | 1.2910 | 1.1362 |
75
+ | No log | 1.2973 | 48 | 1.3371 | 0.2539 | 1.3371 | 1.1563 |
76
+ | No log | 1.3514 | 50 | 1.1343 | 0.4171 | 1.1343 | 1.0650 |
77
+ | No log | 1.4054 | 52 | 1.1738 | 0.4119 | 1.1738 | 1.0834 |
78
+ | No log | 1.4595 | 54 | 1.0629 | 0.3723 | 1.0629 | 1.0309 |
79
+ | No log | 1.5135 | 56 | 1.0118 | 0.3329 | 1.0118 | 1.0059 |
80
+ | No log | 1.5676 | 58 | 0.9913 | 0.3956 | 0.9913 | 0.9957 |
81
+ | No log | 1.6216 | 60 | 1.0151 | 0.3704 | 1.0151 | 1.0075 |
82
+ | No log | 1.6757 | 62 | 1.0961 | 0.3614 | 1.0961 | 1.0470 |
83
+ | No log | 1.7297 | 64 | 1.5191 | 0.2165 | 1.5191 | 1.2325 |
84
+ | No log | 1.7838 | 66 | 1.6980 | 0.2484 | 1.6980 | 1.3031 |
85
+ | No log | 1.8378 | 68 | 1.5859 | 0.2364 | 1.5859 | 1.2593 |
86
+ | No log | 1.8919 | 70 | 1.1989 | 0.2295 | 1.1989 | 1.0950 |
87
+ | No log | 1.9459 | 72 | 0.9973 | 0.3965 | 0.9973 | 0.9986 |
88
+ | No log | 2.0 | 74 | 0.9503 | 0.3965 | 0.9503 | 0.9748 |
89
+ | No log | 2.0541 | 76 | 0.9888 | 0.4121 | 0.9888 | 0.9944 |
90
+ | No log | 2.1081 | 78 | 1.2387 | 0.4052 | 1.2387 | 1.1130 |
91
+ | No log | 2.1622 | 80 | 1.4119 | 0.4167 | 1.4119 | 1.1882 |
92
+ | No log | 2.2162 | 82 | 1.0645 | 0.4597 | 1.0645 | 1.0317 |
93
+ | No log | 2.2703 | 84 | 0.9453 | 0.4489 | 0.9453 | 0.9723 |
94
+ | No log | 2.3243 | 86 | 1.0131 | 0.3317 | 1.0131 | 1.0065 |
95
+ | No log | 2.3784 | 88 | 1.0368 | 0.2921 | 1.0368 | 1.0183 |
96
+ | No log | 2.4324 | 90 | 0.9735 | 0.3738 | 0.9735 | 0.9867 |
97
+ | No log | 2.4865 | 92 | 1.0628 | 0.4406 | 1.0628 | 1.0309 |
98
+ | No log | 2.5405 | 94 | 1.1102 | 0.4048 | 1.1102 | 1.0537 |
99
+ | No log | 2.5946 | 96 | 1.0219 | 0.3154 | 1.0219 | 1.0109 |
100
+ | No log | 2.6486 | 98 | 1.0409 | 0.3084 | 1.0409 | 1.0202 |
101
+ | No log | 2.7027 | 100 | 1.0427 | 0.3176 | 1.0427 | 1.0211 |
102
+ | No log | 2.7568 | 102 | 1.0175 | 0.375 | 1.0175 | 1.0087 |
103
+ | No log | 2.8108 | 104 | 1.1733 | 0.3763 | 1.1733 | 1.0832 |
104
+ | No log | 2.8649 | 106 | 1.0789 | 0.3731 | 1.0789 | 1.0387 |
105
+ | No log | 2.9189 | 108 | 0.9999 | 0.3327 | 0.9999 | 1.0000 |
106
+ | No log | 2.9730 | 110 | 1.0229 | 0.2995 | 1.0229 | 1.0114 |
107
+ | No log | 3.0270 | 112 | 0.9651 | 0.3293 | 0.9651 | 0.9824 |
108
+ | No log | 3.0811 | 114 | 1.0590 | 0.4459 | 1.0590 | 1.0291 |
109
+ | No log | 3.1351 | 116 | 1.4591 | 0.3216 | 1.4591 | 1.2079 |
110
+ | No log | 3.1892 | 118 | 1.4428 | 0.3216 | 1.4428 | 1.2012 |
111
+ | No log | 3.2432 | 120 | 1.1972 | 0.3763 | 1.1972 | 1.0941 |
112
+ | No log | 3.2973 | 122 | 1.0002 | 0.3886 | 1.0002 | 1.0001 |
113
+ | No log | 3.3514 | 124 | 0.9983 | 0.3855 | 0.9983 | 0.9992 |
114
+ | No log | 3.4054 | 126 | 1.1165 | 0.3776 | 1.1165 | 1.0566 |
115
+ | No log | 3.4595 | 128 | 1.6341 | 0.3016 | 1.6341 | 1.2783 |
116
+ | No log | 3.5135 | 130 | 1.7182 | 0.3230 | 1.7182 | 1.3108 |
117
+ | No log | 3.5676 | 132 | 1.2995 | 0.3884 | 1.2995 | 1.1400 |
118
+ | No log | 3.6216 | 134 | 1.1280 | 0.4327 | 1.1280 | 1.0621 |
119
+ | No log | 3.6757 | 136 | 1.2568 | 0.3192 | 1.2568 | 1.1211 |
120
+ | No log | 3.7297 | 138 | 1.1878 | 0.3340 | 1.1878 | 1.0899 |
121
+ | No log | 3.7838 | 140 | 1.0928 | 0.4035 | 1.0928 | 1.0453 |
122
+ | No log | 3.8378 | 142 | 1.1834 | 0.3378 | 1.1834 | 1.0879 |
123
+ | No log | 3.8919 | 144 | 1.4885 | 0.2614 | 1.4885 | 1.2200 |
124
+ | No log | 3.9459 | 146 | 1.5685 | 0.1706 | 1.5685 | 1.2524 |
125
+ | No log | 4.0 | 148 | 1.3022 | 0.3184 | 1.3022 | 1.1412 |
126
+ | No log | 4.0541 | 150 | 1.0827 | 0.3200 | 1.0827 | 1.0405 |
127
+ | No log | 4.1081 | 152 | 1.0809 | 0.3132 | 1.0809 | 1.0396 |
128
+ | No log | 4.1622 | 154 | 1.0863 | 0.3836 | 1.0863 | 1.0423 |
129
+ | No log | 4.2162 | 156 | 1.2981 | 0.3217 | 1.2981 | 1.1393 |
130
+ | No log | 4.2703 | 158 | 1.4814 | 0.2493 | 1.4814 | 1.2171 |
131
+ | No log | 4.3243 | 160 | 1.3127 | 0.3204 | 1.3127 | 1.1457 |
132
+ | No log | 4.3784 | 162 | 1.0511 | 0.4421 | 1.0511 | 1.0252 |
133
+ | No log | 4.4324 | 164 | 1.0127 | 0.4458 | 1.0127 | 1.0063 |
134
+ | No log | 4.4865 | 166 | 1.0119 | 0.4337 | 1.0119 | 1.0059 |
135
+ | No log | 4.5405 | 168 | 1.0040 | 0.3935 | 1.0040 | 1.0020 |
136
+ | No log | 4.5946 | 170 | 1.0690 | 0.3871 | 1.0690 | 1.0339 |
137
+ | No log | 4.6486 | 172 | 1.2316 | 0.3763 | 1.2316 | 1.1098 |
138
+ | No log | 4.7027 | 174 | 1.2871 | 0.3523 | 1.2871 | 1.1345 |
139
+ | No log | 4.7568 | 176 | 1.1301 | 0.3617 | 1.1301 | 1.0630 |
140
+ | No log | 4.8108 | 178 | 1.0654 | 0.4736 | 1.0654 | 1.0322 |
141
+ | No log | 4.8649 | 180 | 1.0839 | 0.4389 | 1.0839 | 1.0411 |
142
+ | No log | 4.9189 | 182 | 1.0701 | 0.3811 | 1.0701 | 1.0345 |
143
+ | No log | 4.9730 | 184 | 1.1251 | 0.3161 | 1.1251 | 1.0607 |
144
+ | No log | 5.0270 | 186 | 1.1319 | 0.3161 | 1.1319 | 1.0639 |
145
+ | No log | 5.0811 | 188 | 1.1298 | 0.3161 | 1.1298 | 1.0629 |
146
+ | No log | 5.1351 | 190 | 1.2430 | 0.3879 | 1.2430 | 1.1149 |
147
+ | No log | 5.1892 | 192 | 1.2123 | 0.4204 | 1.2123 | 1.1010 |
148
+ | No log | 5.2432 | 194 | 1.0784 | 0.3942 | 1.0783 | 1.0384 |
149
+ | No log | 5.2973 | 196 | 1.0997 | 0.4385 | 1.0997 | 1.0487 |
150
+ | No log | 5.3514 | 198 | 1.0970 | 0.4385 | 1.0970 | 1.0474 |
151
+ | No log | 5.4054 | 200 | 1.0727 | 0.4726 | 1.0727 | 1.0357 |
152
+ | No log | 5.4595 | 202 | 1.0522 | 0.3551 | 1.0522 | 1.0257 |
153
+ | No log | 5.5135 | 204 | 1.0912 | 0.4084 | 1.0912 | 1.0446 |
154
+ | No log | 5.5676 | 206 | 1.0797 | 0.4084 | 1.0797 | 1.0391 |
155
+ | No log | 5.6216 | 208 | 1.0527 | 0.4286 | 1.0527 | 1.0260 |
156
+ | No log | 5.6757 | 210 | 1.0501 | 0.3690 | 1.0501 | 1.0248 |
157
+ | No log | 5.7297 | 212 | 1.0725 | 0.4681 | 1.0725 | 1.0356 |
158
+ | No log | 5.7838 | 214 | 1.1102 | 0.4297 | 1.1102 | 1.0537 |
159
+ | No log | 5.8378 | 216 | 1.0454 | 0.4321 | 1.0454 | 1.0224 |
160
+ | No log | 5.8919 | 218 | 1.0385 | 0.3934 | 1.0385 | 1.0191 |
161
+ | No log | 5.9459 | 220 | 1.0179 | 0.3733 | 1.0179 | 1.0089 |
162
+ | No log | 6.0 | 222 | 1.0480 | 0.3338 | 1.0480 | 1.0237 |
163
+ | No log | 6.0541 | 224 | 1.1120 | 0.3033 | 1.1120 | 1.0545 |
164
+ | No log | 6.1081 | 226 | 1.0169 | 0.3338 | 1.0169 | 1.0084 |
165
+ | No log | 6.1622 | 228 | 0.9698 | 0.3833 | 0.9698 | 0.9848 |
166
+ | No log | 6.2162 | 230 | 0.9849 | 0.4876 | 0.9849 | 0.9924 |
167
+ | No log | 6.2703 | 232 | 0.9473 | 0.4668 | 0.9473 | 0.9733 |
168
+ | No log | 6.3243 | 234 | 0.9823 | 0.4267 | 0.9823 | 0.9911 |
169
+ | No log | 6.3784 | 236 | 0.9639 | 0.4106 | 0.9639 | 0.9818 |
170
+ | No log | 6.4324 | 238 | 0.9598 | 0.4250 | 0.9598 | 0.9797 |
171
+ | No log | 6.4865 | 240 | 0.9646 | 0.4587 | 0.9646 | 0.9821 |
172
+ | No log | 6.5405 | 242 | 0.9783 | 0.3539 | 0.9783 | 0.9891 |
173
+ | No log | 6.5946 | 244 | 1.1143 | 0.3547 | 1.1143 | 1.0556 |
174
+ | No log | 6.6486 | 246 | 1.1947 | 0.3514 | 1.1947 | 1.0930 |
175
+ | No log | 6.7027 | 248 | 1.1084 | 0.3215 | 1.1084 | 1.0528 |
176
+ | No log | 6.7568 | 250 | 0.9727 | 0.3692 | 0.9727 | 0.9862 |
177
+ | No log | 6.8108 | 252 | 0.9775 | 0.4270 | 0.9775 | 0.9887 |
178
+ | No log | 6.8649 | 254 | 0.9657 | 0.5110 | 0.9657 | 0.9827 |
179
+ | No log | 6.9189 | 256 | 1.0362 | 0.3625 | 1.0362 | 1.0179 |
180
+ | No log | 6.9730 | 258 | 1.0881 | 0.3613 | 1.0881 | 1.0431 |
181
+ | No log | 7.0270 | 260 | 1.0692 | 0.3705 | 1.0692 | 1.0340 |
182
+ | No log | 7.0811 | 262 | 1.0151 | 0.3947 | 1.0151 | 1.0075 |
183
+ | No log | 7.1351 | 264 | 0.9741 | 0.3881 | 0.9741 | 0.9870 |
184
+ | No log | 7.1892 | 266 | 0.9822 | 0.4233 | 0.9822 | 0.9911 |
185
+ | No log | 7.2432 | 268 | 0.9736 | 0.3395 | 0.9736 | 0.9867 |
186
+ | No log | 7.2973 | 270 | 0.9670 | 0.3151 | 0.9670 | 0.9834 |
187
+ | No log | 7.3514 | 272 | 0.9759 | 0.3286 | 0.9759 | 0.9879 |
188
+ | No log | 7.4054 | 274 | 0.9971 | 0.3637 | 0.9971 | 0.9986 |
189
+ | No log | 7.4595 | 276 | 1.0223 | 0.4004 | 1.0223 | 1.0111 |
190
+ | No log | 7.5135 | 278 | 1.0477 | 0.3584 | 1.0477 | 1.0236 |
191
+ | No log | 7.5676 | 280 | 1.1066 | 0.3705 | 1.1066 | 1.0520 |
192
+ | No log | 7.6216 | 282 | 1.0696 | 0.4048 | 1.0696 | 1.0342 |
193
+ | No log | 7.6757 | 284 | 1.0354 | 0.3758 | 1.0354 | 1.0175 |
194
+ | No log | 7.7297 | 286 | 1.0559 | 0.3842 | 1.0559 | 1.0276 |
195
+ | No log | 7.7838 | 288 | 1.0942 | 0.3607 | 1.0942 | 1.0460 |
196
+ | No log | 7.8378 | 290 | 1.1186 | 0.2918 | 1.1186 | 1.0576 |
197
+ | No log | 7.8919 | 292 | 1.0870 | 0.2871 | 1.0870 | 1.0426 |
198
+ | No log | 7.9459 | 294 | 1.0995 | 0.3779 | 1.0995 | 1.0486 |
199
+ | No log | 8.0 | 296 | 1.1082 | 0.3311 | 1.1082 | 1.0527 |
200
+ | No log | 8.0541 | 298 | 1.1779 | 0.1651 | 1.1779 | 1.0853 |
201
+ | No log | 8.1081 | 300 | 1.3119 | 0.1836 | 1.3119 | 1.1454 |
202
+ | No log | 8.1622 | 302 | 1.3054 | 0.1738 | 1.3054 | 1.1426 |
203
+ | No log | 8.2162 | 304 | 1.2109 | 0.1751 | 1.2109 | 1.1004 |
204
+ | No log | 8.2703 | 306 | 1.1810 | 0.2424 | 1.1810 | 1.0868 |
205
+ | No log | 8.3243 | 308 | 1.1703 | 0.3171 | 1.1703 | 1.0818 |
206
+ | No log | 8.3784 | 310 | 1.1865 | 0.2902 | 1.1865 | 1.0892 |
207
+ | No log | 8.4324 | 312 | 1.2455 | 0.2301 | 1.2455 | 1.1160 |
208
+ | No log | 8.4865 | 314 | 1.1705 | 0.3166 | 1.1705 | 1.0819 |
209
+ | No log | 8.5405 | 316 | 1.1210 | 0.3149 | 1.1210 | 1.0587 |
210
+ | No log | 8.5946 | 318 | 1.1025 | 0.3880 | 1.1025 | 1.0500 |
211
+ | No log | 8.6486 | 320 | 1.0921 | 0.4013 | 1.0921 | 1.0450 |
212
+ | No log | 8.7027 | 322 | 1.0796 | 0.3987 | 1.0796 | 1.0390 |
213
+ | No log | 8.7568 | 324 | 1.0690 | 0.3987 | 1.0690 | 1.0339 |
214
+ | No log | 8.8108 | 326 | 1.1083 | 0.3606 | 1.1083 | 1.0527 |
215
+ | No log | 8.8649 | 328 | 1.1634 | 0.2898 | 1.1634 | 1.0786 |
216
+ | No log | 8.9189 | 330 | 1.1187 | 0.3204 | 1.1187 | 1.0577 |
217
+ | No log | 8.9730 | 332 | 1.0804 | 0.4444 | 1.0804 | 1.0394 |
218
+ | No log | 9.0270 | 334 | 1.1226 | 0.3231 | 1.1226 | 1.0595 |
219
+ | No log | 9.0811 | 336 | 1.1727 | 0.3557 | 1.1727 | 1.0829 |
220
+ | No log | 9.1351 | 338 | 1.1229 | 0.3266 | 1.1229 | 1.0596 |
221
+ | No log | 9.1892 | 340 | 1.0731 | 0.4216 | 1.0731 | 1.0359 |
222
+ | No log | 9.2432 | 342 | 1.1070 | 0.3532 | 1.1070 | 1.0521 |
223
+ | No log | 9.2973 | 344 | 1.0851 | 0.3102 | 1.0851 | 1.0417 |
224
+ | No log | 9.3514 | 346 | 1.0462 | 0.4411 | 1.0462 | 1.0228 |
225
+ | No log | 9.4054 | 348 | 1.1033 | 0.4050 | 1.1033 | 1.0504 |
226
+ | No log | 9.4595 | 350 | 1.1286 | 0.3338 | 1.1286 | 1.0623 |
227
+ | No log | 9.5135 | 352 | 1.1786 | 0.3255 | 1.1786 | 1.0856 |
228
+ | No log | 9.5676 | 354 | 1.1486 | 0.2851 | 1.1486 | 1.0717 |
229
+ | No log | 9.6216 | 356 | 1.0945 | 0.3463 | 1.0945 | 1.0462 |
230
+ | No log | 9.6757 | 358 | 1.0553 | 0.3711 | 1.0553 | 1.0273 |
231
+ | No log | 9.7297 | 360 | 1.0556 | 0.4160 | 1.0556 | 1.0274 |
232
+ | No log | 9.7838 | 362 | 1.0922 | 0.4014 | 1.0922 | 1.0451 |
233
+ | No log | 9.8378 | 364 | 1.1083 | 0.3463 | 1.1083 | 1.0528 |
234
+ | No log | 9.8919 | 366 | 1.0595 | 0.3463 | 1.0595 | 1.0293 |
235
+ | No log | 9.9459 | 368 | 1.0210 | 0.3514 | 1.0210 | 1.0104 |
236
+ | No log | 10.0 | 370 | 0.9990 | 0.4254 | 0.9990 | 0.9995 |
237
+ | No log | 10.0541 | 372 | 0.9891 | 0.4381 | 0.9891 | 0.9945 |
238
+ | No log | 10.1081 | 374 | 0.9931 | 0.3977 | 0.9931 | 0.9965 |
239
+ | No log | 10.1622 | 376 | 1.0166 | 0.4444 | 1.0166 | 1.0083 |
240
+ | No log | 10.2162 | 378 | 1.0106 | 0.4444 | 1.0106 | 1.0053 |
241
+ | No log | 10.2703 | 380 | 0.9787 | 0.4547 | 0.9787 | 0.9893 |
242
+ | No log | 10.3243 | 382 | 0.9651 | 0.4547 | 0.9651 | 0.9824 |
243
+ | No log | 10.3784 | 384 | 0.9501 | 0.4996 | 0.9501 | 0.9747 |
244
+ | No log | 10.4324 | 386 | 0.9582 | 0.4980 | 0.9582 | 0.9789 |
245
+ | No log | 10.4865 | 388 | 0.9567 | 0.4980 | 0.9567 | 0.9781 |
246
+ | No log | 10.5405 | 390 | 0.9600 | 0.4768 | 0.9600 | 0.9798 |
247
+ | No log | 10.5946 | 392 | 0.9620 | 0.3788 | 0.9620 | 0.9808 |
248
+ | No log | 10.6486 | 394 | 1.0311 | 0.3939 | 1.0311 | 1.0155 |
249
+ | No log | 10.7027 | 396 | 1.1397 | 0.3152 | 1.1397 | 1.0675 |
250
+ | No log | 10.7568 | 398 | 1.1576 | 0.3067 | 1.1576 | 1.0759 |
251
+ | No log | 10.8108 | 400 | 1.1000 | 0.2807 | 1.1000 | 1.0488 |
252
+ | No log | 10.8649 | 402 | 1.0110 | 0.4719 | 1.0110 | 1.0055 |
253
+ | No log | 10.9189 | 404 | 0.9767 | 0.4254 | 0.9767 | 0.9883 |
254
+ | No log | 10.9730 | 406 | 0.9812 | 0.4254 | 0.9812 | 0.9906 |
255
+ | No log | 11.0270 | 408 | 0.9861 | 0.3925 | 0.9861 | 0.9930 |
256
+ | No log | 11.0811 | 410 | 1.0320 | 0.4118 | 1.0320 | 1.0159 |
257
+ | No log | 11.1351 | 412 | 1.0212 | 0.4308 | 1.0212 | 1.0105 |
258
+ | No log | 11.1892 | 414 | 0.9818 | 0.4103 | 0.9818 | 0.9909 |
259
+ | No log | 11.2432 | 416 | 0.9714 | 0.4071 | 0.9714 | 0.9856 |
260
+ | No log | 11.2973 | 418 | 0.9764 | 0.4106 | 0.9764 | 0.9881 |
261
+ | No log | 11.3514 | 420 | 0.9920 | 0.4388 | 0.9920 | 0.9960 |
262
+ | No log | 11.4054 | 422 | 1.0464 | 0.3305 | 1.0464 | 1.0229 |
263
+ | No log | 11.4595 | 424 | 1.0623 | 0.3347 | 1.0623 | 1.0307 |
264
+ | No log | 11.5135 | 426 | 1.0414 | 0.3625 | 1.0414 | 1.0205 |
265
+ | No log | 11.5676 | 428 | 1.0278 | 0.3490 | 1.0278 | 1.0138 |
266
+ | No log | 11.6216 | 430 | 1.0185 | 0.3796 | 1.0185 | 1.0092 |
267
+ | No log | 11.6757 | 432 | 1.0163 | 0.3842 | 1.0163 | 1.0081 |
268
+ | No log | 11.7297 | 434 | 1.0224 | 0.3720 | 1.0224 | 1.0111 |
269
+ | No log | 11.7838 | 436 | 1.0320 | 0.3804 | 1.0320 | 1.0159 |
270
+ | No log | 11.8378 | 438 | 1.0435 | 0.3539 | 1.0435 | 1.0215 |
271
+ | No log | 11.8919 | 440 | 1.0603 | 0.3312 | 1.0603 | 1.0297 |
272
+ | No log | 11.9459 | 442 | 1.0612 | 0.2943 | 1.0612 | 1.0301 |
273
+ | No log | 12.0 | 444 | 1.0559 | 0.3514 | 1.0559 | 1.0276 |
274
+ | No log | 12.0541 | 446 | 1.0906 | 0.2667 | 1.0906 | 1.0443 |
275
+ | No log | 12.1081 | 448 | 1.1386 | 0.3099 | 1.1386 | 1.0670 |
276
+ | No log | 12.1622 | 450 | 1.2056 | 0.2657 | 1.2056 | 1.0980 |
277
+ | No log | 12.2162 | 452 | 1.1651 | 0.3009 | 1.1651 | 1.0794 |
278
+ | No log | 12.2703 | 454 | 1.1035 | 0.2725 | 1.1035 | 1.0505 |
279
+ | No log | 12.3243 | 456 | 1.0530 | 0.3171 | 1.0530 | 1.0262 |
280
+ | No log | 12.3784 | 458 | 1.0436 | 0.3684 | 1.0436 | 1.0216 |
281
+ | No log | 12.4324 | 460 | 1.0523 | 0.3256 | 1.0523 | 1.0258 |
282
+ | No log | 12.4865 | 462 | 1.0331 | 0.3535 | 1.0331 | 1.0164 |
283
+ | No log | 12.5405 | 464 | 1.0159 | 0.4122 | 1.0159 | 1.0079 |
284
+ | No log | 12.5946 | 466 | 1.0367 | 0.2896 | 1.0367 | 1.0182 |
285
+ | No log | 12.6486 | 468 | 1.0440 | 0.3386 | 1.0440 | 1.0218 |
286
+ | No log | 12.7027 | 470 | 1.0328 | 0.4610 | 1.0328 | 1.0162 |
287
+ | No log | 12.7568 | 472 | 1.0544 | 0.4523 | 1.0544 | 1.0269 |
288
+ | No log | 12.8108 | 474 | 1.0774 | 0.3724 | 1.0774 | 1.0380 |
289
+ | No log | 12.8649 | 476 | 1.0597 | 0.4229 | 1.0597 | 1.0294 |
290
+ | No log | 12.9189 | 478 | 1.0271 | 0.4523 | 1.0271 | 1.0135 |
291
+ | No log | 12.9730 | 480 | 1.0058 | 0.3567 | 1.0058 | 1.0029 |
292
+ | No log | 13.0270 | 482 | 0.9852 | 0.4634 | 0.9852 | 0.9926 |
293
+ | No log | 13.0811 | 484 | 0.9735 | 0.4841 | 0.9735 | 0.9866 |
294
+ | No log | 13.1351 | 486 | 0.9668 | 0.4841 | 0.9668 | 0.9833 |
295
+ | No log | 13.1892 | 488 | 0.9603 | 0.4807 | 0.9603 | 0.9800 |
296
+ | No log | 13.2432 | 490 | 0.9834 | 0.5026 | 0.9834 | 0.9917 |
297
+ | No log | 13.2973 | 492 | 0.9714 | 0.5026 | 0.9714 | 0.9856 |
298
+ | No log | 13.3514 | 494 | 0.9416 | 0.5057 | 0.9416 | 0.9703 |
299
+ | No log | 13.4054 | 496 | 0.9300 | 0.5057 | 0.9300 | 0.9644 |
300
+ | No log | 13.4595 | 498 | 0.9431 | 0.4937 | 0.9431 | 0.9711 |
301
+ | 0.3397 | 13.5135 | 500 | 0.9273 | 0.4736 | 0.9273 | 0.9630 |
302
+ | 0.3397 | 13.5676 | 502 | 0.9140 | 0.4326 | 0.9140 | 0.9560 |
303
+ | 0.3397 | 13.6216 | 504 | 0.9400 | 0.5055 | 0.9400 | 0.9695 |
304
+ | 0.3397 | 13.6757 | 506 | 0.9273 | 0.4582 | 0.9273 | 0.9630 |
305
+ | 0.3397 | 13.7297 | 508 | 0.9122 | 0.4728 | 0.9122 | 0.9551 |
306
+ | 0.3397 | 13.7838 | 510 | 0.9249 | 0.4560 | 0.9249 | 0.9617 |
307
+ | 0.3397 | 13.8378 | 512 | 0.9235 | 0.4568 | 0.9235 | 0.9610 |
308
+ | 0.3397 | 13.8919 | 514 | 0.9284 | 0.4514 | 0.9284 | 0.9635 |
309
+
310
+
311
+ ### Framework versions
312
+
313
+ - Transformers 4.44.2
314
+ - Pytorch 2.4.0+cu118
315
+ - Datasets 2.21.0
316
+ - Tokenizers 0.19.1
config.json ADDED
@@ -0,0 +1,32 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "_name_or_path": "aubmindlab/bert-base-arabertv02",
3
+ "architectures": [
4
+ "BertForSequenceClassification"
5
+ ],
6
+ "attention_probs_dropout_prob": 0.1,
7
+ "classifier_dropout": null,
8
+ "hidden_act": "gelu",
9
+ "hidden_dropout_prob": 0.1,
10
+ "hidden_size": 768,
11
+ "id2label": {
12
+ "0": "LABEL_0"
13
+ },
14
+ "initializer_range": 0.02,
15
+ "intermediate_size": 3072,
16
+ "label2id": {
17
+ "LABEL_0": 0
18
+ },
19
+ "layer_norm_eps": 1e-12,
20
+ "max_position_embeddings": 512,
21
+ "model_type": "bert",
22
+ "num_attention_heads": 12,
23
+ "num_hidden_layers": 12,
24
+ "pad_token_id": 0,
25
+ "position_embedding_type": "absolute",
26
+ "problem_type": "regression",
27
+ "torch_dtype": "float32",
28
+ "transformers_version": "4.44.2",
29
+ "type_vocab_size": 2,
30
+ "use_cache": true,
31
+ "vocab_size": 64000
32
+ }
model.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:0a6f33673e28d0fa202fc7d9039969d24a693f404d3e815b3c517a4fb480b8bd
3
+ size 540799996
training_args.bin ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:6c7ec55e2f1ac1bc9af1a283c7815b2529c106e9394c134bfe2d0969ed23c8c7
3
+ size 5304