MayBashendy commited on
Commit
6f46891
·
verified ·
1 Parent(s): 3f73c53

Training in progress, step 500

Browse files
Files changed (4) hide show
  1. README.md +314 -0
  2. config.json +32 -0
  3. model.safetensors +3 -0
  4. training_args.bin +3 -0
README.md ADDED
@@ -0,0 +1,314 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ library_name: transformers
3
+ base_model: aubmindlab/bert-base-arabertv02
4
+ tags:
5
+ - generated_from_trainer
6
+ model-index:
7
+ - name: ArabicNewSplits7_usingWellWrittenEssays_FineTuningAraBERT_run3_AugV5_k9_task2_organization
8
+ results: []
9
+ ---
10
+
11
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
12
+ should probably proofread and complete it, then remove this comment. -->
13
+
14
+ # ArabicNewSplits7_usingWellWrittenEssays_FineTuningAraBERT_run3_AugV5_k9_task2_organization
15
+
16
+ This model is a fine-tuned version of [aubmindlab/bert-base-arabertv02](https://huggingface.co/aubmindlab/bert-base-arabertv02) on the None dataset.
17
+ It achieves the following results on the evaluation set:
18
+ - Loss: 1.2537
19
+ - Qwk: 0.2286
20
+ - Mse: 1.2537
21
+ - Rmse: 1.1197
22
+
23
+ ## Model description
24
+
25
+ More information needed
26
+
27
+ ## Intended uses & limitations
28
+
29
+ More information needed
30
+
31
+ ## Training and evaluation data
32
+
33
+ More information needed
34
+
35
+ ## Training procedure
36
+
37
+ ### Training hyperparameters
38
+
39
+ The following hyperparameters were used during training:
40
+ - learning_rate: 2e-05
41
+ - train_batch_size: 8
42
+ - eval_batch_size: 8
43
+ - seed: 42
44
+ - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
45
+ - lr_scheduler_type: linear
46
+ - num_epochs: 100
47
+
48
+ ### Training results
49
+
50
+ | Training Loss | Epoch | Step | Validation Loss | Qwk | Mse | Rmse |
51
+ |:-------------:|:-------:|:----:|:---------------:|:-------:|:------:|:------:|
52
+ | No log | 0.0769 | 2 | 4.5887 | 0.0010 | 4.5887 | 2.1421 |
53
+ | No log | 0.1538 | 4 | 2.5948 | -0.0076 | 2.5948 | 1.6108 |
54
+ | No log | 0.2308 | 6 | 1.9632 | -0.0575 | 1.9632 | 1.4011 |
55
+ | No log | 0.3077 | 8 | 1.4715 | -0.0056 | 1.4715 | 1.2130 |
56
+ | No log | 0.3846 | 10 | 1.2843 | 0.0693 | 1.2843 | 1.1333 |
57
+ | No log | 0.4615 | 12 | 1.2648 | 0.0613 | 1.2648 | 1.1246 |
58
+ | No log | 0.5385 | 14 | 1.2493 | 0.0750 | 1.2493 | 1.1177 |
59
+ | No log | 0.6154 | 16 | 1.1945 | 0.1918 | 1.1945 | 1.0929 |
60
+ | No log | 0.6923 | 18 | 1.2436 | 0.1893 | 1.2436 | 1.1152 |
61
+ | No log | 0.7692 | 20 | 1.2397 | 0.1076 | 1.2397 | 1.1134 |
62
+ | No log | 0.8462 | 22 | 1.2230 | 0.1773 | 1.2230 | 1.1059 |
63
+ | No log | 0.9231 | 24 | 1.2207 | 0.2624 | 1.2207 | 1.1049 |
64
+ | No log | 1.0 | 26 | 1.2411 | 0.1675 | 1.2411 | 1.1141 |
65
+ | No log | 1.0769 | 28 | 1.2804 | 0.1379 | 1.2804 | 1.1316 |
66
+ | No log | 1.1538 | 30 | 1.2993 | 0.1106 | 1.2993 | 1.1399 |
67
+ | No log | 1.2308 | 32 | 1.3888 | 0.0124 | 1.3888 | 1.1785 |
68
+ | No log | 1.3077 | 34 | 1.3637 | 0.0275 | 1.3637 | 1.1678 |
69
+ | No log | 1.3846 | 36 | 1.2650 | 0.1346 | 1.2650 | 1.1247 |
70
+ | No log | 1.4615 | 38 | 1.1651 | 0.2439 | 1.1651 | 1.0794 |
71
+ | No log | 1.5385 | 40 | 1.1585 | 0.2735 | 1.1585 | 1.0763 |
72
+ | No log | 1.6154 | 42 | 1.1483 | 0.2733 | 1.1483 | 1.0716 |
73
+ | No log | 1.6923 | 44 | 1.1623 | 0.2678 | 1.1623 | 1.0781 |
74
+ | No log | 1.7692 | 46 | 1.1645 | 0.2689 | 1.1645 | 1.0791 |
75
+ | No log | 1.8462 | 48 | 1.1957 | 0.1344 | 1.1957 | 1.0935 |
76
+ | No log | 1.9231 | 50 | 1.2120 | 0.1654 | 1.2120 | 1.1009 |
77
+ | No log | 2.0 | 52 | 1.2172 | 0.1881 | 1.2172 | 1.1033 |
78
+ | No log | 2.0769 | 54 | 1.2121 | 0.1750 | 1.2121 | 1.1009 |
79
+ | No log | 2.1538 | 56 | 1.2094 | 0.0692 | 1.2094 | 1.0997 |
80
+ | No log | 2.2308 | 58 | 1.2980 | 0.1846 | 1.2980 | 1.1393 |
81
+ | No log | 2.3077 | 60 | 1.2635 | 0.1148 | 1.2635 | 1.1241 |
82
+ | No log | 2.3846 | 62 | 1.1922 | 0.1546 | 1.1922 | 1.0919 |
83
+ | No log | 2.4615 | 64 | 1.2251 | 0.1148 | 1.2251 | 1.1069 |
84
+ | No log | 2.5385 | 66 | 1.3312 | 0.1599 | 1.3312 | 1.1538 |
85
+ | No log | 2.6154 | 68 | 1.3696 | 0.1903 | 1.3696 | 1.1703 |
86
+ | No log | 2.6923 | 70 | 1.2670 | 0.2141 | 1.2670 | 1.1256 |
87
+ | No log | 2.7692 | 72 | 1.0982 | 0.2721 | 1.0982 | 1.0480 |
88
+ | No log | 2.8462 | 74 | 1.0791 | 0.2850 | 1.0791 | 1.0388 |
89
+ | No log | 2.9231 | 76 | 1.1381 | 0.3778 | 1.1381 | 1.0668 |
90
+ | No log | 3.0 | 78 | 1.1115 | 0.3211 | 1.1115 | 1.0543 |
91
+ | No log | 3.0769 | 80 | 1.2055 | 0.3197 | 1.2055 | 1.0980 |
92
+ | No log | 3.1538 | 82 | 1.3945 | 0.2903 | 1.3945 | 1.1809 |
93
+ | No log | 3.2308 | 84 | 1.4351 | 0.3125 | 1.4351 | 1.1980 |
94
+ | No log | 3.3077 | 86 | 1.2651 | 0.3047 | 1.2651 | 1.1248 |
95
+ | No log | 3.3846 | 88 | 1.2930 | 0.2773 | 1.2930 | 1.1371 |
96
+ | No log | 3.4615 | 90 | 1.3353 | 0.2846 | 1.3353 | 1.1556 |
97
+ | No log | 3.5385 | 92 | 1.7222 | 0.2156 | 1.7222 | 1.3123 |
98
+ | No log | 3.6154 | 94 | 2.0619 | 0.1187 | 2.0619 | 1.4359 |
99
+ | No log | 3.6923 | 96 | 2.1225 | 0.1299 | 2.1225 | 1.4569 |
100
+ | No log | 3.7692 | 98 | 1.9804 | 0.1077 | 1.9804 | 1.4073 |
101
+ | No log | 3.8462 | 100 | 1.6519 | 0.1884 | 1.6519 | 1.2853 |
102
+ | No log | 3.9231 | 102 | 1.3766 | 0.2518 | 1.3766 | 1.1733 |
103
+ | No log | 4.0 | 104 | 1.3744 | 0.2568 | 1.3744 | 1.1723 |
104
+ | No log | 4.0769 | 106 | 1.5088 | 0.2240 | 1.5088 | 1.2283 |
105
+ | No log | 4.1538 | 108 | 1.6043 | 0.1886 | 1.6043 | 1.2666 |
106
+ | No log | 4.2308 | 110 | 1.6458 | 0.2303 | 1.6458 | 1.2829 |
107
+ | No log | 4.3077 | 112 | 1.6020 | 0.2147 | 1.6020 | 1.2657 |
108
+ | No log | 4.3846 | 114 | 1.5498 | 0.2501 | 1.5498 | 1.2449 |
109
+ | No log | 4.4615 | 116 | 1.7384 | 0.1779 | 1.7384 | 1.3185 |
110
+ | No log | 4.5385 | 118 | 1.8970 | 0.1500 | 1.8970 | 1.3773 |
111
+ | No log | 4.6154 | 120 | 1.9298 | 0.1639 | 1.9298 | 1.3892 |
112
+ | No log | 4.6923 | 122 | 1.7030 | 0.2171 | 1.7030 | 1.3050 |
113
+ | No log | 4.7692 | 124 | 1.5666 | 0.1889 | 1.5666 | 1.2517 |
114
+ | No log | 4.8462 | 126 | 1.7671 | 0.1779 | 1.7671 | 1.3293 |
115
+ | No log | 4.9231 | 128 | 1.9572 | 0.1523 | 1.9572 | 1.3990 |
116
+ | No log | 5.0 | 130 | 1.8478 | 0.1914 | 1.8478 | 1.3593 |
117
+ | No log | 5.0769 | 132 | 1.7482 | 0.2156 | 1.7482 | 1.3222 |
118
+ | No log | 5.1538 | 134 | 1.4845 | 0.2197 | 1.4845 | 1.2184 |
119
+ | No log | 5.2308 | 136 | 1.2589 | 0.1899 | 1.2589 | 1.1220 |
120
+ | No log | 5.3077 | 138 | 1.3022 | 0.1557 | 1.3022 | 1.1411 |
121
+ | No log | 5.3846 | 140 | 1.5180 | 0.2163 | 1.5180 | 1.2321 |
122
+ | No log | 5.4615 | 142 | 1.7317 | 0.1466 | 1.7317 | 1.3160 |
123
+ | No log | 5.5385 | 144 | 1.7871 | 0.1473 | 1.7871 | 1.3368 |
124
+ | No log | 5.6154 | 146 | 1.6708 | 0.2710 | 1.6708 | 1.2926 |
125
+ | No log | 5.6923 | 148 | 1.5979 | 0.2201 | 1.5979 | 1.2641 |
126
+ | No log | 5.7692 | 150 | 1.5234 | 0.2092 | 1.5234 | 1.2343 |
127
+ | No log | 5.8462 | 152 | 1.4986 | 0.1935 | 1.4986 | 1.2242 |
128
+ | No log | 5.9231 | 154 | 1.4351 | 0.1892 | 1.4351 | 1.1980 |
129
+ | No log | 6.0 | 156 | 1.3401 | 0.1899 | 1.3401 | 1.1576 |
130
+ | No log | 6.0769 | 158 | 1.3936 | 0.1674 | 1.3936 | 1.1805 |
131
+ | No log | 6.1538 | 160 | 1.6305 | 0.1924 | 1.6305 | 1.2769 |
132
+ | No log | 6.2308 | 162 | 1.7866 | 0.2185 | 1.7866 | 1.3366 |
133
+ | No log | 6.3077 | 164 | 1.7640 | 0.2479 | 1.7640 | 1.3282 |
134
+ | No log | 6.3846 | 166 | 1.6759 | 0.2657 | 1.6759 | 1.2946 |
135
+ | No log | 6.4615 | 168 | 1.6239 | 0.2634 | 1.6239 | 1.2743 |
136
+ | No log | 6.5385 | 170 | 1.7223 | 0.2185 | 1.7223 | 1.3124 |
137
+ | No log | 6.6154 | 172 | 1.7136 | 0.1952 | 1.7136 | 1.3090 |
138
+ | No log | 6.6923 | 174 | 1.6230 | 0.2343 | 1.6230 | 1.2740 |
139
+ | No log | 6.7692 | 176 | 1.3906 | 0.1621 | 1.3906 | 1.1793 |
140
+ | No log | 6.8462 | 178 | 1.3121 | 0.1535 | 1.3121 | 1.1455 |
141
+ | No log | 6.9231 | 180 | 1.4585 | 0.2020 | 1.4585 | 1.2077 |
142
+ | No log | 7.0 | 182 | 1.4848 | 0.2020 | 1.4848 | 1.2185 |
143
+ | No log | 7.0769 | 184 | 1.3617 | 0.2115 | 1.3617 | 1.1669 |
144
+ | No log | 7.1538 | 186 | 1.3402 | 0.1621 | 1.3402 | 1.1577 |
145
+ | No log | 7.2308 | 188 | 1.2888 | 0.1252 | 1.2888 | 1.1353 |
146
+ | No log | 7.3077 | 190 | 1.2461 | 0.1674 | 1.2461 | 1.1163 |
147
+ | No log | 7.3846 | 192 | 1.4051 | 0.2074 | 1.4051 | 1.1854 |
148
+ | No log | 7.4615 | 194 | 1.5733 | 0.2550 | 1.5733 | 1.2543 |
149
+ | No log | 7.5385 | 196 | 1.5750 | 0.2806 | 1.5750 | 1.2550 |
150
+ | No log | 7.6154 | 198 | 1.5713 | 0.3026 | 1.5713 | 1.2535 |
151
+ | No log | 7.6923 | 200 | 1.4440 | 0.2822 | 1.4440 | 1.2017 |
152
+ | No log | 7.7692 | 202 | 1.5116 | 0.2873 | 1.5116 | 1.2295 |
153
+ | No log | 7.8462 | 204 | 1.5118 | 0.2759 | 1.5118 | 1.2296 |
154
+ | No log | 7.9231 | 206 | 1.3864 | 0.3537 | 1.3864 | 1.1775 |
155
+ | No log | 8.0 | 208 | 1.3713 | 0.3537 | 1.3713 | 1.1710 |
156
+ | No log | 8.0769 | 210 | 1.4720 | 0.2633 | 1.4720 | 1.2133 |
157
+ | No log | 8.1538 | 212 | 1.5045 | 0.2633 | 1.5045 | 1.2266 |
158
+ | No log | 8.2308 | 214 | 1.5253 | 0.2658 | 1.5253 | 1.2350 |
159
+ | No log | 8.3077 | 216 | 1.6157 | 0.2407 | 1.6157 | 1.2711 |
160
+ | No log | 8.3846 | 218 | 1.6588 | 0.2435 | 1.6588 | 1.2879 |
161
+ | No log | 8.4615 | 220 | 1.8262 | 0.1914 | 1.8262 | 1.3514 |
162
+ | No log | 8.5385 | 222 | 1.8406 | 0.1946 | 1.8406 | 1.3567 |
163
+ | No log | 8.6154 | 224 | 1.7575 | 0.1846 | 1.7575 | 1.3257 |
164
+ | No log | 8.6923 | 226 | 1.5631 | 0.2929 | 1.5631 | 1.2502 |
165
+ | No log | 8.7692 | 228 | 1.5317 | 0.2285 | 1.5317 | 1.2376 |
166
+ | No log | 8.8462 | 230 | 1.6257 | 0.2950 | 1.6257 | 1.2750 |
167
+ | No log | 8.9231 | 232 | 1.6429 | 0.2735 | 1.6429 | 1.2817 |
168
+ | No log | 9.0 | 234 | 1.6421 | 0.2929 | 1.6421 | 1.2814 |
169
+ | No log | 9.0769 | 236 | 1.6494 | 0.2435 | 1.6494 | 1.2843 |
170
+ | No log | 9.1538 | 238 | 1.7873 | 0.1745 | 1.7873 | 1.3369 |
171
+ | No log | 9.2308 | 240 | 2.0451 | 0.1358 | 2.0451 | 1.4301 |
172
+ | No log | 9.3077 | 242 | 2.0459 | 0.1697 | 2.0459 | 1.4303 |
173
+ | No log | 9.3846 | 244 | 1.8164 | 0.1780 | 1.8164 | 1.3477 |
174
+ | No log | 9.4615 | 246 | 1.7122 | 0.1672 | 1.7122 | 1.3085 |
175
+ | No log | 9.5385 | 248 | 1.5984 | 0.1997 | 1.5984 | 1.2643 |
176
+ | No log | 9.6154 | 250 | 1.4260 | 0.2024 | 1.4260 | 1.1942 |
177
+ | No log | 9.6923 | 252 | 1.3046 | 0.2291 | 1.3046 | 1.1422 |
178
+ | No log | 9.7692 | 254 | 1.2991 | 0.1896 | 1.2991 | 1.1398 |
179
+ | No log | 9.8462 | 256 | 1.4627 | 0.2512 | 1.4627 | 1.2094 |
180
+ | No log | 9.9231 | 258 | 1.6132 | 0.1616 | 1.6132 | 1.2701 |
181
+ | No log | 10.0 | 260 | 1.6565 | 0.1882 | 1.6565 | 1.2870 |
182
+ | No log | 10.0769 | 262 | 1.6531 | 0.2188 | 1.6531 | 1.2857 |
183
+ | No log | 10.1538 | 264 | 1.6616 | 0.2285 | 1.6616 | 1.2890 |
184
+ | No log | 10.2308 | 266 | 1.6688 | 0.2542 | 1.6688 | 1.2918 |
185
+ | No log | 10.3077 | 268 | 1.5189 | 0.2289 | 1.5189 | 1.2324 |
186
+ | No log | 10.3846 | 270 | 1.3134 | 0.1313 | 1.3134 | 1.1460 |
187
+ | No log | 10.4615 | 272 | 1.2212 | 0.1599 | 1.2212 | 1.1051 |
188
+ | No log | 10.5385 | 274 | 1.2360 | 0.1026 | 1.2360 | 1.1118 |
189
+ | No log | 10.6154 | 276 | 1.3063 | 0.1608 | 1.3063 | 1.1430 |
190
+ | No log | 10.6923 | 278 | 1.4085 | 0.1796 | 1.4085 | 1.1868 |
191
+ | No log | 10.7692 | 280 | 1.4729 | 0.2815 | 1.4729 | 1.2136 |
192
+ | No log | 10.8462 | 282 | 1.4701 | 0.2941 | 1.4701 | 1.2125 |
193
+ | No log | 10.9231 | 284 | 1.4104 | 0.2201 | 1.4104 | 1.1876 |
194
+ | No log | 11.0 | 286 | 1.3824 | 0.1935 | 1.3824 | 1.1757 |
195
+ | No log | 11.0769 | 288 | 1.3977 | 0.1708 | 1.3977 | 1.1822 |
196
+ | No log | 11.1538 | 290 | 1.4382 | 0.1568 | 1.4382 | 1.1993 |
197
+ | No log | 11.2308 | 292 | 1.3687 | 0.1889 | 1.3687 | 1.1699 |
198
+ | No log | 11.3077 | 294 | 1.3096 | 0.1498 | 1.3096 | 1.1444 |
199
+ | No log | 11.3846 | 296 | 1.2678 | 0.1345 | 1.2678 | 1.1260 |
200
+ | No log | 11.4615 | 298 | 1.2142 | 0.1641 | 1.2142 | 1.1019 |
201
+ | No log | 11.5385 | 300 | 1.2936 | 0.0433 | 1.2936 | 1.1373 |
202
+ | No log | 11.6154 | 302 | 1.4714 | 0.2239 | 1.4714 | 1.2130 |
203
+ | No log | 11.6923 | 304 | 1.6113 | 0.2188 | 1.6113 | 1.2694 |
204
+ | No log | 11.7692 | 306 | 1.6269 | 0.2334 | 1.6269 | 1.2755 |
205
+ | No log | 11.8462 | 308 | 1.5372 | 0.2272 | 1.5372 | 1.2398 |
206
+ | No log | 11.9231 | 310 | 1.4658 | 0.2676 | 1.4658 | 1.2107 |
207
+ | No log | 12.0 | 312 | 1.4831 | 0.2676 | 1.4831 | 1.2178 |
208
+ | No log | 12.0769 | 314 | 1.4233 | 0.1608 | 1.4233 | 1.1930 |
209
+ | No log | 12.1538 | 316 | 1.3396 | 0.1313 | 1.3396 | 1.1574 |
210
+ | No log | 12.2308 | 318 | 1.3519 | 0.1608 | 1.3519 | 1.1627 |
211
+ | No log | 12.3077 | 320 | 1.5316 | 0.2757 | 1.5316 | 1.2376 |
212
+ | No log | 12.3846 | 322 | 1.6268 | 0.2829 | 1.6268 | 1.2755 |
213
+ | No log | 12.4615 | 324 | 1.5757 | 0.2465 | 1.5757 | 1.2553 |
214
+ | No log | 12.5385 | 326 | 1.4129 | 0.2757 | 1.4129 | 1.1886 |
215
+ | No log | 12.6154 | 328 | 1.3608 | 0.2371 | 1.3608 | 1.1665 |
216
+ | No log | 12.6923 | 330 | 1.4233 | 0.2309 | 1.4233 | 1.1930 |
217
+ | No log | 12.7692 | 332 | 1.4721 | 0.2309 | 1.4721 | 1.2133 |
218
+ | No log | 12.8462 | 334 | 1.4753 | 0.2181 | 1.4753 | 1.2146 |
219
+ | No log | 12.9231 | 336 | 1.5044 | 0.2632 | 1.5044 | 1.2266 |
220
+ | No log | 13.0 | 338 | 1.5859 | 0.2542 | 1.5859 | 1.2593 |
221
+ | No log | 13.0769 | 340 | 1.5363 | 0.2317 | 1.5363 | 1.2395 |
222
+ | No log | 13.1538 | 342 | 1.3407 | 0.2605 | 1.3407 | 1.1579 |
223
+ | No log | 13.2308 | 344 | 1.2791 | 0.2772 | 1.2791 | 1.1310 |
224
+ | No log | 13.3077 | 346 | 1.3450 | 0.1848 | 1.3450 | 1.1598 |
225
+ | No log | 13.3846 | 348 | 1.4302 | 0.2574 | 1.4302 | 1.1959 |
226
+ | No log | 13.4615 | 350 | 1.5145 | 0.2632 | 1.5145 | 1.2306 |
227
+ | No log | 13.5385 | 352 | 1.5356 | 0.2285 | 1.5356 | 1.2392 |
228
+ | No log | 13.6154 | 354 | 1.4822 | 0.2217 | 1.4822 | 1.2175 |
229
+ | No log | 13.6923 | 356 | 1.4419 | 0.2406 | 1.4419 | 1.2008 |
230
+ | No log | 13.7692 | 358 | 1.4500 | 0.2272 | 1.4500 | 1.2042 |
231
+ | No log | 13.8462 | 360 | 1.5571 | 0.2407 | 1.5571 | 1.2478 |
232
+ | No log | 13.9231 | 362 | 1.5422 | 0.2407 | 1.5422 | 1.2418 |
233
+ | No log | 14.0 | 364 | 1.4055 | 0.2589 | 1.4055 | 1.1855 |
234
+ | No log | 14.0769 | 366 | 1.2839 | 0.2743 | 1.2839 | 1.1331 |
235
+ | No log | 14.1538 | 368 | 1.2392 | 0.2304 | 1.2392 | 1.1132 |
236
+ | No log | 14.2308 | 370 | 1.2881 | 0.1650 | 1.2881 | 1.1349 |
237
+ | No log | 14.3077 | 372 | 1.4441 | 0.2213 | 1.4441 | 1.2017 |
238
+ | No log | 14.3846 | 374 | 1.5028 | 0.2213 | 1.5028 | 1.2259 |
239
+ | No log | 14.4615 | 376 | 1.4605 | 0.1345 | 1.4605 | 1.2085 |
240
+ | No log | 14.5385 | 378 | 1.3445 | 0.1053 | 1.3445 | 1.1595 |
241
+ | No log | 14.6154 | 380 | 1.3102 | 0.1217 | 1.3102 | 1.1446 |
242
+ | No log | 14.6923 | 382 | 1.3631 | 0.2224 | 1.3631 | 1.1675 |
243
+ | No log | 14.7692 | 384 | 1.4406 | 0.2227 | 1.4406 | 1.2003 |
244
+ | No log | 14.8462 | 386 | 1.4892 | 0.2418 | 1.4892 | 1.2203 |
245
+ | No log | 14.9231 | 388 | 1.5078 | 0.2418 | 1.5078 | 1.2279 |
246
+ | No log | 15.0 | 390 | 1.6036 | 0.2378 | 1.6036 | 1.2663 |
247
+ | No log | 15.0769 | 392 | 1.6921 | 0.2620 | 1.6921 | 1.3008 |
248
+ | No log | 15.1538 | 394 | 1.6519 | 0.2595 | 1.6519 | 1.2852 |
249
+ | No log | 15.2308 | 396 | 1.6245 | 0.2595 | 1.6245 | 1.2746 |
250
+ | No log | 15.3077 | 398 | 1.6157 | 0.2595 | 1.6157 | 1.2711 |
251
+ | No log | 15.3846 | 400 | 1.6003 | 0.2708 | 1.6003 | 1.2650 |
252
+ | No log | 15.4615 | 402 | 1.5829 | 0.2620 | 1.5829 | 1.2582 |
253
+ | No log | 15.5385 | 404 | 1.6210 | 0.2644 | 1.6210 | 1.2732 |
254
+ | No log | 15.6154 | 406 | 1.6630 | 0.2620 | 1.6630 | 1.2896 |
255
+ | No log | 15.6923 | 408 | 1.6005 | 0.2659 | 1.6005 | 1.2651 |
256
+ | No log | 15.7692 | 410 | 1.5126 | 0.1698 | 1.5126 | 1.2299 |
257
+ | No log | 15.8462 | 412 | 1.4641 | 0.1698 | 1.4641 | 1.2100 |
258
+ | No log | 15.9231 | 414 | 1.4769 | 0.1283 | 1.4769 | 1.2153 |
259
+ | No log | 16.0 | 416 | 1.4597 | 0.2325 | 1.4597 | 1.2082 |
260
+ | No log | 16.0769 | 418 | 1.4697 | 0.2217 | 1.4697 | 1.2123 |
261
+ | No log | 16.1538 | 420 | 1.5294 | 0.2569 | 1.5294 | 1.2367 |
262
+ | No log | 16.2308 | 422 | 1.5453 | 0.2595 | 1.5453 | 1.2431 |
263
+ | No log | 16.3077 | 424 | 1.6401 | 0.2540 | 1.6401 | 1.2807 |
264
+ | No log | 16.3846 | 426 | 1.7721 | 0.2713 | 1.7721 | 1.3312 |
265
+ | No log | 16.4615 | 428 | 1.7402 | 0.2735 | 1.7402 | 1.3192 |
266
+ | No log | 16.5385 | 430 | 1.5810 | 0.2317 | 1.5810 | 1.2574 |
267
+ | No log | 16.6154 | 432 | 1.4097 | 0.2375 | 1.4097 | 1.1873 |
268
+ | No log | 16.6923 | 434 | 1.4217 | 0.2751 | 1.4217 | 1.1923 |
269
+ | No log | 16.7692 | 436 | 1.4788 | 0.2633 | 1.4788 | 1.2161 |
270
+ | No log | 16.8462 | 438 | 1.5685 | 0.2378 | 1.5685 | 1.2524 |
271
+ | No log | 16.9231 | 440 | 1.5344 | 0.2348 | 1.5344 | 1.2387 |
272
+ | No log | 17.0 | 442 | 1.4585 | 0.2514 | 1.4585 | 1.2077 |
273
+ | No log | 17.0769 | 444 | 1.4600 | 0.2514 | 1.4600 | 1.2083 |
274
+ | No log | 17.1538 | 446 | 1.5477 | 0.2595 | 1.5477 | 1.2441 |
275
+ | No log | 17.2308 | 448 | 1.5510 | 0.2595 | 1.5510 | 1.2454 |
276
+ | No log | 17.3077 | 450 | 1.4088 | 0.2632 | 1.4088 | 1.1869 |
277
+ | No log | 17.3846 | 452 | 1.2676 | 0.2037 | 1.2676 | 1.1259 |
278
+ | No log | 17.4615 | 454 | 1.2745 | 0.2037 | 1.2745 | 1.1290 |
279
+ | No log | 17.5385 | 456 | 1.3577 | 0.2910 | 1.3577 | 1.1652 |
280
+ | No log | 17.6154 | 458 | 1.4423 | 0.2514 | 1.4423 | 1.2009 |
281
+ | No log | 17.6923 | 460 | 1.4575 | 0.2348 | 1.4575 | 1.2073 |
282
+ | No log | 17.7692 | 462 | 1.4629 | 0.2542 | 1.4629 | 1.2095 |
283
+ | No log | 17.8462 | 464 | 1.5004 | 0.2542 | 1.5004 | 1.2249 |
284
+ | No log | 17.9231 | 466 | 1.5332 | 0.2493 | 1.5332 | 1.2382 |
285
+ | No log | 18.0 | 468 | 1.5930 | 0.2489 | 1.5930 | 1.2621 |
286
+ | No log | 18.0769 | 470 | 1.5329 | 0.2493 | 1.5329 | 1.2381 |
287
+ | No log | 18.1538 | 472 | 1.4974 | 0.2465 | 1.4974 | 1.2237 |
288
+ | No log | 18.2308 | 474 | 1.4756 | 0.2240 | 1.4756 | 1.2148 |
289
+ | No log | 18.3077 | 476 | 1.5813 | 0.2154 | 1.5813 | 1.2575 |
290
+ | No log | 18.3846 | 478 | 1.6117 | 0.2348 | 1.6117 | 1.2695 |
291
+ | No log | 18.4615 | 480 | 1.5043 | 0.2406 | 1.5043 | 1.2265 |
292
+ | No log | 18.5385 | 482 | 1.3596 | 0.2676 | 1.3596 | 1.1660 |
293
+ | No log | 18.6154 | 484 | 1.3934 | 0.2201 | 1.3934 | 1.1804 |
294
+ | No log | 18.6923 | 486 | 1.3738 | 0.1795 | 1.3738 | 1.1721 |
295
+ | No log | 18.7692 | 488 | 1.3767 | 0.1745 | 1.3767 | 1.1733 |
296
+ | No log | 18.8462 | 490 | 1.3619 | 0.1745 | 1.3619 | 1.1670 |
297
+ | No log | 18.9231 | 492 | 1.2968 | 0.2046 | 1.2968 | 1.1388 |
298
+ | No log | 19.0 | 494 | 1.2807 | 0.2367 | 1.2807 | 1.1317 |
299
+ | No log | 19.0769 | 496 | 1.2822 | 0.2503 | 1.2822 | 1.1323 |
300
+ | No log | 19.1538 | 498 | 1.3484 | 0.2773 | 1.3484 | 1.1612 |
301
+ | 0.2836 | 19.2308 | 500 | 1.4385 | 0.2903 | 1.4385 | 1.1994 |
302
+ | 0.2836 | 19.3077 | 502 | 1.4755 | 0.2782 | 1.4755 | 1.2147 |
303
+ | 0.2836 | 19.3846 | 504 | 1.3809 | 0.2647 | 1.3809 | 1.1751 |
304
+ | 0.2836 | 19.4615 | 506 | 1.3203 | 0.2371 | 1.3203 | 1.1490 |
305
+ | 0.2836 | 19.5385 | 508 | 1.2806 | 0.2371 | 1.2806 | 1.1316 |
306
+ | 0.2836 | 19.6154 | 510 | 1.2537 | 0.2286 | 1.2537 | 1.1197 |
307
+
308
+
309
+ ### Framework versions
310
+
311
+ - Transformers 4.44.2
312
+ - Pytorch 2.4.0+cu118
313
+ - Datasets 2.21.0
314
+ - Tokenizers 0.19.1
config.json ADDED
@@ -0,0 +1,32 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "_name_or_path": "aubmindlab/bert-base-arabertv02",
3
+ "architectures": [
4
+ "BertForSequenceClassification"
5
+ ],
6
+ "attention_probs_dropout_prob": 0.1,
7
+ "classifier_dropout": null,
8
+ "hidden_act": "gelu",
9
+ "hidden_dropout_prob": 0.1,
10
+ "hidden_size": 768,
11
+ "id2label": {
12
+ "0": "LABEL_0"
13
+ },
14
+ "initializer_range": 0.02,
15
+ "intermediate_size": 3072,
16
+ "label2id": {
17
+ "LABEL_0": 0
18
+ },
19
+ "layer_norm_eps": 1e-12,
20
+ "max_position_embeddings": 512,
21
+ "model_type": "bert",
22
+ "num_attention_heads": 12,
23
+ "num_hidden_layers": 12,
24
+ "pad_token_id": 0,
25
+ "position_embedding_type": "absolute",
26
+ "problem_type": "regression",
27
+ "torch_dtype": "float32",
28
+ "transformers_version": "4.44.2",
29
+ "type_vocab_size": 2,
30
+ "use_cache": true,
31
+ "vocab_size": 64000
32
+ }
model.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:4258cf090fa27c361b1fb81d8f49c43422240b75e50fdbc5b405009da9fe241c
3
+ size 540799996
training_args.bin ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:9e7b61323d699642629f6b65422488004291c580b0111a99c7fe6fb09055a1bc
3
+ size 5304