MayBashendy commited on
Commit
a821794
·
verified ·
1 Parent(s): 21fe533

Training in progress, step 500

Browse files
Files changed (4) hide show
  1. README.md +314 -0
  2. config.json +32 -0
  3. model.safetensors +3 -0
  4. training_args.bin +3 -0
README.md ADDED
@@ -0,0 +1,314 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ library_name: transformers
3
+ base_model: aubmindlab/bert-base-arabertv02
4
+ tags:
5
+ - generated_from_trainer
6
+ model-index:
7
+ - name: ArabicNewSplits7_B_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k3_task5_organization
8
+ results: []
9
+ ---
10
+
11
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
12
+ should probably proofread and complete it, then remove this comment. -->
13
+
14
+ # ArabicNewSplits7_B_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k3_task5_organization
15
+
16
+ This model is a fine-tuned version of [aubmindlab/bert-base-arabertv02](https://huggingface.co/aubmindlab/bert-base-arabertv02) on the None dataset.
17
+ It achieves the following results on the evaluation set:
18
+ - Loss: 1.2914
19
+ - Qwk: 0.0510
20
+ - Mse: 1.2914
21
+ - Rmse: 1.1364
22
+
23
+ ## Model description
24
+
25
+ More information needed
26
+
27
+ ## Intended uses & limitations
28
+
29
+ More information needed
30
+
31
+ ## Training and evaluation data
32
+
33
+ More information needed
34
+
35
+ ## Training procedure
36
+
37
+ ### Training hyperparameters
38
+
39
+ The following hyperparameters were used during training:
40
+ - learning_rate: 2e-05
41
+ - train_batch_size: 8
42
+ - eval_batch_size: 8
43
+ - seed: 42
44
+ - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
45
+ - lr_scheduler_type: linear
46
+ - num_epochs: 100
47
+
48
+ ### Training results
49
+
50
+ | Training Loss | Epoch | Step | Validation Loss | Qwk | Mse | Rmse |
51
+ |:-------------:|:-------:|:----:|:---------------:|:-------:|:------:|:------:|
52
+ | No log | 0.1429 | 2 | 3.8892 | -0.0032 | 3.8892 | 1.9721 |
53
+ | No log | 0.2857 | 4 | 2.0468 | 0.0054 | 2.0468 | 1.4307 |
54
+ | No log | 0.4286 | 6 | 1.4195 | -0.0105 | 1.4195 | 1.1914 |
55
+ | No log | 0.5714 | 8 | 1.2807 | 0.1034 | 1.2807 | 1.1317 |
56
+ | No log | 0.7143 | 10 | 1.0967 | 0.1989 | 1.0967 | 1.0472 |
57
+ | No log | 0.8571 | 12 | 1.3777 | 0.0612 | 1.3777 | 1.1738 |
58
+ | No log | 1.0 | 14 | 1.3930 | 0.0612 | 1.3930 | 1.1802 |
59
+ | No log | 1.1429 | 16 | 1.1374 | 0.1160 | 1.1374 | 1.0665 |
60
+ | No log | 1.2857 | 18 | 1.1377 | 0.2243 | 1.1377 | 1.0666 |
61
+ | No log | 1.4286 | 20 | 1.3194 | 0.0584 | 1.3194 | 1.1486 |
62
+ | No log | 1.5714 | 22 | 1.6440 | 0.0399 | 1.6440 | 1.2822 |
63
+ | No log | 1.7143 | 24 | 1.4787 | 0.0 | 1.4787 | 1.2160 |
64
+ | No log | 1.8571 | 26 | 1.1646 | 0.0318 | 1.1646 | 1.0792 |
65
+ | No log | 2.0 | 28 | 1.0442 | 0.2663 | 1.0442 | 1.0219 |
66
+ | No log | 2.1429 | 30 | 1.0103 | 0.3071 | 1.0103 | 1.0051 |
67
+ | No log | 2.2857 | 32 | 1.0208 | 0.2921 | 1.0208 | 1.0103 |
68
+ | No log | 2.4286 | 34 | 1.0675 | 0.2515 | 1.0675 | 1.0332 |
69
+ | No log | 2.5714 | 36 | 1.1012 | 0.1752 | 1.1012 | 1.0494 |
70
+ | No log | 2.7143 | 38 | 1.0558 | 0.2615 | 1.0558 | 1.0275 |
71
+ | No log | 2.8571 | 40 | 1.0503 | 0.2505 | 1.0503 | 1.0248 |
72
+ | No log | 3.0 | 42 | 1.0718 | 0.2213 | 1.0718 | 1.0353 |
73
+ | No log | 3.1429 | 44 | 1.0529 | 0.2771 | 1.0529 | 1.0261 |
74
+ | No log | 3.2857 | 46 | 1.0525 | 0.2771 | 1.0525 | 1.0259 |
75
+ | No log | 3.4286 | 48 | 1.0795 | 0.1797 | 1.0795 | 1.0390 |
76
+ | No log | 3.5714 | 50 | 1.1653 | 0.0970 | 1.1653 | 1.0795 |
77
+ | No log | 3.7143 | 52 | 1.2110 | 0.1634 | 1.2110 | 1.1004 |
78
+ | No log | 3.8571 | 54 | 1.2047 | 0.1634 | 1.2047 | 1.0976 |
79
+ | No log | 4.0 | 56 | 1.1560 | 0.1140 | 1.1560 | 1.0752 |
80
+ | No log | 4.1429 | 58 | 1.1816 | 0.1224 | 1.1816 | 1.0870 |
81
+ | No log | 4.2857 | 60 | 1.2786 | 0.1462 | 1.2786 | 1.1308 |
82
+ | No log | 4.4286 | 62 | 1.4327 | 0.1288 | 1.4327 | 1.1969 |
83
+ | No log | 4.5714 | 64 | 1.5185 | 0.1371 | 1.5185 | 1.2323 |
84
+ | No log | 4.7143 | 66 | 1.4602 | 0.1371 | 1.4602 | 1.2084 |
85
+ | No log | 4.8571 | 68 | 1.5343 | 0.0493 | 1.5343 | 1.2387 |
86
+ | No log | 5.0 | 70 | 1.6113 | 0.0406 | 1.6113 | 1.2694 |
87
+ | No log | 5.1429 | 72 | 1.6435 | 0.0090 | 1.6435 | 1.2820 |
88
+ | No log | 5.2857 | 74 | 1.4828 | 0.0554 | 1.4828 | 1.2177 |
89
+ | No log | 5.4286 | 76 | 1.3146 | 0.0961 | 1.3146 | 1.1465 |
90
+ | No log | 5.5714 | 78 | 1.3639 | 0.0510 | 1.3639 | 1.1679 |
91
+ | No log | 5.7143 | 80 | 1.3477 | 0.0510 | 1.3477 | 1.1609 |
92
+ | No log | 5.8571 | 82 | 1.4044 | 0.0510 | 1.4044 | 1.1851 |
93
+ | No log | 6.0 | 84 | 1.5904 | 0.1058 | 1.5904 | 1.2611 |
94
+ | No log | 6.1429 | 86 | 1.5807 | 0.1058 | 1.5807 | 1.2573 |
95
+ | No log | 6.2857 | 88 | 1.4733 | 0.1486 | 1.4733 | 1.2138 |
96
+ | No log | 6.4286 | 90 | 1.3691 | 0.1142 | 1.3691 | 1.1701 |
97
+ | No log | 6.5714 | 92 | 1.3262 | 0.1142 | 1.3262 | 1.1516 |
98
+ | No log | 6.7143 | 94 | 1.2965 | 0.1142 | 1.2965 | 1.1387 |
99
+ | No log | 6.8571 | 96 | 1.2155 | 0.0510 | 1.2155 | 1.1025 |
100
+ | No log | 7.0 | 98 | 1.1349 | 0.0931 | 1.1349 | 1.0653 |
101
+ | No log | 7.1429 | 100 | 1.0748 | 0.1203 | 1.0748 | 1.0367 |
102
+ | No log | 7.2857 | 102 | 1.1166 | 0.0961 | 1.1166 | 1.0567 |
103
+ | No log | 7.4286 | 104 | 1.3030 | 0.2126 | 1.3030 | 1.1415 |
104
+ | No log | 7.5714 | 106 | 1.6692 | 0.1078 | 1.6692 | 1.2920 |
105
+ | No log | 7.7143 | 108 | 1.6714 | 0.1264 | 1.6714 | 1.2928 |
106
+ | No log | 7.8571 | 110 | 1.3940 | 0.2062 | 1.3940 | 1.1807 |
107
+ | No log | 8.0 | 112 | 1.3166 | 0.2065 | 1.3166 | 1.1474 |
108
+ | No log | 8.1429 | 114 | 1.3863 | 0.1943 | 1.3863 | 1.1774 |
109
+ | No log | 8.2857 | 116 | 1.4241 | 0.1943 | 1.4241 | 1.1933 |
110
+ | No log | 8.4286 | 118 | 1.4155 | 0.1634 | 1.4155 | 1.1897 |
111
+ | No log | 8.5714 | 120 | 1.3762 | 0.1744 | 1.3762 | 1.1731 |
112
+ | No log | 8.7143 | 122 | 1.3380 | 0.1744 | 1.3380 | 1.1567 |
113
+ | No log | 8.8571 | 124 | 1.1536 | 0.0587 | 1.1536 | 1.0740 |
114
+ | No log | 9.0 | 126 | 1.1241 | 0.1954 | 1.1241 | 1.0602 |
115
+ | No log | 9.1429 | 128 | 1.1950 | 0.0587 | 1.1950 | 1.0932 |
116
+ | No log | 9.2857 | 130 | 1.4293 | 0.2126 | 1.4293 | 1.1956 |
117
+ | No log | 9.4286 | 132 | 1.5101 | 0.1141 | 1.5101 | 1.2289 |
118
+ | No log | 9.5714 | 134 | 1.3986 | 0.2126 | 1.3986 | 1.1826 |
119
+ | No log | 9.7143 | 136 | 1.2083 | 0.0833 | 1.2083 | 1.0992 |
120
+ | No log | 9.8571 | 138 | 1.1524 | 0.0587 | 1.1524 | 1.0735 |
121
+ | No log | 10.0 | 140 | 1.2128 | 0.0833 | 1.2128 | 1.1013 |
122
+ | No log | 10.1429 | 142 | 1.3870 | 0.1814 | 1.3870 | 1.1777 |
123
+ | No log | 10.2857 | 144 | 1.4594 | 0.1703 | 1.4594 | 1.2080 |
124
+ | No log | 10.4286 | 146 | 1.5208 | 0.1703 | 1.5208 | 1.2332 |
125
+ | No log | 10.5714 | 148 | 1.4091 | 0.2126 | 1.4091 | 1.1871 |
126
+ | No log | 10.7143 | 150 | 1.2407 | 0.1814 | 1.2407 | 1.1139 |
127
+ | No log | 10.8571 | 152 | 1.1903 | 0.1142 | 1.1903 | 1.0910 |
128
+ | No log | 11.0 | 154 | 1.2027 | 0.2126 | 1.2027 | 1.0967 |
129
+ | No log | 11.1429 | 156 | 1.2788 | 0.2126 | 1.2788 | 1.1308 |
130
+ | No log | 11.2857 | 158 | 1.3535 | 0.2126 | 1.3535 | 1.1634 |
131
+ | No log | 11.4286 | 160 | 1.4454 | 0.2126 | 1.4454 | 1.2022 |
132
+ | No log | 11.5714 | 162 | 1.4563 | 0.2126 | 1.4563 | 1.2068 |
133
+ | No log | 11.7143 | 164 | 1.3317 | 0.1814 | 1.3317 | 1.1540 |
134
+ | No log | 11.8571 | 166 | 1.2794 | 0.1024 | 1.2794 | 1.1311 |
135
+ | No log | 12.0 | 168 | 1.3960 | 0.2126 | 1.3960 | 1.1815 |
136
+ | No log | 12.1429 | 170 | 1.5328 | 0.2424 | 1.5328 | 1.2381 |
137
+ | No log | 12.2857 | 172 | 1.4158 | 0.2126 | 1.4158 | 1.1899 |
138
+ | No log | 12.4286 | 174 | 1.2319 | 0.1744 | 1.2319 | 1.1099 |
139
+ | No log | 12.5714 | 176 | 1.2170 | 0.1370 | 1.2170 | 1.1032 |
140
+ | No log | 12.7143 | 178 | 1.2864 | 0.2126 | 1.2864 | 1.1342 |
141
+ | No log | 12.8571 | 180 | 1.3166 | 0.2126 | 1.3166 | 1.1474 |
142
+ | No log | 13.0 | 182 | 1.2895 | 0.2126 | 1.2895 | 1.1356 |
143
+ | No log | 13.1429 | 184 | 1.3368 | 0.2062 | 1.3368 | 1.1562 |
144
+ | No log | 13.2857 | 186 | 1.4846 | 0.2437 | 1.4846 | 1.2184 |
145
+ | No log | 13.4286 | 188 | 1.5046 | 0.1955 | 1.5046 | 1.2266 |
146
+ | No log | 13.5714 | 190 | 1.5275 | 0.1955 | 1.5275 | 1.2359 |
147
+ | No log | 13.7143 | 192 | 1.3838 | 0.1667 | 1.3838 | 1.1764 |
148
+ | No log | 13.8571 | 194 | 1.1575 | 0.2203 | 1.1575 | 1.0759 |
149
+ | No log | 14.0 | 196 | 1.1416 | 0.1202 | 1.1416 | 1.0685 |
150
+ | No log | 14.1429 | 198 | 1.2124 | 0.0833 | 1.2124 | 1.1011 |
151
+ | No log | 14.2857 | 200 | 1.1951 | 0.0833 | 1.1951 | 1.0932 |
152
+ | No log | 14.4286 | 202 | 1.2378 | 0.0401 | 1.2378 | 1.1126 |
153
+ | No log | 14.5714 | 204 | 1.3404 | 0.1228 | 1.3404 | 1.1577 |
154
+ | No log | 14.7143 | 206 | 1.3406 | 0.1943 | 1.3406 | 1.1579 |
155
+ | No log | 14.8571 | 208 | 1.2583 | 0.2260 | 1.2583 | 1.1217 |
156
+ | No log | 15.0 | 210 | 1.1997 | 0.2225 | 1.1997 | 1.0953 |
157
+ | No log | 15.1429 | 212 | 1.3221 | 0.1562 | 1.3221 | 1.1498 |
158
+ | No log | 15.2857 | 214 | 1.3260 | 0.1562 | 1.3260 | 1.1515 |
159
+ | No log | 15.4286 | 216 | 1.1711 | 0.0401 | 1.1711 | 1.0822 |
160
+ | No log | 15.5714 | 218 | 1.1853 | 0.0401 | 1.1853 | 1.0887 |
161
+ | No log | 15.7143 | 220 | 1.2868 | 0.0613 | 1.2868 | 1.1344 |
162
+ | No log | 15.8571 | 222 | 1.1762 | 0.0401 | 1.1762 | 1.0845 |
163
+ | No log | 16.0 | 224 | 1.0842 | 0.1694 | 1.0842 | 1.0413 |
164
+ | No log | 16.1429 | 226 | 1.1432 | 0.2541 | 1.1432 | 1.0692 |
165
+ | No log | 16.2857 | 228 | 1.3369 | 0.2184 | 1.3369 | 1.1562 |
166
+ | No log | 16.4286 | 230 | 1.3468 | 0.1880 | 1.3468 | 1.1605 |
167
+ | No log | 16.5714 | 232 | 1.2070 | 0.2065 | 1.2070 | 1.0986 |
168
+ | No log | 16.7143 | 234 | 1.1931 | 0.0401 | 1.1931 | 1.0923 |
169
+ | No log | 16.8571 | 236 | 1.2331 | 0.1407 | 1.2331 | 1.1104 |
170
+ | No log | 17.0 | 238 | 1.1653 | 0.1744 | 1.1653 | 1.0795 |
171
+ | No log | 17.1429 | 240 | 1.1498 | 0.1744 | 1.1498 | 1.0723 |
172
+ | No log | 17.2857 | 242 | 1.1759 | 0.1744 | 1.1759 | 1.0844 |
173
+ | No log | 17.4286 | 244 | 1.1429 | 0.1744 | 1.1429 | 1.0691 |
174
+ | No log | 17.5714 | 246 | 1.0703 | 0.1265 | 1.0703 | 1.0345 |
175
+ | No log | 17.7143 | 248 | 1.1958 | 0.1744 | 1.1958 | 1.0935 |
176
+ | No log | 17.8571 | 250 | 1.2635 | 0.1407 | 1.2635 | 1.1241 |
177
+ | No log | 18.0 | 252 | 1.1934 | 0.0401 | 1.1934 | 1.0924 |
178
+ | No log | 18.1429 | 254 | 1.0519 | 0.1265 | 1.0519 | 1.0256 |
179
+ | No log | 18.2857 | 256 | 1.0271 | 0.1694 | 1.0271 | 1.0134 |
180
+ | No log | 18.4286 | 258 | 1.0463 | 0.1694 | 1.0463 | 1.0229 |
181
+ | No log | 18.5714 | 260 | 1.1147 | 0.1892 | 1.1147 | 1.0558 |
182
+ | No log | 18.7143 | 262 | 1.2511 | 0.1407 | 1.2511 | 1.1185 |
183
+ | No log | 18.8571 | 264 | 1.2534 | 0.1407 | 1.2534 | 1.1196 |
184
+ | No log | 19.0 | 266 | 1.2755 | 0.1407 | 1.2755 | 1.1294 |
185
+ | No log | 19.1429 | 268 | 1.2158 | 0.1052 | 1.2158 | 1.1026 |
186
+ | No log | 19.2857 | 270 | 1.1028 | 0.1330 | 1.1028 | 1.0501 |
187
+ | No log | 19.4286 | 272 | 1.1391 | 0.1111 | 1.1391 | 1.0673 |
188
+ | No log | 19.5714 | 274 | 1.3093 | 0.1486 | 1.3093 | 1.1442 |
189
+ | No log | 19.7143 | 276 | 1.4771 | 0.1943 | 1.4771 | 1.2154 |
190
+ | No log | 19.8571 | 278 | 1.4338 | 0.1943 | 1.4338 | 1.1974 |
191
+ | No log | 20.0 | 280 | 1.3260 | 0.1814 | 1.3260 | 1.1515 |
192
+ | No log | 20.1429 | 282 | 1.2055 | 0.2225 | 1.2055 | 1.0980 |
193
+ | No log | 20.2857 | 284 | 1.1066 | 0.1846 | 1.1066 | 1.0520 |
194
+ | No log | 20.4286 | 286 | 1.1355 | 0.1846 | 1.1355 | 1.0656 |
195
+ | No log | 20.5714 | 288 | 1.2796 | 0.0401 | 1.2796 | 1.1312 |
196
+ | No log | 20.7143 | 290 | 1.3692 | 0.0510 | 1.3692 | 1.1701 |
197
+ | No log | 20.8571 | 292 | 1.4277 | 0.0510 | 1.4277 | 1.1949 |
198
+ | No log | 21.0 | 294 | 1.4821 | 0.1880 | 1.4821 | 1.2174 |
199
+ | No log | 21.1429 | 296 | 1.4270 | 0.2062 | 1.4270 | 1.1946 |
200
+ | No log | 21.2857 | 298 | 1.4044 | 0.2184 | 1.4044 | 1.1851 |
201
+ | No log | 21.4286 | 300 | 1.4462 | 0.1880 | 1.4462 | 1.2026 |
202
+ | No log | 21.5714 | 302 | 1.3957 | 0.0510 | 1.3957 | 1.1814 |
203
+ | No log | 21.7143 | 304 | 1.3004 | 0.0401 | 1.3004 | 1.1404 |
204
+ | No log | 21.8571 | 306 | 1.2154 | 0.0401 | 1.2154 | 1.1025 |
205
+ | No log | 22.0 | 308 | 1.1573 | 0.0833 | 1.1573 | 1.0758 |
206
+ | No log | 22.1429 | 310 | 1.1561 | 0.0833 | 1.1561 | 1.0752 |
207
+ | No log | 22.2857 | 312 | 1.2844 | 0.1744 | 1.2844 | 1.1333 |
208
+ | No log | 22.4286 | 314 | 1.4677 | 0.2474 | 1.4677 | 1.2115 |
209
+ | No log | 22.5714 | 316 | 1.6027 | 0.2058 | 1.6027 | 1.2660 |
210
+ | No log | 22.7143 | 318 | 1.5519 | 0.2474 | 1.5519 | 1.2457 |
211
+ | No log | 22.8571 | 320 | 1.4056 | 0.1880 | 1.4056 | 1.1856 |
212
+ | No log | 23.0 | 322 | 1.2393 | 0.1407 | 1.2393 | 1.1133 |
213
+ | No log | 23.1429 | 324 | 1.2411 | 0.0401 | 1.2411 | 1.1140 |
214
+ | No log | 23.2857 | 326 | 1.3306 | 0.1142 | 1.3306 | 1.1535 |
215
+ | No log | 23.4286 | 328 | 1.4888 | 0.1814 | 1.4888 | 1.2202 |
216
+ | No log | 23.5714 | 330 | 1.5817 | 0.1814 | 1.5817 | 1.2577 |
217
+ | No log | 23.7143 | 332 | 1.5769 | 0.2342 | 1.5769 | 1.2557 |
218
+ | No log | 23.8571 | 334 | 1.4318 | 0.2793 | 1.4318 | 1.1966 |
219
+ | No log | 24.0 | 336 | 1.2620 | 0.2424 | 1.2620 | 1.1234 |
220
+ | No log | 24.1429 | 338 | 1.1489 | 0.2149 | 1.1489 | 1.0719 |
221
+ | No log | 24.2857 | 340 | 1.2378 | 0.2126 | 1.2378 | 1.1126 |
222
+ | No log | 24.4286 | 342 | 1.3756 | 0.2184 | 1.3756 | 1.1728 |
223
+ | No log | 24.5714 | 344 | 1.4071 | 0.1486 | 1.4071 | 1.1862 |
224
+ | No log | 24.7143 | 346 | 1.4237 | 0.0510 | 1.4237 | 1.1932 |
225
+ | No log | 24.8571 | 348 | 1.3540 | 0.0510 | 1.3540 | 1.1636 |
226
+ | No log | 25.0 | 350 | 1.2950 | 0.0510 | 1.2950 | 1.1380 |
227
+ | No log | 25.1429 | 352 | 1.2907 | 0.0510 | 1.2907 | 1.1361 |
228
+ | No log | 25.2857 | 354 | 1.3015 | 0.0510 | 1.3015 | 1.1408 |
229
+ | No log | 25.4286 | 356 | 1.2866 | 0.0878 | 1.2866 | 1.1343 |
230
+ | No log | 25.5714 | 358 | 1.3262 | 0.1486 | 1.3262 | 1.1516 |
231
+ | No log | 25.7143 | 360 | 1.2861 | 0.0878 | 1.2861 | 1.1340 |
232
+ | No log | 25.8571 | 362 | 1.1958 | 0.1202 | 1.1958 | 1.0935 |
233
+ | No log | 26.0 | 364 | 1.1528 | 0.1202 | 1.1528 | 1.0737 |
234
+ | No log | 26.1429 | 366 | 1.2293 | 0.0781 | 1.2293 | 1.1087 |
235
+ | No log | 26.2857 | 368 | 1.3698 | 0.1562 | 1.3698 | 1.1704 |
236
+ | No log | 26.4286 | 370 | 1.4589 | 0.2568 | 1.4589 | 1.2079 |
237
+ | No log | 26.5714 | 372 | 1.4295 | 0.2292 | 1.4295 | 1.1956 |
238
+ | No log | 26.7143 | 374 | 1.3183 | 0.1228 | 1.3183 | 1.1482 |
239
+ | No log | 26.8571 | 376 | 1.2021 | 0.0833 | 1.2021 | 1.0964 |
240
+ | No log | 27.0 | 378 | 1.1621 | 0.0833 | 1.1621 | 1.0780 |
241
+ | No log | 27.1429 | 380 | 1.1720 | 0.0833 | 1.1720 | 1.0826 |
242
+ | No log | 27.2857 | 382 | 1.2039 | 0.1288 | 1.2039 | 1.0972 |
243
+ | No log | 27.4286 | 384 | 1.2674 | 0.1562 | 1.2674 | 1.1258 |
244
+ | No log | 27.5714 | 386 | 1.2723 | 0.2424 | 1.2723 | 1.1280 |
245
+ | No log | 27.7143 | 388 | 1.2644 | 0.1814 | 1.2644 | 1.1245 |
246
+ | No log | 27.8571 | 390 | 1.1576 | 0.0781 | 1.1576 | 1.0759 |
247
+ | No log | 28.0 | 392 | 1.0507 | 0.0961 | 1.0507 | 1.0251 |
248
+ | No log | 28.1429 | 394 | 1.0506 | 0.0961 | 1.0506 | 1.0250 |
249
+ | No log | 28.2857 | 396 | 1.1084 | 0.1202 | 1.1084 | 1.0528 |
250
+ | No log | 28.4286 | 398 | 1.2174 | 0.0781 | 1.2174 | 1.1034 |
251
+ | No log | 28.5714 | 400 | 1.3668 | 0.1814 | 1.3668 | 1.1691 |
252
+ | No log | 28.7143 | 402 | 1.4044 | 0.2004 | 1.4044 | 1.1851 |
253
+ | No log | 28.8571 | 404 | 1.3965 | 0.2004 | 1.3965 | 1.1817 |
254
+ | No log | 29.0 | 406 | 1.3565 | 0.2184 | 1.3565 | 1.1647 |
255
+ | No log | 29.1429 | 408 | 1.2849 | 0.2203 | 1.2849 | 1.1335 |
256
+ | No log | 29.2857 | 410 | 1.2515 | 0.1886 | 1.2515 | 1.1187 |
257
+ | No log | 29.4286 | 412 | 1.2747 | 0.0931 | 1.2747 | 1.1290 |
258
+ | No log | 29.5714 | 414 | 1.2302 | 0.0931 | 1.2302 | 1.1091 |
259
+ | No log | 29.7143 | 416 | 1.1853 | 0.0833 | 1.1853 | 1.0887 |
260
+ | No log | 29.8571 | 418 | 1.2085 | 0.0931 | 1.2085 | 1.0993 |
261
+ | No log | 30.0 | 420 | 1.2637 | 0.0510 | 1.2637 | 1.1241 |
262
+ | No log | 30.1429 | 422 | 1.3182 | 0.0510 | 1.3182 | 1.1481 |
263
+ | No log | 30.2857 | 424 | 1.3501 | 0.0510 | 1.3501 | 1.1620 |
264
+ | No log | 30.4286 | 426 | 1.3474 | 0.0510 | 1.3474 | 1.1608 |
265
+ | No log | 30.5714 | 428 | 1.3347 | 0.1886 | 1.3347 | 1.1553 |
266
+ | No log | 30.7143 | 430 | 1.4571 | 0.2184 | 1.4571 | 1.2071 |
267
+ | No log | 30.8571 | 432 | 1.4950 | 0.1880 | 1.4950 | 1.2227 |
268
+ | No log | 31.0 | 434 | 1.4071 | 0.1486 | 1.4071 | 1.1862 |
269
+ | No log | 31.1429 | 436 | 1.3217 | 0.0510 | 1.3217 | 1.1497 |
270
+ | No log | 31.2857 | 438 | 1.2657 | 0.0510 | 1.2657 | 1.1251 |
271
+ | No log | 31.4286 | 440 | 1.2749 | 0.0510 | 1.2749 | 1.1291 |
272
+ | No log | 31.5714 | 442 | 1.3215 | 0.0510 | 1.3215 | 1.1496 |
273
+ | No log | 31.7143 | 444 | 1.3451 | 0.0510 | 1.3451 | 1.1598 |
274
+ | No log | 31.8571 | 446 | 1.3870 | 0.0510 | 1.3870 | 1.1777 |
275
+ | No log | 32.0 | 448 | 1.4102 | 0.1814 | 1.4102 | 1.1875 |
276
+ | No log | 32.1429 | 450 | 1.3543 | 0.2506 | 1.3543 | 1.1637 |
277
+ | No log | 32.2857 | 452 | 1.3283 | 0.2506 | 1.3283 | 1.1525 |
278
+ | No log | 32.4286 | 454 | 1.2310 | 0.2203 | 1.2310 | 1.1095 |
279
+ | No log | 32.5714 | 456 | 1.1921 | 0.1552 | 1.1921 | 1.0918 |
280
+ | No log | 32.7143 | 458 | 1.2125 | 0.0931 | 1.2125 | 1.1011 |
281
+ | No log | 32.8571 | 460 | 1.3056 | 0.1486 | 1.3056 | 1.1426 |
282
+ | No log | 33.0 | 462 | 1.3263 | 0.1486 | 1.3263 | 1.1516 |
283
+ | No log | 33.1429 | 464 | 1.3754 | 0.2126 | 1.3754 | 1.1728 |
284
+ | No log | 33.2857 | 466 | 1.4198 | 0.2474 | 1.4198 | 1.1916 |
285
+ | No log | 33.4286 | 468 | 1.3760 | 0.2184 | 1.3760 | 1.1730 |
286
+ | No log | 33.5714 | 470 | 1.2931 | 0.1886 | 1.2931 | 1.1372 |
287
+ | No log | 33.7143 | 472 | 1.2586 | 0.1552 | 1.2586 | 1.1219 |
288
+ | No log | 33.8571 | 474 | 1.2573 | 0.1552 | 1.2573 | 1.1213 |
289
+ | No log | 34.0 | 476 | 1.3291 | 0.2260 | 1.3291 | 1.1529 |
290
+ | No log | 34.1429 | 478 | 1.4402 | 0.1562 | 1.4402 | 1.2001 |
291
+ | No log | 34.2857 | 480 | 1.5007 | 0.1562 | 1.5007 | 1.2250 |
292
+ | No log | 34.4286 | 482 | 1.5721 | 0.1562 | 1.5721 | 1.2538 |
293
+ | No log | 34.5714 | 484 | 1.5650 | 0.1562 | 1.5650 | 1.2510 |
294
+ | No log | 34.7143 | 486 | 1.5314 | 0.1562 | 1.5314 | 1.2375 |
295
+ | No log | 34.8571 | 488 | 1.4324 | 0.1880 | 1.4324 | 1.1968 |
296
+ | No log | 35.0 | 490 | 1.4050 | 0.1814 | 1.4050 | 1.1853 |
297
+ | No log | 35.1429 | 492 | 1.3430 | 0.1814 | 1.3430 | 1.1589 |
298
+ | No log | 35.2857 | 494 | 1.2296 | 0.1473 | 1.2296 | 1.1089 |
299
+ | No log | 35.4286 | 496 | 1.2014 | 0.0833 | 1.2014 | 1.0961 |
300
+ | No log | 35.5714 | 498 | 1.2827 | 0.1142 | 1.2827 | 1.1325 |
301
+ | 0.2282 | 35.7143 | 500 | 1.4196 | 0.1486 | 1.4196 | 1.1915 |
302
+ | 0.2282 | 35.8571 | 502 | 1.4867 | 0.1486 | 1.4867 | 1.2193 |
303
+ | 0.2282 | 36.0 | 504 | 1.4594 | 0.0878 | 1.4594 | 1.2080 |
304
+ | 0.2282 | 36.1429 | 506 | 1.4238 | 0.0878 | 1.4238 | 1.1932 |
305
+ | 0.2282 | 36.2857 | 508 | 1.3471 | 0.0510 | 1.3471 | 1.1606 |
306
+ | 0.2282 | 36.4286 | 510 | 1.2914 | 0.0510 | 1.2914 | 1.1364 |
307
+
308
+
309
+ ### Framework versions
310
+
311
+ - Transformers 4.44.2
312
+ - Pytorch 2.4.0+cu118
313
+ - Datasets 2.21.0
314
+ - Tokenizers 0.19.1
config.json ADDED
@@ -0,0 +1,32 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "_name_or_path": "aubmindlab/bert-base-arabertv02",
3
+ "architectures": [
4
+ "BertForSequenceClassification"
5
+ ],
6
+ "attention_probs_dropout_prob": 0.1,
7
+ "classifier_dropout": null,
8
+ "hidden_act": "gelu",
9
+ "hidden_dropout_prob": 0.1,
10
+ "hidden_size": 768,
11
+ "id2label": {
12
+ "0": "LABEL_0"
13
+ },
14
+ "initializer_range": 0.02,
15
+ "intermediate_size": 3072,
16
+ "label2id": {
17
+ "LABEL_0": 0
18
+ },
19
+ "layer_norm_eps": 1e-12,
20
+ "max_position_embeddings": 512,
21
+ "model_type": "bert",
22
+ "num_attention_heads": 12,
23
+ "num_hidden_layers": 12,
24
+ "pad_token_id": 0,
25
+ "position_embedding_type": "absolute",
26
+ "problem_type": "regression",
27
+ "torch_dtype": "float32",
28
+ "transformers_version": "4.44.2",
29
+ "type_vocab_size": 2,
30
+ "use_cache": true,
31
+ "vocab_size": 64000
32
+ }
model.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:912e5419cec2ebd2137c6f77e7ca1d7c8391737e654ae813bb5a5316e252a678
3
+ size 540799996
training_args.bin ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:ba40c23aaa7b3fcafce062a4c34a0247183534fdc5f74d12f816cf43c69d8570
3
+ size 5368