MayBashendy commited on
Commit
2cfc0f3
·
verified ·
1 Parent(s): 3b76660

Training in progress, step 500

Browse files
Files changed (4) hide show
  1. README.md +314 -0
  2. config.json +32 -0
  3. model.safetensors +3 -0
  4. training_args.bin +3 -0
README.md ADDED
@@ -0,0 +1,314 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ library_name: transformers
3
+ base_model: aubmindlab/bert-base-arabertv02
4
+ tags:
5
+ - generated_from_trainer
6
+ model-index:
7
+ - name: ArabicNewSplits7_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k10_task5_organization
8
+ results: []
9
+ ---
10
+
11
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
12
+ should probably proofread and complete it, then remove this comment. -->
13
+
14
+ # ArabicNewSplits7_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k10_task5_organization
15
+
16
+ This model is a fine-tuned version of [aubmindlab/bert-base-arabertv02](https://huggingface.co/aubmindlab/bert-base-arabertv02) on the None dataset.
17
+ It achieves the following results on the evaluation set:
18
+ - Loss: 1.5647
19
+ - Qwk: 0.2566
20
+ - Mse: 1.5647
21
+ - Rmse: 1.2509
22
+
23
+ ## Model description
24
+
25
+ More information needed
26
+
27
+ ## Intended uses & limitations
28
+
29
+ More information needed
30
+
31
+ ## Training and evaluation data
32
+
33
+ More information needed
34
+
35
+ ## Training procedure
36
+
37
+ ### Training hyperparameters
38
+
39
+ The following hyperparameters were used during training:
40
+ - learning_rate: 2e-05
41
+ - train_batch_size: 8
42
+ - eval_batch_size: 8
43
+ - seed: 42
44
+ - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
45
+ - lr_scheduler_type: linear
46
+ - num_epochs: 100
47
+
48
+ ### Training results
49
+
50
+ | Training Loss | Epoch | Step | Validation Loss | Qwk | Mse | Rmse |
51
+ |:-------------:|:-------:|:----:|:---------------:|:-------:|:------:|:------:|
52
+ | No log | 0.0833 | 2 | 4.0352 | -0.0008 | 4.0352 | 2.0088 |
53
+ | No log | 0.1667 | 4 | 2.5663 | -0.0088 | 2.5663 | 1.6020 |
54
+ | No log | 0.25 | 6 | 1.7523 | 0.0435 | 1.7523 | 1.3237 |
55
+ | No log | 0.3333 | 8 | 1.1986 | 0.1255 | 1.1986 | 1.0948 |
56
+ | No log | 0.4167 | 10 | 1.0601 | 0.2663 | 1.0601 | 1.0296 |
57
+ | No log | 0.5 | 12 | 1.1813 | -0.0075 | 1.1813 | 1.0869 |
58
+ | No log | 0.5833 | 14 | 1.2480 | 0.0021 | 1.2480 | 1.1171 |
59
+ | No log | 0.6667 | 16 | 1.1057 | 0.1062 | 1.1057 | 1.0515 |
60
+ | No log | 0.75 | 18 | 1.0794 | 0.2366 | 1.0794 | 1.0389 |
61
+ | No log | 0.8333 | 20 | 1.1134 | 0.2293 | 1.1134 | 1.0552 |
62
+ | No log | 0.9167 | 22 | 1.2043 | 0.0790 | 1.2043 | 1.0974 |
63
+ | No log | 1.0 | 24 | 1.1877 | 0.1205 | 1.1877 | 1.0898 |
64
+ | No log | 1.0833 | 26 | 1.1884 | 0.1618 | 1.1884 | 1.0902 |
65
+ | No log | 1.1667 | 28 | 1.5477 | 0.0399 | 1.5477 | 1.2440 |
66
+ | No log | 1.25 | 30 | 1.5679 | 0.0 | 1.5679 | 1.2522 |
67
+ | No log | 1.3333 | 32 | 1.2898 | 0.0996 | 1.2898 | 1.1357 |
68
+ | No log | 1.4167 | 34 | 1.0327 | 0.1521 | 1.0327 | 1.0162 |
69
+ | No log | 1.5 | 36 | 1.0051 | 0.1713 | 1.0051 | 1.0025 |
70
+ | No log | 1.5833 | 38 | 0.9994 | 0.1313 | 0.9994 | 0.9997 |
71
+ | No log | 1.6667 | 40 | 1.0764 | 0.0941 | 1.0764 | 1.0375 |
72
+ | No log | 1.75 | 42 | 1.0842 | 0.1537 | 1.0842 | 1.0412 |
73
+ | No log | 1.8333 | 44 | 0.9882 | 0.1685 | 0.9882 | 0.9941 |
74
+ | No log | 1.9167 | 46 | 0.9861 | 0.2746 | 0.9861 | 0.9930 |
75
+ | No log | 2.0 | 48 | 0.9850 | 0.1532 | 0.9850 | 0.9925 |
76
+ | No log | 2.0833 | 50 | 1.0110 | 0.2108 | 1.0110 | 1.0055 |
77
+ | No log | 2.1667 | 52 | 1.1221 | 0.1846 | 1.1221 | 1.0593 |
78
+ | No log | 2.25 | 54 | 1.1521 | 0.2040 | 1.1521 | 1.0734 |
79
+ | No log | 2.3333 | 56 | 1.0663 | 0.2895 | 1.0663 | 1.0326 |
80
+ | No log | 2.4167 | 58 | 1.0357 | 0.2574 | 1.0357 | 1.0177 |
81
+ | No log | 2.5 | 60 | 0.9391 | 0.2251 | 0.9391 | 0.9691 |
82
+ | No log | 2.5833 | 62 | 0.9399 | 0.2276 | 0.9399 | 0.9695 |
83
+ | No log | 2.6667 | 64 | 1.1240 | 0.2149 | 1.1240 | 1.0602 |
84
+ | No log | 2.75 | 66 | 1.3706 | 0.0636 | 1.3706 | 1.1707 |
85
+ | No log | 2.8333 | 68 | 1.3196 | 0.1986 | 1.3196 | 1.1488 |
86
+ | No log | 2.9167 | 70 | 1.1582 | 0.1893 | 1.1582 | 1.0762 |
87
+ | No log | 3.0 | 72 | 1.1241 | 0.2567 | 1.1241 | 1.0603 |
88
+ | No log | 3.0833 | 74 | 1.3890 | 0.2650 | 1.3890 | 1.1786 |
89
+ | No log | 3.1667 | 76 | 1.6219 | 0.1575 | 1.6219 | 1.2735 |
90
+ | No log | 3.25 | 78 | 1.3375 | 0.2729 | 1.3375 | 1.1565 |
91
+ | No log | 3.3333 | 80 | 1.0166 | 0.2636 | 1.0166 | 1.0083 |
92
+ | No log | 3.4167 | 82 | 1.0140 | 0.2505 | 1.0140 | 1.0070 |
93
+ | No log | 3.5 | 84 | 1.3260 | 0.2812 | 1.3260 | 1.1515 |
94
+ | No log | 3.5833 | 86 | 1.7867 | 0.0094 | 1.7867 | 1.3367 |
95
+ | No log | 3.6667 | 88 | 1.6855 | 0.0864 | 1.6855 | 1.2982 |
96
+ | No log | 3.75 | 90 | 1.2645 | 0.2795 | 1.2645 | 1.1245 |
97
+ | No log | 3.8333 | 92 | 0.9626 | 0.2167 | 0.9626 | 0.9811 |
98
+ | No log | 3.9167 | 94 | 0.9549 | 0.2577 | 0.9549 | 0.9772 |
99
+ | No log | 4.0 | 96 | 1.0026 | 0.2238 | 1.0026 | 1.0013 |
100
+ | No log | 4.0833 | 98 | 1.1935 | 0.2203 | 1.1935 | 1.0925 |
101
+ | No log | 4.1667 | 100 | 1.4696 | -0.0122 | 1.4696 | 1.2123 |
102
+ | No log | 4.25 | 102 | 1.4692 | 0.0033 | 1.4692 | 1.2121 |
103
+ | No log | 4.3333 | 104 | 1.2665 | 0.2477 | 1.2665 | 1.1254 |
104
+ | No log | 4.4167 | 106 | 1.2112 | 0.2409 | 1.2112 | 1.1005 |
105
+ | No log | 4.5 | 108 | 1.3189 | 0.2195 | 1.3189 | 1.1484 |
106
+ | No log | 4.5833 | 110 | 1.4691 | 0.2588 | 1.4691 | 1.2120 |
107
+ | No log | 4.6667 | 112 | 1.5950 | 0.2511 | 1.5950 | 1.2629 |
108
+ | No log | 4.75 | 114 | 1.5122 | 0.2771 | 1.5122 | 1.2297 |
109
+ | No log | 4.8333 | 116 | 1.3074 | 0.2686 | 1.3074 | 1.1434 |
110
+ | No log | 4.9167 | 118 | 1.3080 | 0.2686 | 1.3080 | 1.1437 |
111
+ | No log | 5.0 | 120 | 1.5024 | 0.2292 | 1.5024 | 1.2257 |
112
+ | No log | 5.0833 | 122 | 1.6247 | 0.1902 | 1.6247 | 1.2747 |
113
+ | No log | 5.1667 | 124 | 1.5717 | 0.2342 | 1.5717 | 1.2537 |
114
+ | No log | 5.25 | 126 | 1.3994 | 0.2506 | 1.3994 | 1.1830 |
115
+ | No log | 5.3333 | 128 | 1.2788 | 0.2284 | 1.2788 | 1.1309 |
116
+ | No log | 5.4167 | 130 | 1.3171 | 0.2203 | 1.3171 | 1.1477 |
117
+ | No log | 5.5 | 132 | 1.3587 | 0.2203 | 1.3587 | 1.1656 |
118
+ | No log | 5.5833 | 134 | 1.4580 | 0.2555 | 1.4580 | 1.2075 |
119
+ | No log | 5.6667 | 136 | 1.4811 | 0.2555 | 1.4811 | 1.2170 |
120
+ | No log | 5.75 | 138 | 1.3859 | 0.1814 | 1.3859 | 1.1772 |
121
+ | No log | 5.8333 | 140 | 1.1880 | 0.2284 | 1.1880 | 1.0900 |
122
+ | No log | 5.9167 | 142 | 1.1413 | 0.1911 | 1.1413 | 1.0683 |
123
+ | No log | 6.0 | 144 | 1.2514 | 0.2730 | 1.2514 | 1.1186 |
124
+ | No log | 6.0833 | 146 | 1.5816 | 0.2123 | 1.5816 | 1.2576 |
125
+ | No log | 6.1667 | 148 | 1.6955 | 0.2317 | 1.6955 | 1.3021 |
126
+ | No log | 6.25 | 150 | 1.5632 | 0.2004 | 1.5632 | 1.2503 |
127
+ | No log | 6.3333 | 152 | 1.3769 | 0.2203 | 1.3769 | 1.1734 |
128
+ | No log | 6.4167 | 154 | 1.3231 | 0.2203 | 1.3231 | 1.1503 |
129
+ | No log | 6.5 | 156 | 1.2984 | 0.2143 | 1.2984 | 1.1395 |
130
+ | No log | 6.5833 | 158 | 1.3728 | 0.2455 | 1.3728 | 1.1717 |
131
+ | No log | 6.6667 | 160 | 1.3686 | 0.2506 | 1.3686 | 1.1699 |
132
+ | No log | 6.75 | 162 | 1.4669 | 0.2075 | 1.4669 | 1.2111 |
133
+ | No log | 6.8333 | 164 | 1.4634 | 0.2075 | 1.4634 | 1.2097 |
134
+ | No log | 6.9167 | 166 | 1.4068 | 0.2203 | 1.4068 | 1.1861 |
135
+ | No log | 7.0 | 168 | 1.4186 | 0.1886 | 1.4186 | 1.1910 |
136
+ | No log | 7.0833 | 170 | 1.4289 | 0.1886 | 1.4289 | 1.1954 |
137
+ | No log | 7.1667 | 172 | 1.5003 | 0.2075 | 1.5003 | 1.2249 |
138
+ | No log | 7.25 | 174 | 1.5718 | 0.2690 | 1.5718 | 1.2537 |
139
+ | No log | 7.3333 | 176 | 1.5038 | 0.2731 | 1.5038 | 1.2263 |
140
+ | No log | 7.4167 | 178 | 1.3428 | 0.2772 | 1.3428 | 1.1588 |
141
+ | No log | 7.5 | 180 | 1.4585 | 0.2731 | 1.4585 | 1.2077 |
142
+ | No log | 7.5833 | 182 | 1.7303 | 0.2159 | 1.7303 | 1.3154 |
143
+ | No log | 7.6667 | 184 | 1.7356 | 0.2317 | 1.7356 | 1.3174 |
144
+ | No log | 7.75 | 186 | 1.5632 | 0.2568 | 1.5632 | 1.2503 |
145
+ | No log | 7.8333 | 188 | 1.4022 | 0.2260 | 1.4022 | 1.1842 |
146
+ | No log | 7.9167 | 190 | 1.2962 | 0.1628 | 1.2962 | 1.1385 |
147
+ | No log | 8.0 | 192 | 1.2806 | 0.1628 | 1.2806 | 1.1316 |
148
+ | No log | 8.0833 | 194 | 1.3119 | 0.2260 | 1.3119 | 1.1454 |
149
+ | No log | 8.1667 | 196 | 1.3099 | 0.2555 | 1.3099 | 1.1445 |
150
+ | No log | 8.25 | 198 | 1.3175 | 0.2795 | 1.3175 | 1.1478 |
151
+ | No log | 8.3333 | 200 | 1.2971 | 0.2602 | 1.2971 | 1.1389 |
152
+ | No log | 8.4167 | 202 | 1.2174 | 0.2752 | 1.2174 | 1.1034 |
153
+ | No log | 8.5 | 204 | 1.1204 | 0.3018 | 1.1204 | 1.0585 |
154
+ | No log | 8.5833 | 206 | 1.1680 | 0.2752 | 1.1680 | 1.0807 |
155
+ | No log | 8.6667 | 208 | 1.4196 | 0.2126 | 1.4196 | 1.1915 |
156
+ | No log | 8.75 | 210 | 1.6397 | 0.2004 | 1.6397 | 1.2805 |
157
+ | No log | 8.8333 | 212 | 1.6241 | 0.1703 | 1.6241 | 1.2744 |
158
+ | No log | 8.9167 | 214 | 1.5937 | 0.1703 | 1.5937 | 1.2624 |
159
+ | No log | 9.0 | 216 | 1.6711 | 0.1832 | 1.6711 | 1.2927 |
160
+ | No log | 9.0833 | 218 | 1.6325 | 0.1832 | 1.6325 | 1.2777 |
161
+ | No log | 9.1667 | 220 | 1.5336 | 0.1880 | 1.5336 | 1.2384 |
162
+ | No log | 9.25 | 222 | 1.4030 | 0.2424 | 1.4030 | 1.1845 |
163
+ | No log | 9.3333 | 224 | 1.2429 | 0.2752 | 1.2429 | 1.1149 |
164
+ | No log | 9.4167 | 226 | 1.2184 | 0.1816 | 1.2184 | 1.1038 |
165
+ | No log | 9.5 | 228 | 1.2440 | 0.1407 | 1.2440 | 1.1153 |
166
+ | No log | 9.5833 | 230 | 1.3327 | 0.2474 | 1.3327 | 1.1544 |
167
+ | No log | 9.6667 | 232 | 1.3913 | 0.2342 | 1.3913 | 1.1795 |
168
+ | No log | 9.75 | 234 | 1.3686 | 0.2771 | 1.3686 | 1.1699 |
169
+ | No log | 9.8333 | 236 | 1.5024 | 0.2159 | 1.5024 | 1.2257 |
170
+ | No log | 9.9167 | 238 | 1.4099 | 0.2638 | 1.4099 | 1.1874 |
171
+ | No log | 10.0 | 240 | 1.3660 | 0.2809 | 1.3660 | 1.1687 |
172
+ | No log | 10.0833 | 242 | 1.3794 | 0.2771 | 1.3794 | 1.1745 |
173
+ | No log | 10.1667 | 244 | 1.3603 | 0.2793 | 1.3603 | 1.1663 |
174
+ | No log | 10.25 | 246 | 1.4809 | 0.2752 | 1.4809 | 1.2169 |
175
+ | No log | 10.3333 | 248 | 1.4229 | 0.2474 | 1.4229 | 1.1929 |
176
+ | No log | 10.4167 | 250 | 1.2679 | 0.2474 | 1.2679 | 1.1260 |
177
+ | No log | 10.5 | 252 | 1.0772 | 0.1202 | 1.0772 | 1.0379 |
178
+ | No log | 10.5833 | 254 | 1.0087 | 0.0636 | 1.0087 | 1.0043 |
179
+ | No log | 10.6667 | 256 | 1.0666 | 0.1351 | 1.0666 | 1.0328 |
180
+ | No log | 10.75 | 258 | 1.3097 | 0.2752 | 1.3097 | 1.1444 |
181
+ | No log | 10.8333 | 260 | 1.6038 | 0.2832 | 1.6038 | 1.2664 |
182
+ | No log | 10.9167 | 262 | 1.6791 | 0.2653 | 1.6791 | 1.2958 |
183
+ | No log | 11.0 | 264 | 1.5787 | 0.2474 | 1.5787 | 1.2564 |
184
+ | No log | 11.0833 | 266 | 1.4032 | 0.2474 | 1.4032 | 1.1846 |
185
+ | No log | 11.1667 | 268 | 1.2088 | 0.1552 | 1.2088 | 1.0994 |
186
+ | No log | 11.25 | 270 | 1.1892 | 0.1552 | 1.1892 | 1.0905 |
187
+ | No log | 11.3333 | 272 | 1.3378 | 0.2506 | 1.3378 | 1.1567 |
188
+ | No log | 11.4167 | 274 | 1.6929 | 0.1663 | 1.6929 | 1.3011 |
189
+ | No log | 11.5 | 276 | 1.8861 | 0.1964 | 1.8861 | 1.3734 |
190
+ | No log | 11.5833 | 278 | 1.8438 | 0.2007 | 1.8438 | 1.3579 |
191
+ | No log | 11.6667 | 280 | 1.6846 | 0.2611 | 1.6846 | 1.2979 |
192
+ | No log | 11.75 | 282 | 1.4717 | 0.1486 | 1.4717 | 1.2131 |
193
+ | No log | 11.8333 | 284 | 1.2745 | 0.0833 | 1.2745 | 1.1289 |
194
+ | No log | 11.9167 | 286 | 1.2078 | 0.0445 | 1.2078 | 1.0990 |
195
+ | No log | 12.0 | 288 | 1.2180 | 0.0445 | 1.2180 | 1.1036 |
196
+ | No log | 12.0833 | 290 | 1.2544 | 0.1202 | 1.2544 | 1.1200 |
197
+ | No log | 12.1667 | 292 | 1.3931 | 0.1407 | 1.3931 | 1.1803 |
198
+ | No log | 12.25 | 294 | 1.6043 | 0.2342 | 1.6043 | 1.2666 |
199
+ | No log | 12.3333 | 296 | 1.7359 | 0.2252 | 1.7359 | 1.3175 |
200
+ | No log | 12.4167 | 298 | 1.6441 | 0.2252 | 1.6441 | 1.2822 |
201
+ | No log | 12.5 | 300 | 1.3742 | 0.2647 | 1.3742 | 1.1723 |
202
+ | No log | 12.5833 | 302 | 1.1760 | 0.1110 | 1.1760 | 1.0844 |
203
+ | No log | 12.6667 | 304 | 1.1554 | 0.0352 | 1.1554 | 1.0749 |
204
+ | No log | 12.75 | 306 | 1.2672 | 0.0833 | 1.2672 | 1.1257 |
205
+ | No log | 12.8333 | 308 | 1.4077 | 0.2260 | 1.4077 | 1.1865 |
206
+ | No log | 12.9167 | 310 | 1.5072 | 0.2062 | 1.5072 | 1.2277 |
207
+ | No log | 13.0 | 312 | 1.4711 | 0.2239 | 1.4711 | 1.2129 |
208
+ | No log | 13.0833 | 314 | 1.5378 | 0.2292 | 1.5378 | 1.2401 |
209
+ | No log | 13.1667 | 316 | 1.4835 | 0.2239 | 1.4835 | 1.2180 |
210
+ | No log | 13.25 | 318 | 1.3994 | 0.2506 | 1.3994 | 1.1830 |
211
+ | No log | 13.3333 | 320 | 1.4193 | 0.1486 | 1.4193 | 1.1914 |
212
+ | No log | 13.4167 | 322 | 1.4242 | 0.1486 | 1.4242 | 1.1934 |
213
+ | No log | 13.5 | 324 | 1.4112 | 0.1486 | 1.4112 | 1.1879 |
214
+ | No log | 13.5833 | 326 | 1.3752 | 0.1952 | 1.3752 | 1.1727 |
215
+ | No log | 13.6667 | 328 | 1.3480 | 0.2260 | 1.3480 | 1.1610 |
216
+ | No log | 13.75 | 330 | 1.3547 | 0.2602 | 1.3547 | 1.1639 |
217
+ | No log | 13.8333 | 332 | 1.4523 | 0.2292 | 1.4523 | 1.2051 |
218
+ | No log | 13.9167 | 334 | 1.4150 | 0.2647 | 1.4150 | 1.1895 |
219
+ | No log | 14.0 | 336 | 1.2194 | 0.3056 | 1.2194 | 1.1043 |
220
+ | No log | 14.0833 | 338 | 1.1404 | 0.3263 | 1.1404 | 1.0679 |
221
+ | No log | 14.1667 | 340 | 1.2345 | 0.2795 | 1.2345 | 1.1111 |
222
+ | No log | 14.25 | 342 | 1.4185 | 0.2117 | 1.4185 | 1.1910 |
223
+ | No log | 14.3333 | 344 | 1.4374 | 0.2437 | 1.4374 | 1.1989 |
224
+ | No log | 14.4167 | 346 | 1.3743 | 0.2731 | 1.3743 | 1.1723 |
225
+ | No log | 14.5 | 348 | 1.3782 | 0.2952 | 1.3782 | 1.1740 |
226
+ | No log | 14.5833 | 350 | 1.5088 | 0.2110 | 1.5088 | 1.2283 |
227
+ | No log | 14.6667 | 352 | 1.5552 | 0.2110 | 1.5552 | 1.2471 |
228
+ | No log | 14.75 | 354 | 1.4647 | 0.2437 | 1.4647 | 1.2103 |
229
+ | No log | 14.8333 | 356 | 1.3796 | 0.2474 | 1.3796 | 1.1746 |
230
+ | No log | 14.9167 | 358 | 1.3484 | 0.1744 | 1.3484 | 1.1612 |
231
+ | No log | 15.0 | 360 | 1.4076 | 0.1407 | 1.4076 | 1.1864 |
232
+ | No log | 15.0833 | 362 | 1.4709 | 0.1943 | 1.4709 | 1.2128 |
233
+ | No log | 15.1667 | 364 | 1.4924 | 0.1943 | 1.4924 | 1.2216 |
234
+ | No log | 15.25 | 366 | 1.4463 | 0.2126 | 1.4463 | 1.2026 |
235
+ | No log | 15.3333 | 368 | 1.3736 | 0.1744 | 1.3736 | 1.1720 |
236
+ | No log | 15.4167 | 370 | 1.3570 | 0.1744 | 1.3570 | 1.1649 |
237
+ | No log | 15.5 | 372 | 1.3418 | 0.1744 | 1.3418 | 1.1584 |
238
+ | No log | 15.5833 | 374 | 1.2572 | 0.1744 | 1.2572 | 1.1213 |
239
+ | No log | 15.6667 | 376 | 1.1804 | 0.2284 | 1.1804 | 1.0865 |
240
+ | No log | 15.75 | 378 | 1.1705 | 0.2681 | 1.1705 | 1.0819 |
241
+ | No log | 15.8333 | 380 | 1.3176 | 0.2372 | 1.3176 | 1.1479 |
242
+ | No log | 15.9167 | 382 | 1.5117 | 0.2709 | 1.5117 | 1.2295 |
243
+ | No log | 16.0 | 384 | 1.6655 | 0.2793 | 1.6655 | 1.2905 |
244
+ | No log | 16.0833 | 386 | 1.6478 | 0.2793 | 1.6478 | 1.2837 |
245
+ | No log | 16.1667 | 388 | 1.5298 | 0.2709 | 1.5298 | 1.2369 |
246
+ | No log | 16.25 | 390 | 1.4120 | 0.2665 | 1.4120 | 1.1883 |
247
+ | No log | 16.3333 | 392 | 1.3130 | 0.2709 | 1.3130 | 1.1459 |
248
+ | No log | 16.4167 | 394 | 1.2720 | 0.2665 | 1.2720 | 1.1278 |
249
+ | No log | 16.5 | 396 | 1.2584 | 0.2372 | 1.2584 | 1.1218 |
250
+ | No log | 16.5833 | 398 | 1.2579 | 0.2372 | 1.2579 | 1.1216 |
251
+ | No log | 16.6667 | 400 | 1.2961 | 0.2372 | 1.2961 | 1.1384 |
252
+ | No log | 16.75 | 402 | 1.3656 | 0.2665 | 1.3656 | 1.1686 |
253
+ | No log | 16.8333 | 404 | 1.4434 | 0.2709 | 1.4434 | 1.2014 |
254
+ | No log | 16.9167 | 406 | 1.4231 | 0.2568 | 1.4231 | 1.1929 |
255
+ | No log | 17.0 | 408 | 1.3042 | 0.2292 | 1.3042 | 1.1420 |
256
+ | No log | 17.0833 | 410 | 1.1013 | 0.2089 | 1.1013 | 1.0494 |
257
+ | No log | 17.1667 | 412 | 1.0304 | 0.2051 | 1.0304 | 1.0151 |
258
+ | No log | 17.25 | 414 | 1.1494 | 0.2926 | 1.1494 | 1.0721 |
259
+ | No log | 17.3333 | 416 | 1.2900 | 0.2342 | 1.2900 | 1.1358 |
260
+ | No log | 17.4167 | 418 | 1.4236 | 0.2568 | 1.4236 | 1.1931 |
261
+ | No log | 17.5 | 420 | 1.6003 | 0.1807 | 1.6003 | 1.2650 |
262
+ | No log | 17.5833 | 422 | 1.6309 | 0.1860 | 1.6309 | 1.2771 |
263
+ | No log | 17.6667 | 424 | 1.4995 | 0.2522 | 1.4995 | 1.2245 |
264
+ | No log | 17.75 | 426 | 1.3087 | 0.1142 | 1.3087 | 1.1440 |
265
+ | No log | 17.8333 | 428 | 1.1252 | 0.1202 | 1.1252 | 1.0608 |
266
+ | No log | 17.9167 | 430 | 1.0885 | 0.1622 | 1.0885 | 1.0433 |
267
+ | No log | 18.0 | 432 | 1.2325 | 0.1142 | 1.2325 | 1.1102 |
268
+ | No log | 18.0833 | 434 | 1.4209 | 0.2372 | 1.4209 | 1.1920 |
269
+ | No log | 18.1667 | 436 | 1.4296 | 0.2126 | 1.4296 | 1.1957 |
270
+ | No log | 18.25 | 438 | 1.4602 | 0.2126 | 1.4602 | 1.2084 |
271
+ | No log | 18.3333 | 440 | 1.3469 | 0.2126 | 1.3469 | 1.1606 |
272
+ | No log | 18.4167 | 442 | 1.1473 | 0.1744 | 1.1473 | 1.0711 |
273
+ | No log | 18.5 | 444 | 1.1352 | 0.2143 | 1.1352 | 1.0655 |
274
+ | No log | 18.5833 | 446 | 1.2708 | 0.2239 | 1.2708 | 1.1273 |
275
+ | No log | 18.6667 | 448 | 1.4627 | 0.2974 | 1.4627 | 1.2094 |
276
+ | No log | 18.75 | 450 | 1.5287 | 0.3036 | 1.5287 | 1.2364 |
277
+ | No log | 18.8333 | 452 | 1.4192 | 0.3117 | 1.4192 | 1.1913 |
278
+ | No log | 18.9167 | 454 | 1.2709 | 0.2239 | 1.2709 | 1.1273 |
279
+ | No log | 19.0 | 456 | 1.1785 | 0.2341 | 1.1785 | 1.0856 |
280
+ | No log | 19.0833 | 458 | 1.1204 | 0.1961 | 1.1204 | 1.0585 |
281
+ | No log | 19.1667 | 460 | 1.1507 | 0.1202 | 1.1507 | 1.0727 |
282
+ | No log | 19.25 | 462 | 1.1594 | 0.1202 | 1.1594 | 1.0768 |
283
+ | No log | 19.3333 | 464 | 1.2459 | 0.1142 | 1.2459 | 1.1162 |
284
+ | No log | 19.4167 | 466 | 1.3563 | 0.2424 | 1.3563 | 1.1646 |
285
+ | No log | 19.5 | 468 | 1.4016 | 0.2568 | 1.4016 | 1.1839 |
286
+ | No log | 19.5833 | 470 | 1.3351 | 0.2522 | 1.3351 | 1.1555 |
287
+ | No log | 19.6667 | 472 | 1.3084 | 0.2424 | 1.3084 | 1.1438 |
288
+ | No log | 19.75 | 474 | 1.3187 | 0.2424 | 1.3187 | 1.1483 |
289
+ | No log | 19.8333 | 476 | 1.2829 | 0.1814 | 1.2829 | 1.1326 |
290
+ | No log | 19.9167 | 478 | 1.2050 | 0.0781 | 1.2050 | 1.0977 |
291
+ | No log | 20.0 | 480 | 1.1397 | 0.1202 | 1.1397 | 1.0676 |
292
+ | No log | 20.0833 | 482 | 1.1301 | 0.1202 | 1.1301 | 1.0631 |
293
+ | No log | 20.1667 | 484 | 1.1992 | 0.2143 | 1.1992 | 1.0951 |
294
+ | No log | 20.25 | 486 | 1.3002 | 0.2184 | 1.3002 | 1.1403 |
295
+ | No log | 20.3333 | 488 | 1.4296 | 0.2437 | 1.4296 | 1.1957 |
296
+ | No log | 20.4167 | 490 | 1.5133 | 0.2733 | 1.5133 | 1.2302 |
297
+ | No log | 20.5 | 492 | 1.5094 | 0.2482 | 1.5094 | 1.2286 |
298
+ | No log | 20.5833 | 494 | 1.4411 | 0.2653 | 1.4411 | 1.2004 |
299
+ | No log | 20.6667 | 496 | 1.3709 | 0.2653 | 1.3709 | 1.1708 |
300
+ | No log | 20.75 | 498 | 1.4505 | 0.2653 | 1.4505 | 1.2044 |
301
+ | 0.2673 | 20.8333 | 500 | 1.4366 | 0.2653 | 1.4366 | 1.1986 |
302
+ | 0.2673 | 20.9167 | 502 | 1.3565 | 0.2611 | 1.3565 | 1.1647 |
303
+ | 0.2673 | 21.0 | 504 | 1.4319 | 0.2653 | 1.4319 | 1.1966 |
304
+ | 0.2673 | 21.0833 | 506 | 1.5630 | 0.2363 | 1.5630 | 1.2502 |
305
+ | 0.2673 | 21.1667 | 508 | 1.5727 | 0.2363 | 1.5727 | 1.2541 |
306
+ | 0.2673 | 21.25 | 510 | 1.5647 | 0.2566 | 1.5647 | 1.2509 |
307
+
308
+
309
+ ### Framework versions
310
+
311
+ - Transformers 4.44.2
312
+ - Pytorch 2.4.0+cu118
313
+ - Datasets 2.21.0
314
+ - Tokenizers 0.19.1
config.json ADDED
@@ -0,0 +1,32 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "_name_or_path": "aubmindlab/bert-base-arabertv02",
3
+ "architectures": [
4
+ "BertForSequenceClassification"
5
+ ],
6
+ "attention_probs_dropout_prob": 0.1,
7
+ "classifier_dropout": null,
8
+ "hidden_act": "gelu",
9
+ "hidden_dropout_prob": 0.1,
10
+ "hidden_size": 768,
11
+ "id2label": {
12
+ "0": "LABEL_0"
13
+ },
14
+ "initializer_range": 0.02,
15
+ "intermediate_size": 3072,
16
+ "label2id": {
17
+ "LABEL_0": 0
18
+ },
19
+ "layer_norm_eps": 1e-12,
20
+ "max_position_embeddings": 512,
21
+ "model_type": "bert",
22
+ "num_attention_heads": 12,
23
+ "num_hidden_layers": 12,
24
+ "pad_token_id": 0,
25
+ "position_embedding_type": "absolute",
26
+ "problem_type": "regression",
27
+ "torch_dtype": "float32",
28
+ "transformers_version": "4.44.2",
29
+ "type_vocab_size": 2,
30
+ "use_cache": true,
31
+ "vocab_size": 64000
32
+ }
model.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:683e6aa8ada8e930e729bfe1e4fe1082c029eb9687d683056c7ee0f2f943e96e
3
+ size 540799996
training_args.bin ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:2c8794d32771d4c68907ddf42584e0dc0800463b08132eaf1d52f1e2273eb2d7
3
+ size 5368