MayBashendy commited on
Commit
7146be6
·
verified ·
1 Parent(s): 223eb7f

Training in progress, step 500

Browse files
Files changed (4) hide show
  1. README.md +314 -0
  2. config.json +32 -0
  3. model.safetensors +3 -0
  4. training_args.bin +3 -0
README.md ADDED
@@ -0,0 +1,314 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ library_name: transformers
3
+ base_model: aubmindlab/bert-base-arabertv02
4
+ tags:
5
+ - generated_from_trainer
6
+ model-index:
7
+ - name: ArabicNewSplits7_B_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k13_task7_organization
8
+ results: []
9
+ ---
10
+
11
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
12
+ should probably proofread and complete it, then remove this comment. -->
13
+
14
+ # ArabicNewSplits7_B_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k13_task7_organization
15
+
16
+ This model is a fine-tuned version of [aubmindlab/bert-base-arabertv02](https://huggingface.co/aubmindlab/bert-base-arabertv02) on the None dataset.
17
+ It achieves the following results on the evaluation set:
18
+ - Loss: 1.3086
19
+ - Qwk: 0.0955
20
+ - Mse: 1.3086
21
+ - Rmse: 1.1440
22
+
23
+ ## Model description
24
+
25
+ More information needed
26
+
27
+ ## Intended uses & limitations
28
+
29
+ More information needed
30
+
31
+ ## Training and evaluation data
32
+
33
+ More information needed
34
+
35
+ ## Training procedure
36
+
37
+ ### Training hyperparameters
38
+
39
+ The following hyperparameters were used during training:
40
+ - learning_rate: 2e-05
41
+ - train_batch_size: 8
42
+ - eval_batch_size: 8
43
+ - seed: 42
44
+ - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
45
+ - lr_scheduler_type: linear
46
+ - num_epochs: 100
47
+
48
+ ### Training results
49
+
50
+ | Training Loss | Epoch | Step | Validation Loss | Qwk | Mse | Rmse |
51
+ |:-------------:|:------:|:----:|:---------------:|:-------:|:------:|:------:|
52
+ | No log | 0.0312 | 2 | 2.5023 | -0.1213 | 2.5023 | 1.5819 |
53
+ | No log | 0.0625 | 4 | 1.2220 | 0.0977 | 1.2220 | 1.1055 |
54
+ | No log | 0.0938 | 6 | 1.1312 | -0.1082 | 1.1312 | 1.0636 |
55
+ | No log | 0.125 | 8 | 0.9710 | 0.0162 | 0.9710 | 0.9854 |
56
+ | No log | 0.1562 | 10 | 0.9826 | 0.0927 | 0.9826 | 0.9913 |
57
+ | No log | 0.1875 | 12 | 1.3388 | -0.0699 | 1.3388 | 1.1571 |
58
+ | No log | 0.2188 | 14 | 1.3880 | -0.2648 | 1.3880 | 1.1782 |
59
+ | No log | 0.25 | 16 | 1.3680 | -0.2350 | 1.3680 | 1.1696 |
60
+ | No log | 0.2812 | 18 | 1.3002 | -0.0730 | 1.3002 | 1.1403 |
61
+ | No log | 0.3125 | 20 | 1.0899 | 0.1709 | 1.0899 | 1.0440 |
62
+ | No log | 0.3438 | 22 | 0.8732 | 0.2012 | 0.8732 | 0.9344 |
63
+ | No log | 0.375 | 24 | 0.7268 | 0.0804 | 0.7268 | 0.8525 |
64
+ | No log | 0.4062 | 26 | 0.7202 | 0.1333 | 0.7202 | 0.8486 |
65
+ | No log | 0.4375 | 28 | 0.7420 | 0.0393 | 0.7420 | 0.8614 |
66
+ | No log | 0.4688 | 30 | 0.8184 | -0.0054 | 0.8184 | 0.9046 |
67
+ | No log | 0.5 | 32 | 0.8781 | -0.0079 | 0.8781 | 0.9371 |
68
+ | No log | 0.5312 | 34 | 0.8924 | -0.0149 | 0.8924 | 0.9447 |
69
+ | No log | 0.5625 | 36 | 0.8819 | 0.0679 | 0.8819 | 0.9391 |
70
+ | No log | 0.5938 | 38 | 0.8533 | 0.0679 | 0.8533 | 0.9237 |
71
+ | No log | 0.625 | 40 | 0.8703 | -0.0103 | 0.8703 | 0.9329 |
72
+ | No log | 0.6562 | 42 | 0.9117 | -0.0127 | 0.9117 | 0.9548 |
73
+ | No log | 0.6875 | 44 | 0.9484 | 0.0966 | 0.9484 | 0.9739 |
74
+ | No log | 0.7188 | 46 | 1.0341 | 0.0891 | 1.0341 | 1.0169 |
75
+ | No log | 0.75 | 48 | 1.0413 | 0.1422 | 1.0413 | 1.0204 |
76
+ | No log | 0.7812 | 50 | 1.0125 | 0.1254 | 1.0125 | 1.0063 |
77
+ | No log | 0.8125 | 52 | 0.9404 | 0.1103 | 0.9404 | 0.9698 |
78
+ | No log | 0.8438 | 54 | 0.9451 | 0.1459 | 0.9451 | 0.9722 |
79
+ | No log | 0.875 | 56 | 0.9680 | 0.0966 | 0.9680 | 0.9839 |
80
+ | No log | 0.9062 | 58 | 0.9921 | 0.0966 | 0.9921 | 0.9961 |
81
+ | No log | 0.9375 | 60 | 0.9774 | 0.1007 | 0.9774 | 0.9886 |
82
+ | No log | 0.9688 | 62 | 0.9540 | 0.0643 | 0.9540 | 0.9767 |
83
+ | No log | 1.0 | 64 | 0.9653 | 0.0966 | 0.9653 | 0.9825 |
84
+ | No log | 1.0312 | 66 | 0.9772 | 0.0966 | 0.9772 | 0.9885 |
85
+ | No log | 1.0625 | 68 | 0.9840 | 0.0966 | 0.9840 | 0.9920 |
86
+ | No log | 1.0938 | 70 | 0.9958 | 0.1598 | 0.9958 | 0.9979 |
87
+ | No log | 1.125 | 72 | 0.9908 | 0.1459 | 0.9908 | 0.9954 |
88
+ | No log | 1.1562 | 74 | 1.0486 | 0.0653 | 1.0486 | 1.0240 |
89
+ | No log | 1.1875 | 76 | 1.1310 | 0.1293 | 1.1310 | 1.0635 |
90
+ | No log | 1.2188 | 78 | 1.0841 | 0.1065 | 1.0841 | 1.0412 |
91
+ | No log | 1.25 | 80 | 1.0515 | 0.0816 | 1.0515 | 1.0254 |
92
+ | No log | 1.2812 | 82 | 1.0326 | 0.1225 | 1.0326 | 1.0162 |
93
+ | No log | 1.3125 | 84 | 1.0893 | -0.0727 | 1.0893 | 1.0437 |
94
+ | No log | 1.3438 | 86 | 0.9711 | 0.1007 | 0.9711 | 0.9854 |
95
+ | No log | 1.375 | 88 | 0.9492 | 0.1007 | 0.9492 | 0.9743 |
96
+ | No log | 1.4062 | 90 | 0.9820 | 0.1010 | 0.9820 | 0.9909 |
97
+ | No log | 1.4375 | 92 | 1.0128 | 0.0637 | 1.0128 | 1.0064 |
98
+ | No log | 1.4688 | 94 | 1.1704 | -0.0410 | 1.1704 | 1.0818 |
99
+ | No log | 1.5 | 96 | 1.4772 | -0.0577 | 1.4772 | 1.2154 |
100
+ | No log | 1.5312 | 98 | 1.4839 | 0.0447 | 1.4839 | 1.2181 |
101
+ | No log | 1.5625 | 100 | 1.1599 | -0.0063 | 1.1599 | 1.0770 |
102
+ | No log | 1.5938 | 102 | 0.9996 | 0.1213 | 0.9996 | 0.9998 |
103
+ | No log | 1.625 | 104 | 1.0931 | 0.1175 | 1.0931 | 1.0455 |
104
+ | No log | 1.6562 | 106 | 1.1615 | 0.0757 | 1.1615 | 1.0777 |
105
+ | No log | 1.6875 | 108 | 1.1225 | 0.1181 | 1.1225 | 1.0595 |
106
+ | No log | 1.7188 | 110 | 1.1213 | 0.0149 | 1.1213 | 1.0589 |
107
+ | No log | 1.75 | 112 | 1.1101 | 0.0603 | 1.1101 | 1.0536 |
108
+ | No log | 1.7812 | 114 | 1.1715 | 0.0175 | 1.1715 | 1.0824 |
109
+ | No log | 1.8125 | 116 | 1.1568 | 0.0175 | 1.1568 | 1.0756 |
110
+ | No log | 1.8438 | 118 | 1.2667 | -0.0216 | 1.2667 | 1.1255 |
111
+ | No log | 1.875 | 120 | 1.2813 | 0.0114 | 1.2813 | 1.1319 |
112
+ | No log | 1.9062 | 122 | 1.2644 | 0.0114 | 1.2644 | 1.1245 |
113
+ | No log | 1.9375 | 124 | 1.1986 | 0.0860 | 1.1986 | 1.0948 |
114
+ | No log | 1.9688 | 126 | 1.1902 | 0.1144 | 1.1902 | 1.0910 |
115
+ | No log | 2.0 | 128 | 1.1468 | 0.0994 | 1.1468 | 1.0709 |
116
+ | No log | 2.0312 | 130 | 1.0989 | 0.0835 | 1.0989 | 1.0483 |
117
+ | No log | 2.0625 | 132 | 1.0767 | 0.0572 | 1.0767 | 1.0376 |
118
+ | No log | 2.0938 | 134 | 1.0577 | 0.0918 | 1.0577 | 1.0285 |
119
+ | No log | 2.125 | 136 | 1.0367 | 0.0812 | 1.0367 | 1.0182 |
120
+ | No log | 2.1562 | 138 | 1.0755 | 0.1978 | 1.0755 | 1.0371 |
121
+ | No log | 2.1875 | 140 | 1.1350 | 0.1454 | 1.1350 | 1.0654 |
122
+ | No log | 2.2188 | 142 | 1.1515 | 0.1542 | 1.1515 | 1.0731 |
123
+ | No log | 2.25 | 144 | 1.1133 | 0.1542 | 1.1133 | 1.0551 |
124
+ | No log | 2.2812 | 146 | 1.1054 | 0.1046 | 1.1054 | 1.0514 |
125
+ | No log | 2.3125 | 148 | 1.1820 | 0.1507 | 1.1820 | 1.0872 |
126
+ | No log | 2.3438 | 150 | 1.1754 | 0.2000 | 1.1754 | 1.0842 |
127
+ | No log | 2.375 | 152 | 1.1135 | 0.1140 | 1.1135 | 1.0552 |
128
+ | No log | 2.4062 | 154 | 1.0917 | 0.0874 | 1.0917 | 1.0448 |
129
+ | No log | 2.4375 | 156 | 1.0544 | 0.1773 | 1.0544 | 1.0268 |
130
+ | No log | 2.4688 | 158 | 1.0478 | 0.0099 | 1.0478 | 1.0236 |
131
+ | No log | 2.5 | 160 | 1.1526 | 0.1399 | 1.1526 | 1.0736 |
132
+ | No log | 2.5312 | 162 | 1.1534 | 0.0508 | 1.1534 | 1.0740 |
133
+ | No log | 2.5625 | 164 | 1.0525 | 0.0839 | 1.0525 | 1.0259 |
134
+ | No log | 2.5938 | 166 | 1.1587 | 0.0134 | 1.1587 | 1.0764 |
135
+ | No log | 2.625 | 168 | 1.3635 | 0.0325 | 1.3635 | 1.1677 |
136
+ | No log | 2.6562 | 170 | 1.3785 | 0.0512 | 1.3785 | 1.1741 |
137
+ | No log | 2.6875 | 172 | 1.2500 | 0.1116 | 1.2500 | 1.1180 |
138
+ | No log | 2.7188 | 174 | 1.1604 | 0.2342 | 1.1604 | 1.0772 |
139
+ | No log | 2.75 | 176 | 1.2035 | 0.0623 | 1.2035 | 1.0970 |
140
+ | No log | 2.7812 | 178 | 1.2072 | 0.1786 | 1.2072 | 1.0987 |
141
+ | No log | 2.8125 | 180 | 1.1890 | 0.2062 | 1.1890 | 1.0904 |
142
+ | No log | 2.8438 | 182 | 1.1183 | 0.1871 | 1.1183 | 1.0575 |
143
+ | No log | 2.875 | 184 | 1.0080 | 0.1808 | 1.0080 | 1.0040 |
144
+ | No log | 2.9062 | 186 | 0.9237 | 0.1773 | 0.9237 | 0.9611 |
145
+ | No log | 2.9375 | 188 | 0.9512 | 0.1090 | 0.9512 | 0.9753 |
146
+ | No log | 2.9688 | 190 | 1.0093 | 0.0587 | 1.0093 | 1.0047 |
147
+ | No log | 3.0 | 192 | 0.9591 | 0.0504 | 0.9591 | 0.9793 |
148
+ | No log | 3.0312 | 194 | 0.9177 | 0.2109 | 0.9177 | 0.9580 |
149
+ | No log | 3.0625 | 196 | 1.0465 | 0.2643 | 1.0465 | 1.0230 |
150
+ | No log | 3.0938 | 198 | 1.2036 | 0.1587 | 1.2036 | 1.0971 |
151
+ | No log | 3.125 | 200 | 1.2636 | 0.1587 | 1.2636 | 1.1241 |
152
+ | No log | 3.1562 | 202 | 1.2362 | 0.1874 | 1.2362 | 1.1119 |
153
+ | No log | 3.1875 | 204 | 1.1294 | 0.1047 | 1.1294 | 1.0628 |
154
+ | No log | 3.2188 | 206 | 1.0729 | 0.0973 | 1.0729 | 1.0358 |
155
+ | No log | 3.25 | 208 | 1.0610 | 0.0909 | 1.0610 | 1.0301 |
156
+ | No log | 3.2812 | 210 | 1.0558 | 0.1274 | 1.0558 | 1.0275 |
157
+ | No log | 3.3125 | 212 | 1.0599 | 0.2134 | 1.0599 | 1.0295 |
158
+ | No log | 3.3438 | 214 | 1.0292 | 0.1101 | 1.0292 | 1.0145 |
159
+ | No log | 3.375 | 216 | 1.0399 | 0.1464 | 1.0399 | 1.0197 |
160
+ | No log | 3.4062 | 218 | 1.0712 | 0.1640 | 1.0712 | 1.0350 |
161
+ | No log | 3.4375 | 220 | 1.1357 | 0.1712 | 1.1357 | 1.0657 |
162
+ | No log | 3.4688 | 222 | 1.1913 | 0.1912 | 1.1913 | 1.0915 |
163
+ | No log | 3.5 | 224 | 1.2288 | 0.2156 | 1.2288 | 1.1085 |
164
+ | No log | 3.5312 | 226 | 1.2119 | 0.1781 | 1.2119 | 1.1008 |
165
+ | No log | 3.5625 | 228 | 1.2681 | 0.1874 | 1.2681 | 1.1261 |
166
+ | No log | 3.5938 | 230 | 1.3006 | 0.1793 | 1.3006 | 1.1405 |
167
+ | No log | 3.625 | 232 | 1.2706 | 0.1580 | 1.2706 | 1.1272 |
168
+ | No log | 3.6562 | 234 | 1.1918 | 0.1476 | 1.1918 | 1.0917 |
169
+ | No log | 3.6875 | 236 | 1.1416 | 0.1420 | 1.1416 | 1.0684 |
170
+ | No log | 3.7188 | 238 | 1.1254 | 0.2216 | 1.1254 | 1.0608 |
171
+ | No log | 3.75 | 240 | 1.0859 | 0.2320 | 1.0859 | 1.0421 |
172
+ | No log | 3.7812 | 242 | 1.0451 | 0.2693 | 1.0451 | 1.0223 |
173
+ | No log | 3.8125 | 244 | 0.9982 | 0.1960 | 0.9982 | 0.9991 |
174
+ | No log | 3.8438 | 246 | 1.0047 | 0.2105 | 1.0047 | 1.0024 |
175
+ | No log | 3.875 | 248 | 1.0663 | 0.2336 | 1.0663 | 1.0326 |
176
+ | No log | 3.9062 | 250 | 1.0545 | 0.2513 | 1.0545 | 1.0269 |
177
+ | No log | 3.9375 | 252 | 0.9717 | 0.2861 | 0.9717 | 0.9858 |
178
+ | No log | 3.9688 | 254 | 0.9198 | 0.2467 | 0.9198 | 0.9590 |
179
+ | No log | 4.0 | 256 | 0.8988 | 0.2690 | 0.8988 | 0.9480 |
180
+ | No log | 4.0312 | 258 | 0.9435 | 0.2111 | 0.9435 | 0.9713 |
181
+ | No log | 4.0625 | 260 | 0.9589 | 0.2026 | 0.9589 | 0.9792 |
182
+ | No log | 4.0938 | 262 | 1.0139 | 0.2750 | 1.0139 | 1.0069 |
183
+ | No log | 4.125 | 264 | 1.1850 | 0.2089 | 1.1850 | 1.0886 |
184
+ | No log | 4.1562 | 266 | 1.2325 | 0.1789 | 1.2325 | 1.1102 |
185
+ | No log | 4.1875 | 268 | 1.1345 | 0.1304 | 1.1345 | 1.0651 |
186
+ | No log | 4.2188 | 270 | 1.0788 | 0.2097 | 1.0788 | 1.0387 |
187
+ | No log | 4.25 | 272 | 1.0277 | 0.2123 | 1.0277 | 1.0138 |
188
+ | No log | 4.2812 | 274 | 0.9841 | 0.2145 | 0.9841 | 0.9920 |
189
+ | No log | 4.3125 | 276 | 0.9857 | 0.1771 | 0.9857 | 0.9928 |
190
+ | No log | 4.3438 | 278 | 1.0551 | 0.2139 | 1.0551 | 1.0272 |
191
+ | No log | 4.375 | 280 | 1.1553 | 0.2014 | 1.1553 | 1.0749 |
192
+ | No log | 4.4062 | 282 | 1.1551 | 0.2014 | 1.1551 | 1.0748 |
193
+ | No log | 4.4375 | 284 | 1.1001 | 0.2442 | 1.1001 | 1.0489 |
194
+ | No log | 4.4688 | 286 | 1.0455 | 0.2042 | 1.0455 | 1.0225 |
195
+ | No log | 4.5 | 288 | 1.1047 | 0.1652 | 1.1047 | 1.0510 |
196
+ | No log | 4.5312 | 290 | 1.2059 | 0.2167 | 1.2059 | 1.0981 |
197
+ | No log | 4.5625 | 292 | 1.3335 | 0.2097 | 1.3335 | 1.1548 |
198
+ | No log | 4.5938 | 294 | 1.2952 | 0.2329 | 1.2952 | 1.1381 |
199
+ | No log | 4.625 | 296 | 1.1815 | 0.1935 | 1.1815 | 1.0869 |
200
+ | No log | 4.6562 | 298 | 1.0852 | 0.2294 | 1.0852 | 1.0417 |
201
+ | No log | 4.6875 | 300 | 0.9653 | 0.2661 | 0.9653 | 0.9825 |
202
+ | No log | 4.7188 | 302 | 0.9372 | 0.2029 | 0.9372 | 0.9681 |
203
+ | No log | 4.75 | 304 | 0.9442 | 0.1875 | 0.9442 | 0.9717 |
204
+ | No log | 4.7812 | 306 | 1.0386 | 0.1911 | 1.0386 | 1.0191 |
205
+ | No log | 4.8125 | 308 | 1.1200 | 0.1911 | 1.1200 | 1.0583 |
206
+ | No log | 4.8438 | 310 | 1.1049 | 0.1911 | 1.1049 | 1.0512 |
207
+ | No log | 4.875 | 312 | 1.0257 | 0.2066 | 1.0257 | 1.0128 |
208
+ | No log | 4.9062 | 314 | 0.9833 | 0.1643 | 0.9833 | 0.9916 |
209
+ | No log | 4.9375 | 316 | 0.9805 | 0.1862 | 0.9805 | 0.9902 |
210
+ | No log | 4.9688 | 318 | 1.0560 | 0.1819 | 1.0560 | 1.0276 |
211
+ | No log | 5.0 | 320 | 1.1556 | 0.1662 | 1.1556 | 1.0750 |
212
+ | No log | 5.0312 | 322 | 1.1245 | 0.1514 | 1.1245 | 1.0604 |
213
+ | No log | 5.0625 | 324 | 1.0811 | 0.1514 | 1.0811 | 1.0398 |
214
+ | No log | 5.0938 | 326 | 1.0203 | 0.1899 | 1.0203 | 1.0101 |
215
+ | No log | 5.125 | 328 | 0.9736 | 0.2210 | 0.9736 | 0.9867 |
216
+ | No log | 5.1562 | 330 | 0.9609 | 0.1661 | 0.9609 | 0.9803 |
217
+ | No log | 5.1875 | 332 | 1.0206 | 0.1968 | 1.0206 | 1.0103 |
218
+ | No log | 5.2188 | 334 | 1.1829 | 0.1434 | 1.1829 | 1.0876 |
219
+ | No log | 5.25 | 336 | 1.3125 | 0.2132 | 1.3125 | 1.1457 |
220
+ | No log | 5.2812 | 338 | 1.3056 | 0.1736 | 1.3056 | 1.1426 |
221
+ | No log | 5.3125 | 340 | 1.1625 | 0.1500 | 1.1625 | 1.0782 |
222
+ | No log | 5.3438 | 342 | 1.0520 | 0.1198 | 1.0520 | 1.0257 |
223
+ | No log | 5.375 | 344 | 1.0226 | 0.2049 | 1.0226 | 1.0112 |
224
+ | No log | 5.4062 | 346 | 1.0166 | 0.1785 | 1.0166 | 1.0083 |
225
+ | No log | 5.4375 | 348 | 1.0167 | 0.1785 | 1.0167 | 1.0083 |
226
+ | No log | 5.4688 | 350 | 1.0553 | 0.1170 | 1.0553 | 1.0273 |
227
+ | No log | 5.5 | 352 | 1.0883 | 0.1228 | 1.0883 | 1.0432 |
228
+ | No log | 5.5312 | 354 | 1.1498 | 0.0935 | 1.1498 | 1.0723 |
229
+ | No log | 5.5625 | 356 | 1.1782 | 0.1370 | 1.1782 | 1.0855 |
230
+ | No log | 5.5938 | 358 | 1.0624 | 0.0990 | 1.0624 | 1.0307 |
231
+ | No log | 5.625 | 360 | 0.9839 | 0.1033 | 0.9839 | 0.9919 |
232
+ | No log | 5.6562 | 362 | 0.9378 | 0.1597 | 0.9378 | 0.9684 |
233
+ | No log | 5.6875 | 364 | 0.9120 | 0.1616 | 0.9120 | 0.9550 |
234
+ | No log | 5.7188 | 366 | 0.9395 | 0.1541 | 0.9395 | 0.9693 |
235
+ | No log | 5.75 | 368 | 1.0490 | 0.1586 | 1.0490 | 1.0242 |
236
+ | No log | 5.7812 | 370 | 1.0871 | 0.1173 | 1.0871 | 1.0426 |
237
+ | No log | 5.8125 | 372 | 1.0560 | 0.1586 | 1.0560 | 1.0276 |
238
+ | No log | 5.8438 | 374 | 0.9929 | 0.1541 | 0.9929 | 0.9964 |
239
+ | No log | 5.875 | 376 | 1.0093 | 0.0909 | 1.0093 | 1.0047 |
240
+ | No log | 5.9062 | 378 | 1.0048 | 0.0971 | 1.0048 | 1.0024 |
241
+ | No log | 5.9375 | 380 | 1.0114 | 0.0971 | 1.0114 | 1.0057 |
242
+ | No log | 5.9688 | 382 | 0.9709 | 0.1333 | 0.9709 | 0.9853 |
243
+ | No log | 6.0 | 384 | 0.9601 | 0.2769 | 0.9601 | 0.9798 |
244
+ | No log | 6.0312 | 386 | 0.9414 | 0.1627 | 0.9414 | 0.9703 |
245
+ | No log | 6.0625 | 388 | 0.9732 | 0.1620 | 0.9732 | 0.9865 |
246
+ | No log | 6.0938 | 390 | 1.0371 | 0.1606 | 1.0371 | 1.0184 |
247
+ | No log | 6.125 | 392 | 1.1297 | 0.1262 | 1.1297 | 1.0629 |
248
+ | No log | 6.1562 | 394 | 1.1382 | 0.1294 | 1.1382 | 1.0669 |
249
+ | No log | 6.1875 | 396 | 1.0886 | 0.1606 | 1.0886 | 1.0433 |
250
+ | No log | 6.2188 | 398 | 1.0779 | 0.1046 | 1.0779 | 1.0382 |
251
+ | No log | 6.25 | 400 | 1.1621 | 0.0961 | 1.1621 | 1.0780 |
252
+ | No log | 6.2812 | 402 | 1.2607 | 0.1439 | 1.2607 | 1.1228 |
253
+ | No log | 6.3125 | 404 | 1.2714 | 0.1594 | 1.2714 | 1.1276 |
254
+ | No log | 6.3438 | 406 | 1.1905 | 0.0961 | 1.1905 | 1.0911 |
255
+ | No log | 6.375 | 408 | 1.1098 | 0.0848 | 1.1098 | 1.0535 |
256
+ | No log | 6.4062 | 410 | 1.0633 | 0.0904 | 1.0633 | 1.0312 |
257
+ | No log | 6.4375 | 412 | 1.0567 | 0.0906 | 1.0567 | 1.0280 |
258
+ | No log | 6.4688 | 414 | 1.0946 | 0.1077 | 1.0946 | 1.0462 |
259
+ | No log | 6.5 | 416 | 1.1540 | 0.1017 | 1.1540 | 1.0742 |
260
+ | No log | 6.5312 | 418 | 1.1787 | 0.0253 | 1.1787 | 1.0857 |
261
+ | No log | 6.5625 | 420 | 1.1802 | 0.0285 | 1.1802 | 1.0864 |
262
+ | No log | 6.5938 | 422 | 1.2452 | 0.0661 | 1.2452 | 1.1159 |
263
+ | No log | 6.625 | 424 | 1.3085 | 0.1303 | 1.3085 | 1.1439 |
264
+ | No log | 6.6562 | 426 | 1.3026 | 0.0576 | 1.3026 | 1.1413 |
265
+ | No log | 6.6875 | 428 | 1.2429 | 0.0834 | 1.2429 | 1.1149 |
266
+ | No log | 6.7188 | 430 | 1.2577 | 0.0834 | 1.2577 | 1.1215 |
267
+ | No log | 6.75 | 432 | 1.3588 | 0.0533 | 1.3588 | 1.1657 |
268
+ | No log | 6.7812 | 434 | 1.4340 | 0.1242 | 1.4340 | 1.1975 |
269
+ | No log | 6.8125 | 436 | 1.3590 | 0.1228 | 1.3590 | 1.1657 |
270
+ | No log | 6.8438 | 438 | 1.2182 | 0.0927 | 1.2182 | 1.1037 |
271
+ | No log | 6.875 | 440 | 1.1761 | 0.1374 | 1.1761 | 1.0845 |
272
+ | No log | 6.9062 | 442 | 1.2176 | 0.1014 | 1.2176 | 1.1034 |
273
+ | No log | 6.9375 | 444 | 1.2836 | 0.1317 | 1.2836 | 1.1330 |
274
+ | No log | 6.9688 | 446 | 1.3025 | 0.1510 | 1.3025 | 1.1413 |
275
+ | No log | 7.0 | 448 | 1.1714 | 0.0933 | 1.1714 | 1.0823 |
276
+ | No log | 7.0312 | 450 | 1.0530 | 0.0937 | 1.0530 | 1.0261 |
277
+ | No log | 7.0625 | 452 | 1.0204 | 0.1066 | 1.0204 | 1.0101 |
278
+ | No log | 7.0938 | 454 | 1.0993 | 0.1326 | 1.0993 | 1.0485 |
279
+ | No log | 7.125 | 456 | 1.1602 | 0.0909 | 1.1602 | 1.0771 |
280
+ | No log | 7.1562 | 458 | 1.2343 | 0.1031 | 1.2343 | 1.1110 |
281
+ | No log | 7.1875 | 460 | 1.1828 | 0.1370 | 1.1828 | 1.0876 |
282
+ | No log | 7.2188 | 462 | 1.0923 | 0.0762 | 1.0923 | 1.0451 |
283
+ | No log | 7.25 | 464 | 1.0832 | 0.1108 | 1.0832 | 1.0408 |
284
+ | No log | 7.2812 | 466 | 1.1399 | 0.0985 | 1.1399 | 1.0677 |
285
+ | No log | 7.3125 | 468 | 1.2755 | 0.0539 | 1.2755 | 1.1294 |
286
+ | No log | 7.3438 | 470 | 1.3345 | 0.1004 | 1.3345 | 1.1552 |
287
+ | No log | 7.375 | 472 | 1.2832 | 0.1031 | 1.2832 | 1.1328 |
288
+ | No log | 7.4062 | 474 | 1.2345 | 0.0875 | 1.2345 | 1.1111 |
289
+ | No log | 7.4375 | 476 | 1.1475 | 0.0468 | 1.1475 | 1.0712 |
290
+ | No log | 7.4688 | 478 | 1.1536 | 0.0468 | 1.1536 | 1.0741 |
291
+ | No log | 7.5 | 480 | 1.2130 | 0.0875 | 1.2130 | 1.1014 |
292
+ | No log | 7.5312 | 482 | 1.2784 | 0.0798 | 1.2784 | 1.1307 |
293
+ | No log | 7.5625 | 484 | 1.3619 | 0.1144 | 1.3619 | 1.1670 |
294
+ | No log | 7.5938 | 486 | 1.3425 | 0.0952 | 1.3425 | 1.1587 |
295
+ | No log | 7.625 | 488 | 1.2302 | 0.1173 | 1.2302 | 1.1092 |
296
+ | No log | 7.6562 | 490 | 1.1395 | 0.1017 | 1.1395 | 1.0675 |
297
+ | No log | 7.6875 | 492 | 1.1006 | 0.1277 | 1.1006 | 1.0491 |
298
+ | No log | 7.7188 | 494 | 1.1097 | 0.0496 | 1.1097 | 1.0534 |
299
+ | No log | 7.75 | 496 | 1.1319 | 0.1542 | 1.1319 | 1.0639 |
300
+ | No log | 7.7812 | 498 | 1.2514 | 0.1736 | 1.2514 | 1.1187 |
301
+ | 0.3987 | 7.8125 | 500 | 1.3982 | 0.1777 | 1.3982 | 1.1824 |
302
+ | 0.3987 | 7.8438 | 502 | 1.3870 | 0.1520 | 1.3870 | 1.1777 |
303
+ | 0.3987 | 7.875 | 504 | 1.2725 | 0.1310 | 1.2725 | 1.1280 |
304
+ | 0.3987 | 7.9062 | 506 | 1.2396 | 0.0937 | 1.2396 | 1.1134 |
305
+ | 0.3987 | 7.9375 | 508 | 1.2511 | 0.0655 | 1.2511 | 1.1185 |
306
+ | 0.3987 | 7.9688 | 510 | 1.3086 | 0.0955 | 1.3086 | 1.1440 |
307
+
308
+
309
+ ### Framework versions
310
+
311
+ - Transformers 4.44.2
312
+ - Pytorch 2.4.0+cu118
313
+ - Datasets 2.21.0
314
+ - Tokenizers 0.19.1
config.json ADDED
@@ -0,0 +1,32 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "_name_or_path": "aubmindlab/bert-base-arabertv02",
3
+ "architectures": [
4
+ "BertForSequenceClassification"
5
+ ],
6
+ "attention_probs_dropout_prob": 0.1,
7
+ "classifier_dropout": null,
8
+ "hidden_act": "gelu",
9
+ "hidden_dropout_prob": 0.1,
10
+ "hidden_size": 768,
11
+ "id2label": {
12
+ "0": "LABEL_0"
13
+ },
14
+ "initializer_range": 0.02,
15
+ "intermediate_size": 3072,
16
+ "label2id": {
17
+ "LABEL_0": 0
18
+ },
19
+ "layer_norm_eps": 1e-12,
20
+ "max_position_embeddings": 512,
21
+ "model_type": "bert",
22
+ "num_attention_heads": 12,
23
+ "num_hidden_layers": 12,
24
+ "pad_token_id": 0,
25
+ "position_embedding_type": "absolute",
26
+ "problem_type": "regression",
27
+ "torch_dtype": "float32",
28
+ "transformers_version": "4.44.2",
29
+ "type_vocab_size": 2,
30
+ "use_cache": true,
31
+ "vocab_size": 64000
32
+ }
model.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:c0c7e9c3a4be3fcf6f177475188bdf69311781066837456c9820dd5cce4bf0ba
3
+ size 540799996
training_args.bin ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:3af4f1ec64f1c04291edc75d1f493ecdff177bcf1679b54ef9785c716bb13374
3
+ size 5368