MayBashendy commited on
Commit
5039dfe
·
verified ·
1 Parent(s): b5ff272

Training in progress, step 500

Browse files
Files changed (4) hide show
  1. README.md +314 -0
  2. config.json +32 -0
  3. model.safetensors +3 -0
  4. training_args.bin +3 -0
README.md ADDED
@@ -0,0 +1,314 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ library_name: transformers
3
+ base_model: aubmindlab/bert-base-arabertv02
4
+ tags:
5
+ - generated_from_trainer
6
+ model-index:
7
+ - name: ArabicNewSplits7_B_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k5_task7_organization
8
+ results: []
9
+ ---
10
+
11
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
12
+ should probably proofread and complete it, then remove this comment. -->
13
+
14
+ # ArabicNewSplits7_B_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k5_task7_organization
15
+
16
+ This model is a fine-tuned version of [aubmindlab/bert-base-arabertv02](https://huggingface.co/aubmindlab/bert-base-arabertv02) on the None dataset.
17
+ It achieves the following results on the evaluation set:
18
+ - Loss: 0.9902
19
+ - Qwk: 0.1839
20
+ - Mse: 0.9902
21
+ - Rmse: 0.9951
22
+
23
+ ## Model description
24
+
25
+ More information needed
26
+
27
+ ## Intended uses & limitations
28
+
29
+ More information needed
30
+
31
+ ## Training and evaluation data
32
+
33
+ More information needed
34
+
35
+ ## Training procedure
36
+
37
+ ### Training hyperparameters
38
+
39
+ The following hyperparameters were used during training:
40
+ - learning_rate: 2e-05
41
+ - train_batch_size: 8
42
+ - eval_batch_size: 8
43
+ - seed: 42
44
+ - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
45
+ - lr_scheduler_type: linear
46
+ - num_epochs: 100
47
+
48
+ ### Training results
49
+
50
+ | Training Loss | Epoch | Step | Validation Loss | Qwk | Mse | Rmse |
51
+ |:-------------:|:------:|:----:|:---------------:|:-------:|:------:|:------:|
52
+ | No log | 0.08 | 2 | 2.5052 | -0.0788 | 2.5052 | 1.5828 |
53
+ | No log | 0.16 | 4 | 1.2090 | 0.2164 | 1.2090 | 1.0995 |
54
+ | No log | 0.24 | 6 | 1.0926 | -0.1072 | 1.0926 | 1.0453 |
55
+ | No log | 0.32 | 8 | 1.2681 | -0.1767 | 1.2681 | 1.1261 |
56
+ | No log | 0.4 | 10 | 1.1750 | -0.0578 | 1.1750 | 1.0840 |
57
+ | No log | 0.48 | 12 | 1.1477 | 0.0761 | 1.1477 | 1.0713 |
58
+ | No log | 0.56 | 14 | 1.2786 | 0.0569 | 1.2786 | 1.1308 |
59
+ | No log | 0.64 | 16 | 1.1644 | -0.0721 | 1.1644 | 1.0791 |
60
+ | No log | 0.72 | 18 | 1.0627 | 0.1332 | 1.0627 | 1.0309 |
61
+ | No log | 0.8 | 20 | 0.9118 | -0.0171 | 0.9118 | 0.9549 |
62
+ | No log | 0.88 | 22 | 0.8510 | -0.0483 | 0.8510 | 0.9225 |
63
+ | No log | 0.96 | 24 | 0.8273 | -0.0483 | 0.8273 | 0.9095 |
64
+ | No log | 1.04 | 26 | 0.8270 | -0.0149 | 0.8270 | 0.9094 |
65
+ | No log | 1.12 | 28 | 0.8622 | 0.1648 | 0.8622 | 0.9285 |
66
+ | No log | 1.2 | 30 | 0.8841 | 0.1648 | 0.8841 | 0.9403 |
67
+ | No log | 1.28 | 32 | 0.9262 | -0.0409 | 0.9262 | 0.9624 |
68
+ | No log | 1.3600 | 34 | 0.9100 | 0.0227 | 0.9100 | 0.9540 |
69
+ | No log | 1.44 | 36 | 0.9112 | 0.0966 | 0.9112 | 0.9546 |
70
+ | No log | 1.52 | 38 | 0.9436 | 0.2027 | 0.9436 | 0.9714 |
71
+ | No log | 1.6 | 40 | 0.9626 | 0.2027 | 0.9626 | 0.9811 |
72
+ | No log | 1.6800 | 42 | 0.9619 | 0.1699 | 0.9619 | 0.9808 |
73
+ | No log | 1.76 | 44 | 0.9617 | 0.0208 | 0.9617 | 0.9807 |
74
+ | No log | 1.8400 | 46 | 1.0272 | -0.0765 | 1.0272 | 1.0135 |
75
+ | No log | 1.92 | 48 | 1.0388 | -0.1186 | 1.0388 | 1.0192 |
76
+ | No log | 2.0 | 50 | 0.9754 | 0.1699 | 0.9754 | 0.9876 |
77
+ | No log | 2.08 | 52 | 0.9389 | 0.1699 | 0.9389 | 0.9690 |
78
+ | No log | 2.16 | 54 | 0.9292 | 0.0966 | 0.9292 | 0.9640 |
79
+ | No log | 2.24 | 56 | 0.9661 | -0.0095 | 0.9661 | 0.9829 |
80
+ | No log | 2.32 | 58 | 0.9542 | -0.0479 | 0.9542 | 0.9768 |
81
+ | No log | 2.4 | 60 | 0.9244 | 0.0236 | 0.9244 | 0.9615 |
82
+ | No log | 2.48 | 62 | 1.1457 | 0.1116 | 1.1457 | 1.0704 |
83
+ | No log | 2.56 | 64 | 1.3156 | 0.0712 | 1.3156 | 1.1470 |
84
+ | No log | 2.64 | 66 | 1.1972 | 0.0812 | 1.1972 | 1.0942 |
85
+ | No log | 2.7200 | 68 | 0.9423 | 0.1598 | 0.9423 | 0.9707 |
86
+ | No log | 2.8 | 70 | 0.8869 | -0.0511 | 0.8869 | 0.9418 |
87
+ | No log | 2.88 | 72 | 0.9585 | -0.1166 | 0.9585 | 0.9790 |
88
+ | No log | 2.96 | 74 | 1.0078 | -0.0238 | 1.0078 | 1.0039 |
89
+ | No log | 3.04 | 76 | 0.9291 | -0.0143 | 0.9291 | 0.9639 |
90
+ | No log | 3.12 | 78 | 0.9634 | 0.2467 | 0.9634 | 0.9816 |
91
+ | No log | 3.2 | 80 | 1.0693 | 0.1521 | 1.0693 | 1.0341 |
92
+ | No log | 3.2800 | 82 | 0.9920 | 0.1682 | 0.9920 | 0.9960 |
93
+ | No log | 3.36 | 84 | 0.8918 | 0.1866 | 0.8918 | 0.9443 |
94
+ | No log | 3.44 | 86 | 0.8751 | 0.2063 | 0.8751 | 0.9354 |
95
+ | No log | 3.52 | 88 | 1.0574 | 0.0171 | 1.0574 | 1.0283 |
96
+ | No log | 3.6 | 90 | 1.1079 | -0.0708 | 1.1079 | 1.0526 |
97
+ | No log | 3.68 | 92 | 0.9925 | 0.0977 | 0.9925 | 0.9962 |
98
+ | No log | 3.76 | 94 | 0.9388 | 0.1587 | 0.9388 | 0.9689 |
99
+ | No log | 3.84 | 96 | 0.9383 | 0.1587 | 0.9383 | 0.9686 |
100
+ | No log | 3.92 | 98 | 0.9591 | 0.0966 | 0.9591 | 0.9793 |
101
+ | No log | 4.0 | 100 | 0.9786 | 0.0208 | 0.9786 | 0.9892 |
102
+ | No log | 4.08 | 102 | 0.9927 | 0.0208 | 0.9927 | 0.9963 |
103
+ | No log | 4.16 | 104 | 1.0093 | 0.0966 | 1.0093 | 1.0047 |
104
+ | No log | 4.24 | 106 | 1.0243 | 0.0966 | 1.0243 | 1.0121 |
105
+ | No log | 4.32 | 108 | 1.0254 | 0.0966 | 1.0254 | 1.0126 |
106
+ | No log | 4.4 | 110 | 1.0841 | -0.1037 | 1.0841 | 1.0412 |
107
+ | No log | 4.48 | 112 | 1.2040 | -0.0631 | 1.2040 | 1.0973 |
108
+ | No log | 4.5600 | 114 | 1.2715 | -0.0386 | 1.2715 | 1.1276 |
109
+ | No log | 4.64 | 116 | 1.2902 | -0.0488 | 1.2902 | 1.1359 |
110
+ | No log | 4.72 | 118 | 1.2514 | 0.0250 | 1.2514 | 1.1187 |
111
+ | No log | 4.8 | 120 | 1.2302 | -0.0011 | 1.2302 | 1.1091 |
112
+ | No log | 4.88 | 122 | 1.1976 | -0.0010 | 1.1976 | 1.0943 |
113
+ | No log | 4.96 | 124 | 1.1683 | 0.0091 | 1.1683 | 1.0809 |
114
+ | No log | 5.04 | 126 | 1.1661 | 0.0236 | 1.1661 | 1.0799 |
115
+ | No log | 5.12 | 128 | 1.1238 | 0.0798 | 1.1238 | 1.0601 |
116
+ | No log | 5.2 | 130 | 1.1200 | 0.1448 | 1.1200 | 1.0583 |
117
+ | No log | 5.28 | 132 | 1.1535 | 0.1077 | 1.1535 | 1.0740 |
118
+ | No log | 5.36 | 134 | 1.1127 | 0.0829 | 1.1127 | 1.0549 |
119
+ | No log | 5.44 | 136 | 1.0419 | 0.1065 | 1.0419 | 1.0208 |
120
+ | No log | 5.52 | 138 | 1.0256 | 0.1988 | 1.0256 | 1.0127 |
121
+ | No log | 5.6 | 140 | 1.0344 | 0.1988 | 1.0344 | 1.0171 |
122
+ | No log | 5.68 | 142 | 1.0554 | 0.1179 | 1.0554 | 1.0273 |
123
+ | No log | 5.76 | 144 | 1.0987 | 0.1783 | 1.0987 | 1.0482 |
124
+ | No log | 5.84 | 146 | 1.1345 | 0.0987 | 1.1345 | 1.0651 |
125
+ | No log | 5.92 | 148 | 1.1636 | 0.0561 | 1.1636 | 1.0787 |
126
+ | No log | 6.0 | 150 | 1.1952 | 0.1030 | 1.1952 | 1.0933 |
127
+ | No log | 6.08 | 152 | 1.1059 | 0.1172 | 1.1059 | 1.0516 |
128
+ | No log | 6.16 | 154 | 1.0210 | 0.2475 | 1.0210 | 1.0104 |
129
+ | No log | 6.24 | 156 | 1.0263 | 0.1860 | 1.0263 | 1.0130 |
130
+ | No log | 6.32 | 158 | 0.9898 | 0.2535 | 0.9898 | 0.9949 |
131
+ | No log | 6.4 | 160 | 0.9742 | 0.2140 | 0.9742 | 0.9870 |
132
+ | No log | 6.48 | 162 | 0.9907 | 0.2161 | 0.9907 | 0.9953 |
133
+ | No log | 6.5600 | 164 | 1.0419 | 0.1843 | 1.0419 | 1.0207 |
134
+ | No log | 6.64 | 166 | 1.0539 | 0.1196 | 1.0539 | 1.0266 |
135
+ | No log | 6.72 | 168 | 1.0337 | 0.2563 | 1.0337 | 1.0167 |
136
+ | No log | 6.8 | 170 | 1.0392 | 0.2283 | 1.0392 | 1.0194 |
137
+ | No log | 6.88 | 172 | 1.0988 | 0.1268 | 1.0988 | 1.0482 |
138
+ | No log | 6.96 | 174 | 1.1138 | 0.1268 | 1.1138 | 1.0554 |
139
+ | No log | 7.04 | 176 | 1.0950 | 0.0803 | 1.0950 | 1.0464 |
140
+ | No log | 7.12 | 178 | 1.0651 | 0.0803 | 1.0651 | 1.0320 |
141
+ | No log | 7.2 | 180 | 1.0602 | 0.0995 | 1.0602 | 1.0297 |
142
+ | No log | 7.28 | 182 | 1.0781 | 0.1418 | 1.0781 | 1.0383 |
143
+ | No log | 7.36 | 184 | 1.0829 | 0.1285 | 1.0829 | 1.0406 |
144
+ | No log | 7.44 | 186 | 1.1294 | 0.0725 | 1.1294 | 1.0627 |
145
+ | No log | 7.52 | 188 | 1.2088 | 0.0899 | 1.2088 | 1.0994 |
146
+ | No log | 7.6 | 190 | 1.2332 | 0.0591 | 1.2332 | 1.1105 |
147
+ | No log | 7.68 | 192 | 1.2093 | 0.0654 | 1.2093 | 1.0997 |
148
+ | No log | 7.76 | 194 | 1.2554 | 0.1283 | 1.2554 | 1.1204 |
149
+ | No log | 7.84 | 196 | 1.2234 | 0.1085 | 1.2234 | 1.1061 |
150
+ | No log | 7.92 | 198 | 1.1677 | 0.0987 | 1.1677 | 1.0806 |
151
+ | No log | 8.0 | 200 | 1.1480 | 0.1357 | 1.1480 | 1.0714 |
152
+ | No log | 8.08 | 202 | 1.1509 | 0.0997 | 1.1509 | 1.0728 |
153
+ | No log | 8.16 | 204 | 1.2110 | 0.0401 | 1.2110 | 1.1004 |
154
+ | No log | 8.24 | 206 | 1.2429 | 0.1114 | 1.2429 | 1.1149 |
155
+ | No log | 8.32 | 208 | 1.2048 | 0.0379 | 1.2048 | 1.0976 |
156
+ | No log | 8.4 | 210 | 1.1151 | 0.0512 | 1.1151 | 1.0560 |
157
+ | No log | 8.48 | 212 | 1.0482 | 0.0891 | 1.0482 | 1.0238 |
158
+ | No log | 8.56 | 214 | 1.0160 | 0.0539 | 1.0160 | 1.0080 |
159
+ | No log | 8.64 | 216 | 1.0660 | 0.0390 | 1.0660 | 1.0325 |
160
+ | No log | 8.72 | 218 | 1.0851 | 0.0390 | 1.0851 | 1.0417 |
161
+ | No log | 8.8 | 220 | 1.0273 | 0.0575 | 1.0273 | 1.0136 |
162
+ | No log | 8.88 | 222 | 1.0236 | 0.0754 | 1.0236 | 1.0117 |
163
+ | No log | 8.96 | 224 | 1.0467 | 0.0378 | 1.0467 | 1.0231 |
164
+ | No log | 9.04 | 226 | 1.0779 | 0.0548 | 1.0779 | 1.0382 |
165
+ | No log | 9.12 | 228 | 1.0871 | 0.0190 | 1.0871 | 1.0426 |
166
+ | No log | 9.2 | 230 | 1.0588 | 0.0799 | 1.0588 | 1.0290 |
167
+ | No log | 9.28 | 232 | 0.9962 | 0.1661 | 0.9962 | 0.9981 |
168
+ | No log | 9.36 | 234 | 0.9630 | 0.1935 | 0.9630 | 0.9813 |
169
+ | No log | 9.44 | 236 | 0.9531 | 0.1935 | 0.9531 | 0.9763 |
170
+ | No log | 9.52 | 238 | 0.9539 | 0.2096 | 0.9539 | 0.9767 |
171
+ | No log | 9.6 | 240 | 0.9902 | 0.2105 | 0.9902 | 0.9951 |
172
+ | No log | 9.68 | 242 | 0.9783 | 0.2662 | 0.9783 | 0.9891 |
173
+ | No log | 9.76 | 244 | 0.9483 | 0.2096 | 0.9483 | 0.9738 |
174
+ | No log | 9.84 | 246 | 0.9348 | 0.1567 | 0.9348 | 0.9669 |
175
+ | No log | 9.92 | 248 | 0.9277 | 0.1587 | 0.9277 | 0.9632 |
176
+ | No log | 10.0 | 250 | 0.9380 | 0.2467 | 0.9380 | 0.9685 |
177
+ | No log | 10.08 | 252 | 0.9462 | 0.2467 | 0.9462 | 0.9727 |
178
+ | No log | 10.16 | 254 | 0.9946 | 0.2012 | 0.9946 | 0.9973 |
179
+ | No log | 10.24 | 256 | 1.0317 | 0.1962 | 1.0317 | 1.0157 |
180
+ | No log | 10.32 | 258 | 1.0724 | 0.1584 | 1.0724 | 1.0356 |
181
+ | No log | 10.4 | 260 | 1.0932 | 0.0993 | 1.0932 | 1.0456 |
182
+ | No log | 10.48 | 262 | 1.1488 | 0.0516 | 1.1488 | 1.0718 |
183
+ | No log | 10.56 | 264 | 1.2556 | 0.0584 | 1.2556 | 1.1205 |
184
+ | No log | 10.64 | 266 | 1.4257 | 0.1531 | 1.4257 | 1.1940 |
185
+ | No log | 10.72 | 268 | 1.4454 | 0.1067 | 1.4454 | 1.2022 |
186
+ | No log | 10.8 | 270 | 1.3118 | 0.0299 | 1.3118 | 1.1453 |
187
+ | No log | 10.88 | 272 | 1.2297 | -0.0708 | 1.2297 | 1.1089 |
188
+ | No log | 10.96 | 274 | 1.1414 | 0.0169 | 1.1414 | 1.0684 |
189
+ | No log | 11.04 | 276 | 1.1008 | 0.0686 | 1.1008 | 1.0492 |
190
+ | No log | 11.12 | 278 | 1.1109 | 0.0368 | 1.1109 | 1.0540 |
191
+ | No log | 11.2 | 280 | 1.1042 | 0.0154 | 1.1042 | 1.0508 |
192
+ | No log | 11.28 | 282 | 1.0903 | 0.0033 | 1.0903 | 1.0442 |
193
+ | No log | 11.36 | 284 | 1.0998 | 0.0404 | 1.0998 | 1.0487 |
194
+ | No log | 11.44 | 286 | 1.0934 | 0.0766 | 1.0934 | 1.0457 |
195
+ | No log | 11.52 | 288 | 1.1153 | 0.0566 | 1.1153 | 1.0561 |
196
+ | No log | 11.6 | 290 | 1.2493 | 0.1004 | 1.2493 | 1.1177 |
197
+ | No log | 11.68 | 292 | 1.2573 | 0.1283 | 1.2573 | 1.1213 |
198
+ | No log | 11.76 | 294 | 1.1888 | -0.0130 | 1.1888 | 1.0903 |
199
+ | No log | 11.84 | 296 | 1.1051 | 0.0906 | 1.1051 | 1.0512 |
200
+ | No log | 11.92 | 298 | 1.0945 | 0.1243 | 1.0945 | 1.0462 |
201
+ | No log | 12.0 | 300 | 1.0724 | 0.1779 | 1.0724 | 1.0356 |
202
+ | No log | 12.08 | 302 | 1.0941 | 0.1441 | 1.0941 | 1.0460 |
203
+ | No log | 12.16 | 304 | 1.0847 | 0.0539 | 1.0847 | 1.0415 |
204
+ | No log | 12.24 | 306 | 1.0019 | 0.1866 | 1.0019 | 1.0010 |
205
+ | No log | 12.32 | 308 | 0.9586 | 0.1567 | 0.9586 | 0.9791 |
206
+ | No log | 12.4 | 310 | 0.9597 | 0.0906 | 0.9597 | 0.9796 |
207
+ | No log | 12.48 | 312 | 0.9633 | 0.1567 | 0.9633 | 0.9815 |
208
+ | No log | 12.56 | 314 | 1.0295 | 0.2063 | 1.0295 | 1.0147 |
209
+ | No log | 12.64 | 316 | 1.1511 | 0.0666 | 1.1511 | 1.0729 |
210
+ | No log | 12.72 | 318 | 1.2125 | 0.0576 | 1.2125 | 1.1011 |
211
+ | No log | 12.8 | 320 | 1.2023 | 0.0318 | 1.2023 | 1.0965 |
212
+ | No log | 12.88 | 322 | 1.1911 | 0.0284 | 1.1911 | 1.0914 |
213
+ | No log | 12.96 | 324 | 1.1655 | 0.2066 | 1.1655 | 1.0796 |
214
+ | No log | 13.04 | 326 | 1.1524 | 0.1770 | 1.1524 | 1.0735 |
215
+ | No log | 13.12 | 328 | 1.1621 | 0.1058 | 1.1621 | 1.0780 |
216
+ | No log | 13.2 | 330 | 1.1102 | 0.1375 | 1.1102 | 1.0537 |
217
+ | No log | 13.28 | 332 | 1.0710 | 0.1375 | 1.0710 | 1.0349 |
218
+ | No log | 13.36 | 334 | 0.9936 | 0.1628 | 0.9936 | 0.9968 |
219
+ | No log | 13.44 | 336 | 0.9611 | 0.3127 | 0.9611 | 0.9804 |
220
+ | No log | 13.52 | 338 | 0.9803 | 0.2359 | 0.9803 | 0.9901 |
221
+ | No log | 13.6 | 340 | 1.0069 | 0.2009 | 1.0069 | 1.0035 |
222
+ | No log | 13.68 | 342 | 1.0875 | 0.0710 | 1.0875 | 1.0428 |
223
+ | No log | 13.76 | 344 | 1.1696 | 0.0553 | 1.1696 | 1.0815 |
224
+ | No log | 13.84 | 346 | 1.1737 | 0.0068 | 1.1737 | 1.0834 |
225
+ | No log | 13.92 | 348 | 1.1172 | 0.1355 | 1.1172 | 1.0570 |
226
+ | No log | 14.0 | 350 | 1.0641 | 0.2660 | 1.0641 | 1.0316 |
227
+ | No log | 14.08 | 352 | 1.0768 | 0.2052 | 1.0768 | 1.0377 |
228
+ | No log | 14.16 | 354 | 1.0149 | 0.2415 | 1.0149 | 1.0074 |
229
+ | No log | 14.24 | 356 | 0.9680 | 0.1176 | 0.9680 | 0.9839 |
230
+ | No log | 14.32 | 358 | 0.9786 | 0.1753 | 0.9786 | 0.9892 |
231
+ | No log | 14.4 | 360 | 0.9624 | 0.2116 | 0.9624 | 0.9810 |
232
+ | No log | 14.48 | 362 | 0.9095 | 0.1866 | 0.9095 | 0.9537 |
233
+ | No log | 14.56 | 364 | 0.9079 | 0.1289 | 0.9079 | 0.9528 |
234
+ | No log | 14.64 | 366 | 0.9658 | 0.1935 | 0.9658 | 0.9828 |
235
+ | No log | 14.72 | 368 | 0.9777 | 0.1629 | 0.9777 | 0.9888 |
236
+ | No log | 14.8 | 370 | 0.9819 | 0.1213 | 0.9819 | 0.9909 |
237
+ | No log | 14.88 | 372 | 1.0438 | 0.2202 | 1.0438 | 1.0217 |
238
+ | No log | 14.96 | 374 | 1.1791 | 0.0576 | 1.1791 | 1.0859 |
239
+ | No log | 15.04 | 376 | 1.2348 | 0.0778 | 1.2348 | 1.1112 |
240
+ | No log | 15.12 | 378 | 1.1648 | 0.1113 | 1.1648 | 1.0792 |
241
+ | No log | 15.2 | 380 | 1.0489 | 0.2415 | 1.0489 | 1.0242 |
242
+ | No log | 15.28 | 382 | 1.0079 | 0.0899 | 1.0079 | 1.0040 |
243
+ | No log | 15.36 | 384 | 1.0325 | -0.0020 | 1.0325 | 1.0161 |
244
+ | No log | 15.44 | 386 | 1.0193 | -0.0020 | 1.0193 | 1.0096 |
245
+ | No log | 15.52 | 388 | 0.9779 | 0.1253 | 0.9779 | 0.9889 |
246
+ | No log | 15.6 | 390 | 0.9494 | 0.2285 | 0.9494 | 0.9744 |
247
+ | No log | 15.68 | 392 | 0.9644 | 0.1815 | 0.9644 | 0.9820 |
248
+ | No log | 15.76 | 394 | 0.9584 | 0.1815 | 0.9584 | 0.9790 |
249
+ | No log | 15.84 | 396 | 0.9469 | 0.2285 | 0.9469 | 0.9731 |
250
+ | No log | 15.92 | 398 | 0.9460 | 0.1918 | 0.9460 | 0.9726 |
251
+ | No log | 16.0 | 400 | 0.9435 | 0.1697 | 0.9435 | 0.9713 |
252
+ | No log | 16.08 | 402 | 0.9396 | 0.1577 | 0.9396 | 0.9693 |
253
+ | No log | 16.16 | 404 | 0.9368 | 0.2345 | 0.9368 | 0.9679 |
254
+ | No log | 16.24 | 406 | 0.9246 | 0.2345 | 0.9246 | 0.9615 |
255
+ | No log | 16.32 | 408 | 0.9236 | 0.2345 | 0.9236 | 0.9610 |
256
+ | No log | 16.4 | 410 | 0.9522 | 0.2285 | 0.9522 | 0.9758 |
257
+ | No log | 16.48 | 412 | 1.0330 | 0.2253 | 1.0330 | 1.0164 |
258
+ | No log | 16.56 | 414 | 1.0728 | 0.2315 | 1.0728 | 1.0358 |
259
+ | No log | 16.64 | 416 | 1.0273 | 0.2045 | 1.0273 | 1.0135 |
260
+ | No log | 16.72 | 418 | 1.0008 | 0.2045 | 1.0008 | 1.0004 |
261
+ | No log | 16.8 | 420 | 0.9632 | 0.2227 | 0.9632 | 0.9814 |
262
+ | No log | 16.88 | 422 | 0.9617 | 0.1850 | 0.9617 | 0.9807 |
263
+ | No log | 16.96 | 424 | 0.9878 | 0.1850 | 0.9878 | 0.9939 |
264
+ | No log | 17.04 | 426 | 1.0172 | 0.1706 | 1.0172 | 1.0086 |
265
+ | No log | 17.12 | 428 | 1.0780 | 0.0808 | 1.0780 | 1.0383 |
266
+ | No log | 17.2 | 430 | 1.1481 | 0.0985 | 1.1481 | 1.0715 |
267
+ | No log | 17.28 | 432 | 1.1125 | 0.1110 | 1.1125 | 1.0547 |
268
+ | No log | 17.36 | 434 | 1.0264 | 0.2027 | 1.0264 | 1.0131 |
269
+ | No log | 17.44 | 436 | 0.9888 | 0.1539 | 0.9888 | 0.9944 |
270
+ | No log | 17.52 | 438 | 0.9874 | 0.0353 | 0.9874 | 0.9937 |
271
+ | No log | 17.6 | 440 | 0.9509 | -0.0307 | 0.9509 | 0.9752 |
272
+ | No log | 17.68 | 442 | 0.8930 | 0.1624 | 0.8930 | 0.9450 |
273
+ | No log | 17.76 | 444 | 0.9222 | 0.2527 | 0.9222 | 0.9603 |
274
+ | No log | 17.84 | 446 | 1.0269 | 0.1144 | 1.0269 | 1.0134 |
275
+ | No log | 17.92 | 448 | 1.1073 | 0.1271 | 1.1073 | 1.0523 |
276
+ | No log | 18.0 | 450 | 1.1036 | 0.1843 | 1.1036 | 1.0505 |
277
+ | No log | 18.08 | 452 | 1.0445 | 0.1493 | 1.0445 | 1.0220 |
278
+ | No log | 18.16 | 454 | 0.9646 | 0.2817 | 0.9646 | 0.9821 |
279
+ | No log | 18.24 | 456 | 0.9393 | 0.1835 | 0.9393 | 0.9692 |
280
+ | No log | 18.32 | 458 | 0.9397 | 0.1835 | 0.9397 | 0.9694 |
281
+ | No log | 18.4 | 460 | 0.9449 | 0.2305 | 0.9449 | 0.9720 |
282
+ | No log | 18.48 | 462 | 0.9674 | 0.2662 | 0.9674 | 0.9836 |
283
+ | No log | 18.56 | 464 | 0.9767 | 0.2275 | 0.9767 | 0.9883 |
284
+ | No log | 18.64 | 466 | 0.9711 | 0.2275 | 0.9711 | 0.9855 |
285
+ | No log | 18.72 | 468 | 1.0292 | 0.1454 | 1.0292 | 1.0145 |
286
+ | No log | 18.8 | 470 | 1.0735 | 0.1831 | 1.0735 | 1.0361 |
287
+ | No log | 18.88 | 472 | 1.0841 | 0.1926 | 1.0841 | 1.0412 |
288
+ | No log | 18.96 | 474 | 1.0576 | 0.1723 | 1.0576 | 1.0284 |
289
+ | No log | 19.04 | 476 | 1.0336 | 0.1542 | 1.0336 | 1.0166 |
290
+ | No log | 19.12 | 478 | 1.0286 | 0.1884 | 1.0286 | 1.0142 |
291
+ | No log | 19.2 | 480 | 1.0126 | 0.2313 | 1.0126 | 1.0063 |
292
+ | No log | 19.28 | 482 | 1.0173 | 0.3023 | 1.0173 | 1.0086 |
293
+ | No log | 19.36 | 484 | 1.0695 | 0.1926 | 1.0695 | 1.0342 |
294
+ | No log | 19.44 | 486 | 1.1252 | 0.1143 | 1.1252 | 1.0608 |
295
+ | No log | 19.52 | 488 | 1.1019 | 0.1173 | 1.1019 | 1.0497 |
296
+ | No log | 19.6 | 490 | 1.0209 | 0.1178 | 1.0209 | 1.0104 |
297
+ | No log | 19.68 | 492 | 0.9411 | 0.2012 | 0.9411 | 0.9701 |
298
+ | No log | 19.76 | 494 | 0.8815 | 0.2589 | 0.8815 | 0.9389 |
299
+ | No log | 19.84 | 496 | 0.8901 | 0.2589 | 0.8901 | 0.9435 |
300
+ | No log | 19.92 | 498 | 0.9237 | 0.2722 | 0.9237 | 0.9611 |
301
+ | 0.3088 | 20.0 | 500 | 0.9056 | 0.2589 | 0.9056 | 0.9517 |
302
+ | 0.3088 | 20.08 | 502 | 0.8937 | 0.2285 | 0.8937 | 0.9454 |
303
+ | 0.3088 | 20.16 | 504 | 0.9198 | 0.2440 | 0.9198 | 0.9591 |
304
+ | 0.3088 | 20.24 | 506 | 0.9478 | 0.2440 | 0.9478 | 0.9736 |
305
+ | 0.3088 | 20.32 | 508 | 0.9765 | 0.1884 | 0.9765 | 0.9882 |
306
+ | 0.3088 | 20.4 | 510 | 0.9902 | 0.1839 | 0.9902 | 0.9951 |
307
+
308
+
309
+ ### Framework versions
310
+
311
+ - Transformers 4.44.2
312
+ - Pytorch 2.4.0+cu118
313
+ - Datasets 2.21.0
314
+ - Tokenizers 0.19.1
config.json ADDED
@@ -0,0 +1,32 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "_name_or_path": "aubmindlab/bert-base-arabertv02",
3
+ "architectures": [
4
+ "BertForSequenceClassification"
5
+ ],
6
+ "attention_probs_dropout_prob": 0.1,
7
+ "classifier_dropout": null,
8
+ "hidden_act": "gelu",
9
+ "hidden_dropout_prob": 0.1,
10
+ "hidden_size": 768,
11
+ "id2label": {
12
+ "0": "LABEL_0"
13
+ },
14
+ "initializer_range": 0.02,
15
+ "intermediate_size": 3072,
16
+ "label2id": {
17
+ "LABEL_0": 0
18
+ },
19
+ "layer_norm_eps": 1e-12,
20
+ "max_position_embeddings": 512,
21
+ "model_type": "bert",
22
+ "num_attention_heads": 12,
23
+ "num_hidden_layers": 12,
24
+ "pad_token_id": 0,
25
+ "position_embedding_type": "absolute",
26
+ "problem_type": "regression",
27
+ "torch_dtype": "float32",
28
+ "transformers_version": "4.44.2",
29
+ "type_vocab_size": 2,
30
+ "use_cache": true,
31
+ "vocab_size": 64000
32
+ }
model.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:bfbac144561b23f2c45fad1c95dbc2f7895e2bd85cf6362d0a889655677bc820
3
+ size 540799996
training_args.bin ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:c6f7f2a0f34080350e529035c5b4a7ebdc7e841414bdd36177ab10d27158c353
3
+ size 5368