MayBashendy commited on
Commit
7d73c38
·
verified ·
1 Parent(s): 4dd3f19

Training in progress, step 500

Browse files
Files changed (4) hide show
  1. README.md +209 -0
  2. config.json +32 -0
  3. model.safetensors +3 -0
  4. training_args.bin +3 -0
README.md ADDED
@@ -0,0 +1,209 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ library_name: transformers
3
+ base_model: aubmindlab/bert-base-arabertv02
4
+ tags:
5
+ - generated_from_trainer
6
+ model-index:
7
+ - name: ArabicNewSplits7_usingWellWrittenEssays_FineTuningAraBERT_run3_AugV5_k1_task7_organization
8
+ results: []
9
+ ---
10
+
11
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
12
+ should probably proofread and complete it, then remove this comment. -->
13
+
14
+ # ArabicNewSplits7_usingWellWrittenEssays_FineTuningAraBERT_run3_AugV5_k1_task7_organization
15
+
16
+ This model is a fine-tuned version of [aubmindlab/bert-base-arabertv02](https://huggingface.co/aubmindlab/bert-base-arabertv02) on the None dataset.
17
+ It achieves the following results on the evaluation set:
18
+ - Loss: 0.9319
19
+ - Qwk: 0.0982
20
+ - Mse: 0.9319
21
+ - Rmse: 0.9653
22
+
23
+ ## Model description
24
+
25
+ More information needed
26
+
27
+ ## Intended uses & limitations
28
+
29
+ More information needed
30
+
31
+ ## Training and evaluation data
32
+
33
+ More information needed
34
+
35
+ ## Training procedure
36
+
37
+ ### Training hyperparameters
38
+
39
+ The following hyperparameters were used during training:
40
+ - learning_rate: 2e-05
41
+ - train_batch_size: 8
42
+ - eval_batch_size: 8
43
+ - seed: 42
44
+ - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
45
+ - lr_scheduler_type: linear
46
+ - num_epochs: 100
47
+
48
+ ### Training results
49
+
50
+ | Training Loss | Epoch | Step | Validation Loss | Qwk | Mse | Rmse |
51
+ |:-------------:|:-------:|:----:|:---------------:|:-------:|:------:|:------:|
52
+ | No log | 0.6667 | 2 | 4.2154 | -0.0060 | 4.2154 | 2.0531 |
53
+ | No log | 1.3333 | 4 | 2.1734 | -0.0316 | 2.1734 | 1.4742 |
54
+ | No log | 2.0 | 6 | 1.8642 | 0.0298 | 1.8642 | 1.3653 |
55
+ | No log | 2.6667 | 8 | 1.6716 | 0.0233 | 1.6716 | 1.2929 |
56
+ | No log | 3.3333 | 10 | 0.9513 | 0.0860 | 0.9513 | 0.9753 |
57
+ | No log | 4.0 | 12 | 1.3112 | 0.0390 | 1.3112 | 1.1451 |
58
+ | No log | 4.6667 | 14 | 1.7222 | 0.0049 | 1.7222 | 1.3123 |
59
+ | No log | 5.3333 | 16 | 1.2439 | 0.0686 | 1.2439 | 1.1153 |
60
+ | No log | 6.0 | 18 | 0.8313 | 0.2813 | 0.8313 | 0.9117 |
61
+ | No log | 6.6667 | 20 | 0.8361 | 0.2739 | 0.8361 | 0.9144 |
62
+ | No log | 7.3333 | 22 | 1.0473 | 0.1184 | 1.0473 | 1.0234 |
63
+ | No log | 8.0 | 24 | 1.5759 | 0.0898 | 1.5759 | 1.2554 |
64
+ | No log | 8.6667 | 26 | 1.2758 | 0.0312 | 1.2758 | 1.1295 |
65
+ | No log | 9.3333 | 28 | 0.9355 | 0.2339 | 0.9355 | 0.9672 |
66
+ | No log | 10.0 | 30 | 0.8571 | 0.2277 | 0.8571 | 0.9258 |
67
+ | No log | 10.6667 | 32 | 0.9790 | 0.1600 | 0.9790 | 0.9895 |
68
+ | No log | 11.3333 | 34 | 1.1933 | 0.1182 | 1.1933 | 1.0924 |
69
+ | No log | 12.0 | 36 | 1.0288 | 0.1641 | 1.0288 | 1.0143 |
70
+ | No log | 12.6667 | 38 | 0.9725 | 0.1531 | 0.9725 | 0.9862 |
71
+ | No log | 13.3333 | 40 | 1.0411 | 0.0752 | 1.0411 | 1.0203 |
72
+ | No log | 14.0 | 42 | 1.1477 | 0.1056 | 1.1477 | 1.0713 |
73
+ | No log | 14.6667 | 44 | 1.1977 | 0.0996 | 1.1977 | 1.0944 |
74
+ | No log | 15.3333 | 46 | 1.2576 | -0.0026 | 1.2576 | 1.1214 |
75
+ | No log | 16.0 | 48 | 1.4412 | 0.0858 | 1.4412 | 1.2005 |
76
+ | No log | 16.6667 | 50 | 1.3153 | 0.0020 | 1.3153 | 1.1468 |
77
+ | No log | 17.3333 | 52 | 1.1836 | 0.0972 | 1.1836 | 1.0879 |
78
+ | No log | 18.0 | 54 | 1.0634 | 0.1653 | 1.0634 | 1.0312 |
79
+ | No log | 18.6667 | 56 | 1.1575 | 0.0671 | 1.1575 | 1.0759 |
80
+ | No log | 19.3333 | 58 | 1.3467 | 0.0858 | 1.3467 | 1.1605 |
81
+ | No log | 20.0 | 60 | 1.2003 | 0.0999 | 1.2003 | 1.0956 |
82
+ | No log | 20.6667 | 62 | 1.0722 | 0.1820 | 1.0722 | 1.0355 |
83
+ | No log | 21.3333 | 64 | 1.0590 | 0.1449 | 1.0590 | 1.0291 |
84
+ | No log | 22.0 | 66 | 1.1925 | 0.0966 | 1.1925 | 1.0920 |
85
+ | No log | 22.6667 | 68 | 1.2008 | 0.1281 | 1.2008 | 1.0958 |
86
+ | No log | 23.3333 | 70 | 1.1068 | 0.1472 | 1.1068 | 1.0521 |
87
+ | No log | 24.0 | 72 | 1.0672 | 0.0834 | 1.0672 | 1.0331 |
88
+ | No log | 24.6667 | 74 | 1.0327 | 0.1917 | 1.0327 | 1.0162 |
89
+ | No log | 25.3333 | 76 | 0.9911 | 0.1490 | 0.9911 | 0.9956 |
90
+ | No log | 26.0 | 78 | 0.9620 | 0.1234 | 0.9620 | 0.9808 |
91
+ | No log | 26.6667 | 80 | 0.8954 | 0.2449 | 0.8954 | 0.9462 |
92
+ | No log | 27.3333 | 82 | 0.8566 | 0.2809 | 0.8566 | 0.9255 |
93
+ | No log | 28.0 | 84 | 0.8661 | 0.2479 | 0.8661 | 0.9306 |
94
+ | No log | 28.6667 | 86 | 0.9168 | 0.1547 | 0.9168 | 0.9575 |
95
+ | No log | 29.3333 | 88 | 0.9412 | 0.2036 | 0.9412 | 0.9702 |
96
+ | No log | 30.0 | 90 | 0.9896 | 0.2238 | 0.9896 | 0.9948 |
97
+ | No log | 30.6667 | 92 | 1.0643 | 0.1479 | 1.0643 | 1.0317 |
98
+ | No log | 31.3333 | 94 | 1.0855 | 0.1545 | 1.0855 | 1.0419 |
99
+ | No log | 32.0 | 96 | 1.0320 | 0.1225 | 1.0320 | 1.0159 |
100
+ | No log | 32.6667 | 98 | 0.9797 | 0.1196 | 0.9797 | 0.9898 |
101
+ | No log | 33.3333 | 100 | 0.9685 | 0.1578 | 0.9685 | 0.9841 |
102
+ | No log | 34.0 | 102 | 0.9433 | 0.1231 | 0.9433 | 0.9712 |
103
+ | No log | 34.6667 | 104 | 0.9862 | 0.2164 | 0.9862 | 0.9931 |
104
+ | No log | 35.3333 | 106 | 1.0853 | 0.0443 | 1.0853 | 1.0418 |
105
+ | No log | 36.0 | 108 | 1.0853 | 0.0741 | 1.0853 | 1.0418 |
106
+ | No log | 36.6667 | 110 | 1.0564 | 0.0995 | 1.0564 | 1.0278 |
107
+ | No log | 37.3333 | 112 | 1.0430 | 0.0991 | 1.0430 | 1.0213 |
108
+ | No log | 38.0 | 114 | 1.0199 | 0.1600 | 1.0199 | 1.0099 |
109
+ | No log | 38.6667 | 116 | 1.0008 | 0.1682 | 1.0008 | 1.0004 |
110
+ | No log | 39.3333 | 118 | 0.9581 | 0.1682 | 0.9581 | 0.9788 |
111
+ | No log | 40.0 | 120 | 0.9366 | 0.2273 | 0.9366 | 0.9678 |
112
+ | No log | 40.6667 | 122 | 0.9447 | 0.2303 | 0.9447 | 0.9720 |
113
+ | No log | 41.3333 | 124 | 0.9732 | 0.1682 | 0.9732 | 0.9865 |
114
+ | No log | 42.0 | 126 | 0.9248 | 0.2652 | 0.9248 | 0.9617 |
115
+ | No log | 42.6667 | 128 | 0.9043 | 0.2061 | 0.9043 | 0.9509 |
116
+ | No log | 43.3333 | 130 | 0.9117 | 0.2495 | 0.9117 | 0.9548 |
117
+ | No log | 44.0 | 132 | 0.8953 | 0.2495 | 0.8953 | 0.9462 |
118
+ | No log | 44.6667 | 134 | 0.8936 | 0.2681 | 0.8936 | 0.9453 |
119
+ | No log | 45.3333 | 136 | 0.9254 | 0.2043 | 0.9254 | 0.9620 |
120
+ | No log | 46.0 | 138 | 0.9858 | 0.1846 | 0.9858 | 0.9929 |
121
+ | No log | 46.6667 | 140 | 0.9915 | 0.1814 | 0.9915 | 0.9957 |
122
+ | No log | 47.3333 | 142 | 1.0114 | 0.1446 | 1.0114 | 1.0057 |
123
+ | No log | 48.0 | 144 | 1.0252 | 0.1725 | 1.0252 | 1.0125 |
124
+ | No log | 48.6667 | 146 | 1.0566 | 0.1401 | 1.0566 | 1.0279 |
125
+ | No log | 49.3333 | 148 | 1.0272 | 0.1725 | 1.0272 | 1.0135 |
126
+ | No log | 50.0 | 150 | 1.0131 | 0.1725 | 1.0131 | 1.0065 |
127
+ | No log | 50.6667 | 152 | 0.9593 | 0.0685 | 0.9593 | 0.9795 |
128
+ | No log | 51.3333 | 154 | 0.9186 | 0.1253 | 0.9186 | 0.9584 |
129
+ | No log | 52.0 | 156 | 0.9174 | 0.1219 | 0.9174 | 0.9578 |
130
+ | No log | 52.6667 | 158 | 0.9301 | 0.1253 | 0.9301 | 0.9644 |
131
+ | No log | 53.3333 | 160 | 0.9585 | 0.0325 | 0.9585 | 0.9790 |
132
+ | No log | 54.0 | 162 | 0.9754 | 0.0685 | 0.9754 | 0.9876 |
133
+ | No log | 54.6667 | 164 | 0.9639 | 0.0912 | 0.9639 | 0.9818 |
134
+ | No log | 55.3333 | 166 | 0.9458 | 0.1871 | 0.9458 | 0.9725 |
135
+ | No log | 56.0 | 168 | 0.9432 | 0.1273 | 0.9432 | 0.9712 |
136
+ | No log | 56.6667 | 170 | 0.9894 | 0.1682 | 0.9894 | 0.9947 |
137
+ | No log | 57.3333 | 172 | 1.0290 | 0.1395 | 1.0290 | 1.0144 |
138
+ | No log | 58.0 | 174 | 1.0209 | 0.1395 | 1.0209 | 1.0104 |
139
+ | No log | 58.6667 | 176 | 0.9581 | 0.1682 | 0.9581 | 0.9788 |
140
+ | No log | 59.3333 | 178 | 0.9120 | 0.2203 | 0.9120 | 0.9550 |
141
+ | No log | 60.0 | 180 | 0.9185 | 0.2109 | 0.9185 | 0.9584 |
142
+ | No log | 60.6667 | 182 | 0.9187 | 0.2077 | 0.9187 | 0.9585 |
143
+ | No log | 61.3333 | 184 | 0.9137 | 0.1820 | 0.9137 | 0.9559 |
144
+ | No log | 62.0 | 186 | 0.9230 | 0.1888 | 0.9230 | 0.9607 |
145
+ | No log | 62.6667 | 188 | 0.9289 | 0.2486 | 0.9289 | 0.9638 |
146
+ | No log | 63.3333 | 190 | 0.9298 | 0.2192 | 0.9298 | 0.9643 |
147
+ | No log | 64.0 | 192 | 0.9246 | 0.2192 | 0.9246 | 0.9616 |
148
+ | No log | 64.6667 | 194 | 0.9413 | 0.2192 | 0.9413 | 0.9702 |
149
+ | No log | 65.3333 | 196 | 0.9668 | 0.1600 | 0.9668 | 0.9833 |
150
+ | No log | 66.0 | 198 | 0.9989 | 0.1600 | 0.9989 | 0.9994 |
151
+ | No log | 66.6667 | 200 | 0.9965 | 0.1600 | 0.9965 | 0.9983 |
152
+ | No log | 67.3333 | 202 | 0.9830 | 0.1600 | 0.9830 | 0.9914 |
153
+ | No log | 68.0 | 204 | 0.9687 | 0.1785 | 0.9687 | 0.9842 |
154
+ | No log | 68.6667 | 206 | 0.9718 | 0.2102 | 0.9718 | 0.9858 |
155
+ | No log | 69.3333 | 208 | 0.9771 | 0.2053 | 0.9771 | 0.9885 |
156
+ | No log | 70.0 | 210 | 0.9827 | 0.1825 | 0.9827 | 0.9913 |
157
+ | No log | 70.6667 | 212 | 0.9785 | 0.1825 | 0.9785 | 0.9892 |
158
+ | No log | 71.3333 | 214 | 0.9779 | 0.0627 | 0.9779 | 0.9889 |
159
+ | No log | 72.0 | 216 | 0.9670 | 0.0635 | 0.9670 | 0.9834 |
160
+ | No log | 72.6667 | 218 | 0.9447 | 0.0627 | 0.9447 | 0.9719 |
161
+ | No log | 73.3333 | 220 | 0.9350 | 0.0275 | 0.9350 | 0.9670 |
162
+ | No log | 74.0 | 222 | 0.9266 | 0.0602 | 0.9266 | 0.9626 |
163
+ | No log | 74.6667 | 224 | 0.9231 | 0.1506 | 0.9231 | 0.9608 |
164
+ | No log | 75.3333 | 226 | 0.9273 | 0.1179 | 0.9273 | 0.9630 |
165
+ | No log | 76.0 | 228 | 0.9238 | 0.1506 | 0.9238 | 0.9611 |
166
+ | No log | 76.6667 | 230 | 0.9227 | 0.0602 | 0.9227 | 0.9606 |
167
+ | No log | 77.3333 | 232 | 0.9349 | 0.0640 | 0.9349 | 0.9669 |
168
+ | No log | 78.0 | 234 | 0.9516 | 0.0692 | 0.9516 | 0.9755 |
169
+ | No log | 78.6667 | 236 | 0.9618 | 0.0692 | 0.9618 | 0.9807 |
170
+ | No log | 79.3333 | 238 | 0.9593 | 0.0692 | 0.9593 | 0.9795 |
171
+ | No log | 80.0 | 240 | 0.9582 | 0.0339 | 0.9582 | 0.9789 |
172
+ | No log | 80.6667 | 242 | 0.9522 | 0.0640 | 0.9522 | 0.9758 |
173
+ | No log | 81.3333 | 244 | 0.9438 | 0.0640 | 0.9438 | 0.9715 |
174
+ | No log | 82.0 | 246 | 0.9396 | 0.0640 | 0.9396 | 0.9693 |
175
+ | No log | 82.6667 | 248 | 0.9340 | 0.0640 | 0.9340 | 0.9665 |
176
+ | No log | 83.3333 | 250 | 0.9324 | 0.0912 | 0.9324 | 0.9656 |
177
+ | No log | 84.0 | 252 | 0.9336 | 0.1246 | 0.9336 | 0.9662 |
178
+ | No log | 84.6667 | 254 | 0.9296 | 0.1825 | 0.9296 | 0.9642 |
179
+ | No log | 85.3333 | 256 | 0.9219 | 0.1820 | 0.9219 | 0.9602 |
180
+ | No log | 86.0 | 258 | 0.9108 | 0.1487 | 0.9108 | 0.9543 |
181
+ | No log | 86.6667 | 260 | 0.8961 | 0.1487 | 0.8961 | 0.9466 |
182
+ | No log | 87.3333 | 262 | 0.8845 | 0.1820 | 0.8845 | 0.9405 |
183
+ | No log | 88.0 | 264 | 0.8800 | 0.2838 | 0.8800 | 0.9381 |
184
+ | No log | 88.6667 | 266 | 0.8824 | 0.2203 | 0.8824 | 0.9394 |
185
+ | No log | 89.3333 | 268 | 0.8902 | 0.1661 | 0.8902 | 0.9435 |
186
+ | No log | 90.0 | 270 | 0.9027 | 0.1016 | 0.9027 | 0.9501 |
187
+ | No log | 90.6667 | 272 | 0.9134 | 0.0982 | 0.9134 | 0.9557 |
188
+ | No log | 91.3333 | 274 | 0.9268 | 0.1307 | 0.9268 | 0.9627 |
189
+ | No log | 92.0 | 276 | 0.9343 | 0.1307 | 0.9343 | 0.9666 |
190
+ | No log | 92.6667 | 278 | 0.9366 | 0.1307 | 0.9366 | 0.9678 |
191
+ | No log | 93.3333 | 280 | 0.9405 | 0.1307 | 0.9405 | 0.9698 |
192
+ | No log | 94.0 | 282 | 0.9455 | 0.1307 | 0.9455 | 0.9724 |
193
+ | No log | 94.6667 | 284 | 0.9463 | 0.1307 | 0.9463 | 0.9728 |
194
+ | No log | 95.3333 | 286 | 0.9443 | 0.1307 | 0.9443 | 0.9718 |
195
+ | No log | 96.0 | 288 | 0.9403 | 0.1307 | 0.9403 | 0.9697 |
196
+ | No log | 96.6667 | 290 | 0.9371 | 0.1307 | 0.9371 | 0.9681 |
197
+ | No log | 97.3333 | 292 | 0.9352 | 0.1307 | 0.9352 | 0.9670 |
198
+ | No log | 98.0 | 294 | 0.9332 | 0.1307 | 0.9332 | 0.9660 |
199
+ | No log | 98.6667 | 296 | 0.9321 | 0.1307 | 0.9321 | 0.9655 |
200
+ | No log | 99.3333 | 298 | 0.9318 | 0.0982 | 0.9318 | 0.9653 |
201
+ | No log | 100.0 | 300 | 0.9319 | 0.0982 | 0.9319 | 0.9653 |
202
+
203
+
204
+ ### Framework versions
205
+
206
+ - Transformers 4.44.2
207
+ - Pytorch 2.4.0+cu118
208
+ - Datasets 2.21.0
209
+ - Tokenizers 0.19.1
config.json ADDED
@@ -0,0 +1,32 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "_name_or_path": "aubmindlab/bert-base-arabertv02",
3
+ "architectures": [
4
+ "BertForSequenceClassification"
5
+ ],
6
+ "attention_probs_dropout_prob": 0.1,
7
+ "classifier_dropout": null,
8
+ "hidden_act": "gelu",
9
+ "hidden_dropout_prob": 0.1,
10
+ "hidden_size": 768,
11
+ "id2label": {
12
+ "0": "LABEL_0"
13
+ },
14
+ "initializer_range": 0.02,
15
+ "intermediate_size": 3072,
16
+ "label2id": {
17
+ "LABEL_0": 0
18
+ },
19
+ "layer_norm_eps": 1e-12,
20
+ "max_position_embeddings": 512,
21
+ "model_type": "bert",
22
+ "num_attention_heads": 12,
23
+ "num_hidden_layers": 12,
24
+ "pad_token_id": 0,
25
+ "position_embedding_type": "absolute",
26
+ "problem_type": "regression",
27
+ "torch_dtype": "float32",
28
+ "transformers_version": "4.44.2",
29
+ "type_vocab_size": 2,
30
+ "use_cache": true,
31
+ "vocab_size": 64000
32
+ }
model.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:b9b1864b4552a2c3915e56373ea5be8820c02e8a9f68e889640f10ee646cec8e
3
+ size 540799996
training_args.bin ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:2b0917f95c6e1ab38b02d538f98a4bd73f2130c7cc9b6e9b7f75c01db18152c0
3
+ size 5368