ArabicNewSplits4_WithDuplicationsForScore5_FineTuningAraBERT_run1_AugV5_k5_task2_organization
This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:
- Loss: 1.0648
- Qwk: 0.5132
- Mse: 1.0648
- Rmse: 1.0319
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 10
Training results
| Training Loss | Epoch | Step | Validation Loss | Qwk | Mse | Rmse |
|---|---|---|---|---|---|---|
| No log | 0.0714 | 2 | 4.1340 | -0.0134 | 4.1340 | 2.0332 |
| No log | 0.1429 | 4 | 2.1800 | 0.0424 | 2.1800 | 1.4765 |
| No log | 0.2143 | 6 | 1.2438 | 0.0424 | 1.2438 | 1.1153 |
| No log | 0.2857 | 8 | 0.8558 | -0.1126 | 0.8558 | 0.9251 |
| No log | 0.3571 | 10 | 0.7392 | 0.2239 | 0.7392 | 0.8598 |
| No log | 0.4286 | 12 | 0.7716 | 0.1352 | 0.7716 | 0.8784 |
| No log | 0.5 | 14 | 0.7781 | 0.1442 | 0.7781 | 0.8821 |
| No log | 0.5714 | 16 | 0.7345 | 0.1585 | 0.7345 | 0.8570 |
| No log | 0.6429 | 18 | 0.6956 | 0.2069 | 0.6956 | 0.8340 |
| No log | 0.7143 | 20 | 0.7384 | 0.2292 | 0.7384 | 0.8593 |
| No log | 0.7857 | 22 | 0.7074 | 0.2247 | 0.7074 | 0.8411 |
| No log | 0.8571 | 24 | 0.6846 | 0.1977 | 0.6846 | 0.8274 |
| No log | 0.9286 | 26 | 0.7121 | 0.1918 | 0.7121 | 0.8439 |
| No log | 1.0 | 28 | 0.8262 | 0.1499 | 0.8262 | 0.9089 |
| No log | 1.0714 | 30 | 0.8814 | 0.1552 | 0.8814 | 0.9388 |
| No log | 1.1429 | 32 | 0.8592 | 0.2708 | 0.8592 | 0.9269 |
| No log | 1.2143 | 34 | 0.7440 | 0.2985 | 0.7440 | 0.8625 |
| No log | 1.2857 | 36 | 0.6508 | 0.2792 | 0.6508 | 0.8067 |
| No log | 1.3571 | 38 | 0.6081 | 0.3344 | 0.6081 | 0.7798 |
| No log | 1.4286 | 40 | 0.6143 | 0.3401 | 0.6143 | 0.7838 |
| No log | 1.5 | 42 | 0.6131 | 0.3547 | 0.6131 | 0.7830 |
| No log | 1.5714 | 44 | 0.5751 | 0.3548 | 0.5751 | 0.7583 |
| No log | 1.6429 | 46 | 0.5981 | 0.4481 | 0.5981 | 0.7734 |
| No log | 1.7143 | 48 | 0.6407 | 0.4145 | 0.6407 | 0.8005 |
| No log | 1.7857 | 50 | 0.7284 | 0.4531 | 0.7284 | 0.8534 |
| No log | 1.8571 | 52 | 0.7053 | 0.4555 | 0.7053 | 0.8398 |
| No log | 1.9286 | 54 | 0.6466 | 0.5027 | 0.6466 | 0.8041 |
| No log | 2.0 | 56 | 0.8035 | 0.4199 | 0.8035 | 0.8964 |
| No log | 2.0714 | 58 | 0.8386 | 0.4169 | 0.8386 | 0.9158 |
| No log | 2.1429 | 60 | 0.6310 | 0.4955 | 0.6310 | 0.7943 |
| No log | 2.2143 | 62 | 0.7127 | 0.4876 | 0.7127 | 0.8442 |
| No log | 2.2857 | 64 | 0.7533 | 0.5 | 0.7533 | 0.8679 |
| No log | 2.3571 | 66 | 0.7081 | 0.5436 | 0.7081 | 0.8415 |
| No log | 2.4286 | 68 | 0.6758 | 0.5392 | 0.6758 | 0.8221 |
| No log | 2.5 | 70 | 0.7218 | 0.4899 | 0.7218 | 0.8496 |
| No log | 2.5714 | 72 | 0.7427 | 0.5064 | 0.7427 | 0.8618 |
| No log | 2.6429 | 74 | 0.6951 | 0.5648 | 0.6951 | 0.8337 |
| No log | 2.7143 | 76 | 0.8836 | 0.5020 | 0.8836 | 0.9400 |
| No log | 2.7857 | 78 | 1.0525 | 0.4567 | 1.0525 | 1.0259 |
| No log | 2.8571 | 80 | 0.9715 | 0.4927 | 0.9715 | 0.9856 |
| No log | 2.9286 | 82 | 0.8462 | 0.5327 | 0.8462 | 0.9199 |
| No log | 3.0 | 84 | 0.8607 | 0.5567 | 0.8607 | 0.9278 |
| No log | 3.0714 | 86 | 0.9001 | 0.5543 | 0.9001 | 0.9487 |
| No log | 3.1429 | 88 | 0.9209 | 0.5357 | 0.9209 | 0.9596 |
| No log | 3.2143 | 90 | 0.9739 | 0.5485 | 0.9739 | 0.9869 |
| No log | 3.2857 | 92 | 0.9868 | 0.5200 | 0.9868 | 0.9934 |
| No log | 3.3571 | 94 | 0.9818 | 0.5419 | 0.9818 | 0.9909 |
| No log | 3.4286 | 96 | 0.9909 | 0.5306 | 0.9909 | 0.9954 |
| No log | 3.5 | 98 | 1.0381 | 0.5101 | 1.0380 | 1.0188 |
| No log | 3.5714 | 100 | 1.1269 | 0.4736 | 1.1269 | 1.0616 |
| No log | 3.6429 | 102 | 1.1923 | 0.4646 | 1.1923 | 1.0919 |
| No log | 3.7143 | 104 | 1.2656 | 0.4456 | 1.2656 | 1.1250 |
| No log | 3.7857 | 106 | 1.2508 | 0.4393 | 1.2508 | 1.1184 |
| No log | 3.8571 | 108 | 1.1542 | 0.4866 | 1.1542 | 1.0744 |
| No log | 3.9286 | 110 | 1.1126 | 0.4768 | 1.1126 | 1.0548 |
| No log | 4.0 | 112 | 1.0654 | 0.5073 | 1.0654 | 1.0322 |
| No log | 4.0714 | 114 | 1.0520 | 0.5093 | 1.0520 | 1.0257 |
| No log | 4.1429 | 116 | 1.0406 | 0.4969 | 1.0406 | 1.0201 |
| No log | 4.2143 | 118 | 1.0833 | 0.4984 | 1.0833 | 1.0408 |
| No log | 4.2857 | 120 | 1.0952 | 0.4943 | 1.0952 | 1.0465 |
| No log | 4.3571 | 122 | 1.0969 | 0.5042 | 1.0969 | 1.0473 |
| No log | 4.4286 | 124 | 1.0632 | 0.5188 | 1.0632 | 1.0311 |
| No log | 4.5 | 126 | 1.0579 | 0.4922 | 1.0579 | 1.0286 |
| No log | 4.5714 | 128 | 1.0752 | 0.5016 | 1.0752 | 1.0369 |
| No log | 4.6429 | 130 | 1.1085 | 0.5051 | 1.1085 | 1.0528 |
| No log | 4.7143 | 132 | 1.1296 | 0.5232 | 1.1296 | 1.0628 |
| No log | 4.7857 | 134 | 1.1931 | 0.4840 | 1.1931 | 1.0923 |
| No log | 4.8571 | 136 | 1.2428 | 0.4880 | 1.2428 | 1.1148 |
| No log | 4.9286 | 138 | 1.2794 | 0.4578 | 1.2794 | 1.1311 |
| No log | 5.0 | 140 | 1.2919 | 0.4456 | 1.2919 | 1.1366 |
| No log | 5.0714 | 142 | 1.2849 | 0.4598 | 1.2849 | 1.1335 |
| No log | 5.1429 | 144 | 1.2694 | 0.4757 | 1.2694 | 1.1267 |
| No log | 5.2143 | 146 | 1.2888 | 0.4671 | 1.2888 | 1.1353 |
| No log | 5.2857 | 148 | 1.3187 | 0.4722 | 1.3187 | 1.1483 |
| No log | 5.3571 | 150 | 1.3693 | 0.4502 | 1.3693 | 1.1702 |
| No log | 5.4286 | 152 | 1.3452 | 0.4459 | 1.3452 | 1.1598 |
| No log | 5.5 | 154 | 1.2316 | 0.4659 | 1.2316 | 1.1098 |
| No log | 5.5714 | 156 | 1.1410 | 0.4983 | 1.1410 | 1.0682 |
| No log | 5.6429 | 158 | 1.1304 | 0.5249 | 1.1304 | 1.0632 |
| No log | 5.7143 | 160 | 1.2080 | 0.4854 | 1.2080 | 1.0991 |
| No log | 5.7857 | 162 | 1.2421 | 0.4563 | 1.2421 | 1.1145 |
| No log | 5.8571 | 164 | 1.2036 | 0.4795 | 1.2036 | 1.0971 |
| No log | 5.9286 | 166 | 1.1568 | 0.4825 | 1.1568 | 1.0756 |
| No log | 6.0 | 168 | 1.0890 | 0.5289 | 1.0890 | 1.0435 |
| No log | 6.0714 | 170 | 1.0702 | 0.4941 | 1.0702 | 1.0345 |
| No log | 6.1429 | 172 | 1.1391 | 0.5139 | 1.1391 | 1.0673 |
| No log | 6.2143 | 174 | 1.2273 | 0.4860 | 1.2273 | 1.1078 |
| No log | 6.2857 | 176 | 1.2473 | 0.4816 | 1.2473 | 1.1168 |
| No log | 6.3571 | 178 | 1.2915 | 0.4677 | 1.2915 | 1.1364 |
| No log | 6.4286 | 180 | 1.3069 | 0.4765 | 1.3069 | 1.1432 |
| No log | 6.5 | 182 | 1.2833 | 0.4788 | 1.2833 | 1.1328 |
| No log | 6.5714 | 184 | 1.3059 | 0.4815 | 1.3059 | 1.1428 |
| No log | 6.6429 | 186 | 1.3572 | 0.4545 | 1.3572 | 1.1650 |
| No log | 6.7143 | 188 | 1.3305 | 0.4672 | 1.3305 | 1.1535 |
| No log | 6.7857 | 190 | 1.2803 | 0.4773 | 1.2803 | 1.1315 |
| No log | 6.8571 | 192 | 1.2648 | 0.4858 | 1.2648 | 1.1246 |
| No log | 6.9286 | 194 | 1.2740 | 0.4762 | 1.2740 | 1.1287 |
| No log | 7.0 | 196 | 1.2837 | 0.4669 | 1.2837 | 1.1330 |
| No log | 7.0714 | 198 | 1.2661 | 0.4793 | 1.2661 | 1.1252 |
| No log | 7.1429 | 200 | 1.2081 | 0.4966 | 1.2081 | 1.0992 |
| No log | 7.2143 | 202 | 1.1781 | 0.5056 | 1.1781 | 1.0854 |
| No log | 7.2857 | 204 | 1.1332 | 0.4838 | 1.1332 | 1.0645 |
| No log | 7.3571 | 206 | 1.1234 | 0.5107 | 1.1234 | 1.0599 |
| No log | 7.4286 | 208 | 1.1504 | 0.4917 | 1.1504 | 1.0726 |
| No log | 7.5 | 210 | 1.1706 | 0.5006 | 1.1706 | 1.0819 |
| No log | 7.5714 | 212 | 1.1806 | 0.4963 | 1.1806 | 1.0866 |
| No log | 7.6429 | 214 | 1.1888 | 0.4957 | 1.1888 | 1.0903 |
| No log | 7.7143 | 216 | 1.2087 | 0.4719 | 1.2087 | 1.0994 |
| No log | 7.7857 | 218 | 1.2207 | 0.4623 | 1.2207 | 1.1049 |
| No log | 7.8571 | 220 | 1.1893 | 0.4983 | 1.1893 | 1.0905 |
| No log | 7.9286 | 222 | 1.1220 | 0.5066 | 1.1220 | 1.0592 |
| No log | 8.0 | 224 | 1.0648 | 0.5113 | 1.0648 | 1.0319 |
| No log | 8.0714 | 226 | 1.0423 | 0.4855 | 1.0423 | 1.0209 |
| No log | 8.1429 | 228 | 1.0453 | 0.4918 | 1.0453 | 1.0224 |
| No log | 8.2143 | 230 | 1.0539 | 0.4803 | 1.0539 | 1.0266 |
| No log | 8.2857 | 232 | 1.0752 | 0.4983 | 1.0752 | 1.0369 |
| No log | 8.3571 | 234 | 1.1259 | 0.5077 | 1.1259 | 1.0611 |
| No log | 8.4286 | 236 | 1.1759 | 0.4974 | 1.1759 | 1.0844 |
| No log | 8.5 | 238 | 1.2142 | 0.4902 | 1.2142 | 1.1019 |
| No log | 8.5714 | 240 | 1.2268 | 0.4909 | 1.2268 | 1.1076 |
| No log | 8.6429 | 242 | 1.2210 | 0.4909 | 1.2210 | 1.1050 |
| No log | 8.7143 | 244 | 1.2070 | 0.4974 | 1.2070 | 1.0987 |
| No log | 8.7857 | 246 | 1.1861 | 0.4937 | 1.1861 | 1.0891 |
| No log | 8.8571 | 248 | 1.1622 | 0.5002 | 1.1622 | 1.0780 |
| No log | 8.9286 | 250 | 1.1338 | 0.4913 | 1.1338 | 1.0648 |
| No log | 9.0 | 252 | 1.1136 | 0.5059 | 1.1136 | 1.0553 |
| No log | 9.0714 | 254 | 1.1060 | 0.5059 | 1.1060 | 1.0517 |
| No log | 9.1429 | 256 | 1.1095 | 0.4966 | 1.1095 | 1.0533 |
| No log | 9.2143 | 258 | 1.1074 | 0.5080 | 1.1074 | 1.0524 |
| No log | 9.2857 | 260 | 1.1043 | 0.5080 | 1.1043 | 1.0508 |
| No log | 9.3571 | 262 | 1.0967 | 0.5080 | 1.0967 | 1.0472 |
| No log | 9.4286 | 264 | 1.0881 | 0.5175 | 1.0881 | 1.0431 |
| No log | 9.5 | 266 | 1.0793 | 0.5048 | 1.0793 | 1.0389 |
| No log | 9.5714 | 268 | 1.0760 | 0.5048 | 1.0760 | 1.0373 |
| No log | 9.6429 | 270 | 1.0724 | 0.5048 | 1.0724 | 1.0356 |
| No log | 9.7143 | 272 | 1.0661 | 0.5015 | 1.0661 | 1.0325 |
| No log | 9.7857 | 274 | 1.0635 | 0.5015 | 1.0635 | 1.0312 |
| No log | 9.8571 | 276 | 1.0634 | 0.5139 | 1.0634 | 1.0312 |
| No log | 9.9286 | 278 | 1.0640 | 0.5132 | 1.0640 | 1.0315 |
| No log | 10.0 | 280 | 1.0648 | 0.5132 | 1.0648 | 1.0319 |
Framework versions
- Transformers 4.44.2
- Pytorch 2.4.0+cu118
- Datasets 2.21.0
- Tokenizers 0.19.1
- Downloads last month
- 1
Model tree for MayBashendy/ArabicNewSplits4_WithDuplicationsForScore5_FineTuningAraBERT_run1_AugV5_k5_task2_organization
Base model
aubmindlab/bert-base-arabertv02