ArabicNewSplits6_FineTuningAraBERT_run2_AugV5_k7_task3_organization
This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:
- Loss: 0.5337
- Qwk: 0.4872
- Mse: 0.5337
- Rmse: 0.7306
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 10
Training results
| Training Loss | Epoch | Step | Validation Loss | Qwk | Mse | Rmse |
|---|---|---|---|---|---|---|
| No log | 0.0588 | 2 | 3.0234 | 0.0264 | 3.0234 | 1.7388 |
| No log | 0.1176 | 4 | 1.7324 | 0.0390 | 1.7324 | 1.3162 |
| No log | 0.1765 | 6 | 0.8389 | 0.1588 | 0.8389 | 0.9159 |
| No log | 0.2353 | 8 | 0.6543 | 0.0685 | 0.6543 | 0.8089 |
| No log | 0.2941 | 10 | 0.5831 | 0.0569 | 0.5831 | 0.7636 |
| No log | 0.3529 | 12 | 0.5741 | 0.0569 | 0.5741 | 0.7577 |
| No log | 0.4118 | 14 | 0.5778 | 0.0569 | 0.5778 | 0.7601 |
| No log | 0.4706 | 16 | 0.5909 | 0.0569 | 0.5909 | 0.7687 |
| No log | 0.5294 | 18 | 0.8813 | 0.0333 | 0.8813 | 0.9388 |
| No log | 0.5882 | 20 | 0.6946 | -0.0314 | 0.6946 | 0.8334 |
| No log | 0.6471 | 22 | 0.6066 | 0.0909 | 0.6066 | 0.7788 |
| No log | 0.7059 | 24 | 0.5593 | 0.0 | 0.5593 | 0.7479 |
| No log | 0.7647 | 26 | 0.5900 | 0.0569 | 0.5900 | 0.7681 |
| No log | 0.8235 | 28 | 0.7008 | 0.1030 | 0.7008 | 0.8372 |
| No log | 0.8824 | 30 | 0.9765 | 0.0388 | 0.9765 | 0.9882 |
| No log | 0.9412 | 32 | 0.9335 | 0.1111 | 0.9335 | 0.9662 |
| No log | 1.0 | 34 | 0.7303 | 0.2146 | 0.7303 | 0.8546 |
| No log | 1.0588 | 36 | 0.6368 | 0.2485 | 0.6368 | 0.7980 |
| No log | 1.1176 | 38 | 0.8475 | 0.1392 | 0.8475 | 0.9206 |
| No log | 1.1765 | 40 | 0.9464 | 0.1545 | 0.9464 | 0.9728 |
| No log | 1.2353 | 42 | 0.6208 | 0.0617 | 0.6208 | 0.7879 |
| No log | 1.2941 | 44 | 0.5881 | 0.0071 | 0.5881 | 0.7669 |
| No log | 1.3529 | 46 | 0.6369 | 0.1282 | 0.6369 | 0.7981 |
| No log | 1.4118 | 48 | 0.8035 | 0.1179 | 0.8035 | 0.8964 |
| No log | 1.4706 | 50 | 0.6446 | 0.0769 | 0.6446 | 0.8029 |
| No log | 1.5294 | 52 | 0.6254 | 0.0476 | 0.6254 | 0.7908 |
| No log | 1.5882 | 54 | 0.6665 | 0.0476 | 0.6665 | 0.8164 |
| No log | 1.6471 | 56 | 0.6407 | 0.0388 | 0.6407 | 0.8004 |
| No log | 1.7059 | 58 | 0.7560 | 0.1145 | 0.7560 | 0.8695 |
| No log | 1.7647 | 60 | 0.7645 | 0.0769 | 0.7645 | 0.8743 |
| No log | 1.8235 | 62 | 0.6834 | 0.1145 | 0.6834 | 0.8267 |
| No log | 1.8824 | 64 | 0.6563 | 0.1617 | 0.6563 | 0.8101 |
| No log | 1.9412 | 66 | 0.7418 | 0.2000 | 0.7418 | 0.8613 |
| No log | 2.0 | 68 | 0.5891 | 0.1565 | 0.5891 | 0.7675 |
| No log | 2.0588 | 70 | 0.5653 | 0.1565 | 0.5653 | 0.7519 |
| No log | 2.1176 | 72 | 0.5768 | 0.1565 | 0.5768 | 0.7594 |
| No log | 2.1765 | 74 | 0.5669 | 0.0534 | 0.5669 | 0.7529 |
| No log | 2.2353 | 76 | 0.5899 | 0.0 | 0.5899 | 0.7681 |
| No log | 2.2941 | 78 | 0.6124 | -0.0081 | 0.6124 | 0.7826 |
| No log | 2.3529 | 80 | 0.5983 | 0.1788 | 0.5983 | 0.7735 |
| No log | 2.4118 | 82 | 0.7163 | 0.2390 | 0.7163 | 0.8464 |
| No log | 2.4706 | 84 | 0.6778 | 0.3224 | 0.6778 | 0.8233 |
| No log | 2.5294 | 86 | 0.9241 | 0.0539 | 0.9241 | 0.9613 |
| No log | 2.5882 | 88 | 1.0515 | 0.0222 | 1.0515 | 1.0254 |
| No log | 2.6471 | 90 | 1.0482 | 0.0222 | 1.0482 | 1.0238 |
| No log | 2.7059 | 92 | 0.9215 | 0.0279 | 0.9215 | 0.9600 |
| No log | 2.7647 | 94 | 0.8859 | 0.0871 | 0.8859 | 0.9412 |
| No log | 2.8235 | 96 | 0.6615 | 0.3161 | 0.6615 | 0.8133 |
| No log | 2.8824 | 98 | 0.5777 | 0.2914 | 0.5777 | 0.7601 |
| No log | 2.9412 | 100 | 0.6151 | 0.3333 | 0.6151 | 0.7843 |
| No log | 3.0 | 102 | 0.5805 | 0.4157 | 0.5805 | 0.7619 |
| No log | 3.0588 | 104 | 0.5755 | 0.3371 | 0.5755 | 0.7586 |
| No log | 3.1176 | 106 | 0.5445 | 0.3953 | 0.5445 | 0.7379 |
| No log | 3.1765 | 108 | 0.5909 | 0.3488 | 0.5909 | 0.7687 |
| No log | 3.2353 | 110 | 0.7475 | 0.2762 | 0.7475 | 0.8646 |
| No log | 3.2941 | 112 | 0.6556 | 0.3371 | 0.6556 | 0.8097 |
| No log | 3.3529 | 114 | 0.5010 | 0.2000 | 0.5010 | 0.7078 |
| No log | 3.4118 | 116 | 0.5359 | 0.2471 | 0.5359 | 0.7321 |
| No log | 3.4706 | 118 | 0.4966 | 0.2000 | 0.4966 | 0.7047 |
| No log | 3.5294 | 120 | 0.5086 | 0.1008 | 0.5086 | 0.7131 |
| No log | 3.5882 | 122 | 0.6008 | 0.0720 | 0.6008 | 0.7751 |
| No log | 3.6471 | 124 | 0.6364 | 0.3032 | 0.6364 | 0.7978 |
| No log | 3.7059 | 126 | 0.5422 | 0.2778 | 0.5422 | 0.7363 |
| No log | 3.7647 | 128 | 0.5321 | 0.2771 | 0.5321 | 0.7294 |
| No log | 3.8235 | 130 | 0.5728 | 0.2809 | 0.5728 | 0.7569 |
| No log | 3.8824 | 132 | 0.5373 | 0.2208 | 0.5373 | 0.7330 |
| No log | 3.9412 | 134 | 0.6874 | 0.2653 | 0.6874 | 0.8291 |
| No log | 4.0 | 136 | 0.8265 | 0.1203 | 0.8265 | 0.9091 |
| No log | 4.0588 | 138 | 0.8199 | 0.1203 | 0.8199 | 0.9055 |
| No log | 4.1176 | 140 | 0.6688 | 0.3301 | 0.6688 | 0.8178 |
| No log | 4.1765 | 142 | 0.6586 | 0.4502 | 0.6586 | 0.8116 |
| No log | 4.2353 | 144 | 0.7813 | 0.2771 | 0.7813 | 0.8839 |
| No log | 4.2941 | 146 | 0.8171 | 0.2258 | 0.8171 | 0.9040 |
| No log | 4.3529 | 148 | 0.7151 | 0.2821 | 0.7151 | 0.8456 |
| No log | 4.4118 | 150 | 0.5942 | 0.5025 | 0.5942 | 0.7709 |
| No log | 4.4706 | 152 | 0.5823 | 0.4694 | 0.5823 | 0.7631 |
| No log | 4.5294 | 154 | 0.6861 | 0.2632 | 0.6861 | 0.8283 |
| No log | 4.5882 | 156 | 0.7854 | 0.0769 | 0.7854 | 0.8862 |
| No log | 4.6471 | 158 | 0.7431 | 0.2632 | 0.7431 | 0.8620 |
| No log | 4.7059 | 160 | 0.5966 | 0.4563 | 0.5966 | 0.7724 |
| No log | 4.7647 | 162 | 0.5873 | 0.3548 | 0.5873 | 0.7664 |
| No log | 4.8235 | 164 | 0.5874 | 0.3535 | 0.5874 | 0.7664 |
| No log | 4.8824 | 166 | 0.6159 | 0.4178 | 0.6159 | 0.7848 |
| No log | 4.9412 | 168 | 0.8464 | 0.2558 | 0.8464 | 0.9200 |
| No log | 5.0 | 170 | 0.8262 | 0.2548 | 0.8262 | 0.9089 |
| No log | 5.0588 | 172 | 0.6845 | 0.4530 | 0.6845 | 0.8274 |
| No log | 5.1176 | 174 | 0.5946 | 0.4340 | 0.5946 | 0.7711 |
| No log | 5.1765 | 176 | 0.6073 | 0.4286 | 0.6073 | 0.7793 |
| No log | 5.2353 | 178 | 0.6742 | 0.3778 | 0.6742 | 0.8211 |
| No log | 5.2941 | 180 | 0.7981 | 0.2340 | 0.7981 | 0.8934 |
| No log | 5.3529 | 182 | 0.7276 | 0.3537 | 0.7276 | 0.8530 |
| No log | 5.4118 | 184 | 0.5963 | 0.4019 | 0.5963 | 0.7722 |
| No log | 5.4706 | 186 | 0.5761 | 0.4019 | 0.5761 | 0.7590 |
| No log | 5.5294 | 188 | 0.5715 | 0.4643 | 0.5715 | 0.7560 |
| No log | 5.5882 | 190 | 0.6294 | 0.3846 | 0.6294 | 0.7934 |
| No log | 5.6471 | 192 | 0.7923 | 0.2281 | 0.7923 | 0.8901 |
| No log | 5.7059 | 194 | 0.7294 | 0.2281 | 0.7294 | 0.8540 |
| No log | 5.7647 | 196 | 0.5758 | 0.3535 | 0.5758 | 0.7588 |
| No log | 5.8235 | 198 | 0.5046 | 0.5169 | 0.5046 | 0.7104 |
| No log | 5.8824 | 200 | 0.5381 | 0.4343 | 0.5381 | 0.7335 |
| No log | 5.9412 | 202 | 0.5074 | 0.4680 | 0.5074 | 0.7123 |
| No log | 6.0 | 204 | 0.4753 | 0.4819 | 0.4753 | 0.6894 |
| No log | 6.0588 | 206 | 0.5398 | 0.4051 | 0.5398 | 0.7347 |
| No log | 6.1176 | 208 | 0.5511 | 0.4051 | 0.5511 | 0.7423 |
| No log | 6.1765 | 210 | 0.5737 | 0.4112 | 0.5737 | 0.7574 |
| No log | 6.2353 | 212 | 0.5180 | 0.4468 | 0.5180 | 0.7197 |
| No log | 6.2941 | 214 | 0.4523 | 0.5 | 0.4523 | 0.6725 |
| No log | 6.3529 | 216 | 0.4564 | 0.5330 | 0.4564 | 0.6756 |
| No log | 6.4118 | 218 | 0.5107 | 0.4162 | 0.5107 | 0.7146 |
| No log | 6.4706 | 220 | 0.5730 | 0.3299 | 0.5730 | 0.7569 |
| No log | 6.5294 | 222 | 0.5559 | 0.3561 | 0.5559 | 0.7456 |
| No log | 6.5882 | 224 | 0.5378 | 0.4510 | 0.5378 | 0.7333 |
| No log | 6.6471 | 226 | 0.5846 | 0.4502 | 0.5846 | 0.7646 |
| No log | 6.7059 | 228 | 0.6800 | 0.3684 | 0.6800 | 0.8246 |
| No log | 6.7647 | 230 | 0.6793 | 0.3645 | 0.6793 | 0.8242 |
| No log | 6.8235 | 232 | 0.6854 | 0.3607 | 0.6854 | 0.8279 |
| No log | 6.8824 | 234 | 0.7075 | 0.3571 | 0.7075 | 0.8411 |
| No log | 6.9412 | 236 | 0.6295 | 0.3951 | 0.6295 | 0.7934 |
| No log | 7.0 | 238 | 0.6389 | 0.3367 | 0.6389 | 0.7993 |
| No log | 7.0588 | 240 | 0.6050 | 0.3892 | 0.6050 | 0.7778 |
| No log | 7.1176 | 242 | 0.5526 | 0.4171 | 0.5526 | 0.7433 |
| No log | 7.1765 | 244 | 0.5473 | 0.4171 | 0.5473 | 0.7398 |
| No log | 7.2353 | 246 | 0.5783 | 0.4051 | 0.5783 | 0.7605 |
| No log | 7.2941 | 248 | 0.5578 | 0.4105 | 0.5578 | 0.7468 |
| No log | 7.3529 | 250 | 0.5826 | 0.4051 | 0.5826 | 0.7633 |
| No log | 7.4118 | 252 | 0.5519 | 0.4286 | 0.5519 | 0.7429 |
| No log | 7.4706 | 254 | 0.5459 | 0.4286 | 0.5459 | 0.7388 |
| No log | 7.5294 | 256 | 0.5107 | 0.4346 | 0.5107 | 0.7146 |
| No log | 7.5882 | 258 | 0.5148 | 0.4346 | 0.5148 | 0.7175 |
| No log | 7.6471 | 260 | 0.5181 | 0.4694 | 0.5181 | 0.7198 |
| No log | 7.7059 | 262 | 0.5241 | 0.4694 | 0.5241 | 0.7240 |
| No log | 7.7647 | 264 | 0.5625 | 0.375 | 0.5625 | 0.7500 |
| No log | 7.8235 | 266 | 0.5731 | 0.375 | 0.5731 | 0.7571 |
| No log | 7.8824 | 268 | 0.5678 | 0.4167 | 0.5678 | 0.7535 |
| No log | 7.9412 | 270 | 0.5093 | 0.4839 | 0.5093 | 0.7137 |
| No log | 8.0 | 272 | 0.4730 | 0.4620 | 0.4730 | 0.6878 |
| No log | 8.0588 | 274 | 0.4659 | 0.4620 | 0.4659 | 0.6826 |
| No log | 8.1176 | 276 | 0.4739 | 0.4545 | 0.4739 | 0.6884 |
| No log | 8.1765 | 278 | 0.5039 | 0.4839 | 0.5039 | 0.7099 |
| No log | 8.2353 | 280 | 0.5659 | 0.3478 | 0.5659 | 0.7523 |
| No log | 8.2941 | 282 | 0.5902 | 0.3769 | 0.5902 | 0.7683 |
| No log | 8.3529 | 284 | 0.5853 | 0.3769 | 0.5853 | 0.7651 |
| No log | 8.4118 | 286 | 0.5710 | 0.3706 | 0.5710 | 0.7556 |
| No log | 8.4706 | 288 | 0.5891 | 0.3725 | 0.5891 | 0.7675 |
| No log | 8.5294 | 290 | 0.6263 | 0.3645 | 0.6263 | 0.7914 |
| No log | 8.5882 | 292 | 0.6119 | 0.3725 | 0.6119 | 0.7822 |
| No log | 8.6471 | 294 | 0.5964 | 0.3725 | 0.5964 | 0.7723 |
| No log | 8.7059 | 296 | 0.5338 | 0.4105 | 0.5338 | 0.7306 |
| No log | 8.7647 | 298 | 0.4930 | 0.4917 | 0.4930 | 0.7022 |
| No log | 8.8235 | 300 | 0.4884 | 0.4917 | 0.4884 | 0.6989 |
| No log | 8.8824 | 302 | 0.5065 | 0.4917 | 0.5065 | 0.7117 |
| No log | 8.9412 | 304 | 0.5510 | 0.4112 | 0.5510 | 0.7423 |
| No log | 9.0 | 306 | 0.5803 | 0.4010 | 0.5803 | 0.7617 |
| No log | 9.0588 | 308 | 0.5889 | 0.4010 | 0.5889 | 0.7674 |
| No log | 9.1176 | 310 | 0.5917 | 0.4010 | 0.5917 | 0.7692 |
| No log | 9.1765 | 312 | 0.5873 | 0.4010 | 0.5873 | 0.7664 |
| No log | 9.2353 | 314 | 0.5850 | 0.4010 | 0.5850 | 0.7648 |
| No log | 9.2941 | 316 | 0.5744 | 0.4112 | 0.5744 | 0.7579 |
| No log | 9.3529 | 318 | 0.5433 | 0.4112 | 0.5433 | 0.7371 |
| No log | 9.4118 | 320 | 0.5194 | 0.5152 | 0.5194 | 0.7207 |
| No log | 9.4706 | 322 | 0.5055 | 0.4346 | 0.5055 | 0.7110 |
| No log | 9.5294 | 324 | 0.5068 | 0.4346 | 0.5068 | 0.7119 |
| No log | 9.5882 | 326 | 0.5110 | 0.4346 | 0.5110 | 0.7148 |
| No log | 9.6471 | 328 | 0.5196 | 0.4819 | 0.5196 | 0.7209 |
| No log | 9.7059 | 330 | 0.5274 | 0.4872 | 0.5274 | 0.7262 |
| No log | 9.7647 | 332 | 0.5353 | 0.4872 | 0.5353 | 0.7316 |
| No log | 9.8235 | 334 | 0.5354 | 0.4872 | 0.5354 | 0.7317 |
| No log | 9.8824 | 336 | 0.5342 | 0.4872 | 0.5342 | 0.7309 |
| No log | 9.9412 | 338 | 0.5335 | 0.4872 | 0.5335 | 0.7304 |
| No log | 10.0 | 340 | 0.5337 | 0.4872 | 0.5337 | 0.7306 |
Framework versions
- Transformers 4.44.2
- Pytorch 2.4.0+cu118
- Datasets 2.21.0
- Tokenizers 0.19.1
- Downloads last month
- 1
Model tree for MayBashendy/ArabicNewSplits6_FineTuningAraBERT_run2_AugV5_k7_task3_organization
Base model
aubmindlab/bert-base-arabertv02