ArabicNewSplits4_WithDuplicationsForScore5_FineTuningAraBERT_run2_AugV5_k5_task5_organization
This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:
- Loss: 1.0064
- Qwk: 0.6722
- Mse: 1.0064
- Rmse: 1.0032
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 10
Training results
| Training Loss | Epoch | Step | Validation Loss | Qwk | Mse | Rmse |
|---|---|---|---|---|---|---|
| No log | 0.1053 | 2 | 2.3843 | 0.0013 | 2.3843 | 1.5441 |
| No log | 0.2105 | 4 | 1.5944 | 0.1566 | 1.5944 | 1.2627 |
| No log | 0.3158 | 6 | 1.2772 | 0.2324 | 1.2772 | 1.1301 |
| No log | 0.4211 | 8 | 1.3145 | 0.1754 | 1.3145 | 1.1465 |
| No log | 0.5263 | 10 | 1.6140 | 0.2257 | 1.6140 | 1.2704 |
| No log | 0.6316 | 12 | 1.7550 | 0.1582 | 1.7550 | 1.3248 |
| No log | 0.7368 | 14 | 1.8381 | 0.1046 | 1.8381 | 1.3558 |
| No log | 0.8421 | 16 | 2.3503 | 0.1870 | 2.3503 | 1.5331 |
| No log | 0.9474 | 18 | 2.5820 | 0.1474 | 2.5820 | 1.6069 |
| No log | 1.0526 | 20 | 2.2637 | 0.1757 | 2.2637 | 1.5046 |
| No log | 1.1579 | 22 | 1.7872 | 0.3104 | 1.7872 | 1.3369 |
| No log | 1.2632 | 24 | 1.6833 | 0.3508 | 1.6833 | 1.2974 |
| No log | 1.3684 | 26 | 1.6994 | 0.3617 | 1.6994 | 1.3036 |
| No log | 1.4737 | 28 | 1.7060 | 0.3464 | 1.7060 | 1.3061 |
| No log | 1.5789 | 30 | 1.5883 | 0.3565 | 1.5883 | 1.2603 |
| No log | 1.6842 | 32 | 1.4383 | 0.3101 | 1.4383 | 1.1993 |
| No log | 1.7895 | 34 | 1.3103 | 0.2345 | 1.3103 | 1.1447 |
| No log | 1.8947 | 36 | 1.3361 | 0.2883 | 1.3361 | 1.1559 |
| No log | 2.0 | 38 | 1.4164 | 0.3231 | 1.4164 | 1.1901 |
| No log | 2.1053 | 40 | 1.4206 | 0.3495 | 1.4206 | 1.1919 |
| No log | 2.2105 | 42 | 1.4006 | 0.4111 | 1.4006 | 1.1835 |
| No log | 2.3158 | 44 | 1.2878 | 0.4469 | 1.2878 | 1.1348 |
| No log | 2.4211 | 46 | 1.1340 | 0.4088 | 1.1340 | 1.0649 |
| No log | 2.5263 | 48 | 1.0553 | 0.3487 | 1.0553 | 1.0273 |
| No log | 2.6316 | 50 | 1.0257 | 0.3736 | 1.0257 | 1.0128 |
| No log | 2.7368 | 52 | 1.0068 | 0.3857 | 1.0068 | 1.0034 |
| No log | 2.8421 | 54 | 1.0373 | 0.4382 | 1.0373 | 1.0185 |
| No log | 2.9474 | 56 | 1.1622 | 0.5042 | 1.1622 | 1.0781 |
| No log | 3.0526 | 58 | 1.2329 | 0.4766 | 1.2329 | 1.1104 |
| No log | 3.1579 | 60 | 1.3819 | 0.4823 | 1.3819 | 1.1755 |
| No log | 3.2632 | 62 | 1.3987 | 0.5005 | 1.3987 | 1.1827 |
| No log | 3.3684 | 64 | 1.3614 | 0.5160 | 1.3614 | 1.1668 |
| No log | 3.4737 | 66 | 1.3244 | 0.5291 | 1.3244 | 1.1508 |
| No log | 3.5789 | 68 | 1.2792 | 0.5279 | 1.2792 | 1.1310 |
| No log | 3.6842 | 70 | 1.2875 | 0.5298 | 1.2875 | 1.1347 |
| No log | 3.7895 | 72 | 1.2371 | 0.5279 | 1.2371 | 1.1122 |
| No log | 3.8947 | 74 | 1.1923 | 0.5411 | 1.1923 | 1.0919 |
| No log | 4.0 | 76 | 1.1674 | 0.5593 | 1.1674 | 1.0805 |
| No log | 4.1053 | 78 | 1.2402 | 0.5388 | 1.2402 | 1.1137 |
| No log | 4.2105 | 80 | 1.3414 | 0.5251 | 1.3414 | 1.1582 |
| No log | 4.3158 | 82 | 1.3663 | 0.5219 | 1.3663 | 1.1689 |
| No log | 4.4211 | 84 | 1.2539 | 0.5622 | 1.2539 | 1.1198 |
| No log | 4.5263 | 86 | 1.1853 | 0.5912 | 1.1853 | 1.0887 |
| No log | 4.6316 | 88 | 1.0872 | 0.6024 | 1.0872 | 1.0427 |
| No log | 4.7368 | 90 | 1.1160 | 0.6125 | 1.1160 | 1.0564 |
| No log | 4.8421 | 92 | 1.1695 | 0.5916 | 1.1695 | 1.0815 |
| No log | 4.9474 | 94 | 1.0984 | 0.6318 | 1.0984 | 1.0480 |
| No log | 5.0526 | 96 | 1.0179 | 0.6560 | 1.0179 | 1.0089 |
| No log | 5.1579 | 98 | 0.9508 | 0.6514 | 0.9508 | 0.9751 |
| No log | 5.2632 | 100 | 0.8656 | 0.6315 | 0.8656 | 0.9304 |
| No log | 5.3684 | 102 | 0.8694 | 0.6612 | 0.8694 | 0.9324 |
| No log | 5.4737 | 104 | 0.9729 | 0.6831 | 0.9729 | 0.9864 |
| No log | 5.5789 | 106 | 0.9944 | 0.6640 | 0.9944 | 0.9972 |
| No log | 5.6842 | 108 | 0.9892 | 0.6656 | 0.9892 | 0.9946 |
| No log | 5.7895 | 110 | 0.9778 | 0.6581 | 0.9778 | 0.9889 |
| No log | 5.8947 | 112 | 0.9715 | 0.6748 | 0.9715 | 0.9856 |
| No log | 6.0 | 114 | 0.9809 | 0.6656 | 0.9809 | 0.9904 |
| No log | 6.1053 | 116 | 0.9739 | 0.6672 | 0.9739 | 0.9869 |
| No log | 6.2105 | 118 | 1.0883 | 0.6319 | 1.0883 | 1.0432 |
| No log | 6.3158 | 120 | 1.3486 | 0.5832 | 1.3486 | 1.1613 |
| No log | 6.4211 | 122 | 1.4061 | 0.5579 | 1.4061 | 1.1858 |
| No log | 6.5263 | 124 | 1.3202 | 0.5729 | 1.3202 | 1.1490 |
| No log | 6.6316 | 126 | 1.2216 | 0.5737 | 1.2216 | 1.1053 |
| No log | 6.7368 | 128 | 1.1641 | 0.5874 | 1.1641 | 1.0789 |
| No log | 6.8421 | 130 | 1.1309 | 0.6018 | 1.1309 | 1.0634 |
| No log | 6.9474 | 132 | 1.1066 | 0.6218 | 1.1066 | 1.0519 |
| No log | 7.0526 | 134 | 1.1334 | 0.6233 | 1.1334 | 1.0646 |
| No log | 7.1579 | 136 | 1.1837 | 0.5882 | 1.1837 | 1.0880 |
| No log | 7.2632 | 138 | 1.2128 | 0.5756 | 1.2128 | 1.1013 |
| No log | 7.3684 | 140 | 1.2375 | 0.5698 | 1.2375 | 1.1124 |
| No log | 7.4737 | 142 | 1.1451 | 0.6121 | 1.1451 | 1.0701 |
| No log | 7.5789 | 144 | 1.0447 | 0.6411 | 1.0447 | 1.0221 |
| No log | 7.6842 | 146 | 0.9725 | 0.6550 | 0.9725 | 0.9861 |
| No log | 7.7895 | 148 | 0.9161 | 0.6922 | 0.9161 | 0.9571 |
| No log | 7.8947 | 150 | 0.8890 | 0.6964 | 0.8890 | 0.9429 |
| No log | 8.0 | 152 | 0.8794 | 0.6905 | 0.8794 | 0.9377 |
| No log | 8.1053 | 154 | 0.9022 | 0.6850 | 0.9022 | 0.9498 |
| No log | 8.2105 | 156 | 0.9751 | 0.6622 | 0.9751 | 0.9875 |
| No log | 8.3158 | 158 | 1.0502 | 0.6544 | 1.0502 | 1.0248 |
| No log | 8.4211 | 160 | 1.0637 | 0.6666 | 1.0637 | 1.0314 |
| No log | 8.5263 | 162 | 1.0420 | 0.6625 | 1.0420 | 1.0208 |
| No log | 8.6316 | 164 | 1.0287 | 0.6550 | 1.0287 | 1.0142 |
| No log | 8.7368 | 166 | 1.0451 | 0.6633 | 1.0451 | 1.0223 |
| No log | 8.8421 | 168 | 1.0483 | 0.6633 | 1.0483 | 1.0239 |
| No log | 8.9474 | 170 | 1.0307 | 0.6722 | 1.0307 | 1.0152 |
| No log | 9.0526 | 172 | 1.0229 | 0.6722 | 1.0229 | 1.0114 |
| No log | 9.1579 | 174 | 1.0104 | 0.6722 | 1.0104 | 1.0052 |
| No log | 9.2632 | 176 | 0.9967 | 0.6731 | 0.9967 | 0.9984 |
| No log | 9.3684 | 178 | 0.9881 | 0.6731 | 0.9881 | 0.9940 |
| No log | 9.4737 | 180 | 0.9918 | 0.6731 | 0.9918 | 0.9959 |
| No log | 9.5789 | 182 | 0.9917 | 0.6640 | 0.9917 | 0.9958 |
| No log | 9.6842 | 184 | 0.9974 | 0.6722 | 0.9974 | 0.9987 |
| No log | 9.7895 | 186 | 0.9992 | 0.6722 | 0.9992 | 0.9996 |
| No log | 9.8947 | 188 | 1.0038 | 0.6722 | 1.0038 | 1.0019 |
| No log | 10.0 | 190 | 1.0064 | 0.6722 | 1.0064 | 1.0032 |
Framework versions
- Transformers 4.44.2
- Pytorch 2.4.0+cu118
- Datasets 2.21.0
- Tokenizers 0.19.1
- Downloads last month
- 1
Model tree for MayBashendy/ArabicNewSplits4_WithDuplicationsForScore5_FineTuningAraBERT_run2_AugV5_k5_task5_organization
Base model
aubmindlab/bert-base-arabertv02