ArabicNewSplits5_FineTuningAraBERT_run1_AugV5_k3_task5_organization
This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:
- Loss: 1.1568
- Qwk: 0.5705
- Mse: 1.1568
- Rmse: 1.0756
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 10
Training results
| Training Loss | Epoch | Step | Validation Loss | Qwk | Mse | Rmse |
|---|---|---|---|---|---|---|
| No log | 0.1176 | 2 | 2.1255 | -0.0173 | 2.1255 | 1.4579 |
| No log | 0.2353 | 4 | 1.6254 | 0.1225 | 1.6254 | 1.2749 |
| No log | 0.3529 | 6 | 1.4914 | 0.1209 | 1.4914 | 1.2212 |
| No log | 0.4706 | 8 | 1.4281 | 0.1415 | 1.4281 | 1.1950 |
| No log | 0.5882 | 10 | 1.3824 | 0.1423 | 1.3824 | 1.1758 |
| No log | 0.7059 | 12 | 1.3616 | 0.1093 | 1.3616 | 1.1669 |
| No log | 0.8235 | 14 | 1.3803 | 0.1475 | 1.3803 | 1.1749 |
| No log | 0.9412 | 16 | 1.3817 | 0.1968 | 1.3817 | 1.1755 |
| No log | 1.0588 | 18 | 1.3513 | 0.1968 | 1.3513 | 1.1625 |
| No log | 1.1765 | 20 | 1.3298 | 0.1475 | 1.3298 | 1.1532 |
| No log | 1.2941 | 22 | 1.3404 | 0.1700 | 1.3404 | 1.1578 |
| No log | 1.4118 | 24 | 1.3307 | 0.2401 | 1.3307 | 1.1536 |
| No log | 1.5294 | 26 | 1.2660 | 0.2539 | 1.2660 | 1.1252 |
| No log | 1.6471 | 28 | 1.2154 | 0.2565 | 1.2154 | 1.1024 |
| No log | 1.7647 | 30 | 1.1693 | 0.3077 | 1.1693 | 1.0814 |
| No log | 1.8824 | 32 | 1.1636 | 0.3591 | 1.1636 | 1.0787 |
| No log | 2.0 | 34 | 1.1376 | 0.3764 | 1.1376 | 1.0666 |
| No log | 2.1176 | 36 | 1.1240 | 0.3801 | 1.1240 | 1.0602 |
| No log | 2.2353 | 38 | 1.1030 | 0.3872 | 1.1030 | 1.0502 |
| No log | 2.3529 | 40 | 1.1109 | 0.3313 | 1.1109 | 1.0540 |
| No log | 2.4706 | 42 | 1.1013 | 0.3142 | 1.1013 | 1.0494 |
| No log | 2.5882 | 44 | 1.0820 | 0.3259 | 1.0820 | 1.0402 |
| No log | 2.7059 | 46 | 1.0588 | 0.3411 | 1.0588 | 1.0290 |
| No log | 2.8235 | 48 | 1.0350 | 0.3546 | 1.0350 | 1.0173 |
| No log | 2.9412 | 50 | 1.0169 | 0.4318 | 1.0169 | 1.0084 |
| No log | 3.0588 | 52 | 1.0301 | 0.4990 | 1.0301 | 1.0149 |
| No log | 3.1765 | 54 | 1.0157 | 0.5033 | 1.0157 | 1.0078 |
| No log | 3.2941 | 56 | 0.9702 | 0.5150 | 0.9702 | 0.9850 |
| No log | 3.4118 | 58 | 0.9451 | 0.4624 | 0.9451 | 0.9722 |
| No log | 3.5294 | 60 | 0.9435 | 0.5108 | 0.9435 | 0.9713 |
| No log | 3.6471 | 62 | 0.9992 | 0.5786 | 0.9992 | 0.9996 |
| No log | 3.7647 | 64 | 1.0759 | 0.5354 | 1.0759 | 1.0372 |
| No log | 3.8824 | 66 | 1.1911 | 0.4913 | 1.1911 | 1.0914 |
| No log | 4.0 | 68 | 1.2722 | 0.4933 | 1.2722 | 1.1279 |
| No log | 4.1176 | 70 | 1.3234 | 0.4833 | 1.3234 | 1.1504 |
| No log | 4.2353 | 72 | 1.2167 | 0.5370 | 1.2167 | 1.1030 |
| No log | 4.3529 | 74 | 1.1915 | 0.5557 | 1.1915 | 1.0916 |
| No log | 4.4706 | 76 | 1.0835 | 0.5341 | 1.0835 | 1.0409 |
| No log | 4.5882 | 78 | 0.9906 | 0.5391 | 0.9906 | 0.9953 |
| No log | 4.7059 | 80 | 0.9552 | 0.5656 | 0.9552 | 0.9774 |
| No log | 4.8235 | 82 | 0.9968 | 0.5562 | 0.9968 | 0.9984 |
| No log | 4.9412 | 84 | 1.0622 | 0.5480 | 1.0622 | 1.0307 |
| No log | 5.0588 | 86 | 1.1321 | 0.5710 | 1.1321 | 1.0640 |
| No log | 5.1765 | 88 | 1.1553 | 0.5592 | 1.1553 | 1.0749 |
| No log | 5.2941 | 90 | 1.1985 | 0.5389 | 1.1985 | 1.0948 |
| No log | 5.4118 | 92 | 1.1636 | 0.5840 | 1.1636 | 1.0787 |
| No log | 5.5294 | 94 | 1.1425 | 0.5922 | 1.1425 | 1.0689 |
| No log | 5.6471 | 96 | 1.1819 | 0.5188 | 1.1819 | 1.0871 |
| No log | 5.7647 | 98 | 1.1689 | 0.5526 | 1.1689 | 1.0812 |
| No log | 5.8824 | 100 | 1.2249 | 0.5478 | 1.2249 | 1.1067 |
| No log | 6.0 | 102 | 1.2373 | 0.5527 | 1.2373 | 1.1124 |
| No log | 6.1176 | 104 | 1.1858 | 0.5725 | 1.1858 | 1.0889 |
| No log | 6.2353 | 106 | 1.1439 | 0.5945 | 1.1439 | 1.0695 |
| No log | 6.3529 | 108 | 1.1174 | 0.6083 | 1.1174 | 1.0571 |
| No log | 6.4706 | 110 | 1.1468 | 0.5857 | 1.1468 | 1.0709 |
| No log | 6.5882 | 112 | 1.1475 | 0.5947 | 1.1475 | 1.0712 |
| No log | 6.7059 | 114 | 1.0982 | 0.6052 | 1.0982 | 1.0479 |
| No log | 6.8235 | 116 | 1.0471 | 0.6331 | 1.0471 | 1.0233 |
| No log | 6.9412 | 118 | 0.9854 | 0.6394 | 0.9854 | 0.9927 |
| No log | 7.0588 | 120 | 0.9646 | 0.6548 | 0.9646 | 0.9821 |
| No log | 7.1765 | 122 | 1.0049 | 0.6408 | 1.0049 | 1.0025 |
| No log | 7.2941 | 124 | 1.0999 | 0.6314 | 1.0999 | 1.0488 |
| No log | 7.4118 | 126 | 1.1766 | 0.5861 | 1.1766 | 1.0847 |
| No log | 7.5294 | 128 | 1.1922 | 0.5952 | 1.1922 | 1.0919 |
| No log | 7.6471 | 130 | 1.1417 | 0.6041 | 1.1417 | 1.0685 |
| No log | 7.7647 | 132 | 1.0713 | 0.6289 | 1.0713 | 1.0351 |
| No log | 7.8824 | 134 | 1.0136 | 0.6518 | 1.0136 | 1.0068 |
| No log | 8.0 | 136 | 0.9813 | 0.6390 | 0.9813 | 0.9906 |
| No log | 8.1176 | 138 | 0.9815 | 0.6351 | 0.9815 | 0.9907 |
| No log | 8.2353 | 140 | 1.0167 | 0.6310 | 1.0167 | 1.0083 |
| No log | 8.3529 | 142 | 1.0677 | 0.6174 | 1.0677 | 1.0333 |
| No log | 8.4706 | 144 | 1.1230 | 0.5866 | 1.1230 | 1.0597 |
| No log | 8.5882 | 146 | 1.2015 | 0.5591 | 1.2015 | 1.0961 |
| No log | 8.7059 | 148 | 1.2590 | 0.5534 | 1.2590 | 1.1221 |
| No log | 8.8235 | 150 | 1.2746 | 0.5556 | 1.2746 | 1.1290 |
| No log | 8.9412 | 152 | 1.2668 | 0.5556 | 1.2668 | 1.1255 |
| No log | 9.0588 | 154 | 1.2578 | 0.5565 | 1.2578 | 1.1215 |
| No log | 9.1765 | 156 | 1.2271 | 0.5645 | 1.2271 | 1.1077 |
| No log | 9.2941 | 158 | 1.2038 | 0.5645 | 1.2038 | 1.0972 |
| No log | 9.4118 | 160 | 1.1923 | 0.5623 | 1.1923 | 1.0919 |
| No log | 9.5294 | 162 | 1.1844 | 0.5623 | 1.1844 | 1.0883 |
| No log | 9.6471 | 164 | 1.1789 | 0.5623 | 1.1789 | 1.0858 |
| No log | 9.7647 | 166 | 1.1702 | 0.5705 | 1.1702 | 1.0817 |
| No log | 9.8824 | 168 | 1.1602 | 0.5705 | 1.1602 | 1.0771 |
| No log | 10.0 | 170 | 1.1568 | 0.5705 | 1.1568 | 1.0756 |
Framework versions
- Transformers 4.44.2
- Pytorch 2.4.0+cu118
- Datasets 2.21.0
- Tokenizers 0.19.1
- Downloads last month
- -
Model tree for MayBashendy/ArabicNewSplits5_FineTuningAraBERT_run1_AugV5_k3_task5_organization
Base model
aubmindlab/bert-base-arabertv02