ArabicNewSplits6_WithDuplicationsForScore5_FineTuningAraBERT_run3_AugV5_k2_task5_organization
This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:
- Loss: 1.1626
- Qwk: 0.6083
- Mse: 1.1626
- Rmse: 1.0782
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 10
Training results
| Training Loss | Epoch | Step | Validation Loss | Qwk | Mse | Rmse |
|---|---|---|---|---|---|---|
| No log | 0.1818 | 2 | 2.3491 | -0.0013 | 2.3491 | 1.5327 |
| No log | 0.3636 | 4 | 1.4925 | 0.2017 | 1.4925 | 1.2217 |
| No log | 0.5455 | 6 | 1.3164 | 0.1307 | 1.3164 | 1.1474 |
| No log | 0.7273 | 8 | 1.3824 | 0.1903 | 1.3824 | 1.1757 |
| No log | 0.9091 | 10 | 1.6964 | 0.3128 | 1.6964 | 1.3024 |
| No log | 1.0909 | 12 | 1.8682 | 0.3116 | 1.8682 | 1.3668 |
| No log | 1.2727 | 14 | 1.8846 | 0.3128 | 1.8846 | 1.3728 |
| No log | 1.4545 | 16 | 1.7012 | 0.3448 | 1.7012 | 1.3043 |
| No log | 1.6364 | 18 | 1.4895 | 0.3223 | 1.4895 | 1.2204 |
| No log | 1.8182 | 20 | 1.3364 | 0.2865 | 1.3364 | 1.1560 |
| No log | 2.0 | 22 | 1.2644 | 0.1628 | 1.2644 | 1.1245 |
| No log | 2.1818 | 24 | 1.2313 | 0.1903 | 1.2313 | 1.1096 |
| No log | 2.3636 | 26 | 1.1790 | 0.1628 | 1.1790 | 1.0858 |
| No log | 2.5455 | 28 | 1.1314 | 0.2202 | 1.1314 | 1.0637 |
| No log | 2.7273 | 30 | 1.1207 | 0.3277 | 1.1207 | 1.0586 |
| No log | 2.9091 | 32 | 1.1514 | 0.3766 | 1.1514 | 1.0730 |
| No log | 3.0909 | 34 | 1.2195 | 0.3711 | 1.2195 | 1.1043 |
| No log | 3.2727 | 36 | 1.3770 | 0.4037 | 1.3770 | 1.1735 |
| No log | 3.4545 | 38 | 1.4887 | 0.3906 | 1.4887 | 1.2201 |
| No log | 3.6364 | 40 | 1.4150 | 0.3902 | 1.4150 | 1.1895 |
| No log | 3.8182 | 42 | 1.2434 | 0.4161 | 1.2434 | 1.1151 |
| No log | 4.0 | 44 | 1.0707 | 0.4686 | 1.0707 | 1.0347 |
| No log | 4.1818 | 46 | 0.9741 | 0.5035 | 0.9741 | 0.9869 |
| No log | 4.3636 | 48 | 0.9428 | 0.5262 | 0.9428 | 0.9710 |
| No log | 4.5455 | 50 | 0.9151 | 0.5649 | 0.9151 | 0.9566 |
| No log | 4.7273 | 52 | 0.9797 | 0.5462 | 0.9797 | 0.9898 |
| No log | 4.9091 | 54 | 1.1162 | 0.5336 | 1.1162 | 1.0565 |
| No log | 5.0909 | 56 | 1.2237 | 0.4909 | 1.2237 | 1.1062 |
| No log | 5.2727 | 58 | 1.2223 | 0.5082 | 1.2223 | 1.1056 |
| No log | 5.4545 | 60 | 1.2128 | 0.5309 | 1.2128 | 1.1013 |
| No log | 5.6364 | 62 | 1.1364 | 0.4918 | 1.1364 | 1.0660 |
| No log | 5.8182 | 64 | 1.0813 | 0.4683 | 1.0813 | 1.0399 |
| No log | 6.0 | 66 | 1.0827 | 0.5036 | 1.0827 | 1.0405 |
| No log | 6.1818 | 68 | 1.0956 | 0.5647 | 1.0956 | 1.0467 |
| No log | 6.3636 | 70 | 1.0943 | 0.5571 | 1.0943 | 1.0461 |
| No log | 6.5455 | 72 | 1.0891 | 0.5477 | 1.0891 | 1.0436 |
| No log | 6.7273 | 74 | 1.0977 | 0.5642 | 1.0977 | 1.0477 |
| No log | 6.9091 | 76 | 1.0979 | 0.5645 | 1.0979 | 1.0478 |
| No log | 7.0909 | 78 | 1.1484 | 0.5521 | 1.1484 | 1.0716 |
| No log | 7.2727 | 80 | 1.1266 | 0.5567 | 1.1266 | 1.0614 |
| No log | 7.4545 | 82 | 1.1001 | 0.5791 | 1.1001 | 1.0488 |
| No log | 7.6364 | 84 | 1.0252 | 0.5918 | 1.0252 | 1.0125 |
| No log | 7.8182 | 86 | 0.9597 | 0.5946 | 0.9597 | 0.9797 |
| No log | 8.0 | 88 | 0.9549 | 0.6095 | 0.9549 | 0.9772 |
| No log | 8.1818 | 90 | 0.9901 | 0.6134 | 0.9901 | 0.9950 |
| No log | 8.3636 | 92 | 1.0505 | 0.6174 | 1.0505 | 1.0249 |
| No log | 8.5455 | 94 | 1.1476 | 0.6083 | 1.1476 | 1.0713 |
| No log | 8.7273 | 96 | 1.2085 | 0.5830 | 1.2085 | 1.0993 |
| No log | 8.9091 | 98 | 1.2312 | 0.5551 | 1.2312 | 1.1096 |
| No log | 9.0909 | 100 | 1.2390 | 0.5551 | 1.2390 | 1.1131 |
| No log | 9.2727 | 102 | 1.2360 | 0.5551 | 1.2360 | 1.1117 |
| No log | 9.4545 | 104 | 1.2087 | 0.5871 | 1.2087 | 1.0994 |
| No log | 9.6364 | 106 | 1.1831 | 0.5956 | 1.1831 | 1.0877 |
| No log | 9.8182 | 108 | 1.1679 | 0.6083 | 1.1679 | 1.0807 |
| No log | 10.0 | 110 | 1.1626 | 0.6083 | 1.1626 | 1.0782 |
Framework versions
- Transformers 4.44.2
- Pytorch 2.4.0+cu118
- Datasets 2.21.0
- Tokenizers 0.19.1
- Downloads last month
- 1
Model tree for MayBashendy/ArabicNewSplits6_WithDuplicationsForScore5_FineTuningAraBERT_run3_AugV5_k2_task5_organization
Base model
aubmindlab/bert-base-arabertv02