ArabicNewSplits5_FineTuningAraBERT_run2_AugV5_k2_task1_organization
This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:
- Loss: 0.7955
- Qwk: 0.6859
- Mse: 0.7955
- Rmse: 0.8919
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 10
Training results
| Training Loss | Epoch | Step | Validation Loss | Qwk | Mse | Rmse |
|---|---|---|---|---|---|---|
| No log | 0.125 | 2 | 5.1059 | -0.0238 | 5.1059 | 2.2596 |
| No log | 0.25 | 4 | 2.9393 | 0.0863 | 2.9393 | 1.7144 |
| No log | 0.375 | 6 | 1.8752 | 0.1210 | 1.8752 | 1.3694 |
| No log | 0.5 | 8 | 1.6197 | 0.1017 | 1.6197 | 1.2727 |
| No log | 0.625 | 10 | 1.9296 | -0.0344 | 1.9296 | 1.3891 |
| No log | 0.75 | 12 | 1.8116 | -0.1098 | 1.8116 | 1.3459 |
| No log | 0.875 | 14 | 2.1000 | -0.1223 | 2.1000 | 1.4491 |
| No log | 1.0 | 16 | 2.1515 | -0.0859 | 2.1515 | 1.4668 |
| No log | 1.125 | 18 | 2.5569 | -0.0317 | 2.5569 | 1.5990 |
| No log | 1.25 | 20 | 1.8494 | 0.0290 | 1.8494 | 1.3599 |
| No log | 1.375 | 22 | 1.1987 | 0.2562 | 1.1987 | 1.0948 |
| No log | 1.5 | 24 | 1.1159 | 0.3640 | 1.1159 | 1.0563 |
| No log | 1.625 | 26 | 1.1712 | 0.3681 | 1.1712 | 1.0822 |
| No log | 1.75 | 28 | 1.1964 | 0.3576 | 1.1964 | 1.0938 |
| No log | 1.875 | 30 | 1.1917 | 0.3201 | 1.1917 | 1.0917 |
| No log | 2.0 | 32 | 1.3469 | 0.1222 | 1.3469 | 1.1605 |
| No log | 2.125 | 34 | 1.6531 | 0.0540 | 1.6531 | 1.2857 |
| No log | 2.25 | 36 | 1.6781 | 0.1355 | 1.6781 | 1.2954 |
| No log | 2.375 | 38 | 1.3813 | 0.1723 | 1.3813 | 1.1753 |
| No log | 2.5 | 40 | 1.0940 | 0.3416 | 1.0940 | 1.0460 |
| No log | 2.625 | 42 | 1.0161 | 0.3882 | 1.0161 | 1.0080 |
| No log | 2.75 | 44 | 1.0008 | 0.4038 | 1.0008 | 1.0004 |
| No log | 2.875 | 46 | 1.0746 | 0.3615 | 1.0746 | 1.0367 |
| No log | 3.0 | 48 | 1.0037 | 0.4337 | 1.0037 | 1.0019 |
| No log | 3.125 | 50 | 0.8813 | 0.4934 | 0.8813 | 0.9388 |
| No log | 3.25 | 52 | 0.8219 | 0.4930 | 0.8219 | 0.9066 |
| No log | 3.375 | 54 | 0.8188 | 0.4842 | 0.8188 | 0.9049 |
| No log | 3.5 | 56 | 0.7809 | 0.5433 | 0.7809 | 0.8837 |
| No log | 3.625 | 58 | 0.7510 | 0.5510 | 0.7510 | 0.8666 |
| No log | 3.75 | 60 | 0.9111 | 0.5707 | 0.9111 | 0.9545 |
| No log | 3.875 | 62 | 1.4708 | 0.4556 | 1.4708 | 1.2128 |
| No log | 4.0 | 64 | 1.6728 | 0.4538 | 1.6728 | 1.2934 |
| No log | 4.125 | 66 | 1.4221 | 0.4596 | 1.4221 | 1.1925 |
| No log | 4.25 | 68 | 1.0266 | 0.6027 | 1.0266 | 1.0132 |
| No log | 4.375 | 70 | 0.7413 | 0.6308 | 0.7413 | 0.8610 |
| No log | 4.5 | 72 | 0.6523 | 0.6924 | 0.6523 | 0.8076 |
| No log | 4.625 | 74 | 0.6940 | 0.6997 | 0.6940 | 0.8330 |
| No log | 4.75 | 76 | 0.7507 | 0.6364 | 0.7507 | 0.8664 |
| No log | 4.875 | 78 | 0.7390 | 0.6021 | 0.7390 | 0.8597 |
| No log | 5.0 | 80 | 0.7044 | 0.6501 | 0.7044 | 0.8393 |
| No log | 5.125 | 82 | 0.6646 | 0.6428 | 0.6646 | 0.8152 |
| No log | 5.25 | 84 | 0.6654 | 0.6671 | 0.6654 | 0.8157 |
| No log | 5.375 | 86 | 0.6971 | 0.6339 | 0.6971 | 0.8349 |
| No log | 5.5 | 88 | 0.6896 | 0.6264 | 0.6896 | 0.8304 |
| No log | 5.625 | 90 | 0.6604 | 0.6501 | 0.6604 | 0.8126 |
| No log | 5.75 | 92 | 0.6874 | 0.6969 | 0.6874 | 0.8291 |
| No log | 5.875 | 94 | 0.7291 | 0.7012 | 0.7291 | 0.8539 |
| No log | 6.0 | 96 | 0.7497 | 0.6711 | 0.7497 | 0.8658 |
| No log | 6.125 | 98 | 0.7371 | 0.6993 | 0.7371 | 0.8586 |
| No log | 6.25 | 100 | 0.7079 | 0.6742 | 0.7079 | 0.8414 |
| No log | 6.375 | 102 | 0.7039 | 0.7041 | 0.7039 | 0.8390 |
| No log | 6.5 | 104 | 0.7211 | 0.6852 | 0.7211 | 0.8492 |
| No log | 6.625 | 106 | 0.7157 | 0.6945 | 0.7157 | 0.8460 |
| No log | 6.75 | 108 | 0.7312 | 0.7117 | 0.7312 | 0.8551 |
| No log | 6.875 | 110 | 0.7477 | 0.7181 | 0.7477 | 0.8647 |
| No log | 7.0 | 112 | 0.7433 | 0.7181 | 0.7433 | 0.8622 |
| No log | 7.125 | 114 | 0.7286 | 0.7151 | 0.7286 | 0.8536 |
| No log | 7.25 | 116 | 0.7257 | 0.7187 | 0.7257 | 0.8519 |
| No log | 7.375 | 118 | 0.7132 | 0.7158 | 0.7132 | 0.8445 |
| No log | 7.5 | 120 | 0.7199 | 0.7210 | 0.7199 | 0.8484 |
| No log | 7.625 | 122 | 0.7381 | 0.6968 | 0.7381 | 0.8591 |
| No log | 7.75 | 124 | 0.7452 | 0.6956 | 0.7452 | 0.8632 |
| No log | 7.875 | 126 | 0.7650 | 0.6751 | 0.7650 | 0.8747 |
| No log | 8.0 | 128 | 0.7856 | 0.6654 | 0.7856 | 0.8863 |
| No log | 8.125 | 130 | 0.7923 | 0.6647 | 0.7923 | 0.8901 |
| No log | 8.25 | 132 | 0.7799 | 0.6629 | 0.7799 | 0.8831 |
| No log | 8.375 | 134 | 0.7776 | 0.6693 | 0.7776 | 0.8818 |
| No log | 8.5 | 136 | 0.7540 | 0.6820 | 0.7540 | 0.8683 |
| No log | 8.625 | 138 | 0.7308 | 0.6838 | 0.7308 | 0.8549 |
| No log | 8.75 | 140 | 0.7165 | 0.6859 | 0.7165 | 0.8465 |
| No log | 8.875 | 142 | 0.7172 | 0.6859 | 0.7172 | 0.8469 |
| No log | 9.0 | 144 | 0.7227 | 0.6919 | 0.7227 | 0.8501 |
| No log | 9.125 | 146 | 0.7270 | 0.6919 | 0.7270 | 0.8526 |
| No log | 9.25 | 148 | 0.7373 | 0.7050 | 0.7373 | 0.8587 |
| No log | 9.375 | 150 | 0.7509 | 0.7031 | 0.7509 | 0.8665 |
| No log | 9.5 | 152 | 0.7659 | 0.7012 | 0.7659 | 0.8751 |
| No log | 9.625 | 154 | 0.7805 | 0.6859 | 0.7805 | 0.8835 |
| No log | 9.75 | 156 | 0.7887 | 0.6859 | 0.7887 | 0.8881 |
| No log | 9.875 | 158 | 0.7930 | 0.6859 | 0.7930 | 0.8905 |
| No log | 10.0 | 160 | 0.7955 | 0.6859 | 0.7955 | 0.8919 |
Framework versions
- Transformers 4.44.2
- Pytorch 2.4.0+cu118
- Datasets 2.21.0
- Tokenizers 0.19.1
- Downloads last month
- 2
Model tree for MayBashendy/ArabicNewSplits5_FineTuningAraBERT_run2_AugV5_k2_task1_organization
Base model
aubmindlab/bert-base-arabertv02