ArabicNewSplits6_FineTuningAraBERT_run3_AugV5_k2_task1_organization
This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:
- Loss: 0.6671
- Qwk: 0.6974
- Mse: 0.6671
- Rmse: 0.8167
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 10
Training results
| Training Loss | Epoch | Step | Validation Loss | Qwk | Mse | Rmse |
|---|---|---|---|---|---|---|
| No log | 0.1429 | 2 | 5.4036 | -0.0607 | 5.4036 | 2.3246 |
| No log | 0.2857 | 4 | 3.4226 | 0.0699 | 3.4226 | 1.8500 |
| No log | 0.4286 | 6 | 2.0746 | 0.0686 | 2.0746 | 1.4403 |
| No log | 0.5714 | 8 | 2.1220 | -0.1206 | 2.1220 | 1.4567 |
| No log | 0.7143 | 10 | 1.7488 | -0.0557 | 1.7488 | 1.3224 |
| No log | 0.8571 | 12 | 1.1309 | 0.2244 | 1.1309 | 1.0634 |
| No log | 1.0 | 14 | 1.0081 | 0.3512 | 1.0081 | 1.0040 |
| No log | 1.1429 | 16 | 1.1122 | 0.4034 | 1.1122 | 1.0546 |
| No log | 1.2857 | 18 | 1.2652 | 0.3733 | 1.2652 | 1.1248 |
| No log | 1.4286 | 20 | 1.0840 | 0.4516 | 1.0840 | 1.0411 |
| No log | 1.5714 | 22 | 0.8305 | 0.4377 | 0.8305 | 0.9113 |
| No log | 1.7143 | 24 | 0.8954 | 0.5642 | 0.8954 | 0.9462 |
| No log | 1.8571 | 26 | 0.7952 | 0.5460 | 0.7952 | 0.8917 |
| No log | 2.0 | 28 | 0.7532 | 0.4901 | 0.7532 | 0.8679 |
| No log | 2.1429 | 30 | 1.0572 | 0.4589 | 1.0572 | 1.0282 |
| No log | 2.2857 | 32 | 1.4316 | 0.3599 | 1.4316 | 1.1965 |
| No log | 2.4286 | 34 | 1.2943 | 0.4359 | 1.2943 | 1.1377 |
| No log | 2.5714 | 36 | 0.8761 | 0.5258 | 0.8761 | 0.9360 |
| No log | 2.7143 | 38 | 0.6940 | 0.5978 | 0.6940 | 0.8331 |
| No log | 2.8571 | 40 | 0.6180 | 0.6794 | 0.6180 | 0.7861 |
| No log | 3.0 | 42 | 0.6043 | 0.6840 | 0.6043 | 0.7774 |
| No log | 3.1429 | 44 | 0.5957 | 0.6787 | 0.5957 | 0.7718 |
| No log | 3.2857 | 46 | 0.6139 | 0.7249 | 0.6139 | 0.7835 |
| No log | 3.4286 | 48 | 0.7543 | 0.6925 | 0.7543 | 0.8685 |
| No log | 3.5714 | 50 | 1.0967 | 0.5348 | 1.0967 | 1.0472 |
| No log | 3.7143 | 52 | 1.0532 | 0.5603 | 1.0532 | 1.0263 |
| No log | 3.8571 | 54 | 0.7502 | 0.7004 | 0.7502 | 0.8661 |
| No log | 4.0 | 56 | 0.6011 | 0.7488 | 0.6011 | 0.7753 |
| No log | 4.1429 | 58 | 0.7006 | 0.7249 | 0.7006 | 0.8370 |
| No log | 4.2857 | 60 | 0.6896 | 0.7231 | 0.6896 | 0.8304 |
| No log | 4.4286 | 62 | 0.5953 | 0.7253 | 0.5953 | 0.7716 |
| No log | 4.5714 | 64 | 0.6813 | 0.6865 | 0.6813 | 0.8254 |
| No log | 4.7143 | 66 | 1.1222 | 0.5641 | 1.1222 | 1.0593 |
| No log | 4.8571 | 68 | 1.2716 | 0.5026 | 1.2716 | 1.1277 |
| No log | 5.0 | 70 | 1.0454 | 0.6121 | 1.0454 | 1.0224 |
| No log | 5.1429 | 72 | 0.7051 | 0.6638 | 0.7051 | 0.8397 |
| No log | 5.2857 | 74 | 0.5653 | 0.7759 | 0.5653 | 0.7519 |
| No log | 5.4286 | 76 | 0.6266 | 0.7387 | 0.6266 | 0.7916 |
| No log | 5.5714 | 78 | 0.7251 | 0.7164 | 0.7251 | 0.8515 |
| No log | 5.7143 | 80 | 0.7244 | 0.7144 | 0.7244 | 0.8511 |
| No log | 5.8571 | 82 | 0.6263 | 0.7467 | 0.6263 | 0.7914 |
| No log | 6.0 | 84 | 0.5819 | 0.7670 | 0.5819 | 0.7628 |
| No log | 6.1429 | 86 | 0.6471 | 0.7182 | 0.6471 | 0.8044 |
| No log | 6.2857 | 88 | 0.7067 | 0.6879 | 0.7067 | 0.8406 |
| No log | 6.4286 | 90 | 0.6720 | 0.7055 | 0.6720 | 0.8198 |
| No log | 6.5714 | 92 | 0.5988 | 0.7410 | 0.5988 | 0.7738 |
| No log | 6.7143 | 94 | 0.5634 | 0.7398 | 0.5634 | 0.7506 |
| No log | 6.8571 | 96 | 0.5652 | 0.7591 | 0.5652 | 0.7518 |
| No log | 7.0 | 98 | 0.5678 | 0.7497 | 0.5678 | 0.7535 |
| No log | 7.1429 | 100 | 0.6251 | 0.7158 | 0.6251 | 0.7906 |
| No log | 7.2857 | 102 | 0.6673 | 0.7093 | 0.6673 | 0.8169 |
| No log | 7.4286 | 104 | 0.6510 | 0.7093 | 0.6510 | 0.8068 |
| No log | 7.5714 | 106 | 0.6540 | 0.7128 | 0.6540 | 0.8087 |
| No log | 7.7143 | 108 | 0.6296 | 0.7314 | 0.6296 | 0.7935 |
| No log | 7.8571 | 110 | 0.6195 | 0.7370 | 0.6195 | 0.7871 |
| No log | 8.0 | 112 | 0.6149 | 0.7350 | 0.6149 | 0.7842 |
| No log | 8.1429 | 114 | 0.6015 | 0.7359 | 0.6015 | 0.7756 |
| No log | 8.2857 | 116 | 0.5851 | 0.7494 | 0.5851 | 0.7649 |
| No log | 8.4286 | 118 | 0.5916 | 0.7494 | 0.5916 | 0.7692 |
| No log | 8.5714 | 120 | 0.5958 | 0.7494 | 0.5958 | 0.7719 |
| No log | 8.7143 | 122 | 0.5948 | 0.7439 | 0.5948 | 0.7712 |
| No log | 8.8571 | 124 | 0.6006 | 0.7446 | 0.6006 | 0.7750 |
| No log | 9.0 | 126 | 0.6219 | 0.7370 | 0.6219 | 0.7886 |
| No log | 9.1429 | 128 | 0.6589 | 0.6974 | 0.6589 | 0.8117 |
| No log | 9.2857 | 130 | 0.6896 | 0.7017 | 0.6896 | 0.8304 |
| No log | 9.4286 | 132 | 0.6983 | 0.6921 | 0.6983 | 0.8357 |
| No log | 9.5714 | 134 | 0.6906 | 0.6957 | 0.6906 | 0.8310 |
| No log | 9.7143 | 136 | 0.6796 | 0.6974 | 0.6796 | 0.8244 |
| No log | 9.8571 | 138 | 0.6696 | 0.6974 | 0.6696 | 0.8183 |
| No log | 10.0 | 140 | 0.6671 | 0.6974 | 0.6671 | 0.8167 |
Framework versions
- Transformers 4.44.2
- Pytorch 2.4.0+cu118
- Datasets 2.21.0
- Tokenizers 0.19.1
- Downloads last month
- -
Model tree for MayBashendy/ArabicNewSplits6_FineTuningAraBERT_run3_AugV5_k2_task1_organization
Base model
aubmindlab/bert-base-arabertv02