ArabicNewSplits4_WithDuplicationsForScore5_FineTuningAraBERT_run2_AugV5_k4_task2_organization
This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:
- Loss: 1.0012
- Qwk: 0.4175
- Mse: 1.0012
- Rmse: 1.0006
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 10
Training results
| Training Loss | Epoch | Step | Validation Loss | Qwk | Mse | Rmse |
|---|---|---|---|---|---|---|
| No log | 0.0870 | 2 | 3.9118 | -0.0151 | 3.9118 | 1.9778 |
| No log | 0.1739 | 4 | 2.1183 | 0.1132 | 2.1183 | 1.4554 |
| No log | 0.2609 | 6 | 1.0761 | 0.0773 | 1.0761 | 1.0374 |
| No log | 0.3478 | 8 | 0.8501 | 0.0630 | 0.8501 | 0.9220 |
| No log | 0.4348 | 10 | 0.9004 | 0.0031 | 0.9004 | 0.9489 |
| No log | 0.5217 | 12 | 0.7704 | 0.1618 | 0.7704 | 0.8777 |
| No log | 0.6087 | 14 | 0.7600 | 0.1136 | 0.7600 | 0.8718 |
| No log | 0.6957 | 16 | 0.7979 | 0.1494 | 0.7979 | 0.8932 |
| No log | 0.7826 | 18 | 0.8657 | 0.1549 | 0.8657 | 0.9305 |
| No log | 0.8696 | 20 | 0.8648 | 0.0417 | 0.8648 | 0.9299 |
| No log | 0.9565 | 22 | 0.8356 | 0.0331 | 0.8356 | 0.9141 |
| No log | 1.0435 | 24 | 0.7924 | 0.0471 | 0.7924 | 0.8902 |
| No log | 1.1304 | 26 | 0.7878 | 0.0704 | 0.7878 | 0.8876 |
| No log | 1.2174 | 28 | 0.7406 | 0.1056 | 0.7406 | 0.8606 |
| No log | 1.3043 | 30 | 0.7211 | 0.1273 | 0.7211 | 0.8492 |
| No log | 1.3913 | 32 | 0.7135 | 0.0502 | 0.7135 | 0.8447 |
| No log | 1.4783 | 34 | 0.7150 | 0.1469 | 0.7150 | 0.8456 |
| No log | 1.5652 | 36 | 0.7148 | 0.1238 | 0.7148 | 0.8455 |
| No log | 1.6522 | 38 | 0.7196 | 0.1121 | 0.7196 | 0.8483 |
| No log | 1.7391 | 40 | 0.7121 | 0.1601 | 0.7121 | 0.8439 |
| No log | 1.8261 | 42 | 0.7006 | 0.2461 | 0.7006 | 0.8370 |
| No log | 1.9130 | 44 | 0.6840 | 0.2839 | 0.6840 | 0.8271 |
| No log | 2.0 | 46 | 0.6537 | 0.3539 | 0.6537 | 0.8085 |
| No log | 2.0870 | 48 | 0.6446 | 0.3520 | 0.6446 | 0.8028 |
| No log | 2.1739 | 50 | 0.6769 | 0.2305 | 0.6769 | 0.8227 |
| No log | 2.2609 | 52 | 0.7000 | 0.2135 | 0.7000 | 0.8367 |
| No log | 2.3478 | 54 | 0.7202 | 0.2507 | 0.7202 | 0.8486 |
| No log | 2.4348 | 56 | 0.8102 | 0.2812 | 0.8102 | 0.9001 |
| No log | 2.5217 | 58 | 0.8097 | 0.3422 | 0.8097 | 0.8998 |
| No log | 2.6087 | 60 | 0.6935 | 0.3570 | 0.6935 | 0.8328 |
| No log | 2.6957 | 62 | 0.6159 | 0.3537 | 0.6159 | 0.7848 |
| No log | 2.7826 | 64 | 0.6038 | 0.3622 | 0.6038 | 0.7771 |
| No log | 2.8696 | 66 | 0.5996 | 0.4007 | 0.5996 | 0.7743 |
| No log | 2.9565 | 68 | 0.5943 | 0.3812 | 0.5943 | 0.7709 |
| No log | 3.0435 | 70 | 0.5921 | 0.3898 | 0.5921 | 0.7694 |
| No log | 3.1304 | 72 | 0.6042 | 0.3915 | 0.6042 | 0.7773 |
| No log | 3.2174 | 74 | 0.5910 | 0.4246 | 0.5910 | 0.7688 |
| No log | 3.3043 | 76 | 0.6194 | 0.4360 | 0.6194 | 0.7870 |
| No log | 3.3913 | 78 | 0.6849 | 0.4828 | 0.6849 | 0.8276 |
| No log | 3.4783 | 80 | 0.7219 | 0.5036 | 0.7219 | 0.8496 |
| No log | 3.5652 | 82 | 0.7642 | 0.4639 | 0.7642 | 0.8742 |
| No log | 3.6522 | 84 | 0.8401 | 0.4180 | 0.8401 | 0.9166 |
| No log | 3.7391 | 86 | 0.9305 | 0.3886 | 0.9305 | 0.9647 |
| No log | 3.8261 | 88 | 1.0770 | 0.3815 | 1.0770 | 1.0378 |
| No log | 3.9130 | 90 | 1.2188 | 0.3716 | 1.2188 | 1.1040 |
| No log | 4.0 | 92 | 1.2197 | 0.3378 | 1.2197 | 1.1044 |
| No log | 4.0870 | 94 | 1.1489 | 0.3493 | 1.1489 | 1.0719 |
| No log | 4.1739 | 96 | 1.0119 | 0.3491 | 1.0119 | 1.0059 |
| No log | 4.2609 | 98 | 1.0181 | 0.3137 | 1.0181 | 1.0090 |
| No log | 4.3478 | 100 | 1.0421 | 0.3293 | 1.0421 | 1.0208 |
| No log | 4.4348 | 102 | 1.1706 | 0.3332 | 1.1706 | 1.0820 |
| No log | 4.5217 | 104 | 1.2294 | 0.3263 | 1.2294 | 1.1088 |
| No log | 4.6087 | 106 | 1.1801 | 0.3422 | 1.1801 | 1.0863 |
| No log | 4.6957 | 108 | 1.1240 | 0.3524 | 1.1240 | 1.0602 |
| No log | 4.7826 | 110 | 1.1276 | 0.3429 | 1.1276 | 1.0619 |
| No log | 4.8696 | 112 | 0.9882 | 0.4070 | 0.9882 | 0.9941 |
| No log | 4.9565 | 114 | 0.9466 | 0.4263 | 0.9466 | 0.9729 |
| No log | 5.0435 | 116 | 0.9981 | 0.3908 | 0.9981 | 0.9990 |
| No log | 5.1304 | 118 | 1.0370 | 0.4016 | 1.0370 | 1.0183 |
| No log | 5.2174 | 120 | 0.9567 | 0.4227 | 0.9567 | 0.9781 |
| No log | 5.3043 | 122 | 0.8924 | 0.4203 | 0.8924 | 0.9447 |
| No log | 5.3913 | 124 | 0.8350 | 0.4396 | 0.8350 | 0.9138 |
| No log | 5.4783 | 126 | 0.7920 | 0.4332 | 0.7920 | 0.8899 |
| No log | 5.5652 | 128 | 0.7727 | 0.4715 | 0.7727 | 0.8791 |
| No log | 5.6522 | 130 | 0.7724 | 0.4825 | 0.7724 | 0.8788 |
| No log | 5.7391 | 132 | 0.7936 | 0.4720 | 0.7936 | 0.8908 |
| No log | 5.8261 | 134 | 0.8582 | 0.4344 | 0.8582 | 0.9264 |
| No log | 5.9130 | 136 | 0.9540 | 0.4173 | 0.9540 | 0.9767 |
| No log | 6.0 | 138 | 1.0602 | 0.3988 | 1.0602 | 1.0297 |
| No log | 6.0870 | 140 | 1.1745 | 0.3560 | 1.1745 | 1.0838 |
| No log | 6.1739 | 142 | 1.2040 | 0.3513 | 1.2040 | 1.0973 |
| No log | 6.2609 | 144 | 1.1555 | 0.3586 | 1.1555 | 1.0749 |
| No log | 6.3478 | 146 | 1.0164 | 0.3798 | 1.0164 | 1.0082 |
| No log | 6.4348 | 148 | 0.8653 | 0.4533 | 0.8653 | 0.9302 |
| No log | 6.5217 | 150 | 0.8094 | 0.4556 | 0.8094 | 0.8997 |
| No log | 6.6087 | 152 | 0.8350 | 0.4440 | 0.8350 | 0.9138 |
| No log | 6.6957 | 154 | 0.8802 | 0.4303 | 0.8802 | 0.9382 |
| No log | 6.7826 | 156 | 0.8713 | 0.4303 | 0.8713 | 0.9334 |
| No log | 6.8696 | 158 | 0.8534 | 0.4113 | 0.8534 | 0.9238 |
| No log | 6.9565 | 160 | 0.8750 | 0.4303 | 0.8750 | 0.9354 |
| No log | 7.0435 | 162 | 0.9513 | 0.4155 | 0.9513 | 0.9753 |
| No log | 7.1304 | 164 | 1.0220 | 0.4013 | 1.0220 | 1.0109 |
| No log | 7.2174 | 166 | 1.0570 | 0.4018 | 1.0570 | 1.0281 |
| No log | 7.3043 | 168 | 1.0874 | 0.3712 | 1.0874 | 1.0428 |
| No log | 7.3913 | 170 | 1.0394 | 0.4149 | 1.0394 | 1.0195 |
| No log | 7.4783 | 172 | 1.0058 | 0.4166 | 1.0058 | 1.0029 |
| No log | 7.5652 | 174 | 0.9940 | 0.4071 | 0.9940 | 0.9970 |
| No log | 7.6522 | 176 | 0.9708 | 0.3977 | 0.9708 | 0.9853 |
| No log | 7.7391 | 178 | 0.8996 | 0.4484 | 0.8996 | 0.9484 |
| No log | 7.8261 | 180 | 0.8133 | 0.4125 | 0.8133 | 0.9018 |
| No log | 7.9130 | 182 | 0.7649 | 0.4333 | 0.7649 | 0.8746 |
| No log | 8.0 | 184 | 0.7612 | 0.4698 | 0.7612 | 0.8724 |
| No log | 8.0870 | 186 | 0.7924 | 0.4059 | 0.7924 | 0.8902 |
| No log | 8.1739 | 188 | 0.8252 | 0.3929 | 0.8252 | 0.9084 |
| No log | 8.2609 | 190 | 0.8657 | 0.4267 | 0.8657 | 0.9304 |
| No log | 8.3478 | 192 | 0.8972 | 0.4450 | 0.8972 | 0.9472 |
| No log | 8.4348 | 194 | 0.9271 | 0.4703 | 0.9271 | 0.9628 |
| No log | 8.5217 | 196 | 0.9515 | 0.4050 | 0.9515 | 0.9755 |
| No log | 8.6087 | 198 | 0.9979 | 0.3822 | 0.9979 | 0.9989 |
| No log | 8.6957 | 200 | 1.0351 | 0.3822 | 1.0351 | 1.0174 |
| No log | 8.7826 | 202 | 1.0650 | 0.3822 | 1.0650 | 1.0320 |
| No log | 8.8696 | 204 | 1.0792 | 0.3808 | 1.0792 | 1.0388 |
| No log | 8.9565 | 206 | 1.0746 | 0.3758 | 1.0746 | 1.0366 |
| No log | 9.0435 | 208 | 1.0552 | 0.3822 | 1.0552 | 1.0272 |
| No log | 9.1304 | 210 | 1.0515 | 0.3822 | 1.0515 | 1.0254 |
| No log | 9.2174 | 212 | 1.0454 | 0.4056 | 1.0454 | 1.0225 |
| No log | 9.3043 | 214 | 1.0300 | 0.4056 | 1.0300 | 1.0149 |
| No log | 9.3913 | 216 | 1.0149 | 0.4126 | 1.0149 | 1.0074 |
| No log | 9.4783 | 218 | 1.0110 | 0.4126 | 1.0110 | 1.0055 |
| No log | 9.5652 | 220 | 1.0028 | 0.4247 | 1.0028 | 1.0014 |
| No log | 9.6522 | 222 | 1.0028 | 0.4175 | 1.0028 | 1.0014 |
| No log | 9.7391 | 224 | 1.0005 | 0.4175 | 1.0005 | 1.0003 |
| No log | 9.8261 | 226 | 1.0027 | 0.4175 | 1.0027 | 1.0014 |
| No log | 9.9130 | 228 | 1.0018 | 0.4175 | 1.0018 | 1.0009 |
| No log | 10.0 | 230 | 1.0012 | 0.4175 | 1.0012 | 1.0006 |
Framework versions
- Transformers 4.44.2
- Pytorch 2.4.0+cu118
- Datasets 2.21.0
- Tokenizers 0.19.1
- Downloads last month
- 2
Model tree for MayBashendy/ArabicNewSplits4_WithDuplicationsForScore5_FineTuningAraBERT_run2_AugV5_k4_task2_organization
Base model
aubmindlab/bert-base-arabertv02