ArabicNewSplits6_WithDuplicationsForScore5_FineTuningAraBERT_run1_AugV5_k3_task2_organization
This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:
- Loss: 0.8467
- Qwk: 0.5477
- Mse: 0.8467
- Rmse: 0.9202
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 10
Training results
| Training Loss | Epoch | Step | Validation Loss | Qwk | Mse | Rmse |
|---|---|---|---|---|---|---|
| No log | 0.1053 | 2 | 3.9673 | 0.0094 | 3.9673 | 1.9918 |
| No log | 0.2105 | 4 | 2.6822 | 0.1025 | 2.6822 | 1.6377 |
| No log | 0.3158 | 6 | 1.3908 | 0.1538 | 1.3908 | 1.1793 |
| No log | 0.4211 | 8 | 0.7873 | 0.1007 | 0.7873 | 0.8873 |
| No log | 0.5263 | 10 | 0.6311 | 0.2730 | 0.6311 | 0.7944 |
| No log | 0.6316 | 12 | 0.6469 | 0.2243 | 0.6469 | 0.8043 |
| No log | 0.7368 | 14 | 0.7144 | 0.3627 | 0.7144 | 0.8452 |
| No log | 0.8421 | 16 | 0.8002 | 0.3241 | 0.8002 | 0.8945 |
| No log | 0.9474 | 18 | 0.7107 | 0.4081 | 0.7107 | 0.8430 |
| No log | 1.0526 | 20 | 0.7170 | 0.4110 | 0.7170 | 0.8468 |
| No log | 1.1579 | 22 | 0.7221 | 0.4498 | 0.7221 | 0.8498 |
| No log | 1.2632 | 24 | 0.6036 | 0.4620 | 0.6036 | 0.7769 |
| No log | 1.3684 | 26 | 0.5649 | 0.4986 | 0.5649 | 0.7516 |
| No log | 1.4737 | 28 | 0.5911 | 0.5505 | 0.5911 | 0.7688 |
| No log | 1.5789 | 30 | 0.6415 | 0.5633 | 0.6415 | 0.8009 |
| No log | 1.6842 | 32 | 0.6119 | 0.5738 | 0.6119 | 0.7823 |
| No log | 1.7895 | 34 | 0.6802 | 0.5354 | 0.6802 | 0.8247 |
| No log | 1.8947 | 36 | 0.6793 | 0.5625 | 0.6793 | 0.8242 |
| No log | 2.0 | 38 | 0.6515 | 0.6376 | 0.6515 | 0.8072 |
| No log | 2.1053 | 40 | 0.5963 | 0.5919 | 0.5963 | 0.7722 |
| No log | 2.2105 | 42 | 0.5536 | 0.5850 | 0.5536 | 0.7440 |
| No log | 2.3158 | 44 | 0.5082 | 0.5736 | 0.5082 | 0.7129 |
| No log | 2.4211 | 46 | 0.5296 | 0.5802 | 0.5296 | 0.7278 |
| No log | 2.5263 | 48 | 0.6308 | 0.5490 | 0.6308 | 0.7942 |
| No log | 2.6316 | 50 | 0.5533 | 0.5682 | 0.5533 | 0.7438 |
| No log | 2.7368 | 52 | 0.5295 | 0.5682 | 0.5295 | 0.7277 |
| No log | 2.8421 | 54 | 0.6095 | 0.5580 | 0.6095 | 0.7807 |
| No log | 2.9474 | 56 | 0.5851 | 0.5702 | 0.5851 | 0.7649 |
| No log | 3.0526 | 58 | 0.6289 | 0.5605 | 0.6289 | 0.7930 |
| No log | 3.1579 | 60 | 0.7864 | 0.5339 | 0.7864 | 0.8868 |
| No log | 3.2632 | 62 | 0.6782 | 0.5373 | 0.6782 | 0.8235 |
| No log | 3.3684 | 64 | 0.6301 | 0.5836 | 0.6301 | 0.7938 |
| No log | 3.4737 | 66 | 0.6849 | 0.5883 | 0.6849 | 0.8276 |
| No log | 3.5789 | 68 | 0.7523 | 0.6040 | 0.7523 | 0.8673 |
| No log | 3.6842 | 70 | 0.8357 | 0.5869 | 0.8357 | 0.9142 |
| No log | 3.7895 | 72 | 0.8797 | 0.6072 | 0.8797 | 0.9379 |
| No log | 3.8947 | 74 | 0.9402 | 0.5402 | 0.9402 | 0.9696 |
| No log | 4.0 | 76 | 0.9999 | 0.5100 | 0.9999 | 1.0000 |
| No log | 4.1053 | 78 | 0.9803 | 0.5443 | 0.9803 | 0.9901 |
| No log | 4.2105 | 80 | 0.9204 | 0.5600 | 0.9204 | 0.9594 |
| No log | 4.3158 | 82 | 0.9263 | 0.5294 | 0.9263 | 0.9625 |
| No log | 4.4211 | 84 | 0.8952 | 0.5571 | 0.8952 | 0.9462 |
| No log | 4.5263 | 86 | 0.8823 | 0.5688 | 0.8823 | 0.9393 |
| No log | 4.6316 | 88 | 0.8412 | 0.5495 | 0.8412 | 0.9172 |
| No log | 4.7368 | 90 | 0.8315 | 0.5624 | 0.8315 | 0.9119 |
| No log | 4.8421 | 92 | 0.8203 | 0.5565 | 0.8203 | 0.9057 |
| No log | 4.9474 | 94 | 0.7567 | 0.5837 | 0.7567 | 0.8699 |
| No log | 5.0526 | 96 | 0.7108 | 0.5976 | 0.7108 | 0.8431 |
| No log | 5.1579 | 98 | 0.6743 | 0.6234 | 0.6743 | 0.8211 |
| No log | 5.2632 | 100 | 0.6841 | 0.6083 | 0.6841 | 0.8271 |
| No log | 5.3684 | 102 | 0.7509 | 0.5865 | 0.7509 | 0.8665 |
| No log | 5.4737 | 104 | 0.7275 | 0.5728 | 0.7275 | 0.8530 |
| No log | 5.5789 | 106 | 0.7217 | 0.5693 | 0.7217 | 0.8495 |
| No log | 5.6842 | 108 | 0.6804 | 0.6388 | 0.6804 | 0.8249 |
| No log | 5.7895 | 110 | 0.6936 | 0.5888 | 0.6936 | 0.8328 |
| No log | 5.8947 | 112 | 0.6944 | 0.5851 | 0.6944 | 0.8333 |
| No log | 6.0 | 114 | 0.6965 | 0.6303 | 0.6965 | 0.8346 |
| No log | 6.1053 | 116 | 0.7656 | 0.5464 | 0.7656 | 0.8750 |
| No log | 6.2105 | 118 | 0.8533 | 0.5130 | 0.8533 | 0.9238 |
| No log | 6.3158 | 120 | 0.9362 | 0.5214 | 0.9362 | 0.9676 |
| No log | 6.4211 | 122 | 0.9121 | 0.5257 | 0.9121 | 0.9550 |
| No log | 6.5263 | 124 | 0.8437 | 0.5639 | 0.8437 | 0.9185 |
| No log | 6.6316 | 126 | 0.8342 | 0.5697 | 0.8342 | 0.9134 |
| No log | 6.7368 | 128 | 0.8350 | 0.5778 | 0.8350 | 0.9138 |
| No log | 6.8421 | 130 | 0.8175 | 0.5773 | 0.8175 | 0.9041 |
| No log | 6.9474 | 132 | 0.8173 | 0.5795 | 0.8173 | 0.9041 |
| No log | 7.0526 | 134 | 0.8256 | 0.5562 | 0.8256 | 0.9086 |
| No log | 7.1579 | 136 | 0.8222 | 0.5440 | 0.8222 | 0.9067 |
| No log | 7.2632 | 138 | 0.8116 | 0.5488 | 0.8116 | 0.9009 |
| No log | 7.3684 | 140 | 0.8196 | 0.5306 | 0.8196 | 0.9053 |
| No log | 7.4737 | 142 | 0.7951 | 0.5601 | 0.7951 | 0.8917 |
| No log | 7.5789 | 144 | 0.7689 | 0.5963 | 0.7689 | 0.8769 |
| No log | 7.6842 | 146 | 0.7616 | 0.5854 | 0.7616 | 0.8727 |
| No log | 7.7895 | 148 | 0.7695 | 0.5958 | 0.7695 | 0.8772 |
| No log | 7.8947 | 150 | 0.7971 | 0.5326 | 0.7971 | 0.8928 |
| No log | 8.0 | 152 | 0.8415 | 0.5296 | 0.8415 | 0.9173 |
| No log | 8.1053 | 154 | 0.8715 | 0.5187 | 0.8715 | 0.9335 |
| No log | 8.2105 | 156 | 0.8557 | 0.5249 | 0.8557 | 0.9251 |
| No log | 8.3158 | 158 | 0.8248 | 0.5380 | 0.8248 | 0.9082 |
| No log | 8.4211 | 160 | 0.7961 | 0.5640 | 0.7961 | 0.8922 |
| No log | 8.5263 | 162 | 0.7866 | 0.5841 | 0.7866 | 0.8869 |
| No log | 8.6316 | 164 | 0.7879 | 0.5977 | 0.7879 | 0.8876 |
| No log | 8.7368 | 166 | 0.8029 | 0.5977 | 0.8029 | 0.8961 |
| No log | 8.8421 | 168 | 0.8055 | 0.5850 | 0.8055 | 0.8975 |
| No log | 8.9474 | 170 | 0.8175 | 0.5763 | 0.8175 | 0.9041 |
| No log | 9.0526 | 172 | 0.8334 | 0.5514 | 0.8334 | 0.9129 |
| No log | 9.1579 | 174 | 0.8388 | 0.5490 | 0.8388 | 0.9159 |
| No log | 9.2632 | 176 | 0.8452 | 0.5160 | 0.8452 | 0.9193 |
| No log | 9.3684 | 178 | 0.8450 | 0.5272 | 0.8450 | 0.9193 |
| No log | 9.4737 | 180 | 0.8456 | 0.5272 | 0.8456 | 0.9196 |
| No log | 9.5789 | 182 | 0.8487 | 0.5272 | 0.8487 | 0.9212 |
| No log | 9.6842 | 184 | 0.8460 | 0.5346 | 0.8460 | 0.9198 |
| No log | 9.7895 | 186 | 0.8449 | 0.5477 | 0.8449 | 0.9192 |
| No log | 9.8947 | 188 | 0.8456 | 0.5477 | 0.8456 | 0.9196 |
| No log | 10.0 | 190 | 0.8467 | 0.5477 | 0.8467 | 0.9202 |
Framework versions
- Transformers 4.44.2
- Pytorch 2.4.0+cu118
- Datasets 2.21.0
- Tokenizers 0.19.1
- Downloads last month
- 1
Model tree for MayBashendy/ArabicNewSplits6_WithDuplicationsForScore5_FineTuningAraBERT_run1_AugV5_k3_task2_organization
Base model
aubmindlab/bert-base-arabertv02