ArabicNewSplits6_WithDuplicationsForScore5_FineTuningAraBERT_run2_AugV5_k4_task5_organization
This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:
- Loss: 0.6675
- Qwk: 0.7524
- Mse: 0.6675
- Rmse: 0.8170
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 10
Training results
| Training Loss | Epoch | Step | Validation Loss | Qwk | Mse | Rmse |
|---|---|---|---|---|---|---|
| No log | 0.1053 | 2 | 2.1414 | -0.0366 | 2.1414 | 1.4634 |
| No log | 0.2105 | 4 | 1.5026 | 0.1988 | 1.5026 | 1.2258 |
| No log | 0.3158 | 6 | 1.5402 | 0.1701 | 1.5402 | 1.2410 |
| No log | 0.4211 | 8 | 1.6605 | 0.1913 | 1.6605 | 1.2886 |
| No log | 0.5263 | 10 | 1.6187 | 0.3313 | 1.6187 | 1.2723 |
| No log | 0.6316 | 12 | 1.3881 | 0.2660 | 1.3881 | 1.1782 |
| No log | 0.7368 | 14 | 1.2697 | 0.2024 | 1.2697 | 1.1268 |
| No log | 0.8421 | 16 | 1.2324 | 0.1833 | 1.2324 | 1.1101 |
| No log | 0.9474 | 18 | 1.1768 | 0.3222 | 1.1768 | 1.0848 |
| No log | 1.0526 | 20 | 1.0956 | 0.4003 | 1.0956 | 1.0467 |
| No log | 1.1579 | 22 | 1.0042 | 0.4526 | 1.0042 | 1.0021 |
| No log | 1.2632 | 24 | 0.9121 | 0.5020 | 0.9121 | 0.9550 |
| No log | 1.3684 | 26 | 0.8841 | 0.5338 | 0.8841 | 0.9402 |
| No log | 1.4737 | 28 | 0.9223 | 0.5938 | 0.9223 | 0.9604 |
| No log | 1.5789 | 30 | 0.7955 | 0.6390 | 0.7955 | 0.8919 |
| No log | 1.6842 | 32 | 0.7909 | 0.6646 | 0.7909 | 0.8893 |
| No log | 1.7895 | 34 | 0.7963 | 0.6301 | 0.7963 | 0.8924 |
| No log | 1.8947 | 36 | 0.8144 | 0.6062 | 0.8144 | 0.9024 |
| No log | 2.0 | 38 | 0.7819 | 0.6138 | 0.7819 | 0.8842 |
| No log | 2.1053 | 40 | 0.7939 | 0.6083 | 0.7939 | 0.8910 |
| No log | 2.2105 | 42 | 0.9838 | 0.6118 | 0.9838 | 0.9919 |
| No log | 2.3158 | 44 | 1.3096 | 0.5509 | 1.3096 | 1.1444 |
| No log | 2.4211 | 46 | 1.1906 | 0.5632 | 1.1906 | 1.0911 |
| No log | 2.5263 | 48 | 0.8502 | 0.6613 | 0.8502 | 0.9220 |
| No log | 2.6316 | 50 | 0.7552 | 0.7231 | 0.7552 | 0.8690 |
| No log | 2.7368 | 52 | 0.8079 | 0.6743 | 0.8079 | 0.8988 |
| No log | 2.8421 | 54 | 0.9290 | 0.6527 | 0.9290 | 0.9638 |
| No log | 2.9474 | 56 | 0.8550 | 0.6807 | 0.8550 | 0.9247 |
| No log | 3.0526 | 58 | 0.7125 | 0.7060 | 0.7125 | 0.8441 |
| No log | 3.1579 | 60 | 0.6894 | 0.7183 | 0.6894 | 0.8303 |
| No log | 3.2632 | 62 | 0.7292 | 0.7039 | 0.7292 | 0.8539 |
| No log | 3.3684 | 64 | 0.7778 | 0.7092 | 0.7778 | 0.8819 |
| No log | 3.4737 | 66 | 0.7113 | 0.6988 | 0.7113 | 0.8434 |
| No log | 3.5789 | 68 | 0.6831 | 0.6946 | 0.6831 | 0.8265 |
| No log | 3.6842 | 70 | 0.6848 | 0.6993 | 0.6848 | 0.8276 |
| No log | 3.7895 | 72 | 0.7707 | 0.7207 | 0.7707 | 0.8779 |
| No log | 3.8947 | 74 | 1.0486 | 0.6540 | 1.0486 | 1.0240 |
| No log | 4.0 | 76 | 1.0505 | 0.6490 | 1.0505 | 1.0249 |
| No log | 4.1053 | 78 | 0.8648 | 0.7006 | 0.8648 | 0.9299 |
| No log | 4.2105 | 80 | 0.6926 | 0.7183 | 0.6926 | 0.8322 |
| No log | 4.3158 | 82 | 0.6722 | 0.7236 | 0.6722 | 0.8199 |
| No log | 4.4211 | 84 | 0.6806 | 0.7098 | 0.6806 | 0.8250 |
| No log | 4.5263 | 86 | 0.7511 | 0.7020 | 0.7511 | 0.8667 |
| No log | 4.6316 | 88 | 0.9051 | 0.6477 | 0.9051 | 0.9513 |
| No log | 4.7368 | 90 | 0.9295 | 0.6603 | 0.9295 | 0.9641 |
| No log | 4.8421 | 92 | 0.7987 | 0.7080 | 0.7987 | 0.8937 |
| No log | 4.9474 | 94 | 0.7268 | 0.7251 | 0.7268 | 0.8525 |
| No log | 5.0526 | 96 | 0.7203 | 0.7335 | 0.7203 | 0.8487 |
| No log | 5.1579 | 98 | 0.7136 | 0.7375 | 0.7136 | 0.8447 |
| No log | 5.2632 | 100 | 0.6877 | 0.7370 | 0.6877 | 0.8293 |
| No log | 5.3684 | 102 | 0.6865 | 0.7291 | 0.6865 | 0.8286 |
| No log | 5.4737 | 104 | 0.7784 | 0.7133 | 0.7784 | 0.8823 |
| No log | 5.5789 | 106 | 0.9155 | 0.6782 | 0.9155 | 0.9568 |
| No log | 5.6842 | 108 | 0.9501 | 0.6672 | 0.9501 | 0.9747 |
| No log | 5.7895 | 110 | 0.8647 | 0.6769 | 0.8647 | 0.9299 |
| No log | 5.8947 | 112 | 0.7154 | 0.7422 | 0.7154 | 0.8458 |
| No log | 6.0 | 114 | 0.6352 | 0.7260 | 0.6352 | 0.7970 |
| No log | 6.1053 | 116 | 0.6756 | 0.7395 | 0.6756 | 0.8219 |
| No log | 6.2105 | 118 | 0.6838 | 0.7040 | 0.6838 | 0.8269 |
| No log | 6.3158 | 120 | 0.6344 | 0.7305 | 0.6344 | 0.7965 |
| No log | 6.4211 | 122 | 0.6380 | 0.7359 | 0.6380 | 0.7988 |
| No log | 6.5263 | 124 | 0.7472 | 0.7399 | 0.7472 | 0.8644 |
| No log | 6.6316 | 126 | 0.7882 | 0.7368 | 0.7882 | 0.8878 |
| No log | 6.7368 | 128 | 0.7457 | 0.7343 | 0.7457 | 0.8635 |
| No log | 6.8421 | 130 | 0.6803 | 0.7467 | 0.6803 | 0.8248 |
| No log | 6.9474 | 132 | 0.6668 | 0.7416 | 0.6668 | 0.8166 |
| No log | 7.0526 | 134 | 0.6845 | 0.7490 | 0.6845 | 0.8273 |
| No log | 7.1579 | 136 | 0.6811 | 0.7335 | 0.6811 | 0.8253 |
| No log | 7.2632 | 138 | 0.6608 | 0.7443 | 0.6608 | 0.8129 |
| No log | 7.3684 | 140 | 0.6622 | 0.7457 | 0.6622 | 0.8138 |
| No log | 7.4737 | 142 | 0.6478 | 0.7364 | 0.6478 | 0.8048 |
| No log | 7.5789 | 144 | 0.6517 | 0.7443 | 0.6517 | 0.8073 |
| No log | 7.6842 | 146 | 0.6616 | 0.7420 | 0.6616 | 0.8134 |
| No log | 7.7895 | 148 | 0.6832 | 0.7392 | 0.6832 | 0.8265 |
| No log | 7.8947 | 150 | 0.7262 | 0.7644 | 0.7262 | 0.8522 |
| No log | 8.0 | 152 | 0.7482 | 0.7686 | 0.7482 | 0.8650 |
| No log | 8.1053 | 154 | 0.7216 | 0.7602 | 0.7216 | 0.8495 |
| No log | 8.2105 | 156 | 0.6763 | 0.7599 | 0.6763 | 0.8224 |
| No log | 8.3158 | 158 | 0.6504 | 0.7453 | 0.6504 | 0.8065 |
| No log | 8.4211 | 160 | 0.6450 | 0.7497 | 0.6450 | 0.8031 |
| No log | 8.5263 | 162 | 0.6512 | 0.7430 | 0.6512 | 0.8069 |
| No log | 8.6316 | 164 | 0.6584 | 0.7467 | 0.6584 | 0.8114 |
| No log | 8.7368 | 166 | 0.6686 | 0.7524 | 0.6686 | 0.8177 |
| No log | 8.8421 | 168 | 0.6707 | 0.7524 | 0.6707 | 0.8190 |
| No log | 8.9474 | 170 | 0.6831 | 0.7524 | 0.6831 | 0.8265 |
| No log | 9.0526 | 172 | 0.6784 | 0.7524 | 0.6784 | 0.8236 |
| No log | 9.1579 | 174 | 0.6695 | 0.7487 | 0.6695 | 0.8183 |
| No log | 9.2632 | 176 | 0.6647 | 0.7473 | 0.6647 | 0.8153 |
| No log | 9.3684 | 178 | 0.6635 | 0.7473 | 0.6635 | 0.8145 |
| No log | 9.4737 | 180 | 0.6632 | 0.7487 | 0.6632 | 0.8144 |
| No log | 9.5789 | 182 | 0.6639 | 0.7487 | 0.6639 | 0.8148 |
| No log | 9.6842 | 184 | 0.6640 | 0.7487 | 0.6640 | 0.8148 |
| No log | 9.7895 | 186 | 0.6664 | 0.7524 | 0.6664 | 0.8163 |
| No log | 9.8947 | 188 | 0.6674 | 0.7524 | 0.6674 | 0.8170 |
| No log | 10.0 | 190 | 0.6675 | 0.7524 | 0.6675 | 0.8170 |
Framework versions
- Transformers 4.44.2
- Pytorch 2.4.0+cu118
- Datasets 2.21.0
- Tokenizers 0.19.1
- Downloads last month
- 1
Model tree for MayBashendy/ArabicNewSplits6_WithDuplicationsForScore5_FineTuningAraBERT_run2_AugV5_k4_task5_organization
Base model
aubmindlab/bert-base-arabertv02