ArabicNewSplits5_FineTuningAraBERT_run3_AugV5_k3_task1_organization
This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:
- Loss: 0.7450
- Qwk: 0.7077
- Mse: 0.7450
- Rmse: 0.8632
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 10
Training results
| Training Loss | Epoch | Step | Validation Loss | Qwk | Mse | Rmse |
|---|---|---|---|---|---|---|
| No log | 0.0909 | 2 | 5.4652 | -0.0207 | 5.4652 | 2.3378 |
| No log | 0.1818 | 4 | 3.3809 | 0.0611 | 3.3809 | 1.8387 |
| No log | 0.2727 | 6 | 1.9927 | 0.0840 | 1.9927 | 1.4116 |
| No log | 0.3636 | 8 | 1.5362 | 0.0187 | 1.5362 | 1.2394 |
| No log | 0.4545 | 10 | 1.3395 | 0.1263 | 1.3395 | 1.1574 |
| No log | 0.5455 | 12 | 1.2577 | 0.2686 | 1.2577 | 1.1215 |
| No log | 0.6364 | 14 | 1.2555 | 0.2541 | 1.2555 | 1.1205 |
| No log | 0.7273 | 16 | 1.3868 | 0.1104 | 1.3868 | 1.1776 |
| No log | 0.8182 | 18 | 1.5574 | 0.1270 | 1.5574 | 1.2480 |
| No log | 0.9091 | 20 | 1.4184 | 0.1834 | 1.4184 | 1.1910 |
| No log | 1.0 | 22 | 1.2148 | 0.2866 | 1.2148 | 1.1022 |
| No log | 1.0909 | 24 | 1.2918 | 0.2892 | 1.2918 | 1.1366 |
| No log | 1.1818 | 26 | 1.1910 | 0.3524 | 1.1910 | 1.0913 |
| No log | 1.2727 | 28 | 1.0363 | 0.5231 | 1.0363 | 1.0180 |
| No log | 1.3636 | 30 | 1.0043 | 0.5238 | 1.0043 | 1.0021 |
| No log | 1.4545 | 32 | 0.9816 | 0.5244 | 0.9816 | 0.9908 |
| No log | 1.5455 | 34 | 1.2631 | 0.4983 | 1.2631 | 1.1239 |
| No log | 1.6364 | 36 | 1.4017 | 0.4388 | 1.4017 | 1.1839 |
| No log | 1.7273 | 38 | 1.0116 | 0.5560 | 1.0116 | 1.0058 |
| No log | 1.8182 | 40 | 0.8942 | 0.5624 | 0.8942 | 0.9456 |
| No log | 1.9091 | 42 | 0.9205 | 0.5587 | 0.9205 | 0.9594 |
| No log | 2.0 | 44 | 0.9202 | 0.5671 | 0.9202 | 0.9593 |
| No log | 2.0909 | 46 | 0.8065 | 0.5973 | 0.8065 | 0.8981 |
| No log | 2.1818 | 48 | 0.8410 | 0.5766 | 0.8410 | 0.9171 |
| No log | 2.2727 | 50 | 0.8365 | 0.5764 | 0.8365 | 0.9146 |
| No log | 2.3636 | 52 | 0.8466 | 0.5711 | 0.8466 | 0.9201 |
| No log | 2.4545 | 54 | 0.8609 | 0.5951 | 0.8609 | 0.9278 |
| No log | 2.5455 | 56 | 0.7560 | 0.6394 | 0.7560 | 0.8695 |
| No log | 2.6364 | 58 | 0.6699 | 0.6566 | 0.6699 | 0.8184 |
| No log | 2.7273 | 60 | 0.6846 | 0.6070 | 0.6846 | 0.8274 |
| No log | 2.8182 | 62 | 0.6367 | 0.6861 | 0.6367 | 0.7980 |
| No log | 2.9091 | 64 | 0.6384 | 0.6861 | 0.6384 | 0.7990 |
| No log | 3.0 | 66 | 0.8295 | 0.6169 | 0.8295 | 0.9108 |
| No log | 3.0909 | 68 | 0.8862 | 0.6026 | 0.8862 | 0.9414 |
| No log | 3.1818 | 70 | 0.7573 | 0.6408 | 0.7573 | 0.8702 |
| No log | 3.2727 | 72 | 0.6174 | 0.6932 | 0.6174 | 0.7857 |
| No log | 3.3636 | 74 | 0.7613 | 0.6102 | 0.7613 | 0.8725 |
| No log | 3.4545 | 76 | 0.8870 | 0.5083 | 0.8870 | 0.9418 |
| No log | 3.5455 | 78 | 0.7687 | 0.6105 | 0.7687 | 0.8768 |
| No log | 3.6364 | 80 | 0.6294 | 0.7255 | 0.6294 | 0.7933 |
| No log | 3.7273 | 82 | 0.7687 | 0.6987 | 0.7687 | 0.8767 |
| No log | 3.8182 | 84 | 0.8426 | 0.6412 | 0.8426 | 0.9179 |
| No log | 3.9091 | 86 | 0.7471 | 0.6967 | 0.7471 | 0.8643 |
| No log | 4.0 | 88 | 0.6445 | 0.7367 | 0.6445 | 0.8028 |
| No log | 4.0909 | 90 | 0.7149 | 0.7045 | 0.7149 | 0.8455 |
| No log | 4.1818 | 92 | 0.8078 | 0.6342 | 0.8078 | 0.8988 |
| No log | 4.2727 | 94 | 0.7739 | 0.6719 | 0.7739 | 0.8797 |
| No log | 4.3636 | 96 | 0.6948 | 0.7430 | 0.6948 | 0.8335 |
| No log | 4.4545 | 98 | 0.6851 | 0.7240 | 0.6851 | 0.8277 |
| No log | 4.5455 | 100 | 0.6741 | 0.7416 | 0.6741 | 0.8210 |
| No log | 4.6364 | 102 | 0.6582 | 0.7407 | 0.6582 | 0.8113 |
| No log | 4.7273 | 104 | 0.6647 | 0.7205 | 0.6647 | 0.8153 |
| No log | 4.8182 | 106 | 0.6671 | 0.6988 | 0.6671 | 0.8167 |
| No log | 4.9091 | 108 | 0.6605 | 0.7001 | 0.6605 | 0.8127 |
| No log | 5.0 | 110 | 0.6552 | 0.7013 | 0.6552 | 0.8094 |
| No log | 5.0909 | 112 | 0.6655 | 0.6977 | 0.6655 | 0.8158 |
| No log | 5.1818 | 114 | 0.6892 | 0.6603 | 0.6892 | 0.8302 |
| No log | 5.2727 | 116 | 0.7353 | 0.6719 | 0.7353 | 0.8575 |
| No log | 5.3636 | 118 | 0.7838 | 0.6623 | 0.7838 | 0.8854 |
| No log | 5.4545 | 120 | 0.8208 | 0.6661 | 0.8208 | 0.9060 |
| No log | 5.5455 | 122 | 0.7975 | 0.6609 | 0.7975 | 0.8930 |
| No log | 5.6364 | 124 | 0.7567 | 0.7126 | 0.7567 | 0.8699 |
| No log | 5.7273 | 126 | 0.7276 | 0.7239 | 0.7276 | 0.8530 |
| No log | 5.8182 | 128 | 0.7156 | 0.7173 | 0.7156 | 0.8459 |
| No log | 5.9091 | 130 | 0.7048 | 0.7236 | 0.7048 | 0.8395 |
| No log | 6.0 | 132 | 0.6983 | 0.7116 | 0.6983 | 0.8356 |
| No log | 6.0909 | 134 | 0.7042 | 0.7117 | 0.7042 | 0.8392 |
| No log | 6.1818 | 136 | 0.7342 | 0.6825 | 0.7342 | 0.8568 |
| No log | 6.2727 | 138 | 0.7190 | 0.6993 | 0.7190 | 0.8479 |
| No log | 6.3636 | 140 | 0.7000 | 0.7108 | 0.7000 | 0.8367 |
| No log | 6.4545 | 142 | 0.7061 | 0.7144 | 0.7061 | 0.8403 |
| No log | 6.5455 | 144 | 0.7106 | 0.7179 | 0.7106 | 0.8430 |
| No log | 6.6364 | 146 | 0.7249 | 0.7159 | 0.7249 | 0.8514 |
| No log | 6.7273 | 148 | 0.7462 | 0.7140 | 0.7462 | 0.8638 |
| No log | 6.8182 | 150 | 0.7533 | 0.7169 | 0.7533 | 0.8679 |
| No log | 6.9091 | 152 | 0.7582 | 0.7266 | 0.7582 | 0.8707 |
| No log | 7.0 | 154 | 0.7649 | 0.7113 | 0.7649 | 0.8746 |
| No log | 7.0909 | 156 | 0.7746 | 0.6998 | 0.7746 | 0.8801 |
| No log | 7.1818 | 158 | 0.7774 | 0.6830 | 0.7774 | 0.8817 |
| No log | 7.2727 | 160 | 0.7809 | 0.6865 | 0.7809 | 0.8837 |
| No log | 7.3636 | 162 | 0.7808 | 0.7012 | 0.7808 | 0.8836 |
| No log | 7.4545 | 164 | 0.7811 | 0.6928 | 0.7811 | 0.8838 |
| No log | 7.5455 | 166 | 0.7881 | 0.6877 | 0.7881 | 0.8877 |
| No log | 7.6364 | 168 | 0.7932 | 0.6906 | 0.7932 | 0.8906 |
| No log | 7.7273 | 170 | 0.7866 | 0.6906 | 0.7866 | 0.8869 |
| No log | 7.8182 | 172 | 0.7859 | 0.7082 | 0.7859 | 0.8865 |
| No log | 7.9091 | 174 | 0.7787 | 0.7052 | 0.7787 | 0.8824 |
| No log | 8.0 | 176 | 0.7686 | 0.6933 | 0.7686 | 0.8767 |
| No log | 8.0909 | 178 | 0.7657 | 0.6909 | 0.7657 | 0.8751 |
| No log | 8.1818 | 180 | 0.7693 | 0.6872 | 0.7693 | 0.8771 |
| No log | 8.2727 | 182 | 0.7769 | 0.6872 | 0.7769 | 0.8814 |
| No log | 8.3636 | 184 | 0.7849 | 0.6836 | 0.7849 | 0.8860 |
| No log | 8.4545 | 186 | 0.7939 | 0.6926 | 0.7939 | 0.8910 |
| No log | 8.5455 | 188 | 0.7947 | 0.7016 | 0.7947 | 0.8915 |
| No log | 8.6364 | 190 | 0.7915 | 0.7016 | 0.7915 | 0.8897 |
| No log | 8.7273 | 192 | 0.7827 | 0.6973 | 0.7827 | 0.8847 |
| No log | 8.8182 | 194 | 0.7708 | 0.7108 | 0.7708 | 0.8779 |
| No log | 8.9091 | 196 | 0.7620 | 0.7048 | 0.7620 | 0.8729 |
| No log | 9.0 | 198 | 0.7530 | 0.7031 | 0.7530 | 0.8677 |
| No log | 9.0909 | 200 | 0.7490 | 0.6970 | 0.7490 | 0.8655 |
| No log | 9.1818 | 202 | 0.7457 | 0.6970 | 0.7457 | 0.8635 |
| No log | 9.2727 | 204 | 0.7441 | 0.6970 | 0.7441 | 0.8626 |
| No log | 9.3636 | 206 | 0.7453 | 0.6982 | 0.7453 | 0.8633 |
| No log | 9.4545 | 208 | 0.7445 | 0.6994 | 0.7445 | 0.8628 |
| No log | 9.5455 | 210 | 0.7434 | 0.7138 | 0.7434 | 0.8622 |
| No log | 9.6364 | 212 | 0.7429 | 0.7138 | 0.7429 | 0.8619 |
| No log | 9.7273 | 214 | 0.7431 | 0.7077 | 0.7431 | 0.8620 |
| No log | 9.8182 | 216 | 0.7437 | 0.7077 | 0.7437 | 0.8624 |
| No log | 9.9091 | 218 | 0.7446 | 0.7077 | 0.7446 | 0.8629 |
| No log | 10.0 | 220 | 0.7450 | 0.7077 | 0.7450 | 0.8632 |
Framework versions
- Transformers 4.44.2
- Pytorch 2.4.0+cu118
- Datasets 2.21.0
- Tokenizers 0.19.1
- Downloads last month
- -
Model tree for MayBashendy/ArabicNewSplits5_FineTuningAraBERT_run3_AugV5_k3_task1_organization
Base model
aubmindlab/bert-base-arabertv02