ArabicNewSplits6_FineTuningAraBERT_run1_AugV5_k5_task3_organization
This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:
- Loss: 0.8963
- Qwk: 0.1736
- Mse: 0.8963
- Rmse: 0.9467
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 10
Training results
| Training Loss | Epoch | Step | Validation Loss | Qwk | Mse | Rmse |
|---|---|---|---|---|---|---|
| No log | 0.08 | 2 | 3.3350 | -0.0258 | 3.3350 | 1.8262 |
| No log | 0.16 | 4 | 1.7722 | -0.0101 | 1.7722 | 1.3312 |
| No log | 0.24 | 6 | 1.2167 | 0.0255 | 1.2167 | 1.1030 |
| No log | 0.32 | 8 | 1.6673 | 0.0390 | 1.6673 | 1.2912 |
| No log | 0.4 | 10 | 2.3463 | 0.0793 | 2.3463 | 1.5318 |
| No log | 0.48 | 12 | 1.1652 | 0.0 | 1.1652 | 1.0794 |
| No log | 0.56 | 14 | 0.6792 | 0.1724 | 0.6792 | 0.8242 |
| No log | 0.64 | 16 | 0.5965 | 0.0 | 0.5965 | 0.7723 |
| No log | 0.72 | 18 | 0.5670 | 0.0 | 0.5670 | 0.7530 |
| No log | 0.8 | 20 | 0.5668 | -0.0081 | 0.5668 | 0.7529 |
| No log | 0.88 | 22 | 0.6409 | 0.0 | 0.6409 | 0.8005 |
| No log | 0.96 | 24 | 0.6173 | 0.0222 | 0.6173 | 0.7857 |
| No log | 1.04 | 26 | 0.7052 | 0.0409 | 0.7052 | 0.8397 |
| No log | 1.12 | 28 | 0.7815 | 0.0145 | 0.7815 | 0.8840 |
| No log | 1.2 | 30 | 0.6089 | 0.2418 | 0.6089 | 0.7803 |
| No log | 1.28 | 32 | 0.5907 | 0.1385 | 0.5907 | 0.7686 |
| No log | 1.3600 | 34 | 0.7185 | 0.2485 | 0.7185 | 0.8476 |
| No log | 1.44 | 36 | 0.6760 | 0.2000 | 0.6760 | 0.8222 |
| No log | 1.52 | 38 | 0.5871 | 0.2308 | 0.5871 | 0.7662 |
| No log | 1.6 | 40 | 0.9571 | 0.1290 | 0.9571 | 0.9783 |
| No log | 1.6800 | 42 | 0.7634 | 0.1238 | 0.7634 | 0.8737 |
| No log | 1.76 | 44 | 0.6060 | 0.2889 | 0.6060 | 0.7785 |
| No log | 1.8400 | 46 | 0.6206 | 0.2516 | 0.6206 | 0.7878 |
| No log | 1.92 | 48 | 0.6676 | 0.0983 | 0.6676 | 0.8171 |
| No log | 2.0 | 50 | 0.6673 | 0.2821 | 0.6673 | 0.8169 |
| No log | 2.08 | 52 | 0.8527 | -0.0595 | 0.8527 | 0.9234 |
| No log | 2.16 | 54 | 0.8476 | -0.0909 | 0.8476 | 0.9206 |
| No log | 2.24 | 56 | 0.5771 | 0.2632 | 0.5771 | 0.7597 |
| No log | 2.32 | 58 | 0.6328 | 0.0108 | 0.6328 | 0.7955 |
| No log | 2.4 | 60 | 0.6786 | 0.0359 | 0.6786 | 0.8238 |
| No log | 2.48 | 62 | 0.7128 | 0.0680 | 0.7128 | 0.8443 |
| No log | 2.56 | 64 | 0.5723 | 0.4556 | 0.5723 | 0.7565 |
| No log | 2.64 | 66 | 0.7020 | 0.1746 | 0.7020 | 0.8378 |
| No log | 2.7200 | 68 | 0.6541 | 0.3224 | 0.6541 | 0.8088 |
| No log | 2.8 | 70 | 0.6134 | 0.2184 | 0.6134 | 0.7832 |
| No log | 2.88 | 72 | 0.7513 | 0.0578 | 0.7513 | 0.8668 |
| No log | 2.96 | 74 | 0.6110 | 0.2258 | 0.6110 | 0.7816 |
| No log | 3.04 | 76 | 0.6326 | 0.2967 | 0.6326 | 0.7953 |
| No log | 3.12 | 78 | 0.7566 | 0.1429 | 0.7566 | 0.8698 |
| No log | 3.2 | 80 | 0.5921 | 0.4409 | 0.5921 | 0.7695 |
| No log | 3.2800 | 82 | 0.5527 | 0.3478 | 0.5527 | 0.7434 |
| No log | 3.36 | 84 | 0.5819 | 0.3478 | 0.5819 | 0.7628 |
| No log | 3.44 | 86 | 0.8029 | 0.0654 | 0.8029 | 0.8960 |
| No log | 3.52 | 88 | 1.1760 | 0.0178 | 1.1760 | 1.0844 |
| No log | 3.6 | 90 | 1.0855 | 0.0790 | 1.0855 | 1.0419 |
| No log | 3.68 | 92 | 0.7676 | 0.3188 | 0.7676 | 0.8761 |
| No log | 3.76 | 94 | 0.6331 | 0.3299 | 0.6331 | 0.7957 |
| No log | 3.84 | 96 | 0.6438 | 0.3462 | 0.6438 | 0.8023 |
| No log | 3.92 | 98 | 0.9316 | 0.1278 | 0.9316 | 0.9652 |
| No log | 4.0 | 100 | 1.0614 | 0.0831 | 1.0614 | 1.0303 |
| No log | 4.08 | 102 | 0.8179 | 0.1525 | 0.8179 | 0.9044 |
| No log | 4.16 | 104 | 0.6224 | 0.3951 | 0.6224 | 0.7889 |
| No log | 4.24 | 106 | 0.6481 | 0.4010 | 0.6481 | 0.8051 |
| No log | 4.32 | 108 | 0.8782 | 0.1220 | 0.8782 | 0.9371 |
| No log | 4.4 | 110 | 1.0105 | 0.1014 | 1.0105 | 1.0052 |
| No log | 4.48 | 112 | 1.0695 | 0.1014 | 1.0695 | 1.0342 |
| No log | 4.5600 | 114 | 0.9170 | 0.1939 | 0.9170 | 0.9576 |
| No log | 4.64 | 116 | 0.9819 | 0.0882 | 0.9819 | 0.9909 |
| No log | 4.72 | 118 | 0.9113 | 0.2569 | 0.9113 | 0.9546 |
| No log | 4.8 | 120 | 0.8898 | 0.2903 | 0.8898 | 0.9433 |
| No log | 4.88 | 122 | 1.0287 | 0.1241 | 1.0287 | 1.0143 |
| No log | 4.96 | 124 | 0.9667 | 0.2126 | 0.9667 | 0.9832 |
| No log | 5.04 | 126 | 0.7501 | 0.3514 | 0.7501 | 0.8661 |
| No log | 5.12 | 128 | 0.8012 | 0.1861 | 0.8012 | 0.8951 |
| No log | 5.2 | 130 | 0.8178 | 0.1930 | 0.8178 | 0.9043 |
| No log | 5.28 | 132 | 1.0378 | 0.1331 | 1.0378 | 1.0187 |
| No log | 5.36 | 134 | 1.1549 | 0.0530 | 1.1549 | 1.0747 |
| No log | 5.44 | 136 | 1.0429 | 0.0769 | 1.0429 | 1.0212 |
| No log | 5.52 | 138 | 0.7929 | 0.1928 | 0.7929 | 0.8905 |
| No log | 5.6 | 140 | 0.7470 | 0.2830 | 0.7470 | 0.8643 |
| No log | 5.68 | 142 | 0.8859 | 0.1795 | 0.8859 | 0.9412 |
| No log | 5.76 | 144 | 1.1328 | 0.1111 | 1.1328 | 1.0644 |
| No log | 5.84 | 146 | 1.0759 | 0.0791 | 1.0759 | 1.0372 |
| No log | 5.92 | 148 | 0.8187 | 0.1705 | 0.8187 | 0.9048 |
| No log | 6.0 | 150 | 0.7301 | 0.3433 | 0.7301 | 0.8545 |
| No log | 6.08 | 152 | 0.6831 | 0.3548 | 0.6831 | 0.8265 |
| No log | 6.16 | 154 | 0.7160 | 0.3433 | 0.7160 | 0.8461 |
| No log | 6.24 | 156 | 0.8565 | 0.1392 | 0.8565 | 0.9255 |
| No log | 6.32 | 158 | 0.8876 | 0.1169 | 0.8876 | 0.9421 |
| No log | 6.4 | 160 | 0.7593 | 0.3103 | 0.7593 | 0.8714 |
| No log | 6.48 | 162 | 0.7757 | 0.2692 | 0.7757 | 0.8808 |
| No log | 6.5600 | 164 | 0.8268 | 0.2364 | 0.8268 | 0.9093 |
| No log | 6.64 | 166 | 0.7779 | 0.2692 | 0.7779 | 0.8820 |
| No log | 6.72 | 168 | 0.8085 | 0.2294 | 0.8085 | 0.8992 |
| No log | 6.8 | 170 | 0.8454 | 0.2146 | 0.8454 | 0.9195 |
| No log | 6.88 | 172 | 0.9151 | 0.0598 | 0.9151 | 0.9566 |
| No log | 6.96 | 174 | 0.8822 | 0.0569 | 0.8822 | 0.9393 |
| No log | 7.04 | 176 | 0.7859 | 0.2830 | 0.7859 | 0.8865 |
| No log | 7.12 | 178 | 0.7581 | 0.2300 | 0.7581 | 0.8707 |
| No log | 7.2 | 180 | 0.7590 | 0.2692 | 0.7590 | 0.8712 |
| No log | 7.28 | 182 | 0.7170 | 0.2692 | 0.7170 | 0.8468 |
| No log | 7.36 | 184 | 0.7227 | 0.2692 | 0.7227 | 0.8501 |
| No log | 7.44 | 186 | 0.7160 | 0.2692 | 0.7160 | 0.8461 |
| No log | 7.52 | 188 | 0.7300 | 0.2780 | 0.7300 | 0.8544 |
| No log | 7.6 | 190 | 0.8455 | 0.1799 | 0.8455 | 0.9195 |
| No log | 7.68 | 192 | 0.8673 | 0.1148 | 0.8673 | 0.9313 |
| No log | 7.76 | 194 | 0.7691 | 0.2830 | 0.7691 | 0.8770 |
| No log | 7.84 | 196 | 0.6987 | 0.2986 | 0.6987 | 0.8359 |
| No log | 7.92 | 198 | 0.6945 | 0.3301 | 0.6945 | 0.8334 |
| No log | 8.0 | 200 | 0.7564 | 0.2579 | 0.7564 | 0.8697 |
| No log | 8.08 | 202 | 0.8433 | 0.1261 | 0.8433 | 0.9183 |
| No log | 8.16 | 204 | 0.9240 | 0.1197 | 0.9240 | 0.9612 |
| No log | 8.24 | 206 | 0.9431 | 0.1524 | 0.9431 | 0.9712 |
| No log | 8.32 | 208 | 0.8929 | 0.2066 | 0.8929 | 0.9449 |
| No log | 8.4 | 210 | 0.8194 | 0.1930 | 0.8194 | 0.9052 |
| No log | 8.48 | 212 | 0.7849 | 0.2857 | 0.7849 | 0.8859 |
| No log | 8.56 | 214 | 0.7856 | 0.2857 | 0.7856 | 0.8863 |
| No log | 8.64 | 216 | 0.8527 | 0.1931 | 0.8527 | 0.9234 |
| No log | 8.72 | 218 | 0.9015 | 0.1385 | 0.9015 | 0.9494 |
| No log | 8.8 | 220 | 0.9844 | 0.1765 | 0.9844 | 0.9922 |
| No log | 8.88 | 222 | 1.0516 | 0.1254 | 1.0516 | 1.0255 |
| No log | 8.96 | 224 | 1.0236 | 0.1524 | 1.0236 | 1.0117 |
| No log | 9.04 | 226 | 0.9811 | 0.1765 | 0.9811 | 0.9905 |
| No log | 9.12 | 228 | 0.8925 | 0.0720 | 0.8925 | 0.9447 |
| No log | 9.2 | 230 | 0.8111 | 0.2838 | 0.8111 | 0.9006 |
| No log | 9.28 | 232 | 0.7786 | 0.2877 | 0.7786 | 0.8824 |
| No log | 9.36 | 234 | 0.7815 | 0.2877 | 0.7815 | 0.8840 |
| No log | 9.44 | 236 | 0.8107 | 0.2364 | 0.8107 | 0.9004 |
| No log | 9.52 | 238 | 0.8441 | 0.2000 | 0.8441 | 0.9188 |
| No log | 9.6 | 240 | 0.8898 | 0.2068 | 0.8898 | 0.9433 |
| No log | 9.68 | 242 | 0.9103 | 0.1756 | 0.9103 | 0.9541 |
| No log | 9.76 | 244 | 0.9111 | 0.1756 | 0.9111 | 0.9545 |
| No log | 9.84 | 246 | 0.9030 | 0.1756 | 0.9030 | 0.9503 |
| No log | 9.92 | 248 | 0.8976 | 0.1736 | 0.8976 | 0.9474 |
| No log | 10.0 | 250 | 0.8963 | 0.1736 | 0.8963 | 0.9467 |
Framework versions
- Transformers 4.44.2
- Pytorch 2.4.0+cu118
- Datasets 2.21.0
- Tokenizers 0.19.1
- Downloads last month
- -
Model tree for MayBashendy/ArabicNewSplits6_FineTuningAraBERT_run1_AugV5_k5_task3_organization
Base model
aubmindlab/bert-base-arabertv02