ArabicNewSplits6_WithDuplicationsForScore5_FineTuningAraBERT_run2_AugV5_k7_task5_organization
This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:
- Loss: 0.7934
- Qwk: 0.7193
- Mse: 0.7934
- Rmse: 0.8907
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 10
Training results
| Training Loss | Epoch | Step | Validation Loss | Qwk | Mse | Rmse |
|---|---|---|---|---|---|---|
| No log | 0.0690 | 2 | 2.4237 | 0.0218 | 2.4237 | 1.5568 |
| No log | 0.1379 | 4 | 1.6356 | 0.1531 | 1.6356 | 1.2789 |
| No log | 0.2069 | 6 | 1.7021 | -0.0198 | 1.7021 | 1.3047 |
| No log | 0.2759 | 8 | 1.9767 | -0.0458 | 1.9767 | 1.4059 |
| No log | 0.3448 | 10 | 1.6797 | 0.0477 | 1.6797 | 1.2960 |
| No log | 0.4138 | 12 | 1.7713 | 0.1804 | 1.7713 | 1.3309 |
| No log | 0.4828 | 14 | 1.6965 | 0.2549 | 1.6965 | 1.3025 |
| No log | 0.5517 | 16 | 1.4704 | 0.1022 | 1.4704 | 1.2126 |
| No log | 0.6207 | 18 | 1.3690 | 0.2247 | 1.3690 | 1.1701 |
| No log | 0.6897 | 20 | 1.2419 | 0.2828 | 1.2419 | 1.1144 |
| No log | 0.7586 | 22 | 1.2231 | 0.3051 | 1.2231 | 1.1059 |
| No log | 0.8276 | 24 | 1.3204 | 0.3975 | 1.3204 | 1.1491 |
| No log | 0.8966 | 26 | 1.3899 | 0.4345 | 1.3899 | 1.1789 |
| No log | 0.9655 | 28 | 1.2591 | 0.4514 | 1.2591 | 1.1221 |
| No log | 1.0345 | 30 | 1.1682 | 0.4543 | 1.1682 | 1.0808 |
| No log | 1.1034 | 32 | 1.0887 | 0.3441 | 1.0887 | 1.0434 |
| No log | 1.1724 | 34 | 1.0401 | 0.3388 | 1.0401 | 1.0199 |
| No log | 1.2414 | 36 | 1.0121 | 0.4677 | 1.0121 | 1.0061 |
| No log | 1.3103 | 38 | 0.9846 | 0.5064 | 0.9846 | 0.9923 |
| No log | 1.3793 | 40 | 0.9986 | 0.5534 | 0.9986 | 0.9993 |
| No log | 1.4483 | 42 | 1.1755 | 0.5146 | 1.1755 | 1.0842 |
| No log | 1.5172 | 44 | 1.2066 | 0.5320 | 1.2066 | 1.0985 |
| No log | 1.5862 | 46 | 0.9554 | 0.5259 | 0.9554 | 0.9775 |
| No log | 1.6552 | 48 | 0.9105 | 0.5267 | 0.9105 | 0.9542 |
| No log | 1.7241 | 50 | 0.9738 | 0.5933 | 0.9738 | 0.9868 |
| No log | 1.7931 | 52 | 1.1649 | 0.5737 | 1.1649 | 1.0793 |
| No log | 1.8621 | 54 | 1.3403 | 0.5388 | 1.3403 | 1.1577 |
| No log | 1.9310 | 56 | 1.2097 | 0.5559 | 1.2097 | 1.0998 |
| No log | 2.0 | 58 | 0.8857 | 0.6040 | 0.8857 | 0.9411 |
| No log | 2.0690 | 60 | 0.8336 | 0.5618 | 0.8336 | 0.9130 |
| No log | 2.1379 | 62 | 0.9106 | 0.6245 | 0.9106 | 0.9543 |
| No log | 2.2069 | 64 | 1.0864 | 0.6001 | 1.0864 | 1.0423 |
| No log | 2.2759 | 66 | 1.2740 | 0.5290 | 1.2740 | 1.1287 |
| No log | 2.3448 | 68 | 1.3788 | 0.4896 | 1.3788 | 1.1742 |
| No log | 2.4138 | 70 | 1.2829 | 0.5593 | 1.2829 | 1.1326 |
| No log | 2.4828 | 72 | 0.9299 | 0.6442 | 0.9299 | 0.9643 |
| No log | 2.5517 | 74 | 0.7051 | 0.6158 | 0.7051 | 0.8397 |
| No log | 2.6207 | 76 | 0.6648 | 0.6418 | 0.6648 | 0.8153 |
| No log | 2.6897 | 78 | 0.6938 | 0.7149 | 0.6938 | 0.8330 |
| No log | 2.7586 | 80 | 0.9464 | 0.6906 | 0.9464 | 0.9728 |
| No log | 2.8276 | 82 | 1.0435 | 0.6688 | 1.0435 | 1.0215 |
| No log | 2.8966 | 84 | 0.7632 | 0.7162 | 0.7632 | 0.8736 |
| No log | 2.9655 | 86 | 0.6370 | 0.7423 | 0.6370 | 0.7981 |
| No log | 3.0345 | 88 | 0.6074 | 0.7486 | 0.6074 | 0.7793 |
| No log | 3.1034 | 90 | 0.6258 | 0.7240 | 0.6258 | 0.7911 |
| No log | 3.1724 | 92 | 0.8455 | 0.7142 | 0.8455 | 0.9195 |
| No log | 3.2414 | 94 | 1.0873 | 0.6864 | 1.0873 | 1.0427 |
| No log | 3.3103 | 96 | 0.9572 | 0.6982 | 0.9572 | 0.9784 |
| No log | 3.3793 | 98 | 0.6723 | 0.7369 | 0.6723 | 0.8200 |
| No log | 3.4483 | 100 | 0.6277 | 0.7322 | 0.6277 | 0.7923 |
| No log | 3.5172 | 102 | 0.6738 | 0.7250 | 0.6738 | 0.8208 |
| No log | 3.5862 | 104 | 0.9134 | 0.7065 | 0.9134 | 0.9557 |
| No log | 3.6552 | 106 | 1.1224 | 0.6354 | 1.1224 | 1.0595 |
| No log | 3.7241 | 108 | 1.0948 | 0.6600 | 1.0948 | 1.0463 |
| No log | 3.7931 | 110 | 0.9453 | 0.6768 | 0.9453 | 0.9722 |
| No log | 3.8621 | 112 | 0.8189 | 0.6982 | 0.8189 | 0.9049 |
| No log | 3.9310 | 114 | 0.7611 | 0.7156 | 0.7611 | 0.8724 |
| No log | 4.0 | 116 | 0.7537 | 0.7214 | 0.7537 | 0.8681 |
| No log | 4.0690 | 118 | 0.8941 | 0.7082 | 0.8941 | 0.9456 |
| No log | 4.1379 | 120 | 1.0579 | 0.6827 | 1.0579 | 1.0285 |
| No log | 4.2069 | 122 | 0.9839 | 0.6857 | 0.9839 | 0.9919 |
| No log | 4.2759 | 124 | 0.8662 | 0.7132 | 0.8662 | 0.9307 |
| No log | 4.3448 | 126 | 0.7625 | 0.7118 | 0.7625 | 0.8732 |
| No log | 4.4138 | 128 | 0.7985 | 0.7024 | 0.7985 | 0.8936 |
| No log | 4.4828 | 130 | 0.7693 | 0.7113 | 0.7693 | 0.8771 |
| No log | 4.5517 | 132 | 0.7209 | 0.7262 | 0.7209 | 0.8491 |
| No log | 4.6207 | 134 | 0.6816 | 0.7389 | 0.6816 | 0.8256 |
| No log | 4.6897 | 136 | 0.7249 | 0.7318 | 0.7249 | 0.8514 |
| No log | 4.7586 | 138 | 0.7139 | 0.7268 | 0.7139 | 0.8449 |
| No log | 4.8276 | 140 | 0.6701 | 0.7236 | 0.6701 | 0.8186 |
| No log | 4.8966 | 142 | 0.6117 | 0.7499 | 0.6117 | 0.7821 |
| No log | 4.9655 | 144 | 0.6388 | 0.7467 | 0.6388 | 0.7993 |
| No log | 5.0345 | 146 | 0.7752 | 0.7218 | 0.7752 | 0.8804 |
| No log | 5.1034 | 148 | 0.9248 | 0.6999 | 0.9248 | 0.9617 |
| No log | 5.1724 | 150 | 0.9205 | 0.6953 | 0.9205 | 0.9594 |
| No log | 5.2414 | 152 | 0.8355 | 0.7106 | 0.8355 | 0.9140 |
| No log | 5.3103 | 154 | 0.7413 | 0.7183 | 0.7413 | 0.8610 |
| No log | 5.3793 | 156 | 0.6418 | 0.7183 | 0.6418 | 0.8011 |
| No log | 5.4483 | 158 | 0.6179 | 0.7123 | 0.6179 | 0.7861 |
| No log | 5.5172 | 160 | 0.6459 | 0.7145 | 0.6459 | 0.8037 |
| No log | 5.5862 | 162 | 0.7252 | 0.7093 | 0.7252 | 0.8516 |
| No log | 5.6552 | 164 | 0.7932 | 0.7242 | 0.7932 | 0.8906 |
| No log | 5.7241 | 166 | 0.8840 | 0.6943 | 0.8840 | 0.9402 |
| No log | 5.7931 | 168 | 0.8298 | 0.7173 | 0.8298 | 0.9109 |
| No log | 5.8621 | 170 | 0.7177 | 0.7361 | 0.7177 | 0.8471 |
| No log | 5.9310 | 172 | 0.6996 | 0.7361 | 0.6996 | 0.8364 |
| No log | 6.0 | 174 | 0.7218 | 0.7339 | 0.7218 | 0.8496 |
| No log | 6.0690 | 176 | 0.7826 | 0.7251 | 0.7826 | 0.8847 |
| No log | 6.1379 | 178 | 0.8148 | 0.7178 | 0.8148 | 0.9027 |
| No log | 6.2069 | 180 | 0.8451 | 0.7229 | 0.8451 | 0.9193 |
| No log | 6.2759 | 182 | 0.9488 | 0.6738 | 0.9488 | 0.9741 |
| No log | 6.3448 | 184 | 0.9824 | 0.6426 | 0.9824 | 0.9911 |
| No log | 6.4138 | 186 | 0.8886 | 0.7029 | 0.8886 | 0.9426 |
| No log | 6.4828 | 188 | 0.7471 | 0.7430 | 0.7471 | 0.8644 |
| No log | 6.5517 | 190 | 0.6096 | 0.7390 | 0.6096 | 0.7808 |
| No log | 6.6207 | 192 | 0.5688 | 0.7246 | 0.5688 | 0.7542 |
| No log | 6.6897 | 194 | 0.5694 | 0.7159 | 0.5694 | 0.7546 |
| No log | 6.7586 | 196 | 0.5940 | 0.7331 | 0.5940 | 0.7707 |
| No log | 6.8276 | 198 | 0.6617 | 0.7504 | 0.6617 | 0.8135 |
| No log | 6.8966 | 200 | 0.7658 | 0.7400 | 0.7658 | 0.8751 |
| No log | 6.9655 | 202 | 0.8488 | 0.7015 | 0.8488 | 0.9213 |
| No log | 7.0345 | 204 | 0.9101 | 0.6798 | 0.9101 | 0.9540 |
| No log | 7.1034 | 206 | 0.8836 | 0.6960 | 0.8836 | 0.9400 |
| No log | 7.1724 | 208 | 0.7926 | 0.7211 | 0.7926 | 0.8903 |
| No log | 7.2414 | 210 | 0.7514 | 0.7295 | 0.7514 | 0.8668 |
| No log | 7.3103 | 212 | 0.7475 | 0.7223 | 0.7475 | 0.8646 |
| No log | 7.3793 | 214 | 0.7629 | 0.7256 | 0.7629 | 0.8735 |
| No log | 7.4483 | 216 | 0.8083 | 0.7112 | 0.8083 | 0.8991 |
| No log | 7.5172 | 218 | 0.8610 | 0.7022 | 0.8610 | 0.9279 |
| No log | 7.5862 | 220 | 0.8850 | 0.6937 | 0.8850 | 0.9408 |
| No log | 7.6552 | 222 | 0.8562 | 0.7012 | 0.8562 | 0.9253 |
| No log | 7.7241 | 224 | 0.8093 | 0.7083 | 0.8093 | 0.8996 |
| No log | 7.7931 | 226 | 0.7987 | 0.7148 | 0.7987 | 0.8937 |
| No log | 7.8621 | 228 | 0.8105 | 0.7083 | 0.8105 | 0.9003 |
| No log | 7.9310 | 230 | 0.8310 | 0.7044 | 0.8310 | 0.9116 |
| No log | 8.0 | 232 | 0.8891 | 0.6777 | 0.8891 | 0.9429 |
| No log | 8.0690 | 234 | 0.9351 | 0.6642 | 0.9351 | 0.9670 |
| No log | 8.1379 | 236 | 0.9719 | 0.6720 | 0.9719 | 0.9859 |
| No log | 8.2069 | 238 | 0.9786 | 0.6720 | 0.9786 | 0.9892 |
| No log | 8.2759 | 240 | 0.9981 | 0.6720 | 0.9981 | 0.9990 |
| No log | 8.3448 | 242 | 0.9913 | 0.6720 | 0.9913 | 0.9956 |
| No log | 8.4138 | 244 | 0.9554 | 0.6720 | 0.9554 | 0.9774 |
| No log | 8.4828 | 246 | 0.9107 | 0.6867 | 0.9107 | 0.9543 |
| No log | 8.5517 | 248 | 0.8319 | 0.7044 | 0.8319 | 0.9121 |
| No log | 8.6207 | 250 | 0.7460 | 0.7249 | 0.7460 | 0.8637 |
| No log | 8.6897 | 252 | 0.6954 | 0.7267 | 0.6954 | 0.8339 |
| No log | 8.7586 | 254 | 0.6815 | 0.7267 | 0.6815 | 0.8255 |
| No log | 8.8276 | 256 | 0.6963 | 0.7267 | 0.6963 | 0.8344 |
| No log | 8.8966 | 258 | 0.7274 | 0.7262 | 0.7274 | 0.8529 |
| No log | 8.9655 | 260 | 0.7600 | 0.7218 | 0.7600 | 0.8718 |
| No log | 9.0345 | 262 | 0.7957 | 0.7083 | 0.7957 | 0.8920 |
| No log | 9.1034 | 264 | 0.8408 | 0.7083 | 0.8408 | 0.9169 |
| No log | 9.1724 | 266 | 0.8613 | 0.6994 | 0.8613 | 0.9281 |
| No log | 9.2414 | 268 | 0.8672 | 0.7030 | 0.8672 | 0.9312 |
| No log | 9.3103 | 270 | 0.8832 | 0.7033 | 0.8832 | 0.9398 |
| No log | 9.3793 | 272 | 0.8913 | 0.7033 | 0.8913 | 0.9441 |
| No log | 9.4483 | 274 | 0.8777 | 0.7033 | 0.8777 | 0.9369 |
| No log | 9.5172 | 276 | 0.8656 | 0.7071 | 0.8656 | 0.9304 |
| No log | 9.5862 | 278 | 0.8491 | 0.7156 | 0.8491 | 0.9214 |
| No log | 9.6552 | 280 | 0.8320 | 0.7083 | 0.8320 | 0.9121 |
| No log | 9.7241 | 282 | 0.8167 | 0.7083 | 0.8167 | 0.9037 |
| No log | 9.7931 | 284 | 0.8053 | 0.7083 | 0.8053 | 0.8974 |
| No log | 9.8621 | 286 | 0.7969 | 0.7083 | 0.7969 | 0.8927 |
| No log | 9.9310 | 288 | 0.7948 | 0.7083 | 0.7948 | 0.8915 |
| No log | 10.0 | 290 | 0.7934 | 0.7193 | 0.7934 | 0.8907 |
Framework versions
- Transformers 4.44.2
- Pytorch 2.4.0+cu118
- Datasets 2.21.0
- Tokenizers 0.19.1
- Downloads last month
- 2
Model tree for MayBashendy/ArabicNewSplits6_WithDuplicationsForScore5_FineTuningAraBERT_run2_AugV5_k7_task5_organization
Base model
aubmindlab/bert-base-arabertv02