ArabicNewSplits6_WithDuplicationsForScore5_FineTuningAraBERT_run2_AugV5_k5_task2_organization
This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:
- Loss: 0.8418
- Qwk: 0.5446
- Mse: 0.8418
- Rmse: 0.9175
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 10
Training results
| Training Loss | Epoch | Step | Validation Loss | Qwk | Mse | Rmse |
|---|---|---|---|---|---|---|
| No log | 0.0667 | 2 | 3.9087 | 0.0073 | 3.9087 | 1.9770 |
| No log | 0.1333 | 4 | 2.5833 | 0.0712 | 2.5833 | 1.6073 |
| No log | 0.2 | 6 | 1.2544 | 0.0864 | 1.2544 | 1.1200 |
| No log | 0.2667 | 8 | 0.9597 | 0.0133 | 0.9597 | 0.9796 |
| No log | 0.3333 | 10 | 0.7471 | 0.1640 | 0.7471 | 0.8643 |
| No log | 0.4 | 12 | 0.7705 | 0.1819 | 0.7705 | 0.8778 |
| No log | 0.4667 | 14 | 0.7979 | 0.1314 | 0.7979 | 0.8933 |
| No log | 0.5333 | 16 | 0.7581 | 0.1395 | 0.7581 | 0.8707 |
| No log | 0.6 | 18 | 0.7397 | 0.1647 | 0.7397 | 0.8600 |
| No log | 0.6667 | 20 | 0.7099 | 0.1971 | 0.7099 | 0.8425 |
| No log | 0.7333 | 22 | 0.7010 | 0.1273 | 0.7010 | 0.8372 |
| No log | 0.8 | 24 | 0.7177 | 0.1366 | 0.7177 | 0.8472 |
| No log | 0.8667 | 26 | 0.8379 | 0.1353 | 0.8379 | 0.9154 |
| No log | 0.9333 | 28 | 0.7564 | 0.2806 | 0.7564 | 0.8697 |
| No log | 1.0 | 30 | 0.6858 | 0.3309 | 0.6858 | 0.8281 |
| No log | 1.0667 | 32 | 0.6881 | 0.2909 | 0.6881 | 0.8295 |
| No log | 1.1333 | 34 | 0.6822 | 0.4207 | 0.6822 | 0.8259 |
| No log | 1.2 | 36 | 0.8444 | 0.3721 | 0.8444 | 0.9189 |
| No log | 1.2667 | 38 | 0.8600 | 0.3435 | 0.8600 | 0.9273 |
| No log | 1.3333 | 40 | 0.7534 | 0.4394 | 0.7534 | 0.8680 |
| No log | 1.4 | 42 | 0.6784 | 0.5024 | 0.6784 | 0.8236 |
| No log | 1.4667 | 44 | 0.5946 | 0.4672 | 0.5946 | 0.7711 |
| No log | 1.5333 | 46 | 0.6481 | 0.4593 | 0.6481 | 0.8050 |
| No log | 1.6 | 48 | 0.6918 | 0.5079 | 0.6918 | 0.8317 |
| No log | 1.6667 | 50 | 0.7510 | 0.5149 | 0.7510 | 0.8666 |
| No log | 1.7333 | 52 | 0.6936 | 0.4738 | 0.6936 | 0.8328 |
| No log | 1.8 | 54 | 0.8754 | 0.4355 | 0.8754 | 0.9356 |
| No log | 1.8667 | 56 | 0.8376 | 0.4243 | 0.8376 | 0.9152 |
| No log | 1.9333 | 58 | 0.6211 | 0.4783 | 0.6211 | 0.7881 |
| No log | 2.0 | 60 | 0.6583 | 0.5249 | 0.6583 | 0.8114 |
| No log | 2.0667 | 62 | 0.6457 | 0.5249 | 0.6457 | 0.8035 |
| No log | 2.1333 | 64 | 0.5836 | 0.4965 | 0.5836 | 0.7639 |
| No log | 2.2 | 66 | 0.5813 | 0.4860 | 0.5813 | 0.7624 |
| No log | 2.2667 | 68 | 0.6015 | 0.5169 | 0.6015 | 0.7755 |
| No log | 2.3333 | 70 | 0.7137 | 0.5183 | 0.7137 | 0.8448 |
| No log | 2.4 | 72 | 0.6953 | 0.5173 | 0.6953 | 0.8338 |
| No log | 2.4667 | 74 | 0.7411 | 0.5131 | 0.7411 | 0.8609 |
| No log | 2.5333 | 76 | 0.7957 | 0.5269 | 0.7957 | 0.8920 |
| No log | 2.6 | 78 | 0.8659 | 0.5313 | 0.8659 | 0.9305 |
| No log | 2.6667 | 80 | 1.0167 | 0.4988 | 1.0167 | 1.0083 |
| No log | 2.7333 | 82 | 0.8816 | 0.5457 | 0.8816 | 0.9389 |
| No log | 2.8 | 84 | 0.7920 | 0.5052 | 0.7920 | 0.8900 |
| No log | 2.8667 | 86 | 0.7748 | 0.5632 | 0.7748 | 0.8802 |
| No log | 2.9333 | 88 | 0.8061 | 0.5323 | 0.8061 | 0.8978 |
| No log | 3.0 | 90 | 0.9968 | 0.5073 | 0.9968 | 0.9984 |
| No log | 3.0667 | 92 | 1.0534 | 0.4973 | 1.0534 | 1.0264 |
| No log | 3.1333 | 94 | 0.8439 | 0.5199 | 0.8439 | 0.9186 |
| No log | 3.2 | 96 | 0.7243 | 0.5370 | 0.7243 | 0.8511 |
| No log | 3.2667 | 98 | 0.7074 | 0.5165 | 0.7074 | 0.8411 |
| No log | 3.3333 | 100 | 0.6395 | 0.5287 | 0.6395 | 0.7997 |
| No log | 3.4 | 102 | 0.7569 | 0.5415 | 0.7569 | 0.8700 |
| No log | 3.4667 | 104 | 1.0081 | 0.4374 | 1.0081 | 1.0040 |
| No log | 3.5333 | 106 | 1.0646 | 0.4337 | 1.0646 | 1.0318 |
| No log | 3.6 | 108 | 0.9327 | 0.4845 | 0.9327 | 0.9658 |
| No log | 3.6667 | 110 | 0.7144 | 0.5162 | 0.7144 | 0.8452 |
| No log | 3.7333 | 112 | 0.7274 | 0.5392 | 0.7274 | 0.8529 |
| No log | 3.8 | 114 | 0.7615 | 0.5298 | 0.7615 | 0.8726 |
| No log | 3.8667 | 116 | 0.8260 | 0.5241 | 0.8260 | 0.9089 |
| No log | 3.9333 | 118 | 0.8997 | 0.4943 | 0.8997 | 0.9485 |
| No log | 4.0 | 120 | 0.9964 | 0.4717 | 0.9964 | 0.9982 |
| No log | 4.0667 | 122 | 0.9794 | 0.5098 | 0.9794 | 0.9896 |
| No log | 4.1333 | 124 | 0.9436 | 0.51 | 0.9436 | 0.9714 |
| No log | 4.2 | 126 | 0.9806 | 0.4812 | 0.9806 | 0.9902 |
| No log | 4.2667 | 128 | 0.8875 | 0.5105 | 0.8875 | 0.9421 |
| No log | 4.3333 | 130 | 0.8028 | 0.5514 | 0.8028 | 0.8960 |
| No log | 4.4 | 132 | 0.9107 | 0.4898 | 0.9107 | 0.9543 |
| No log | 4.4667 | 134 | 0.9055 | 0.4850 | 0.9055 | 0.9516 |
| No log | 4.5333 | 136 | 0.7901 | 0.5248 | 0.7901 | 0.8889 |
| No log | 4.6 | 138 | 0.8166 | 0.4900 | 0.8166 | 0.9037 |
| No log | 4.6667 | 140 | 0.8988 | 0.5088 | 0.8988 | 0.9481 |
| No log | 4.7333 | 142 | 0.8272 | 0.4900 | 0.8272 | 0.9095 |
| No log | 4.8 | 144 | 0.7977 | 0.5582 | 0.7977 | 0.8932 |
| No log | 4.8667 | 146 | 0.8618 | 0.5149 | 0.8618 | 0.9284 |
| No log | 4.9333 | 148 | 1.0207 | 0.4837 | 1.0207 | 1.0103 |
| No log | 5.0 | 150 | 1.0666 | 0.4743 | 1.0666 | 1.0327 |
| No log | 5.0667 | 152 | 1.0752 | 0.4827 | 1.0752 | 1.0369 |
| No log | 5.1333 | 154 | 1.1249 | 0.5141 | 1.1249 | 1.0606 |
| No log | 5.2 | 156 | 1.1341 | 0.5065 | 1.1341 | 1.0649 |
| No log | 5.2667 | 158 | 1.1405 | 0.4992 | 1.1405 | 1.0679 |
| No log | 5.3333 | 160 | 1.0910 | 0.5006 | 1.0910 | 1.0445 |
| No log | 5.4 | 162 | 1.0075 | 0.5154 | 1.0075 | 1.0037 |
| No log | 5.4667 | 164 | 0.9254 | 0.5020 | 0.9254 | 0.9620 |
| No log | 5.5333 | 166 | 0.8517 | 0.5228 | 0.8517 | 0.9229 |
| No log | 5.6 | 168 | 0.7643 | 0.5844 | 0.7643 | 0.8742 |
| No log | 5.6667 | 170 | 0.7332 | 0.5298 | 0.7332 | 0.8563 |
| No log | 5.7333 | 172 | 0.7287 | 0.5315 | 0.7287 | 0.8537 |
| No log | 5.8 | 174 | 0.7238 | 0.5423 | 0.7238 | 0.8507 |
| No log | 5.8667 | 176 | 0.7229 | 0.5159 | 0.7229 | 0.8503 |
| No log | 5.9333 | 178 | 0.7549 | 0.5183 | 0.7549 | 0.8688 |
| No log | 6.0 | 180 | 0.7805 | 0.5151 | 0.7805 | 0.8835 |
| No log | 6.0667 | 182 | 0.8353 | 0.5048 | 0.8353 | 0.9140 |
| No log | 6.1333 | 184 | 0.8425 | 0.5177 | 0.8425 | 0.9179 |
| No log | 6.2 | 186 | 0.8033 | 0.4978 | 0.8033 | 0.8963 |
| No log | 6.2667 | 188 | 0.7930 | 0.5135 | 0.7930 | 0.8905 |
| No log | 6.3333 | 190 | 0.7732 | 0.5213 | 0.7732 | 0.8793 |
| No log | 6.4 | 192 | 0.7699 | 0.5177 | 0.7699 | 0.8775 |
| No log | 6.4667 | 194 | 0.7946 | 0.5112 | 0.7946 | 0.8914 |
| No log | 6.5333 | 196 | 0.8173 | 0.5217 | 0.8173 | 0.9040 |
| No log | 6.6 | 198 | 0.7908 | 0.5130 | 0.7908 | 0.8893 |
| No log | 6.6667 | 200 | 0.7575 | 0.5117 | 0.7575 | 0.8703 |
| No log | 6.7333 | 202 | 0.7460 | 0.5072 | 0.7460 | 0.8637 |
| No log | 6.8 | 204 | 0.7452 | 0.5188 | 0.7452 | 0.8633 |
| No log | 6.8667 | 206 | 0.7760 | 0.5116 | 0.7760 | 0.8809 |
| No log | 6.9333 | 208 | 0.7973 | 0.5321 | 0.7973 | 0.8929 |
| No log | 7.0 | 210 | 0.7722 | 0.5116 | 0.7722 | 0.8788 |
| No log | 7.0667 | 212 | 0.7624 | 0.5193 | 0.7624 | 0.8732 |
| No log | 7.1333 | 214 | 0.8022 | 0.5138 | 0.8022 | 0.8957 |
| No log | 7.2 | 216 | 0.8248 | 0.5138 | 0.8248 | 0.9082 |
| No log | 7.2667 | 218 | 0.8536 | 0.5130 | 0.8536 | 0.9239 |
| No log | 7.3333 | 220 | 0.8509 | 0.5211 | 0.8509 | 0.9224 |
| No log | 7.4 | 222 | 0.8476 | 0.5220 | 0.8476 | 0.9207 |
| No log | 7.4667 | 224 | 0.8480 | 0.5220 | 0.8480 | 0.9209 |
| No log | 7.5333 | 226 | 0.8400 | 0.5288 | 0.8400 | 0.9165 |
| No log | 7.6 | 228 | 0.8181 | 0.5254 | 0.8181 | 0.9045 |
| No log | 7.6667 | 230 | 0.8097 | 0.5327 | 0.8097 | 0.8998 |
| No log | 7.7333 | 232 | 0.8313 | 0.5426 | 0.8313 | 0.9117 |
| No log | 7.8 | 234 | 0.8696 | 0.4888 | 0.8696 | 0.9325 |
| No log | 7.8667 | 236 | 0.9084 | 0.4758 | 0.9084 | 0.9531 |
| No log | 7.9333 | 238 | 0.8867 | 0.4929 | 0.8867 | 0.9417 |
| No log | 8.0 | 240 | 0.8510 | 0.5236 | 0.8510 | 0.9225 |
| No log | 8.0667 | 242 | 0.8183 | 0.5313 | 0.8183 | 0.9046 |
| No log | 8.1333 | 244 | 0.7980 | 0.5342 | 0.7980 | 0.8933 |
| No log | 8.2 | 246 | 0.7963 | 0.5342 | 0.7963 | 0.8924 |
| No log | 8.2667 | 248 | 0.8144 | 0.5236 | 0.8144 | 0.9025 |
| No log | 8.3333 | 250 | 0.8306 | 0.5286 | 0.8306 | 0.9114 |
| No log | 8.4 | 252 | 0.8166 | 0.5236 | 0.8166 | 0.9036 |
| No log | 8.4667 | 254 | 0.8063 | 0.5236 | 0.8063 | 0.8980 |
| No log | 8.5333 | 256 | 0.8070 | 0.5236 | 0.8070 | 0.8983 |
| No log | 8.6 | 258 | 0.8024 | 0.5306 | 0.8024 | 0.8957 |
| No log | 8.6667 | 260 | 0.8059 | 0.5370 | 0.8059 | 0.8977 |
| No log | 8.7333 | 262 | 0.8138 | 0.5303 | 0.8138 | 0.9021 |
| No log | 8.8 | 264 | 0.8383 | 0.5299 | 0.8383 | 0.9156 |
| No log | 8.8667 | 266 | 0.8516 | 0.5289 | 0.8516 | 0.9228 |
| No log | 8.9333 | 268 | 0.8621 | 0.5289 | 0.8621 | 0.9285 |
| No log | 9.0 | 270 | 0.8755 | 0.5060 | 0.8755 | 0.9357 |
| No log | 9.0667 | 272 | 0.8858 | 0.4998 | 0.8858 | 0.9412 |
| No log | 9.1333 | 274 | 0.8964 | 0.5044 | 0.8964 | 0.9468 |
| No log | 9.2 | 276 | 0.8990 | 0.5044 | 0.8990 | 0.9481 |
| No log | 9.2667 | 278 | 0.8877 | 0.4998 | 0.8877 | 0.9422 |
| No log | 9.3333 | 280 | 0.8767 | 0.5060 | 0.8767 | 0.9363 |
| No log | 9.4 | 282 | 0.8739 | 0.5060 | 0.8739 | 0.9348 |
| No log | 9.4667 | 284 | 0.8700 | 0.5060 | 0.8700 | 0.9327 |
| No log | 9.5333 | 286 | 0.8605 | 0.5402 | 0.8605 | 0.9276 |
| No log | 9.6 | 288 | 0.8554 | 0.5402 | 0.8554 | 0.9249 |
| No log | 9.6667 | 290 | 0.8501 | 0.5413 | 0.8501 | 0.9220 |
| No log | 9.7333 | 292 | 0.8474 | 0.5413 | 0.8474 | 0.9206 |
| No log | 9.8 | 294 | 0.8442 | 0.5446 | 0.8442 | 0.9188 |
| No log | 9.8667 | 296 | 0.8421 | 0.5446 | 0.8421 | 0.9176 |
| No log | 9.9333 | 298 | 0.8419 | 0.5446 | 0.8419 | 0.9175 |
| No log | 10.0 | 300 | 0.8418 | 0.5446 | 0.8418 | 0.9175 |
Framework versions
- Transformers 4.44.2
- Pytorch 2.4.0+cu118
- Datasets 2.21.0
- Tokenizers 0.19.1
- Downloads last month
- 1
Model tree for MayBashendy/ArabicNewSplits6_WithDuplicationsForScore5_FineTuningAraBERT_run2_AugV5_k5_task2_organization
Base model
aubmindlab/bert-base-arabertv02