ArabicNewSplits6_WithDuplicationsForScore5_FineTuningAraBERT_run2_AugV5_k9_task3_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.7637
  • Qwk: 0.3770
  • Mse: 0.7637
  • Rmse: 0.8739

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 10

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0435 2 3.0767 0.0 3.0767 1.7541
No log 0.0870 4 1.5809 -0.0070 1.5809 1.2573
No log 0.1304 6 0.9183 0.1181 0.9183 0.9583
No log 0.1739 8 0.5958 -0.0159 0.5958 0.7719
No log 0.2174 10 0.6595 0.0569 0.6595 0.8121
No log 0.2609 12 0.7397 0.125 0.7397 0.8600
No log 0.3043 14 0.5919 0.0222 0.5919 0.7694
No log 0.3478 16 0.7587 0.1398 0.7587 0.8710
No log 0.3913 18 0.6329 0.1905 0.6329 0.7956
No log 0.4348 20 0.5513 0.0 0.5513 0.7425
No log 0.4783 22 0.5785 0.0 0.5785 0.7606
No log 0.5217 24 0.6235 0.0 0.6235 0.7896
No log 0.5652 26 0.7279 0.0720 0.7279 0.8532
No log 0.6087 28 0.7084 0.0720 0.7084 0.8417
No log 0.6522 30 0.5774 0.0 0.5774 0.7598
No log 0.6957 32 0.6808 0.25 0.6808 0.8251
No log 0.7391 34 0.8761 0.1429 0.8761 0.9360
No log 0.7826 36 0.6933 0.2000 0.6933 0.8326
No log 0.8261 38 0.6070 0.0815 0.6070 0.7791
No log 0.8696 40 0.6528 -0.0303 0.6528 0.8080
No log 0.9130 42 0.6546 -0.0435 0.6546 0.8091
No log 0.9565 44 0.7482 0.0409 0.7482 0.8650
No log 1.0 46 1.1291 0.0522 1.1291 1.0626
No log 1.0435 48 1.2372 0.0286 1.2372 1.1123
No log 1.0870 50 0.8891 0.0980 0.8891 0.9429
No log 1.1304 52 0.6693 -0.0435 0.6693 0.8181
No log 1.1739 54 0.6496 -0.0233 0.6496 0.8060
No log 1.2174 56 0.5538 -0.0233 0.5538 0.7442
No log 1.2609 58 0.6283 0.2549 0.6283 0.7926
No log 1.3043 60 0.6231 0.3035 0.6231 0.7894
No log 1.3478 62 0.6268 0.2688 0.6268 0.7917
No log 1.3913 64 0.5544 0.1467 0.5544 0.7446
No log 1.4348 66 0.6364 0.2000 0.6364 0.7977
No log 1.4783 68 0.6168 0.2281 0.6168 0.7853
No log 1.5217 70 0.6036 0.1141 0.6036 0.7769
No log 1.5652 72 0.6701 0.3016 0.6701 0.8186
No log 1.6087 74 0.6008 0.2749 0.6008 0.7751
No log 1.6522 76 0.5000 0.2281 0.5000 0.7071
No log 1.6957 78 1.1195 0.2509 1.1195 1.0580
No log 1.7391 80 0.9173 0.3134 0.9173 0.9578
No log 1.7826 82 0.6029 0.3797 0.6029 0.7764
No log 1.8261 84 1.2490 0.1126 1.2490 1.1176
No log 1.8696 86 1.2738 0.0130 1.2738 1.1286
No log 1.9130 88 0.7033 0.3862 0.7033 0.8387
No log 1.9565 90 0.7118 0.3067 0.7118 0.8437
No log 2.0 92 1.1631 0.1498 1.1631 1.0785
No log 2.0435 94 0.9191 0.2066 0.9191 0.9587
No log 2.0870 96 0.5638 0.2941 0.5638 0.7509
No log 2.1304 98 0.6993 0.3333 0.6993 0.8362
No log 2.1739 100 1.0596 0.1648 1.0596 1.0294
No log 2.2174 102 1.0970 0.2243 1.0970 1.0474
No log 2.2609 104 0.8346 0.3208 0.8346 0.9136
No log 2.3043 106 0.5240 0.4573 0.5240 0.7239
No log 2.3478 108 0.5797 0.5215 0.5797 0.7614
No log 2.3913 110 1.0379 0.2814 1.0379 1.0188
No log 2.4348 112 1.7872 0.1025 1.7872 1.3369
No log 2.4783 114 1.6071 0.1011 1.6071 1.2677
No log 2.5217 116 0.6953 0.3585 0.6953 0.8339
No log 2.5652 118 0.6760 0.3770 0.6760 0.8222
No log 2.6087 120 0.8103 0.2787 0.8103 0.9002
No log 2.6522 122 0.5985 0.4690 0.5985 0.7736
No log 2.6957 124 0.4884 0.3478 0.4884 0.6988
No log 2.7391 126 0.5189 0.3258 0.5189 0.7204
No log 2.7826 128 0.5006 0.3591 0.5006 0.7076
No log 2.8261 130 0.4966 0.3043 0.4966 0.7047
No log 2.8696 132 0.7001 0.4622 0.7001 0.8367
No log 2.9130 134 0.9242 0.25 0.9242 0.9613
No log 2.9565 136 0.6277 0.4444 0.6277 0.7923
No log 3.0 138 0.5700 0.4862 0.5700 0.7550
No log 3.0435 140 0.5380 0.3769 0.5380 0.7335
No log 3.0870 142 0.6544 0.4632 0.6544 0.8090
No log 3.1304 144 0.8944 0.2366 0.8944 0.9457
No log 3.1739 146 0.7557 0.3633 0.7557 0.8693
No log 3.2174 148 0.5274 0.3371 0.5274 0.7262
No log 3.2609 150 0.4839 0.4526 0.4839 0.6957
No log 3.3043 152 0.4935 0.4167 0.4935 0.7025
No log 3.3478 154 0.4992 0.4404 0.4992 0.7065
No log 3.3913 156 0.6902 0.3684 0.6902 0.8308
No log 3.4348 158 1.0859 0.1834 1.0859 1.0421
No log 3.4783 160 0.9912 0.1828 0.9912 0.9956
No log 3.5217 162 0.7240 0.4138 0.7240 0.8509
No log 3.5652 164 0.5812 0.4902 0.5812 0.7623
No log 3.6087 166 0.5595 0.4851 0.5595 0.7480
No log 3.6522 168 0.5707 0.4732 0.5707 0.7554
No log 3.6957 170 0.8166 0.2450 0.8166 0.9036
No log 3.7391 172 0.9741 0.2121 0.9741 0.9869
No log 3.7826 174 0.8072 0.2469 0.8072 0.8984
No log 3.8261 176 0.6142 0.4010 0.6142 0.7837
No log 3.8696 178 0.6332 0.3962 0.6332 0.7957
No log 3.9130 180 0.9433 0.2286 0.9433 0.9712
No log 3.9565 182 1.3822 0.1605 1.3822 1.1757
No log 4.0 184 1.4448 0.1579 1.4448 1.2020
No log 4.0435 186 0.9042 0.2836 0.9042 0.9509
No log 4.0870 188 0.6301 0.4400 0.6301 0.7938
No log 4.1304 190 0.6601 0.4667 0.6601 0.8125
No log 4.1739 192 0.9076 0.2537 0.9076 0.9527
No log 4.2174 194 1.1617 0.1847 1.1617 1.0778
No log 4.2609 196 0.9325 0.2000 0.9325 0.9657
No log 4.3043 198 0.9288 0.1807 0.9288 0.9637
No log 4.3478 200 0.7755 0.1790 0.7755 0.8806
No log 4.3913 202 0.7968 0.2134 0.7968 0.8927
No log 4.4348 204 0.8107 0.2134 0.8107 0.9004
No log 4.4783 206 0.7167 0.2982 0.7167 0.8466
No log 4.5217 208 0.7621 0.3043 0.7621 0.8730
No log 4.5652 210 0.7973 0.2140 0.7973 0.8929
No log 4.6087 212 0.9829 0.2353 0.9829 0.9914
No log 4.6522 214 0.9919 0.2347 0.9919 0.9959
No log 4.6957 216 0.8991 0.2459 0.8991 0.9482
No log 4.7391 218 0.8022 0.2838 0.8022 0.8956
No log 4.7826 220 0.8196 0.2838 0.8196 0.9053
No log 4.8261 222 0.7311 0.2811 0.7311 0.8550
No log 4.8696 224 0.8040 0.2137 0.8040 0.8966
No log 4.9130 226 0.9003 0.2846 0.9003 0.9488
No log 4.9565 228 1.0362 0.2727 1.0362 1.0179
No log 5.0 230 1.0936 0.2409 1.0936 1.0458
No log 5.0435 232 0.8847 0.3195 0.8847 0.9406
No log 5.0870 234 0.6766 0.2464 0.6766 0.8226
No log 5.1304 236 0.7283 0.3153 0.7283 0.8534
No log 5.1739 238 0.7629 0.3480 0.7629 0.8735
No log 5.2174 240 0.8992 0.2672 0.8992 0.9482
No log 5.2609 242 0.9530 0.2672 0.9530 0.9762
No log 5.3043 244 0.7827 0.3391 0.7827 0.8847
No log 5.3478 246 0.7766 0.4386 0.7766 0.8813
No log 5.3913 248 0.8896 0.2062 0.8896 0.9432
No log 5.4348 250 0.9423 0.1769 0.9423 0.9707
No log 5.4783 252 0.8823 0.2129 0.8823 0.9393
No log 5.5217 254 0.7718 0.3667 0.7718 0.8785
No log 5.5652 256 0.6605 0.4 0.6605 0.8127
No log 5.6087 258 0.7441 0.3739 0.7441 0.8626
No log 5.6522 260 1.0563 0.1940 1.0563 1.0278
No log 5.6957 262 1.1841 0.1942 1.1841 1.0882
No log 5.7391 264 1.1726 0.1941 1.1726 1.0829
No log 5.7826 266 0.8883 0.1811 0.8883 0.9425
No log 5.8261 268 0.6613 0.4067 0.6613 0.8132
No log 5.8696 270 0.6486 0.4067 0.6486 0.8054
No log 5.9130 272 0.7401 0.3333 0.7401 0.8603
No log 5.9565 274 0.9689 0.1935 0.9689 0.9843
No log 6.0 276 1.0676 0.1635 1.0676 1.0332
No log 6.0435 278 0.8516 0.1935 0.8516 0.9228
No log 6.0870 280 0.7236 0.3793 0.7236 0.8507
No log 6.1304 282 0.7567 0.3559 0.7567 0.8699
No log 6.1739 284 0.6789 0.3214 0.6789 0.8240
No log 6.2174 286 0.5813 0.4563 0.5813 0.7624
No log 6.2609 288 0.5757 0.4902 0.5757 0.7588
No log 6.3043 290 0.5799 0.5094 0.5799 0.7615
No log 6.3478 292 0.6270 0.4751 0.6270 0.7918
No log 6.3913 294 0.8460 0.2803 0.8460 0.9198
No log 6.4348 296 0.9360 0.2113 0.9360 0.9674
No log 6.4783 298 0.7695 0.3793 0.7695 0.8772
No log 6.5217 300 0.7464 0.3418 0.7464 0.8639
No log 6.5652 302 0.8025 0.2803 0.8025 0.8958
No log 6.6087 304 0.8170 0.3115 0.8170 0.9039
No log 6.6522 306 0.7212 0.3793 0.7212 0.8492
No log 6.6957 308 0.6643 0.4286 0.6643 0.8150
No log 6.7391 310 0.6508 0.4286 0.6508 0.8067
No log 6.7826 312 0.7078 0.3793 0.7078 0.8413
No log 6.8261 314 0.8436 0.2846 0.8436 0.9185
No log 6.8696 316 0.8585 0.2829 0.8585 0.9266
No log 6.9130 318 0.7994 0.3115 0.7994 0.8941
No log 6.9565 320 0.7047 0.4185 0.7047 0.8395
No log 7.0 322 0.6965 0.4595 0.6965 0.8346
No log 7.0435 324 0.6998 0.4489 0.6998 0.8365
No log 7.0870 326 0.7987 0.2803 0.7987 0.8937
No log 7.1304 328 0.7787 0.3755 0.7787 0.8824
No log 7.1739 330 0.6711 0.4737 0.6711 0.8192
No log 7.2174 332 0.6640 0.4439 0.6640 0.8149
No log 7.2609 334 0.7337 0.3793 0.7337 0.8566
No log 7.3043 336 0.8550 0.2191 0.8550 0.9247
No log 7.3478 338 0.9192 0.2119 0.9192 0.9587
No log 7.3913 340 0.8563 0.1875 0.8563 0.9254
No log 7.4348 342 0.7284 0.3793 0.7284 0.8534
No log 7.4783 344 0.6214 0.5113 0.6214 0.7883
No log 7.5217 346 0.6094 0.4815 0.6094 0.7806
No log 7.5652 348 0.6796 0.4286 0.6796 0.8244
No log 7.6087 350 0.8662 0.3258 0.8662 0.9307
No log 7.6522 352 1.0289 0.2053 1.0289 1.0143
No log 7.6957 354 0.9938 0.2055 0.9938 0.9969
No log 7.7391 356 0.8960 0.2119 0.8960 0.9466
No log 7.7826 358 0.8431 0.3258 0.8431 0.9182
No log 7.8261 360 0.8627 0.3258 0.8627 0.9288
No log 7.8696 362 0.9074 0.2119 0.9074 0.9526
No log 7.9130 364 0.9042 0.2119 0.9042 0.9509
No log 7.9565 366 0.9019 0.2561 0.9019 0.9497
No log 8.0 368 0.7954 0.4373 0.7954 0.8919
No log 8.0435 370 0.7386 0.4239 0.7386 0.8594
No log 8.0870 372 0.7923 0.4240 0.7923 0.8901
No log 8.1304 374 0.8736 0.3143 0.8736 0.9347
No log 8.1739 376 0.9382 0.2056 0.9382 0.9686
No log 8.2174 378 0.9864 0.2055 0.9864 0.9932
No log 8.2609 380 0.9279 0.2056 0.9279 0.9633
No log 8.3043 382 0.9038 0.2552 0.9038 0.9507
No log 8.3478 384 0.9578 0.2056 0.9578 0.9787
No log 8.3913 386 1.0206 0.2055 1.0206 1.0102
No log 8.4348 388 0.9641 0.2056 0.9641 0.9819
No log 8.4783 390 0.8735 0.2756 0.8735 0.9346
No log 8.5217 392 0.8275 0.3414 0.8275 0.9097
No log 8.5652 394 0.7724 0.3882 0.7724 0.8789
No log 8.6087 396 0.7532 0.4240 0.7532 0.8679
No log 8.6522 398 0.7819 0.3414 0.7819 0.8842
No log 8.6957 400 0.8476 0.2756 0.8476 0.9207
No log 8.7391 402 0.8647 0.2424 0.8647 0.9299
No log 8.7826 404 0.8836 0.2347 0.8836 0.9400
No log 8.8261 406 0.8293 0.3414 0.8293 0.9106
No log 8.8696 408 0.7912 0.3414 0.7912 0.8895
No log 8.9130 410 0.7702 0.3770 0.7702 0.8776
No log 8.9565 412 0.7538 0.4240 0.7538 0.8682
No log 9.0 414 0.7380 0.4240 0.7380 0.8591
No log 9.0435 416 0.7456 0.4240 0.7456 0.8635
No log 9.0870 418 0.7409 0.4240 0.7409 0.8607
No log 9.1304 420 0.7571 0.4008 0.7571 0.8701
No log 9.1739 422 0.8098 0.3414 0.8098 0.8999
No log 9.2174 424 0.8332 0.3414 0.8332 0.9128
No log 9.2609 426 0.8240 0.3414 0.8240 0.9078
No log 9.3043 428 0.8400 0.3414 0.8400 0.9165
No log 9.3478 430 0.8327 0.3414 0.8327 0.9125
No log 9.3913 432 0.8250 0.3414 0.8250 0.9083
No log 9.4348 434 0.8500 0.3588 0.8500 0.9220
No log 9.4783 436 0.8836 0.3455 0.8836 0.9400
No log 9.5217 438 0.8951 0.2347 0.8951 0.9461
No log 9.5652 440 0.8859 0.2347 0.8859 0.9412
No log 9.6087 442 0.8677 0.3588 0.8677 0.9315
No log 9.6522 444 0.8559 0.3414 0.8559 0.9252
No log 9.6957 446 0.8357 0.3414 0.8357 0.9141
No log 9.7391 448 0.8070 0.3770 0.8070 0.8983
No log 9.7826 450 0.7842 0.3770 0.7842 0.8856
No log 9.8261 452 0.7770 0.3770 0.7770 0.8815
No log 9.8696 454 0.7690 0.3770 0.7690 0.8769
No log 9.9130 456 0.7642 0.3770 0.7642 0.8742
No log 9.9565 458 0.7636 0.3770 0.7636 0.8739
No log 10.0 460 0.7637 0.3770 0.7637 0.8739

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
1
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits6_WithDuplicationsForScore5_FineTuningAraBERT_run2_AugV5_k9_task3_organization

Finetuned
(4023)
this model