ArabicNewSplits7_B_usingWellWrittenEssays_FineTuningAraBERT_run3_AugV5_k4_task7_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.6274
  • Qwk: 0.4721
  • Mse: 0.6274
  • Rmse: 0.7921

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0909 2 2.6233 -0.0729 2.6233 1.6196
No log 0.1818 4 1.0803 0.0659 1.0803 1.0394
No log 0.2727 6 0.8665 0.1050 0.8665 0.9308
No log 0.3636 8 1.1225 0.1243 1.1225 1.0595
No log 0.4545 10 1.0349 0.1747 1.0349 1.0173
No log 0.5455 12 0.7965 0.0327 0.7965 0.8924
No log 0.6364 14 0.7823 0.1786 0.7823 0.8845
No log 0.7273 16 0.8328 0.2435 0.8328 0.9126
No log 0.8182 18 0.7585 0.1786 0.7585 0.8709
No log 0.9091 20 0.7039 0.0393 0.7039 0.8390
No log 1.0 22 0.7225 0.2085 0.7225 0.8500
No log 1.0909 24 0.6752 0.1846 0.6752 0.8217
No log 1.1818 26 0.6690 0.1961 0.6690 0.8179
No log 1.2727 28 0.6678 0.1846 0.6678 0.8172
No log 1.3636 30 0.7158 0.1183 0.7158 0.8461
No log 1.4545 32 0.8079 0.1718 0.8079 0.8988
No log 1.5455 34 0.8210 0.1962 0.8210 0.9061
No log 1.6364 36 0.8253 0.1962 0.8253 0.9085
No log 1.7273 38 0.8457 0.1416 0.8457 0.9196
No log 1.8182 40 0.8355 0.1416 0.8355 0.9141
No log 1.9091 42 0.7766 0.1866 0.7766 0.8813
No log 2.0 44 0.7471 0.0643 0.7471 0.8644
No log 2.0909 46 0.7339 0.1094 0.7339 0.8567
No log 2.1818 48 0.7182 0.1393 0.7182 0.8475
No log 2.2727 50 0.7123 0.1624 0.7123 0.8440
No log 2.3636 52 0.7360 0.0781 0.7360 0.8579
No log 2.4545 54 0.7020 0.2386 0.7020 0.8378
No log 2.5455 56 0.7032 0.3628 0.7032 0.8386
No log 2.6364 58 0.7098 0.2929 0.7098 0.8425
No log 2.7273 60 0.7501 0.2204 0.7501 0.8661
No log 2.8182 62 0.7546 0.3253 0.7546 0.8687
No log 2.9091 64 0.7435 0.3106 0.7435 0.8623
No log 3.0 66 0.7383 0.3106 0.7383 0.8592
No log 3.0909 68 0.7405 0.3667 0.7405 0.8605
No log 3.1818 70 0.7400 0.3375 0.7400 0.8603
No log 3.2727 72 0.7317 0.3375 0.7317 0.8554
No log 3.3636 74 0.7513 0.3034 0.7513 0.8668
No log 3.4545 76 0.7508 0.2109 0.7508 0.8665
No log 3.5455 78 0.7849 0.1672 0.7849 0.8860
No log 3.6364 80 0.8217 0.1416 0.8217 0.9065
No log 3.7273 82 0.7749 0.1452 0.7749 0.8803
No log 3.8182 84 0.7634 0.1935 0.7634 0.8738
No log 3.9091 86 0.7705 0.2353 0.7705 0.8778
No log 4.0 88 0.8020 0.2591 0.8020 0.8955
No log 4.0909 90 0.8450 0.2806 0.8450 0.9192
No log 4.1818 92 0.8603 0.2843 0.8603 0.9275
No log 4.2727 94 0.8955 0.2962 0.8955 0.9463
No log 4.3636 96 0.8751 0.2893 0.8751 0.9354
No log 4.4545 98 0.8140 0.1888 0.8140 0.9022
No log 4.5455 100 0.8328 0.2273 0.8328 0.9126
No log 4.6364 102 0.8554 0.3163 0.8554 0.9249
No log 4.7273 104 0.8488 0.2594 0.8488 0.9213
No log 4.8182 106 0.8406 0.2622 0.8406 0.9168
No log 4.9091 108 0.8434 0.2121 0.8434 0.9184
No log 5.0 110 0.8433 0.3417 0.8433 0.9183
No log 5.0909 112 0.7757 0.3209 0.7757 0.8807
No log 5.1818 114 0.7020 0.3015 0.7020 0.8378
No log 5.2727 116 0.7168 0.2835 0.7168 0.8467
No log 5.3636 118 0.7627 0.2781 0.7627 0.8734
No log 5.4545 120 0.8263 0.2463 0.8263 0.9090
No log 5.5455 122 0.8532 0.3131 0.8532 0.9237
No log 5.6364 124 0.8047 0.2058 0.8047 0.8971
No log 5.7273 126 0.7979 0.2270 0.7979 0.8932
No log 5.8182 128 0.8438 0.2720 0.8438 0.9186
No log 5.9091 130 0.9696 0.2392 0.9696 0.9847
No log 6.0 132 1.0502 0.0378 1.0502 1.0248
No log 6.0909 134 1.0171 0.1336 1.0171 1.0085
No log 6.1818 136 0.9815 0.1173 0.9815 0.9907
No log 6.2727 138 0.9023 0.1779 0.9023 0.9499
No log 6.3636 140 0.8495 0.2681 0.8495 0.9217
No log 6.4545 142 0.8793 0.1801 0.8793 0.9377
No log 6.5455 144 0.8401 0.2439 0.8401 0.9165
No log 6.6364 146 0.7994 0.2958 0.7994 0.8941
No log 6.7273 148 0.8615 0.2805 0.8615 0.9282
No log 6.8182 150 0.9047 0.2163 0.9047 0.9512
No log 6.9091 152 0.8459 0.2633 0.8459 0.9197
No log 7.0 154 0.7870 0.2809 0.7870 0.8871
No log 7.0909 156 0.7690 0.3209 0.7690 0.8769
No log 7.1818 158 0.7731 0.3231 0.7731 0.8793
No log 7.2727 160 0.7630 0.3197 0.7630 0.8735
No log 7.3636 162 0.7927 0.3130 0.7927 0.8904
No log 7.4545 164 0.8615 0.3711 0.8615 0.9282
No log 7.5455 166 0.8821 0.3343 0.8821 0.9392
No log 7.6364 168 0.8348 0.4044 0.8348 0.9137
No log 7.7273 170 0.8121 0.3775 0.8121 0.9011
No log 7.8182 172 0.7656 0.3508 0.7656 0.8750
No log 7.9091 174 0.7145 0.4259 0.7145 0.8453
No log 8.0 176 0.6969 0.4892 0.6969 0.8348
No log 8.0909 178 0.7050 0.4037 0.7050 0.8396
No log 8.1818 180 0.8740 0.3934 0.8740 0.9349
No log 8.2727 182 1.0833 0.3442 1.0833 1.0408
No log 8.3636 184 1.0769 0.3259 1.0769 1.0377
No log 8.4545 186 0.9940 0.3074 0.9940 0.9970
No log 8.5455 188 0.8139 0.4404 0.8139 0.9022
No log 8.6364 190 0.7203 0.3863 0.7203 0.8487
No log 8.7273 192 0.7043 0.4225 0.7043 0.8392
No log 8.8182 194 0.7134 0.3393 0.7134 0.8446
No log 8.9091 196 0.7541 0.3612 0.7541 0.8684
No log 9.0 198 0.7614 0.3550 0.7614 0.8726
No log 9.0909 200 0.7620 0.4154 0.7620 0.8730
No log 9.1818 202 0.7973 0.4116 0.7973 0.8929
No log 9.2727 204 0.7710 0.3906 0.7710 0.8781
No log 9.3636 206 0.7676 0.3669 0.7676 0.8761
No log 9.4545 208 0.7497 0.4428 0.7497 0.8658
No log 9.5455 210 0.7970 0.3402 0.7970 0.8928
No log 9.6364 212 0.8017 0.3402 0.8017 0.8954
No log 9.7273 214 0.8091 0.3402 0.8091 0.8995
No log 9.8182 216 0.8788 0.3355 0.8788 0.9375
No log 9.9091 218 0.8689 0.3410 0.8689 0.9321
No log 10.0 220 0.8349 0.3626 0.8349 0.9137
No log 10.0909 222 0.8269 0.3285 0.8269 0.9093
No log 10.1818 224 0.8504 0.3719 0.8504 0.9222
No log 10.2727 226 0.8573 0.3719 0.8573 0.9259
No log 10.3636 228 0.7585 0.3950 0.7585 0.8709
No log 10.4545 230 0.7052 0.4600 0.7052 0.8398
No log 10.5455 232 0.7345 0.3239 0.7345 0.8570
No log 10.6364 234 0.7098 0.2709 0.7098 0.8425
No log 10.7273 236 0.6884 0.4535 0.6884 0.8297
No log 10.8182 238 0.8164 0.3719 0.8164 0.9035
No log 10.9091 240 1.0029 0.2977 1.0029 1.0015
No log 11.0 242 1.0025 0.2591 1.0025 1.0013
No log 11.0909 244 0.8777 0.3312 0.8777 0.9369
No log 11.1818 246 0.7995 0.3456 0.7995 0.8942
No log 11.2727 248 0.7731 0.4350 0.7731 0.8793
No log 11.3636 250 0.7751 0.4350 0.7751 0.8804
No log 11.4545 252 0.7830 0.4430 0.7830 0.8849
No log 11.5455 254 0.7538 0.3891 0.7538 0.8682
No log 11.6364 256 0.7362 0.3042 0.7362 0.8580
No log 11.7273 258 0.7492 0.3133 0.7492 0.8656
No log 11.8182 260 0.7554 0.3714 0.7554 0.8692
No log 11.9091 262 0.7877 0.3841 0.7877 0.8875
No log 12.0 264 0.8341 0.3648 0.8341 0.9133
No log 12.0909 266 0.8276 0.3888 0.8276 0.9097
No log 12.1818 268 0.7674 0.3665 0.7674 0.8760
No log 12.2727 270 0.7658 0.2747 0.7658 0.8751
No log 12.3636 272 0.7917 0.3035 0.7917 0.8898
No log 12.4545 274 0.8348 0.3531 0.8348 0.9137
No log 12.5455 276 0.8557 0.3425 0.8557 0.9250
No log 12.6364 278 0.8842 0.3648 0.8842 0.9403
No log 12.7273 280 0.8505 0.3292 0.8505 0.9222
No log 12.8182 282 0.7906 0.3635 0.7906 0.8891
No log 12.9091 284 0.7718 0.2999 0.7718 0.8785
No log 13.0 286 0.7743 0.3297 0.7743 0.8799
No log 13.0909 288 0.7704 0.3616 0.7704 0.8777
No log 13.1818 290 0.7914 0.4106 0.7914 0.8896
No log 13.2727 292 0.8418 0.4051 0.8418 0.9175
No log 13.3636 294 0.8307 0.3780 0.8307 0.9114
No log 13.4545 296 0.7881 0.3508 0.7881 0.8878
No log 13.5455 298 0.7687 0.3432 0.7687 0.8768
No log 13.6364 300 0.7953 0.3196 0.7953 0.8918
No log 13.7273 302 0.8235 0.3402 0.8235 0.9075
No log 13.8182 304 0.9357 0.3886 0.9357 0.9673
No log 13.9091 306 0.9674 0.3018 0.9674 0.9836
No log 14.0 308 0.9217 0.3563 0.9217 0.9601
No log 14.0909 310 0.9115 0.3867 0.9115 0.9547
No log 14.1818 312 0.8846 0.3760 0.8846 0.9405
No log 14.2727 314 0.8581 0.3667 0.8581 0.9263
No log 14.3636 316 0.8623 0.3505 0.8623 0.9286
No log 14.4545 318 0.8375 0.4031 0.8375 0.9151
No log 14.5455 320 0.8232 0.4031 0.8232 0.9073
No log 14.6364 322 0.7755 0.4740 0.7755 0.8806
No log 14.7273 324 0.7524 0.4393 0.7524 0.8674
No log 14.8182 326 0.7987 0.4097 0.7987 0.8937
No log 14.9091 328 0.8197 0.3700 0.8197 0.9054
No log 15.0 330 0.8101 0.3489 0.8101 0.9000
No log 15.0909 332 0.8074 0.3822 0.8074 0.8985
No log 15.1818 334 0.8035 0.4031 0.8035 0.8964
No log 15.2727 336 0.7611 0.3822 0.7611 0.8724
No log 15.3636 338 0.7277 0.4183 0.7277 0.8531
No log 15.4545 340 0.7267 0.4116 0.7267 0.8525
No log 15.5455 342 0.7507 0.3822 0.7507 0.8664
No log 15.6364 344 0.7894 0.3700 0.7894 0.8885
No log 15.7273 346 0.7974 0.3700 0.7974 0.8930
No log 15.8182 348 0.7534 0.3822 0.7534 0.8680
No log 15.9091 350 0.6938 0.4448 0.6938 0.8329
No log 16.0 352 0.6958 0.4699 0.6958 0.8342
No log 16.0909 354 0.6991 0.4486 0.6991 0.8361
No log 16.1818 356 0.7017 0.4290 0.7017 0.8377
No log 16.2727 358 0.7301 0.4051 0.7301 0.8545
No log 16.3636 360 0.8177 0.3805 0.8177 0.9042
No log 16.4545 362 0.8281 0.3805 0.8281 0.9100
No log 16.5455 364 0.7943 0.3548 0.7943 0.8912
No log 16.6364 366 0.7776 0.3930 0.7776 0.8818
No log 16.7273 368 0.7679 0.3950 0.7679 0.8763
No log 16.8182 370 0.7841 0.4163 0.7841 0.8855
No log 16.9091 372 0.8323 0.3586 0.8323 0.9123
No log 17.0 374 0.8290 0.3586 0.8290 0.9105
No log 17.0909 376 0.7713 0.3822 0.7713 0.8782
No log 17.1818 378 0.7213 0.4001 0.7213 0.8493
No log 17.2727 380 0.7053 0.4322 0.7053 0.8398
No log 17.3636 382 0.7042 0.4341 0.7042 0.8391
No log 17.4545 384 0.7074 0.4321 0.7074 0.8411
No log 17.5455 386 0.7030 0.4321 0.7030 0.8385
No log 17.6364 388 0.7212 0.4408 0.7212 0.8493
No log 17.7273 390 0.7392 0.3865 0.7392 0.8598
No log 17.8182 392 0.7696 0.3865 0.7696 0.8773
No log 17.9091 394 0.7462 0.3930 0.7462 0.8638
No log 18.0 396 0.6888 0.4290 0.6888 0.8299
No log 18.0909 398 0.6691 0.4402 0.6691 0.8180
No log 18.1818 400 0.6685 0.4516 0.6685 0.8176
No log 18.2727 402 0.6716 0.4840 0.6716 0.8195
No log 18.3636 404 0.7233 0.4116 0.7233 0.8505
No log 18.4545 406 0.8132 0.3620 0.8132 0.9018
No log 18.5455 408 0.8011 0.3948 0.8011 0.8951
No log 18.6364 410 0.7159 0.3489 0.7159 0.8461
No log 18.7273 412 0.6646 0.4700 0.6646 0.8152
No log 18.8182 414 0.6688 0.4820 0.6688 0.8178
No log 18.9091 416 0.6751 0.4820 0.6751 0.8217
No log 19.0 418 0.6923 0.4037 0.6923 0.8320
No log 19.0909 420 0.7402 0.3842 0.7402 0.8604
No log 19.1818 422 0.7649 0.3608 0.7649 0.8746
No log 19.2727 424 0.7840 0.3402 0.7840 0.8854
No log 19.3636 426 0.8219 0.3059 0.8219 0.9066
No log 19.4545 428 0.8122 0.3207 0.8122 0.9012
No log 19.5455 430 0.8296 0.3440 0.8296 0.9108
No log 19.6364 432 0.8110 0.3207 0.8110 0.9005
No log 19.7273 434 0.7610 0.4081 0.7610 0.8723
No log 19.8182 436 0.7403 0.4179 0.7403 0.8604
No log 19.9091 438 0.7576 0.4281 0.7576 0.8704
No log 20.0 440 0.8562 0.3505 0.8562 0.9253
No log 20.0909 442 0.9484 0.3760 0.9484 0.9739
No log 20.1818 444 0.9627 0.3760 0.9627 0.9812
No log 20.2727 446 0.8984 0.3760 0.8984 0.9479
No log 20.3636 448 0.8107 0.4112 0.8107 0.9004
No log 20.4545 450 0.8219 0.4350 0.8219 0.9066
No log 20.5455 452 0.9052 0.3425 0.9052 0.9514
No log 20.6364 454 1.0969 0.2977 1.0969 1.0473
No log 20.7273 456 1.1574 0.2754 1.1574 1.0758
No log 20.8182 458 1.0356 0.3305 1.0356 1.0176
No log 20.9091 460 0.8538 0.2900 0.8538 0.9240
No log 21.0 462 0.7925 0.3689 0.7925 0.8902
No log 21.0909 464 0.7649 0.4044 0.7649 0.8746
No log 21.1818 466 0.7746 0.4044 0.7746 0.8801
No log 21.2727 468 0.7966 0.3626 0.7966 0.8925
No log 21.3636 470 0.7837 0.3976 0.7837 0.8853
No log 21.4545 472 0.7758 0.3997 0.7758 0.8808
No log 21.5455 474 0.7704 0.3307 0.7704 0.8778
No log 21.6364 476 0.7870 0.3343 0.7870 0.8871
No log 21.7273 478 0.7433 0.3997 0.7433 0.8622
No log 21.8182 480 0.7016 0.4091 0.7016 0.8376
No log 21.9091 482 0.6893 0.3862 0.6893 0.8303
No log 22.0 484 0.6936 0.4222 0.6936 0.8329
No log 22.0909 486 0.7220 0.4017 0.7220 0.8497
No log 22.1818 488 0.7515 0.3548 0.7515 0.8669
No log 22.2727 490 0.7452 0.4232 0.7452 0.8632
No log 22.3636 492 0.6935 0.3931 0.6935 0.8327
No log 22.4545 494 0.6745 0.4397 0.6745 0.8213
No log 22.5455 496 0.6646 0.3910 0.6646 0.8152
No log 22.6364 498 0.6572 0.3577 0.6572 0.8107
0.3752 22.7273 500 0.6768 0.4724 0.6768 0.8227
0.3752 22.8182 502 0.6931 0.4354 0.6931 0.8325
0.3752 22.9091 504 0.7049 0.4568 0.7049 0.8396
0.3752 23.0 506 0.6812 0.4085 0.6812 0.8253
0.3752 23.0909 508 0.6783 0.4302 0.6783 0.8236
0.3752 23.1818 510 0.7260 0.4777 0.7260 0.8521
0.3752 23.2727 512 0.8463 0.4286 0.8463 0.9200
0.3752 23.3636 514 0.8752 0.4286 0.8752 0.9355
0.3752 23.4545 516 0.7918 0.3929 0.7918 0.8899
0.3752 23.5455 518 0.6914 0.4980 0.6914 0.8315
0.3752 23.6364 520 0.6334 0.4336 0.6334 0.7959
0.3752 23.7273 522 0.6201 0.4413 0.6201 0.7874
0.3752 23.8182 524 0.6243 0.4562 0.6243 0.7902
0.3752 23.9091 526 0.6191 0.4639 0.6191 0.7868
0.3752 24.0 528 0.6274 0.4721 0.6274 0.7921

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
1
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_B_usingWellWrittenEssays_FineTuningAraBERT_run3_AugV5_k4_task7_organization

Finetuned
(4019)
this model