ArabicNewSplits7_B_usingWellWrittenEssays_FineTuningAraBERT_run3_AugV5_k16_task7_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.8173
  • Qwk: 0.3780
  • Mse: 0.8173
  • Rmse: 0.9041

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0244 2 2.6347 -0.0407 2.6347 1.6232
No log 0.0488 4 1.4763 0.0461 1.4763 1.2150
No log 0.0732 6 1.1378 -0.1866 1.1378 1.0667
No log 0.0976 8 1.0700 -0.0215 1.0700 1.0344
No log 0.1220 10 1.1709 -0.0652 1.1709 1.0821
No log 0.1463 12 1.2764 -0.0856 1.2764 1.1298
No log 0.1707 14 1.3335 -0.2062 1.3335 1.1548
No log 0.1951 16 1.3049 -0.1171 1.3049 1.1423
No log 0.2195 18 1.0628 -0.1111 1.0628 1.0309
No log 0.2439 20 0.8706 0.0 0.8706 0.9330
No log 0.2683 22 0.8118 0.0 0.8118 0.9010
No log 0.2927 24 0.8199 0.1617 0.8199 0.9055
No log 0.3171 26 0.8968 0.1718 0.8968 0.9470
No log 0.3415 28 1.0342 0.0924 1.0342 1.0169
No log 0.3659 30 1.0600 0.0653 1.0600 1.0296
No log 0.3902 32 1.0181 0.0509 1.0181 1.0090
No log 0.4146 34 0.9053 0.0541 0.9053 0.9515
No log 0.4390 36 0.7510 0.1139 0.7510 0.8666
No log 0.4634 38 0.7448 0.1282 0.7448 0.8630
No log 0.4878 40 0.7894 0.1372 0.7894 0.8885
No log 0.5122 42 0.7035 0.0481 0.7035 0.8388
No log 0.5366 44 0.6790 0.0798 0.6790 0.8240
No log 0.5610 46 0.7270 0.2345 0.7270 0.8527
No log 0.5854 48 0.7632 0.1359 0.7632 0.8736
No log 0.6098 50 0.7717 0.1407 0.7717 0.8784
No log 0.6341 52 0.7477 0.1752 0.7477 0.8647
No log 0.6585 54 0.7502 0.1183 0.7502 0.8662
No log 0.6829 56 0.7555 0.1103 0.7555 0.8692
No log 0.7073 58 0.7624 0.1452 0.7624 0.8732
No log 0.7317 60 0.7806 0.1353 0.7806 0.8835
No log 0.7561 62 0.8056 0.0643 0.8056 0.8976
No log 0.7805 64 0.8549 0.1007 0.8549 0.9246
No log 0.8049 66 0.8493 0.1007 0.8493 0.9216
No log 0.8293 68 0.8275 0.0643 0.8275 0.9097
No log 0.8537 70 0.7690 0.1981 0.7690 0.8769
No log 0.8780 72 0.7090 0.2095 0.7090 0.8420
No log 0.9024 74 0.7455 0.2012 0.7455 0.8634
No log 0.9268 76 0.7706 0.1304 0.7706 0.8778
No log 0.9512 78 0.7334 0.0679 0.7334 0.8564
No log 0.9756 80 0.7437 0.2883 0.7437 0.8624
No log 1.0 82 0.7896 0.2817 0.7896 0.8886
No log 1.0244 84 0.7697 0.1918 0.7697 0.8773
No log 1.0488 86 0.7551 0.1050 0.7551 0.8690
No log 1.0732 88 0.8043 0.2395 0.8043 0.8968
No log 1.0976 90 0.8401 0.2612 0.8401 0.9166
No log 1.1220 92 0.7947 0.2673 0.7947 0.8914
No log 1.1463 94 0.7374 0.1941 0.7374 0.8587
No log 1.1707 96 0.6831 0.1775 0.6831 0.8265
No log 1.1951 98 0.6922 0.4044 0.6922 0.8320
No log 1.2195 100 0.7629 0.3032 0.7629 0.8735
No log 1.2439 102 0.7610 0.3195 0.7610 0.8724
No log 1.2683 104 0.7388 0.2113 0.7388 0.8595
No log 1.2927 106 0.8581 0.2583 0.8581 0.9263
No log 1.3171 108 0.8845 0.2224 0.8845 0.9405
No log 1.3415 110 0.8028 0.1597 0.8028 0.8960
No log 1.3659 112 0.7694 0.1489 0.7694 0.8772
No log 1.3902 114 0.8041 0.1972 0.8041 0.8967
No log 1.4146 116 0.8473 0.1884 0.8473 0.9205
No log 1.4390 118 0.9178 0.3012 0.9178 0.9580
No log 1.4634 120 0.9061 0.2861 0.9061 0.9519
No log 1.4878 122 0.7943 0.2692 0.7943 0.8912
No log 1.5122 124 0.7121 0.3341 0.7121 0.8439
No log 1.5366 126 0.6947 0.3782 0.6947 0.8335
No log 1.5610 128 0.6983 0.2819 0.6983 0.8356
No log 1.5854 130 0.7007 0.2819 0.7007 0.8370
No log 1.6098 132 0.7106 0.3545 0.7106 0.8429
No log 1.6341 134 0.6803 0.3814 0.6803 0.8248
No log 1.6585 136 0.6114 0.4468 0.6114 0.7819
No log 1.6829 138 0.6153 0.4468 0.6153 0.7844
No log 1.7073 140 0.6498 0.4402 0.6498 0.8061
No log 1.7317 142 0.6852 0.3022 0.6852 0.8278
No log 1.7561 144 0.7081 0.3906 0.7081 0.8415
No log 1.7805 146 0.7369 0.3879 0.7369 0.8584
No log 1.8049 148 0.7415 0.3625 0.7415 0.8611
No log 1.8293 150 0.7467 0.3658 0.7467 0.8641
No log 1.8537 152 0.7344 0.3151 0.7344 0.8569
No log 1.8780 154 0.7269 0.4176 0.7269 0.8526
No log 1.9024 156 0.7071 0.4073 0.7071 0.8409
No log 1.9268 158 0.6852 0.4073 0.6852 0.8278
No log 1.9512 160 0.6657 0.4222 0.6657 0.8159
No log 1.9756 162 0.6833 0.4302 0.6833 0.8266
No log 2.0 164 0.7063 0.4920 0.7063 0.8404
No log 2.0244 166 0.7343 0.4719 0.7343 0.8569
No log 2.0488 168 0.7147 0.3971 0.7147 0.8454
No log 2.0732 170 0.6738 0.4173 0.6738 0.8208
No log 2.0976 172 0.6640 0.4635 0.6640 0.8148
No log 2.1220 174 0.6577 0.4037 0.6577 0.8110
No log 2.1463 176 0.7362 0.3950 0.7362 0.8580
No log 2.1707 178 0.7612 0.3930 0.7612 0.8725
No log 2.1951 180 0.7063 0.3679 0.7063 0.8404
No log 2.2195 182 0.6849 0.3933 0.6849 0.8276
No log 2.2439 184 0.7097 0.3598 0.7097 0.8424
No log 2.2683 186 0.7445 0.4125 0.7445 0.8629
No log 2.2927 188 0.7513 0.3679 0.7513 0.8668
No log 2.3171 190 0.7703 0.3353 0.7703 0.8777
No log 2.3415 192 0.7516 0.3486 0.7516 0.8669
No log 2.3659 194 0.7057 0.4543 0.7057 0.8401
No log 2.3902 196 0.6923 0.3961 0.6923 0.8321
No log 2.4146 198 0.6967 0.2953 0.6967 0.8347
No log 2.4390 200 0.6972 0.3569 0.6972 0.8350
No log 2.4634 202 0.7338 0.3261 0.7338 0.8566
No log 2.4878 204 0.8323 0.3456 0.8323 0.9123
No log 2.5122 206 0.8771 0.3076 0.8771 0.9365
No log 2.5366 208 0.8007 0.3586 0.8007 0.8948
No log 2.5610 210 0.7161 0.4375 0.7161 0.8462
No log 2.5854 212 0.7054 0.4278 0.7054 0.8399
No log 2.6098 214 0.7180 0.4535 0.7180 0.8474
No log 2.6341 216 0.7707 0.4356 0.7707 0.8779
No log 2.6585 218 0.7865 0.4754 0.7865 0.8869
No log 2.6829 220 0.7508 0.4543 0.7508 0.8665
No log 2.7073 222 0.7166 0.4516 0.7166 0.8465
No log 2.7317 224 0.6872 0.4575 0.6872 0.8290
No log 2.7561 226 0.6998 0.4475 0.6998 0.8365
No log 2.7805 228 0.7647 0.3447 0.7647 0.8745
No log 2.8049 230 0.7664 0.4031 0.7664 0.8754
No log 2.8293 232 0.7368 0.3780 0.7368 0.8583
No log 2.8537 234 0.7533 0.3822 0.7533 0.8679
No log 2.8780 236 0.7588 0.3802 0.7588 0.8711
No log 2.9024 238 0.6905 0.4543 0.6905 0.8310
No log 2.9268 240 0.6555 0.4738 0.6555 0.8096
No log 2.9512 242 0.6573 0.4274 0.6573 0.8108
No log 2.9756 244 0.6916 0.3854 0.6916 0.8316
No log 3.0 246 0.7519 0.3570 0.7519 0.8671
No log 3.0244 248 0.7730 0.3675 0.7730 0.8792
No log 3.0488 250 0.6925 0.4058 0.6925 0.8322
No log 3.0732 252 0.6337 0.5319 0.6337 0.7961
No log 3.0976 254 0.6499 0.4810 0.6499 0.8061
No log 3.1220 256 0.6818 0.4794 0.6818 0.8257
No log 3.1463 258 0.6722 0.4315 0.6722 0.8199
No log 3.1707 260 0.6965 0.5074 0.6965 0.8346
No log 3.1951 262 0.7968 0.3538 0.7968 0.8926
No log 3.2195 264 0.7956 0.3937 0.7956 0.8919
No log 3.2439 266 0.7161 0.4618 0.7161 0.8462
No log 3.2683 268 0.7123 0.4381 0.7123 0.8440
No log 3.2927 270 0.7781 0.3510 0.7781 0.8821
No log 3.3171 272 0.8301 0.3285 0.8301 0.9111
No log 3.3415 274 0.7632 0.3740 0.7632 0.8736
No log 3.3659 276 0.6958 0.3910 0.6958 0.8342
No log 3.3902 278 0.6689 0.4166 0.6689 0.8179
No log 3.4146 280 0.6820 0.4379 0.6820 0.8258
No log 3.4390 282 0.7480 0.3888 0.7480 0.8648
No log 3.4634 284 0.8153 0.4124 0.8153 0.9029
No log 3.4878 286 0.8440 0.3543 0.8440 0.9187
No log 3.5122 288 0.7752 0.4423 0.7752 0.8805
No log 3.5366 290 0.6867 0.4574 0.6867 0.8287
No log 3.5610 292 0.7219 0.4230 0.7219 0.8496
No log 3.5854 294 0.7615 0.3808 0.7615 0.8727
No log 3.6098 296 0.7221 0.4832 0.7221 0.8497
No log 3.6341 298 0.6888 0.4308 0.6888 0.8300
No log 3.6585 300 0.7482 0.3656 0.7482 0.8650
No log 3.6829 302 0.8096 0.3503 0.8096 0.8998
No log 3.7073 304 0.7863 0.4232 0.7863 0.8867
No log 3.7317 306 0.7241 0.4094 0.7241 0.8509
No log 3.7561 308 0.7202 0.4128 0.7202 0.8487
No log 3.7805 310 0.7355 0.3924 0.7355 0.8576
No log 3.8049 312 0.7154 0.4422 0.7154 0.8458
No log 3.8293 314 0.7042 0.3969 0.7042 0.8392
No log 3.8537 316 0.7360 0.4620 0.7360 0.8579
No log 3.8780 318 0.8593 0.3389 0.8593 0.9270
No log 3.9024 320 0.9697 0.3325 0.9697 0.9847
No log 3.9268 322 0.9488 0.3522 0.9488 0.9741
No log 3.9512 324 0.8591 0.3151 0.8591 0.9269
No log 3.9756 326 0.7592 0.3615 0.7592 0.8713
No log 4.0 328 0.7228 0.3615 0.7228 0.8501
No log 4.0244 330 0.7029 0.4125 0.7029 0.8384
No log 4.0488 332 0.7027 0.3961 0.7027 0.8382
No log 4.0732 334 0.7092 0.3601 0.7092 0.8422
No log 4.0976 336 0.7136 0.3578 0.7136 0.8447
No log 4.1220 338 0.7224 0.3530 0.7224 0.8499
No log 4.1463 340 0.7739 0.4263 0.7739 0.8797
No log 4.1707 342 0.8593 0.4173 0.8593 0.9270
No log 4.1951 344 0.8289 0.3955 0.8289 0.9104
No log 4.2195 346 0.8033 0.3677 0.8033 0.8962
No log 4.2439 348 0.7876 0.2926 0.7876 0.8875
No log 4.2683 350 0.7685 0.2813 0.7685 0.8766
No log 4.2927 352 0.7515 0.2943 0.7515 0.8669
No log 4.3171 354 0.7484 0.3224 0.7484 0.8651
No log 4.3415 356 0.7412 0.3144 0.7412 0.8609
No log 4.3659 358 0.7388 0.4342 0.7388 0.8596
No log 4.3902 360 0.7456 0.3478 0.7456 0.8635
No log 4.4146 362 0.7424 0.3817 0.7424 0.8616
No log 4.4390 364 0.7533 0.3980 0.7533 0.8679
No log 4.4634 366 0.7775 0.3958 0.7775 0.8817
No log 4.4878 368 0.7835 0.3567 0.7835 0.8851
No log 4.5122 370 0.7868 0.3183 0.7868 0.8870
No log 4.5366 372 0.7342 0.3746 0.7342 0.8569
No log 4.5610 374 0.6767 0.3399 0.6767 0.8226
No log 4.5854 376 0.6608 0.3224 0.6608 0.8129
No log 4.6098 378 0.6584 0.3813 0.6584 0.8114
No log 4.6341 380 0.6637 0.3937 0.6637 0.8147
No log 4.6585 382 0.6547 0.3861 0.6547 0.8092
No log 4.6829 384 0.6498 0.4096 0.6498 0.8061
No log 4.7073 386 0.6502 0.3856 0.6502 0.8063
No log 4.7317 388 0.6470 0.4526 0.6470 0.8043
No log 4.7561 390 0.6449 0.5070 0.6449 0.8031
No log 4.7805 392 0.6674 0.4754 0.6674 0.8169
No log 4.8049 394 0.7184 0.4371 0.7184 0.8476
No log 4.8293 396 0.7323 0.4371 0.7323 0.8557
No log 4.8537 398 0.6958 0.4550 0.6958 0.8342
No log 4.8780 400 0.6999 0.4360 0.6999 0.8366
No log 4.9024 402 0.7163 0.4713 0.7163 0.8463
No log 4.9268 404 0.7329 0.4713 0.7329 0.8561
No log 4.9512 406 0.7403 0.4848 0.7403 0.8604
No log 4.9756 408 0.7473 0.4557 0.7473 0.8645
No log 5.0 410 0.7428 0.4746 0.7428 0.8618
No log 5.0244 412 0.7477 0.4202 0.7477 0.8647
No log 5.0488 414 0.7278 0.3991 0.7278 0.8531
No log 5.0732 416 0.7119 0.4126 0.7119 0.8437
No log 5.0976 418 0.6951 0.4236 0.6951 0.8337
No log 5.1220 420 0.6978 0.4413 0.6978 0.8353
No log 5.1463 422 0.6950 0.4271 0.6950 0.8337
No log 5.1707 424 0.6583 0.4719 0.6583 0.8114
No log 5.1951 426 0.6482 0.4885 0.6482 0.8051
No log 5.2195 428 0.6481 0.4885 0.6481 0.8050
No log 5.2439 430 0.6567 0.4222 0.6567 0.8104
No log 5.2683 432 0.6912 0.4201 0.6912 0.8314
No log 5.2927 434 0.7026 0.3714 0.7026 0.8382
No log 5.3171 436 0.7362 0.2395 0.7362 0.8580
No log 5.3415 438 0.7934 0.2866 0.7934 0.8907
No log 5.3659 440 0.8113 0.2804 0.8113 0.9007
No log 5.3902 442 0.7762 0.3052 0.7762 0.8810
No log 5.4146 444 0.7010 0.4496 0.7010 0.8372
No log 5.4390 446 0.6870 0.4715 0.6870 0.8289
No log 5.4634 448 0.6864 0.4715 0.6864 0.8285
No log 5.4878 450 0.6669 0.4288 0.6669 0.8167
No log 5.5122 452 0.6734 0.3442 0.6734 0.8206
No log 5.5366 454 0.7156 0.3350 0.7156 0.8459
No log 5.5610 456 0.8039 0.2475 0.8039 0.8966
No log 5.5854 458 0.8284 0.2532 0.8284 0.9102
No log 5.6098 460 0.7929 0.2895 0.7929 0.8904
No log 5.6341 462 0.7609 0.2884 0.7609 0.8723
No log 5.6585 464 0.7479 0.3178 0.7479 0.8648
No log 5.6829 466 0.7232 0.3939 0.7232 0.8504
No log 5.7073 468 0.7179 0.3856 0.7179 0.8473
No log 5.7317 470 0.7308 0.3957 0.7308 0.8549
No log 5.7561 472 0.7413 0.4019 0.7413 0.8610
No log 5.7805 474 0.7445 0.3501 0.7445 0.8628
No log 5.8049 476 0.7470 0.3107 0.7470 0.8643
No log 5.8293 478 0.7699 0.2883 0.7699 0.8775
No log 5.8537 480 0.8002 0.2694 0.8002 0.8945
No log 5.8780 482 0.8237 0.2937 0.8237 0.9076
No log 5.9024 484 0.8158 0.3162 0.8158 0.9032
No log 5.9268 486 0.7896 0.3470 0.7896 0.8886
No log 5.9512 488 0.7853 0.3470 0.7853 0.8862
No log 5.9756 490 0.7713 0.3818 0.7713 0.8783
No log 6.0 492 0.7705 0.3926 0.7705 0.8778
No log 6.0244 494 0.7566 0.4533 0.7566 0.8699
No log 6.0488 496 0.7416 0.4278 0.7416 0.8612
No log 6.0732 498 0.7431 0.3417 0.7431 0.8620
0.4153 6.0976 500 0.7380 0.3946 0.7380 0.8591
0.4153 6.1220 502 0.7244 0.4658 0.7244 0.8511
0.4153 6.1463 504 0.7238 0.4778 0.7238 0.8507
0.4153 6.1707 506 0.7166 0.4962 0.7166 0.8465
0.4153 6.1951 508 0.7050 0.5018 0.7050 0.8396
0.4153 6.2195 510 0.7060 0.4394 0.7060 0.8402
0.4153 6.2439 512 0.7068 0.4126 0.7068 0.8407
0.4153 6.2683 514 0.7024 0.4268 0.7024 0.8381
0.4153 6.2927 516 0.7100 0.4382 0.7100 0.8426
0.4153 6.3171 518 0.7269 0.3902 0.7269 0.8526
0.4153 6.3415 520 0.7357 0.3409 0.7357 0.8578
0.4153 6.3659 522 0.7419 0.3861 0.7419 0.8613
0.4153 6.3902 524 0.7750 0.3690 0.7750 0.8803
0.4153 6.4146 526 0.8173 0.3780 0.8173 0.9041

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_B_usingWellWrittenEssays_FineTuningAraBERT_run3_AugV5_k16_task7_organization

Finetuned
(4019)
this model