ArabicNewSplits7_B_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k19_task7_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.7411
  • Qwk: 0.3746
  • Mse: 0.7411
  • Rmse: 0.8609

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0208 2 2.7222 -0.0891 2.7222 1.6499
No log 0.0417 4 1.4544 0.1514 1.4544 1.2060
No log 0.0625 6 1.1931 -0.1255 1.1931 1.0923
No log 0.0833 8 1.1510 -0.0490 1.1510 1.0728
No log 0.1042 10 1.0718 0.0719 1.0718 1.0353
No log 0.125 12 1.0309 0.0561 1.0309 1.0153
No log 0.1458 14 1.0513 0.1057 1.0513 1.0253
No log 0.1667 16 1.1621 0.0894 1.1621 1.0780
No log 0.1875 18 1.2067 -0.0386 1.2067 1.0985
No log 0.2083 20 1.1952 -0.0944 1.1952 1.0932
No log 0.2292 22 1.0945 -0.0662 1.0945 1.0462
No log 0.25 24 0.9429 -0.0232 0.9429 0.9710
No log 0.2708 26 0.8736 -0.0578 0.8736 0.9347
No log 0.2917 28 0.7925 -0.0500 0.7925 0.8902
No log 0.3125 30 0.8524 -0.0605 0.8524 0.9232
No log 0.3333 32 1.0937 0.0993 1.0937 1.0458
No log 0.3542 34 1.1347 0.0566 1.1347 1.0652
No log 0.375 36 1.1469 0.0487 1.1469 1.0709
No log 0.3958 38 1.0707 0.0891 1.0707 1.0347
No log 0.4167 40 0.9619 0.1142 0.9619 0.9808
No log 0.4375 42 0.9115 -0.0103 0.9115 0.9547
No log 0.4583 44 0.9204 0.0 0.9204 0.9594
No log 0.4792 46 0.9178 -0.0483 0.9178 0.9580
No log 0.5 48 0.8845 -0.0563 0.8845 0.9405
No log 0.5208 50 0.9759 0.1142 0.9759 0.9879
No log 0.5417 52 1.0950 0.0288 1.0950 1.0464
No log 0.5625 54 1.1052 0.0747 1.1052 1.0513
No log 0.5833 56 1.0874 0.0154 1.0874 1.0428
No log 0.6042 58 1.1156 -0.0744 1.1156 1.0562
No log 0.625 60 1.0183 0.0643 1.0183 1.0091
No log 0.6458 62 0.9622 0.0643 0.9622 0.9809
No log 0.6667 64 0.9702 0.0327 0.9702 0.9850
No log 0.6875 66 0.9433 0.0295 0.9433 0.9712
No log 0.7083 68 0.8932 0.0679 0.8932 0.9451
No log 0.7292 70 0.9151 0.1672 0.9151 0.9566
No log 0.75 72 0.8962 0.2982 0.8962 0.9467
No log 0.7708 74 0.9529 0.3516 0.9529 0.9762
No log 0.7917 76 0.9064 0.2982 0.9064 0.9521
No log 0.8125 78 0.8803 0.2632 0.8803 0.9383
No log 0.8333 80 0.8145 0.1225 0.8145 0.9025
No log 0.8542 82 0.8045 0.0679 0.8045 0.8969
No log 0.875 84 0.8646 0.0327 0.8646 0.9298
No log 0.8958 86 0.9124 -0.0479 0.9124 0.9552
No log 0.9167 88 0.9241 -0.0025 0.9241 0.9613
No log 0.9375 90 0.8833 0.0283 0.8833 0.9399
No log 0.9583 92 0.8810 0.0509 0.8810 0.9386
No log 0.9792 94 0.9347 0.1416 0.9347 0.9668
No log 1.0 96 0.9712 0.1962 0.9712 0.9855
No log 1.0208 98 0.9039 0.2352 0.9039 0.9507
No log 1.0417 100 0.8126 0.2227 0.8126 0.9014
No log 1.0625 102 0.8542 0.0377 0.8542 0.9242
No log 1.0833 104 1.0544 0.0309 1.0544 1.0269
No log 1.1042 106 1.2171 0.0440 1.2171 1.1032
No log 1.125 108 1.3066 0.0467 1.3066 1.1431
No log 1.1458 110 1.2892 0.0971 1.2892 1.1354
No log 1.1667 112 1.1287 0.1679 1.1287 1.0624
No log 1.1875 114 1.0896 0.1654 1.0896 1.0438
No log 1.2083 116 1.0087 0.1689 1.0087 1.0043
No log 1.2292 118 0.9069 0.1421 0.9069 0.9523
No log 1.25 120 0.8222 0.2712 0.8222 0.9068
No log 1.2708 122 0.7615 0.3031 0.7615 0.8726
No log 1.2917 124 0.7743 0.2057 0.7743 0.8799
No log 1.3125 126 0.7835 0.0944 0.7835 0.8852
No log 1.3333 128 0.7385 0.0810 0.7385 0.8594
No log 1.3542 130 0.7211 0.2285 0.7211 0.8492
No log 1.375 132 0.8259 0.2574 0.8259 0.9088
No log 1.3958 134 0.9213 0.1740 0.9213 0.9599
No log 1.4167 136 0.9545 0.0868 0.9545 0.9770
No log 1.4375 138 0.8739 0.0583 0.8739 0.9348
No log 1.4583 140 0.7793 0.1624 0.7793 0.8828
No log 1.4792 142 0.8385 0.1209 0.8385 0.9157
No log 1.5 144 1.0758 0.0931 1.0758 1.0372
No log 1.5208 146 1.1376 0.0931 1.1376 1.0666
No log 1.5417 148 1.0944 0.0896 1.0944 1.0461
No log 1.5625 150 1.0728 0.1296 1.0728 1.0357
No log 1.5833 152 1.0623 0.1476 1.0623 1.0307
No log 1.6042 154 1.0323 0.2584 1.0323 1.0160
No log 1.625 156 0.9938 0.2292 0.9938 0.9969
No log 1.6458 158 0.9842 0.2052 0.9842 0.9921
No log 1.6667 160 0.9972 0.1384 0.9972 0.9986
No log 1.6875 162 0.9875 0.1452 0.9875 0.9937
No log 1.7083 164 0.9212 0.2010 0.9212 0.9598
No log 1.7292 166 0.8697 0.1710 0.8697 0.9326
No log 1.75 168 0.8633 0.2204 0.8633 0.9291
No log 1.7708 170 0.8718 0.2253 0.8718 0.9337
No log 1.7917 172 0.8893 0.2392 0.8893 0.9430
No log 1.8125 174 0.8593 0.1686 0.8593 0.9270
No log 1.8333 176 0.8514 0.2148 0.8514 0.9227
No log 1.8542 178 0.8732 0.1475 0.8732 0.9345
No log 1.875 180 0.8814 0.1741 0.8814 0.9388
No log 1.8958 182 0.8751 0.2591 0.8751 0.9355
No log 1.9167 184 0.8940 0.2237 0.8940 0.9455
No log 1.9375 186 0.9128 0.1786 0.9128 0.9554
No log 1.9583 188 0.8828 0.1935 0.8828 0.9396
No log 1.9792 190 0.8698 0.0604 0.8698 0.9326
No log 2.0 192 0.8648 0.0652 0.8648 0.9299
No log 2.0208 194 0.8709 0.0584 0.8709 0.9332
No log 2.0417 196 0.9129 0.1416 0.9129 0.9555
No log 2.0625 198 0.9047 0.2063 0.9047 0.9511
No log 2.0833 200 0.8904 0.3305 0.8904 0.9436
No log 2.1042 202 0.8398 0.3368 0.8398 0.9164
No log 2.125 204 0.7905 0.3235 0.7905 0.8891
No log 2.1458 206 0.7761 0.2718 0.7761 0.8810
No log 2.1667 208 0.7738 0.2718 0.7738 0.8796
No log 2.1875 210 0.7714 0.3060 0.7714 0.8783
No log 2.2083 212 0.7932 0.2226 0.7932 0.8906
No log 2.2292 214 0.8017 0.1907 0.8017 0.8954
No log 2.25 216 0.7636 0.2842 0.7636 0.8738
No log 2.2708 218 0.7770 0.2685 0.7770 0.8815
No log 2.2917 220 0.8116 0.2953 0.8116 0.9009
No log 2.3125 222 0.8167 0.2505 0.8167 0.9037
No log 2.3333 224 0.8611 0.1056 0.8611 0.9279
No log 2.3542 226 0.8869 0.0878 0.8869 0.9417
No log 2.375 228 0.8392 0.2519 0.8392 0.9161
No log 2.3958 230 0.8008 0.3643 0.8008 0.8949
No log 2.4167 232 0.8005 0.2980 0.8005 0.8947
No log 2.4375 234 0.7887 0.3910 0.7887 0.8881
No log 2.4583 236 0.8142 0.2546 0.8142 0.9023
No log 2.4792 238 0.8483 0.2501 0.8483 0.9210
No log 2.5 240 0.8328 0.2501 0.8328 0.9126
No log 2.5208 242 0.7671 0.2573 0.7671 0.8759
No log 2.5417 244 0.7587 0.3811 0.7587 0.8711
No log 2.5625 246 0.7910 0.3811 0.7910 0.8894
No log 2.5833 248 0.8327 0.3366 0.8327 0.9125
No log 2.6042 250 0.8647 0.3366 0.8647 0.9299
No log 2.625 252 0.8894 0.3299 0.8894 0.9431
No log 2.6458 254 0.9394 0.2116 0.9394 0.9692
No log 2.6667 256 0.9273 0.1900 0.9273 0.9630
No log 2.6875 258 0.9393 0.1630 0.9393 0.9692
No log 2.7083 260 0.9695 0.2232 0.9695 0.9846
No log 2.7292 262 0.9562 0.2359 0.9562 0.9779
No log 2.75 264 0.9116 0.2270 0.9116 0.9548
No log 2.7708 266 0.8898 0.1697 0.8898 0.9433
No log 2.7917 268 0.8495 0.2563 0.8495 0.9217
No log 2.8125 270 0.8177 0.3007 0.8177 0.9042
No log 2.8333 272 0.8228 0.3146 0.8228 0.9071
No log 2.8542 274 0.8206 0.3393 0.8206 0.9059
No log 2.875 276 0.8087 0.3931 0.8087 0.8993
No log 2.8958 278 0.8107 0.3701 0.8107 0.9004
No log 2.9167 280 0.8377 0.2634 0.8377 0.9153
No log 2.9375 282 0.8933 0.2499 0.8933 0.9451
No log 2.9583 284 0.9018 0.2735 0.9018 0.9496
No log 2.9792 286 0.8785 0.2469 0.8785 0.9373
No log 3.0 288 0.9066 0.2450 0.9066 0.9522
No log 3.0208 290 0.8697 0.1837 0.8697 0.9326
No log 3.0417 292 0.8498 0.2853 0.8498 0.9218
No log 3.0625 294 0.8859 0.3930 0.8859 0.9412
No log 3.0833 296 0.8938 0.2857 0.8938 0.9454
No log 3.1042 298 0.8430 0.3121 0.8430 0.9182
No log 3.125 300 0.8708 0.2342 0.8708 0.9332
No log 3.1458 302 0.9617 0.2013 0.9617 0.9807
No log 3.1667 304 0.9197 0.2342 0.9197 0.9590
No log 3.1875 306 0.8998 0.2415 0.8998 0.9486
No log 3.2083 308 0.9412 0.1926 0.9412 0.9701
No log 3.2292 310 0.9152 0.1672 0.9152 0.9566
No log 3.25 312 0.8201 0.2835 0.8201 0.9056
No log 3.2708 314 0.8034 0.1548 0.8034 0.8963
No log 3.2917 316 0.8595 0.2097 0.8595 0.9271
No log 3.3125 318 0.8081 0.2680 0.8081 0.8990
No log 3.3333 320 0.7670 0.3618 0.7670 0.8758
No log 3.3542 322 0.7553 0.3060 0.7553 0.8691
No log 3.375 324 0.7852 0.3383 0.7852 0.8861
No log 3.3958 326 0.7565 0.3347 0.7565 0.8697
No log 3.4167 328 0.7398 0.2877 0.7398 0.8601
No log 3.4375 330 0.7805 0.3393 0.7805 0.8835
No log 3.4583 332 0.7616 0.3425 0.7616 0.8727
No log 3.4792 334 0.7390 0.3253 0.7390 0.8596
No log 3.5 336 0.7561 0.3398 0.7561 0.8695
No log 3.5208 338 0.7948 0.3293 0.7948 0.8915
No log 3.5417 340 0.8094 0.3252 0.8094 0.8997
No log 3.5625 342 0.7734 0.3398 0.7734 0.8794
No log 3.5833 344 0.8168 0.3060 0.8168 0.9038
No log 3.6042 346 0.8441 0.3060 0.8441 0.9187
No log 3.625 348 0.7927 0.3433 0.7927 0.8904
No log 3.6458 350 0.7619 0.3042 0.7619 0.8729
No log 3.6667 352 0.7708 0.3336 0.7708 0.8779
No log 3.6875 354 0.7482 0.3618 0.7482 0.8650
No log 3.7083 356 0.7487 0.3458 0.7487 0.8653
No log 3.7292 358 0.7464 0.3549 0.7464 0.8639
No log 3.75 360 0.7473 0.4094 0.7473 0.8645
No log 3.7708 362 0.7741 0.3643 0.7741 0.8798
No log 3.7917 364 0.8471 0.3066 0.8471 0.9204
No log 3.8125 366 0.8280 0.3066 0.8280 0.9099
No log 3.8333 368 0.7502 0.3583 0.7502 0.8661
No log 3.8542 370 0.7117 0.4073 0.7117 0.8436
No log 3.875 372 0.7034 0.3864 0.7034 0.8387
No log 3.8958 374 0.7044 0.3791 0.7044 0.8393
No log 3.9167 376 0.7196 0.4295 0.7196 0.8483
No log 3.9375 378 0.7271 0.3964 0.7271 0.8527
No log 3.9583 380 0.7050 0.4107 0.7050 0.8396
No log 3.9792 382 0.7074 0.3950 0.7074 0.8411
No log 4.0 384 0.7358 0.3946 0.7358 0.8578
No log 4.0208 386 0.7301 0.4169 0.7301 0.8544
No log 4.0417 388 0.7239 0.3455 0.7239 0.8508
No log 4.0625 390 0.7413 0.3597 0.7413 0.8610
No log 4.0833 392 0.7544 0.3372 0.7544 0.8685
No log 4.1042 394 0.7563 0.3372 0.7563 0.8697
No log 4.125 396 0.7737 0.2985 0.7737 0.8796
No log 4.1458 398 0.8195 0.2936 0.8195 0.9053
No log 4.1667 400 0.8483 0.2784 0.8483 0.9211
No log 4.1875 402 0.8691 0.3006 0.8691 0.9322
No log 4.2083 404 0.7842 0.3734 0.7842 0.8856
No log 4.2292 406 0.7387 0.3335 0.7387 0.8595
No log 4.25 408 0.7326 0.3433 0.7326 0.8559
No log 4.2708 410 0.7427 0.2769 0.7427 0.8618
No log 4.2917 412 0.7519 0.3201 0.7519 0.8671
No log 4.3125 414 0.7310 0.2682 0.7310 0.8550
No log 4.3333 416 0.7146 0.3398 0.7146 0.8453
No log 4.3542 418 0.7173 0.3352 0.7173 0.8470
No log 4.375 420 0.7242 0.3581 0.7242 0.8510
No log 4.3958 422 0.7115 0.4244 0.7115 0.8435
No log 4.4167 424 0.7167 0.4595 0.7167 0.8466
No log 4.4375 426 0.7074 0.4595 0.7074 0.8411
No log 4.4583 428 0.6964 0.3738 0.6964 0.8345
No log 4.4792 430 0.6867 0.3649 0.6867 0.8287
No log 4.5 432 0.6864 0.3649 0.6864 0.8285
No log 4.5208 434 0.6867 0.3788 0.6867 0.8287
No log 4.5417 436 0.6898 0.3738 0.6898 0.8305
No log 4.5625 438 0.6780 0.4678 0.6780 0.8234
No log 4.5833 440 0.6871 0.4458 0.6871 0.8289
No log 4.6042 442 0.7138 0.3769 0.7138 0.8449
No log 4.625 444 0.7163 0.3864 0.7163 0.8464
No log 4.6458 446 0.7050 0.3762 0.7050 0.8397
No log 4.6667 448 0.7050 0.4022 0.7050 0.8397
No log 4.6875 450 0.7012 0.3762 0.7012 0.8374
No log 4.7083 452 0.7171 0.4221 0.7171 0.8468
No log 4.7292 454 0.7560 0.3854 0.7560 0.8695
No log 4.75 456 0.7499 0.4119 0.7499 0.8659
No log 4.7708 458 0.7361 0.3899 0.7361 0.8580
No log 4.7917 460 0.7406 0.3990 0.7406 0.8606
No log 4.8125 462 0.7402 0.4422 0.7402 0.8604
No log 4.8333 464 0.7787 0.3568 0.7787 0.8825
No log 4.8542 466 0.8461 0.3612 0.8461 0.9198
No log 4.875 468 0.7945 0.3810 0.7945 0.8913
No log 4.8958 470 0.7212 0.4179 0.7212 0.8492
No log 4.9167 472 0.7731 0.3633 0.7731 0.8793
No log 4.9375 474 0.8753 0.2323 0.8753 0.9356
No log 4.9583 476 0.8475 0.2323 0.8475 0.9206
No log 4.9792 478 0.7538 0.4212 0.7538 0.8682
No log 5.0 480 0.7481 0.4375 0.7481 0.8649
No log 5.0208 482 0.7492 0.4314 0.7492 0.8655
No log 5.0417 484 0.7307 0.4276 0.7307 0.8548
No log 5.0625 486 0.7434 0.4274 0.7434 0.8622
No log 5.0833 488 0.7654 0.3878 0.7654 0.8749
No log 5.1042 490 0.7840 0.3939 0.7840 0.8854
No log 5.125 492 0.7622 0.3814 0.7622 0.8730
No log 5.1458 494 0.7446 0.4129 0.7446 0.8629
No log 5.1667 496 0.7381 0.3899 0.7381 0.8591
No log 5.1875 498 0.7332 0.4129 0.7332 0.8562
0.5303 5.2083 500 0.7479 0.4031 0.7479 0.8648
0.5303 5.2292 502 0.7160 0.3972 0.7160 0.8461
0.5303 5.25 504 0.6910 0.3618 0.6910 0.8313
0.5303 5.2708 506 0.7265 0.4451 0.7265 0.8524
0.5303 5.2917 508 0.7181 0.4473 0.7181 0.8474
0.5303 5.3125 510 0.6793 0.3937 0.6793 0.8242
0.5303 5.3333 512 0.7175 0.4249 0.7175 0.8470
0.5303 5.3542 514 0.7670 0.3807 0.7670 0.8758
0.5303 5.375 516 0.7514 0.4282 0.7514 0.8668
0.5303 5.3958 518 0.7100 0.4435 0.7100 0.8426
0.5303 5.4167 520 0.7415 0.3885 0.7415 0.8611
0.5303 5.4375 522 0.7392 0.3769 0.7392 0.8597
0.5303 5.4583 524 0.7141 0.4586 0.7141 0.8450
0.5303 5.4792 526 0.7694 0.4140 0.7694 0.8771
0.5303 5.5 528 0.8019 0.4230 0.8019 0.8955
0.5303 5.5208 530 0.7347 0.4321 0.7347 0.8571
0.5303 5.5417 532 0.6938 0.4417 0.6938 0.8329
0.5303 5.5625 534 0.7218 0.4413 0.7218 0.8496
0.5303 5.5833 536 0.7250 0.3817 0.7250 0.8514
0.5303 5.6042 538 0.7308 0.4285 0.7308 0.8548
0.5303 5.625 540 0.7885 0.4210 0.7885 0.8880
0.5303 5.6458 542 0.8309 0.4210 0.8309 0.9115
0.5303 5.6667 544 0.8094 0.4210 0.8094 0.8997
0.5303 5.6875 546 0.7970 0.4038 0.7970 0.8927
0.5303 5.7083 548 0.7671 0.3990 0.7671 0.8758
0.5303 5.7292 550 0.7589 0.3791 0.7589 0.8711
0.5303 5.75 552 0.7631 0.3990 0.7631 0.8735
0.5303 5.7708 554 0.7560 0.3703 0.7560 0.8695
0.5303 5.7917 556 0.7411 0.3746 0.7411 0.8609

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_B_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k19_task7_organization

Finetuned
(4019)
this model