ArabicNewSplits7_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k4_task7_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.7467
  • Qwk: 0.0697
  • Mse: 0.7467
  • Rmse: 0.8641

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.2 2 2.4280 -0.0646 2.4280 1.5582
No log 0.4 4 1.0881 0.2875 1.0881 1.0431
No log 0.6 6 1.0474 -0.1517 1.0474 1.0234
No log 0.8 8 1.3691 -0.1706 1.3691 1.1701
No log 1.0 10 1.2737 -0.1706 1.2737 1.1286
No log 1.2 12 1.0007 0.0283 1.0007 1.0003
No log 1.4 14 0.9275 0.1183 0.9275 0.9630
No log 1.6 16 0.8221 0.0428 0.8221 0.9067
No log 1.8 18 0.8087 0.0 0.8087 0.8993
No log 2.0 20 0.7870 0.0 0.7870 0.8871
No log 2.2 22 0.7671 0.0 0.7671 0.8758
No log 2.4 24 0.7781 0.0 0.7781 0.8821
No log 2.6 26 0.8770 -0.0320 0.8770 0.9365
No log 2.8 28 1.0144 -0.0076 1.0144 1.0072
No log 3.0 30 0.8925 -0.0700 0.8925 0.9447
No log 3.2 32 0.7964 0.0481 0.7964 0.8924
No log 3.4 34 0.7693 0.1674 0.7693 0.8771
No log 3.6 36 0.8324 0.2285 0.8324 0.9123
No log 3.8 38 0.8665 0.2319 0.8665 0.9309
No log 4.0 40 0.9179 -0.0045 0.9179 0.9581
No log 4.2 42 1.2147 0.0367 1.2147 1.1021
No log 4.4 44 1.0913 -0.0033 1.0913 1.0446
No log 4.6 46 0.8643 0.2063 0.8643 0.9297
No log 4.8 48 0.8581 0.1550 0.8581 0.9263
No log 5.0 50 0.8888 0.1815 0.8888 0.9428
No log 5.2 52 0.8949 0.1766 0.8949 0.9460
No log 5.4 54 0.8389 0.1699 0.8389 0.9159
No log 5.6 56 0.8780 0.0410 0.8780 0.9370
No log 5.8 58 1.0235 0.0975 1.0235 1.0117
No log 6.0 60 1.0118 0.1259 1.0118 1.0059
No log 6.2 62 0.8937 0.1498 0.8937 0.9454
No log 6.4 64 0.8858 0.1541 0.8858 0.9412
No log 6.6 66 0.8867 0.1541 0.8867 0.9416
No log 6.8 68 0.8969 0.0930 0.8969 0.9471
No log 7.0 70 0.8899 0.1760 0.8899 0.9434
No log 7.2 72 0.9062 0.1866 0.9062 0.9519
No log 7.4 74 1.0502 0.1271 1.0502 1.0248
No log 7.6 76 0.9760 0.1712 0.9760 0.9879
No log 7.8 78 0.8485 0.1303 0.8485 0.9211
No log 8.0 80 0.8510 0.1379 0.8510 0.9225
No log 8.2 82 0.8621 0.1969 0.8621 0.9285
No log 8.4 84 0.8653 0.2747 0.8653 0.9302
No log 8.6 86 0.8761 0.2987 0.8761 0.9360
No log 8.8 88 0.9020 0.2593 0.9020 0.9498
No log 9.0 90 0.8735 0.2888 0.8735 0.9346
No log 9.2 92 0.8758 0.2256 0.8758 0.9359
No log 9.4 94 0.8433 0.2936 0.8433 0.9183
No log 9.6 96 0.8244 0.3296 0.8244 0.9080
No log 9.8 98 0.8399 0.3060 0.8399 0.9165
No log 10.0 100 0.8450 0.3060 0.8450 0.9192
No log 10.2 102 0.8358 0.3478 0.8358 0.9142
No log 10.4 104 0.9045 0.0678 0.9045 0.9511
No log 10.6 106 0.8931 0.0702 0.8931 0.9450
No log 10.8 108 0.8382 0.1379 0.8382 0.9155
No log 11.0 110 0.8312 0.2475 0.8312 0.9117
No log 11.2 112 0.8191 0.2360 0.8191 0.9050
No log 11.4 114 0.8313 0.1797 0.8313 0.9118
No log 11.6 116 0.8490 0.1179 0.8490 0.9214
No log 11.8 118 0.8143 0.1179 0.8143 0.9024
No log 12.0 120 0.7943 0.3002 0.7943 0.8912
No log 12.2 122 0.7994 0.2973 0.7994 0.8941
No log 12.4 124 0.8554 0.2633 0.8554 0.9249
No log 12.6 126 0.8489 0.2633 0.8489 0.9213
No log 12.8 128 0.8310 0.2561 0.8310 0.9116
No log 13.0 130 0.8112 0.1471 0.8112 0.9006
No log 13.2 132 0.8178 0.1697 0.8178 0.9043
No log 13.4 134 0.9047 0.2899 0.9047 0.9511
No log 13.6 136 1.0909 0.1142 1.0909 1.0445
No log 13.8 138 1.0766 0.1743 1.0766 1.0376
No log 14.0 140 0.9207 0.2495 0.9207 0.9595
No log 14.2 142 0.8547 0.2072 0.8547 0.9245
No log 14.4 144 0.9249 0.1156 0.9249 0.9617
No log 14.6 146 0.8786 0.1494 0.8786 0.9373
No log 14.8 148 0.8118 0.1760 0.8118 0.9010
No log 15.0 150 0.8315 0.2261 0.8315 0.9119
No log 15.2 152 0.8498 0.1740 0.8498 0.9218
No log 15.4 154 0.8157 0.2590 0.8157 0.9031
No log 15.6 156 0.8095 0.2424 0.8095 0.8997
No log 15.8 158 0.8147 0.1870 0.8147 0.9026
No log 16.0 160 0.7790 0.0741 0.7790 0.8826
No log 16.2 162 0.7751 0.2353 0.7751 0.8804
No log 16.4 164 0.8274 0.2995 0.8274 0.9096
No log 16.6 166 0.8519 0.2521 0.8519 0.9230
No log 16.8 168 0.7997 0.2558 0.7997 0.8943
No log 17.0 170 0.7449 0.1386 0.7449 0.8631
No log 17.2 172 0.7760 0.1873 0.7760 0.8809
No log 17.4 174 0.7825 0.2926 0.7825 0.8846
No log 17.6 176 0.7185 0.1133 0.7185 0.8477
No log 17.8 178 0.6876 0.1456 0.6876 0.8292
No log 18.0 180 0.7060 0.2685 0.7060 0.8402
No log 18.2 182 0.7141 0.2471 0.7141 0.8450
No log 18.4 184 0.7117 0.2652 0.7117 0.8436
No log 18.6 186 0.7238 0.2287 0.7238 0.8508
No log 18.8 188 0.7424 0.2182 0.7424 0.8617
No log 19.0 190 0.7592 0.2132 0.7592 0.8713
No log 19.2 192 0.7723 0.2772 0.7723 0.8788
No log 19.4 194 0.7658 0.2458 0.7658 0.8751
No log 19.6 196 0.7629 0.2405 0.7629 0.8735
No log 19.8 198 0.7542 0.2749 0.7542 0.8684
No log 20.0 200 0.8205 0.2995 0.8205 0.9058
No log 20.2 202 0.8206 0.3399 0.8206 0.9059
No log 20.4 204 0.7879 0.2784 0.7879 0.8876
No log 20.6 206 0.7408 0.2589 0.7408 0.8607
No log 20.8 208 0.7042 0.1407 0.7042 0.8392
No log 21.0 210 0.7484 0.1528 0.7484 0.8651
No log 21.2 212 0.8096 0.2068 0.8096 0.8998
No log 21.4 214 0.7809 0.1716 0.7809 0.8837
No log 21.6 216 0.7705 0.2475 0.7705 0.8778
No log 21.8 218 0.8606 0.3586 0.8606 0.9277
No log 22.0 220 0.9064 0.3586 0.9064 0.9521
No log 22.2 222 0.8311 0.3590 0.8311 0.9116
No log 22.4 224 0.7556 0.1353 0.7556 0.8693
No log 22.6 226 0.7622 -0.0023 0.7622 0.8731
No log 22.8 228 0.7908 0.1716 0.7908 0.8893
No log 23.0 230 0.7844 0.2349 0.7844 0.8857
No log 23.2 232 0.7792 0.2379 0.7792 0.8827
No log 23.4 234 0.7998 0.2784 0.7998 0.8943
No log 23.6 236 0.8093 0.2899 0.8093 0.8996
No log 23.8 238 0.7995 0.3127 0.7995 0.8942
No log 24.0 240 0.7712 0.1835 0.7712 0.8782
No log 24.2 242 0.7589 0.1813 0.7589 0.8711
No log 24.4 244 0.7670 0.1133 0.7670 0.8758
No log 24.6 246 0.7585 0.1850 0.7585 0.8709
No log 24.8 248 0.7605 0.2590 0.7605 0.8721
No log 25.0 250 0.7949 0.3121 0.7949 0.8916
No log 25.2 252 0.8086 0.3121 0.8086 0.8992
No log 25.4 254 0.7797 0.2161 0.7797 0.8830
No log 25.6 256 0.7776 0.2713 0.7776 0.8818
No log 25.8 258 0.8006 0.1775 0.8006 0.8948
No log 26.0 260 0.7976 0.2683 0.7976 0.8931
No log 26.2 262 0.8170 0.2445 0.8170 0.9039
No log 26.4 264 0.8895 0.3320 0.8895 0.9432
No log 26.6 266 0.9028 0.3320 0.9028 0.9501
No log 26.8 268 0.8500 0.3723 0.8500 0.9220
No log 27.0 270 0.7773 0.2237 0.7773 0.8816
No log 27.2 272 0.7466 0.1432 0.7466 0.8640
No log 27.4 274 0.7358 0.1432 0.7358 0.8578
No log 27.6 276 0.7184 0.1400 0.7184 0.8476
No log 27.8 278 0.7469 0.2913 0.7469 0.8642
No log 28.0 280 0.8341 0.4167 0.8341 0.9133
No log 28.2 282 0.9025 0.3480 0.9025 0.9500
No log 28.4 284 0.8798 0.3480 0.8798 0.9380
No log 28.6 286 0.7983 0.3305 0.7983 0.8935
No log 28.8 288 0.7546 0.2530 0.7546 0.8687
No log 29.0 290 0.7365 0.3198 0.7365 0.8582
No log 29.2 292 0.7642 0.3369 0.7642 0.8742
No log 29.4 294 0.7576 0.3369 0.7576 0.8704
No log 29.6 296 0.7301 0.3603 0.7301 0.8545
No log 29.8 298 0.7250 0.2182 0.7250 0.8515
No log 30.0 300 0.7324 0.2471 0.7324 0.8558
No log 30.2 302 0.7285 0.2471 0.7285 0.8535
No log 30.4 304 0.7248 0.2973 0.7248 0.8514
No log 30.6 306 0.7439 0.3859 0.7439 0.8625
No log 30.8 308 0.7629 0.3716 0.7629 0.8734
No log 31.0 310 0.7752 0.3093 0.7752 0.8804
No log 31.2 312 0.7833 0.3433 0.7833 0.8850
No log 31.4 314 0.7665 0.2535 0.7665 0.8755
No log 31.6 316 0.7582 0.2862 0.7582 0.8707
No log 31.8 318 0.7551 0.4081 0.7551 0.8690
No log 32.0 320 0.7494 0.4081 0.7494 0.8657
No log 32.2 322 0.7288 0.3144 0.7288 0.8537
No log 32.4 324 0.7226 0.2360 0.7226 0.8500
No log 32.6 326 0.7280 0.2392 0.7280 0.8532
No log 32.8 328 0.7445 0.2092 0.7445 0.8628
No log 33.0 330 0.7433 0.2092 0.7433 0.8622
No log 33.2 332 0.7497 0.3144 0.7497 0.8658
No log 33.4 334 0.7552 0.3088 0.7552 0.8690
No log 33.6 336 0.7637 0.3355 0.7637 0.8739
No log 33.8 338 0.7622 0.2751 0.7622 0.8730
No log 34.0 340 0.7537 0.3253 0.7537 0.8681
No log 34.2 342 0.7522 0.2621 0.7522 0.8673
No log 34.4 344 0.7579 0.2530 0.7579 0.8706
No log 34.6 346 0.7845 0.3399 0.7845 0.8857
No log 34.8 348 0.8104 0.3918 0.8104 0.9002
No log 35.0 350 0.8344 0.4167 0.8344 0.9135
No log 35.2 352 0.8210 0.4167 0.8210 0.9061
No log 35.4 354 0.7846 0.3662 0.7846 0.8858
No log 35.6 356 0.7560 0.2813 0.7560 0.8695
No log 35.8 358 0.7321 0.3551 0.7321 0.8556
No log 36.0 360 0.7285 0.2113 0.7285 0.8535
No log 36.2 362 0.7275 0.1760 0.7275 0.8529
No log 36.4 364 0.7351 0.2973 0.7351 0.8574
No log 36.6 366 0.7917 0.3121 0.7917 0.8898
No log 36.8 368 0.8597 0.3092 0.8597 0.9272
No log 37.0 370 0.8741 0.3320 0.8741 0.9349
No log 37.2 372 0.8339 0.3092 0.8339 0.9132
No log 37.4 374 0.7728 0.3088 0.7728 0.8791
No log 37.6 376 0.7437 0.2684 0.7437 0.8624
No log 37.8 378 0.7441 0.0741 0.7441 0.8626
No log 38.0 380 0.7444 0.1133 0.7444 0.8628
No log 38.2 382 0.7398 0.0330 0.7398 0.8601
No log 38.4 384 0.7364 0.1050 0.7364 0.8581
No log 38.6 386 0.7439 0.1988 0.7439 0.8625
No log 38.8 388 0.7504 0.2652 0.7504 0.8663
No log 39.0 390 0.7610 0.2590 0.7610 0.8724
No log 39.2 392 0.7666 0.2877 0.7666 0.8755
No log 39.4 394 0.7729 0.2877 0.7729 0.8791
No log 39.6 396 0.7772 0.2943 0.7772 0.8816
No log 39.8 398 0.7828 0.2327 0.7828 0.8848
No log 40.0 400 0.7829 0.2327 0.7829 0.8848
No log 40.2 402 0.7780 0.2270 0.7780 0.8820
No log 40.4 404 0.7750 0.2590 0.7750 0.8804
No log 40.6 406 0.7717 0.3224 0.7717 0.8785
No log 40.8 408 0.7689 0.3224 0.7689 0.8769
No log 41.0 410 0.7627 0.1935 0.7627 0.8733
No log 41.2 412 0.7545 0.1303 0.7545 0.8686
No log 41.4 414 0.7458 0.0652 0.7458 0.8636
No log 41.6 416 0.7394 0.0652 0.7394 0.8599
No log 41.8 418 0.7392 0.0652 0.7392 0.8598
No log 42.0 420 0.7425 0.1432 0.7425 0.8617
No log 42.2 422 0.7477 0.1697 0.7477 0.8647
No log 42.4 424 0.7587 0.1341 0.7587 0.8710
No log 42.6 426 0.7674 0.1673 0.7674 0.8760
No log 42.8 428 0.7715 0.1673 0.7715 0.8783
No log 43.0 430 0.7743 0.2023 0.7743 0.8800
No log 43.2 432 0.7788 0.1697 0.7788 0.8825
No log 43.4 434 0.7774 0.2004 0.7774 0.8817
No log 43.6 436 0.7802 0.1672 0.7802 0.8833
No log 43.8 438 0.7689 0.1697 0.7689 0.8769
No log 44.0 440 0.7487 0.1393 0.7487 0.8653
No log 44.2 442 0.7407 0.1009 0.7407 0.8607
No log 44.4 444 0.7507 0.1686 0.7507 0.8664
No log 44.6 446 0.7712 0.3471 0.7712 0.8782
No log 44.8 448 0.8306 0.3918 0.8306 0.9114
No log 45.0 450 0.8624 0.3243 0.8624 0.9286
No log 45.2 452 0.8467 0.3918 0.8467 0.9202
No log 45.4 454 0.8022 0.3996 0.8022 0.8956
No log 45.6 456 0.7728 0.3545 0.7728 0.8791
No log 45.8 458 0.7627 0.2621 0.7627 0.8733
No log 46.0 460 0.7581 0.1341 0.7581 0.8707
No log 46.2 462 0.7569 0.1341 0.7569 0.8700
No log 46.4 464 0.7586 0.1341 0.7586 0.8710
No log 46.6 466 0.7676 0.2023 0.7676 0.8761
No log 46.8 468 0.7907 0.3050 0.7907 0.8892
No log 47.0 470 0.8157 0.3196 0.8157 0.9031
No log 47.2 472 0.8223 0.2261 0.8223 0.9068
No log 47.4 474 0.8078 0.2063 0.8078 0.8988
No log 47.6 476 0.7929 0.2063 0.7929 0.8904
No log 47.8 478 0.7951 0.2063 0.7951 0.8917
No log 48.0 480 0.8073 0.2379 0.8073 0.8985
No log 48.2 482 0.8317 0.2981 0.8317 0.9120
No log 48.4 484 0.8679 0.3737 0.8679 0.9316
No log 48.6 486 0.8842 0.3544 0.8842 0.9403
No log 48.8 488 0.8807 0.3737 0.8807 0.9384
No log 49.0 490 0.8616 0.3737 0.8616 0.9282
No log 49.2 492 0.8484 0.3471 0.8484 0.9211
No log 49.4 494 0.8225 0.2319 0.8225 0.9069
No log 49.6 496 0.8099 0.2379 0.8099 0.9000
No log 49.8 498 0.8027 0.2379 0.8027 0.8960
0.2553 50.0 500 0.8057 0.2379 0.8057 0.8976
0.2553 50.2 502 0.8048 0.2379 0.8048 0.8971
0.2553 50.4 504 0.7961 0.1988 0.7961 0.8922
0.2553 50.6 506 0.7901 0.2685 0.7901 0.8889
0.2553 50.8 508 0.7940 0.2685 0.7940 0.8911
0.2553 51.0 510 0.8051 0.2847 0.8051 0.8973
0.2553 51.2 512 0.8122 0.2847 0.8122 0.9012
0.2553 51.4 514 0.8100 0.2847 0.8100 0.9000
0.2553 51.6 516 0.8060 0.2847 0.8060 0.8978
0.2553 51.8 518 0.8079 0.2847 0.8079 0.8988
0.2553 52.0 520 0.7941 0.2621 0.7941 0.8911
0.2553 52.2 522 0.7804 0.2685 0.7804 0.8834
0.2553 52.4 524 0.7663 0.2685 0.7663 0.8754
0.2553 52.6 526 0.7580 0.1737 0.7580 0.8706
0.2553 52.8 528 0.7571 0.1737 0.7571 0.8701
0.2553 53.0 530 0.7622 0.2294 0.7622 0.8730
0.2553 53.2 532 0.7758 0.2685 0.7758 0.8808
0.2553 53.4 534 0.8022 0.3088 0.8022 0.8957
0.2553 53.6 536 0.8215 0.3287 0.8215 0.9064
0.2553 53.8 538 0.8320 0.3287 0.8320 0.9121
0.2553 54.0 540 0.8215 0.3287 0.8215 0.9063
0.2553 54.2 542 0.7971 0.3688 0.7971 0.8928
0.2553 54.4 544 0.7801 0.2652 0.7801 0.8832
0.2553 54.6 546 0.7654 0.1673 0.7654 0.8749
0.2553 54.8 548 0.7574 0.0971 0.7574 0.8703
0.2553 55.0 550 0.7524 0.0971 0.7524 0.8674
0.2553 55.2 552 0.7489 0.0283 0.7489 0.8654
0.2553 55.4 554 0.7467 0.0697 0.7467 0.8641

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k4_task7_organization

Finetuned
(4019)
this model