ArabicNewSplits7_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k19_task7_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.8116
  • Qwk: 0.1052
  • Mse: 0.8116
  • Rmse: 0.9009

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0444 2 2.6359 -0.0262 2.6359 1.6235
No log 0.0889 4 1.3851 0.0750 1.3851 1.1769
No log 0.1333 6 1.2984 -0.1983 1.2984 1.1395
No log 0.1778 8 1.2589 -0.2787 1.2589 1.1220
No log 0.2222 10 1.1997 -0.1356 1.1997 1.0953
No log 0.2667 12 1.2070 -0.1705 1.2070 1.0986
No log 0.3111 14 1.3576 -0.1384 1.3576 1.1652
No log 0.3556 16 1.6718 0.0289 1.6718 1.2930
No log 0.4 18 1.5585 0.0278 1.5585 1.2484
No log 0.4444 20 1.1516 -0.0961 1.1516 1.0731
No log 0.4889 22 0.9965 -0.0288 0.9965 0.9982
No log 0.5333 24 0.8746 0.0725 0.8746 0.9352
No log 0.5778 26 0.8535 0.0410 0.8535 0.9238
No log 0.6222 28 0.8574 0.0937 0.8574 0.9260
No log 0.6667 30 0.8894 0.0509 0.8894 0.9431
No log 0.7111 32 0.8933 0.0947 0.8933 0.9452
No log 0.7556 34 0.9631 0.0609 0.9631 0.9814
No log 0.8 36 1.0419 0.0329 1.0419 1.0207
No log 0.8444 38 1.1758 0.0095 1.1758 1.0843
No log 0.8889 40 1.1874 0.0126 1.1874 1.0897
No log 0.9333 42 0.9553 -0.0484 0.9553 0.9774
No log 0.9778 44 0.8113 0.1737 0.8113 0.9007
No log 1.0222 46 0.9473 0.2651 0.9473 0.9733
No log 1.0667 48 1.0228 0.2757 1.0228 1.0113
No log 1.1111 50 1.0274 0.2543 1.0274 1.0136
No log 1.1556 52 0.8945 0.2063 0.8945 0.9458
No log 1.2 54 0.7828 0.0757 0.7828 0.8847
No log 1.2444 56 0.8118 0.0804 0.8118 0.9010
No log 1.2889 58 0.9450 0.0506 0.9450 0.9721
No log 1.3333 60 1.1381 0.0661 1.1381 1.0668
No log 1.3778 62 1.1564 0.0643 1.1564 1.0754
No log 1.4222 64 1.0107 0.2092 1.0107 1.0053
No log 1.4667 66 0.9387 0.2034 0.9387 0.9689
No log 1.5111 68 0.9827 0.2118 0.9827 0.9913
No log 1.5556 70 1.0083 0.1460 1.0083 1.0041
No log 1.6 72 0.9884 0.0823 0.9884 0.9942
No log 1.6444 74 1.0193 0.1155 1.0193 1.0096
No log 1.6889 76 0.9372 0.0813 0.9372 0.9681
No log 1.7333 78 0.8497 0.0 0.8497 0.9218
No log 1.7778 80 0.8160 0.1440 0.8160 0.9033
No log 1.8222 82 0.8222 0.1440 0.8222 0.9068
No log 1.8667 84 0.8815 0.0727 0.8815 0.9389
No log 1.9111 86 1.0441 0.1427 1.0441 1.0218
No log 1.9556 88 1.0435 0.1122 1.0435 1.0215
No log 2.0 90 1.0138 0.0722 1.0138 1.0069
No log 2.0444 92 0.9973 0.1311 0.9973 0.9986
No log 2.0889 94 0.9859 0.1627 0.9859 0.9929
No log 2.1333 96 0.9563 0.1627 0.9563 0.9779
No log 2.1778 98 0.9327 0.1289 0.9327 0.9658
No log 2.2222 100 0.9467 0.1661 0.9467 0.9730
No log 2.2667 102 0.9298 0.1289 0.9298 0.9643
No log 2.3111 104 1.0249 0.1121 1.0249 1.0124
No log 2.3556 106 0.9742 0.1419 0.9742 0.9870
No log 2.4 108 0.8918 0.1410 0.8918 0.9444
No log 2.4444 110 0.8916 0.1455 0.8916 0.9443
No log 2.4889 112 0.8683 0.1684 0.8683 0.9318
No log 2.5333 114 0.8724 0.1986 0.8724 0.9340
No log 2.5778 116 0.8877 0.2279 0.8877 0.9422
No log 2.6222 118 0.9303 0.2192 0.9303 0.9645
No log 2.6667 120 0.9948 0.1801 0.9948 0.9974
No log 2.7111 122 0.9409 0.2053 0.9409 0.9700
No log 2.7556 124 0.9339 0.2256 0.9339 0.9664
No log 2.8 126 0.9156 0.2172 0.9156 0.9568
No log 2.8444 128 0.9249 0.1901 0.9249 0.9617
No log 2.8889 130 0.8510 0.2099 0.8510 0.9225
No log 2.9333 132 0.8177 0.2099 0.8177 0.9043
No log 2.9778 134 0.8225 0.1373 0.8225 0.9069
No log 3.0222 136 0.7945 0.2748 0.7945 0.8913
No log 3.0667 138 0.7980 0.2413 0.7980 0.8933
No log 3.1111 140 0.8029 0.2413 0.8029 0.8960
No log 3.1556 142 0.8051 0.2224 0.8051 0.8973
No log 3.2 144 0.8027 0.2193 0.8027 0.8960
No log 3.2444 146 0.7993 0.2224 0.7993 0.8940
No log 3.2889 148 0.7874 0.2224 0.7874 0.8874
No log 3.3333 150 0.7901 0.2398 0.7901 0.8889
No log 3.3778 152 0.7811 0.1733 0.7811 0.8838
No log 3.4222 154 0.8737 0.4020 0.8737 0.9347
No log 3.4667 156 1.0100 0.2702 1.0100 1.0050
No log 3.5111 158 1.0095 0.3068 1.0095 1.0047
No log 3.5556 160 0.8831 0.3007 0.8831 0.9397
No log 3.6 162 0.9029 0.2287 0.9029 0.9502
No log 3.6444 164 0.9058 0.2929 0.9058 0.9517
No log 3.6889 166 1.0173 0.3115 1.0173 1.0086
No log 3.7333 168 1.3004 0.0996 1.3004 1.1403
No log 3.7778 170 1.2563 0.1270 1.2563 1.1209
No log 3.8222 172 1.0818 0.2066 1.0818 1.0401
No log 3.8667 174 0.9456 0.1627 0.9456 0.9724
No log 3.9111 176 0.9349 0.1361 0.9349 0.9669
No log 3.9556 178 0.9030 0.2838 0.9030 0.9503
No log 4.0 180 0.9461 0.3023 0.9461 0.9727
No log 4.0444 182 1.0352 0.2437 1.0352 1.0175
No log 4.0889 184 0.9958 0.2576 0.9958 0.9979
No log 4.1333 186 0.9167 0.3015 0.9167 0.9575
No log 4.1778 188 0.9129 0.3051 0.9129 0.9554
No log 4.2222 190 0.9945 0.3195 0.9945 0.9973
No log 4.2667 192 1.0099 0.3606 1.0099 1.0049
No log 4.3111 194 0.8876 0.2359 0.8876 0.9421
No log 4.3556 196 0.8267 0.2279 0.8267 0.9092
No log 4.4 198 0.8204 0.2279 0.8204 0.9058
No log 4.4444 200 0.9194 0.2808 0.9194 0.9588
No log 4.4889 202 1.2250 0.1875 1.2250 1.1068
No log 4.5333 204 1.2601 0.1814 1.2601 1.1225
No log 4.5778 206 1.0034 0.2934 1.0034 1.0017
No log 4.6222 208 0.8106 0.2413 0.8106 0.9004
No log 4.6667 210 0.9043 0.0827 0.9043 0.9509
No log 4.7111 212 0.9055 0.0827 0.9055 0.9516
No log 4.7556 214 0.8110 0.1760 0.8110 0.9006
No log 4.8 216 0.8798 0.2328 0.8798 0.9380
No log 4.8444 218 0.9761 0.2781 0.9761 0.9880
No log 4.8889 220 0.9206 0.3319 0.9206 0.9595
No log 4.9333 222 0.8489 0.2877 0.8489 0.9214
No log 4.9778 224 0.8348 0.2717 0.8348 0.9137
No log 5.0222 226 0.8470 0.1052 0.8470 0.9203
No log 5.0667 228 0.8462 0.1051 0.8462 0.9199
No log 5.1111 230 0.8673 0.2843 0.8673 0.9313
No log 5.1556 232 0.8991 0.2780 0.8991 0.9482
No log 5.2 234 0.8971 0.2809 0.8971 0.9472
No log 5.2444 236 0.9060 0.1684 0.9060 0.9519
No log 5.2889 238 0.9053 0.2334 0.9053 0.9515
No log 5.3333 240 0.9373 0.3146 0.9373 0.9681
No log 5.3778 242 1.0271 0.2097 1.0271 1.0134
No log 5.4222 244 1.0298 0.2965 1.0299 1.0148
No log 5.4667 246 0.9210 0.2236 0.9210 0.9597
No log 5.5111 248 0.8376 0.2535 0.8376 0.9152
No log 5.5556 250 0.8448 0.1426 0.8448 0.9191
No log 5.6 252 0.8342 0.1483 0.8342 0.9133
No log 5.6444 254 0.7888 0.0687 0.7888 0.8882
No log 5.6889 256 0.7758 0.2413 0.7758 0.8808
No log 5.7333 258 0.7959 0.2717 0.7959 0.8921
No log 5.7778 260 0.8048 0.2413 0.8048 0.8971
No log 5.8222 262 0.8135 0.2019 0.8135 0.9019
No log 5.8667 264 0.8136 0.2004 0.8136 0.9020
No log 5.9111 266 0.8098 0.2413 0.8098 0.8999
No log 5.9556 268 0.8242 0.2413 0.8242 0.9078
No log 6.0 270 0.8091 0.1723 0.8091 0.8995
No log 6.0444 272 0.8233 -0.0045 0.8233 0.9073
No log 6.0889 274 0.8266 -0.0045 0.8266 0.9092
No log 6.1333 276 0.8093 0.0661 0.8093 0.8996
No log 6.1778 278 0.8317 0.2819 0.8317 0.9120
No log 6.2222 280 0.8980 0.3224 0.8980 0.9476
No log 6.2667 282 0.8922 0.2621 0.8922 0.9446
No log 6.3111 284 0.8717 0.2004 0.8717 0.9336
No log 6.3556 286 0.8900 0.2224 0.8900 0.9434
No log 6.4 288 0.9279 0.1779 0.9279 0.9633
No log 6.4444 290 0.9783 0.1846 0.9783 0.9891
No log 6.4889 292 0.9799 0.1846 0.9799 0.9899
No log 6.5333 294 0.9523 0.1875 0.9523 0.9759
No log 6.5778 296 0.9195 0.2084 0.9195 0.9589
No log 6.6222 298 0.8853 0.2224 0.8853 0.9409
No log 6.6667 300 0.8695 0.2334 0.8695 0.9325
No log 6.7111 302 0.8553 0.2224 0.8553 0.9248
No log 6.7556 304 0.8778 0.2809 0.8778 0.9369
No log 6.8 306 0.9131 0.2633 0.9131 0.9555
No log 6.8444 308 0.9371 0.2082 0.9371 0.9681
No log 6.8889 310 0.9964 0.1954 0.9964 0.9982
No log 6.9333 312 0.9810 0.1662 0.9810 0.9905
No log 6.9778 314 0.9012 0.1902 0.9012 0.9493
No log 7.0222 316 0.8692 0.2535 0.8692 0.9323
No log 7.0667 318 0.8493 0.2038 0.8493 0.9216
No log 7.1111 320 0.8565 0.2395 0.8565 0.9254
No log 7.1556 322 0.9115 0.2370 0.9115 0.9548
No log 7.2 324 0.9655 0.2420 0.9655 0.9826
No log 7.2444 326 0.9343 0.2370 0.9343 0.9666
No log 7.2889 328 0.8350 0.2819 0.8350 0.9138
No log 7.3333 330 0.8209 0.1775 0.8209 0.9060
No log 7.3778 332 0.8376 0.1775 0.8376 0.9152
No log 7.4222 334 0.8704 0.1916 0.8704 0.9330
No log 7.4667 336 0.9128 0.2420 0.9128 0.9554
No log 7.5111 338 0.8765 0.2691 0.8765 0.9362
No log 7.5556 340 0.8454 0.3092 0.8454 0.9195
No log 7.6 342 0.8451 0.2717 0.8451 0.9193
No log 7.6444 344 0.8583 0.1011 0.8583 0.9265
No log 7.6889 346 0.8601 0.1011 0.8601 0.9274
No log 7.7333 348 0.8624 0.2327 0.8624 0.9286
No log 7.7778 350 0.9519 0.2832 0.9519 0.9756
No log 7.8222 352 0.9583 0.2832 0.9583 0.9789
No log 7.8667 354 0.8634 0.2685 0.8634 0.9292
No log 7.9111 356 0.8198 0.1723 0.8198 0.9054
No log 7.9556 358 0.8147 0.1723 0.8147 0.9026
No log 8.0 360 0.8215 0.1723 0.8215 0.9064
No log 8.0444 362 0.8336 0.1353 0.8336 0.9130
No log 8.0889 364 0.8173 0.1723 0.8173 0.9041
No log 8.1333 366 0.8196 0.1723 0.8196 0.9053
No log 8.1778 368 0.8571 0.3296 0.8571 0.9258
No log 8.2222 370 0.9102 0.3121 0.9102 0.9540
No log 8.2667 372 0.9329 0.2823 0.9329 0.9658
No log 8.3111 374 0.8755 0.3183 0.8755 0.9357
No log 8.3556 376 0.7710 0.3011 0.7710 0.8781
No log 8.4 378 0.7340 0.3011 0.7340 0.8568
No log 8.4444 380 0.7042 0.2717 0.7042 0.8392
No log 8.4889 382 0.7055 0.2717 0.7055 0.8400
No log 8.5333 384 0.7392 0.2685 0.7392 0.8597
No log 8.5778 386 0.7665 0.3092 0.7665 0.8755
No log 8.6222 388 0.7612 0.3092 0.7612 0.8724
No log 8.6667 390 0.7153 0.2379 0.7153 0.8458
No log 8.7111 392 0.7109 0.1093 0.7109 0.8431
No log 8.7556 394 0.7375 0.1479 0.7375 0.8588
No log 8.8 396 0.7598 0.2748 0.7598 0.8716
No log 8.8444 398 0.7953 0.2907 0.7953 0.8918
No log 8.8889 400 0.8405 0.2424 0.8405 0.9168
No log 8.9333 402 0.8447 0.2669 0.8447 0.9191
No log 8.9778 404 0.7924 0.3211 0.7924 0.8902
No log 9.0222 406 0.7662 0.2334 0.7662 0.8753
No log 9.0667 408 0.7583 0.1052 0.7583 0.8708
No log 9.1111 410 0.7442 0.2748 0.7442 0.8626
No log 9.1556 412 0.7633 0.2063 0.7633 0.8736
No log 9.2 414 0.8311 0.3737 0.8311 0.9117
No log 9.2444 416 0.8523 0.3409 0.8523 0.9232
No log 9.2889 418 0.8084 0.3387 0.8084 0.8991
No log 9.3333 420 0.7706 0.3092 0.7706 0.8778
No log 9.3778 422 0.7789 0.2685 0.7789 0.8826
No log 9.4222 424 0.7944 0.2877 0.7944 0.8913
No log 9.4667 426 0.8166 0.2877 0.8166 0.9037
No log 9.5111 428 0.8115 0.2389 0.8115 0.9008
No log 9.5556 430 0.7795 0.3211 0.7795 0.8829
No log 9.6 432 0.7854 0.2936 0.7854 0.8862
No log 9.6444 434 0.7995 0.2621 0.7995 0.8942
No log 9.6889 436 0.7947 0.2413 0.7947 0.8915
No log 9.7333 438 0.8003 0.2063 0.8003 0.8946
No log 9.7778 440 0.8170 0.2685 0.8170 0.9039
No log 9.8222 442 0.8873 0.3127 0.8873 0.9420
No log 9.8667 444 0.9800 0.3219 0.9800 0.9900
No log 9.9111 446 0.9525 0.3710 0.9525 0.9760
No log 9.9556 448 0.9189 0.3590 0.9189 0.9586
No log 10.0 450 0.8698 0.3092 0.8698 0.9326
No log 10.0444 452 0.8922 0.3387 0.8922 0.9446
No log 10.0889 454 0.9395 0.3653 0.9395 0.9693
No log 10.1333 456 0.9246 0.3662 0.9246 0.9615
No log 10.1778 458 0.8814 0.3471 0.8814 0.9388
No log 10.2222 460 0.8492 0.2787 0.8492 0.9215
No log 10.2667 462 0.8421 0.3092 0.8421 0.9176
No log 10.3111 464 0.8645 0.3544 0.8645 0.9298
No log 10.3556 466 0.8910 0.3934 0.8910 0.9439
No log 10.4 468 0.8799 0.3699 0.8799 0.9380
No log 10.4444 470 0.8139 0.4479 0.8139 0.9022
No log 10.4889 472 0.7709 0.3387 0.7709 0.8780
No log 10.5333 474 0.7667 0.3387 0.7667 0.8756
No log 10.5778 476 0.8047 0.3387 0.8047 0.8970
No log 10.6222 478 0.8216 0.3387 0.8216 0.9064
No log 10.6667 480 0.8629 0.3794 0.8629 0.9289
No log 10.7111 482 0.8957 0.3645 0.8957 0.9464
No log 10.7556 484 0.9055 0.3710 0.9055 0.9516
No log 10.8 486 0.8552 0.2576 0.8552 0.9248
No log 10.8444 488 0.8169 0.3092 0.8169 0.9038
No log 10.8889 490 0.7826 0.2685 0.7826 0.8846
No log 10.9333 492 0.8042 0.2751 0.8042 0.8968
No log 10.9778 494 0.8034 0.2751 0.8034 0.8963
No log 11.0222 496 0.7819 0.2530 0.7819 0.8843
No log 11.0667 498 0.7964 0.2751 0.7964 0.8924
0.3426 11.1111 500 0.8086 0.2661 0.8086 0.8992
0.3426 11.1556 502 0.8663 0.3060 0.8663 0.9308
0.3426 11.2 504 0.9105 0.3645 0.9105 0.9542
0.3426 11.2444 506 0.8904 0.3409 0.8904 0.9436
0.3426 11.2889 508 0.8564 0.3167 0.8564 0.9254
0.3426 11.3333 510 0.8348 0.3312 0.8348 0.9136
0.3426 11.3778 512 0.8483 0.3312 0.8483 0.9210
0.3426 11.4222 514 0.8790 0.3167 0.8790 0.9376
0.3426 11.4667 516 0.8506 0.3312 0.8506 0.9223
0.3426 11.5111 518 0.8137 0.2787 0.8137 0.9021
0.3426 11.5556 520 0.8157 0.2379 0.8157 0.9031
0.3426 11.6 522 0.8387 0.3088 0.8387 0.9158
0.3426 11.6444 524 0.8496 0.3287 0.8496 0.9217
0.3426 11.6889 526 0.8177 0.2530 0.8177 0.9043
0.3426 11.7333 528 0.8001 0.2379 0.8001 0.8945
0.3426 11.7778 530 0.8045 0.1723 0.8045 0.8969
0.3426 11.8222 532 0.8121 0.1052 0.8121 0.9012
0.3426 11.8667 534 0.8116 0.1052 0.8116 0.9009

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
11
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k19_task7_organization

Finetuned
(4019)
this model