ArabicNewSplits7_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k15_task7_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.9965
  • Qwk: 0.1044
  • Mse: 0.9965
  • Rmse: 0.9982

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0556 2 2.6546 -0.0593 2.6546 1.6293
No log 0.1111 4 1.2753 0.0997 1.2753 1.1293
No log 0.1667 6 0.9133 0.0535 0.9133 0.9557
No log 0.2222 8 0.8322 0.0771 0.8322 0.9123
No log 0.2778 10 0.7834 0.0410 0.7834 0.8851
No log 0.3333 12 0.8037 0.1321 0.8037 0.8965
No log 0.3889 14 0.7549 0.1184 0.7549 0.8688
No log 0.4444 16 0.7330 0.1184 0.7330 0.8562
No log 0.5 18 0.7192 0.0804 0.7192 0.8480
No log 0.5556 20 0.7348 0.1786 0.7348 0.8572
No log 0.6111 22 0.7275 0.1786 0.7275 0.8529
No log 0.6667 24 0.7219 0.0937 0.7219 0.8497
No log 0.7222 26 0.7166 0.0428 0.7166 0.8465
No log 0.7778 28 0.7286 0.0295 0.7286 0.8536
No log 0.8333 30 0.7940 0.1142 0.7940 0.8911
No log 0.8889 32 0.8344 -0.0269 0.8344 0.9135
No log 0.9444 34 0.8587 -0.0269 0.8587 0.9267
No log 1.0 36 0.8241 0.0129 0.8241 0.9078
No log 1.0556 38 0.7394 0.0393 0.7394 0.8599
No log 1.1111 40 0.7009 0.1327 0.7009 0.8372
No log 1.1667 42 0.7203 0.0846 0.7203 0.8487
No log 1.2222 44 0.7328 0.0376 0.7328 0.8560
No log 1.2778 46 0.7769 0.1400 0.7769 0.8814
No log 1.3333 48 0.7690 0.1400 0.7690 0.8769
No log 1.3889 50 0.7785 0.2751 0.7785 0.8823
No log 1.4444 52 0.7155 0.2413 0.7155 0.8459
No log 1.5 54 0.9689 0.2643 0.9689 0.9843
No log 1.5556 56 0.9172 0.2697 0.9172 0.9577
No log 1.6111 58 0.6743 0.3408 0.6743 0.8212
No log 1.6667 60 0.7050 0.3665 0.7050 0.8396
No log 1.7222 62 0.8148 0.3305 0.8148 0.9027
No log 1.7778 64 1.0671 0.2999 1.0671 1.0330
No log 1.8333 66 1.1267 0.1935 1.1267 1.0614
No log 1.8889 68 1.2371 0.1002 1.2371 1.1123
No log 1.9444 70 1.4000 0.1401 1.4000 1.1832
No log 2.0 72 1.0699 0.2554 1.0699 1.0343
No log 2.0556 74 0.8151 0.2953 0.8151 0.9028
No log 2.1111 76 0.7686 0.3031 0.7686 0.8767
No log 2.1667 78 0.7794 0.3408 0.7794 0.8829
No log 2.2222 80 1.0362 0.2643 1.0362 1.0179
No log 2.2778 82 1.2021 0.1109 1.2021 1.0964
No log 2.3333 84 1.3732 -0.0991 1.3732 1.1718
No log 2.3889 86 1.3007 -0.0257 1.3007 1.1405
No log 2.4444 88 0.9696 0.1766 0.9696 0.9847
No log 2.5 90 0.7766 0.0930 0.7766 0.8813
No log 2.5556 92 0.7665 0.2116 0.7665 0.8755
No log 2.6111 94 0.6899 0.2327 0.6899 0.8306
No log 2.6667 96 0.7055 0.3155 0.7055 0.8399
No log 2.7222 98 0.8247 0.3371 0.8247 0.9081
No log 2.7778 100 0.8312 0.3371 0.8312 0.9117
No log 2.8333 102 0.7184 0.4165 0.7184 0.8476
No log 2.8889 104 0.6760 0.3581 0.6760 0.8222
No log 2.9444 106 0.6782 0.3556 0.6782 0.8235
No log 3.0 108 0.8702 0.3623 0.8702 0.9328
No log 3.0556 110 1.1767 0.2115 1.1767 1.0847
No log 3.1111 112 1.0802 0.2682 1.0802 1.0393
No log 3.1667 114 0.7772 0.3630 0.7772 0.8816
No log 3.2222 116 0.7059 0.2994 0.7059 0.8402
No log 3.2778 118 0.7862 0.3505 0.7862 0.8867
No log 3.3333 120 0.7452 0.3316 0.7452 0.8633
No log 3.3889 122 0.6980 0.2684 0.6980 0.8354
No log 3.4444 124 0.7901 0.3564 0.7901 0.8889
No log 3.5 126 0.8911 0.3938 0.8911 0.9440
No log 3.5556 128 0.9007 0.3938 0.9007 0.9490
No log 3.6111 130 0.7964 0.3630 0.7964 0.8924
No log 3.6667 132 0.7309 0.3060 0.7309 0.8549
No log 3.7222 134 0.7263 0.3116 0.7263 0.8523
No log 3.7778 136 0.7919 0.2812 0.7919 0.8899
No log 3.8333 138 0.9654 0.3439 0.9654 0.9826
No log 3.8889 140 1.0088 0.2948 1.0088 1.0044
No log 3.9444 142 0.8781 0.3008 0.8781 0.9371
No log 4.0 144 0.7780 0.2784 0.7780 0.8820
No log 4.0556 146 0.7882 0.2784 0.7882 0.8878
No log 4.1111 148 0.9037 0.2726 0.9037 0.9506
No log 4.1667 150 1.0446 0.3029 1.0446 1.0220
No log 4.2222 152 1.0160 0.2806 1.0160 1.0080
No log 4.2778 154 0.8929 0.2297 0.8929 0.9449
No log 4.3333 156 0.8393 0.2109 0.8393 0.9161
No log 4.3889 158 0.8415 0.3504 0.8415 0.9173
No log 4.4444 160 0.8754 0.2808 0.8754 0.9356
No log 4.5 162 0.9574 0.2626 0.9574 0.9785
No log 4.5556 164 1.0389 0.3082 1.0389 1.0192
No log 4.6111 166 0.9983 0.3082 0.9983 0.9992
No log 4.6667 168 0.9032 0.2923 0.9032 0.9504
No log 4.7222 170 0.7896 0.2815 0.7896 0.8886
No log 4.7778 172 0.7980 0.3243 0.7980 0.8933
No log 4.8333 174 0.7847 0.2916 0.7847 0.8858
No log 4.8889 176 0.7936 0.2319 0.7936 0.8909
No log 4.9444 178 0.8450 0.2063 0.8450 0.9192
No log 5.0 180 0.8983 0.1867 0.8983 0.9478
No log 5.0556 182 0.9091 0.1867 0.9091 0.9535
No log 5.1111 184 0.8618 0.1718 0.8618 0.9284
No log 5.1667 186 0.8316 0.1866 0.8316 0.9119
No log 5.2222 188 0.9009 0.2142 0.9009 0.9492
No log 5.2778 190 0.9415 0.2358 0.9415 0.9703
No log 5.3333 192 1.0603 0.2119 1.0603 1.0297
No log 5.3889 194 1.0557 0.2119 1.0557 1.0275
No log 5.4444 196 0.9433 0.2358 0.9433 0.9712
No log 5.5 198 0.8836 0.2244 0.8836 0.9400
No log 5.5556 200 0.9140 0.2142 0.9140 0.9560
No log 5.6111 202 1.0767 0.2810 1.0767 1.0376
No log 5.6667 204 1.2505 0.2084 1.2505 1.1183
No log 5.7222 206 1.3327 0.1458 1.3327 1.1544
No log 5.7778 208 1.2176 0.2392 1.2176 1.1034
No log 5.8333 210 1.1306 0.2075 1.1306 1.0633
No log 5.8889 212 1.0227 0.2460 1.0227 1.0113
No log 5.9444 214 0.9800 0.2810 0.9800 0.9899
No log 6.0 216 0.9584 0.2810 0.9584 0.9790
No log 6.0556 218 1.0157 0.2460 1.0157 1.0078
No log 6.1111 220 1.1102 0.2833 1.1102 1.0537
No log 6.1667 222 1.0744 0.2833 1.0744 1.0365
No log 6.2222 224 0.9706 0.3347 0.9706 0.9852
No log 6.2778 226 0.9404 0.2810 0.9404 0.9698
No log 6.3333 228 0.9420 0.2670 0.9420 0.9705
No log 6.3889 230 0.9304 0.1962 0.9304 0.9645
No log 6.4444 232 1.0131 0.2142 1.0131 1.0065
No log 6.5 234 1.2174 0.0894 1.2174 1.1034
No log 6.5556 236 1.4668 0.0969 1.4668 1.2111
No log 6.6111 238 1.4500 0.0993 1.4500 1.2041
No log 6.6667 240 1.2109 0.1535 1.2109 1.1004
No log 6.7222 242 0.9755 0.2142 0.9755 0.9877
No log 6.7778 244 0.9473 0.2193 0.9473 0.9733
No log 6.8333 246 1.0603 0.1869 1.0603 1.0297
No log 6.8889 248 1.1707 0.1909 1.1707 1.0820
No log 6.9444 250 1.1697 0.2412 1.1697 1.0815
No log 7.0 252 1.2312 0.1713 1.2312 1.1096
No log 7.0556 254 1.1376 0.2482 1.1376 1.0666
No log 7.1111 256 0.9507 0.2029 0.9507 0.9750
No log 7.1667 258 0.7763 0.2027 0.7763 0.8811
No log 7.2222 260 0.7355 0.1922 0.7355 0.8576
No log 7.2778 262 0.7637 0.2085 0.7637 0.8739
No log 7.3333 264 0.9312 0.1962 0.9312 0.9650
No log 7.3889 266 1.1819 0.2501 1.1819 1.0872
No log 7.4444 268 1.2457 0.1638 1.2457 1.1161
No log 7.5 270 1.1364 0.3337 1.1364 1.0660
No log 7.5556 272 0.9774 0.2193 0.9774 0.9886
No log 7.6111 274 0.9411 0.2193 0.9411 0.9701
No log 7.6667 276 0.9875 0.2142 0.9875 0.9937
No log 7.7222 278 0.9694 0.2142 0.9694 0.9846
No log 7.7778 280 0.9926 0.3473 0.9926 0.9963
No log 7.8333 282 1.0200 0.3516 1.0200 1.0099
No log 7.8889 284 0.9929 0.3287 0.9929 0.9964
No log 7.9444 286 0.9666 0.3231 0.9666 0.9831
No log 8.0 288 0.9308 0.2574 0.9308 0.9648
No log 8.0556 290 0.9621 0.2574 0.9621 0.9808
No log 8.1111 292 1.0236 0.1651 1.0236 1.0117
No log 8.1667 294 1.1880 0.2367 1.1880 1.0899
No log 8.2222 296 1.3663 0.0992 1.3663 1.1689
No log 8.2778 298 1.3799 0.1041 1.3799 1.1747
No log 8.3333 300 1.1694 0.2324 1.1694 1.0814
No log 8.3889 302 0.9271 0.2244 0.9271 0.9629
No log 8.4444 304 0.7951 0.2652 0.7951 0.8917
No log 8.5 306 0.7634 0.2471 0.7634 0.8737
No log 8.5556 308 0.7808 0.2652 0.7808 0.8836
No log 8.6111 310 0.8002 0.2527 0.8002 0.8945
No log 8.6667 312 0.9041 0.2632 0.9041 0.9508
No log 8.7222 314 1.0021 0.1348 1.0021 1.0010
No log 8.7778 316 1.0409 0.1612 1.0409 1.0202
No log 8.8333 318 1.0547 0.1535 1.0547 1.0270
No log 8.8889 320 1.1255 0.2635 1.1255 1.0609
No log 8.9444 322 1.0459 0.2412 1.0459 1.0227
No log 9.0 324 1.0186 0.1147 1.0186 1.0093
No log 9.0556 326 1.0454 0.1178 1.0454 1.0225
No log 9.1111 328 1.0503 0.1277 1.0503 1.0249
No log 9.1667 330 0.9570 0.2193 0.9570 0.9783
No log 9.2222 332 1.0083 0.2193 1.0083 1.0041
No log 9.2778 334 0.9766 0.2244 0.9766 0.9882
No log 9.3333 336 0.9766 0.2244 0.9766 0.9882
No log 9.3889 338 0.9704 0.2244 0.9704 0.9851
No log 9.4444 340 0.9347 0.2632 0.9347 0.9668
No log 9.5 342 0.9507 0.2632 0.9507 0.9750
No log 9.5556 344 1.0398 0.1911 1.0398 1.0197
No log 9.6111 346 1.0946 0.0894 1.0946 1.0462
No log 9.6667 348 1.0422 0.1612 1.0422 1.0209
No log 9.7222 350 1.0922 0.1210 1.0922 1.0451
No log 9.7778 352 1.1244 0.0569 1.1244 1.0604
No log 9.8333 354 1.1717 0.1356 1.1717 1.0824
No log 9.8889 356 1.1302 0.1356 1.1302 1.0631
No log 9.9444 358 0.9414 0.3105 0.9414 0.9703
No log 10.0 360 0.8083 0.2632 0.8083 0.8991
No log 10.0556 362 0.7743 0.2883 0.7743 0.8799
No log 10.1111 364 0.8033 0.2754 0.8033 0.8962
No log 10.1667 366 0.8797 0.2843 0.8797 0.9379
No log 10.2222 368 1.0038 0.2460 1.0038 1.0019
No log 10.2778 370 1.0090 0.1911 1.0090 1.0045
No log 10.3333 372 0.9197 0.2843 0.9197 0.9590
No log 10.3889 374 0.9157 0.2574 0.9157 0.9569
No log 10.4444 376 0.9940 0.1723 0.9940 0.9970
No log 10.5 378 1.0495 0.1557 1.0495 1.0245
No log 10.5556 380 1.0497 0.1304 1.0497 1.0245
No log 10.6111 382 1.0078 0.1304 1.0078 1.0039
No log 10.6667 384 0.8825 0.2632 0.8825 0.9394
No log 10.7222 386 0.8275 0.2467 0.8275 0.9097
No log 10.7778 388 0.8367 0.2467 0.8367 0.9147
No log 10.8333 390 0.8344 0.2527 0.8344 0.9135
No log 10.8889 392 0.8668 0.2692 0.8668 0.9310
No log 10.9444 394 0.9259 0.2193 0.9259 0.9622
No log 11.0 396 1.0289 0.2389 1.0289 1.0144
No log 11.0556 398 1.0886 0.2529 1.0886 1.0434
No log 11.1111 400 1.1749 0.1858 1.1749 1.0839
No log 11.1667 402 1.0777 0.2529 1.0777 1.0381
No log 11.2222 404 0.9007 0.2463 0.9007 0.9490
No log 11.2778 406 0.7597 0.3092 0.7597 0.8716
No log 11.3333 408 0.7123 0.2537 0.7123 0.8440
No log 11.3889 410 0.7303 0.2537 0.7303 0.8546
No log 11.4444 412 0.8155 0.2527 0.8155 0.9031
No log 11.5 414 0.9817 0.2866 0.9817 0.9908
No log 11.5556 416 1.1696 0.1497 1.1696 1.0815
No log 11.6111 418 1.2034 0.1497 1.2034 1.0970
No log 11.6667 420 1.0732 0.2271 1.0732 1.0360
No log 11.7222 422 0.9332 0.2142 0.9332 0.9660
No log 11.7778 424 0.8148 0.2227 0.8148 0.9027
No log 11.8333 426 0.7609 0.3092 0.7609 0.8723
No log 11.8889 428 0.8005 0.2227 0.8005 0.8947
No log 11.9444 430 0.9069 0.2518 0.9069 0.9523
No log 12.0 432 1.0162 0.1897 1.0162 1.0081
No log 12.0556 434 1.1694 0.2392 1.1694 1.0814
No log 12.1111 436 1.2191 0.2306 1.2191 1.1041
No log 12.1667 438 1.1936 0.2412 1.1936 1.0925
No log 12.2222 440 1.0998 0.2504 1.0998 1.0487
No log 12.2778 442 1.0597 0.2504 1.0597 1.0294
No log 12.3333 444 0.9757 0.1911 0.9757 0.9878
No log 12.3889 446 0.8601 0.2244 0.8601 0.9274
No log 12.4444 448 0.8071 0.2116 0.8071 0.8984
No log 12.5 450 0.8156 0.2409 0.8156 0.9031
No log 12.5556 452 0.8984 0.2297 0.8984 0.9478
No log 12.6111 454 1.0381 0.2487 1.0381 1.0188
No log 12.6667 456 1.0533 0.2343 1.0533 1.0263
No log 12.7222 458 1.0023 0.2866 1.0023 1.0012
No log 12.7778 460 0.9411 0.2982 0.9411 0.9701
No log 12.8333 462 0.8594 0.2632 0.8594 0.9270
No log 12.8889 464 0.7886 0.2718 0.7886 0.8880
No log 12.9444 466 0.7827 0.2407 0.7827 0.8847
No log 13.0 468 0.7767 0.2407 0.7767 0.8813
No log 13.0556 470 0.7681 0.2787 0.7681 0.8764
No log 13.1111 472 0.8132 0.2227 0.8132 0.9018
No log 13.1667 474 0.8431 0.2754 0.8431 0.9182
No log 13.2222 476 0.8578 0.3032 0.8578 0.9262
No log 13.2778 478 0.8452 0.3032 0.8452 0.9194
No log 13.3333 480 0.8673 0.3105 0.8673 0.9313
No log 13.3889 482 0.9200 0.2923 0.9200 0.9592
No log 13.4444 484 1.0200 0.2601 1.0200 1.0099
No log 13.5 486 1.1227 0.2457 1.1227 1.0596
No log 13.5556 488 1.1941 0.2367 1.1941 1.0928
No log 13.6111 490 1.0901 0.2457 1.0901 1.0441
No log 13.6667 492 0.9472 0.2000 0.9472 0.9733
No log 13.7222 494 0.8449 0.2754 0.8449 0.9192
No log 13.7778 496 0.8054 0.2527 0.8054 0.8974
No log 13.8333 498 0.8127 0.3099 0.8127 0.9015
0.2808 13.8889 500 0.8777 0.2574 0.8777 0.9369
0.2808 13.9444 502 0.9984 0.1499 0.9984 0.9992
0.2808 14.0 504 1.1016 0.2504 1.1016 1.0496
0.2808 14.0556 506 1.1035 0.2732 1.1035 1.0505
0.2808 14.1111 508 1.1295 0.2732 1.1295 1.0628
0.2808 14.1667 510 1.1452 0.2732 1.1452 1.0702
0.2808 14.2222 512 1.1287 0.2732 1.1287 1.0624
0.2808 14.2778 514 1.1542 0.1810 1.1542 1.0743
0.2808 14.3333 516 1.1514 0.1810 1.1514 1.0730
0.2808 14.3889 518 1.0839 0.1846 1.0839 1.0411
0.2808 14.4444 520 0.9800 0.2670 0.9800 0.9899
0.2808 14.5 522 0.9747 0.2308 0.9747 0.9873
0.2808 14.5556 524 0.9557 0.1955 0.9557 0.9776
0.2808 14.6111 526 0.9533 0.1955 0.9533 0.9764
0.2808 14.6667 528 0.9735 0.1612 0.9735 0.9867
0.2808 14.7222 530 0.9965 0.1044 0.9965 0.9982

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k15_task7_organization

Finetuned
(4019)
this model