ArabicNewSplits7_usingALLEssays_FineTuningAraBERT_run1_AugV5_k19_task7_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.7313
  • Qwk: 0.2193
  • Mse: 0.7313
  • Rmse: 0.8552

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0317 2 2.6268 -0.0109 2.6268 1.6207
No log 0.0635 4 1.3499 0.1000 1.3499 1.1619
No log 0.0952 6 0.9620 -0.0550 0.9620 0.9808
No log 0.1270 8 1.3578 -0.0471 1.3578 1.1652
No log 0.1587 10 1.2481 -0.0733 1.2481 1.1172
No log 0.1905 12 0.9136 0.1221 0.9136 0.9558
No log 0.2222 14 0.8425 0.1094 0.8425 0.9179
No log 0.2540 16 0.8038 0.0 0.8038 0.8965
No log 0.2857 18 0.7850 0.0 0.7850 0.8860
No log 0.3175 20 0.7713 0.0 0.7713 0.8782
No log 0.3492 22 0.7838 0.0 0.7838 0.8853
No log 0.3810 24 0.8451 0.1770 0.8451 0.9193
No log 0.4127 26 0.8933 0.2769 0.8933 0.9452
No log 0.4444 28 0.7988 0.2156 0.7988 0.8938
No log 0.4762 30 0.7136 -0.0027 0.7136 0.8448
No log 0.5079 32 0.7036 0.1187 0.7036 0.8388
No log 0.5397 34 0.7372 0.1456 0.7372 0.8586
No log 0.5714 36 0.7852 0.0444 0.7852 0.8861
No log 0.6032 38 0.9132 -0.0228 0.9132 0.9556
No log 0.6349 40 0.8923 0.0145 0.8923 0.9446
No log 0.6667 42 0.7901 0.0481 0.7901 0.8889
No log 0.6984 44 0.7431 0.2454 0.7431 0.8620
No log 0.7302 46 0.7169 0.1272 0.7169 0.8467
No log 0.7619 48 0.8347 0.0 0.8347 0.9136
No log 0.7937 50 0.9373 0.0 0.9373 0.9682
No log 0.8254 52 0.9355 0.0 0.9355 0.9672
No log 0.8571 54 0.8647 0.0 0.8647 0.9299
No log 0.8889 56 0.8061 0.0 0.8061 0.8979
No log 0.9206 58 0.7828 0.0 0.7828 0.8848
No log 0.9524 60 0.7792 0.0 0.7792 0.8827
No log 0.9841 62 0.7980 0.0 0.7980 0.8933
No log 1.0159 64 0.8152 0.0 0.8152 0.9029
No log 1.0476 66 0.8441 0.0 0.8441 0.9187
No log 1.0794 68 0.9179 0.1711 0.9179 0.9581
No log 1.1111 70 0.8751 0.1352 0.8751 0.9354
No log 1.1429 72 0.8854 0.0947 0.8854 0.9409
No log 1.1746 74 0.9144 0.0522 0.9144 0.9562
No log 1.2063 76 0.8456 0.0481 0.8456 0.9196
No log 1.2381 78 0.7870 0.0717 0.7870 0.8872
No log 1.2698 80 0.7815 0.0717 0.7815 0.8840
No log 1.3016 82 0.7831 0.0798 0.7831 0.8849
No log 1.3333 84 0.8263 0.0509 0.8263 0.9090
No log 1.3651 86 0.9857 0.0966 0.9857 0.9928
No log 1.3968 88 1.2214 0.1468 1.2214 1.1052
No log 1.4286 90 1.2053 0.1234 1.2053 1.0979
No log 1.4603 92 1.0389 0.2702 1.0389 1.0193
No log 1.4921 94 0.8623 0.3444 0.8623 0.9286
No log 1.5238 96 0.7409 0.0851 0.7409 0.8607
No log 1.5556 98 0.7610 0.0798 0.7610 0.8724
No log 1.5873 100 0.8358 0.0798 0.8358 0.9142
No log 1.6190 102 1.0402 -0.0424 1.0402 1.0199
No log 1.6508 104 1.0845 -0.0316 1.0845 1.0414
No log 1.6825 106 0.8925 0.0464 0.8925 0.9447
No log 1.7143 108 0.8573 0.0024 0.8573 0.9259
No log 1.7460 110 0.8069 0.0408 0.8069 0.8983
No log 1.7778 112 0.8444 0.0437 0.8444 0.9189
No log 1.8095 114 0.8878 0.1281 0.8878 0.9422
No log 1.8413 116 1.1026 0.0710 1.1026 1.0500
No log 1.8730 118 1.0548 0.1271 1.0548 1.0270
No log 1.9048 120 1.0378 0.0980 1.0378 1.0187
No log 1.9365 122 0.8730 0.0026 0.8730 0.9344
No log 1.9683 124 0.7755 -0.0027 0.7755 0.8806
No log 2.0 126 0.7685 0.0393 0.7685 0.8767
No log 2.0317 128 0.7676 0.1187 0.7676 0.8761
No log 2.0635 130 0.7741 0.0393 0.7741 0.8798
No log 2.0952 132 0.7805 0.0410 0.7805 0.8835
No log 2.1270 134 0.8172 0.1673 0.8172 0.9040
No log 2.1587 136 0.8997 0.2277 0.8997 0.9485
No log 2.1905 138 0.9256 0.2254 0.9256 0.9621
No log 2.2222 140 0.8195 0.3031 0.8195 0.9052
No log 2.2540 142 0.7363 0.1561 0.7363 0.8581
No log 2.2857 144 0.7376 0.1863 0.7376 0.8589
No log 2.3175 146 0.7344 0.1863 0.7344 0.8570
No log 2.3492 148 0.7553 0.1224 0.7553 0.8691
No log 2.3810 150 0.8162 0.1304 0.8162 0.9034
No log 2.4127 152 0.7930 0.1255 0.7930 0.8905
No log 2.4444 154 0.7830 0.1586 0.7830 0.8849
No log 2.4762 156 0.7571 0.1569 0.7571 0.8701
No log 2.5079 158 0.7640 0.2023 0.7640 0.8741
No log 2.5397 160 0.7731 0.2270 0.7731 0.8793
No log 2.5714 162 0.7743 0.2951 0.7743 0.8800
No log 2.6032 164 0.7927 0.2843 0.7927 0.8903
No log 2.6349 166 0.8208 0.2661 0.8208 0.9060
No log 2.6667 168 0.7707 0.4006 0.7707 0.8779
No log 2.6984 170 0.8028 0.2879 0.8028 0.8960
No log 2.7302 172 0.9098 0.2895 0.9098 0.9539
No log 2.7619 174 0.8578 0.2991 0.8578 0.9262
No log 2.7937 176 0.7826 0.2998 0.7826 0.8847
No log 2.8254 178 0.7500 0.3982 0.7500 0.8661
No log 2.8571 180 0.8295 0.3367 0.8295 0.9108
No log 2.8889 182 0.9964 0.0679 0.9964 0.9982
No log 2.9206 184 0.9434 0.0925 0.9434 0.9713
No log 2.9524 186 0.8342 0.1218 0.8342 0.9134
No log 2.9841 188 0.7720 0.2243 0.7720 0.8786
No log 3.0159 190 0.8439 0.2153 0.8439 0.9186
No log 3.0476 192 0.8505 0.2153 0.8505 0.9222
No log 3.0794 194 0.7806 0.2677 0.7806 0.8835
No log 3.1111 196 0.8832 -0.0426 0.8832 0.9398
No log 3.1429 198 1.0015 0.0580 1.0015 1.0008
No log 3.1746 200 0.9382 0.0559 0.9382 0.9686
No log 3.2063 202 0.8289 -0.0026 0.8289 0.9104
No log 3.2381 204 0.8023 0.1187 0.8023 0.8957
No log 3.2698 206 0.8002 0.1187 0.8002 0.8945
No log 3.3016 208 0.7929 0.1983 0.7929 0.8904
No log 3.3333 210 0.8199 0.0376 0.8199 0.9055
No log 3.3651 212 0.8759 0.0822 0.8759 0.9359
No log 3.3968 214 0.8439 0.2264 0.8439 0.9186
No log 3.4286 216 0.7945 0.2451 0.7945 0.8913
No log 3.4603 218 0.7745 0.2717 0.7745 0.8801
No log 3.4921 220 0.7685 0.2890 0.7685 0.8766
No log 3.5238 222 0.7834 0.2181 0.7834 0.8851
No log 3.5556 224 0.8250 0.1176 0.8250 0.9083
No log 3.5873 226 0.8406 0.1558 0.8406 0.9168
No log 3.6190 228 0.8371 0.2126 0.8371 0.9149
No log 3.6508 230 0.8186 0.2224 0.8186 0.9048
No log 3.6825 232 0.8594 0.2009 0.8594 0.9270
No log 3.7143 234 0.8921 0.2009 0.8921 0.9445
No log 3.7460 236 0.8619 0.2009 0.8619 0.9284
No log 3.7778 238 0.8312 0.2152 0.8312 0.9117
No log 3.8095 240 0.8494 0.2318 0.8494 0.9217
No log 3.8413 242 0.8306 0.2744 0.8306 0.9114
No log 3.8730 244 0.8479 0.2389 0.8479 0.9208
No log 3.9048 246 0.9091 0.1826 0.9091 0.9535
No log 3.9365 248 0.9514 0.0531 0.9514 0.9754
No log 3.9683 250 0.8524 0.2319 0.8524 0.9233
No log 4.0 252 0.8281 0.2160 0.8281 0.9100
No log 4.0317 254 1.0160 0.1176 1.0160 1.0080
No log 4.0635 256 1.1229 0.0689 1.1229 1.0597
No log 4.0952 258 1.0342 0.1176 1.0342 1.0170
No log 4.1270 260 0.8462 0.1538 0.8462 0.9199
No log 4.1587 262 0.8161 0.3081 0.8161 0.9034
No log 4.1905 264 0.8869 0.1800 0.8869 0.9417
No log 4.2222 266 0.8746 0.1961 0.8746 0.9352
No log 4.2540 268 0.8283 0.2591 0.8283 0.9101
No log 4.2857 270 0.8411 0.1961 0.8411 0.9171
No log 4.3175 272 0.8428 0.1961 0.8428 0.9180
No log 4.3492 274 0.8298 0.2591 0.8298 0.9109
No log 4.3810 276 0.8560 0.2009 0.8560 0.9252
No log 4.4127 278 0.8378 0.2389 0.8378 0.9153
No log 4.4444 280 0.7927 0.2270 0.7927 0.8903
No log 4.4762 282 0.7818 0.2327 0.7818 0.8842
No log 4.5079 284 0.7859 0.2270 0.7859 0.8865
No log 4.5397 286 0.8002 0.2058 0.8002 0.8945
No log 4.5714 288 0.7870 0.2058 0.7870 0.8871
No log 4.6032 290 0.7579 0.3667 0.7579 0.8706
No log 4.6349 292 0.7461 0.3252 0.7461 0.8638
No log 4.6667 294 0.7361 0.4012 0.7361 0.8580
No log 4.6984 296 0.7656 0.4092 0.7656 0.8750
No log 4.7302 298 0.7820 0.4059 0.7820 0.8843
No log 4.7619 300 0.7779 0.3399 0.7779 0.8820
No log 4.7937 302 0.7438 0.3885 0.7438 0.8625
No log 4.8254 304 0.7547 0.2877 0.7547 0.8687
No log 4.8571 306 0.7874 0.2813 0.7874 0.8874
No log 4.8889 308 0.7837 0.2717 0.7837 0.8853
No log 4.9206 310 0.7783 0.2353 0.7783 0.8822
No log 4.9524 312 0.7938 0.2283 0.7938 0.8910
No log 4.9841 314 0.8091 0.2611 0.8091 0.8995
No log 5.0159 316 0.7886 0.2674 0.7886 0.8880
No log 5.0476 318 0.7752 0.2955 0.7752 0.8805
No log 5.0794 320 0.7979 0.2530 0.7979 0.8933
No log 5.1111 322 0.8513 0.1566 0.8513 0.9226
No log 5.1429 324 0.8443 0.1607 0.8443 0.9189
No log 5.1746 326 0.7871 0.3961 0.7871 0.8872
No log 5.2063 328 0.7838 0.3467 0.7838 0.8853
No log 5.2381 330 0.7890 0.3467 0.7890 0.8882
No log 5.2698 332 0.7893 0.3122 0.7893 0.8884
No log 5.3016 334 0.8225 0.3816 0.8225 0.9069
No log 5.3333 336 0.8231 0.3839 0.8231 0.9073
No log 5.3651 338 0.8287 0.2540 0.8287 0.9103
No log 5.3968 340 0.8916 0.2332 0.8916 0.9442
No log 5.4286 342 0.8777 0.2355 0.8777 0.9369
No log 5.4603 344 0.8186 0.2595 0.8186 0.9048
No log 5.4921 346 0.7899 0.2342 0.7899 0.8887
No log 5.5238 348 0.7791 0.2058 0.7791 0.8827
No log 5.5556 350 0.7763 0.1550 0.7763 0.8811
No log 5.5873 352 0.7721 0.1550 0.7721 0.8787
No log 5.6190 354 0.7645 0.1550 0.7645 0.8744
No log 5.6508 356 0.7692 0.1550 0.7692 0.8771
No log 5.6825 358 0.7903 0.1336 0.7903 0.8890
No log 5.7143 360 0.8221 0.1984 0.8221 0.9067
No log 5.7460 362 0.8543 0.2595 0.8543 0.9243
No log 5.7778 364 0.8478 0.3122 0.8478 0.9208
No log 5.8095 366 0.9313 0.2183 0.9313 0.9650
No log 5.8413 368 0.9865 0.2201 0.9865 0.9932
No log 5.8730 370 0.9018 0.2696 0.9018 0.9496
No log 5.9048 372 0.8168 0.2780 0.8168 0.9038
No log 5.9365 374 0.7753 0.2543 0.7753 0.8805
No log 5.9683 376 0.7710 0.2884 0.7710 0.8781
No log 6.0 378 0.7921 0.2843 0.7921 0.8900
No log 6.0317 380 0.7904 0.3051 0.7904 0.8890
No log 6.0635 382 0.8163 0.2751 0.8163 0.9035
No log 6.0952 384 0.8681 0.3068 0.8681 0.9317
No log 6.1270 386 0.8460 0.2617 0.8460 0.9198
No log 6.1587 388 0.8083 0.2993 0.8083 0.8990
No log 6.1905 390 0.7765 0.3482 0.7765 0.8812
No log 6.2222 392 0.7500 0.3714 0.7500 0.8660
No log 6.2540 394 0.7489 0.2715 0.7489 0.8654
No log 6.2857 396 0.7750 0.2943 0.7750 0.8803
No log 6.3175 398 0.8595 0.2294 0.8595 0.9271
No log 6.3492 400 0.8959 0.2702 0.8959 0.9465
No log 6.3810 402 0.8403 0.3001 0.8403 0.9167
No log 6.4127 404 0.7700 0.3434 0.7700 0.8775
No log 6.4444 406 0.7613 0.2901 0.7613 0.8725
No log 6.4762 408 0.7684 0.2838 0.7684 0.8766
No log 6.5079 410 0.7809 0.3366 0.7809 0.8837
No log 6.5397 412 0.7863 0.2424 0.7863 0.8867
No log 6.5714 414 0.8011 0.3172 0.8011 0.8951
No log 6.6032 416 0.8229 0.3417 0.8229 0.9071
No log 6.6349 418 0.8365 0.2993 0.8365 0.9146
No log 6.6667 420 0.8315 0.2749 0.8315 0.9119
No log 6.6984 422 0.8329 0.2866 0.8329 0.9126
No log 6.7302 424 0.8118 0.3314 0.8118 0.9010
No log 6.7619 426 0.8218 0.3314 0.8218 0.9066
No log 6.7937 428 0.8043 0.2777 0.8043 0.8968
No log 6.8254 430 0.7883 0.2806 0.7883 0.8879
No log 6.8571 432 0.7732 0.2867 0.7732 0.8793
No log 6.8889 434 0.7760 0.1853 0.7760 0.8809
No log 6.9206 436 0.7921 0.3382 0.7921 0.8900
No log 6.9524 438 0.7770 0.2419 0.7770 0.8815
No log 6.9841 440 0.7706 0.2449 0.7706 0.8779
No log 7.0159 442 0.7533 0.3133 0.7533 0.8679
No log 7.0476 444 0.7527 0.3149 0.7527 0.8676
No log 7.0794 446 0.7571 0.2652 0.7571 0.8701
No log 7.1111 448 0.7538 0.3213 0.7538 0.8682
No log 7.1429 450 0.7520 0.3417 0.7520 0.8672
No log 7.1746 452 0.7658 0.3034 0.7658 0.8751
No log 7.2063 454 0.8022 0.2962 0.8022 0.8957
No log 7.2381 456 0.7753 0.3076 0.7753 0.8805
No log 7.2698 458 0.7564 0.3738 0.7564 0.8697
No log 7.3016 460 0.7993 0.2130 0.7993 0.8941
No log 7.3333 462 0.8111 0.2236 0.8111 0.9006
No log 7.3651 464 0.7701 0.2519 0.7701 0.8775
No log 7.3968 466 0.7735 0.2777 0.7735 0.8795
No log 7.4286 468 0.9514 0.2303 0.9514 0.9754
No log 7.4603 470 1.0735 0.1799 1.0735 1.0361
No log 7.4921 472 1.0192 0.1835 1.0192 1.0096
No log 7.5238 474 0.8638 0.1859 0.8638 0.9294
No log 7.5556 476 0.7978 0.3512 0.7978 0.8932
No log 7.5873 478 0.7958 0.3578 0.7958 0.8921
No log 7.6190 480 0.8217 0.2038 0.8217 0.9065
No log 7.6508 482 0.8199 0.2281 0.8199 0.9055
No log 7.6825 484 0.7903 0.2831 0.7903 0.8890
No log 7.7143 486 0.7774 0.2947 0.7774 0.8817
No log 7.7460 488 0.7691 0.3260 0.7691 0.8770
No log 7.7778 490 0.7451 0.3955 0.7451 0.8632
No log 7.8095 492 0.7433 0.4137 0.7433 0.8622
No log 7.8413 494 0.7492 0.4211 0.7492 0.8656
No log 7.8730 496 0.7502 0.4211 0.7502 0.8662
No log 7.9048 498 0.7416 0.4137 0.7416 0.8612
0.4124 7.9365 500 0.7378 0.3350 0.7378 0.8590
0.4124 7.9683 502 0.7313 0.2234 0.7313 0.8552
0.4124 8.0 504 0.7329 0.2774 0.7329 0.8561
0.4124 8.0317 506 0.7327 0.3258 0.7327 0.8560
0.4124 8.0635 508 0.7402 0.2884 0.7402 0.8603
0.4124 8.0952 510 0.7576 0.3683 0.7576 0.8704
0.4124 8.1270 512 0.7504 0.3478 0.7504 0.8663
0.4124 8.1587 514 0.7513 0.3944 0.7513 0.8668
0.4124 8.1905 516 0.7575 0.3725 0.7575 0.8703
0.4124 8.2222 518 0.7473 0.3360 0.7473 0.8645
0.4124 8.2540 520 0.7425 0.3352 0.7425 0.8617
0.4124 8.2857 522 0.7948 0.2722 0.7948 0.8915
0.4124 8.3175 524 0.8564 0.2754 0.8564 0.9254
0.4124 8.3492 526 0.8359 0.2887 0.8359 0.9143
0.4124 8.3810 528 0.7599 0.2621 0.7599 0.8717
0.4124 8.4127 530 0.7313 0.2193 0.7313 0.8552

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_usingALLEssays_FineTuningAraBERT_run1_AugV5_k19_task7_organization

Finetuned
(4019)
this model