ArabicNewSplits7_B_usingALLEssays_FineTuningAraBERT_run2_AugV5_k18_task7_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.8070
  • Qwk: 0.3440
  • Mse: 0.8070
  • Rmse: 0.8984

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0222 2 2.9772 -0.0863 2.9772 1.7255
No log 0.0444 4 1.5682 -0.0020 1.5682 1.2523
No log 0.0667 6 1.2809 -0.2100 1.2809 1.1318
No log 0.0889 8 1.1245 -0.0379 1.1245 1.0604
No log 0.1111 10 1.3013 -0.0255 1.3013 1.1408
No log 0.1333 12 1.3404 -0.1185 1.3404 1.1578
No log 0.1556 14 1.3048 -0.1071 1.3048 1.1423
No log 0.1778 16 1.3349 -0.1703 1.3349 1.1554
No log 0.2 18 1.1499 -0.0133 1.1499 1.0724
No log 0.2222 20 0.9887 0.1054 0.9887 0.9943
No log 0.2444 22 0.8566 0.1904 0.8566 0.9255
No log 0.2667 24 0.9049 0.0570 0.9049 0.9513
No log 0.2889 26 0.9932 0.0974 0.9932 0.9966
No log 0.3111 28 1.2268 0.0745 1.2268 1.1076
No log 0.3333 30 0.9545 0.1288 0.9545 0.9770
No log 0.3556 32 0.7269 0.3363 0.7269 0.8526
No log 0.3778 34 0.7651 0.3833 0.7651 0.8747
No log 0.4 36 0.8966 0.3118 0.8966 0.9469
No log 0.4222 38 0.9287 0.2824 0.9287 0.9637
No log 0.4444 40 0.7266 0.3122 0.7266 0.8524
No log 0.4667 42 0.6879 0.3211 0.6879 0.8294
No log 0.4889 44 0.7119 0.3788 0.7119 0.8437
No log 0.5111 46 0.7218 0.3599 0.7218 0.8496
No log 0.5333 48 0.7241 0.3198 0.7241 0.8509
No log 0.5556 50 0.8483 0.2819 0.8483 0.9210
No log 0.5778 52 1.1327 0.2094 1.1327 1.0643
No log 0.6 54 1.0427 0.2122 1.0427 1.0211
No log 0.6222 56 0.7655 0.2826 0.7655 0.8749
No log 0.6444 58 0.7282 0.2034 0.7282 0.8533
No log 0.6667 60 0.7723 0.2944 0.7723 0.8788
No log 0.6889 62 0.8839 0.3009 0.8839 0.9402
No log 0.7111 64 0.7922 0.2612 0.7922 0.8900
No log 0.7333 66 0.6780 0.2205 0.6780 0.8234
No log 0.7556 68 0.6533 0.3197 0.6533 0.8083
No log 0.7778 70 0.6592 0.2787 0.6592 0.8119
No log 0.8 72 0.6496 0.1922 0.6496 0.8060
No log 0.8222 74 0.7294 0.0078 0.7294 0.8541
No log 0.8444 76 0.9453 0.2784 0.9453 0.9723
No log 0.8667 78 1.1904 0.1971 1.1904 1.0911
No log 0.8889 80 1.2177 0.1232 1.2177 1.1035
No log 0.9111 82 1.0446 0.0356 1.0446 1.0221
No log 0.9333 84 0.8866 0.1367 0.8866 0.9416
No log 0.9556 86 0.9478 0.1558 0.9478 0.9736
No log 0.9778 88 1.0145 0.0656 1.0145 1.0072
No log 1.0 90 1.0662 0.1770 1.0662 1.0326
No log 1.0222 92 1.1382 0.0941 1.1382 1.0669
No log 1.0444 94 1.2625 0.0278 1.2625 1.1236
No log 1.0667 96 1.2591 0.0767 1.2591 1.1221
No log 1.0889 98 1.0938 0.0965 1.0938 1.0459
No log 1.1111 100 0.9922 0.2020 0.9922 0.9961
No log 1.1333 102 0.9411 0.1842 0.9411 0.9701
No log 1.1556 104 0.8866 0.1440 0.8866 0.9416
No log 1.1778 106 0.8614 0.1448 0.8614 0.9281
No log 1.2 108 0.8328 0.1448 0.8328 0.9126
No log 1.2222 110 0.7977 0.1448 0.7977 0.8932
No log 1.2444 112 0.7737 0.2745 0.7737 0.8796
No log 1.2667 114 0.7698 0.2745 0.7698 0.8774
No log 1.2889 116 0.7644 0.2745 0.7644 0.8743
No log 1.3111 118 0.7569 0.1961 0.7569 0.8700
No log 1.3333 120 0.7474 0.1961 0.7474 0.8645
No log 1.3556 122 0.7221 0.2412 0.7221 0.8498
No log 1.3778 124 0.7540 0.1548 0.7540 0.8683
No log 1.4 126 0.9783 0.1512 0.9783 0.9891
No log 1.4222 128 1.0457 0.2330 1.0457 1.0226
No log 1.4444 130 1.0283 0.2330 1.0283 1.0141
No log 1.4667 132 0.8535 0.2714 0.8535 0.9239
No log 1.4889 134 0.8114 0.3350 0.8114 0.9008
No log 1.5111 136 0.7751 0.2992 0.7751 0.8804
No log 1.5333 138 0.8662 0.2678 0.8662 0.9307
No log 1.5556 140 1.0776 0.1856 1.0776 1.0381
No log 1.5778 142 0.8344 0.1909 0.8344 0.9134
No log 1.6 144 0.6543 0.2996 0.6543 0.8089
No log 1.6222 146 0.6839 0.2819 0.6839 0.8270
No log 1.6444 148 0.7130 0.2506 0.7130 0.8444
No log 1.6667 150 0.8010 0.1789 0.8010 0.8950
No log 1.6889 152 0.8615 0.1809 0.8615 0.9282
No log 1.7111 154 0.8754 0.2030 0.8754 0.9356
No log 1.7333 156 0.9645 0.2815 0.9645 0.9821
No log 1.7556 158 1.1437 0.2482 1.1437 1.0695
No log 1.7778 160 1.3959 0.0476 1.3959 1.1815
No log 1.8 162 1.3915 0.0506 1.3915 1.1796
No log 1.8222 164 1.1090 0.2278 1.1090 1.0531
No log 1.8444 166 0.8613 0.1795 0.8613 0.9280
No log 1.8667 168 0.8339 0.1942 0.8339 0.9132
No log 1.8889 170 0.8409 0.1904 0.8409 0.9170
No log 1.9111 172 0.8796 0.0838 0.8796 0.9379
No log 1.9333 174 0.8745 0.0833 0.8745 0.9352
No log 1.9556 176 0.8445 0.1550 0.8445 0.9190
No log 1.9778 178 0.9423 0.1102 0.9423 0.9707
No log 2.0 180 1.0306 0.1479 1.0306 1.0152
No log 2.0222 182 1.0711 0.1467 1.0711 1.0349
No log 2.0444 184 1.0005 0.0146 1.0005 1.0002
No log 2.0667 186 1.0747 0.2097 1.0747 1.0367
No log 2.0889 188 1.2531 0.0449 1.2531 1.1194
No log 2.1111 190 1.1579 0.0931 1.1579 1.0761
No log 2.1333 192 0.9701 0.1842 0.9701 0.9850
No log 2.1556 194 0.8926 0.2027 0.8926 0.9448
No log 2.1778 196 0.8971 0.2173 0.8971 0.9472
No log 2.2 198 0.8712 0.2662 0.8712 0.9334
No log 2.2222 200 0.8265 0.2852 0.8265 0.9091
No log 2.2444 202 0.8809 0.1857 0.8809 0.9385
No log 2.2667 204 0.9822 0.1557 0.9822 0.9911
No log 2.2889 206 0.9908 0.1869 0.9908 0.9954
No log 2.3111 208 0.8538 0.0955 0.8538 0.9240
No log 2.3333 210 0.7208 0.1981 0.7208 0.8490
No log 2.3556 212 0.6476 0.2317 0.6476 0.8048
No log 2.3778 214 0.6344 0.3213 0.6344 0.7965
No log 2.4 216 0.6234 0.3305 0.6234 0.7896
No log 2.4222 218 0.7082 0.2984 0.7082 0.8415
No log 2.4444 220 0.8272 0.3455 0.8272 0.9095
No log 2.4667 222 0.8079 0.3395 0.8079 0.8989
No log 2.4889 224 0.7037 0.3302 0.7037 0.8389
No log 2.5111 226 0.6739 0.3762 0.6739 0.8209
No log 2.5333 228 0.7943 0.4486 0.7943 0.8913
No log 2.5556 230 0.8405 0.4684 0.8405 0.9168
No log 2.5778 232 0.7878 0.2845 0.7878 0.8876
No log 2.6 234 0.7621 0.3492 0.7621 0.8730
No log 2.6222 236 0.7315 0.3042 0.7315 0.8553
No log 2.6444 238 0.7302 0.3681 0.7302 0.8545
No log 2.6667 240 0.7121 0.3042 0.7121 0.8439
No log 2.6889 242 0.7127 0.2295 0.7127 0.8442
No log 2.7111 244 0.7291 0.2608 0.7291 0.8539
No log 2.7333 246 0.7517 0.3455 0.7517 0.8670
No log 2.7556 248 0.7583 0.3156 0.7583 0.8708
No log 2.7778 250 0.7893 0.4031 0.7893 0.8884
No log 2.8 252 0.8400 0.3572 0.8400 0.9165
No log 2.8222 254 0.8563 0.3572 0.8563 0.9254
No log 2.8444 256 0.8074 0.3052 0.8074 0.8985
No log 2.8667 258 0.7976 0.3226 0.7976 0.8931
No log 2.8889 260 0.8063 0.3043 0.8063 0.8979
No log 2.9111 262 0.8647 0.3378 0.8647 0.9299
No log 2.9333 264 0.8723 0.3331 0.8723 0.9340
No log 2.9556 266 0.8556 0.3131 0.8556 0.9250
No log 2.9778 268 0.8559 0.2715 0.8559 0.9252
No log 3.0 270 0.8409 0.2511 0.8409 0.9170
No log 3.0222 272 0.8445 0.2911 0.8445 0.9190
No log 3.0444 274 0.8580 0.2362 0.8580 0.9263
No log 3.0667 276 0.8321 0.2385 0.8321 0.9122
No log 3.0889 278 0.8000 0.2652 0.8000 0.8944
No log 3.1111 280 0.8216 0.1793 0.8216 0.9064
No log 3.1333 282 0.8707 0.1347 0.8707 0.9331
No log 3.1556 284 0.8871 0.2327 0.8871 0.9419
No log 3.1778 286 0.9131 0.2570 0.9131 0.9556
No log 3.2 288 0.9530 0.2796 0.9530 0.9762
No log 3.2222 290 0.9306 0.2796 0.9306 0.9647
No log 3.2444 292 0.8609 0.1984 0.8609 0.9278
No log 3.2667 294 0.8470 0.2071 0.8470 0.9203
No log 3.2889 296 0.8492 0.1902 0.8492 0.9215
No log 3.3111 298 0.8267 0.1902 0.8267 0.9092
No log 3.3333 300 0.8108 0.2004 0.8108 0.9004
No log 3.3556 302 0.8420 0.2126 0.8420 0.9176
No log 3.3778 304 0.8322 0.1782 0.8322 0.9123
No log 3.4 306 0.8567 0.1782 0.8567 0.9256
No log 3.4222 308 0.9293 0.1053 0.9293 0.9640
No log 3.4444 310 0.9567 0.1210 0.9567 0.9781
No log 3.4667 312 1.0548 0.2099 1.0548 1.0270
No log 3.4889 314 1.1254 0.1913 1.1254 1.0609
No log 3.5111 316 1.1439 0.1498 1.1439 1.0696
No log 3.5333 318 1.1876 0.0898 1.1876 1.0898
No log 3.5556 320 1.1984 0.1254 1.1984 1.0947
No log 3.5778 322 1.0820 0.0615 1.0820 1.0402
No log 3.6 324 0.9712 0.1066 0.9712 0.9855
No log 3.6222 326 0.9357 0.0212 0.9357 0.9673
No log 3.6444 328 0.8844 0.1820 0.8844 0.9404
No log 3.6667 330 0.8462 0.1901 0.8462 0.9199
No log 3.6889 332 0.8364 0.1800 0.8364 0.9145
No log 3.7111 334 0.8534 0.1368 0.8534 0.9238
No log 3.7333 336 0.8535 0.1740 0.8535 0.9239
No log 3.7556 338 0.8615 0.0835 0.8615 0.9282
No log 3.7778 340 0.9008 0.1455 0.9008 0.9491
No log 3.8 342 0.9459 0.1090 0.9459 0.9726
No log 3.8222 344 0.9486 0.0915 0.9486 0.9739
No log 3.8444 346 0.9348 0.0817 0.9348 0.9669
No log 3.8667 348 0.9149 0.1602 0.9149 0.9565
No log 3.8889 350 0.8622 0.1916 0.8622 0.9285
No log 3.9111 352 0.7940 0.2058 0.7940 0.8911
No log 3.9333 354 0.7739 0.2182 0.7739 0.8797
No log 3.9556 356 0.7779 0.2216 0.7779 0.8820
No log 3.9778 358 0.7720 0.2113 0.7720 0.8786
No log 4.0 360 0.7902 0.2058 0.7902 0.8889
No log 4.0222 362 0.8138 0.2366 0.8138 0.9021
No log 4.0444 364 0.8313 0.2038 0.8313 0.9117
No log 4.0667 366 0.8693 0.1137 0.8693 0.9324
No log 4.0889 368 0.9013 0.2335 0.9013 0.9493
No log 4.1111 370 0.8953 0.1479 0.8953 0.9462
No log 4.1333 372 0.8825 0.1649 0.8825 0.9394
No log 4.1556 374 0.8630 0.1603 0.8630 0.9290
No log 4.1778 376 0.8559 0.3452 0.8559 0.9252
No log 4.2 378 0.8303 0.2633 0.8303 0.9112
No log 4.2222 380 0.8129 0.1779 0.8129 0.9016
No log 4.2444 382 0.8035 0.1857 0.8035 0.8964
No log 4.2667 384 0.7918 0.1627 0.7918 0.8898
No log 4.2889 386 0.8000 0.2395 0.8000 0.8944
No log 4.3111 388 0.8196 0.2980 0.8196 0.9053
No log 4.3333 390 0.8303 0.2020 0.8303 0.9112
No log 4.3556 392 0.8906 0.1307 0.8906 0.9437
No log 4.3778 394 0.9365 0.1619 0.9365 0.9677
No log 4.4 396 0.9410 0.2001 0.9410 0.9701
No log 4.4222 398 0.9633 0.2771 0.9633 0.9815
No log 4.4444 400 1.0001 0.3984 1.0001 1.0001
No log 4.4667 402 0.9515 0.3526 0.9515 0.9754
No log 4.4889 404 0.8539 0.2831 0.8539 0.9240
No log 4.5111 406 0.8315 0.1741 0.8315 0.9118
No log 4.5333 408 0.8839 0.2652 0.8839 0.9402
No log 4.5556 410 0.8565 0.2796 0.8565 0.9255
No log 4.5778 412 0.7407 0.2053 0.7407 0.8606
No log 4.6 414 0.7544 0.3051 0.7544 0.8686
No log 4.6222 416 0.8820 0.4133 0.8820 0.9392
No log 4.6444 418 0.9924 0.4536 0.9924 0.9962
No log 4.6667 420 0.9244 0.4479 0.9244 0.9615
No log 4.6889 422 0.7939 0.3780 0.7939 0.8910
No log 4.7111 424 0.7305 0.3628 0.7305 0.8547
No log 4.7333 426 0.8087 0.4219 0.8087 0.8993
No log 4.7556 428 0.7659 0.4321 0.7659 0.8751
No log 4.7778 430 0.6901 0.3902 0.6901 0.8307
No log 4.8 432 0.7057 0.4428 0.7057 0.8401
No log 4.8222 434 0.7857 0.3889 0.7857 0.8864
No log 4.8444 436 0.7849 0.3544 0.7849 0.8859
No log 4.8667 438 0.7480 0.3567 0.7480 0.8649
No log 4.8889 440 0.6696 0.3452 0.6696 0.8183
No log 4.9111 442 0.6246 0.3677 0.6246 0.7903
No log 4.9333 444 0.6288 0.3704 0.6288 0.7930
No log 4.9556 446 0.6419 0.4262 0.6419 0.8012
No log 4.9778 448 0.6552 0.4287 0.6552 0.8094
No log 5.0 450 0.6726 0.4037 0.6726 0.8201
No log 5.0222 452 0.6799 0.3928 0.6799 0.8245
No log 5.0444 454 0.6844 0.3201 0.6844 0.8273
No log 5.0667 456 0.6673 0.3978 0.6673 0.8169
No log 5.0889 458 0.6704 0.4006 0.6704 0.8188
No log 5.1111 460 0.6675 0.3643 0.6675 0.8170
No log 5.1333 462 0.6479 0.3106 0.6479 0.8049
No log 5.1556 464 0.6331 0.3863 0.6331 0.7957
No log 5.1778 466 0.6327 0.3863 0.6327 0.7954
No log 5.2 468 0.6421 0.3863 0.6421 0.8013
No log 5.2222 470 0.6511 0.3995 0.6511 0.8069
No log 5.2444 472 0.6801 0.3078 0.6801 0.8247
No log 5.2667 474 0.7612 0.3732 0.7612 0.8724
No log 5.2889 476 0.7972 0.4334 0.7972 0.8929
No log 5.3111 478 0.7378 0.3958 0.7378 0.8589
No log 5.3333 480 0.7102 0.2563 0.7102 0.8427
No log 5.3556 482 0.7431 0.1982 0.7431 0.8620
No log 5.3778 484 0.7534 0.1982 0.7534 0.8680
No log 5.4 486 0.7157 0.2746 0.7157 0.8460
No log 5.4222 488 0.7230 0.3613 0.7230 0.8503
No log 5.4444 490 0.7400 0.3231 0.7400 0.8602
No log 5.4667 492 0.6999 0.3353 0.6999 0.8366
No log 5.4889 494 0.6682 0.3380 0.6682 0.8175
No log 5.5111 496 0.6768 0.3320 0.6768 0.8227
No log 5.5333 498 0.6773 0.3729 0.6773 0.8230
0.4216 5.5556 500 0.6736 0.3554 0.6736 0.8207
0.4216 5.5778 502 0.7371 0.3526 0.7371 0.8585
0.4216 5.6 504 0.8047 0.4038 0.8047 0.8971
0.4216 5.6222 506 0.7665 0.3955 0.7665 0.8755
0.4216 5.6444 508 0.6850 0.3763 0.6850 0.8277
0.4216 5.6667 510 0.6522 0.2622 0.6522 0.8076
0.4216 5.6889 512 0.6659 0.3042 0.6659 0.8160
0.4216 5.7111 514 0.6727 0.4575 0.6727 0.8202
0.4216 5.7333 516 0.6721 0.3995 0.6721 0.8198
0.4216 5.7556 518 0.6813 0.3817 0.6813 0.8254
0.4216 5.7778 520 0.6833 0.4365 0.6833 0.8266
0.4216 5.8 522 0.6963 0.5317 0.6963 0.8345
0.4216 5.8222 524 0.6841 0.5104 0.6841 0.8271
0.4216 5.8444 526 0.6968 0.5159 0.6968 0.8348
0.4216 5.8667 528 0.6861 0.4315 0.6861 0.8283
0.4216 5.8889 530 0.6872 0.3760 0.6872 0.8290
0.4216 5.9111 532 0.6722 0.4108 0.6722 0.8199
0.4216 5.9333 534 0.6519 0.4524 0.6519 0.8074
0.4216 5.9556 536 0.6575 0.4327 0.6575 0.8108
0.4216 5.9778 538 0.7134 0.4579 0.7134 0.8447
0.4216 6.0 540 0.7364 0.4133 0.7364 0.8581
0.4216 6.0222 542 0.6950 0.4574 0.6950 0.8337
0.4216 6.0444 544 0.6310 0.5324 0.6310 0.7943
0.4216 6.0667 546 0.6040 0.5133 0.6040 0.7772
0.4216 6.0889 548 0.5935 0.4382 0.5935 0.7704
0.4216 6.1111 550 0.6312 0.5195 0.6312 0.7945
0.4216 6.1333 552 0.6814 0.3896 0.6814 0.8255
0.4216 6.1556 554 0.6690 0.4406 0.6690 0.8179
0.4216 6.1778 556 0.6453 0.4992 0.6453 0.8033
0.4216 6.2 558 0.6473 0.4278 0.6473 0.8046
0.4216 6.2222 560 0.6782 0.4089 0.6782 0.8235
0.4216 6.2444 562 0.7040 0.4097 0.7040 0.8390
0.4216 6.2667 564 0.7107 0.4622 0.7107 0.8430
0.4216 6.2889 566 0.7163 0.4270 0.7163 0.8463
0.4216 6.3111 568 0.7221 0.4212 0.7221 0.8498
0.4216 6.3333 570 0.7016 0.3452 0.7016 0.8376
0.4216 6.3556 572 0.6952 0.2360 0.6952 0.8338
0.4216 6.3778 574 0.7047 0.1918 0.7047 0.8395
0.4216 6.4 576 0.7241 0.2953 0.7241 0.8510
0.4216 6.4222 578 0.8070 0.3440 0.8070 0.8984

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_B_usingALLEssays_FineTuningAraBERT_run2_AugV5_k18_task7_organization

Finetuned
(4019)
this model