ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k19_task5_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.6533
  • Qwk: 0.5579
  • Mse: 0.6533
  • Rmse: 0.8082

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0211 2 3.8878 -0.0172 3.8878 1.9717
No log 0.0421 4 2.1197 0.1266 2.1197 1.4559
No log 0.0632 6 1.4797 -0.0267 1.4797 1.2164
No log 0.0842 8 1.0331 0.2288 1.0331 1.0164
No log 0.1053 10 1.1162 0.2944 1.1162 1.0565
No log 0.1263 12 1.0773 0.2865 1.0773 1.0379
No log 0.1474 14 1.0048 0.3519 1.0048 1.0024
No log 0.1684 16 0.9294 0.3583 0.9294 0.9640
No log 0.1895 18 0.9248 0.3561 0.9248 0.9617
No log 0.2105 20 0.9221 0.3646 0.9221 0.9602
No log 0.2316 22 0.9957 0.2857 0.9957 0.9978
No log 0.2526 24 1.4618 0.2803 1.4618 1.2091
No log 0.2737 26 1.6526 0.2742 1.6526 1.2855
No log 0.2947 28 1.2694 0.2667 1.2694 1.1267
No log 0.3158 30 0.9206 0.3143 0.9206 0.9595
No log 0.3368 32 0.8415 0.3837 0.8415 0.9173
No log 0.3579 34 0.9117 0.3883 0.9117 0.9548
No log 0.3789 36 0.9644 0.3629 0.9644 0.9820
No log 0.4 38 0.9809 0.3164 0.9809 0.9904
No log 0.4211 40 0.9453 0.3308 0.9453 0.9723
No log 0.4421 42 0.9269 0.2518 0.9269 0.9627
No log 0.4632 44 0.9438 0.3625 0.9438 0.9715
No log 0.4842 46 0.8691 0.4820 0.8691 0.9322
No log 0.5053 48 1.0037 0.3431 1.0037 1.0018
No log 0.5263 50 1.7812 0.1630 1.7812 1.3346
No log 0.5474 52 1.9082 0.1593 1.9082 1.3814
No log 0.5684 54 1.2591 0.3411 1.2591 1.1221
No log 0.5895 56 0.8387 0.4828 0.8387 0.9158
No log 0.6105 58 0.9062 0.4695 0.9062 0.9519
No log 0.6316 60 0.9663 0.4578 0.9663 0.9830
No log 0.6526 62 0.8854 0.4517 0.8854 0.9409
No log 0.6737 64 0.8050 0.5523 0.8050 0.8972
No log 0.6947 66 0.8164 0.4959 0.8164 0.9035
No log 0.7158 68 0.9288 0.4233 0.9288 0.9637
No log 0.7368 70 0.9111 0.4104 0.9111 0.9545
No log 0.7579 72 0.7187 0.5961 0.7187 0.8477
No log 0.7789 74 0.6712 0.5735 0.6712 0.8193
No log 0.8 76 0.6963 0.5480 0.6963 0.8345
No log 0.8211 78 0.7482 0.5260 0.7482 0.8650
No log 0.8421 80 0.7119 0.5832 0.7119 0.8438
No log 0.8632 82 0.7737 0.5769 0.7737 0.8796
No log 0.8842 84 0.8216 0.5489 0.8216 0.9064
No log 0.9053 86 0.8944 0.5671 0.8944 0.9457
No log 0.9263 88 0.9017 0.5744 0.9017 0.9496
No log 0.9474 90 0.8359 0.5917 0.8359 0.9143
No log 0.9684 92 0.7722 0.5680 0.7722 0.8787
No log 0.9895 94 0.7784 0.6046 0.7784 0.8823
No log 1.0105 96 0.7992 0.5893 0.7992 0.8940
No log 1.0316 98 0.7506 0.5949 0.7506 0.8664
No log 1.0526 100 0.6970 0.5439 0.6970 0.8349
No log 1.0737 102 0.6824 0.5066 0.6824 0.8261
No log 1.0947 104 0.6653 0.6022 0.6653 0.8156
No log 1.1158 106 0.8143 0.6640 0.8143 0.9024
No log 1.1368 108 0.9773 0.5543 0.9773 0.9886
No log 1.1579 110 0.8856 0.5956 0.8856 0.9411
No log 1.1789 112 0.6787 0.6724 0.6787 0.8238
No log 1.2 114 0.6127 0.6125 0.6127 0.7828
No log 1.2211 116 0.5913 0.6102 0.5913 0.7689
No log 1.2421 118 0.5910 0.6685 0.5910 0.7687
No log 1.2632 120 0.5799 0.6519 0.5799 0.7615
No log 1.2842 122 0.6040 0.6766 0.6040 0.7772
No log 1.3053 124 0.5820 0.6639 0.5820 0.7629
No log 1.3263 126 0.6125 0.6455 0.6125 0.7826
No log 1.3474 128 0.7026 0.6238 0.7026 0.8382
No log 1.3684 130 0.7485 0.6731 0.7485 0.8651
No log 1.3895 132 0.7067 0.6897 0.7067 0.8407
No log 1.4105 134 0.6204 0.6852 0.6204 0.7877
No log 1.4316 136 0.5575 0.5742 0.5575 0.7467
No log 1.4526 138 0.5991 0.6302 0.5991 0.7740
No log 1.4737 140 0.5776 0.6035 0.5776 0.7600
No log 1.4947 142 0.6134 0.6692 0.6134 0.7832
No log 1.5158 144 0.6538 0.6751 0.6538 0.8086
No log 1.5368 146 0.6311 0.6988 0.6311 0.7944
No log 1.5579 148 0.5874 0.6459 0.5874 0.7664
No log 1.5789 150 0.5548 0.6356 0.5548 0.7448
No log 1.6 152 0.5932 0.6626 0.5932 0.7702
No log 1.6211 154 0.6156 0.6867 0.6156 0.7846
No log 1.6421 156 0.6376 0.6900 0.6376 0.7985
No log 1.6632 158 0.5900 0.6518 0.5900 0.7681
No log 1.6842 160 0.6416 0.6599 0.6416 0.8010
No log 1.7053 162 0.6472 0.6470 0.6472 0.8045
No log 1.7263 164 0.5741 0.6667 0.5741 0.7577
No log 1.7474 166 0.6198 0.6412 0.6198 0.7873
No log 1.7684 168 0.7316 0.6199 0.7316 0.8553
No log 1.7895 170 0.6742 0.6301 0.6742 0.8211
No log 1.8105 172 0.6061 0.6028 0.6061 0.7785
No log 1.8316 174 0.5636 0.6545 0.5636 0.7507
No log 1.8526 176 0.5649 0.6578 0.5649 0.7516
No log 1.8737 178 0.5744 0.6743 0.5744 0.7579
No log 1.8947 180 0.5874 0.7048 0.5874 0.7664
No log 1.9158 182 0.5879 0.6877 0.5879 0.7667
No log 1.9368 184 0.5879 0.6833 0.5879 0.7667
No log 1.9579 186 0.6152 0.7144 0.6152 0.7843
No log 1.9789 188 0.6098 0.6981 0.6098 0.7809
No log 2.0 190 0.5754 0.7184 0.5754 0.7585
No log 2.0211 192 0.5352 0.6704 0.5352 0.7316
No log 2.0421 194 0.5745 0.7027 0.5745 0.7579
No log 2.0632 196 0.6462 0.6554 0.6462 0.8039
No log 2.0842 198 0.6697 0.6120 0.6697 0.8184
No log 2.1053 200 0.6014 0.6628 0.6014 0.7755
No log 2.1263 202 0.5472 0.6812 0.5472 0.7397
No log 2.1474 204 0.5398 0.6981 0.5398 0.7347
No log 2.1684 206 0.5347 0.7279 0.5347 0.7312
No log 2.1895 208 0.5194 0.7216 0.5194 0.7207
No log 2.2105 210 0.5148 0.7320 0.5148 0.7175
No log 2.2316 212 0.5397 0.7180 0.5397 0.7347
No log 2.2526 214 0.5512 0.7083 0.5512 0.7424
No log 2.2737 216 0.6078 0.6828 0.6078 0.7796
No log 2.2947 218 0.6092 0.6775 0.6092 0.7805
No log 2.3158 220 0.6017 0.6324 0.6017 0.7757
No log 2.3368 222 0.6682 0.6403 0.6682 0.8174
No log 2.3579 224 0.7419 0.6436 0.7419 0.8613
No log 2.3789 226 0.6821 0.6253 0.6821 0.8259
No log 2.4 228 0.5920 0.6051 0.5920 0.7694
No log 2.4211 230 0.5816 0.6562 0.5816 0.7626
No log 2.4421 232 0.5854 0.6429 0.5854 0.7651
No log 2.4632 234 0.6317 0.5988 0.6317 0.7948
No log 2.4842 236 0.7165 0.5895 0.7165 0.8464
No log 2.5053 238 0.6806 0.6269 0.6806 0.8250
No log 2.5263 240 0.5810 0.6157 0.5810 0.7622
No log 2.5474 242 0.5820 0.6330 0.5820 0.7629
No log 2.5684 244 0.6124 0.5554 0.6124 0.7826
No log 2.5895 246 0.6242 0.5770 0.6242 0.7900
No log 2.6105 248 0.6023 0.6473 0.6023 0.7761
No log 2.6316 250 0.5962 0.6399 0.5962 0.7721
No log 2.6526 252 0.5711 0.6843 0.5711 0.7557
No log 2.6737 254 0.5540 0.7083 0.5540 0.7443
No log 2.6947 256 0.5459 0.7083 0.5459 0.7388
No log 2.7158 258 0.5356 0.7318 0.5356 0.7318
No log 2.7368 260 0.5244 0.7501 0.5244 0.7241
No log 2.7579 262 0.5183 0.7189 0.5183 0.7199
No log 2.7789 264 0.5408 0.6736 0.5408 0.7354
No log 2.8 266 0.5723 0.6781 0.5723 0.7565
No log 2.8211 268 0.5631 0.6605 0.5631 0.7504
No log 2.8421 270 0.5412 0.7119 0.5412 0.7357
No log 2.8632 272 0.4984 0.7305 0.4984 0.7060
No log 2.8842 274 0.4878 0.7314 0.4878 0.6984
No log 2.9053 276 0.5185 0.7398 0.5185 0.7200
No log 2.9263 278 0.5765 0.7020 0.5765 0.7593
No log 2.9474 280 0.5671 0.7141 0.5671 0.7530
No log 2.9684 282 0.5548 0.6805 0.5548 0.7448
No log 2.9895 284 0.5914 0.6849 0.5914 0.7690
No log 3.0105 286 0.6614 0.5777 0.6614 0.8133
No log 3.0316 288 0.7040 0.5455 0.7040 0.8390
No log 3.0526 290 0.8083 0.5704 0.8083 0.8991
No log 3.0737 292 0.7866 0.5759 0.7866 0.8869
No log 3.0947 294 0.7487 0.5948 0.7487 0.8652
No log 3.1158 296 0.6771 0.5862 0.6771 0.8229
No log 3.1368 298 0.6439 0.5776 0.6439 0.8024
No log 3.1579 300 0.6805 0.5634 0.6805 0.8249
No log 3.1789 302 0.6570 0.5543 0.6570 0.8105
No log 3.2 304 0.6091 0.6421 0.6091 0.7804
No log 3.2211 306 0.5980 0.6528 0.5980 0.7733
No log 3.2421 308 0.6725 0.6371 0.6725 0.8200
No log 3.2632 310 0.7828 0.5495 0.7828 0.8848
No log 3.2842 312 0.7761 0.5198 0.7761 0.8810
No log 3.3053 314 0.6841 0.5766 0.6841 0.8271
No log 3.3263 316 0.6327 0.5948 0.6327 0.7954
No log 3.3474 318 0.6538 0.6564 0.6538 0.8086
No log 3.3684 320 0.6576 0.6564 0.6576 0.8109
No log 3.3895 322 0.6877 0.6753 0.6877 0.8293
No log 3.4105 324 0.6616 0.6640 0.6616 0.8134
No log 3.4316 326 0.7597 0.5769 0.7597 0.8716
No log 3.4526 328 0.7098 0.5963 0.7098 0.8425
No log 3.4737 330 0.6021 0.6365 0.6021 0.7760
No log 3.4947 332 0.6091 0.6097 0.6091 0.7804
No log 3.5158 334 0.6204 0.5626 0.6204 0.7876
No log 3.5368 336 0.6614 0.5821 0.6614 0.8133
No log 3.5579 338 0.6837 0.5677 0.6837 0.8268
No log 3.5789 340 0.6309 0.5776 0.6309 0.7943
No log 3.6 342 0.6150 0.7028 0.6150 0.7842
No log 3.6211 344 0.6085 0.7265 0.6085 0.7801
No log 3.6421 346 0.6103 0.7124 0.6103 0.7812
No log 3.6632 348 0.6387 0.6556 0.6387 0.7992
No log 3.6842 350 0.6024 0.6482 0.6024 0.7761
No log 3.7053 352 0.5739 0.6221 0.5739 0.7576
No log 3.7263 354 0.5625 0.5960 0.5625 0.7500
No log 3.7474 356 0.5572 0.6065 0.5572 0.7464
No log 3.7684 358 0.5579 0.6415 0.5579 0.7470
No log 3.7895 360 0.6144 0.5875 0.6144 0.7839
No log 3.8105 362 0.7103 0.6263 0.7103 0.8428
No log 3.8316 364 0.7617 0.5849 0.7617 0.8728
No log 3.8526 366 0.6813 0.6123 0.6813 0.8254
No log 3.8737 368 0.5977 0.5730 0.5977 0.7731
No log 3.8947 370 0.5824 0.6415 0.5824 0.7631
No log 3.9158 372 0.6284 0.5719 0.6284 0.7927
No log 3.9368 374 0.7763 0.5861 0.7763 0.8811
No log 3.9579 376 0.8683 0.5526 0.8683 0.9318
No log 3.9789 378 0.8511 0.5541 0.8511 0.9226
No log 4.0 380 0.7707 0.5110 0.7707 0.8779
No log 4.0211 382 0.6694 0.5970 0.6694 0.8182
No log 4.0421 384 0.5965 0.5810 0.5965 0.7723
No log 4.0632 386 0.5887 0.5882 0.5887 0.7673
No log 4.0842 388 0.5731 0.6354 0.5731 0.7570
No log 4.1053 390 0.6079 0.5622 0.6079 0.7797
No log 4.1263 392 0.6360 0.5666 0.6360 0.7975
No log 4.1474 394 0.6240 0.6141 0.6240 0.7900
No log 4.1684 396 0.5423 0.6835 0.5423 0.7364
No log 4.1895 398 0.5070 0.7064 0.5070 0.7120
No log 4.2105 400 0.5184 0.7124 0.5184 0.7200
No log 4.2316 402 0.5396 0.6919 0.5396 0.7346
No log 4.2526 404 0.5273 0.7158 0.5273 0.7261
No log 4.2737 406 0.5447 0.6820 0.5447 0.7381
No log 4.2947 408 0.5644 0.6545 0.5644 0.7512
No log 4.3158 410 0.5585 0.6903 0.5585 0.7473
No log 4.3368 412 0.5702 0.6365 0.5702 0.7551
No log 4.3579 414 0.5849 0.6243 0.5849 0.7648
No log 4.3789 416 0.6017 0.6560 0.6017 0.7757
No log 4.4 418 0.5974 0.6536 0.5974 0.7729
No log 4.4211 420 0.5706 0.6736 0.5706 0.7554
No log 4.4421 422 0.5678 0.6888 0.5678 0.7535
No log 4.4632 424 0.5452 0.6717 0.5452 0.7384
No log 4.4842 426 0.5519 0.6476 0.5519 0.7429
No log 4.5053 428 0.5809 0.6012 0.5809 0.7622
No log 4.5263 430 0.5737 0.6028 0.5737 0.7575
No log 4.5474 432 0.5678 0.6335 0.5678 0.7535
No log 4.5684 434 0.5670 0.6370 0.5670 0.7530
No log 4.5895 436 0.5803 0.6335 0.5803 0.7618
No log 4.6105 438 0.6320 0.5566 0.6320 0.7950
No log 4.6316 440 0.5985 0.5688 0.5985 0.7736
No log 4.6526 442 0.5302 0.6853 0.5302 0.7282
No log 4.6737 444 0.4971 0.7064 0.4971 0.7051
No log 4.6947 446 0.4973 0.6935 0.4973 0.7052
No log 4.7158 448 0.5041 0.7054 0.5041 0.7100
No log 4.7368 450 0.5150 0.7294 0.5150 0.7176
No log 4.7579 452 0.5173 0.6877 0.5173 0.7192
No log 4.7789 454 0.5255 0.6680 0.5255 0.7249
No log 4.8 456 0.5260 0.6680 0.5260 0.7252
No log 4.8211 458 0.5206 0.6501 0.5206 0.7216
No log 4.8421 460 0.5250 0.6835 0.5250 0.7246
No log 4.8632 462 0.5130 0.7003 0.5130 0.7163
No log 4.8842 464 0.5341 0.6406 0.5341 0.7308
No log 4.9053 466 0.5450 0.6188 0.5450 0.7382
No log 4.9263 468 0.5599 0.6520 0.5599 0.7483
No log 4.9474 470 0.5976 0.6225 0.5976 0.7730
No log 4.9684 472 0.5526 0.6476 0.5526 0.7434
No log 4.9895 474 0.5349 0.6756 0.5349 0.7313
No log 5.0105 476 0.5444 0.6938 0.5444 0.7378
No log 5.0316 478 0.5439 0.6568 0.5439 0.7375
No log 5.0526 480 0.5801 0.6753 0.5801 0.7616
No log 5.0737 482 0.7030 0.6260 0.7030 0.8384
No log 5.0947 484 0.7308 0.6439 0.7308 0.8549
No log 5.1158 486 0.6374 0.5930 0.6374 0.7984
No log 5.1368 488 0.5605 0.6476 0.5605 0.7487
No log 5.1579 490 0.5499 0.6602 0.5499 0.7415
No log 5.1789 492 0.5551 0.6572 0.5551 0.7450
No log 5.2 494 0.5499 0.6820 0.5499 0.7416
No log 5.2211 496 0.5854 0.6169 0.5854 0.7651
No log 5.2421 498 0.6682 0.6115 0.6682 0.8174
0.3146 5.2632 500 0.7250 0.6235 0.7250 0.8515
0.3146 5.2842 502 0.6909 0.6301 0.6909 0.8312
0.3146 5.3053 504 0.6623 0.6309 0.6623 0.8138
0.3146 5.3263 506 0.6058 0.5766 0.6058 0.7784
0.3146 5.3474 508 0.5627 0.6501 0.5627 0.7501
0.3146 5.3684 510 0.5597 0.6501 0.5597 0.7481
0.3146 5.3895 512 0.5917 0.5953 0.5917 0.7692
0.3146 5.4105 514 0.6606 0.6019 0.6606 0.8127
0.3146 5.4316 516 0.7507 0.6260 0.7507 0.8664
0.3146 5.4526 518 0.8735 0.5965 0.8735 0.9346
0.3146 5.4737 520 0.8611 0.6011 0.8611 0.9280
0.3146 5.4947 522 0.7846 0.4987 0.7846 0.8857
0.3146 5.5158 524 0.6931 0.5346 0.6931 0.8325
0.3146 5.5368 526 0.6533 0.5579 0.6533 0.8082

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k19_task5_organization

Finetuned
(4019)
this model