ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run3_AugV5_k18_task2_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.8554
  • Qwk: 0.3577
  • Mse: 0.8554
  • Rmse: 0.9249

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0196 2 4.6204 0.0144 4.6204 2.1495
No log 0.0392 4 2.9896 0.0005 2.9896 1.7290
No log 0.0588 6 1.6829 0.0198 1.6829 1.2973
No log 0.0784 8 1.2900 0.0205 1.2900 1.1358
No log 0.0980 10 1.2350 0.1176 1.2350 1.1113
No log 0.1176 12 1.1748 0.1521 1.1748 1.0839
No log 0.1373 14 1.1659 0.3510 1.1659 1.0798
No log 0.1569 16 1.1843 0.2333 1.1843 1.0883
No log 0.1765 18 1.1389 0.2781 1.1389 1.0672
No log 0.1961 20 1.1126 0.3095 1.1126 1.0548
No log 0.2157 22 1.4269 0.1224 1.4269 1.1945
No log 0.2353 24 1.4780 0.1387 1.4780 1.2157
No log 0.2549 26 1.1061 0.2572 1.1061 1.0517
No log 0.2745 28 1.0697 0.3525 1.0697 1.0343
No log 0.2941 30 1.1101 0.2486 1.1101 1.0536
No log 0.3137 32 1.0913 0.2843 1.0913 1.0447
No log 0.3333 34 1.2261 0.2292 1.2261 1.1073
No log 0.3529 36 1.5355 0.2568 1.5355 1.2391
No log 0.3725 38 1.5865 0.2413 1.5865 1.2595
No log 0.3922 40 1.3248 0.3188 1.3248 1.1510
No log 0.4118 42 1.1968 0.3440 1.1968 1.0940
No log 0.4314 44 1.1626 0.4279 1.1626 1.0782
No log 0.4510 46 1.1182 0.3443 1.1182 1.0574
No log 0.4706 48 1.0861 0.4126 1.0861 1.0422
No log 0.4902 50 1.0315 0.4462 1.0315 1.0156
No log 0.5098 52 1.1136 0.3458 1.1136 1.0553
No log 0.5294 54 1.4072 0.3053 1.4072 1.1862
No log 0.5490 56 1.2467 0.3058 1.2467 1.1166
No log 0.5686 58 0.9509 0.5021 0.9509 0.9752
No log 0.5882 60 1.0805 0.4953 1.0805 1.0395
No log 0.6078 62 1.2237 0.5440 1.2237 1.1062
No log 0.6275 64 0.9658 0.6054 0.9658 0.9828
No log 0.6471 66 0.9080 0.5049 0.9080 0.9529
No log 0.6667 68 1.1809 0.3259 1.1809 1.0867
No log 0.6863 70 1.3410 0.3208 1.3410 1.1580
No log 0.7059 72 1.1727 0.3002 1.1727 1.0829
No log 0.7255 74 0.8710 0.5127 0.8710 0.9333
No log 0.7451 76 0.9508 0.4927 0.9508 0.9751
No log 0.7647 78 1.0453 0.4339 1.0453 1.0224
No log 0.7843 80 0.9824 0.4410 0.9824 0.9912
No log 0.8039 82 0.8669 0.5073 0.8669 0.9311
No log 0.8235 84 0.8179 0.5699 0.8179 0.9044
No log 0.8431 86 0.8438 0.5404 0.8438 0.9186
No log 0.8627 88 0.9826 0.5055 0.9826 0.9913
No log 0.8824 90 0.9287 0.5129 0.9287 0.9637
No log 0.9020 92 0.7637 0.6175 0.7637 0.8739
No log 0.9216 94 0.7498 0.6725 0.7498 0.8659
No log 0.9412 96 0.7650 0.6771 0.7650 0.8746
No log 0.9608 98 0.7164 0.6996 0.7164 0.8464
No log 0.9804 100 0.6996 0.6824 0.6996 0.8364
No log 1.0 102 0.7931 0.5451 0.7931 0.8906
No log 1.0196 104 0.9259 0.4833 0.9259 0.9622
No log 1.0392 106 0.7563 0.5821 0.7563 0.8697
No log 1.0588 108 0.7046 0.6629 0.7046 0.8394
No log 1.0784 110 0.8580 0.5398 0.8580 0.9263
No log 1.0980 112 0.9565 0.5475 0.9565 0.9780
No log 1.1176 114 0.7736 0.6662 0.7736 0.8795
No log 1.1373 116 0.7376 0.6445 0.7376 0.8589
No log 1.1569 118 0.7856 0.6115 0.7856 0.8863
No log 1.1765 120 0.6954 0.6396 0.6954 0.8339
No log 1.1961 122 0.6875 0.6664 0.6875 0.8292
No log 1.2157 124 0.6876 0.6352 0.6876 0.8292
No log 1.2353 126 0.6888 0.6352 0.6888 0.8299
No log 1.2549 128 0.6881 0.6402 0.6881 0.8295
No log 1.2745 130 0.6927 0.6277 0.6927 0.8323
No log 1.2941 132 0.7463 0.6252 0.7463 0.8639
No log 1.3137 134 0.7670 0.6012 0.7670 0.8758
No log 1.3333 136 0.9614 0.5657 0.9614 0.9805
No log 1.3529 138 0.9277 0.5585 0.9277 0.9632
No log 1.3725 140 0.7803 0.5802 0.7803 0.8834
No log 1.3922 142 0.8226 0.5458 0.8226 0.9070
No log 1.4118 144 0.8101 0.5458 0.8101 0.9001
No log 1.4314 146 0.7551 0.5885 0.7551 0.8690
No log 1.4510 148 0.7624 0.5547 0.7624 0.8732
No log 1.4706 150 0.7562 0.5498 0.7562 0.8696
No log 1.4902 152 0.7361 0.5729 0.7361 0.8580
No log 1.5098 154 0.7286 0.6059 0.7286 0.8536
No log 1.5294 156 0.7338 0.5898 0.7338 0.8566
No log 1.5490 158 0.8126 0.5577 0.8126 0.9014
No log 1.5686 160 0.8004 0.5710 0.8004 0.8946
No log 1.5882 162 0.7641 0.5662 0.7641 0.8741
No log 1.6078 164 0.7908 0.5296 0.7908 0.8893
No log 1.6275 166 0.9003 0.4867 0.9003 0.9489
No log 1.6471 168 0.8449 0.5075 0.8449 0.9192
No log 1.6667 170 0.7844 0.4792 0.7844 0.8856
No log 1.6863 172 0.9004 0.4661 0.9004 0.9489
No log 1.7059 174 1.1120 0.4731 1.1120 1.0545
No log 1.7255 176 1.0008 0.5159 1.0008 1.0004
No log 1.7451 178 0.7708 0.5686 0.7708 0.8780
No log 1.7647 180 0.8104 0.5312 0.8104 0.9002
No log 1.7843 182 0.8643 0.5236 0.8643 0.9297
No log 1.8039 184 0.7834 0.5642 0.7834 0.8851
No log 1.8235 186 0.8809 0.6092 0.8809 0.9386
No log 1.8431 188 1.0137 0.5698 1.0137 1.0068
No log 1.8627 190 0.8852 0.6102 0.8852 0.9409
No log 1.8824 192 0.7535 0.5971 0.7535 0.8680
No log 1.9020 194 0.7278 0.5830 0.7278 0.8531
No log 1.9216 196 0.7669 0.5593 0.7669 0.8757
No log 1.9412 198 0.9546 0.4748 0.9546 0.9770
No log 1.9608 200 0.9448 0.4833 0.9448 0.9720
No log 1.9804 202 0.8405 0.4861 0.8405 0.9168
No log 2.0 204 0.7723 0.5334 0.7723 0.8788
No log 2.0196 206 0.8033 0.6489 0.8033 0.8963
No log 2.0392 208 0.9542 0.4645 0.9542 0.9768
No log 2.0588 210 0.9860 0.4596 0.9860 0.9930
No log 2.0784 212 0.8711 0.4293 0.8711 0.9333
No log 2.0980 214 0.8251 0.5508 0.8251 0.9083
No log 2.1176 216 0.8761 0.5138 0.8761 0.9360
No log 2.1373 218 0.9382 0.4794 0.9382 0.9686
No log 2.1569 220 0.8839 0.4877 0.8839 0.9402
No log 2.1765 222 0.8453 0.4848 0.8453 0.9194
No log 2.1961 224 0.8347 0.5495 0.8347 0.9136
No log 2.2157 226 0.8240 0.5759 0.8240 0.9077
No log 2.2353 228 0.8893 0.5414 0.8893 0.9430
No log 2.2549 230 1.1656 0.4856 1.1656 1.0796
No log 2.2745 232 1.2346 0.4822 1.2346 1.1111
No log 2.2941 234 1.0498 0.4557 1.0498 1.0246
No log 2.3137 236 0.8615 0.4470 0.8615 0.9282
No log 2.3333 238 0.8250 0.4681 0.8250 0.9083
No log 2.3529 240 0.8169 0.5538 0.8169 0.9038
No log 2.3725 242 0.8124 0.5510 0.8124 0.9014
No log 2.3922 244 0.8510 0.5519 0.8510 0.9225
No log 2.4118 246 0.8238 0.5635 0.8238 0.9076
No log 2.4314 248 0.7762 0.5198 0.7762 0.8810
No log 2.4510 250 0.7958 0.5137 0.7958 0.8921
No log 2.4706 252 0.8348 0.6151 0.8348 0.9136
No log 2.4902 254 0.8874 0.5792 0.8874 0.9420
No log 2.5098 256 0.8598 0.5792 0.8598 0.9273
No log 2.5294 258 0.8391 0.5519 0.8391 0.9160
No log 2.5490 260 0.8172 0.5259 0.8172 0.9040
No log 2.5686 262 0.8428 0.5291 0.8428 0.9180
No log 2.5882 264 0.8046 0.5688 0.8046 0.8970
No log 2.6078 266 0.8116 0.5411 0.8116 0.9009
No log 2.6275 268 0.9131 0.5872 0.9131 0.9555
No log 2.6471 270 0.8831 0.5556 0.8831 0.9397
No log 2.6667 272 0.8215 0.5992 0.8215 0.9064
No log 2.6863 274 0.8054 0.5131 0.8054 0.8974
No log 2.7059 276 0.8179 0.5027 0.8179 0.9044
No log 2.7255 278 0.8612 0.3787 0.8612 0.9280
No log 2.7451 280 0.8725 0.3590 0.8725 0.9341
No log 2.7647 282 0.8423 0.3363 0.8423 0.9178
No log 2.7843 284 0.8014 0.4042 0.8014 0.8952
No log 2.8039 286 0.7627 0.4645 0.7627 0.8733
No log 2.8235 288 0.7394 0.5296 0.7394 0.8599
No log 2.8431 290 0.7268 0.5705 0.7268 0.8526
No log 2.8627 292 0.7226 0.5634 0.7226 0.8500
No log 2.8824 294 0.7706 0.5060 0.7706 0.8778
No log 2.9020 296 0.7602 0.5057 0.7602 0.8719
No log 2.9216 298 0.7229 0.6307 0.7229 0.8502
No log 2.9412 300 0.8277 0.5844 0.8277 0.9098
No log 2.9608 302 0.8630 0.5716 0.8630 0.9290
No log 2.9804 304 0.8023 0.5670 0.8023 0.8957
No log 3.0 306 0.8313 0.5106 0.8313 0.9117
No log 3.0196 308 0.8957 0.5087 0.8957 0.9464
No log 3.0392 310 0.8571 0.5307 0.8571 0.9258
No log 3.0588 312 0.8292 0.5510 0.8292 0.9106
No log 3.0784 314 0.8519 0.4527 0.8519 0.9230
No log 3.0980 316 0.9554 0.3954 0.9554 0.9774
No log 3.1176 318 0.9706 0.4043 0.9706 0.9852
No log 3.1373 320 0.8450 0.4521 0.8450 0.9193
No log 3.1569 322 0.8174 0.5661 0.8174 0.9041
No log 3.1765 324 0.8743 0.4376 0.8743 0.9350
No log 3.1961 326 0.8513 0.5007 0.8513 0.9227
No log 3.2157 328 0.8925 0.4316 0.8925 0.9447
No log 3.2353 330 0.7953 0.5443 0.7953 0.8918
No log 3.2549 332 0.7808 0.5073 0.7808 0.8836
No log 3.2745 334 0.8960 0.5071 0.8960 0.9466
No log 3.2941 336 0.9778 0.4714 0.9778 0.9888
No log 3.3137 338 0.9459 0.4533 0.9459 0.9726
No log 3.3333 340 0.8513 0.4579 0.8513 0.9227
No log 3.3529 342 0.8020 0.4965 0.8020 0.8956
No log 3.3725 344 0.8194 0.5208 0.8194 0.9052
No log 3.3922 346 0.8307 0.5208 0.8307 0.9114
No log 3.4118 348 0.8201 0.5348 0.8201 0.9056
No log 3.4314 350 0.8413 0.6129 0.8413 0.9172
No log 3.4510 352 0.9125 0.6034 0.9125 0.9552
No log 3.4706 354 0.9370 0.5851 0.9370 0.9680
No log 3.4902 356 0.9342 0.5384 0.9342 0.9665
No log 3.5098 358 0.8775 0.4722 0.8775 0.9368
No log 3.5294 360 0.8190 0.4278 0.8190 0.9050
No log 3.5490 362 0.8015 0.4646 0.8015 0.8953
No log 3.5686 364 0.7964 0.5233 0.7964 0.8924
No log 3.5882 366 0.8238 0.4907 0.8238 0.9076
No log 3.6078 368 0.8076 0.5428 0.8076 0.8987
No log 3.6275 370 0.7806 0.5738 0.7806 0.8835
No log 3.6471 372 0.8619 0.4860 0.8619 0.9284
No log 3.6667 374 0.8520 0.5540 0.8520 0.9230
No log 3.6863 376 0.7902 0.5738 0.7902 0.8889
No log 3.7059 378 0.8061 0.4572 0.8061 0.8978
No log 3.7255 380 0.8225 0.4236 0.8225 0.9069
No log 3.7451 382 0.9044 0.4681 0.9044 0.9510
No log 3.7647 384 0.9291 0.4681 0.9291 0.9639
No log 3.7843 386 0.8897 0.4031 0.8897 0.9432
No log 3.8039 388 0.8030 0.4587 0.8030 0.8961
No log 3.8235 390 0.7439 0.5312 0.7439 0.8625
No log 3.8431 392 0.7584 0.5245 0.7584 0.8708
No log 3.8627 394 0.7489 0.5633 0.7489 0.8654
No log 3.8824 396 0.7682 0.4778 0.7682 0.8765
No log 3.9020 398 0.8700 0.5385 0.8700 0.9327
No log 3.9216 400 0.9640 0.5379 0.9640 0.9818
No log 3.9412 402 0.9724 0.5309 0.9724 0.9861
No log 3.9608 404 0.8785 0.4074 0.8785 0.9373
No log 3.9804 406 0.7957 0.4105 0.7957 0.8920
No log 4.0 408 0.7729 0.4465 0.7729 0.8791
No log 4.0196 410 0.7597 0.5013 0.7597 0.8716
No log 4.0392 412 0.7162 0.5555 0.7162 0.8463
No log 4.0588 414 0.6966 0.6129 0.6966 0.8346
No log 4.0784 416 0.7008 0.6227 0.7008 0.8371
No log 4.0980 418 0.7094 0.6860 0.7094 0.8423
No log 4.1176 420 0.8491 0.5066 0.8491 0.9215
No log 4.1373 422 1.0418 0.5584 1.0418 1.0207
No log 4.1569 424 1.0330 0.4733 1.0330 1.0164
No log 4.1765 426 0.9341 0.4166 0.9341 0.9665
No log 4.1961 428 0.8621 0.4331 0.8621 0.9285
No log 4.2157 430 0.7760 0.5184 0.7760 0.8809
No log 4.2353 432 0.7405 0.6265 0.7405 0.8605
No log 4.2549 434 0.7674 0.5902 0.7674 0.8760
No log 4.2745 436 0.8198 0.5926 0.8198 0.9054
No log 4.2941 438 0.8063 0.5714 0.8063 0.8979
No log 4.3137 440 0.7578 0.5661 0.7578 0.8705
No log 4.3333 442 0.8443 0.5554 0.8443 0.9188
No log 4.3529 444 0.8368 0.5098 0.8368 0.9147
No log 4.3725 446 0.8218 0.4297 0.8218 0.9066
No log 4.3922 448 0.8627 0.4072 0.8627 0.9288
No log 4.4118 450 0.9267 0.4375 0.9267 0.9626
No log 4.4314 452 0.9745 0.5109 0.9745 0.9872
No log 4.4510 454 0.9423 0.5304 0.9423 0.9707
No log 4.4706 456 0.8382 0.5472 0.8382 0.9155
No log 4.4902 458 0.7694 0.5521 0.7694 0.8772
No log 4.5098 460 0.7641 0.5521 0.7641 0.8741
No log 4.5294 462 0.7523 0.5570 0.7523 0.8674
No log 4.5490 464 0.7935 0.5086 0.7935 0.8908
No log 4.5686 466 0.8757 0.5553 0.8757 0.9358
No log 4.5882 468 0.9582 0.5184 0.9582 0.9789
No log 4.6078 470 0.9159 0.4851 0.9159 0.9570
No log 4.6275 472 0.8093 0.4553 0.8093 0.8996
No log 4.6471 474 0.7614 0.4646 0.7614 0.8726
No log 4.6667 476 0.7559 0.4646 0.7559 0.8694
No log 4.6863 478 0.7757 0.4681 0.7757 0.8807
No log 4.7059 480 0.8644 0.5408 0.8644 0.9297
No log 4.7255 482 0.9252 0.5184 0.9252 0.9619
No log 4.7451 484 0.8707 0.5408 0.8707 0.9331
No log 4.7647 486 0.7455 0.6175 0.7455 0.8634
No log 4.7843 488 0.7165 0.5716 0.7165 0.8465
No log 4.8039 490 0.7159 0.6620 0.7159 0.8461
No log 4.8235 492 0.7412 0.6335 0.7412 0.8609
No log 4.8431 494 0.8078 0.5849 0.8078 0.8988
No log 4.8627 496 0.9103 0.5199 0.9103 0.9541
No log 4.8824 498 1.0077 0.5055 1.0077 1.0038
0.3436 4.9020 500 0.9560 0.4894 0.9560 0.9778
0.3436 4.9216 502 0.8881 0.5423 0.8881 0.9424
0.3436 4.9412 504 0.8003 0.5264 0.8003 0.8946
0.3436 4.9608 506 0.7580 0.5725 0.7580 0.8706
0.3436 4.9804 508 0.7564 0.6195 0.7564 0.8697
0.3436 5.0 510 0.7483 0.6319 0.7483 0.8651
0.3436 5.0196 512 0.7366 0.5735 0.7366 0.8583
0.3436 5.0392 514 0.7464 0.5260 0.7464 0.8639
0.3436 5.0588 516 0.7727 0.6029 0.7727 0.8791
0.3436 5.0784 518 0.7934 0.5547 0.7934 0.8907
0.3436 5.0980 520 0.8551 0.5359 0.8551 0.9247
0.3436 5.1176 522 0.8942 0.5339 0.8942 0.9456
0.3436 5.1373 524 0.8778 0.4983 0.8778 0.9369
0.3436 5.1569 526 0.8619 0.4553 0.8619 0.9284
0.3436 5.1765 528 0.8395 0.4105 0.8395 0.9163
0.3436 5.1961 530 0.8187 0.3663 0.8187 0.9048
0.3436 5.2157 532 0.8232 0.3804 0.8232 0.9073
0.3436 5.2353 534 0.8554 0.3577 0.8554 0.9249

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
1
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run3_AugV5_k18_task2_organization

Finetuned
(4019)
this model