ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run3_AugV5_k2_task2_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.7129
  • Qwk: 0.4846
  • Mse: 0.7129
  • Rmse: 0.8443

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.1667 2 4.5738 0.0042 4.5738 2.1386
No log 0.3333 4 2.6516 0.0226 2.6516 1.6284
No log 0.5 6 1.5934 0.0504 1.5934 1.2623
No log 0.6667 8 1.3205 -0.0480 1.3205 1.1491
No log 0.8333 10 1.1874 0.1314 1.1874 1.0897
No log 1.0 12 1.2210 0.1585 1.2210 1.1050
No log 1.1667 14 1.2988 0.0532 1.2988 1.1397
No log 1.3333 16 1.2651 0.1793 1.2651 1.1248
No log 1.5 18 1.2810 0.0731 1.2810 1.1318
No log 1.6667 20 1.2866 0.0984 1.2866 1.1343
No log 1.8333 22 1.3124 0.0275 1.3124 1.1456
No log 2.0 24 1.1180 0.3145 1.1180 1.0573
No log 2.1667 26 1.0805 0.2336 1.0805 1.0395
No log 2.3333 28 1.0652 0.3734 1.0652 1.0321
No log 2.5 30 1.2279 0.1460 1.2279 1.1081
No log 2.6667 32 1.2595 0.1875 1.2595 1.1223
No log 2.8333 34 1.0734 0.2679 1.0734 1.0361
No log 3.0 36 1.3543 0.3570 1.3543 1.1637
No log 3.1667 38 1.5064 0.3134 1.5064 1.2274
No log 3.3333 40 1.1489 0.2959 1.1489 1.0719
No log 3.5 42 1.2058 0.3131 1.2058 1.0981
No log 3.6667 44 1.5414 0.2962 1.5414 1.2415
No log 3.8333 46 1.3787 0.2944 1.3787 1.1742
No log 4.0 48 1.0455 0.4035 1.0455 1.0225
No log 4.1667 50 1.0173 0.4493 1.0173 1.0086
No log 4.3333 52 1.0173 0.4569 1.0173 1.0086
No log 4.5 54 0.9711 0.4898 0.9711 0.9854
No log 4.6667 56 0.9573 0.5391 0.9573 0.9784
No log 4.8333 58 1.0068 0.4040 1.0068 1.0034
No log 5.0 60 1.0767 0.3361 1.0767 1.0376
No log 5.1667 62 1.0661 0.3570 1.0661 1.0325
No log 5.3333 64 0.9877 0.4411 0.9877 0.9938
No log 5.5 66 0.9382 0.5316 0.9382 0.9686
No log 5.6667 68 0.9546 0.4643 0.9546 0.9770
No log 5.8333 70 0.9942 0.4809 0.9942 0.9971
No log 6.0 72 1.0088 0.5050 1.0088 1.0044
No log 6.1667 74 1.1518 0.3767 1.1518 1.0732
No log 6.3333 76 1.1131 0.4012 1.1131 1.0551
No log 6.5 78 0.9589 0.5257 0.9589 0.9793
No log 6.6667 80 0.9335 0.4643 0.9335 0.9662
No log 6.8333 82 0.9971 0.4661 0.9971 0.9985
No log 7.0 84 0.9871 0.4475 0.9871 0.9935
No log 7.1667 86 0.9072 0.4582 0.9072 0.9525
No log 7.3333 88 1.0474 0.5029 1.0474 1.0234
No log 7.5 90 1.1935 0.4736 1.1935 1.0925
No log 7.6667 92 1.1196 0.4554 1.1196 1.0581
No log 7.8333 94 0.9253 0.5335 0.9253 0.9619
No log 8.0 96 0.9460 0.4847 0.9460 0.9726
No log 8.1667 98 0.9632 0.4567 0.9632 0.9815
No log 8.3333 100 0.9374 0.4996 0.9374 0.9682
No log 8.5 102 0.9531 0.4505 0.9531 0.9763
No log 8.6667 104 0.9574 0.4731 0.9574 0.9785
No log 8.8333 106 1.0202 0.4594 1.0202 1.0101
No log 9.0 108 1.1479 0.4318 1.1479 1.0714
No log 9.1667 110 1.0756 0.3615 1.0756 1.0371
No log 9.3333 112 0.9670 0.4026 0.9670 0.9834
No log 9.5 114 0.9799 0.5067 0.9799 0.9899
No log 9.6667 116 1.0379 0.5102 1.0379 1.0188
No log 9.8333 118 1.0578 0.4378 1.0578 1.0285
No log 10.0 120 1.1015 0.4601 1.1015 1.0495
No log 10.1667 122 1.0661 0.4378 1.0661 1.0325
No log 10.3333 124 0.9738 0.5148 0.9738 0.9868
No log 10.5 126 0.9338 0.4767 0.9338 0.9664
No log 10.6667 128 0.9280 0.5276 0.9280 0.9633
No log 10.8333 130 0.9426 0.5412 0.9426 0.9709
No log 11.0 132 0.8675 0.5268 0.8675 0.9314
No log 11.1667 134 0.8538 0.4273 0.8538 0.9240
No log 11.3333 136 0.8918 0.4833 0.8918 0.9444
No log 11.5 138 0.8616 0.4918 0.8616 0.9282
No log 11.6667 140 0.8226 0.5163 0.8226 0.9070
No log 11.8333 142 0.8763 0.6135 0.8763 0.9361
No log 12.0 144 0.9147 0.6085 0.9147 0.9564
No log 12.1667 146 0.9416 0.5560 0.9416 0.9704
No log 12.3333 148 0.7914 0.6157 0.7914 0.8896
No log 12.5 150 0.7575 0.5302 0.7575 0.8703
No log 12.6667 152 0.8597 0.5614 0.8597 0.9272
No log 12.8333 154 0.8815 0.5297 0.8815 0.9389
No log 13.0 156 0.8288 0.5014 0.8288 0.9104
No log 13.1667 158 0.7541 0.5835 0.7541 0.8684
No log 13.3333 160 0.7379 0.6084 0.7379 0.8590
No log 13.5 162 0.7158 0.5872 0.7158 0.8460
No log 13.6667 164 0.7282 0.6534 0.7282 0.8533
No log 13.8333 166 0.7259 0.5954 0.7259 0.8520
No log 14.0 168 0.6829 0.6356 0.6829 0.8264
No log 14.1667 170 0.6948 0.6486 0.6948 0.8336
No log 14.3333 172 0.7157 0.5571 0.7157 0.8460
No log 14.5 174 0.7202 0.4792 0.7202 0.8487
No log 14.6667 176 0.7554 0.4703 0.7554 0.8691
No log 14.8333 178 0.7210 0.5497 0.7210 0.8491
No log 15.0 180 0.6909 0.6129 0.6909 0.8312
No log 15.1667 182 0.7114 0.5370 0.7114 0.8434
No log 15.3333 184 0.7373 0.5789 0.7373 0.8587
No log 15.5 186 0.7708 0.5789 0.7708 0.8780
No log 15.6667 188 0.8071 0.6337 0.8071 0.8984
No log 15.8333 190 0.8637 0.6059 0.8637 0.9294
No log 16.0 192 0.7930 0.5853 0.7930 0.8905
No log 16.1667 194 0.7486 0.5889 0.7486 0.8652
No log 16.3333 196 0.8699 0.6423 0.8699 0.9327
No log 16.5 198 0.8946 0.6085 0.8946 0.9458
No log 16.6667 200 0.7960 0.5557 0.7960 0.8922
No log 16.8333 202 0.8051 0.4540 0.8051 0.8973
No log 17.0 204 0.8766 0.4463 0.8766 0.9363
No log 17.1667 206 0.9336 0.4641 0.9336 0.9662
No log 17.3333 208 0.9064 0.5348 0.9064 0.9521
No log 17.5 210 0.8108 0.5086 0.8108 0.9005
No log 17.6667 212 0.7652 0.5645 0.7652 0.8747
No log 17.8333 214 0.7691 0.5645 0.7691 0.8770
No log 18.0 216 0.7524 0.5645 0.7524 0.8674
No log 18.1667 218 0.7571 0.5671 0.7571 0.8701
No log 18.3333 220 0.7583 0.5242 0.7583 0.8708
No log 18.5 222 0.7791 0.4553 0.7791 0.8827
No log 18.6667 224 0.8174 0.4386 0.8174 0.9041
No log 18.8333 226 0.7979 0.4722 0.7979 0.8933
No log 19.0 228 0.7480 0.4937 0.7480 0.8649
No log 19.1667 230 0.7303 0.5582 0.7303 0.8546
No log 19.3333 232 0.7223 0.5838 0.7223 0.8499
No log 19.5 234 0.7068 0.6086 0.7068 0.8407
No log 19.6667 236 0.7148 0.6191 0.7148 0.8454
No log 19.8333 238 0.7331 0.5588 0.7331 0.8562
No log 20.0 240 0.7250 0.5846 0.7250 0.8515
No log 20.1667 242 0.7540 0.5838 0.7540 0.8683
No log 20.3333 244 0.7735 0.5838 0.7735 0.8795
No log 20.5 246 0.7568 0.5838 0.7568 0.8700
No log 20.6667 248 0.7279 0.5937 0.7279 0.8532
No log 20.8333 250 0.7393 0.5783 0.7393 0.8598
No log 21.0 252 0.7450 0.5674 0.7450 0.8631
No log 21.1667 254 0.7353 0.5432 0.7353 0.8575
No log 21.3333 256 0.7385 0.5746 0.7385 0.8594
No log 21.5 258 0.7562 0.5783 0.7562 0.8696
No log 21.6667 260 0.8017 0.5721 0.8017 0.8954
No log 21.8333 262 0.7940 0.5866 0.7940 0.8911
No log 22.0 264 0.7806 0.5401 0.7806 0.8835
No log 22.1667 266 0.8024 0.5503 0.8024 0.8958
No log 22.3333 268 0.8055 0.5416 0.8055 0.8975
No log 22.5 270 0.7699 0.5462 0.7699 0.8774
No log 22.6667 272 0.7811 0.5397 0.7811 0.8838
No log 22.8333 274 0.7691 0.5607 0.7691 0.8770
No log 23.0 276 0.7510 0.5397 0.7510 0.8666
No log 23.1667 278 0.7433 0.5693 0.7433 0.8621
No log 23.3333 280 0.7584 0.5810 0.7584 0.8709
No log 23.5 282 0.7404 0.5902 0.7404 0.8605
No log 23.6667 284 0.7656 0.6023 0.7656 0.8750
No log 23.8333 286 0.8507 0.5893 0.8507 0.9223
No log 24.0 288 0.8589 0.5920 0.8589 0.9268
No log 24.1667 290 0.7785 0.5861 0.7785 0.8823
No log 24.3333 292 0.7456 0.4968 0.7456 0.8635
No log 24.5 294 0.7985 0.4764 0.7985 0.8936
No log 24.6667 296 0.8261 0.5405 0.8261 0.9089
No log 24.8333 298 0.8043 0.4764 0.8043 0.8968
No log 25.0 300 0.7709 0.4737 0.7709 0.8780
No log 25.1667 302 0.7648 0.4747 0.7648 0.8745
No log 25.3333 304 0.7681 0.4898 0.7681 0.8764
No log 25.5 306 0.7655 0.5010 0.7655 0.8749
No log 25.6667 308 0.7927 0.5083 0.7927 0.8904
No log 25.8333 310 0.8591 0.6151 0.8591 0.9269
No log 26.0 312 0.8698 0.6427 0.8698 0.9326
No log 26.1667 314 0.8287 0.5683 0.8287 0.9103
No log 26.3333 316 0.7751 0.5083 0.7751 0.8804
No log 26.5 318 0.7600 0.4841 0.7600 0.8718
No log 26.6667 320 0.7605 0.5422 0.7605 0.8721
No log 26.8333 322 0.7704 0.5094 0.7704 0.8777
No log 27.0 324 0.7899 0.5094 0.7899 0.8888
No log 27.1667 326 0.7904 0.5094 0.7904 0.8890
No log 27.3333 328 0.7849 0.4816 0.7849 0.8860
No log 27.5 330 0.7718 0.4828 0.7718 0.8785
No log 27.6667 332 0.7691 0.4816 0.7691 0.8770
No log 27.8333 334 0.7782 0.4794 0.7782 0.8822
No log 28.0 336 0.7606 0.5202 0.7606 0.8721
No log 28.1667 338 0.7520 0.5498 0.7520 0.8672
No log 28.3333 340 0.7443 0.5385 0.7443 0.8627
No log 28.5 342 0.7356 0.5498 0.7356 0.8577
No log 28.6667 344 0.7330 0.5202 0.7330 0.8562
No log 28.8333 346 0.7243 0.5749 0.7243 0.8511
No log 29.0 348 0.7259 0.6077 0.7259 0.8520
No log 29.1667 350 0.7386 0.6182 0.7386 0.8594
No log 29.3333 352 0.7428 0.6334 0.7428 0.8618
No log 29.5 354 0.7410 0.5958 0.7410 0.8608
No log 29.6667 356 0.7717 0.5249 0.7717 0.8785
No log 29.8333 358 0.8013 0.5361 0.8013 0.8952
No log 30.0 360 0.7924 0.5249 0.7924 0.8902
No log 30.1667 362 0.7916 0.4816 0.7916 0.8897
No log 30.3333 364 0.8222 0.5202 0.8222 0.9067
No log 30.5 366 0.8410 0.6100 0.8410 0.9171
No log 30.6667 368 0.8266 0.6068 0.8266 0.9092
No log 30.8333 370 0.7738 0.5674 0.7738 0.8797
No log 31.0 372 0.7451 0.4966 0.7451 0.8632
No log 31.1667 374 0.7760 0.5646 0.7760 0.8809
No log 31.3333 376 0.8133 0.5671 0.8133 0.9018
No log 31.5 378 0.8492 0.5864 0.8492 0.9215
No log 31.6667 380 0.8324 0.5633 0.8324 0.9123
No log 31.8333 382 0.8013 0.5556 0.8013 0.8951
No log 32.0 384 0.7863 0.4937 0.7863 0.8868
No log 32.1667 386 0.7751 0.4937 0.7751 0.8804
No log 32.3333 388 0.7681 0.5073 0.7681 0.8764
No log 32.5 390 0.7647 0.5057 0.7647 0.8745
No log 32.6667 392 0.7719 0.5057 0.7719 0.8786
No log 32.8333 394 0.7944 0.5521 0.7944 0.8913
No log 33.0 396 0.7997 0.5521 0.7997 0.8943
No log 33.1667 398 0.7781 0.5186 0.7781 0.8821
No log 33.3333 400 0.7712 0.5302 0.7712 0.8782
No log 33.5 402 0.7705 0.5204 0.7705 0.8778
No log 33.6667 404 0.7648 0.5176 0.7648 0.8745
No log 33.8333 406 0.7793 0.4816 0.7793 0.8828
No log 34.0 408 0.7857 0.4816 0.7857 0.8864
No log 34.1667 410 0.7888 0.5186 0.7888 0.8881
No log 34.3333 412 0.8175 0.5443 0.8175 0.9041
No log 34.5 414 0.8416 0.5472 0.8416 0.9174
No log 34.6667 416 0.8830 0.6041 0.8830 0.9397
No log 34.8333 418 0.9028 0.5876 0.9028 0.9501
No log 35.0 420 0.8451 0.5361 0.8451 0.9193
No log 35.1667 422 0.7935 0.5320 0.7935 0.8908
No log 35.3333 424 0.7609 0.5089 0.7609 0.8723
No log 35.5 426 0.7761 0.5759 0.7761 0.8809
No log 35.6667 428 0.8144 0.5769 0.8144 0.9025
No log 35.8333 430 0.8024 0.6091 0.8024 0.8958
No log 36.0 432 0.7442 0.5239 0.7442 0.8627
No log 36.1667 434 0.7058 0.5184 0.7058 0.8401
No log 36.3333 436 0.7036 0.5648 0.7036 0.8388
No log 36.5 438 0.7265 0.5178 0.7265 0.8523
No log 36.6667 440 0.7560 0.5756 0.7560 0.8695
No log 36.8333 442 0.7919 0.5693 0.7919 0.8899
No log 37.0 444 0.8101 0.5830 0.8101 0.9001
No log 37.1667 446 0.8183 0.5520 0.8183 0.9046
No log 37.3333 448 0.8098 0.5787 0.8098 0.8999
No log 37.5 450 0.7887 0.5386 0.7887 0.8881
No log 37.6667 452 0.7818 0.5365 0.7818 0.8842
No log 37.8333 454 0.7739 0.5411 0.7739 0.8797
No log 38.0 456 0.7669 0.5648 0.7669 0.8757
No log 38.1667 458 0.7548 0.5814 0.7548 0.8688
No log 38.3333 460 0.7272 0.5869 0.7272 0.8528
No log 38.5 462 0.7055 0.5194 0.7055 0.8399
No log 38.6667 464 0.7001 0.5271 0.7001 0.8367
No log 38.8333 466 0.6872 0.5011 0.6872 0.8290
No log 39.0 468 0.6851 0.5810 0.6851 0.8277
No log 39.1667 470 0.6932 0.5783 0.6932 0.8326
No log 39.3333 472 0.7132 0.6089 0.7132 0.8445
No log 39.5 474 0.7170 0.6059 0.7170 0.8468
No log 39.6667 476 0.6971 0.6089 0.6971 0.8349
No log 39.8333 478 0.6675 0.5921 0.6675 0.8170
No log 40.0 480 0.7048 0.5870 0.7048 0.8395
No log 40.1667 482 0.7440 0.5988 0.7440 0.8626
No log 40.3333 484 0.7538 0.5540 0.7538 0.8682
No log 40.5 486 0.7545 0.5439 0.7545 0.8686
No log 40.6667 488 0.7491 0.5248 0.7491 0.8655
No log 40.8333 490 0.7655 0.5708 0.7655 0.8749
No log 41.0 492 0.7949 0.6226 0.7949 0.8916
No log 41.1667 494 0.8232 0.6226 0.8232 0.9073
No log 41.3333 496 0.8121 0.6226 0.8121 0.9012
No log 41.5 498 0.7638 0.5708 0.7638 0.8740
0.2663 41.6667 500 0.7275 0.5073 0.7275 0.8529
0.2663 41.8333 502 0.7270 0.5102 0.7270 0.8526
0.2663 42.0 504 0.7427 0.5329 0.7427 0.8618
0.2663 42.1667 506 0.7379 0.5082 0.7379 0.8590
0.2663 42.3333 508 0.7395 0.5411 0.7395 0.8599
0.2663 42.5 510 0.7496 0.5949 0.7496 0.8658
0.2663 42.6667 512 0.7666 0.5708 0.7666 0.8756
0.2663 42.8333 514 0.7660 0.5733 0.7660 0.8752
0.2663 43.0 516 0.7491 0.4805 0.7491 0.8655
0.2663 43.1667 518 0.7415 0.5057 0.7415 0.8611
0.2663 43.3333 520 0.7380 0.4864 0.7380 0.8590
0.2663 43.5 522 0.7458 0.4864 0.7458 0.8636
0.2663 43.6667 524 0.7577 0.5611 0.7577 0.8704
0.2663 43.8333 526 0.7575 0.5611 0.7575 0.8703
0.2663 44.0 528 0.7523 0.4864 0.7523 0.8673
0.2663 44.1667 530 0.7560 0.4864 0.7560 0.8695
0.2663 44.3333 532 0.7656 0.4864 0.7656 0.8750
0.2663 44.5 534 0.7725 0.5055 0.7725 0.8789
0.2663 44.6667 536 0.7706 0.5611 0.7706 0.8778
0.2663 44.8333 538 0.7639 0.6089 0.7639 0.8740
0.2663 45.0 540 0.7430 0.5957 0.7430 0.8620
0.2663 45.1667 542 0.7224 0.4828 0.7224 0.8500
0.2663 45.3333 544 0.7151 0.4620 0.7151 0.8456
0.2663 45.5 546 0.7087 0.4620 0.7087 0.8418
0.2663 45.6667 548 0.7094 0.4828 0.7094 0.8423
0.2663 45.8333 550 0.7129 0.4846 0.7129 0.8443

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
1
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run3_AugV5_k2_task2_organization

Finetuned
(4019)
this model