ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k12_task5_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.6129
  • Qwk: 0.6196
  • Mse: 0.6129
  • Rmse: 0.7829

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0333 2 3.8529 -0.0323 3.8529 1.9629
No log 0.0667 4 2.8781 -0.0373 2.8781 1.6965
No log 0.1 6 1.7732 -0.0508 1.7732 1.3316
No log 0.1333 8 1.2923 0.0431 1.2923 1.1368
No log 0.1667 10 1.2743 0.0970 1.2743 1.1288
No log 0.2 12 1.0597 0.1864 1.0597 1.0294
No log 0.2333 14 0.9864 0.2492 0.9864 0.9932
No log 0.2667 16 0.9444 0.2716 0.9444 0.9718
No log 0.3 18 0.9646 0.3143 0.9646 0.9821
No log 0.3333 20 1.0074 0.3469 1.0074 1.0037
No log 0.3667 22 0.8989 0.3954 0.8989 0.9481
No log 0.4 24 0.8844 0.4094 0.8844 0.9404
No log 0.4333 26 0.7437 0.5048 0.7437 0.8624
No log 0.4667 28 0.7568 0.4722 0.7568 0.8700
No log 0.5 30 0.7413 0.5827 0.7413 0.8610
No log 0.5333 32 1.4014 0.3221 1.4014 1.1838
No log 0.5667 34 1.7551 0.2875 1.7551 1.3248
No log 0.6 36 1.2591 0.3830 1.2591 1.1221
No log 0.6333 38 0.7153 0.5364 0.7153 0.8458
No log 0.6667 40 1.1213 0.4052 1.1213 1.0589
No log 0.7 42 1.1355 0.3836 1.1355 1.0656
No log 0.7333 44 0.7742 0.5368 0.7742 0.8799
No log 0.7667 46 0.7534 0.4554 0.7534 0.8680
No log 0.8 48 0.7326 0.5069 0.7326 0.8559
No log 0.8333 50 0.7263 0.6012 0.7263 0.8522
No log 0.8667 52 0.7313 0.5905 0.7313 0.8551
No log 0.9 54 0.7555 0.6198 0.7555 0.8692
No log 0.9333 56 0.7376 0.6317 0.7376 0.8588
No log 0.9667 58 0.7351 0.5937 0.7351 0.8574
No log 1.0 60 0.9417 0.5894 0.9417 0.9704
No log 1.0333 62 0.9729 0.5754 0.9729 0.9863
No log 1.0667 64 0.7149 0.5974 0.7149 0.8455
No log 1.1 66 0.6619 0.5784 0.6619 0.8136
No log 1.1333 68 0.6775 0.5971 0.6775 0.8231
No log 1.1667 70 0.7087 0.5894 0.7087 0.8418
No log 1.2 72 0.6704 0.5399 0.6704 0.8188
No log 1.2333 74 0.7017 0.5771 0.7017 0.8377
No log 1.2667 76 0.7106 0.5894 0.7106 0.8429
No log 1.3 78 0.7874 0.5148 0.7874 0.8873
No log 1.3333 80 0.6546 0.5512 0.6546 0.8091
No log 1.3667 82 0.6195 0.6104 0.6195 0.7871
No log 1.4 84 0.6350 0.6341 0.6350 0.7969
No log 1.4333 86 0.6764 0.5892 0.6764 0.8224
No log 1.4667 88 0.8914 0.6023 0.8914 0.9441
No log 1.5 90 0.9679 0.5245 0.9679 0.9838
No log 1.5333 92 0.7186 0.6229 0.7186 0.8477
No log 1.5667 94 0.6881 0.6560 0.6881 0.8295
No log 1.6 96 0.6845 0.6125 0.6845 0.8273
No log 1.6333 98 0.6436 0.6482 0.6436 0.8022
No log 1.6667 100 0.6867 0.5625 0.6867 0.8287
No log 1.7 102 0.6705 0.5828 0.6705 0.8188
No log 1.7333 104 0.6269 0.6473 0.6269 0.7918
No log 1.7667 106 0.6539 0.6361 0.6539 0.8086
No log 1.8 108 0.6292 0.6510 0.6292 0.7932
No log 1.8333 110 0.6268 0.5971 0.6268 0.7917
No log 1.8667 112 0.6400 0.5890 0.6400 0.8000
No log 1.9 114 0.5909 0.6796 0.5909 0.7687
No log 1.9333 116 0.6191 0.6581 0.6191 0.7868
No log 1.9667 118 0.6218 0.6751 0.6218 0.7885
No log 2.0 120 0.5765 0.6456 0.5765 0.7593
No log 2.0333 122 0.7166 0.6324 0.7166 0.8465
No log 2.0667 124 0.6193 0.6203 0.6193 0.7869
No log 2.1 126 0.5646 0.6764 0.5646 0.7514
No log 2.1333 128 0.6239 0.6775 0.6239 0.7898
No log 2.1667 130 0.5926 0.6712 0.5926 0.7698
No log 2.2 132 0.6233 0.6265 0.6233 0.7895
No log 2.2333 134 0.6579 0.5806 0.6579 0.8111
No log 2.2667 136 0.6457 0.6186 0.6457 0.8036
No log 2.3 138 0.6358 0.5833 0.6358 0.7973
No log 2.3333 140 0.6224 0.6184 0.6224 0.7890
No log 2.3667 142 0.6724 0.6253 0.6724 0.8200
No log 2.4 144 0.6935 0.6403 0.6935 0.8327
No log 2.4333 146 0.6230 0.6721 0.6230 0.7893
No log 2.4667 148 0.5888 0.6788 0.5888 0.7673
No log 2.5 150 0.5786 0.6822 0.5786 0.7607
No log 2.5333 152 0.5865 0.6699 0.5865 0.7658
No log 2.5667 154 0.6917 0.5708 0.6917 0.8317
No log 2.6 156 0.9146 0.4624 0.9146 0.9564
No log 2.6333 158 0.8815 0.4378 0.8815 0.9389
No log 2.6667 160 0.6421 0.5696 0.6421 0.8013
No log 2.7 162 0.6228 0.6352 0.6228 0.7892
No log 2.7333 164 0.7959 0.5208 0.7959 0.8921
No log 2.7667 166 0.7971 0.5208 0.7971 0.8928
No log 2.8 168 0.6065 0.6687 0.6065 0.7788
No log 2.8333 170 0.6140 0.5887 0.6140 0.7836
No log 2.8667 172 0.7094 0.5583 0.7094 0.8422
No log 2.9 174 0.6507 0.6062 0.6507 0.8067
No log 2.9333 176 0.5810 0.6622 0.5810 0.7622
No log 2.9667 178 0.6100 0.6921 0.6100 0.7810
No log 3.0 180 0.6402 0.6714 0.6402 0.8001
No log 3.0333 182 0.7587 0.5666 0.7587 0.8711
No log 3.0667 184 0.8022 0.5495 0.8022 0.8956
No log 3.1 186 0.7126 0.5895 0.7126 0.8442
No log 3.1333 188 0.6276 0.6796 0.6276 0.7922
No log 3.1667 190 0.7148 0.5439 0.7148 0.8455
No log 3.2 192 0.7396 0.5439 0.7396 0.8600
No log 3.2333 194 0.7841 0.5318 0.7841 0.8855
No log 3.2667 196 0.6623 0.5614 0.6623 0.8138
No log 3.3 198 0.5916 0.6724 0.5916 0.7691
No log 3.3333 200 0.6323 0.6265 0.6323 0.7951
No log 3.3667 202 0.6097 0.6778 0.6097 0.7808
No log 3.4 204 0.6270 0.6516 0.6270 0.7919
No log 3.4333 206 0.6577 0.6116 0.6577 0.8110
No log 3.4667 208 0.6113 0.6452 0.6113 0.7819
No log 3.5 210 0.6098 0.6788 0.6098 0.7809
No log 3.5333 212 0.6170 0.6762 0.6170 0.7855
No log 3.5667 214 0.6277 0.6575 0.6277 0.7923
No log 3.6 216 0.6389 0.6252 0.6389 0.7993
No log 3.6333 218 0.6425 0.6291 0.6425 0.8016
No log 3.6667 220 0.6292 0.6291 0.6292 0.7932
No log 3.7 222 0.6024 0.6882 0.6024 0.7762
No log 3.7333 224 0.7485 0.5474 0.7485 0.8652
No log 3.7667 226 0.7615 0.5370 0.7615 0.8726
No log 3.8 228 0.6004 0.7323 0.6004 0.7749
No log 3.8333 230 0.5738 0.6936 0.5738 0.7575
No log 3.8667 232 0.6850 0.5842 0.6850 0.8276
No log 3.9 234 0.7221 0.5330 0.7221 0.8498
No log 3.9333 236 0.6207 0.6361 0.6207 0.7879
No log 3.9667 238 0.5805 0.6498 0.5805 0.7619
No log 4.0 240 0.6006 0.6593 0.6006 0.7750
No log 4.0333 242 0.6234 0.5999 0.6234 0.7895
No log 4.0667 244 0.5936 0.6427 0.5936 0.7704
No log 4.1 246 0.5902 0.6664 0.5902 0.7683
No log 4.1333 248 0.5961 0.6664 0.5961 0.7721
No log 4.1667 250 0.5776 0.6753 0.5776 0.7600
No log 4.2 252 0.6699 0.5451 0.6699 0.8185
No log 4.2333 254 0.7759 0.4860 0.7759 0.8808
No log 4.2667 256 0.6889 0.5845 0.6889 0.8300
No log 4.3 258 0.5801 0.6265 0.5801 0.7617
No log 4.3333 260 0.5349 0.6944 0.5349 0.7314
No log 4.3667 262 0.5364 0.6664 0.5364 0.7324
No log 4.4 264 0.5426 0.6772 0.5426 0.7366
No log 4.4333 266 0.5593 0.6507 0.5593 0.7479
No log 4.4667 268 0.5947 0.5862 0.5947 0.7712
No log 4.5 270 0.5605 0.6940 0.5605 0.7486
No log 4.5333 272 0.5505 0.7154 0.5505 0.7420
No log 4.5667 274 0.5499 0.7154 0.5499 0.7416
No log 4.6 276 0.5634 0.6593 0.5634 0.7506
No log 4.6333 278 0.5670 0.6593 0.5670 0.7530
No log 4.6667 280 0.5639 0.6460 0.5639 0.7509
No log 4.7 282 0.5869 0.6664 0.5869 0.7661
No log 4.7333 284 0.5857 0.6655 0.5857 0.7653
No log 4.7667 286 0.5601 0.6857 0.5601 0.7484
No log 4.8 288 0.6120 0.6248 0.6120 0.7823
No log 4.8333 290 0.6637 0.6101 0.6637 0.8147
No log 4.8667 292 0.5915 0.6729 0.5915 0.7691
No log 4.9 294 0.5467 0.7223 0.5467 0.7394
No log 4.9333 296 0.5533 0.6664 0.5533 0.7439
No log 4.9667 298 0.5937 0.6782 0.5937 0.7705
No log 5.0 300 0.6255 0.6807 0.6255 0.7909
No log 5.0333 302 0.6032 0.5913 0.6032 0.7767
No log 5.0667 304 0.5855 0.6097 0.5855 0.7652
No log 5.1 306 0.6110 0.6139 0.6110 0.7817
No log 5.1333 308 0.5751 0.6139 0.5751 0.7583
No log 5.1667 310 0.5393 0.6753 0.5393 0.7344
No log 5.2 312 0.5931 0.6774 0.5931 0.7701
No log 5.2333 314 0.6212 0.6630 0.6212 0.7881
No log 5.2667 316 0.5573 0.6745 0.5573 0.7465
No log 5.3 318 0.5465 0.6753 0.5465 0.7393
No log 5.3333 320 0.5699 0.6165 0.5699 0.7549
No log 5.3667 322 0.5879 0.5949 0.5879 0.7667
No log 5.4 324 0.5681 0.6054 0.5681 0.7537
No log 5.4333 326 0.5567 0.6479 0.5567 0.7461
No log 5.4667 328 0.5628 0.6588 0.5628 0.7502
No log 5.5 330 0.5347 0.6822 0.5347 0.7312
No log 5.5333 332 0.5327 0.6822 0.5327 0.7299
No log 5.5667 334 0.5503 0.6814 0.5503 0.7418
No log 5.6 336 0.5527 0.6814 0.5527 0.7434
No log 5.6333 338 0.5669 0.6320 0.5669 0.7529
No log 5.6667 340 0.6060 0.6143 0.6060 0.7784
No log 5.7 342 0.6080 0.6272 0.6080 0.7798
No log 5.7333 344 0.5682 0.6433 0.5682 0.7538
No log 5.7667 346 0.5411 0.6796 0.5411 0.7356
No log 5.8 348 0.5360 0.6772 0.5360 0.7321
No log 5.8333 350 0.5474 0.7078 0.5474 0.7398
No log 5.8667 352 0.5875 0.6653 0.5875 0.7665
No log 5.9 354 0.5855 0.6653 0.5855 0.7652
No log 5.9333 356 0.5682 0.6049 0.5682 0.7538
No log 5.9667 358 0.5809 0.5759 0.5809 0.7622
No log 6.0 360 0.5881 0.5882 0.5881 0.7669
No log 6.0333 362 0.5795 0.6195 0.5795 0.7613
No log 6.0667 364 0.5550 0.5934 0.5550 0.7450
No log 6.1 366 0.5559 0.6537 0.5559 0.7456
No log 6.1333 368 0.5583 0.6814 0.5583 0.7472
No log 6.1667 370 0.5500 0.6753 0.5500 0.7417
No log 6.2 372 0.5509 0.6593 0.5509 0.7422
No log 6.2333 374 0.5669 0.6368 0.5669 0.7529
No log 6.2667 376 0.5821 0.6344 0.5821 0.7630
No log 6.3 378 0.5702 0.6778 0.5702 0.7551
No log 6.3333 380 0.5717 0.6753 0.5717 0.7561
No log 6.3667 382 0.6330 0.6547 0.6330 0.7956
No log 6.4 384 0.6299 0.6547 0.6299 0.7937
No log 6.4333 386 0.5857 0.6566 0.5857 0.7653
No log 6.4667 388 0.5725 0.6364 0.5725 0.7567
No log 6.5 390 0.5679 0.6932 0.5679 0.7536
No log 6.5333 392 0.5776 0.6872 0.5776 0.7600
No log 6.5667 394 0.5743 0.6908 0.5743 0.7578
No log 6.6 396 0.5529 0.6658 0.5529 0.7436
No log 6.6333 398 0.6116 0.6620 0.6116 0.7820
No log 6.6667 400 0.6240 0.5913 0.6240 0.7899
No log 6.7 402 0.5788 0.6566 0.5788 0.7608
No log 6.7333 404 0.5722 0.6209 0.5722 0.7564
No log 6.7667 406 0.5823 0.6664 0.5823 0.7631
No log 6.8 408 0.5593 0.7034 0.5593 0.7479
No log 6.8333 410 0.5627 0.6234 0.5627 0.7501
No log 6.8667 412 0.5781 0.6533 0.5781 0.7603
No log 6.9 414 0.5673 0.6402 0.5673 0.7532
No log 6.9333 416 0.5386 0.6292 0.5386 0.7339
No log 6.9667 418 0.5456 0.6753 0.5456 0.7386
No log 7.0 420 0.5682 0.6814 0.5682 0.7538
No log 7.0333 422 0.5770 0.6814 0.5770 0.7596
No log 7.0667 424 0.5743 0.6641 0.5743 0.7578
No log 7.1 426 0.5834 0.6753 0.5834 0.7638
No log 7.1333 428 0.6030 0.6479 0.6030 0.7765
No log 7.1667 430 0.6053 0.6196 0.6053 0.7780
No log 7.2 432 0.6141 0.6370 0.6141 0.7836
No log 7.2333 434 0.6154 0.6370 0.6154 0.7845
No log 7.2667 436 0.6291 0.6035 0.6291 0.7932
No log 7.3 438 0.6092 0.6451 0.6092 0.7805
No log 7.3333 440 0.6057 0.6488 0.6057 0.7783
No log 7.3667 442 0.6162 0.6518 0.6162 0.7850
No log 7.4 444 0.6261 0.5394 0.6261 0.7912
No log 7.4333 446 0.6283 0.5426 0.6283 0.7927
No log 7.4667 448 0.6375 0.5549 0.6375 0.7984
No log 7.5 450 0.6256 0.5783 0.6256 0.7909
No log 7.5333 452 0.6084 0.6606 0.6084 0.7800
No log 7.5667 454 0.6049 0.6724 0.6049 0.7778
No log 7.6 456 0.6023 0.6196 0.6023 0.7761
No log 7.6333 458 0.6021 0.6724 0.6021 0.7760
No log 7.6667 460 0.6276 0.6301 0.6276 0.7922
No log 7.7 462 0.6263 0.6094 0.6263 0.7914
No log 7.7333 464 0.6118 0.5758 0.6118 0.7822
No log 7.7667 466 0.5971 0.6196 0.5971 0.7727
No log 7.8 468 0.5855 0.6307 0.5855 0.7652
No log 7.8333 470 0.5856 0.6814 0.5856 0.7652
No log 7.8667 472 0.5771 0.6944 0.5771 0.7597
No log 7.9 474 0.5662 0.6788 0.5662 0.7524
No log 7.9333 476 0.5900 0.6435 0.5900 0.7681
No log 7.9667 478 0.5856 0.6572 0.5856 0.7653
No log 8.0 480 0.5597 0.6237 0.5597 0.7481
No log 8.0333 482 0.5694 0.6706 0.5694 0.7546
No log 8.0667 484 0.5778 0.6387 0.5778 0.7601
No log 8.1 486 0.5815 0.5653 0.5815 0.7626
No log 8.1333 488 0.5806 0.5653 0.5806 0.7620
No log 8.1667 490 0.5788 0.6087 0.5788 0.7608
No log 8.2 492 0.5647 0.6087 0.5647 0.7515
No log 8.2333 494 0.5596 0.6932 0.5596 0.7480
No log 8.2667 496 0.5642 0.6712 0.5642 0.7511
No log 8.3 498 0.5746 0.6814 0.5746 0.7581
0.2735 8.3333 500 0.5782 0.6259 0.5782 0.7604
0.2735 8.3667 502 0.5973 0.6195 0.5973 0.7729
0.2735 8.4 504 0.5934 0.5988 0.5934 0.7703
0.2735 8.4333 506 0.5820 0.6186 0.5820 0.7629
0.2735 8.4667 508 0.6054 0.6311 0.6054 0.7781
0.2735 8.5 510 0.5990 0.6241 0.5990 0.7739
0.2735 8.5333 512 0.5906 0.6371 0.5906 0.7685
0.2735 8.5667 514 0.5518 0.6672 0.5518 0.7428
0.2735 8.6 516 0.5626 0.6708 0.5626 0.7501
0.2735 8.6333 518 0.6484 0.5711 0.6484 0.8053
0.2735 8.6667 520 0.6486 0.6182 0.6486 0.8053
0.2735 8.7 522 0.5508 0.7260 0.5508 0.7421
0.2735 8.7333 524 0.5735 0.6805 0.5735 0.7573
0.2735 8.7667 526 0.6072 0.6626 0.6072 0.7792
0.2735 8.8 528 0.5839 0.6642 0.5839 0.7641
0.2735 8.8333 530 0.5803 0.6259 0.5803 0.7618
0.2735 8.8667 532 0.6002 0.6526 0.6002 0.7747
0.2735 8.9 534 0.6060 0.6087 0.6060 0.7784
0.2735 8.9333 536 0.6439 0.5712 0.6439 0.8024
0.2735 8.9667 538 0.6634 0.5490 0.6634 0.8145
0.2735 9.0 540 0.6447 0.5485 0.6447 0.8030
0.2735 9.0333 542 0.6129 0.6196 0.6129 0.7829

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k12_task5_organization

Finetuned
(4019)
this model