ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k6_task5_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.5980
  • Qwk: 0.5403
  • Mse: 0.5980
  • Rmse: 0.7733

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0667 2 4.6141 -0.0268 4.6141 2.1480
No log 0.1333 4 2.4448 0.0848 2.4448 1.5636
No log 0.2 6 1.6423 -0.0144 1.6423 1.2815
No log 0.2667 8 1.3962 0.0682 1.3962 1.1816
No log 0.3333 10 1.1183 0.1755 1.1183 1.0575
No log 0.4 12 1.0777 0.1722 1.0777 1.0381
No log 0.4667 14 1.0159 0.2262 1.0159 1.0079
No log 0.5333 16 0.9666 0.2416 0.9666 0.9831
No log 0.6 18 0.9363 0.2441 0.9363 0.9676
No log 0.6667 20 0.9227 0.3100 0.9227 0.9606
No log 0.7333 22 1.0149 0.3069 1.0149 1.0074
No log 0.8 24 1.6133 0.3112 1.6133 1.2702
No log 0.8667 26 1.7585 0.2383 1.7585 1.3261
No log 0.9333 28 1.2075 0.3519 1.2075 1.0989
No log 1.0 30 0.7489 0.4642 0.7489 0.8654
No log 1.0667 32 0.9352 0.2825 0.9352 0.9670
No log 1.1333 34 0.9128 0.2873 0.9128 0.9554
No log 1.2 36 0.7713 0.4494 0.7713 0.8782
No log 1.2667 38 0.7482 0.5174 0.7482 0.8650
No log 1.3333 40 0.7336 0.5315 0.7336 0.8565
No log 1.4 42 0.7046 0.5111 0.7046 0.8394
No log 1.4667 44 0.7720 0.6072 0.7720 0.8786
No log 1.5333 46 1.1151 0.4440 1.1151 1.0560
No log 1.6 48 1.1475 0.4718 1.1475 1.0712
No log 1.6667 50 0.7578 0.5881 0.7578 0.8705
No log 1.7333 52 0.8691 0.5718 0.8691 0.9323
No log 1.8 54 0.8962 0.5314 0.8962 0.9467
No log 1.8667 56 0.7200 0.5774 0.7200 0.8485
No log 1.9333 58 0.7913 0.5642 0.7913 0.8896
No log 2.0 60 0.8550 0.6211 0.8550 0.9247
No log 2.0667 62 0.9935 0.5164 0.9935 0.9967
No log 2.1333 64 0.8153 0.6442 0.8153 0.9029
No log 2.2 66 0.7110 0.6035 0.7110 0.8432
No log 2.2667 68 0.8923 0.5715 0.8923 0.9446
No log 2.3333 70 0.9147 0.5458 0.9147 0.9564
No log 2.4 72 0.8043 0.6298 0.8043 0.8968
No log 2.4667 74 0.8272 0.6303 0.8272 0.9095
No log 2.5333 76 0.8205 0.6358 0.8205 0.9058
No log 2.6 78 0.7415 0.6674 0.7415 0.8611
No log 2.6667 80 0.6961 0.5726 0.6961 0.8343
No log 2.7333 82 0.7240 0.5678 0.7240 0.8509
No log 2.8 84 0.7402 0.5661 0.7402 0.8603
No log 2.8667 86 0.8347 0.5568 0.8347 0.9136
No log 2.9333 88 0.9746 0.5374 0.9746 0.9872
No log 3.0 90 0.9513 0.5713 0.9513 0.9753
No log 3.0667 92 0.7110 0.6421 0.7110 0.8432
No log 3.1333 94 0.8270 0.6584 0.8270 0.9094
No log 3.2 96 0.8161 0.6429 0.8161 0.9034
No log 3.2667 98 0.7744 0.6704 0.7744 0.8800
No log 3.3333 100 0.8571 0.6664 0.8571 0.9258
No log 3.4 102 0.7703 0.7032 0.7703 0.8777
No log 3.4667 104 0.6746 0.6886 0.6746 0.8213
No log 3.5333 106 0.5855 0.6876 0.5855 0.7652
No log 3.6 108 0.5934 0.6528 0.5934 0.7703
No log 3.6667 110 0.6286 0.6412 0.6286 0.7928
No log 3.7333 112 0.5998 0.6572 0.5998 0.7745
No log 3.8 114 0.6452 0.6899 0.6452 0.8032
No log 3.8667 116 0.6824 0.6824 0.6824 0.8261
No log 3.9333 118 0.6464 0.6377 0.6464 0.8040
No log 4.0 120 0.7442 0.6513 0.7442 0.8627
No log 4.0667 122 0.6643 0.6142 0.6643 0.8150
No log 4.1333 124 0.6646 0.6950 0.6646 0.8152
No log 4.2 126 0.8283 0.6110 0.8283 0.9101
No log 4.2667 128 0.7647 0.6321 0.7647 0.8745
No log 4.3333 130 0.6332 0.6479 0.6332 0.7957
No log 4.4 132 0.6183 0.6111 0.6183 0.7863
No log 4.4667 134 0.6497 0.5869 0.6497 0.8060
No log 4.5333 136 0.6439 0.6534 0.6439 0.8025
No log 4.6 138 0.7948 0.6599 0.7948 0.8915
No log 4.6667 140 0.9640 0.5081 0.9640 0.9818
No log 4.7333 142 0.7958 0.6596 0.7958 0.8921
No log 4.8 144 0.6424 0.6186 0.6424 0.8015
No log 4.8667 146 0.7282 0.5852 0.7282 0.8533
No log 4.9333 148 0.7297 0.5852 0.7297 0.8542
No log 5.0 150 0.6838 0.5742 0.6838 0.8269
No log 5.0667 152 0.7490 0.6473 0.7490 0.8654
No log 5.1333 154 0.7092 0.6516 0.7092 0.8422
No log 5.2 156 0.6710 0.5838 0.6710 0.8192
No log 5.2667 158 0.6662 0.5693 0.6662 0.8162
No log 5.3333 160 0.6533 0.6345 0.6533 0.8083
No log 5.4 162 0.6614 0.6354 0.6614 0.8133
No log 5.4667 164 0.6401 0.6534 0.6401 0.8000
No log 5.5333 166 0.6408 0.6453 0.6408 0.8005
No log 5.6 168 0.6437 0.6286 0.6437 0.8023
No log 5.6667 170 0.6670 0.6453 0.6670 0.8167
No log 5.7333 172 0.6935 0.6970 0.6935 0.8328
No log 5.8 174 0.6769 0.6691 0.6769 0.8227
No log 5.8667 176 0.6652 0.5817 0.6652 0.8156
No log 5.9333 178 0.6919 0.6028 0.6919 0.8318
No log 6.0 180 0.6928 0.5874 0.6928 0.8324
No log 6.0667 182 0.6935 0.5828 0.6935 0.8328
No log 6.1333 184 0.6636 0.6306 0.6636 0.8146
No log 6.2 186 0.6494 0.5874 0.6494 0.8058
No log 6.2667 188 0.6397 0.5487 0.6397 0.7998
No log 6.3333 190 0.6659 0.5188 0.6659 0.8160
No log 6.4 192 0.6626 0.5074 0.6626 0.8140
No log 6.4667 194 0.6174 0.5396 0.6174 0.7858
No log 6.5333 196 0.6780 0.6100 0.6780 0.8234
No log 6.6 198 0.8957 0.5818 0.8957 0.9464
No log 6.6667 200 0.9016 0.5782 0.9016 0.9495
No log 6.7333 202 0.7430 0.6180 0.7430 0.8620
No log 6.8 204 0.6613 0.6294 0.6613 0.8132
No log 6.8667 206 0.6925 0.5922 0.6925 0.8322
No log 6.9333 208 0.7389 0.6136 0.7389 0.8596
No log 7.0 210 0.7188 0.6356 0.7188 0.8478
No log 7.0667 212 0.6135 0.5585 0.6135 0.7833
No log 7.1333 214 0.6336 0.6167 0.6336 0.7960
No log 7.2 216 0.6471 0.6485 0.6471 0.8044
No log 7.2667 218 0.6070 0.6553 0.6070 0.7791
No log 7.3333 220 0.6168 0.6272 0.6168 0.7854
No log 7.4 222 0.6253 0.6804 0.6253 0.7908
No log 7.4667 224 0.6164 0.6337 0.6164 0.7851
No log 7.5333 226 0.6144 0.5657 0.6144 0.7838
No log 7.6 228 0.6125 0.5654 0.6125 0.7826
No log 7.6667 230 0.6171 0.5949 0.6171 0.7856
No log 7.7333 232 0.6142 0.6111 0.6142 0.7837
No log 7.8 234 0.6175 0.5776 0.6175 0.7858
No log 7.8667 236 0.5966 0.6359 0.5966 0.7724
No log 7.9333 238 0.6043 0.6377 0.6043 0.7774
No log 8.0 240 0.6195 0.6135 0.6195 0.7871
No log 8.0667 242 0.5996 0.5675 0.5996 0.7744
No log 8.1333 244 0.5957 0.6297 0.5957 0.7718
No log 8.2 246 0.5912 0.5969 0.5912 0.7689
No log 8.2667 248 0.5929 0.6448 0.5929 0.7700
No log 8.3333 250 0.5764 0.6703 0.5764 0.7592
No log 8.4 252 0.5596 0.6951 0.5596 0.7481
No log 8.4667 254 0.5517 0.6850 0.5517 0.7427
No log 8.5333 256 0.5508 0.6871 0.5508 0.7421
No log 8.6 258 0.5637 0.7269 0.5637 0.7508
No log 8.6667 260 0.6067 0.7099 0.6067 0.7789
No log 8.7333 262 0.5851 0.7135 0.5851 0.7649
No log 8.8 264 0.5269 0.7122 0.5269 0.7259
No log 8.8667 266 0.5394 0.6916 0.5394 0.7345
No log 8.9333 268 0.5444 0.6634 0.5444 0.7379
No log 9.0 270 0.5852 0.6886 0.5852 0.7650
No log 9.0667 272 0.6057 0.6554 0.6057 0.7782
No log 9.1333 274 0.6305 0.6388 0.6305 0.7940
No log 9.2 276 0.6188 0.6581 0.6188 0.7867
No log 9.2667 278 0.6215 0.6617 0.6215 0.7884
No log 9.3333 280 0.5947 0.6345 0.5947 0.7712
No log 9.4 282 0.6032 0.6007 0.6032 0.7766
No log 9.4667 284 0.6234 0.6259 0.6234 0.7895
No log 9.5333 286 0.5939 0.6555 0.5939 0.7707
No log 9.6 288 0.5737 0.6039 0.5737 0.7574
No log 9.6667 290 0.5992 0.6223 0.5992 0.7741
No log 9.7333 292 0.5988 0.6937 0.5988 0.7738
No log 9.8 294 0.6072 0.7126 0.6072 0.7792
No log 9.8667 296 0.6284 0.7026 0.6284 0.7927
No log 9.9333 298 0.6164 0.7026 0.6164 0.7851
No log 10.0 300 0.5999 0.5726 0.5999 0.7745
No log 10.0667 302 0.6073 0.5506 0.6073 0.7793
No log 10.1333 304 0.6279 0.5582 0.6279 0.7924
No log 10.2 306 0.6459 0.5419 0.6459 0.8037
No log 10.2667 308 0.6382 0.5669 0.6382 0.7989
No log 10.3333 310 0.6179 0.6491 0.6179 0.7860
No log 10.4 312 0.6085 0.6740 0.6085 0.7801
No log 10.4667 314 0.5940 0.6594 0.5940 0.7707
No log 10.5333 316 0.5995 0.6116 0.5995 0.7743
No log 10.6 318 0.5812 0.6087 0.5812 0.7624
No log 10.6667 320 0.5742 0.6701 0.5742 0.7578
No log 10.7333 322 0.6087 0.6519 0.6087 0.7802
No log 10.8 324 0.7347 0.6322 0.7347 0.8571
No log 10.8667 326 0.8297 0.6327 0.8297 0.9109
No log 10.9333 328 0.7644 0.6141 0.7644 0.8743
No log 11.0 330 0.6319 0.6464 0.6319 0.7949
No log 11.0667 332 0.5776 0.5898 0.5776 0.7600
No log 11.1333 334 0.6075 0.6116 0.6075 0.7794
No log 11.2 336 0.6091 0.6290 0.6091 0.7804
No log 11.2667 338 0.5843 0.6560 0.5843 0.7644
No log 11.3333 340 0.6865 0.6151 0.6865 0.8285
No log 11.4 342 0.7601 0.5832 0.7601 0.8719
No log 11.4667 344 0.7710 0.5666 0.7710 0.8781
No log 11.5333 346 0.7507 0.5292 0.7507 0.8664
No log 11.6 348 0.7082 0.6082 0.7082 0.8416
No log 11.6667 350 0.6443 0.5964 0.6443 0.8027
No log 11.7333 352 0.6092 0.6167 0.6092 0.7805
No log 11.8 354 0.5982 0.6167 0.5982 0.7735
No log 11.8667 356 0.5987 0.6439 0.5987 0.7738
No log 11.9333 358 0.6145 0.6656 0.6145 0.7839
No log 12.0 360 0.5942 0.6301 0.5942 0.7708
No log 12.0667 362 0.5739 0.5819 0.5739 0.7576
No log 12.1333 364 0.5621 0.6256 0.5621 0.7497
No log 12.2 366 0.5525 0.6510 0.5525 0.7433
No log 12.2667 368 0.5611 0.6701 0.5611 0.7491
No log 12.3333 370 0.5780 0.6664 0.5780 0.7603
No log 12.4 372 0.5878 0.6562 0.5878 0.7667
No log 12.4667 374 0.5680 0.6701 0.5680 0.7537
No log 12.5333 376 0.5734 0.6597 0.5734 0.7572
No log 12.6 378 0.5872 0.6664 0.5872 0.7663
No log 12.6667 380 0.5897 0.6664 0.5897 0.7679
No log 12.7333 382 0.5784 0.6224 0.5784 0.7605
No log 12.8 384 0.5874 0.6721 0.5874 0.7664
No log 12.8667 386 0.5919 0.6368 0.5919 0.7693
No log 12.9333 388 0.5745 0.5841 0.5745 0.7580
No log 13.0 390 0.5841 0.6642 0.5841 0.7643
No log 13.0667 392 0.6321 0.6263 0.6321 0.7950
No log 13.1333 394 0.6215 0.6468 0.6215 0.7884
No log 13.2 396 0.5787 0.6748 0.5787 0.7607
No log 13.2667 398 0.5695 0.6358 0.5695 0.7547
No log 13.3333 400 0.5773 0.6174 0.5773 0.7598
No log 13.4 402 0.5775 0.6089 0.5775 0.7599
No log 13.4667 404 0.5771 0.6124 0.5771 0.7597
No log 13.5333 406 0.6219 0.6356 0.6219 0.7886
No log 13.6 408 0.6610 0.6443 0.6610 0.8130
No log 13.6667 410 0.6685 0.6443 0.6685 0.8176
No log 13.7333 412 0.6480 0.6519 0.6480 0.8050
No log 13.8 414 0.5927 0.6301 0.5927 0.7699
No log 13.8667 416 0.5613 0.6266 0.5613 0.7492
No log 13.9333 418 0.5340 0.6447 0.5340 0.7308
No log 14.0 420 0.5214 0.6622 0.5214 0.7221
No log 14.0667 422 0.5227 0.6484 0.5227 0.7230
No log 14.1333 424 0.5429 0.6597 0.5429 0.7368
No log 14.2 426 0.5942 0.6326 0.5942 0.7708
No log 14.2667 428 0.6274 0.6198 0.6274 0.7921
No log 14.3333 430 0.6255 0.5986 0.6255 0.7909
No log 14.4 432 0.6452 0.5346 0.6452 0.8032
No log 14.4667 434 0.6505 0.5442 0.6505 0.8065
No log 14.5333 436 0.6533 0.5975 0.6533 0.8083
No log 14.6 438 0.6356 0.6147 0.6356 0.7972
No log 14.6667 440 0.5948 0.6048 0.5948 0.7712
No log 14.7333 442 0.5762 0.6447 0.5762 0.7591
No log 14.8 444 0.5673 0.6482 0.5673 0.7532
No log 14.8667 446 0.5739 0.6581 0.5739 0.7576
No log 14.9333 448 0.6041 0.6238 0.6041 0.7772
No log 15.0 450 0.6067 0.6017 0.6067 0.7789
No log 15.0667 452 0.5975 0.5759 0.5975 0.7730
No log 15.1333 454 0.5948 0.5724 0.5948 0.7712
No log 15.2 456 0.5929 0.6124 0.5929 0.7700
No log 15.2667 458 0.6042 0.6254 0.6042 0.7773
No log 15.3333 460 0.6188 0.6431 0.6188 0.7867
No log 15.4 462 0.6369 0.6272 0.6369 0.7981
No log 15.4667 464 0.6451 0.5737 0.6451 0.8032
No log 15.5333 466 0.6562 0.5071 0.6562 0.8101
No log 15.6 468 0.6640 0.5274 0.6640 0.8148
No log 15.6667 470 0.6668 0.5498 0.6668 0.8166
No log 15.7333 472 0.6467 0.5498 0.6467 0.8042
No log 15.8 474 0.6183 0.5510 0.6183 0.7863
No log 15.8667 476 0.5985 0.5487 0.5985 0.7736
No log 15.9333 478 0.5919 0.6018 0.5919 0.7693
No log 16.0 480 0.6052 0.6278 0.6052 0.7779
No log 16.0667 482 0.6005 0.6278 0.6005 0.7749
No log 16.1333 484 0.5935 0.5487 0.5935 0.7704
No log 16.2 486 0.5900 0.5505 0.5900 0.7681
No log 16.2667 488 0.5915 0.5505 0.5915 0.7691
No log 16.3333 490 0.5857 0.5988 0.5857 0.7653
No log 16.4 492 0.5854 0.6089 0.5854 0.7651
No log 16.4667 494 0.5811 0.5945 0.5811 0.7623
No log 16.5333 496 0.6040 0.6119 0.6040 0.7772
No log 16.6 498 0.6709 0.6217 0.6709 0.8191
0.2543 16.6667 500 0.6956 0.6137 0.6956 0.8340
0.2543 16.7333 502 0.6299 0.6713 0.6299 0.7937
0.2543 16.8 504 0.5629 0.6423 0.5629 0.7503
0.2543 16.8667 506 0.5758 0.6854 0.5758 0.7588
0.2543 16.9333 508 0.5732 0.6854 0.5732 0.7571
0.2543 17.0 510 0.5590 0.6078 0.5590 0.7476
0.2543 17.0667 512 0.5964 0.6278 0.5964 0.7723
0.2543 17.1333 514 0.6247 0.5943 0.6247 0.7904
0.2543 17.2 516 0.6249 0.5902 0.6249 0.7905
0.2543 17.2667 518 0.6150 0.6247 0.6150 0.7842
0.2543 17.3333 520 0.6210 0.6709 0.6210 0.7881
0.2543 17.4 522 0.6255 0.6391 0.6255 0.7909
0.2543 17.4667 524 0.6266 0.6103 0.6266 0.7916
0.2543 17.5333 526 0.6213 0.5732 0.6213 0.7882
0.2543 17.6 528 0.6205 0.5587 0.6205 0.7877
0.2543 17.6667 530 0.6094 0.5166 0.6094 0.7806
0.2543 17.7333 532 0.5980 0.5403 0.5980 0.7733

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k6_task5_organization

Finetuned
(4019)
this model