ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run3_AugV5_k6_task5_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.5574
  • Qwk: 0.6205
  • Mse: 0.5574
  • Rmse: 0.7466

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0667 2 4.0844 0.0182 4.0844 2.0210
No log 0.1333 4 2.3922 0.0462 2.3922 1.5467
No log 0.2 6 1.6118 -0.0486 1.6118 1.2696
No log 0.2667 8 1.2688 0.1005 1.2688 1.1264
No log 0.3333 10 1.5654 0.0856 1.5654 1.2511
No log 0.4 12 1.0546 0.2351 1.0546 1.0269
No log 0.4667 14 0.9966 0.3056 0.9966 0.9983
No log 0.5333 16 0.9898 0.3386 0.9898 0.9949
No log 0.6 18 0.9138 0.3759 0.9138 0.9559
No log 0.6667 20 1.2085 0.2934 1.2085 1.0993
No log 0.7333 22 1.2815 0.2892 1.2815 1.1320
No log 0.8 24 1.3310 0.2806 1.3310 1.1537
No log 0.8667 26 0.9576 0.5342 0.9576 0.9786
No log 0.9333 28 0.7222 0.5804 0.7222 0.8498
No log 1.0 30 0.7976 0.5656 0.7976 0.8931
No log 1.0667 32 1.0321 0.5599 1.0321 1.0159
No log 1.1333 34 1.2858 0.4080 1.2858 1.1339
No log 1.2 36 0.7981 0.5801 0.7981 0.8934
No log 1.2667 38 0.7553 0.6632 0.7553 0.8691
No log 1.3333 40 0.8199 0.6322 0.8199 0.9055
No log 1.4 42 0.6936 0.6205 0.6936 0.8328
No log 1.4667 44 0.7081 0.6303 0.7081 0.8415
No log 1.5333 46 0.7408 0.6457 0.7408 0.8607
No log 1.6 48 0.9223 0.5641 0.9223 0.9604
No log 1.6667 50 0.8546 0.5853 0.8546 0.9245
No log 1.7333 52 0.7808 0.6084 0.7808 0.8836
No log 1.8 54 0.6947 0.6550 0.6947 0.8335
No log 1.8667 56 0.7209 0.6714 0.7209 0.8491
No log 1.9333 58 0.7613 0.6420 0.7613 0.8726
No log 2.0 60 0.6973 0.6410 0.6973 0.8350
No log 2.0667 62 0.6665 0.6602 0.6665 0.8164
No log 2.1333 64 0.7759 0.6388 0.7759 0.8809
No log 2.2 66 0.9086 0.5694 0.9086 0.9532
No log 2.2667 68 0.7962 0.6465 0.7962 0.8923
No log 2.3333 70 0.7326 0.6159 0.7326 0.8559
No log 2.4 72 0.6720 0.6613 0.6720 0.8198
No log 2.4667 74 0.6556 0.6521 0.6556 0.8097
No log 2.5333 76 0.5972 0.6429 0.5972 0.7728
No log 2.6 78 0.8167 0.6110 0.8167 0.9037
No log 2.6667 80 0.9703 0.5189 0.9703 0.9850
No log 2.7333 82 0.9274 0.5799 0.9274 0.9630
No log 2.8 84 0.6509 0.6763 0.6509 0.8068
No log 2.8667 86 0.6117 0.6590 0.6117 0.7821
No log 2.9333 88 0.6347 0.6392 0.6347 0.7967
No log 3.0 90 0.7227 0.6236 0.7227 0.8501
No log 3.0667 92 0.9444 0.5021 0.9444 0.9718
No log 3.1333 94 0.7926 0.5947 0.7926 0.8903
No log 3.2 96 0.7662 0.6182 0.7662 0.8753
No log 3.2667 98 0.8484 0.5819 0.8484 0.9211
No log 3.3333 100 0.9059 0.5536 0.9059 0.9518
No log 3.4 102 0.6689 0.6925 0.6689 0.8179
No log 3.4667 104 0.6180 0.6398 0.6180 0.7861
No log 3.5333 106 0.6187 0.6601 0.6187 0.7866
No log 3.6 108 0.6143 0.7136 0.6143 0.7838
No log 3.6667 110 0.8558 0.5455 0.8558 0.9251
No log 3.7333 112 0.9295 0.5172 0.9295 0.9641
No log 3.8 114 0.7353 0.6067 0.7353 0.8575
No log 3.8667 116 0.5780 0.6235 0.5780 0.7603
No log 3.9333 118 0.5920 0.6597 0.5920 0.7694
No log 4.0 120 0.5810 0.6553 0.5810 0.7622
No log 4.0667 122 0.6468 0.6394 0.6468 0.8043
No log 4.1333 124 0.7194 0.6227 0.7194 0.8481
No log 4.2 126 0.7009 0.5938 0.7009 0.8372
No log 4.2667 128 0.6064 0.7004 0.6064 0.7787
No log 4.3333 130 0.5730 0.7423 0.5730 0.7570
No log 4.4 132 0.5795 0.6246 0.5795 0.7612
No log 4.4667 134 0.7719 0.5678 0.7719 0.8786
No log 4.5333 136 0.8488 0.6172 0.8488 0.9213
No log 4.6 138 0.6394 0.6090 0.6394 0.7996
No log 4.6667 140 0.5894 0.6491 0.5894 0.7677
No log 4.7333 142 0.5934 0.6672 0.5934 0.7704
No log 4.8 144 0.5762 0.7210 0.5762 0.7591
No log 4.8667 146 0.5932 0.6644 0.5932 0.7702
No log 4.9333 148 0.5960 0.6311 0.5960 0.7720
No log 5.0 150 0.5788 0.7224 0.5788 0.7608
No log 5.0667 152 0.5892 0.6713 0.5892 0.7676
No log 5.1333 154 0.6109 0.6265 0.6109 0.7816
No log 5.2 156 0.6662 0.6253 0.6662 0.8162
No log 5.2667 158 0.6850 0.5870 0.6850 0.8277
No log 5.3333 160 0.6842 0.6038 0.6842 0.8272
No log 5.4 162 0.6553 0.6013 0.6553 0.8095
No log 5.4667 164 0.6044 0.7126 0.6044 0.7775
No log 5.5333 166 0.6464 0.6337 0.6464 0.8040
No log 5.6 168 0.6240 0.6459 0.6240 0.7899
No log 5.6667 170 0.6124 0.6288 0.6124 0.7826
No log 5.7333 172 0.6353 0.6004 0.6353 0.7971
No log 5.8 174 0.8391 0.5814 0.8391 0.9160
No log 5.8667 176 0.9554 0.5186 0.9554 0.9774
No log 5.9333 178 0.7599 0.5711 0.7599 0.8717
No log 6.0 180 0.6000 0.5694 0.6000 0.7746
No log 6.0667 182 0.5818 0.6317 0.5818 0.7628
No log 6.1333 184 0.5790 0.6312 0.5790 0.7609
No log 6.2 186 0.6360 0.5467 0.6360 0.7975
No log 6.2667 188 0.7329 0.6212 0.7329 0.8561
No log 6.3333 190 0.6497 0.5578 0.6497 0.8060
No log 6.4 192 0.6333 0.5669 0.6333 0.7958
No log 6.4667 194 0.7175 0.6296 0.7175 0.8471
No log 6.5333 196 0.7373 0.5809 0.7373 0.8587
No log 6.6 198 0.6541 0.6269 0.6541 0.8088
No log 6.6667 200 0.6268 0.6788 0.6268 0.7917
No log 6.7333 202 0.6087 0.6518 0.6087 0.7802
No log 6.8 204 0.6032 0.6134 0.6032 0.7766
No log 6.8667 206 0.6175 0.5926 0.6175 0.7858
No log 6.9333 208 0.6465 0.6144 0.6465 0.8041
No log 7.0 210 0.6594 0.6144 0.6594 0.8120
No log 7.0667 212 0.6373 0.6281 0.6373 0.7983
No log 7.1333 214 0.6575 0.5826 0.6575 0.8109
No log 7.2 216 0.6986 0.5347 0.6986 0.8358
No log 7.2667 218 0.6599 0.5463 0.6599 0.8124
No log 7.3333 220 0.6576 0.5917 0.6576 0.8109
No log 7.4 222 0.6474 0.6028 0.6474 0.8046
No log 7.4667 224 0.6310 0.5505 0.6310 0.7944
No log 7.5333 226 0.6347 0.6078 0.6347 0.7967
No log 7.6 228 0.6820 0.5455 0.6820 0.8258
No log 7.6667 230 0.6149 0.6383 0.6149 0.7842
No log 7.7333 232 0.6208 0.5872 0.6208 0.7879
No log 7.8 234 0.6657 0.6556 0.6657 0.8159
No log 7.8667 236 0.5736 0.6656 0.5736 0.7574
No log 7.9333 238 0.5863 0.5662 0.5863 0.7657
No log 8.0 240 0.6791 0.5578 0.6791 0.8240
No log 8.0667 242 0.6402 0.5770 0.6402 0.8001
No log 8.1333 244 0.5613 0.5961 0.5613 0.7492
No log 8.2 246 0.5682 0.6745 0.5682 0.7538
No log 8.2667 248 0.5852 0.5833 0.5852 0.7650
No log 8.3333 250 0.5899 0.6230 0.5899 0.7680
No log 8.4 252 0.6318 0.5450 0.6318 0.7949
No log 8.4667 254 0.6593 0.5560 0.6593 0.8120
No log 8.5333 256 0.6378 0.5770 0.6378 0.7986
No log 8.6 258 0.5995 0.6164 0.5995 0.7743
No log 8.6667 260 0.6155 0.6455 0.6155 0.7845
No log 8.7333 262 0.6000 0.6924 0.6000 0.7746
No log 8.8 264 0.6107 0.6311 0.6107 0.7815
No log 8.8667 266 0.6447 0.5748 0.6447 0.8030
No log 8.9333 268 0.6421 0.5887 0.6421 0.8013
No log 9.0 270 0.5957 0.6272 0.5957 0.7718
No log 9.0667 272 0.6071 0.6370 0.6071 0.7791
No log 9.1333 274 0.6017 0.6175 0.6017 0.7757
No log 9.2 276 0.5746 0.6272 0.5746 0.7580
No log 9.2667 278 0.5689 0.6649 0.5689 0.7543
No log 9.3333 280 0.5991 0.6288 0.5991 0.7740
No log 9.4 282 0.5964 0.6060 0.5964 0.7723
No log 9.4667 284 0.5788 0.7034 0.5788 0.7608
No log 9.5333 286 0.5917 0.6916 0.5917 0.7692
No log 9.6 288 0.5826 0.6345 0.5826 0.7633
No log 9.6667 290 0.5769 0.6317 0.5769 0.7595
No log 9.7333 292 0.5849 0.6312 0.5849 0.7648
No log 9.8 294 0.6691 0.5928 0.6691 0.8180
No log 9.8667 296 0.6913 0.5914 0.6913 0.8314
No log 9.9333 298 0.5963 0.6835 0.5963 0.7722
No log 10.0 300 0.5643 0.7095 0.5643 0.7512
No log 10.0667 302 0.5720 0.6704 0.5720 0.7563
No log 10.1333 304 0.5491 0.6317 0.5491 0.7410
No log 10.2 306 0.5453 0.6437 0.5453 0.7384
No log 10.2667 308 0.5552 0.6322 0.5552 0.7451
No log 10.3333 310 0.5664 0.6215 0.5664 0.7526
No log 10.4 312 0.5648 0.6411 0.5648 0.7515
No log 10.4667 314 0.6130 0.6072 0.6130 0.7829
No log 10.5333 316 0.6084 0.6072 0.6084 0.7800
No log 10.6 318 0.5844 0.6386 0.5844 0.7645
No log 10.6667 320 0.5452 0.6195 0.5452 0.7384
No log 10.7333 322 0.5450 0.7025 0.5450 0.7382
No log 10.8 324 0.5521 0.5999 0.5521 0.7430
No log 10.8667 326 0.6336 0.6308 0.6336 0.7960
No log 10.9333 328 0.7304 0.6389 0.7304 0.8546
No log 11.0 330 0.6984 0.6550 0.6984 0.8357
No log 11.0667 332 0.5793 0.6545 0.5793 0.7611
No log 11.1333 334 0.5756 0.6528 0.5756 0.7587
No log 11.2 336 0.6900 0.6476 0.6900 0.8307
No log 11.2667 338 0.7173 0.6336 0.7173 0.8469
No log 11.3333 340 0.6375 0.6500 0.6375 0.7984
No log 11.4 342 0.5578 0.6517 0.5578 0.7469
No log 11.4667 344 0.5447 0.6822 0.5447 0.7380
No log 11.5333 346 0.5415 0.6947 0.5415 0.7359
No log 11.6 348 0.5393 0.6658 0.5393 0.7344
No log 11.6667 350 0.5457 0.6619 0.5457 0.7387
No log 11.7333 352 0.5865 0.6623 0.5865 0.7658
No log 11.8 354 0.6186 0.6491 0.6186 0.7865
No log 11.8667 356 0.5897 0.6588 0.5897 0.7679
No log 11.9333 358 0.5540 0.6704 0.5540 0.7443
No log 12.0 360 0.5475 0.6564 0.5475 0.7399
No log 12.0667 362 0.5349 0.6748 0.5349 0.7314
No log 12.1333 364 0.5230 0.6614 0.5230 0.7232
No log 12.2 366 0.5209 0.6518 0.5209 0.7217
No log 12.2667 368 0.5208 0.6966 0.5208 0.7217
No log 12.3333 370 0.5216 0.6966 0.5216 0.7222
No log 12.4 372 0.5408 0.6830 0.5408 0.7354
No log 12.4667 374 0.5364 0.6796 0.5364 0.7324
No log 12.5333 376 0.5517 0.6748 0.5517 0.7428
No log 12.6 378 0.5481 0.6896 0.5481 0.7403
No log 12.6667 380 0.5545 0.6931 0.5545 0.7446
No log 12.7333 382 0.5548 0.6246 0.5548 0.7448
No log 12.8 384 0.5669 0.5982 0.5669 0.7529
No log 12.8667 386 0.5609 0.5988 0.5609 0.7489
No log 12.9333 388 0.5426 0.6400 0.5426 0.7366
No log 13.0 390 0.5299 0.7095 0.5299 0.7279
No log 13.0667 392 0.5257 0.6995 0.5257 0.7250
No log 13.1333 394 0.5754 0.6262 0.5754 0.7585
No log 13.2 396 0.6364 0.6380 0.6364 0.7978
No log 13.2667 398 0.6928 0.6488 0.6928 0.8323
No log 13.3333 400 0.6407 0.6214 0.6407 0.8005
No log 13.4 402 0.5522 0.6507 0.5522 0.7431
No log 13.4667 404 0.5447 0.6880 0.5447 0.7381
No log 13.5333 406 0.6199 0.6768 0.6199 0.7873
No log 13.6 408 0.6347 0.6768 0.6347 0.7967
No log 13.6667 410 0.5800 0.6764 0.5800 0.7616
No log 13.7333 412 0.5484 0.6087 0.5484 0.7405
No log 13.8 414 0.5660 0.5977 0.5660 0.7523
No log 13.8667 416 0.5788 0.5894 0.5788 0.7608
No log 13.9333 418 0.5589 0.6087 0.5589 0.7476
No log 14.0 420 0.5315 0.6649 0.5315 0.7290
No log 14.0667 422 0.5222 0.6780 0.5222 0.7227
No log 14.1333 424 0.5181 0.6780 0.5181 0.7198
No log 14.2 426 0.5148 0.6947 0.5148 0.7175
No log 14.2667 428 0.5391 0.6507 0.5391 0.7342
No log 14.3333 430 0.5906 0.6638 0.5906 0.7685
No log 14.4 432 0.5873 0.6004 0.5873 0.7663
No log 14.4667 434 0.5514 0.6611 0.5514 0.7426
No log 14.5333 436 0.5382 0.6611 0.5382 0.7336
No log 14.6 438 0.5306 0.6985 0.5306 0.7284
No log 14.6667 440 0.5319 0.7165 0.5319 0.7293
No log 14.7333 442 0.5471 0.6611 0.5471 0.7397
No log 14.8 444 0.5564 0.6611 0.5564 0.7459
No log 14.8667 446 0.5643 0.6507 0.5643 0.7512
No log 14.9333 448 0.5369 0.6526 0.5369 0.7327
No log 15.0 450 0.5361 0.6690 0.5361 0.7322
No log 15.0667 452 0.5308 0.6919 0.5308 0.7285
No log 15.1333 454 0.5403 0.6312 0.5403 0.7351
No log 15.2 456 0.5343 0.6497 0.5343 0.7309
No log 15.2667 458 0.5231 0.6770 0.5231 0.7233
No log 15.3333 460 0.5421 0.6424 0.5421 0.7363
No log 15.4 462 0.5512 0.6460 0.5512 0.7424
No log 15.4667 464 0.5450 0.6207 0.5450 0.7382
No log 15.5333 466 0.5398 0.6526 0.5398 0.7347
No log 15.6 468 0.5359 0.6940 0.5359 0.7321
No log 15.6667 470 0.5508 0.6962 0.5508 0.7421
No log 15.7333 472 0.5669 0.7057 0.5669 0.7529
No log 15.8 474 0.5647 0.6894 0.5647 0.7515
No log 15.8667 476 0.5906 0.6845 0.5906 0.7685
No log 15.9333 478 0.5799 0.6845 0.5799 0.7615
No log 16.0 480 0.5364 0.6894 0.5364 0.7324
No log 16.0667 482 0.5014 0.7003 0.5014 0.7081
No log 16.1333 484 0.5058 0.6327 0.5058 0.7112
No log 16.2 486 0.5158 0.6327 0.5158 0.7182
No log 16.2667 488 0.5197 0.6327 0.5197 0.7209
No log 16.3333 490 0.5205 0.6437 0.5205 0.7214
No log 16.4 492 0.5219 0.6733 0.5219 0.7224
No log 16.4667 494 0.5176 0.6327 0.5176 0.7194
No log 16.5333 496 0.5207 0.6327 0.5207 0.7216
No log 16.6 498 0.5225 0.6327 0.5225 0.7229
0.2208 16.6667 500 0.5265 0.6327 0.5265 0.7256
0.2208 16.7333 502 0.5252 0.6327 0.5252 0.7247
0.2208 16.8 504 0.5288 0.6427 0.5288 0.7272
0.2208 16.8667 506 0.5166 0.6919 0.5166 0.7188
0.2208 16.9333 508 0.5070 0.6830 0.5070 0.7120
0.2208 17.0 510 0.5032 0.6830 0.5032 0.7093
0.2208 17.0667 512 0.5030 0.6830 0.5030 0.7092
0.2208 17.1333 514 0.5036 0.6830 0.5036 0.7097
0.2208 17.2 516 0.5085 0.6830 0.5085 0.7131
0.2208 17.2667 518 0.5106 0.6729 0.5106 0.7145
0.2208 17.3333 520 0.5241 0.6217 0.5241 0.7240
0.2208 17.4 522 0.5252 0.6619 0.5252 0.7247
0.2208 17.4667 524 0.5256 0.6796 0.5256 0.7250
0.2208 17.5333 526 0.5369 0.7273 0.5369 0.7328
0.2208 17.6 528 0.5663 0.6545 0.5663 0.7525
0.2208 17.6667 530 0.5977 0.5806 0.5977 0.7731
0.2208 17.7333 532 0.5963 0.5581 0.5963 0.7722
0.2208 17.8 534 0.5743 0.6139 0.5743 0.7578
0.2208 17.8667 536 0.5760 0.6460 0.5760 0.7590
0.2208 17.9333 538 0.5889 0.6593 0.5889 0.7674
0.2208 18.0 540 0.5756 0.6780 0.5756 0.7587
0.2208 18.0667 542 0.5682 0.6589 0.5682 0.7538
0.2208 18.1333 544 0.5712 0.6049 0.5712 0.7558
0.2208 18.2 546 0.5819 0.6078 0.5819 0.7628
0.2208 18.2667 548 0.5799 0.6078 0.5799 0.7615
0.2208 18.3333 550 0.5708 0.6049 0.5708 0.7555
0.2208 18.4 552 0.5652 0.6951 0.5652 0.7518
0.2208 18.4667 554 0.5613 0.6680 0.5613 0.7492
0.2208 18.5333 556 0.5620 0.6175 0.5620 0.7497
0.2208 18.6 558 0.5698 0.6076 0.5698 0.7548
0.2208 18.6667 560 0.5667 0.6317 0.5667 0.7528
0.2208 18.7333 562 0.5639 0.6217 0.5639 0.7509
0.2208 18.8 564 0.5574 0.6205 0.5574 0.7466

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run3_AugV5_k6_task5_organization

Finetuned
(4019)
this model