ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k4_task5_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.6472
  • Qwk: 0.5894
  • Mse: 0.6472
  • Rmse: 0.8045

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.1 2 3.7769 0.0069 3.7769 1.9434
No log 0.2 4 1.9099 0.0947 1.9099 1.3820
No log 0.3 6 1.2783 0.0232 1.2783 1.1306
No log 0.4 8 1.1438 0.1304 1.1438 1.0695
No log 0.5 10 1.1731 0.1426 1.1731 1.0831
No log 0.6 12 1.0822 0.1767 1.0822 1.0403
No log 0.7 14 1.1970 0.1805 1.1970 1.0941
No log 0.8 16 0.9984 0.3059 0.9984 0.9992
No log 0.9 18 0.8840 0.3717 0.8840 0.9402
No log 1.0 20 0.8716 0.3974 0.8716 0.9336
No log 1.1 22 0.8280 0.3840 0.8280 0.9099
No log 1.2 24 0.8167 0.4057 0.8167 0.9037
No log 1.3 26 0.9736 0.3284 0.9736 0.9867
No log 1.4 28 0.8766 0.4157 0.8766 0.9363
No log 1.5 30 0.7181 0.5261 0.7181 0.8474
No log 1.6 32 0.8182 0.5340 0.8182 0.9045
No log 1.7 34 0.8326 0.5611 0.8326 0.9125
No log 1.8 36 0.9422 0.5382 0.9422 0.9707
No log 1.9 38 1.4414 0.3795 1.4414 1.2006
No log 2.0 40 1.3435 0.4413 1.3435 1.1591
No log 2.1 42 0.8941 0.5928 0.8941 0.9456
No log 2.2 44 0.7843 0.6121 0.7843 0.8856
No log 2.3 46 0.9402 0.4760 0.9402 0.9696
No log 2.4 48 0.8523 0.5228 0.8523 0.9232
No log 2.5 50 0.7128 0.6108 0.7128 0.8443
No log 2.6 52 0.7413 0.6453 0.7413 0.8610
No log 2.7 54 0.9427 0.5782 0.9427 0.9710
No log 2.8 56 1.0188 0.5781 1.0188 1.0094
No log 2.9 58 0.9963 0.6176 0.9963 0.9981
No log 3.0 60 0.8638 0.6269 0.8638 0.9294
No log 3.1 62 0.6694 0.6744 0.6694 0.8182
No log 3.2 64 0.6703 0.6116 0.6703 0.8187
No log 3.3 66 0.7023 0.5778 0.7023 0.8380
No log 3.4 68 0.7183 0.6627 0.7183 0.8475
No log 3.5 70 0.6212 0.5830 0.6212 0.7881
No log 3.6 72 0.6553 0.6763 0.6553 0.8095
No log 3.7 74 0.8099 0.5536 0.8099 0.9000
No log 3.8 76 0.8452 0.5621 0.8452 0.9194
No log 3.9 78 0.6931 0.6245 0.6931 0.8325
No log 4.0 80 0.6525 0.6835 0.6525 0.8078
No log 4.1 82 0.8345 0.4592 0.8345 0.9135
No log 4.2 84 0.9012 0.4036 0.9012 0.9493
No log 4.3 86 0.8101 0.4273 0.8101 0.9000
No log 4.4 88 0.6962 0.5522 0.6962 0.8344
No log 4.5 90 0.6702 0.5659 0.6702 0.8187
No log 4.6 92 0.7080 0.5782 0.7080 0.8414
No log 4.7 94 0.6887 0.6164 0.6887 0.8299
No log 4.8 96 0.6750 0.6535 0.6750 0.8216
No log 4.9 98 0.7595 0.6353 0.7595 0.8715
No log 5.0 100 0.8984 0.5509 0.8984 0.9478
No log 5.1 102 0.9052 0.5344 0.9052 0.9514
No log 5.2 104 0.8066 0.6002 0.8066 0.8981
No log 5.3 106 0.8284 0.5726 0.8284 0.9102
No log 5.4 108 0.8326 0.6170 0.8326 0.9125
No log 5.5 110 0.7939 0.6220 0.7939 0.8910
No log 5.6 112 0.7821 0.6121 0.7821 0.8844
No log 5.7 114 0.7942 0.5579 0.7942 0.8912
No log 5.8 116 0.7088 0.6157 0.7088 0.8419
No log 5.9 118 0.7639 0.5815 0.7639 0.8740
No log 6.0 120 0.8316 0.5407 0.8316 0.9119
No log 6.1 122 0.8221 0.5407 0.8221 0.9067
No log 6.2 124 0.7255 0.6005 0.7255 0.8518
No log 6.3 126 0.8249 0.5105 0.8249 0.9082
No log 6.4 128 0.9458 0.4693 0.9458 0.9725
No log 6.5 130 0.8734 0.4790 0.8734 0.9346
No log 6.6 132 0.7335 0.6006 0.7335 0.8564
No log 6.7 134 0.7279 0.5726 0.7279 0.8531
No log 6.8 136 0.7399 0.5770 0.7399 0.8602
No log 6.9 138 0.7193 0.6319 0.7193 0.8481
No log 7.0 140 0.7223 0.6352 0.7223 0.8499
No log 7.1 142 0.7312 0.6711 0.7312 0.8551
No log 7.2 144 0.7082 0.6746 0.7082 0.8416
No log 7.3 146 0.6828 0.6746 0.6828 0.8263
No log 7.4 148 0.6904 0.6828 0.6904 0.8309
No log 7.5 150 0.6505 0.6721 0.6505 0.8065
No log 7.6 152 0.6185 0.6642 0.6185 0.7865
No log 7.7 154 0.6277 0.6695 0.6277 0.7923
No log 7.8 156 0.6281 0.6875 0.6281 0.7925
No log 7.9 158 0.6489 0.6893 0.6489 0.8055
No log 8.0 160 0.6865 0.6615 0.6865 0.8286
No log 8.1 162 0.7650 0.6065 0.7650 0.8747
No log 8.2 164 0.7781 0.5808 0.7781 0.8821
No log 8.3 166 0.7112 0.6115 0.7112 0.8433
No log 8.4 168 0.6939 0.5872 0.6939 0.8330
No log 8.5 170 0.6801 0.5963 0.6801 0.8247
No log 8.6 172 0.6844 0.6280 0.6844 0.8273
No log 8.7 174 0.7092 0.6404 0.7092 0.8421
No log 8.8 176 0.7956 0.5998 0.7956 0.8919
No log 8.9 178 0.8018 0.6089 0.8018 0.8954
No log 9.0 180 0.7489 0.6041 0.7489 0.8654
No log 9.1 182 0.7338 0.5756 0.7338 0.8566
No log 9.2 184 0.7464 0.6032 0.7464 0.8640
No log 9.3 186 0.7339 0.5806 0.7339 0.8567
No log 9.4 188 0.7270 0.6254 0.7270 0.8526
No log 9.5 190 0.7027 0.5905 0.7027 0.8382
No log 9.6 192 0.6568 0.6780 0.6568 0.8104
No log 9.7 194 0.6672 0.6284 0.6672 0.8168
No log 9.8 196 0.6910 0.6315 0.6910 0.8312
No log 9.9 198 0.6920 0.6169 0.6920 0.8319
No log 10.0 200 0.6701 0.6196 0.6701 0.8186
No log 10.1 202 0.6502 0.6058 0.6502 0.8063
No log 10.2 204 0.6478 0.6039 0.6478 0.8049
No log 10.3 206 0.6514 0.5971 0.6514 0.8071
No log 10.4 208 0.6470 0.6398 0.6470 0.8043
No log 10.5 210 0.6638 0.6545 0.6638 0.8148
No log 10.6 212 0.6893 0.6050 0.6893 0.8303
No log 10.7 214 0.6826 0.5869 0.6826 0.8262
No log 10.8 216 0.7054 0.5963 0.7054 0.8399
No log 10.9 218 0.7454 0.6008 0.7454 0.8634
No log 11.0 220 0.7279 0.5838 0.7279 0.8532
No log 11.1 222 0.7281 0.6008 0.7281 0.8533
No log 11.2 224 0.6954 0.5869 0.6954 0.8339
No log 11.3 226 0.6924 0.6260 0.6924 0.8321
No log 11.4 228 0.6930 0.6408 0.6930 0.8325
No log 11.5 230 0.7004 0.6315 0.7004 0.8369
No log 11.6 232 0.6767 0.6456 0.6767 0.8226
No log 11.7 234 0.7086 0.5787 0.7086 0.8418
No log 11.8 236 0.7830 0.5370 0.7830 0.8849
No log 11.9 238 0.7646 0.5739 0.7646 0.8744
No log 12.0 240 0.6881 0.6269 0.6881 0.8295
No log 12.1 242 0.6654 0.6288 0.6654 0.8157
No log 12.2 244 0.7529 0.6148 0.7529 0.8677
No log 12.3 246 0.8056 0.5985 0.8056 0.8976
No log 12.4 248 0.7842 0.6177 0.7842 0.8856
No log 12.5 250 0.7480 0.6271 0.7480 0.8649
No log 12.6 252 0.7014 0.6553 0.7014 0.8375
No log 12.7 254 0.6692 0.6500 0.6692 0.8181
No log 12.8 256 0.6926 0.6187 0.6926 0.8322
No log 12.9 258 0.7605 0.5948 0.7605 0.8721
No log 13.0 260 0.7409 0.6331 0.7409 0.8607
No log 13.1 262 0.6828 0.5720 0.6828 0.8263
No log 13.2 264 0.6750 0.5934 0.6750 0.8216
No log 13.3 266 0.6914 0.6060 0.6914 0.8315
No log 13.4 268 0.6862 0.6060 0.6862 0.8284
No log 13.5 270 0.6749 0.6407 0.6749 0.8215
No log 13.6 272 0.7177 0.6066 0.7177 0.8472
No log 13.7 274 0.7378 0.6520 0.7378 0.8589
No log 13.8 276 0.6824 0.6724 0.6824 0.8261
No log 13.9 278 0.6789 0.5729 0.6789 0.8240
No log 14.0 280 0.6593 0.6335 0.6593 0.8120
No log 14.1 282 0.6426 0.6753 0.6426 0.8016
No log 14.2 284 0.6564 0.6610 0.6564 0.8102
No log 14.3 286 0.6839 0.6764 0.6839 0.8270
No log 14.4 288 0.7129 0.5679 0.7129 0.8444
No log 14.5 290 0.6812 0.6176 0.6812 0.8254
No log 14.6 292 0.6489 0.6680 0.6489 0.8055
No log 14.7 294 0.6370 0.6680 0.6370 0.7981
No log 14.8 296 0.6323 0.6622 0.6323 0.7952
No log 14.9 298 0.6302 0.6622 0.6302 0.7939
No log 15.0 300 0.6289 0.6622 0.6289 0.7931
No log 15.1 302 0.6319 0.6447 0.6319 0.7949
No log 15.2 304 0.6398 0.6664 0.6398 0.7999
No log 15.3 306 0.6604 0.6721 0.6604 0.8126
No log 15.4 308 0.6690 0.6813 0.6690 0.8179
No log 15.5 310 0.6502 0.6430 0.6502 0.8063
No log 15.6 312 0.6428 0.6333 0.6428 0.8017
No log 15.7 314 0.6428 0.5763 0.6428 0.8018
No log 15.8 316 0.6502 0.5675 0.6502 0.8063
No log 15.9 318 0.6517 0.5859 0.6517 0.8072
No log 16.0 320 0.6472 0.6142 0.6472 0.8045
No log 16.1 322 0.6696 0.6374 0.6696 0.8183
No log 16.2 324 0.6808 0.6703 0.6808 0.8251
No log 16.3 326 0.6378 0.6775 0.6378 0.7986
No log 16.4 328 0.5859 0.6814 0.5859 0.7654
No log 16.5 330 0.5813 0.6762 0.5813 0.7624
No log 16.6 332 0.5942 0.7049 0.5942 0.7708
No log 16.7 334 0.5959 0.7049 0.5959 0.7720
No log 16.8 336 0.5859 0.6822 0.5859 0.7654
No log 16.9 338 0.6282 0.6731 0.6282 0.7926
No log 17.0 340 0.6685 0.6371 0.6685 0.8176
No log 17.1 342 0.6641 0.6632 0.6641 0.8149
No log 17.2 344 0.6225 0.6672 0.6225 0.7890
No log 17.3 346 0.6103 0.6610 0.6103 0.7812
No log 17.4 348 0.6217 0.6788 0.6217 0.7885
No log 17.5 350 0.6315 0.6333 0.6315 0.7947
No log 17.6 352 0.6682 0.6721 0.6682 0.8174
No log 17.7 354 0.6760 0.6684 0.6760 0.8222
No log 17.8 356 0.6638 0.6888 0.6638 0.8147
No log 17.9 358 0.6596 0.6986 0.6596 0.8121
No log 18.0 360 0.6427 0.6828 0.6427 0.8017
No log 18.1 362 0.6241 0.7110 0.6241 0.7900
No log 18.2 364 0.6084 0.6207 0.6084 0.7800
No log 18.3 366 0.5868 0.6616 0.5868 0.7660
No log 18.4 368 0.5801 0.6575 0.5801 0.7616
No log 18.5 370 0.5655 0.6498 0.5655 0.7520
No log 18.6 372 0.5551 0.6690 0.5551 0.7450
No log 18.7 374 0.5879 0.6955 0.5879 0.7667
No log 18.8 376 0.6148 0.6552 0.6148 0.7841
No log 18.9 378 0.6061 0.6783 0.6061 0.7785
No log 19.0 380 0.6002 0.6835 0.6002 0.7747
No log 19.1 382 0.6106 0.6995 0.6106 0.7814
No log 19.2 384 0.6135 0.6951 0.6135 0.7833
No log 19.3 386 0.6250 0.6356 0.6250 0.7905
No log 19.4 388 0.6459 0.6455 0.6459 0.8037
No log 19.5 390 0.6890 0.6018 0.6890 0.8301
No log 19.6 392 0.6930 0.5905 0.6930 0.8324
No log 19.7 394 0.6688 0.6374 0.6688 0.8178
No log 19.8 396 0.6714 0.5830 0.6714 0.8194
No log 19.9 398 0.6941 0.5327 0.6941 0.8331
No log 20.0 400 0.7056 0.5524 0.7056 0.8400
No log 20.1 402 0.6991 0.6431 0.6991 0.8361
No log 20.2 404 0.7125 0.6239 0.7125 0.8441
No log 20.3 406 0.7126 0.6363 0.7126 0.8442
No log 20.4 408 0.7066 0.6620 0.7066 0.8406
No log 20.5 410 0.7106 0.6249 0.7106 0.8430
No log 20.6 412 0.6810 0.6194 0.6810 0.8252
No log 20.7 414 0.6522 0.6237 0.6522 0.8076
No log 20.8 416 0.6436 0.6398 0.6436 0.8023
No log 20.9 418 0.6384 0.6500 0.6384 0.7990
No log 21.0 420 0.6352 0.6610 0.6352 0.7970
No log 21.1 422 0.6369 0.6196 0.6369 0.7981
No log 21.2 424 0.6414 0.6244 0.6414 0.8009
No log 21.3 426 0.6510 0.6111 0.6510 0.8069
No log 21.4 428 0.6605 0.6206 0.6605 0.8127
No log 21.5 430 0.6436 0.6206 0.6436 0.8022
No log 21.6 432 0.6244 0.6297 0.6244 0.7902
No log 21.7 434 0.6182 0.6364 0.6182 0.7862
No log 21.8 436 0.6207 0.6398 0.6207 0.7878
No log 21.9 438 0.6432 0.6758 0.6432 0.8020
No log 22.0 440 0.6665 0.6664 0.6665 0.8164
No log 22.1 442 0.6791 0.6821 0.6791 0.8241
No log 22.2 444 0.6581 0.6476 0.6581 0.8112
No log 22.3 446 0.6463 0.6476 0.6463 0.8040
No log 22.4 448 0.6386 0.6306 0.6386 0.7991
No log 22.5 450 0.6278 0.6407 0.6278 0.7924
No log 22.6 452 0.6281 0.6720 0.6281 0.7926
No log 22.7 454 0.6432 0.6380 0.6432 0.8020
No log 22.8 456 0.6350 0.5880 0.6350 0.7968
No log 22.9 458 0.6267 0.6197 0.6267 0.7916
No log 23.0 460 0.6225 0.6610 0.6225 0.7890
No log 23.1 462 0.6290 0.6407 0.6290 0.7931
No log 23.2 464 0.6406 0.6482 0.6406 0.8003
No log 23.3 466 0.6544 0.6414 0.6544 0.8089
No log 23.4 468 0.6675 0.6151 0.6675 0.8170
No log 23.5 470 0.6720 0.6352 0.6720 0.8197
No log 23.6 472 0.6654 0.6352 0.6654 0.8157
No log 23.7 474 0.6574 0.6352 0.6574 0.8108
No log 23.8 476 0.6487 0.6215 0.6487 0.8054
No log 23.9 478 0.6379 0.6447 0.6379 0.7987
No log 24.0 480 0.6365 0.6316 0.6365 0.7978
No log 24.1 482 0.6385 0.6325 0.6385 0.7991
No log 24.2 484 0.6427 0.6499 0.6427 0.8017
No log 24.3 486 0.6476 0.6772 0.6476 0.8047
No log 24.4 488 0.6511 0.6869 0.6511 0.8069
No log 24.5 490 0.6505 0.6508 0.6505 0.8066
No log 24.6 492 0.6402 0.6737 0.6402 0.8001
No log 24.7 494 0.6289 0.6364 0.6289 0.7930
No log 24.8 496 0.6267 0.6039 0.6267 0.7916
No log 24.9 498 0.6228 0.6039 0.6228 0.7892
0.2459 25.0 500 0.6231 0.6011 0.6231 0.7894
0.2459 25.1 502 0.6207 0.6039 0.6207 0.7879
0.2459 25.2 504 0.6139 0.6479 0.6139 0.7835
0.2459 25.3 506 0.6108 0.6479 0.6108 0.7815
0.2459 25.4 508 0.6147 0.6479 0.6147 0.7840
0.2459 25.5 510 0.6228 0.6680 0.6228 0.7892
0.2459 25.6 512 0.6265 0.6680 0.6265 0.7915
0.2459 25.7 514 0.6285 0.6680 0.6285 0.7928
0.2459 25.8 516 0.6293 0.6572 0.6293 0.7933
0.2459 25.9 518 0.6193 0.6433 0.6193 0.7870
0.2459 26.0 520 0.6103 0.6584 0.6103 0.7812
0.2459 26.1 522 0.6112 0.6509 0.6112 0.7818
0.2459 26.2 524 0.6237 0.6435 0.6237 0.7898
0.2459 26.3 526 0.6203 0.6969 0.6203 0.7876
0.2459 26.4 528 0.6119 0.6689 0.6119 0.7822
0.2459 26.5 530 0.6046 0.6589 0.6046 0.7775
0.2459 26.6 532 0.5918 0.6689 0.5918 0.7693
0.2459 26.7 534 0.5873 0.6572 0.5873 0.7664
0.2459 26.8 536 0.5925 0.6740 0.5925 0.7698
0.2459 26.9 538 0.5867 0.6572 0.5867 0.7659
0.2459 27.0 540 0.5842 0.6374 0.5842 0.7643
0.2459 27.1 542 0.5896 0.6409 0.5896 0.7679
0.2459 27.2 544 0.5910 0.6407 0.5910 0.7688
0.2459 27.3 546 0.5945 0.6374 0.5945 0.7710
0.2459 27.4 548 0.6050 0.6473 0.6050 0.7778
0.2459 27.5 550 0.6250 0.6736 0.6250 0.7906
0.2459 27.6 552 0.6299 0.6644 0.6299 0.7937
0.2459 27.7 554 0.6251 0.6861 0.6251 0.7906
0.2459 27.8 556 0.6062 0.6861 0.6062 0.7786
0.2459 27.9 558 0.5937 0.6830 0.5937 0.7705
0.2459 28.0 560 0.5946 0.6610 0.5946 0.7711
0.2459 28.1 562 0.6097 0.6430 0.6097 0.7808
0.2459 28.2 564 0.6180 0.6430 0.6180 0.7862
0.2459 28.3 566 0.6049 0.6430 0.6049 0.7777
0.2459 28.4 568 0.5960 0.6610 0.5960 0.7720
0.2459 28.5 570 0.5981 0.6649 0.5981 0.7734
0.2459 28.6 572 0.6008 0.6649 0.6008 0.7751
0.2459 28.7 574 0.6064 0.6610 0.6064 0.7787
0.2459 28.8 576 0.6156 0.6500 0.6156 0.7846
0.2459 28.9 578 0.6186 0.6500 0.6186 0.7865
0.2459 29.0 580 0.6178 0.6424 0.6178 0.7860
0.2459 29.1 582 0.6139 0.6498 0.6139 0.7835
0.2459 29.2 584 0.6210 0.6307 0.6210 0.7880
0.2459 29.3 586 0.6225 0.6087 0.6225 0.7890
0.2459 29.4 588 0.6236 0.5955 0.6236 0.7897
0.2459 29.5 590 0.6425 0.6113 0.6425 0.8015
0.2459 29.6 592 0.6472 0.6301 0.6472 0.8045
0.2459 29.7 594 0.6447 0.6133 0.6447 0.8029
0.2459 29.8 596 0.6347 0.6380 0.6347 0.7967
0.2459 29.9 598 0.6274 0.5871 0.6274 0.7921
0.2459 30.0 600 0.6295 0.5882 0.6295 0.7934
0.2459 30.1 602 0.6333 0.5882 0.6333 0.7958
0.2459 30.2 604 0.6384 0.5894 0.6384 0.7990
0.2459 30.3 606 0.6472 0.5894 0.6472 0.8045

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k4_task5_organization

Finetuned
(4019)
this model