ArabicNewSplits7_B_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k4_task5_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.7691
  • Qwk: 0.4681
  • Mse: 0.7691
  • Rmse: 0.8770

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0909 2 4.1032 -0.0075 4.1032 2.0256
No log 0.1818 4 2.4344 0.0696 2.4344 1.5603
No log 0.2727 6 1.3887 0.0185 1.3887 1.1784
No log 0.3636 8 1.1709 0.2049 1.1709 1.0821
No log 0.4545 10 1.0555 0.2061 1.0555 1.0274
No log 0.5455 12 1.1822 0.1096 1.1822 1.0873
No log 0.6364 14 1.0862 0.1699 1.0862 1.0422
No log 0.7273 16 1.0924 0.1962 1.0924 1.0452
No log 0.8182 18 1.1289 0.1603 1.1289 1.0625
No log 0.9091 20 1.1150 0.2293 1.1150 1.0559
No log 1.0 22 1.0384 0.1713 1.0384 1.0190
No log 1.0909 24 1.0228 0.1981 1.0228 1.0114
No log 1.1818 26 1.0344 0.1981 1.0344 1.0170
No log 1.2727 28 1.0457 0.1516 1.0457 1.0226
No log 1.3636 30 1.0249 0.1981 1.0249 1.0124
No log 1.4545 32 0.9905 0.1783 0.9905 0.9952
No log 1.5455 34 1.0110 0.2391 1.0110 1.0055
No log 1.6364 36 1.0654 0.2880 1.0654 1.0322
No log 1.7273 38 1.0848 0.1738 1.0848 1.0416
No log 1.8182 40 1.0009 0.3396 1.0009 1.0004
No log 1.9091 42 0.9360 0.2466 0.9360 0.9675
No log 2.0 44 0.9911 0.2764 0.9911 0.9955
No log 2.0909 46 1.1444 -0.0477 1.1444 1.0698
No log 2.1818 48 1.1373 -0.0477 1.1373 1.0664
No log 2.2727 50 1.0389 0.1591 1.0389 1.0193
No log 2.3636 52 0.9367 0.3478 0.9367 0.9678
No log 2.4545 54 0.9086 0.3393 0.9086 0.9532
No log 2.5455 56 0.9677 0.2529 0.9677 0.9837
No log 2.6364 58 0.9723 0.2529 0.9723 0.9861
No log 2.7273 60 0.9601 0.2135 0.9601 0.9798
No log 2.8182 62 0.9249 0.2770 0.9249 0.9617
No log 2.9091 64 0.9137 0.2944 0.9137 0.9559
No log 3.0 66 0.9103 0.3432 0.9103 0.9541
No log 3.0909 68 0.9170 0.2849 0.9170 0.9576
No log 3.1818 70 0.9291 0.2390 0.9291 0.9639
No log 3.2727 72 0.8349 0.3281 0.8349 0.9137
No log 3.3636 74 0.8087 0.3410 0.8087 0.8993
No log 3.4545 76 0.8525 0.3001 0.8525 0.9233
No log 3.5455 78 0.8738 0.3153 0.8738 0.9348
No log 3.6364 80 0.9387 0.2698 0.9387 0.9689
No log 3.7273 82 0.9648 0.3290 0.9648 0.9823
No log 3.8182 84 0.8388 0.3250 0.8388 0.9158
No log 3.9091 86 0.8022 0.4086 0.8022 0.8957
No log 4.0 88 0.8454 0.3960 0.8454 0.9194
No log 4.0909 90 0.9016 0.2310 0.9016 0.9495
No log 4.1818 92 0.9034 0.3671 0.9034 0.9505
No log 4.2727 94 0.8000 0.4588 0.8000 0.8944
No log 4.3636 96 0.7415 0.4883 0.7415 0.8611
No log 4.4545 98 0.7527 0.5626 0.7527 0.8676
No log 4.5455 100 0.8603 0.4357 0.8603 0.9275
No log 4.6364 102 0.8078 0.5094 0.8078 0.8988
No log 4.7273 104 0.7655 0.5626 0.7655 0.8750
No log 4.8182 106 0.8071 0.5266 0.8071 0.8984
No log 4.9091 108 0.8694 0.4773 0.8694 0.9324
No log 5.0 110 0.7769 0.4344 0.7769 0.8814
No log 5.0909 112 0.7575 0.5261 0.7575 0.8704
No log 5.1818 114 0.7386 0.5291 0.7386 0.8594
No log 5.2727 116 0.7592 0.5902 0.7592 0.8713
No log 5.3636 118 0.7401 0.5577 0.7401 0.8603
No log 5.4545 120 0.7422 0.5160 0.7422 0.8615
No log 5.5455 122 0.7842 0.4471 0.7842 0.8856
No log 5.6364 124 0.8215 0.4370 0.8215 0.9063
No log 5.7273 126 0.8093 0.5098 0.8093 0.8996
No log 5.8182 128 0.8513 0.5233 0.8513 0.9227
No log 5.9091 130 0.7969 0.5442 0.7969 0.8927
No log 6.0 132 0.7323 0.5176 0.7323 0.8558
No log 6.0909 134 0.7177 0.5042 0.7177 0.8472
No log 6.1818 136 0.7290 0.5633 0.7290 0.8538
No log 6.2727 138 0.7068 0.5725 0.7068 0.8407
No log 6.3636 140 0.7340 0.5199 0.7340 0.8567
No log 6.4545 142 0.8352 0.4244 0.8352 0.9139
No log 6.5455 144 0.7799 0.4350 0.7799 0.8831
No log 6.6364 146 0.7403 0.5010 0.7403 0.8604
No log 6.7273 148 0.7222 0.5301 0.7222 0.8498
No log 6.8182 150 0.7466 0.5326 0.7466 0.8640
No log 6.9091 152 0.6863 0.5261 0.6863 0.8284
No log 7.0 154 0.7665 0.5602 0.7665 0.8755
No log 7.0909 156 0.9401 0.4875 0.9401 0.9696
No log 7.1818 158 0.8722 0.5006 0.8722 0.9339
No log 7.2727 160 0.7097 0.5121 0.7097 0.8425
No log 7.3636 162 0.6968 0.5859 0.6968 0.8348
No log 7.4545 164 0.7224 0.5234 0.7224 0.8499
No log 7.5455 166 0.8017 0.4932 0.8017 0.8954
No log 7.6364 168 0.9371 0.4537 0.9371 0.9681
No log 7.7273 170 0.8613 0.5219 0.8613 0.9281
No log 7.8182 172 0.7000 0.5980 0.7000 0.8366
No log 7.9091 174 0.7174 0.5313 0.7174 0.8470
No log 8.0 176 0.7105 0.5030 0.7105 0.8429
No log 8.0909 178 0.7813 0.5614 0.7813 0.8839
No log 8.1818 180 0.8229 0.5266 0.8229 0.9071
No log 8.2727 182 0.7731 0.4128 0.7731 0.8793
No log 8.3636 184 0.7651 0.4168 0.7651 0.8747
No log 8.4545 186 0.7409 0.4660 0.7409 0.8607
No log 8.5455 188 0.7535 0.5084 0.7535 0.8680
No log 8.6364 190 0.8012 0.4234 0.8012 0.8951
No log 8.7273 192 0.7657 0.4836 0.7657 0.8751
No log 8.8182 194 0.7569 0.4836 0.7569 0.8700
No log 8.9091 196 0.7753 0.4711 0.7753 0.8805
No log 9.0 198 0.8676 0.4781 0.8676 0.9314
No log 9.0909 200 0.9342 0.4982 0.9342 0.9665
No log 9.1818 202 0.8922 0.4779 0.8922 0.9446
No log 9.2727 204 0.7543 0.4815 0.7543 0.8685
No log 9.3636 206 0.6753 0.5260 0.6753 0.8218
No log 9.4545 208 0.6742 0.5412 0.6742 0.8211
No log 9.5455 210 0.6686 0.5131 0.6686 0.8177
No log 9.6364 212 0.7580 0.4695 0.7580 0.8706
No log 9.7273 214 0.8093 0.4907 0.8093 0.8996
No log 9.8182 216 0.8388 0.4907 0.8388 0.9158
No log 9.9091 218 0.8074 0.4810 0.8074 0.8985
No log 10.0 220 0.8475 0.5 0.8475 0.9206
No log 10.0909 222 0.9907 0.5188 0.9907 0.9953
No log 10.1818 224 1.1854 0.4246 1.1854 1.0888
No log 10.2727 226 1.1498 0.4844 1.1498 1.0723
No log 10.3636 228 0.8987 0.4792 0.8987 0.9480
No log 10.4545 230 0.7004 0.5666 0.7004 0.8369
No log 10.5455 232 0.6738 0.5247 0.6738 0.8209
No log 10.6364 234 0.6663 0.5678 0.6663 0.8163
No log 10.7273 236 0.7374 0.5707 0.7374 0.8587
No log 10.8182 238 0.8359 0.5220 0.8359 0.9143
No log 10.9091 240 0.8024 0.5026 0.8024 0.8958
No log 11.0 242 0.7173 0.5642 0.7173 0.8469
No log 11.0909 244 0.7261 0.5572 0.7261 0.8521
No log 11.1818 246 0.7630 0.5170 0.7630 0.8735
No log 11.2727 248 0.7066 0.5429 0.7066 0.8406
No log 11.3636 250 0.6896 0.5798 0.6896 0.8304
No log 11.4545 252 0.6921 0.5516 0.6921 0.8319
No log 11.5455 254 0.7485 0.5555 0.7485 0.8651
No log 11.6364 256 0.7421 0.4738 0.7421 0.8615
No log 11.7273 258 0.7356 0.4660 0.7356 0.8577
No log 11.8182 260 0.7224 0.5103 0.7224 0.8499
No log 11.9091 262 0.6703 0.5796 0.6703 0.8187
No log 12.0 264 0.6582 0.5166 0.6582 0.8113
No log 12.0909 266 0.6635 0.5536 0.6635 0.8145
No log 12.1818 268 0.6626 0.5735 0.6626 0.8140
No log 12.2727 270 0.7203 0.5370 0.7203 0.8487
No log 12.3636 272 0.7527 0.5242 0.7527 0.8676
No log 12.4545 274 0.7197 0.5799 0.7197 0.8484
No log 12.5455 276 0.7374 0.5666 0.7374 0.8587
No log 12.6364 278 0.7037 0.5822 0.7037 0.8389
No log 12.7273 280 0.7430 0.5666 0.7430 0.8620
No log 12.8182 282 0.6980 0.5953 0.6980 0.8355
No log 12.9091 284 0.6839 0.5964 0.6839 0.8270
No log 13.0 286 0.6673 0.6199 0.6673 0.8169
No log 13.0909 288 0.6760 0.6082 0.6760 0.8222
No log 13.1818 290 0.6596 0.5622 0.6596 0.8121
No log 13.2727 292 0.6731 0.5622 0.6731 0.8204
No log 13.3636 294 0.6882 0.5835 0.6882 0.8296
No log 13.4545 296 0.7078 0.5084 0.7078 0.8413
No log 13.5455 298 0.7548 0.5150 0.7548 0.8688
No log 13.6364 300 0.8182 0.4681 0.8182 0.9045
No log 13.7273 302 0.7516 0.4710 0.7516 0.8669
No log 13.8182 304 0.6922 0.5480 0.6922 0.8320
No log 13.9091 306 0.6960 0.5370 0.6960 0.8343
No log 14.0 308 0.7296 0.5084 0.7296 0.8541
No log 14.0909 310 0.8578 0.4548 0.8578 0.9262
No log 14.1818 312 0.9668 0.4767 0.9668 0.9833
No log 14.2727 314 1.0361 0.4444 1.0361 1.0179
No log 14.3636 316 1.0017 0.4440 1.0017 1.0009
No log 14.4545 318 0.8748 0.4548 0.8748 0.9353
No log 14.5455 320 0.7911 0.4439 0.7911 0.8894
No log 14.6364 322 0.7346 0.5602 0.7346 0.8571
No log 14.7273 324 0.7297 0.5584 0.7297 0.8542
No log 14.8182 326 0.6858 0.6035 0.6858 0.8281
No log 14.9091 328 0.6469 0.6215 0.6469 0.8043
No log 15.0 330 0.6771 0.5625 0.6771 0.8229
No log 15.0909 332 0.6569 0.5955 0.6569 0.8105
No log 15.1818 334 0.6810 0.5505 0.6810 0.8253
No log 15.2727 336 0.8843 0.5 0.8843 0.9404
No log 15.3636 338 0.9352 0.4570 0.9352 0.9670
No log 15.4545 340 0.8202 0.4907 0.8202 0.9057
No log 15.5455 342 0.7901 0.5343 0.7901 0.8889
No log 15.6364 344 0.7074 0.5708 0.7074 0.8410
No log 15.7273 346 0.6837 0.5821 0.6837 0.8269
No log 15.8182 348 0.6925 0.5729 0.6925 0.8322
No log 15.9091 350 0.6833 0.6063 0.6833 0.8266
No log 16.0 352 0.6984 0.5810 0.6984 0.8357
No log 16.0909 354 0.7384 0.5243 0.7384 0.8593
No log 16.1818 356 0.6976 0.6082 0.6976 0.8352
No log 16.2727 358 0.6555 0.5796 0.6555 0.8096
No log 16.3636 360 0.6761 0.6078 0.6761 0.8223
No log 16.4545 362 0.7496 0.5463 0.7496 0.8658
No log 16.5455 364 0.7171 0.5561 0.7171 0.8468
No log 16.6364 366 0.6630 0.5747 0.6630 0.8142
No log 16.7273 368 0.6780 0.5666 0.6780 0.8234
No log 16.8182 370 0.6676 0.5577 0.6676 0.8170
No log 16.9091 372 0.6677 0.5759 0.6677 0.8171
No log 17.0 374 0.6877 0.5542 0.6877 0.8293
No log 17.0909 376 0.6533 0.5650 0.6533 0.8083
No log 17.1818 378 0.6412 0.5467 0.6412 0.8008
No log 17.2727 380 0.7463 0.5254 0.7463 0.8639
No log 17.3636 382 0.8403 0.4894 0.8403 0.9167
No log 17.4545 384 0.8041 0.4681 0.8041 0.8967
No log 17.5455 386 0.6979 0.5552 0.6979 0.8354
No log 17.6364 388 0.6684 0.5921 0.6684 0.8175
No log 17.7273 390 0.6743 0.5654 0.6743 0.8212
No log 17.8182 392 0.7512 0.5372 0.7512 0.8667
No log 17.9091 394 0.8046 0.4222 0.8046 0.8970
No log 18.0 396 0.7416 0.5470 0.7416 0.8611
No log 18.0909 398 0.6662 0.5923 0.6662 0.8162
No log 18.1818 400 0.6579 0.6249 0.6579 0.8111
No log 18.2727 402 0.6599 0.5751 0.6599 0.8123
No log 18.3636 404 0.6732 0.6025 0.6732 0.8205
No log 18.4545 406 0.6743 0.5938 0.6743 0.8212
No log 18.5455 408 0.6740 0.5938 0.6740 0.8210
No log 18.6364 410 0.6941 0.5822 0.6941 0.8331
No log 18.7273 412 0.6681 0.5883 0.6681 0.8174
No log 18.8182 414 0.6499 0.5949 0.6499 0.8062
No log 18.9091 416 0.6544 0.5823 0.6544 0.8089
No log 19.0 418 0.6948 0.4937 0.6948 0.8335
No log 19.0909 420 0.7169 0.5279 0.7169 0.8467
No log 19.1818 422 0.6921 0.5266 0.6921 0.8319
No log 19.2727 424 0.6943 0.5370 0.6943 0.8332
No log 19.3636 426 0.6627 0.5923 0.6627 0.8141
No log 19.4545 428 0.6127 0.6043 0.6127 0.7828
No log 19.5455 430 0.6170 0.6165 0.6170 0.7855
No log 19.6364 432 0.6627 0.5642 0.6627 0.8140
No log 19.7273 434 0.6987 0.5157 0.6987 0.8359
No log 19.8182 436 0.6693 0.5279 0.6693 0.8181
No log 19.9091 438 0.6268 0.6588 0.6268 0.7917
No log 20.0 440 0.6096 0.6286 0.6096 0.7807
No log 20.0909 442 0.6105 0.6286 0.6105 0.7813
No log 20.1818 444 0.6307 0.6578 0.6307 0.7941
No log 20.2727 446 0.6701 0.5400 0.6701 0.8186
No log 20.3636 448 0.6873 0.4801 0.6873 0.8290
No log 20.4545 450 0.6459 0.5992 0.6459 0.8037
No log 20.5455 452 0.6115 0.6073 0.6115 0.7820
No log 20.6364 454 0.6126 0.6332 0.6126 0.7827
No log 20.7273 456 0.6199 0.5835 0.6199 0.7873
No log 20.8182 458 0.6387 0.5585 0.6387 0.7992
No log 20.9091 460 0.6596 0.5204 0.6596 0.8121
No log 21.0 462 0.6923 0.5266 0.6923 0.8321
No log 21.0909 464 0.7227 0.4681 0.7227 0.8501
No log 21.1818 466 0.7081 0.5027 0.7081 0.8415
No log 21.2727 468 0.6559 0.5751 0.6559 0.8099
No log 21.3636 470 0.6447 0.6063 0.6447 0.8029
No log 21.4545 472 0.6791 0.5147 0.6791 0.8241
No log 21.5455 474 0.7697 0.4898 0.7697 0.8773
No log 21.6364 476 0.8713 0.4554 0.8713 0.9334
No log 21.7273 478 0.8534 0.4333 0.8534 0.9238
No log 21.8182 480 0.7433 0.4560 0.7433 0.8622
No log 21.9091 482 0.6566 0.5943 0.6566 0.8103
No log 22.0 484 0.6658 0.5879 0.6658 0.8160
No log 22.0909 486 0.6812 0.6039 0.6812 0.8253
No log 22.1818 488 0.6452 0.6078 0.6452 0.8032
No log 22.2727 490 0.6213 0.6632 0.6213 0.7882
No log 22.3636 492 0.6765 0.5470 0.6765 0.8225
No log 22.4545 494 0.7443 0.56 0.7443 0.8628
No log 22.5455 496 0.7648 0.5495 0.7648 0.8745
No log 22.6364 498 0.7290 0.5714 0.7290 0.8538
0.3027 22.7273 500 0.7258 0.5344 0.7258 0.8519
0.3027 22.8182 502 0.7526 0.5543 0.7526 0.8675
0.3027 22.9091 504 0.7474 0.5543 0.7474 0.8646
0.3027 23.0 506 0.7225 0.5358 0.7225 0.8500
0.3027 23.0909 508 0.6772 0.5358 0.6772 0.8229
0.3027 23.1818 510 0.6236 0.5425 0.6236 0.7897
0.3027 23.2727 512 0.6203 0.6063 0.6203 0.7876
0.3027 23.3636 514 0.6798 0.5963 0.6798 0.8245
0.3027 23.4545 516 0.7403 0.5705 0.7403 0.8604
0.3027 23.5455 518 0.7098 0.5832 0.7098 0.8425
0.3027 23.6364 520 0.6854 0.5595 0.6854 0.8279
0.3027 23.7273 522 0.6946 0.5470 0.6946 0.8334
0.3027 23.8182 524 0.6737 0.5833 0.6737 0.8208
0.3027 23.9091 526 0.6537 0.5098 0.6537 0.8085
0.3027 24.0 528 0.6730 0.5186 0.6730 0.8204
0.3027 24.0909 530 0.6705 0.5291 0.6705 0.8189
0.3027 24.1818 532 0.6882 0.5718 0.6882 0.8296
0.3027 24.2727 534 0.7377 0.5666 0.7377 0.8589
0.3027 24.3636 536 0.7337 0.5902 0.7337 0.8566
0.3027 24.4545 538 0.6859 0.5291 0.6859 0.8282
0.3027 24.5455 540 0.6644 0.5208 0.6644 0.8151
0.3027 24.6364 542 0.6952 0.5084 0.6952 0.8338
0.3027 24.7273 544 0.7173 0.5160 0.7173 0.8470
0.3027 24.8182 546 0.7325 0.5458 0.7325 0.8559
0.3027 24.9091 548 0.7129 0.5062 0.7129 0.8443
0.3027 25.0 550 0.6711 0.5208 0.6711 0.8192
0.3027 25.0909 552 0.6562 0.5208 0.6562 0.8101
0.3027 25.1818 554 0.6735 0.5197 0.6735 0.8207
0.3027 25.2727 556 0.7691 0.4681 0.7691 0.8770

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_B_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k4_task5_organization

Finetuned
(4019)
this model