ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k3_task2_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 1.0019
  • Qwk: 0.5176
  • Mse: 1.0019
  • Rmse: 1.0010

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.1176 2 4.3833 0.0163 4.3833 2.0936
No log 0.2353 4 2.7118 0.0088 2.7118 1.6467
No log 0.3529 6 1.4399 0.0682 1.4399 1.2000
No log 0.4706 8 1.2704 0.1321 1.2704 1.1271
No log 0.5882 10 1.2273 0.2057 1.2273 1.1078
No log 0.7059 12 1.2207 0.1495 1.2207 1.1048
No log 0.8235 14 1.1584 0.1278 1.1584 1.0763
No log 0.9412 16 1.2397 0.1397 1.2397 1.1134
No log 1.0588 18 1.2233 0.1138 1.2233 1.1060
No log 1.1765 20 1.1458 0.1609 1.1458 1.0704
No log 1.2941 22 1.1103 0.2023 1.1103 1.0537
No log 1.4118 24 1.0960 0.3145 1.0960 1.0469
No log 1.5294 26 1.1559 0.1821 1.1559 1.0751
No log 1.6471 28 1.1020 0.2721 1.1020 1.0498
No log 1.7647 30 1.0219 0.3411 1.0219 1.0109
No log 1.8824 32 1.0204 0.2864 1.0204 1.0102
No log 2.0 34 1.2805 0.2239 1.2805 1.1316
No log 2.1176 36 1.7520 0.1223 1.7520 1.3236
No log 2.2353 38 1.4987 0.1800 1.4987 1.2242
No log 2.3529 40 0.9671 0.5283 0.9671 0.9834
No log 2.4706 42 0.9535 0.4293 0.9535 0.9765
No log 2.5882 44 0.9732 0.3452 0.9732 0.9865
No log 2.7059 46 0.8783 0.4108 0.8783 0.9372
No log 2.8235 48 0.9698 0.4696 0.9698 0.9848
No log 2.9412 50 1.1086 0.2019 1.1086 1.0529
No log 3.0588 52 1.0878 0.2959 1.0878 1.0430
No log 3.1765 54 1.0345 0.3601 1.0345 1.0171
No log 3.2941 56 0.8971 0.5387 0.8971 0.9472
No log 3.4118 58 0.8052 0.5082 0.8052 0.8973
No log 3.5294 60 0.8456 0.5736 0.8456 0.9195
No log 3.6471 62 0.8763 0.5743 0.8763 0.9361
No log 3.7647 64 0.8091 0.5294 0.8091 0.8995
No log 3.8824 66 0.8190 0.5217 0.8190 0.9050
No log 4.0 68 0.9558 0.5661 0.9558 0.9777
No log 4.1176 70 0.9473 0.5832 0.9473 0.9733
No log 4.2353 72 0.8981 0.5607 0.8981 0.9477
No log 4.3529 74 0.8328 0.5718 0.8328 0.9126
No log 4.4706 76 1.0068 0.4278 1.0068 1.0034
No log 4.5882 78 1.1856 0.4186 1.1856 1.0888
No log 4.7059 80 1.0433 0.4590 1.0433 1.0214
No log 4.8235 82 0.8522 0.4785 0.8522 0.9232
No log 4.9412 84 0.8178 0.5561 0.8178 0.9043
No log 5.0588 86 0.8018 0.5785 0.8018 0.8954
No log 5.1765 88 0.7999 0.5501 0.7999 0.8944
No log 5.2941 90 0.8438 0.5394 0.8438 0.9186
No log 5.4118 92 0.8252 0.5375 0.8252 0.9084
No log 5.5294 94 0.7601 0.6589 0.7601 0.8719
No log 5.6471 96 0.7816 0.5895 0.7816 0.8841
No log 5.7647 98 0.7817 0.6071 0.7817 0.8842
No log 5.8824 100 0.8106 0.5954 0.8106 0.9003
No log 6.0 102 1.0401 0.5585 1.0401 1.0198
No log 6.1176 104 1.0113 0.5585 1.0113 1.0056
No log 6.2353 106 0.7807 0.6202 0.7807 0.8836
No log 6.3529 108 0.7395 0.5940 0.7395 0.8600
No log 6.4706 110 0.7720 0.5549 0.7720 0.8786
No log 6.5882 112 0.7498 0.5811 0.7498 0.8659
No log 6.7059 114 0.7472 0.6183 0.7472 0.8644
No log 6.8235 116 0.8318 0.5712 0.8318 0.9120
No log 6.9412 118 0.8134 0.5712 0.8134 0.9019
No log 7.0588 120 0.7532 0.6084 0.7532 0.8679
No log 7.1765 122 0.8200 0.5882 0.8200 0.9055
No log 7.2941 124 0.7917 0.5882 0.7917 0.8898
No log 7.4118 126 0.7484 0.6396 0.7484 0.8651
No log 7.5294 128 0.9149 0.5956 0.9149 0.9565
No log 7.6471 130 0.9538 0.5346 0.9538 0.9766
No log 7.7647 132 0.8081 0.6069 0.8081 0.8989
No log 7.8824 134 0.7173 0.6328 0.7173 0.8470
No log 8.0 136 0.7601 0.5618 0.7601 0.8719
No log 8.1176 138 0.7341 0.5988 0.7341 0.8568
No log 8.2353 140 0.7431 0.6537 0.7431 0.8620
No log 8.3529 142 0.8174 0.6026 0.8174 0.9041
No log 8.4706 144 0.8042 0.6163 0.8042 0.8968
No log 8.5882 146 0.7838 0.5686 0.7838 0.8853
No log 8.7059 148 0.8269 0.6215 0.8269 0.9094
No log 8.8235 150 0.9496 0.5125 0.9496 0.9745
No log 8.9412 152 0.9747 0.5493 0.9747 0.9873
No log 9.0588 154 0.8652 0.5434 0.8652 0.9302
No log 9.1765 156 0.8197 0.5256 0.8197 0.9053
No log 9.2941 158 0.8276 0.5324 0.8276 0.9097
No log 9.4118 160 0.8760 0.5235 0.8760 0.9360
No log 9.5294 162 0.8465 0.5539 0.8465 0.9201
No log 9.6471 164 0.8174 0.5365 0.8174 0.9041
No log 9.7647 166 0.8033 0.5202 0.8033 0.8963
No log 9.8824 168 0.7601 0.6035 0.7601 0.8718
No log 10.0 170 0.7533 0.6234 0.7533 0.8679
No log 10.1176 172 0.7689 0.5233 0.7689 0.8769
No log 10.2353 174 0.7951 0.5329 0.7951 0.8917
No log 10.3529 176 0.8180 0.5028 0.8180 0.9044
No log 10.4706 178 0.8255 0.5318 0.8255 0.9085
No log 10.5882 180 0.8131 0.5275 0.8131 0.9017
No log 10.7059 182 0.8379 0.5275 0.8379 0.9154
No log 10.8235 184 0.8280 0.5086 0.8280 0.9100
No log 10.9412 186 0.8431 0.5437 0.8431 0.9182
No log 11.0588 188 0.8637 0.5437 0.8637 0.9293
No log 11.1765 190 0.8666 0.5229 0.8666 0.9309
No log 11.2941 192 0.8764 0.4794 0.8764 0.9362
No log 11.4118 194 0.8715 0.4894 0.8715 0.9335
No log 11.5294 196 0.7939 0.4757 0.7939 0.8910
No log 11.6471 198 0.7787 0.4801 0.7787 0.8824
No log 11.7647 200 0.8163 0.4902 0.8163 0.9035
No log 11.8824 202 0.8232 0.5303 0.8232 0.9073
No log 12.0 204 0.7769 0.5259 0.7769 0.8814
No log 12.1176 206 0.7417 0.5242 0.7417 0.8612
No log 12.2353 208 0.7707 0.5783 0.7707 0.8779
No log 12.3529 210 0.8978 0.5232 0.8978 0.9475
No log 12.4706 212 0.8727 0.5636 0.8727 0.9342
No log 12.5882 214 0.7379 0.5914 0.7379 0.8590
No log 12.7059 216 0.6946 0.6306 0.6946 0.8334
No log 12.8235 218 0.7152 0.5729 0.7152 0.8457
No log 12.9412 220 0.7537 0.5622 0.7537 0.8681
No log 13.0588 222 0.8436 0.5313 0.8436 0.9185
No log 13.1765 224 0.8885 0.5322 0.8885 0.9426
No log 13.2941 226 0.7877 0.5539 0.7877 0.8875
No log 13.4118 228 0.7199 0.5443 0.7199 0.8484
No log 13.5294 230 0.7580 0.5120 0.7580 0.8707
No log 13.6471 232 0.7157 0.5420 0.7157 0.8460
No log 13.7647 234 0.7228 0.6448 0.7228 0.8502
No log 13.8824 236 0.7614 0.5781 0.7614 0.8726
No log 14.0 238 0.7537 0.5701 0.7537 0.8682
No log 14.1176 240 0.7614 0.5884 0.7614 0.8726
No log 14.2353 242 0.7987 0.5495 0.7987 0.8937
No log 14.3529 244 0.9122 0.5370 0.9122 0.9551
No log 14.4706 246 0.9672 0.5216 0.9672 0.9835
No log 14.5882 248 0.9332 0.5080 0.9332 0.9660
No log 14.7059 250 0.8297 0.5495 0.8297 0.9109
No log 14.8235 252 0.7936 0.4962 0.7936 0.8908
No log 14.9412 254 0.8089 0.4715 0.8089 0.8994
No log 15.0588 256 0.8288 0.4181 0.8288 0.9104
No log 15.1765 258 0.8442 0.4334 0.8442 0.9188
No log 15.2941 260 0.8482 0.5114 0.8482 0.9210
No log 15.4118 262 0.8166 0.5409 0.8166 0.9037
No log 15.5294 264 0.7790 0.5698 0.7790 0.8826
No log 15.6471 266 0.7451 0.5777 0.7451 0.8632
No log 15.7647 268 0.7495 0.4879 0.7495 0.8657
No log 15.8824 270 0.7695 0.5245 0.7695 0.8772
No log 16.0 272 0.7423 0.5486 0.7423 0.8616
No log 16.1176 274 0.7775 0.6110 0.7775 0.8817
No log 16.2353 276 0.9285 0.5101 0.9285 0.9636
No log 16.3529 278 0.9828 0.5531 0.9828 0.9914
No log 16.4706 280 0.9018 0.55 0.9018 0.9496
No log 16.5882 282 0.7751 0.6258 0.7751 0.8804
No log 16.7059 284 0.7585 0.6305 0.7585 0.8709
No log 16.8235 286 0.7984 0.6300 0.7984 0.8935
No log 16.9412 288 0.8186 0.5650 0.8186 0.9048
No log 17.0588 290 0.7823 0.6154 0.7823 0.8845
No log 17.1765 292 0.7595 0.5886 0.7595 0.8715
No log 17.2941 294 0.7645 0.6038 0.7645 0.8744
No log 17.4118 296 0.7552 0.6274 0.7552 0.8690
No log 17.5294 298 0.7394 0.6580 0.7394 0.8599
No log 17.6471 300 0.7366 0.6563 0.7366 0.8582
No log 17.7647 302 0.7309 0.6274 0.7309 0.8549
No log 17.8824 304 0.7312 0.6283 0.7312 0.8551
No log 18.0 306 0.7956 0.5624 0.7956 0.8919
No log 18.1176 308 0.8900 0.5405 0.8900 0.9434
No log 18.2353 310 0.9210 0.5385 0.9210 0.9597
No log 18.3529 312 0.8619 0.5624 0.8619 0.9284
No log 18.4706 314 0.7901 0.5766 0.7901 0.8889
No log 18.5882 316 0.7806 0.5672 0.7806 0.8835
No log 18.7059 318 0.7968 0.5738 0.7968 0.8926
No log 18.8235 320 0.8447 0.5814 0.8447 0.9191
No log 18.9412 322 0.8818 0.5601 0.8818 0.9390
No log 19.0588 324 0.8704 0.5601 0.8704 0.9329
No log 19.1765 326 0.8301 0.5814 0.8301 0.9111
No log 19.2941 328 0.8094 0.5698 0.8094 0.8997
No log 19.4118 330 0.7762 0.5921 0.7762 0.8810
No log 19.5294 332 0.7718 0.5951 0.7718 0.8785
No log 19.6471 334 0.7868 0.5838 0.7868 0.8870
No log 19.7647 336 0.7620 0.5571 0.7620 0.8729
No log 19.8824 338 0.7535 0.5528 0.7535 0.8680
No log 20.0 340 0.7585 0.5267 0.7585 0.8709
No log 20.1176 342 0.7746 0.5267 0.7746 0.8801
No log 20.2353 344 0.7910 0.5041 0.7910 0.8894
No log 20.3529 346 0.7693 0.5528 0.7693 0.8771
No log 20.4706 348 0.7532 0.5167 0.7532 0.8679
No log 20.5882 350 0.7568 0.5207 0.7568 0.8699
No log 20.7059 352 0.7560 0.5555 0.7560 0.8695
No log 20.8235 354 0.7904 0.5159 0.7904 0.8891
No log 20.9412 356 0.8839 0.5014 0.8839 0.9402
No log 21.0588 358 0.9093 0.4685 0.9093 0.9536
No log 21.1765 360 0.8748 0.5175 0.8748 0.9353
No log 21.2941 362 0.8051 0.5194 0.8051 0.8973
No log 21.4118 364 0.7837 0.5486 0.7837 0.8853
No log 21.5294 366 0.7765 0.5420 0.7765 0.8812
No log 21.6471 368 0.7835 0.5997 0.7835 0.8852
No log 21.7647 370 0.8228 0.6163 0.8228 0.9071
No log 21.8824 372 0.8260 0.6077 0.8260 0.9089
No log 22.0 374 0.8278 0.5914 0.8278 0.9099
No log 22.1176 376 0.8308 0.5437 0.8308 0.9115
No log 22.2353 378 0.8073 0.5026 0.8073 0.8985
No log 22.3529 380 0.8213 0.4685 0.8213 0.9062
No log 22.4706 382 0.8514 0.4440 0.8514 0.9227
No log 22.5882 384 0.8853 0.4871 0.8853 0.9409
No log 22.7059 386 0.8994 0.4716 0.8994 0.9484
No log 22.8235 388 0.8436 0.5114 0.8436 0.9185
No log 22.9412 390 0.8004 0.5489 0.8004 0.8947
No log 23.0588 392 0.8336 0.5657 0.8336 0.9130
No log 23.1765 394 0.8381 0.5477 0.8381 0.9155
No log 23.2941 396 0.8072 0.5160 0.8072 0.8984
No log 23.4118 398 0.7970 0.4708 0.7970 0.8927
No log 23.5294 400 0.8057 0.4685 0.8057 0.8976
No log 23.6471 402 0.8015 0.4685 0.8015 0.8953
No log 23.7647 404 0.8063 0.4889 0.8063 0.8980
No log 23.8824 406 0.8016 0.4778 0.8016 0.8953
No log 24.0 408 0.8092 0.5806 0.8092 0.8996
No log 24.1176 410 0.8206 0.5753 0.8206 0.9058
No log 24.2353 412 0.8069 0.6047 0.8069 0.8983
No log 24.3529 414 0.7946 0.6077 0.7946 0.8914
No log 24.4706 416 0.8039 0.5983 0.8039 0.8966
No log 24.5882 418 0.8336 0.5838 0.8336 0.9130
No log 24.7059 420 0.8993 0.5578 0.8993 0.9483
No log 24.8235 422 0.9324 0.5557 0.9324 0.9656
No log 24.9412 424 0.9776 0.5068 0.9776 0.9887
No log 25.0588 426 0.9786 0.5054 0.9786 0.9892
No log 25.1765 428 0.9262 0.5068 0.9263 0.9624
No log 25.2941 430 0.8774 0.5430 0.8774 0.9367
No log 25.4118 432 0.8280 0.6010 0.8280 0.9100
No log 25.5294 434 0.8269 0.5920 0.8269 0.9093
No log 25.6471 436 0.8568 0.5839 0.8568 0.9256
No log 25.7647 438 0.9381 0.5552 0.9381 0.9685
No log 25.8824 440 0.9694 0.5439 0.9694 0.9846
No log 26.0 442 0.9148 0.5240 0.9148 0.9565
No log 26.1176 444 0.8884 0.5326 0.8884 0.9425
No log 26.2353 446 0.8755 0.4998 0.8755 0.9357
No log 26.3529 448 0.8866 0.4805 0.8866 0.9416
No log 26.4706 450 0.8938 0.5318 0.8938 0.9454
No log 26.5882 452 0.9238 0.5414 0.9238 0.9612
No log 26.7059 454 0.9208 0.5414 0.9208 0.9596
No log 26.8235 456 0.8487 0.5706 0.8487 0.9212
No log 26.9412 458 0.7882 0.5501 0.7882 0.8878
No log 27.0588 460 0.7817 0.5501 0.7817 0.8841
No log 27.1765 462 0.7743 0.5523 0.7743 0.8799
No log 27.2941 464 0.7860 0.5501 0.7860 0.8865
No log 27.4118 466 0.8517 0.6026 0.8517 0.9229
No log 27.5294 468 0.9041 0.5227 0.9041 0.9509
No log 27.6471 470 0.8892 0.5470 0.8892 0.9430
No log 27.7647 472 0.8527 0.4820 0.8527 0.9234
No log 27.8824 474 0.8337 0.4142 0.8337 0.9131
No log 28.0 476 0.8130 0.4519 0.8130 0.9017
No log 28.1176 478 0.7914 0.4681 0.7914 0.8896
No log 28.2353 480 0.7828 0.4996 0.7828 0.8848
No log 28.3529 482 0.7808 0.5408 0.7808 0.8836
No log 28.4706 484 0.8057 0.6215 0.8057 0.8976
No log 28.5882 486 0.8317 0.5810 0.8317 0.9120
No log 28.7059 488 0.8151 0.6151 0.8151 0.9028
No log 28.8235 490 0.7708 0.5635 0.7708 0.8780
No log 28.9412 492 0.7539 0.5915 0.7539 0.8683
No log 29.0588 494 0.7668 0.5686 0.7668 0.8757
No log 29.1765 496 0.7744 0.5713 0.7744 0.8800
No log 29.2941 498 0.7875 0.5501 0.7875 0.8874
0.2838 29.4118 500 0.7821 0.5300 0.7821 0.8843
0.2838 29.5294 502 0.7745 0.4656 0.7745 0.8800
0.2838 29.6471 504 0.7783 0.4656 0.7783 0.8822
0.2838 29.7647 506 0.7839 0.5322 0.7839 0.8854
0.2838 29.8824 508 0.7969 0.5892 0.7969 0.8927
0.2838 30.0 510 0.7914 0.5862 0.7914 0.8896
0.2838 30.1176 512 0.7676 0.5713 0.7676 0.8761
0.2838 30.2353 514 0.7601 0.5149 0.7601 0.8718
0.2838 30.3529 516 0.7727 0.4771 0.7727 0.8790
0.2838 30.4706 518 0.7811 0.5042 0.7811 0.8838
0.2838 30.5882 520 0.8018 0.5741 0.8018 0.8954
0.2838 30.7059 522 0.8477 0.4937 0.8477 0.9207
0.2838 30.8235 524 0.9612 0.5276 0.9612 0.9804
0.2838 30.9412 526 1.0758 0.5066 1.0758 1.0372
0.2838 31.0588 528 1.0858 0.5230 1.0858 1.0420
0.2838 31.1765 530 1.0019 0.5176 1.0019 1.0010

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k3_task2_organization

Finetuned
(4019)
this model