ArabicNewSplits7_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k10_task7_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.8295
  • Qwk: 0.2527
  • Mse: 0.8295
  • Rmse: 0.9108

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0833 2 2.5562 -0.0449 2.5562 1.5988
No log 0.1667 4 1.2422 0.0726 1.2422 1.1145
No log 0.25 6 0.9587 -0.0970 0.9587 0.9791
No log 0.3333 8 0.8678 0.1010 0.8678 0.9316
No log 0.4167 10 0.7701 0.0688 0.7701 0.8776
No log 0.5 12 0.8958 0.2387 0.8958 0.9465
No log 0.5833 14 0.7668 0.2261 0.7668 0.8757
No log 0.6667 16 0.8024 0.2740 0.8024 0.8957
No log 0.75 18 1.1118 0.0518 1.1118 1.0544
No log 0.8333 20 1.0401 0.2037 1.0401 1.0198
No log 0.9167 22 0.8404 0.1786 0.8404 0.9167
No log 1.0 24 0.8079 0.0937 0.8079 0.8988
No log 1.0833 26 0.8021 0.0 0.8021 0.8956
No log 1.1667 28 0.7711 0.0 0.7711 0.8781
No log 1.25 30 0.7354 0.0 0.7354 0.8576
No log 1.3333 32 0.7161 0.0840 0.7161 0.8462
No log 1.4167 34 0.7131 0.0840 0.7131 0.8444
No log 1.5 36 0.6958 0.1236 0.6958 0.8342
No log 1.5833 38 0.6690 0.3323 0.6690 0.8179
No log 1.6667 40 0.8661 0.3231 0.8661 0.9306
No log 1.75 42 1.0737 0.2510 1.0737 1.0362
No log 1.8333 44 1.1826 -0.0960 1.1826 1.0875
No log 1.9167 46 1.0100 -0.1823 1.0100 1.0050
No log 2.0 48 0.7369 -0.0027 0.7369 0.8585
No log 2.0833 50 0.7957 0.1372 0.7957 0.8920
No log 2.1667 52 0.8417 0.2526 0.8417 0.9175
No log 2.25 54 0.7856 0.2181 0.7856 0.8864
No log 2.3333 56 0.7029 0.0937 0.7029 0.8384
No log 2.4167 58 0.6646 0.0393 0.6646 0.8153
No log 2.5 60 0.7157 0.2817 0.7157 0.8460
No log 2.5833 62 0.6541 0.3789 0.6541 0.8087
No log 2.6667 64 0.5738 0.3416 0.5738 0.7575
No log 2.75 66 0.5736 0.3745 0.5736 0.7574
No log 2.8333 68 0.6010 0.4243 0.6010 0.7753
No log 2.9167 70 0.6401 0.4728 0.6401 0.8001
No log 3.0 72 0.6577 0.4644 0.6577 0.8110
No log 3.0833 74 0.6102 0.4020 0.6102 0.7812
No log 3.1667 76 0.5533 0.4929 0.5533 0.7438
No log 3.25 78 0.5857 0.4259 0.5857 0.7653
No log 3.3333 80 0.5910 0.4259 0.5910 0.7687
No log 3.4167 82 0.5542 0.4161 0.5542 0.7444
No log 3.5 84 0.6259 0.4618 0.6259 0.7911
No log 3.5833 86 0.6814 0.4270 0.6814 0.8255
No log 3.6667 88 0.7034 0.3399 0.7034 0.8387
No log 3.75 90 0.7211 0.3099 0.7211 0.8492
No log 3.8333 92 0.6896 0.2171 0.6896 0.8304
No log 3.9167 94 0.6397 0.2852 0.6397 0.7998
No log 4.0 96 0.6207 0.2783 0.6207 0.7879
No log 4.0833 98 0.7375 0.3712 0.7375 0.8588
No log 4.1667 100 0.8024 0.3782 0.8024 0.8957
No log 4.25 102 0.8846 0.3560 0.8846 0.9405
No log 4.3333 104 0.7926 0.4597 0.7926 0.8903
No log 4.4167 106 0.7748 0.4265 0.7748 0.8802
No log 4.5 108 0.7953 0.4057 0.7953 0.8918
No log 4.5833 110 0.8548 0.4199 0.8548 0.9246
No log 4.6667 112 0.9362 0.3274 0.9362 0.9676
No log 4.75 114 0.9529 0.3274 0.9529 0.9762
No log 4.8333 116 1.0662 0.3007 1.0662 1.0326
No log 4.9167 118 0.9911 0.3174 0.9911 0.9955
No log 5.0 120 0.8298 0.2564 0.8298 0.9109
No log 5.0833 122 0.8274 0.2589 0.8274 0.9096
No log 5.1667 124 1.0497 0.2348 1.0497 1.0246
No log 5.25 126 1.3403 0.2441 1.3403 1.1577
No log 5.3333 128 1.4616 0.2178 1.4616 1.2090
No log 5.4167 130 1.1661 0.2421 1.1661 1.0798
No log 5.5 132 0.9477 0.2706 0.9477 0.9735
No log 5.5833 134 1.1656 0.3211 1.1656 1.0796
No log 5.6667 136 1.6569 0.2552 1.6569 1.2872
No log 5.75 138 1.6522 0.2382 1.6522 1.2854
No log 5.8333 140 1.1307 0.3237 1.1307 1.0633
No log 5.9167 142 0.6350 0.4473 0.6350 0.7969
No log 6.0 144 0.6517 0.3879 0.6517 0.8073
No log 6.0833 146 0.6581 0.3060 0.6581 0.8112
No log 6.1667 148 0.6234 0.5056 0.6234 0.7895
No log 6.25 150 0.7604 0.3869 0.7604 0.8720
No log 6.3333 152 0.9104 0.3029 0.9104 0.9541
No log 6.4167 154 0.8212 0.3251 0.8212 0.9062
No log 6.5 156 0.6999 0.4684 0.6999 0.8366
No log 6.5833 158 0.6664 0.4300 0.6664 0.8163
No log 6.6667 160 0.6854 0.4091 0.6854 0.8279
No log 6.75 162 0.8049 0.3699 0.8049 0.8972
No log 6.8333 164 1.0896 0.2398 1.0896 1.0438
No log 6.9167 166 1.1595 0.2622 1.1595 1.0768
No log 7.0 168 0.9405 0.3019 0.9405 0.9698
No log 7.0833 170 0.8121 0.3409 0.8121 0.9012
No log 7.1667 172 0.7285 0.3341 0.7285 0.8536
No log 7.25 174 0.7399 0.3622 0.7399 0.8601
No log 7.3333 176 0.8476 0.3069 0.8476 0.9206
No log 7.4167 178 0.9483 0.2754 0.9483 0.9738
No log 7.5 180 0.8717 0.2651 0.8717 0.9336
No log 7.5833 182 0.7594 0.3399 0.7594 0.8714
No log 7.6667 184 0.7554 0.3814 0.7554 0.8691
No log 7.75 186 0.8325 0.3043 0.8325 0.9124
No log 7.8333 188 0.8615 0.3194 0.8615 0.9282
No log 7.9167 190 0.9845 0.2297 0.9845 0.9922
No log 8.0 192 0.9940 0.2297 0.9940 0.9970
No log 8.0833 194 0.9340 0.3076 0.9340 0.9664
No log 8.1667 196 0.8769 0.3134 0.8769 0.9364
No log 8.25 198 0.8263 0.2492 0.8263 0.9090
No log 8.3333 200 0.9214 0.3134 0.9214 0.9599
No log 8.4167 202 1.1103 0.2306 1.1103 1.0537
No log 8.5 204 1.2582 0.2020 1.2582 1.1217
No log 8.5833 206 1.1262 0.2168 1.1262 1.0612
No log 8.6667 208 0.8598 0.2905 0.8598 0.9273
No log 8.75 210 0.7314 0.4522 0.7314 0.8552
No log 8.8333 212 0.7014 0.4294 0.7014 0.8375
No log 8.9167 214 0.6956 0.4522 0.6956 0.8340
No log 9.0 216 0.7572 0.3824 0.7572 0.8702
No log 9.0833 218 0.6921 0.3963 0.6921 0.8319
No log 9.1667 220 0.6668 0.3662 0.6668 0.8166
No log 9.25 222 0.6332 0.3622 0.6332 0.7958
No log 9.3333 224 0.6500 0.3545 0.6500 0.8062
No log 9.4167 226 0.6875 0.3127 0.6875 0.8291
No log 9.5 228 0.7963 0.3869 0.7963 0.8923
No log 9.5833 230 0.9294 0.3827 0.9294 0.9640
No log 9.6667 232 1.0154 0.3557 1.0154 1.0077
No log 9.75 234 0.9698 0.3613 0.9698 0.9848
No log 9.8333 236 0.8237 0.3297 0.8237 0.9076
No log 9.9167 238 0.7296 0.3261 0.7296 0.8542
No log 10.0 240 0.7333 0.3261 0.7333 0.8564
No log 10.0833 242 0.7409 0.3099 0.7409 0.8608
No log 10.1667 244 0.8101 0.2904 0.8101 0.9001
No log 10.25 246 0.8293 0.3105 0.8293 0.9107
No log 10.3333 248 0.8240 0.3675 0.8240 0.9077
No log 10.4167 250 0.9447 0.3477 0.9447 0.9720
No log 10.5 252 0.9702 0.3597 0.9702 0.9850
No log 10.5833 254 0.9140 0.4277 0.9140 0.9561
No log 10.6667 256 0.8081 0.3981 0.8081 0.8989
No log 10.75 258 0.6943 0.3157 0.6943 0.8332
No log 10.8333 260 0.7186 0.2932 0.7186 0.8477
No log 10.9167 262 0.8629 0.4255 0.8629 0.9289
No log 11.0 264 1.0266 0.3110 1.0266 1.0132
No log 11.0833 266 0.9507 0.3576 0.9507 0.9750
No log 11.1667 268 0.8436 0.4154 0.8436 0.9185
No log 11.25 270 0.7393 0.3564 0.7393 0.8598
No log 11.3333 272 0.6731 0.2817 0.6731 0.8204
No log 11.4167 274 0.6102 0.3701 0.6102 0.7812
No log 11.5 276 0.6251 0.3127 0.6251 0.7906
No log 11.5833 278 0.7823 0.4067 0.7823 0.8845
No log 11.6667 280 1.0419 0.3059 1.0419 1.0207
No log 11.75 282 1.0811 0.3010 1.0811 1.0398
No log 11.8333 284 0.9620 0.3481 0.9620 0.9808
No log 11.9167 286 0.7539 0.4726 0.7539 0.8683
No log 12.0 288 0.6245 0.3425 0.6245 0.7903
No log 12.0833 290 0.6045 0.3840 0.6045 0.7775
No log 12.1667 292 0.6462 0.3127 0.6462 0.8039
No log 12.25 294 0.7827 0.4307 0.7827 0.8847
No log 12.3333 296 1.0226 0.3247 1.0226 1.0113
No log 12.4167 298 1.1271 0.3404 1.1271 1.0617
No log 12.5 300 1.0097 0.3247 1.0097 1.0048
No log 12.5833 302 0.8532 0.3665 0.8532 0.9237
No log 12.6667 304 0.7428 0.4224 0.7428 0.8619
No log 12.75 306 0.7813 0.4224 0.7813 0.8839
No log 12.8333 308 0.9519 0.3807 0.9519 0.9756
No log 12.9167 310 1.0330 0.3481 1.0330 1.0163
No log 13.0 312 1.0050 0.3807 1.0050 1.0025
No log 13.0833 314 0.9437 0.3657 0.9437 0.9714
No log 13.1667 316 0.9108 0.3160 0.9108 0.9543
No log 13.25 318 0.9107 0.2810 0.9107 0.9543
No log 13.3333 320 0.9039 0.2308 0.9039 0.9507
No log 13.4167 322 1.0208 0.2756 1.0208 1.0104
No log 13.5 324 1.1725 0.1795 1.1725 1.0828
No log 13.5833 326 1.1779 0.2100 1.1779 1.0853
No log 13.6667 328 1.0673 0.2601 1.0673 1.0331
No log 13.75 330 0.9390 0.2358 0.9390 0.9690
No log 13.8333 332 0.9044 0.2076 0.9044 0.9510
No log 13.9167 334 0.9234 0.2267 0.9234 0.9609
No log 14.0 336 1.0790 0.3183 1.0790 1.0388
No log 14.0833 338 1.1867 0.2324 1.1867 1.0894
No log 14.1667 340 1.2175 0.2442 1.2175 1.1034
No log 14.25 342 1.1245 0.3003 1.1245 1.0604
No log 14.3333 344 0.9325 0.3183 0.9325 0.9657
No log 14.4167 346 0.7961 0.3238 0.7961 0.8922
No log 14.5 348 0.7679 0.3238 0.7679 0.8763
No log 14.5833 350 0.8179 0.3099 0.8179 0.9044
No log 14.6667 352 0.8598 0.3042 0.8598 0.9272
No log 14.75 354 0.8918 0.3076 0.8918 0.9443
No log 14.8333 356 0.9471 0.3160 0.9471 0.9732
No log 14.9167 358 0.9460 0.3381 0.9460 0.9726
No log 15.0 360 0.9557 0.3381 0.9557 0.9776
No log 15.0833 362 0.9055 0.3217 0.9055 0.9516
No log 15.1667 364 0.8664 0.2949 0.8664 0.9308
No log 15.25 366 0.8814 0.3869 0.8814 0.9389
No log 15.3333 368 0.9914 0.2677 0.9914 0.9957
No log 15.4167 370 1.0310 0.2252 1.0310 1.0154
No log 15.5 372 1.0052 0.2677 1.0052 1.0026
No log 15.5833 374 0.9410 0.2892 0.9410 0.9700
No log 15.6667 376 0.8445 0.3238 0.8445 0.9189
No log 15.75 378 0.8098 0.2652 0.8098 0.8999
No log 15.8333 380 0.8558 0.3238 0.8558 0.9251
No log 15.9167 382 0.9050 0.3564 0.9050 0.9513
No log 16.0 384 0.8756 0.3564 0.8756 0.9357
No log 16.0833 386 0.8518 0.3564 0.8518 0.9229
No log 16.1667 388 0.7951 0.4134 0.7951 0.8917
No log 16.25 390 0.8523 0.3494 0.8523 0.9232
No log 16.3333 392 0.9154 0.2892 0.9154 0.9567
No log 16.4167 394 0.8979 0.2892 0.8979 0.9476
No log 16.5 396 0.9634 0.3251 0.9634 0.9815
No log 16.5833 398 1.0536 0.2482 1.0536 1.0265
No log 16.6667 400 1.0188 0.2806 1.0188 1.0093
No log 16.75 402 0.9201 0.3433 0.9201 0.9592
No log 16.8333 404 0.7978 0.2932 0.7978 0.8932
No log 16.9167 406 0.7682 0.3060 0.7682 0.8764
No log 17.0 408 0.7912 0.2662 0.7912 0.8895
No log 17.0833 410 0.7934 0.2932 0.7934 0.8908
No log 17.1667 412 0.8267 0.3699 0.8267 0.9092
No log 17.25 414 0.8644 0.3256 0.8644 0.9298
No log 17.3333 416 0.8571 0.3319 0.8571 0.9258
No log 17.4167 418 0.7683 0.3238 0.7683 0.8765
No log 17.5 420 0.7623 0.3238 0.7623 0.8731
No log 17.5833 422 0.8324 0.3494 0.8324 0.9124
No log 17.6667 424 0.8927 0.2892 0.8927 0.9449
No log 17.75 426 1.0051 0.2343 1.0051 1.0025
No log 17.8333 428 1.1151 0.2006 1.1151 1.0560
No log 17.9167 430 1.1106 0.2223 1.1106 1.0539
No log 18.0 432 1.0028 0.3287 1.0028 1.0014
No log 18.0833 434 0.9637 0.2562 0.9637 0.9817
No log 18.1667 436 0.8705 0.3042 0.8705 0.9330
No log 18.25 438 0.7548 0.2754 0.7548 0.8688
No log 18.3333 440 0.7151 0.3196 0.7151 0.8456
No log 18.4167 442 0.7554 0.2995 0.7554 0.8691
No log 18.5 444 0.8869 0.4153 0.8869 0.9417
No log 18.5833 446 1.1093 0.2796 1.1093 1.0532
No log 18.6667 448 1.1685 0.2398 1.1685 1.0810
No log 18.75 450 1.1277 0.2437 1.1277 1.0619
No log 18.8333 452 0.9481 0.2892 0.9481 0.9737
No log 18.9167 454 0.7956 0.3372 0.7956 0.8920
No log 19.0 456 0.7384 0.2883 0.7384 0.8593
No log 19.0833 458 0.7347 0.3238 0.7347 0.8572
No log 19.1667 460 0.7487 0.3238 0.7487 0.8653
No log 19.25 462 0.7787 0.2817 0.7787 0.8825
No log 19.3333 464 0.8092 0.2817 0.8092 0.8995
No log 19.4167 466 0.8281 0.3450 0.8281 0.9100
No log 19.5 468 0.8159 0.3450 0.8159 0.9033
No log 19.5833 470 0.7851 0.3519 0.7851 0.8861
No log 19.6667 472 0.7382 0.2652 0.7382 0.8592
No log 19.75 474 0.7049 0.2407 0.7049 0.8396
No log 19.8333 476 0.7185 0.3238 0.7185 0.8476
No log 19.9167 478 0.7748 0.4251 0.7748 0.8802
No log 20.0 480 0.8510 0.3869 0.8510 0.9225
No log 20.0833 482 0.8695 0.3869 0.8695 0.9325
No log 20.1667 484 0.9016 0.3869 0.9016 0.9495
No log 20.25 486 0.8646 0.4251 0.8646 0.9298
No log 20.3333 488 0.8056 0.4491 0.8056 0.8976
No log 20.4167 490 0.7631 0.2950 0.7631 0.8735
No log 20.5 492 0.7880 0.2950 0.7880 0.8877
No log 20.5833 494 0.8634 0.3637 0.8634 0.9292
No log 20.6667 496 0.9521 0.4085 0.9521 0.9758
No log 20.75 498 1.0831 0.3099 1.0831 1.0407
0.3141 20.8333 500 1.0813 0.3223 1.0813 1.0399
0.3141 20.9167 502 0.9614 0.4085 0.9614 0.9805
0.3141 21.0 504 0.8314 0.3894 0.8314 0.9118
0.3141 21.0833 506 0.7886 0.3637 0.7886 0.8880
0.3141 21.1667 508 0.7623 0.3637 0.7623 0.8731
0.3141 21.25 510 0.7681 0.3637 0.7681 0.8764
0.3141 21.3333 512 0.7924 0.3894 0.7924 0.8902
0.3141 21.4167 514 0.8642 0.4275 0.8642 0.9296
0.3141 21.5 516 0.8800 0.4275 0.8800 0.9381
0.3141 21.5833 518 0.8248 0.4014 0.8248 0.9082
0.3141 21.6667 520 0.8240 0.3894 0.8240 0.9078
0.3141 21.75 522 0.8732 0.4387 0.8732 0.9344
0.3141 21.8333 524 0.9446 0.3731 0.9446 0.9719
0.3141 21.9167 526 0.9944 0.3499 0.9944 0.9972
0.3141 22.0 528 0.9656 0.3799 0.9656 0.9827
0.3141 22.0833 530 0.8753 0.3894 0.8753 0.9356
0.3141 22.1667 532 0.7228 0.2407 0.7228 0.8502
0.3141 22.25 534 0.6559 0.2379 0.6559 0.8099
0.3141 22.3333 536 0.6613 0.2407 0.6613 0.8132
0.3141 22.4167 538 0.7314 0.2407 0.7314 0.8552
0.3141 22.5 540 0.8901 0.3450 0.8901 0.9434
0.3141 22.5833 542 0.9940 0.3560 0.9940 0.9970
0.3141 22.6667 544 1.0104 0.3433 1.0104 1.0052
0.3141 22.75 546 0.9452 0.3433 0.9452 0.9722
0.3141 22.8333 548 0.8650 0.4470 0.8650 0.9300
0.3141 22.9167 550 0.8128 0.2527 0.8128 0.9015
0.3141 23.0 552 0.7887 0.2652 0.7887 0.8881
0.3141 23.0833 554 0.7865 0.2652 0.7865 0.8869
0.3141 23.1667 556 0.7931 0.2652 0.7931 0.8906
0.3141 23.25 558 0.8295 0.2527 0.8295 0.9108

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k10_task7_organization

Finetuned
(4019)
this model