ArabicNewSplits7_B_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k4_task7_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.8309
  • Qwk: 0.2888
  • Mse: 0.8309
  • Rmse: 0.9115

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0909 2 2.5119 -0.0109 2.5119 1.5849
No log 0.1818 4 1.2360 0.0985 1.2360 1.1118
No log 0.2727 6 0.9804 -0.1408 0.9804 0.9901
No log 0.3636 8 0.8566 0.2440 0.8566 0.9255
No log 0.4545 10 1.0787 0.2100 1.0787 1.0386
No log 0.5455 12 1.5784 0.0257 1.5784 1.2564
No log 0.6364 14 1.7402 -0.0209 1.7402 1.3192
No log 0.7273 16 1.5733 -0.0422 1.5733 1.2543
No log 0.8182 18 1.1913 0.0134 1.1913 1.0914
No log 0.9091 20 1.1650 -0.1013 1.1650 1.0793
No log 1.0 22 1.0620 -0.0146 1.0620 1.0306
No log 1.0909 24 0.9207 0.1444 0.9207 0.9595
No log 1.1818 26 0.9454 0.1822 0.9454 0.9723
No log 1.2727 28 0.9189 0.1962 0.9189 0.9586
No log 1.3636 30 0.8722 0.2817 0.8722 0.9339
No log 1.4545 32 0.8167 0.1686 0.8167 0.9037
No log 1.5455 34 0.7943 0.1136 0.7943 0.8912
No log 1.6364 36 0.7926 0.1519 0.7926 0.8903
No log 1.7273 38 0.7638 0.2193 0.7638 0.8739
No log 1.8182 40 0.7964 0.2558 0.7964 0.8924
No log 1.9091 42 0.9184 0.2692 0.9184 0.9584
No log 2.0 44 0.9222 0.2692 0.9222 0.9603
No log 2.0909 46 0.8586 0.1766 0.8586 0.9266
No log 2.1818 48 0.7782 0.2227 0.7782 0.8821
No log 2.2727 50 0.7401 0.2158 0.7401 0.8603
No log 2.3636 52 0.7460 0.1737 0.7460 0.8637
No log 2.4545 54 0.7704 0.1225 0.7704 0.8777
No log 2.5455 56 0.7848 0.1550 0.7848 0.8859
No log 2.6364 58 0.7854 0.1222 0.7854 0.8862
No log 2.7273 60 0.7807 0.1733 0.7807 0.8836
No log 2.8182 62 0.7643 0.2126 0.7643 0.8742
No log 2.9091 64 0.7631 0.1729 0.7631 0.8735
No log 3.0 66 0.7588 0.2349 0.7588 0.8711
No log 3.0909 68 0.7997 0.3186 0.7997 0.8943
No log 3.1818 70 0.9214 0.4089 0.9214 0.9599
No log 3.2727 72 1.0659 0.3648 1.0659 1.0324
No log 3.3636 74 1.0364 0.3822 1.0364 1.0180
No log 3.4545 76 1.0107 0.3822 1.0107 1.0054
No log 3.5455 78 1.0000 0.3186 1.0000 1.0000
No log 3.6364 80 0.9537 0.3326 0.9537 0.9766
No log 3.7273 82 0.8871 0.3300 0.8871 0.9419
No log 3.8182 84 0.8255 0.3161 0.8255 0.9086
No log 3.9091 86 0.9309 0.3425 0.9309 0.9648
No log 4.0 88 1.0038 0.3409 1.0038 1.0019
No log 4.0909 90 0.9645 0.3169 0.9645 0.9821
No log 4.1818 92 0.8790 0.3234 0.8790 0.9376
No log 4.2727 94 0.8484 0.3372 0.8484 0.9211
No log 4.3636 96 0.8565 0.2988 0.8565 0.9255
No log 4.4545 98 0.8469 0.2594 0.8469 0.9203
No log 4.5455 100 0.8494 0.2934 0.8494 0.9217
No log 4.6364 102 0.8748 0.3117 0.8748 0.9353
No log 4.7273 104 0.8828 0.3141 0.8828 0.9396
No log 4.8182 106 0.8898 0.2388 0.8898 0.9433
No log 4.9091 108 0.9029 0.2492 0.9029 0.9502
No log 5.0 110 0.8918 0.2231 0.8918 0.9444
No log 5.0909 112 0.9600 0.2735 0.9600 0.9798
No log 5.1818 114 0.8233 0.2462 0.8233 0.9073
No log 5.2727 116 0.9371 0.3720 0.9371 0.9680
No log 5.3636 118 1.1231 0.2294 1.1231 1.0598
No log 5.4545 120 1.0881 0.2504 1.0881 1.0431
No log 5.5455 122 0.8777 0.4153 0.8777 0.9368
No log 5.6364 124 0.6938 0.3160 0.6938 0.8329
No log 5.7273 126 0.7076 0.2578 0.7076 0.8412
No log 5.8182 128 0.7059 0.2516 0.7059 0.8402
No log 5.9091 130 0.7978 0.3500 0.7978 0.8932
No log 6.0 132 0.8871 0.3368 0.8871 0.9418
No log 6.0909 134 0.8633 0.4007 0.8633 0.9291
No log 6.1818 136 0.8143 0.4135 0.8143 0.9024
No log 6.2727 138 0.8513 0.3991 0.8513 0.9227
No log 6.3636 140 0.9434 0.3925 0.9434 0.9713
No log 6.4545 142 1.0353 0.3398 1.0353 1.0175
No log 6.5455 144 1.0072 0.3398 1.0072 1.0036
No log 6.6364 146 0.8979 0.3333 0.8979 0.9476
No log 6.7273 148 0.8198 0.3440 0.8198 0.9054
No log 6.8182 150 0.7283 0.3299 0.7283 0.8534
No log 6.9091 152 0.7118 0.2509 0.7118 0.8437
No log 7.0 154 0.7069 0.2574 0.7069 0.8408
No log 7.0909 156 0.7216 0.3267 0.7216 0.8494
No log 7.1818 158 0.7656 0.3615 0.7656 0.8750
No log 7.2727 160 0.9026 0.2958 0.9026 0.9500
No log 7.3636 162 0.9140 0.3425 0.9140 0.9560
No log 7.4545 164 0.8137 0.4212 0.8137 0.9021
No log 7.5455 166 0.7541 0.3840 0.7541 0.8684
No log 7.6364 168 0.7505 0.3575 0.7505 0.8663
No log 7.7273 170 0.8337 0.4037 0.8337 0.9131
No log 7.8182 172 1.1272 0.2464 1.1272 1.0617
No log 7.9091 174 1.5216 0.0504 1.5216 1.2335
No log 8.0 176 1.5947 0.0660 1.5947 1.2628
No log 8.0909 178 1.5878 0.0852 1.5878 1.2601
No log 8.1818 180 1.4747 0.1019 1.4747 1.2144
No log 8.2727 182 1.2116 0.2015 1.2116 1.1007
No log 8.3636 184 0.9928 0.2797 0.9928 0.9964
No log 8.4545 186 0.9385 0.3586 0.9385 0.9688
No log 8.5455 188 0.9376 0.3740 0.9376 0.9683
No log 8.6364 190 0.9957 0.3089 0.9957 0.9978
No log 8.7273 192 1.0370 0.2405 1.0370 1.0184
No log 8.8182 194 0.9316 0.3641 0.9316 0.9652
No log 8.9091 196 0.8506 0.3388 0.8506 0.9223
No log 9.0 198 0.8559 0.3425 0.8559 0.9251
No log 9.0909 200 0.8230 0.3797 0.8230 0.9072
No log 9.1818 202 0.8163 0.3797 0.8163 0.9035
No log 9.2727 204 0.8141 0.3950 0.8141 0.9023
No log 9.3636 206 0.7980 0.4408 0.7980 0.8933
No log 9.4545 208 0.8295 0.4183 0.8295 0.9107
No log 9.5455 210 0.8323 0.3508 0.8323 0.9123
No log 9.6364 212 0.8123 0.3658 0.8123 0.9013
No log 9.7273 214 0.8063 0.3553 0.8063 0.8979
No log 9.8182 216 0.8015 0.4772 0.8015 0.8952
No log 9.9091 218 0.7949 0.4830 0.7949 0.8916
No log 10.0 220 0.7578 0.4660 0.7578 0.8705
No log 10.0909 222 0.7593 0.2624 0.7593 0.8714
No log 10.1818 224 0.7555 0.3409 0.7555 0.8692
No log 10.2727 226 0.7916 0.3401 0.7916 0.8897
No log 10.3636 228 1.0450 0.2934 1.0450 1.0223
No log 10.4545 230 1.1863 0.1774 1.1863 1.0892
No log 10.5455 232 1.1830 0.1696 1.1830 1.0877
No log 10.6364 234 1.0223 0.3043 1.0223 1.0111
No log 10.7273 236 0.8472 0.3590 0.8472 0.9204
No log 10.8182 238 0.7713 0.3440 0.7713 0.8782
No log 10.9091 240 0.7668 0.4157 0.7668 0.8756
No log 11.0 242 0.8100 0.3946 0.8100 0.9000
No log 11.0909 244 0.8885 0.3948 0.8885 0.9426
No log 11.1818 246 0.9007 0.3620 0.9007 0.9491
No log 11.2727 248 0.8466 0.3797 0.8466 0.9201
No log 11.3636 250 0.8460 0.3818 0.8460 0.9198
No log 11.4545 252 0.8264 0.3775 0.8264 0.9091
No log 11.5455 254 0.8002 0.3512 0.8002 0.8945
No log 11.6364 256 0.7934 0.3512 0.7934 0.8907
No log 11.7273 258 0.8125 0.3571 0.8125 0.9014
No log 11.8182 260 0.8032 0.3820 0.8032 0.8962
No log 11.9091 262 0.7748 0.3401 0.7748 0.8802
No log 12.0 264 0.8059 0.3950 0.8059 0.8977
No log 12.0909 266 0.8366 0.3720 0.8366 0.9147
No log 12.1818 268 0.8400 0.4339 0.8400 0.9165
No log 12.2727 270 0.8058 0.3571 0.8058 0.8977
No log 12.3636 272 0.7680 0.3619 0.7680 0.8763
No log 12.4545 274 0.7884 0.3840 0.7884 0.8879
No log 12.5455 276 0.8638 0.3987 0.8638 0.9294
No log 12.6364 278 0.8494 0.3780 0.8494 0.9216
No log 12.7273 280 0.8678 0.3780 0.8678 0.9316
No log 12.8182 282 0.8818 0.3865 0.8818 0.9390
No log 12.9091 284 0.9205 0.3740 0.9205 0.9594
No log 13.0 286 0.9197 0.3526 0.9197 0.9590
No log 13.0909 288 0.9151 0.3432 0.9151 0.9566
No log 13.1818 290 0.8918 0.3690 0.8918 0.9443
No log 13.2727 292 0.8059 0.3775 0.8059 0.8977
No log 13.3636 294 0.7481 0.4200 0.7481 0.8650
No log 13.4545 296 0.7547 0.2879 0.7547 0.8687
No log 13.5455 298 0.7605 0.3928 0.7605 0.8720
No log 13.6364 300 0.7872 0.3101 0.7872 0.8872
No log 13.7273 302 0.8785 0.3930 0.8785 0.9373
No log 13.8182 304 0.9278 0.4353 0.9278 0.9632
No log 13.9091 306 0.9225 0.4031 0.9225 0.9605
No log 14.0 308 0.9783 0.3527 0.9783 0.9891
No log 14.0909 310 1.0042 0.3527 1.0042 1.0021
No log 14.1818 312 1.0351 0.3615 1.0351 1.0174
No log 14.2727 314 1.0064 0.3461 1.0064 1.0032
No log 14.3636 316 0.9273 0.3754 0.9273 0.9630
No log 14.4545 318 0.9175 0.3754 0.9175 0.9579
No log 14.5455 320 0.9176 0.3425 0.9176 0.9579
No log 14.6364 322 0.8984 0.3425 0.8984 0.9479
No log 14.7273 324 0.8698 0.3798 0.8698 0.9326
No log 14.8182 326 0.8174 0.2914 0.8174 0.9041
No log 14.9091 328 0.8103 0.3018 0.8103 0.9002
No log 15.0 330 0.8210 0.2960 0.8210 0.9061
No log 15.0909 332 0.8514 0.3635 0.8514 0.9227
No log 15.1818 334 0.9101 0.3659 0.9101 0.9540
No log 15.2727 336 1.0168 0.3074 1.0168 1.0084
No log 15.3636 338 1.0634 0.2591 1.0634 1.0312
No log 15.4545 340 0.9390 0.3973 0.9390 0.9690
No log 15.5455 342 0.7901 0.2661 0.7901 0.8889
No log 15.6364 344 0.7989 0.2844 0.7989 0.8938
No log 15.7273 346 0.8216 0.2844 0.8216 0.9064
No log 15.8182 348 0.8043 0.3462 0.8043 0.8968
No log 15.9091 350 0.8876 0.4097 0.8876 0.9421
No log 16.0 352 1.0566 0.2916 1.0566 1.0279
No log 16.0909 354 1.2375 0.1144 1.2375 1.1124
No log 16.1818 356 1.3450 0.1497 1.3450 1.1597
No log 16.2727 358 1.2455 0.1715 1.2455 1.1160
No log 16.3636 360 1.0209 0.2323 1.0209 1.0104
No log 16.4545 362 0.8910 0.2907 0.8910 0.9439
No log 16.5455 364 0.8564 0.3279 0.8564 0.9254
No log 16.6364 366 0.8813 0.3425 0.8813 0.9388
No log 16.7273 368 0.9054 0.2943 0.9054 0.9515
No log 16.8182 370 0.9214 0.2590 0.9214 0.9599
No log 16.9091 372 0.9505 0.2442 0.9505 0.9749
No log 17.0 374 1.0009 0.2303 1.0009 1.0005
No log 17.0909 376 0.9964 0.2285 0.9964 0.9982
No log 17.1818 378 0.9148 0.3183 0.9148 0.9564
No log 17.2727 380 0.8743 0.3305 0.8743 0.9350
No log 17.3636 382 0.8546 0.3060 0.8546 0.9244
No log 17.4545 384 0.8388 0.2995 0.8388 0.9159
No log 17.5455 386 0.8520 0.3519 0.8520 0.9230
No log 17.6364 388 0.8783 0.3384 0.8783 0.9372
No log 17.7273 390 0.8693 0.3131 0.8693 0.9324
No log 17.8182 392 0.8391 0.3060 0.8391 0.9160
No log 17.9091 394 0.8396 0.2995 0.8396 0.9163
No log 18.0 396 0.8631 0.3268 0.8631 0.9290
No log 18.0909 398 0.8959 0.3268 0.8959 0.9465
No log 18.1818 400 0.9432 0.3343 0.9432 0.9712
No log 18.2727 402 0.9712 0.3059 0.9712 0.9855
No log 18.3636 404 0.9149 0.3365 0.9149 0.9565
No log 18.4545 406 0.8412 0.3613 0.8412 0.9172
No log 18.5455 408 0.8244 0.3571 0.8244 0.9080
No log 18.6364 410 0.8556 0.4212 0.8556 0.9250
No log 18.7273 412 0.9287 0.4486 0.9287 0.9637
No log 18.8182 414 0.9679 0.3403 0.9679 0.9838
No log 18.9091 416 0.8451 0.4114 0.8451 0.9193
No log 19.0 418 0.7855 0.4114 0.7855 0.8863
No log 19.0909 420 0.7757 0.3888 0.7757 0.8807
No log 19.1818 422 0.8261 0.3914 0.8261 0.9089
No log 19.2727 424 0.8812 0.4173 0.8812 0.9387
No log 19.3636 426 0.9603 0.3973 0.9603 0.9800
No log 19.4545 428 0.9927 0.2278 0.9927 0.9963
No log 19.5455 430 0.8958 0.4051 0.8958 0.9465
No log 19.6364 432 0.8352 0.4665 0.8352 0.9139
No log 19.7273 434 0.7601 0.3613 0.7601 0.8719
No log 19.8182 436 0.7603 0.3616 0.7603 0.8720
No log 19.9091 438 0.7920 0.3396 0.7920 0.8899
No log 20.0 440 0.8835 0.4012 0.8835 0.9400
No log 20.0909 442 1.0085 0.3443 1.0085 1.0043
No log 20.1818 444 1.1422 0.2062 1.1422 1.0687
No log 20.2727 446 1.1230 0.2329 1.1230 1.0597
No log 20.3636 448 1.0272 0.2993 1.0272 1.0135
No log 20.4545 450 0.8815 0.4104 0.8815 0.9389
No log 20.5455 452 0.7952 0.3235 0.7952 0.8917
No log 20.6364 454 0.7576 0.2830 0.7576 0.8704
No log 20.7273 456 0.7535 0.3265 0.7535 0.8680
No log 20.8182 458 0.7406 0.3625 0.7406 0.8606
No log 20.9091 460 0.7617 0.3069 0.7617 0.8727
No log 21.0 462 0.7915 0.2980 0.7915 0.8897
No log 21.0909 464 0.7800 0.2980 0.7800 0.8832
No log 21.1818 466 0.7788 0.2947 0.7788 0.8825
No log 21.2727 468 0.7693 0.3151 0.7693 0.8771
No log 21.3636 470 0.7722 0.3060 0.7722 0.8787
No log 21.4545 472 0.7819 0.3069 0.7819 0.8843
No log 21.5455 474 0.8051 0.3316 0.8051 0.8973
No log 21.6364 476 0.8459 0.3590 0.8459 0.9197
No log 21.7273 478 0.8807 0.3365 0.8807 0.9384
No log 21.8182 480 0.8983 0.3365 0.8983 0.9478
No log 21.9091 482 0.8861 0.3365 0.8861 0.9413
No log 22.0 484 0.8619 0.3365 0.8619 0.9284
No log 22.0909 486 0.8587 0.3231 0.8587 0.9267
No log 22.1818 488 0.8739 0.3268 0.8739 0.9348
No log 22.2727 490 0.9652 0.3194 0.9652 0.9824
No log 22.3636 492 1.1431 0.2134 1.1431 1.0691
No log 22.4545 494 1.1817 0.2443 1.1817 1.0870
No log 22.5455 496 1.1404 0.2134 1.1404 1.0679
No log 22.6364 498 1.0175 0.3097 1.0175 1.0087
0.3742 22.7273 500 0.8745 0.3292 0.8745 0.9352
0.3742 22.8182 502 0.8297 0.2689 0.8297 0.9109
0.3742 22.9091 504 0.8384 0.2914 0.8384 0.9156
0.3742 23.0 506 0.8707 0.3255 0.8707 0.9331
0.3742 23.0909 508 0.8993 0.3388 0.8993 0.9483
0.3742 23.1818 510 0.8585 0.3172 0.8585 0.9266
0.3742 23.2727 512 0.8252 0.2980 0.8252 0.9084
0.3742 23.3636 514 0.8003 0.2980 0.8003 0.8946
0.3742 23.4545 516 0.7984 0.2980 0.7984 0.8935
0.3742 23.5455 518 0.8678 0.3794 0.8678 0.9315
0.3742 23.6364 520 0.9417 0.3068 0.9417 0.9704
0.3742 23.7273 522 0.9542 0.2877 0.9542 0.9768
0.3742 23.8182 524 0.8693 0.3613 0.8693 0.9324
0.3742 23.9091 526 0.8269 0.2888 0.8269 0.9093
0.3742 24.0 528 0.7954 0.2224 0.7954 0.8918
0.3742 24.0909 530 0.7963 0.2256 0.7963 0.8924
0.3742 24.1818 532 0.8037 0.2349 0.8037 0.8965
0.3742 24.2727 534 0.8309 0.2888 0.8309 0.9115

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_B_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k4_task7_organization

Finetuned
(4019)
this model