ArabicNewSplits7_B_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k18_task7_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.6456
  • Qwk: 0.4314
  • Mse: 0.6456
  • Rmse: 0.8035

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0220 2 2.7054 -0.0262 2.7054 1.6448
No log 0.0440 4 1.4238 0.0991 1.4238 1.1932
No log 0.0659 6 1.2994 -0.1759 1.2994 1.1399
No log 0.0879 8 1.2762 -0.0450 1.2762 1.1297
No log 0.1099 10 1.4131 -0.1138 1.4131 1.1887
No log 0.1319 12 1.5287 -0.2085 1.5287 1.2364
No log 0.1538 14 1.4366 -0.2199 1.4366 1.1986
No log 0.1758 16 1.1283 0.0288 1.1283 1.0622
No log 0.1978 18 0.9138 0.2171 0.9138 0.9559
No log 0.2198 20 0.8521 0.1648 0.8521 0.9231
No log 0.2418 22 0.8358 0.2407 0.8358 0.9142
No log 0.2637 24 0.7521 0.1807 0.7521 0.8672
No log 0.2857 26 0.7322 0.1807 0.7322 0.8557
No log 0.3077 28 0.6997 0.1922 0.6997 0.8365
No log 0.3297 30 0.7805 0.1685 0.7805 0.8835
No log 0.3516 32 0.8539 0.1650 0.8539 0.9241
No log 0.3736 34 0.8294 0.1318 0.8294 0.9107
No log 0.3956 36 0.8031 0.1637 0.8031 0.8961
No log 0.4176 38 0.8017 0.1285 0.8017 0.8954
No log 0.4396 40 0.7894 0.2031 0.7894 0.8885
No log 0.4615 42 0.7546 0.2813 0.7546 0.8687
No log 0.4835 44 0.7235 0.1236 0.7235 0.8506
No log 0.5055 46 0.7231 0.1359 0.7231 0.8504
No log 0.5275 48 0.7635 0.1672 0.7635 0.8738
No log 0.5495 50 0.8035 0.2574 0.8035 0.8964
No log 0.5714 52 0.8246 0.2574 0.8246 0.9081
No log 0.5934 54 0.9177 0.2670 0.9177 0.9580
No log 0.6154 56 0.8956 0.2726 0.8956 0.9464
No log 0.6374 58 0.7919 0.2632 0.7919 0.8899
No log 0.6593 60 0.6808 0.2537 0.6808 0.8251
No log 0.6813 62 0.6905 0.3143 0.6905 0.8310
No log 0.7033 64 0.6945 0.3143 0.6945 0.8334
No log 0.7253 66 0.6401 0.4298 0.6401 0.8000
No log 0.7473 68 0.6361 0.3813 0.6361 0.7976
No log 0.7692 70 0.6640 0.2817 0.6640 0.8149
No log 0.7912 72 0.6410 0.3738 0.6410 0.8006
No log 0.8132 74 0.6396 0.4278 0.6396 0.7998
No log 0.8352 76 0.6888 0.3196 0.6888 0.8299
No log 0.8571 78 0.6464 0.3196 0.6464 0.8040
No log 0.8791 80 0.6184 0.3782 0.6184 0.7864
No log 0.9011 82 0.6900 0.2950 0.6900 0.8307
No log 0.9231 84 0.8287 0.2558 0.8287 0.9103
No log 0.9451 86 0.9470 0.2204 0.9470 0.9731
No log 0.9670 88 0.9585 0.2932 0.9585 0.9790
No log 0.9890 90 0.9567 0.2871 0.9567 0.9781
No log 1.0110 92 0.9154 0.2995 0.9154 0.9568
No log 1.0330 94 0.8053 0.3127 0.8053 0.8974
No log 1.0549 96 0.8192 0.2261 0.8192 0.9051
No log 1.0769 98 0.9556 0.1636 0.9556 0.9776
No log 1.0989 100 1.1248 0.1899 1.1248 1.0606
No log 1.1209 102 1.3542 0.0191 1.3542 1.1637
No log 1.1429 104 1.3922 0.0371 1.3922 1.1799
No log 1.1648 106 1.3202 0.0515 1.3202 1.1490
No log 1.1868 108 1.2134 0.0585 1.2134 1.1016
No log 1.2088 110 1.0666 0.2615 1.0666 1.0328
No log 1.2308 112 0.9342 0.3231 0.9342 0.9665
No log 1.2527 114 0.8017 0.3105 0.8017 0.8954
No log 1.2747 116 0.6869 0.3011 0.6869 0.8288
No log 1.2967 118 0.6969 0.2685 0.6969 0.8348
No log 1.3187 120 0.8037 0.2692 0.8037 0.8965
No log 1.3407 122 0.9450 0.2463 0.9450 0.9721
No log 1.3626 124 0.9989 0.2410 0.9989 0.9995
No log 1.3846 126 0.9935 0.1254 0.9935 0.9968
No log 1.4066 128 0.9939 0.1822 0.9939 0.9969
No log 1.4286 130 1.0008 0.1940 1.0008 1.0004
No log 1.4505 132 0.9715 0.2387 0.9715 0.9857
No log 1.4725 134 0.8647 0.2547 0.8647 0.9299
No log 1.4945 136 0.7577 0.3060 0.7577 0.8705
No log 1.5165 138 0.7418 0.3127 0.7418 0.8613
No log 1.5385 140 0.7315 0.3127 0.7315 0.8553
No log 1.5604 142 0.7378 0.2883 0.7378 0.8589
No log 1.5824 144 0.7867 0.3032 0.7867 0.8870
No log 1.6044 146 0.9767 0.3236 0.9767 0.9883
No log 1.6264 148 1.2361 0.1917 1.2361 1.1118
No log 1.6484 150 1.2493 0.1995 1.2493 1.1177
No log 1.6703 152 1.0143 0.2781 1.0143 1.0071
No log 1.6923 154 0.9498 0.1487 0.9498 0.9746
No log 1.7143 156 0.9742 0.1208 0.9742 0.9870
No log 1.7363 158 0.8815 0.1141 0.8815 0.9389
No log 1.7582 160 0.8447 0.2817 0.8447 0.9191
No log 1.7802 162 0.8483 0.3099 0.8483 0.9210
No log 1.8022 164 0.8040 0.3099 0.8040 0.8966
No log 1.8242 166 0.7407 0.3020 0.7407 0.8606
No log 1.8462 168 0.6873 0.1775 0.6873 0.8291
No log 1.8681 170 0.7217 0.2015 0.7217 0.8495
No log 1.8901 172 0.7492 0.2642 0.7492 0.8656
No log 1.9121 174 0.7049 0.1961 0.7049 0.8396
No log 1.9341 176 0.7283 0.1686 0.7283 0.8534
No log 1.9560 178 0.8714 0.3329 0.8714 0.9335
No log 1.9780 180 0.9402 0.3167 0.9402 0.9696
No log 2.0 182 0.9174 0.1972 0.9174 0.9578
No log 2.0220 184 0.8466 0.1598 0.8466 0.9201
No log 2.0440 186 0.7991 0.1972 0.7991 0.8939
No log 2.0659 188 0.7734 0.2883 0.7734 0.8794
No log 2.0879 190 0.7743 0.3302 0.7743 0.8799
No log 2.1099 192 0.8506 0.3274 0.8506 0.9223
No log 2.1319 194 0.8340 0.3274 0.8340 0.9132
No log 2.1538 196 0.8322 0.4018 0.8322 0.9123
No log 2.1758 198 0.7408 0.4366 0.7408 0.8607
No log 2.1978 200 0.6365 0.4596 0.6365 0.7978
No log 2.2198 202 0.6518 0.3716 0.6518 0.8073
No log 2.2418 204 0.6483 0.3716 0.6483 0.8052
No log 2.2637 206 0.6257 0.4114 0.6257 0.7910
No log 2.2857 208 0.6743 0.4764 0.6743 0.8211
No log 2.3077 210 0.7317 0.4502 0.7317 0.8554
No log 2.3297 212 0.8466 0.4030 0.8466 0.9201
No log 2.3516 214 0.8628 0.3450 0.8628 0.9289
No log 2.3736 216 0.9034 0.3450 0.9034 0.9505
No log 2.3956 218 0.8751 0.3302 0.8751 0.9355
No log 2.4176 220 0.9132 0.4014 0.9132 0.9556
No log 2.4396 222 0.8168 0.4014 0.8168 0.9038
No log 2.4615 224 0.7093 0.4167 0.7093 0.8422
No log 2.4835 226 0.6127 0.4147 0.6127 0.7828
No log 2.5055 228 0.5911 0.5516 0.5911 0.7688
No log 2.5275 230 0.5572 0.4441 0.5572 0.7464
No log 2.5495 232 0.5784 0.5231 0.5784 0.7605
No log 2.5714 234 0.5820 0.5009 0.5820 0.7629
No log 2.5934 236 0.5714 0.5003 0.5714 0.7559
No log 2.6154 238 0.6021 0.5308 0.6021 0.7759
No log 2.6374 240 0.5967 0.5560 0.5967 0.7725
No log 2.6593 242 0.5952 0.5671 0.5952 0.7715
No log 2.6813 244 0.6210 0.4418 0.6210 0.7881
No log 2.7033 246 0.6881 0.4621 0.6881 0.8295
No log 2.7253 248 0.7061 0.4621 0.7061 0.8403
No log 2.7473 250 0.6240 0.4700 0.6240 0.7900
No log 2.7692 252 0.6478 0.4836 0.6478 0.8049
No log 2.7912 254 0.7759 0.3678 0.7759 0.8808
No log 2.8132 256 0.8089 0.3559 0.8089 0.8994
No log 2.8352 258 0.6882 0.4315 0.6882 0.8296
No log 2.8571 260 0.5887 0.5140 0.5887 0.7673
No log 2.8791 262 0.5596 0.5738 0.5596 0.7481
No log 2.9011 264 0.5655 0.5201 0.5655 0.7520
No log 2.9231 266 0.6124 0.4764 0.6124 0.7826
No log 2.9451 268 0.6144 0.4997 0.6144 0.7838
No log 2.9670 270 0.6033 0.4763 0.6033 0.7767
No log 2.9890 272 0.6163 0.4657 0.6163 0.7850
No log 3.0110 274 0.6742 0.3784 0.6742 0.8211
No log 3.0330 276 0.6483 0.4118 0.6483 0.8052
No log 3.0549 278 0.6372 0.4828 0.6372 0.7982
No log 3.0769 280 0.7365 0.4531 0.7365 0.8582
No log 3.0989 282 0.8107 0.3867 0.8107 0.9004
No log 3.1209 284 0.8152 0.3521 0.8152 0.9029
No log 3.1429 286 0.7807 0.3688 0.7807 0.8836
No log 3.1648 288 0.7117 0.4597 0.7117 0.8436
No log 3.1868 290 0.6541 0.3763 0.6541 0.8088
No log 3.2088 292 0.6628 0.3990 0.6628 0.8141
No log 3.2308 294 0.6660 0.3817 0.6660 0.8161
No log 3.2527 296 0.6913 0.4389 0.6913 0.8315
No log 3.2747 298 0.7005 0.4389 0.7005 0.8370
No log 3.2967 300 0.6796 0.4335 0.6796 0.8243
No log 3.3187 302 0.6664 0.3746 0.6664 0.8164
No log 3.3407 304 0.7049 0.3746 0.7049 0.8396
No log 3.3626 306 0.6874 0.3465 0.6874 0.8291
No log 3.3846 308 0.6668 0.4059 0.6668 0.8166
No log 3.4066 310 0.6424 0.4134 0.6424 0.8015
No log 3.4286 312 0.6229 0.3640 0.6229 0.7892
No log 3.4505 314 0.6265 0.4393 0.6265 0.7915
No log 3.4725 316 0.5958 0.4473 0.5958 0.7719
No log 3.4945 318 0.5920 0.4724 0.5920 0.7694
No log 3.5165 320 0.5536 0.4874 0.5536 0.7440
No log 3.5385 322 0.5892 0.5983 0.5892 0.7676
No log 3.5604 324 0.6334 0.5650 0.6334 0.7959
No log 3.5824 326 0.5735 0.5779 0.5735 0.7573
No log 3.6044 328 0.5312 0.4659 0.5312 0.7288
No log 3.6264 330 0.6558 0.4982 0.6558 0.8098
No log 3.6484 332 0.7737 0.4978 0.7737 0.8796
No log 3.6703 334 0.7178 0.4614 0.7178 0.8472
No log 3.6923 336 0.5956 0.4864 0.5956 0.7717
No log 3.7143 338 0.5497 0.4768 0.5497 0.7414
No log 3.7363 340 0.5595 0.4701 0.5595 0.7480
No log 3.7582 342 0.5541 0.4914 0.5541 0.7444
No log 3.7802 344 0.5822 0.4724 0.5822 0.7630
No log 3.8022 346 0.6964 0.4592 0.6964 0.8345
No log 3.8242 348 0.7560 0.4852 0.7560 0.8695
No log 3.8462 350 0.6903 0.4400 0.6903 0.8309
No log 3.8681 352 0.5553 0.5140 0.5553 0.7452
No log 3.8901 354 0.5169 0.6154 0.5169 0.7190
No log 3.9121 356 0.5810 0.5779 0.5810 0.7622
No log 3.9341 358 0.6338 0.4616 0.6338 0.7961
No log 3.9560 360 0.5852 0.5642 0.5852 0.7650
No log 3.9780 362 0.5404 0.5556 0.5404 0.7351
No log 4.0 364 0.5552 0.5053 0.5552 0.7451
No log 4.0220 366 0.6616 0.4531 0.6616 0.8134
No log 4.0440 368 0.7003 0.4295 0.7003 0.8368
No log 4.0659 370 0.6405 0.4295 0.6405 0.8003
No log 4.0879 372 0.5819 0.4828 0.5819 0.7628
No log 4.1099 374 0.5750 0.4701 0.5750 0.7583
No log 4.1319 376 0.5793 0.4681 0.5793 0.7611
No log 4.1538 378 0.5853 0.4044 0.5853 0.7651
No log 4.1758 380 0.5993 0.4513 0.5993 0.7742
No log 4.1978 382 0.6002 0.4247 0.6002 0.7747
No log 4.2198 384 0.5944 0.4020 0.5944 0.7709
No log 4.2418 386 0.5839 0.4397 0.5839 0.7642
No log 4.2637 388 0.5839 0.4914 0.5839 0.7641
No log 4.2857 390 0.5848 0.4901 0.5848 0.7647
No log 4.3077 392 0.6181 0.5418 0.6181 0.7862
No log 4.3297 394 0.6618 0.5233 0.6618 0.8135
No log 4.3516 396 0.6720 0.5185 0.6720 0.8198
No log 4.3736 398 0.6482 0.5459 0.6482 0.8051
No log 4.3956 400 0.5795 0.5632 0.5795 0.7613
No log 4.4176 402 0.5847 0.4858 0.5847 0.7646
No log 4.4396 404 0.6152 0.5200 0.6152 0.7843
No log 4.4615 406 0.5945 0.5406 0.5945 0.7710
No log 4.4835 408 0.5782 0.5061 0.5782 0.7604
No log 4.5055 410 0.5803 0.5107 0.5803 0.7618
No log 4.5275 412 0.5771 0.5061 0.5771 0.7597
No log 4.5495 414 0.5889 0.5265 0.5889 0.7674
No log 4.5714 416 0.5772 0.5265 0.5772 0.7597
No log 4.5934 418 0.5736 0.4701 0.5736 0.7574
No log 4.6154 420 0.6126 0.4428 0.6126 0.7827
No log 4.6374 422 0.6305 0.4212 0.6305 0.7941
No log 4.6593 424 0.6862 0.3723 0.6862 0.8284
No log 4.6813 426 0.8055 0.3782 0.8055 0.8975
No log 4.7033 428 0.8457 0.3782 0.8457 0.9196
No log 4.7253 430 0.7482 0.3913 0.7482 0.8650
No log 4.7473 432 0.6624 0.3914 0.6624 0.8139
No log 4.7692 434 0.5891 0.4212 0.5891 0.7675
No log 4.7912 436 0.5686 0.4782 0.5686 0.7540
No log 4.8132 438 0.5725 0.5084 0.5725 0.7567
No log 4.8352 440 0.5749 0.5003 0.5749 0.7582
No log 4.8571 442 0.5901 0.4912 0.5901 0.7682
No log 4.8791 444 0.6022 0.5433 0.6022 0.7760
No log 4.9011 446 0.5909 0.5368 0.5909 0.7687
No log 4.9231 448 0.5734 0.4928 0.5734 0.7573
No log 4.9451 450 0.5634 0.5142 0.5634 0.7506
No log 4.9670 452 0.5639 0.5084 0.5639 0.7510
No log 4.9890 454 0.5668 0.5133 0.5668 0.7529
No log 5.0110 456 0.6081 0.4883 0.6081 0.7798
No log 5.0330 458 0.7261 0.4952 0.7261 0.8521
No log 5.0549 460 0.7400 0.5294 0.7400 0.8602
No log 5.0769 462 0.7115 0.4513 0.7115 0.8435
No log 5.0989 464 0.6582 0.4954 0.6582 0.8113
No log 5.1209 466 0.6300 0.5206 0.6300 0.7937
No log 5.1429 468 0.6413 0.5285 0.6413 0.8008
No log 5.1648 470 0.6378 0.5285 0.6378 0.7987
No log 5.1868 472 0.6308 0.5661 0.6308 0.7942
No log 5.2088 474 0.6234 0.5533 0.6234 0.7896
No log 5.2308 476 0.6176 0.5547 0.6176 0.7859
No log 5.2527 478 0.6098 0.5159 0.6098 0.7809
No log 5.2747 480 0.6023 0.4619 0.6023 0.7761
No log 5.2967 482 0.6080 0.4660 0.6080 0.7797
No log 5.3187 484 0.6182 0.4639 0.6182 0.7863
No log 5.3407 486 0.6262 0.4555 0.6262 0.7913
No log 5.3626 488 0.6236 0.5053 0.6236 0.7897
No log 5.3846 490 0.6294 0.5032 0.6294 0.7934
No log 5.4066 492 0.6124 0.5218 0.6124 0.7826
No log 5.4286 494 0.6078 0.5318 0.6078 0.7796
No log 5.4505 496 0.6293 0.5406 0.6293 0.7933
No log 5.4725 498 0.6335 0.5185 0.6335 0.7959
0.4144 5.4945 500 0.6574 0.5063 0.6574 0.8108
0.4144 5.5165 502 0.6874 0.5459 0.6874 0.8291
0.4144 5.5385 504 0.6632 0.5473 0.6632 0.8144
0.4144 5.5604 506 0.6236 0.5149 0.6236 0.7897
0.4144 5.5824 508 0.6253 0.5067 0.6253 0.7907
0.4144 5.6044 510 0.6565 0.5013 0.6565 0.8102
0.4144 5.6264 512 0.6707 0.4375 0.6707 0.8190
0.4144 5.6484 514 0.6486 0.5184 0.6486 0.8053
0.4144 5.6703 516 0.6265 0.5254 0.6265 0.7915
0.4144 5.6923 518 0.6529 0.4302 0.6529 0.8080
0.4144 5.7143 520 0.6769 0.4282 0.6769 0.8227
0.4144 5.7363 522 0.6648 0.4724 0.6648 0.8154
0.4144 5.7582 524 0.6542 0.4837 0.6542 0.8088
0.4144 5.7802 526 0.6456 0.4314 0.6456 0.8035

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_B_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k18_task7_organization

Finetuned
(4019)
this model