ArabicNewSplits7_B_usingWellWrittenEssays_FineTuningAraBERT_run3_AugV5_k3_task5_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.6282
  • Qwk: 0.6350
  • Mse: 0.6282
  • Rmse: 0.7926

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.1176 2 3.9697 0.0013 3.9697 1.9924
No log 0.2353 4 2.2646 0.0215 2.2646 1.5049
No log 0.3529 6 2.3084 -0.0234 2.3084 1.5193
No log 0.4706 8 1.7808 0.0498 1.7808 1.3345
No log 0.5882 10 1.1342 0.1218 1.1342 1.0650
No log 0.7059 12 1.1609 0.0886 1.1609 1.0774
No log 0.8235 14 1.2450 0.0242 1.2450 1.1158
No log 0.9412 16 1.2449 0.0065 1.2449 1.1157
No log 1.0588 18 1.1698 0.0941 1.1698 1.0816
No log 1.1765 20 1.0975 0.2049 1.0975 1.0476
No log 1.2941 22 1.1091 0.2004 1.1091 1.0531
No log 1.4118 24 1.1335 0.1618 1.1335 1.0646
No log 1.5294 26 1.0693 0.2343 1.0693 1.0341
No log 1.6471 28 1.0327 0.2023 1.0327 1.0162
No log 1.7647 30 0.9358 0.2935 0.9358 0.9674
No log 1.8824 32 0.9041 0.3162 0.9041 0.9509
No log 2.0 34 0.9153 0.3156 0.9153 0.9567
No log 2.1176 36 1.0419 0.2593 1.0419 1.0207
No log 2.2353 38 1.2359 0.0820 1.2359 1.1117
No log 2.3529 40 1.4581 0.0627 1.4581 1.2075
No log 2.4706 42 1.6898 0.1109 1.6898 1.2999
No log 2.5882 44 1.6017 0.1110 1.6017 1.2656
No log 2.7059 46 1.2895 0.1138 1.2895 1.1356
No log 2.8235 48 0.9936 0.3026 0.9936 0.9968
No log 2.9412 50 0.8775 0.3562 0.8775 0.9368
No log 3.0588 52 0.9324 0.3001 0.9324 0.9656
No log 3.1765 54 1.0120 0.2249 1.0120 1.0060
No log 3.2941 56 1.0148 0.3385 1.0148 1.0074
No log 3.4118 58 0.9727 0.3063 0.9727 0.9863
No log 3.5294 60 0.9158 0.2897 0.9158 0.9570
No log 3.6471 62 0.9292 0.3229 0.9292 0.9639
No log 3.7647 64 0.9284 0.3229 0.9284 0.9636
No log 3.8824 66 0.9328 0.3499 0.9328 0.9658
No log 4.0 68 1.0212 0.2635 1.0212 1.0105
No log 4.1176 70 1.0720 0.2969 1.0720 1.0354
No log 4.2353 72 1.0179 0.2827 1.0179 1.0089
No log 4.3529 74 0.9296 0.2981 0.9296 0.9642
No log 4.4706 76 0.8652 0.4148 0.8652 0.9302
No log 4.5882 78 0.8682 0.3815 0.8682 0.9318
No log 4.7059 80 0.8796 0.3304 0.8796 0.9379
No log 4.8235 82 0.8739 0.2770 0.8739 0.9348
No log 4.9412 84 0.8458 0.3837 0.8458 0.9197
No log 5.0588 86 0.8459 0.4 0.8459 0.9197
No log 5.1765 88 0.9192 0.4024 0.9192 0.9587
No log 5.2941 90 0.9950 0.3608 0.9950 0.9975
No log 5.4118 92 1.0091 0.3885 1.0091 1.0045
No log 5.5294 94 1.0118 0.3627 1.0118 1.0059
No log 5.6471 96 0.9435 0.3830 0.9435 0.9714
No log 5.7647 98 0.8433 0.4857 0.8433 0.9183
No log 5.8824 100 0.8067 0.4138 0.8067 0.8982
No log 6.0 102 0.8797 0.3358 0.8797 0.9379
No log 6.1176 104 0.9180 0.3740 0.9180 0.9581
No log 6.2353 106 0.9370 0.3836 0.9370 0.9680
No log 6.3529 108 0.9614 0.4249 0.9614 0.9805
No log 6.4706 110 0.9517 0.4557 0.9517 0.9756
No log 6.5882 112 0.9785 0.4162 0.9785 0.9892
No log 6.7059 114 0.9699 0.4573 0.9699 0.9848
No log 6.8235 116 0.8179 0.4893 0.8179 0.9044
No log 6.9412 118 0.7453 0.5575 0.7453 0.8633
No log 7.0588 120 0.7743 0.4269 0.7743 0.8800
No log 7.1765 122 0.7256 0.4223 0.7256 0.8518
No log 7.2941 124 0.6922 0.4893 0.6922 0.8320
No log 7.4118 126 0.6867 0.5010 0.6867 0.8287
No log 7.5294 128 0.6869 0.5260 0.6869 0.8288
No log 7.6471 130 0.6811 0.5516 0.6811 0.8253
No log 7.7647 132 0.6779 0.5373 0.6779 0.8233
No log 7.8824 134 0.6841 0.5467 0.6841 0.8271
No log 8.0 136 0.6963 0.5894 0.6963 0.8345
No log 8.1176 138 0.6850 0.5434 0.6850 0.8277
No log 8.2353 140 0.6683 0.5450 0.6683 0.8175
No log 8.3529 142 0.6609 0.6417 0.6609 0.8130
No log 8.4706 144 0.6626 0.6409 0.6626 0.8140
No log 8.5882 146 0.6498 0.6409 0.6498 0.8061
No log 8.7059 148 0.7214 0.4924 0.7214 0.8493
No log 8.8235 150 0.9014 0.4654 0.9014 0.9494
No log 8.9412 152 0.8355 0.4340 0.8355 0.9141
No log 9.0588 154 0.8259 0.4340 0.8259 0.9088
No log 9.1765 156 0.7165 0.5046 0.7165 0.8465
No log 9.2941 158 0.6353 0.6407 0.6353 0.7970
No log 9.4118 160 0.6342 0.6262 0.6342 0.7963
No log 9.5294 162 0.6608 0.5785 0.6608 0.8129
No log 9.6471 164 0.6847 0.5342 0.6847 0.8275
No log 9.7647 166 0.6554 0.6113 0.6554 0.8095
No log 9.8824 168 0.6222 0.6460 0.6222 0.7888
No log 10.0 170 0.6474 0.6945 0.6474 0.8046
No log 10.1176 172 0.6556 0.6832 0.6556 0.8097
No log 10.2353 174 0.6394 0.7009 0.6394 0.7996
No log 10.3529 176 0.6608 0.6187 0.6608 0.8129
No log 10.4706 178 0.6802 0.5740 0.6802 0.8247
No log 10.5882 180 0.6627 0.5657 0.6627 0.8141
No log 10.7059 182 0.6645 0.6824 0.6645 0.8151
No log 10.8235 184 0.6776 0.5588 0.6776 0.8231
No log 10.9412 186 0.6713 0.5588 0.6713 0.8193
No log 11.0588 188 0.6527 0.6556 0.6527 0.8079
No log 11.1765 190 0.6518 0.5614 0.6518 0.8074
No log 11.2941 192 0.6396 0.5939 0.6396 0.7998
No log 11.4118 194 0.6254 0.6584 0.6254 0.7909
No log 11.5294 196 0.6101 0.6796 0.6101 0.7811
No log 11.6471 198 0.6080 0.6911 0.6080 0.7797
No log 11.7647 200 0.6058 0.6796 0.6058 0.7784
No log 11.8824 202 0.6061 0.6911 0.6061 0.7785
No log 12.0 204 0.6040 0.7017 0.6040 0.7771
No log 12.1176 206 0.6072 0.6903 0.6072 0.7792
No log 12.2353 208 0.6071 0.6805 0.6071 0.7792
No log 12.3529 210 0.6259 0.6750 0.6259 0.7912
No log 12.4706 212 0.6027 0.7083 0.6027 0.7763
No log 12.5882 214 0.6031 0.6025 0.6031 0.7766
No log 12.7059 216 0.6028 0.6616 0.6028 0.7764
No log 12.8235 218 0.6098 0.6897 0.6098 0.7809
No log 12.9412 220 0.6169 0.7238 0.6169 0.7855
No log 13.0588 222 0.6377 0.6252 0.6377 0.7985
No log 13.1765 224 0.6527 0.5894 0.6527 0.8079
No log 13.2941 226 0.7011 0.5368 0.7011 0.8373
No log 13.4118 228 0.6783 0.5810 0.6783 0.8236
No log 13.5294 230 0.6783 0.6315 0.6783 0.8236
No log 13.6471 232 0.6539 0.6655 0.6539 0.8086
No log 13.7647 234 0.6100 0.7116 0.6100 0.7810
No log 13.8824 236 0.5989 0.6736 0.5989 0.7739
No log 14.0 238 0.5757 0.7179 0.5757 0.7587
No log 14.1176 240 0.5598 0.7089 0.5598 0.7482
No log 14.2353 242 0.5709 0.6905 0.5709 0.7556
No log 14.3529 244 0.5899 0.6720 0.5899 0.7680
No log 14.4706 246 0.5797 0.6598 0.5797 0.7614
No log 14.5882 248 0.6415 0.5410 0.6415 0.8009
No log 14.7059 250 0.7028 0.5160 0.7028 0.8383
No log 14.8235 252 0.6430 0.5317 0.6430 0.8019
No log 14.9412 254 0.5896 0.6353 0.5896 0.7679
No log 15.0588 256 0.6091 0.6287 0.6091 0.7805
No log 15.1765 258 0.5966 0.6676 0.5966 0.7724
No log 15.2941 260 0.5642 0.7245 0.5642 0.7511
No log 15.4118 262 0.5764 0.7066 0.5764 0.7592
No log 15.5294 264 0.5809 0.7291 0.5809 0.7621
No log 15.6471 266 0.5908 0.6667 0.5908 0.7687
No log 15.7647 268 0.6081 0.6676 0.6081 0.7798
No log 15.8824 270 0.6420 0.6720 0.6420 0.8012
No log 16.0 272 0.6523 0.7059 0.6523 0.8077
No log 16.1176 274 0.6596 0.6243 0.6596 0.8122
No log 16.2353 276 0.6560 0.6815 0.6560 0.8099
No log 16.3529 278 0.6431 0.6815 0.6431 0.8019
No log 16.4706 280 0.6244 0.6441 0.6244 0.7902
No log 16.5882 282 0.6084 0.6733 0.6084 0.7800
No log 16.7059 284 0.6042 0.6556 0.6042 0.7773
No log 16.8235 286 0.6066 0.6676 0.6066 0.7788
No log 16.9412 288 0.5989 0.7016 0.5989 0.7739
No log 17.0588 290 0.5974 0.7409 0.5974 0.7729
No log 17.1765 292 0.6050 0.6667 0.6050 0.7778
No log 17.2941 294 0.6021 0.7389 0.6021 0.7759
No log 17.4118 296 0.6012 0.6728 0.6012 0.7754
No log 17.5294 298 0.5829 0.7291 0.5829 0.7634
No log 17.6471 300 0.5989 0.6676 0.5989 0.7739
No log 17.7647 302 0.6133 0.6287 0.6133 0.7831
No log 17.8824 304 0.5963 0.6676 0.5963 0.7722
No log 18.0 306 0.5748 0.7298 0.5748 0.7582
No log 18.1176 308 0.5777 0.6970 0.5777 0.7601
No log 18.2353 310 0.5759 0.6866 0.5759 0.7589
No log 18.3529 312 0.5787 0.7083 0.5787 0.7607
No log 18.4706 314 0.5901 0.6337 0.5901 0.7682
No log 18.5882 316 0.5804 0.6858 0.5804 0.7618
No log 18.7059 318 0.5724 0.7459 0.5724 0.7565
No log 18.8235 320 0.5917 0.7216 0.5917 0.7692
No log 18.9412 322 0.5941 0.7101 0.5941 0.7708
No log 19.0588 324 0.5847 0.7446 0.5847 0.7646
No log 19.1765 326 0.5893 0.6888 0.5893 0.7676
No log 19.2941 328 0.6110 0.6881 0.6110 0.7817
No log 19.4118 330 0.6176 0.6881 0.6176 0.7859
No log 19.5294 332 0.5915 0.7292 0.5915 0.7691
No log 19.6471 334 0.5925 0.7452 0.5925 0.7697
No log 19.7647 336 0.6126 0.7349 0.6126 0.7827
No log 19.8824 338 0.6725 0.5601 0.6725 0.8201
No log 20.0 340 0.7007 0.5601 0.7007 0.8371
No log 20.1176 342 0.6530 0.5588 0.6530 0.8081
No log 20.2353 344 0.6196 0.7066 0.6196 0.7871
No log 20.3529 346 0.6213 0.6053 0.6213 0.7882
No log 20.4706 348 0.6029 0.6954 0.6029 0.7764
No log 20.5882 350 0.5777 0.7508 0.5777 0.7601
No log 20.7059 352 0.5924 0.7579 0.5924 0.7697
No log 20.8235 354 0.6216 0.5684 0.6216 0.7884
No log 20.9412 356 0.6135 0.6312 0.6135 0.7832
No log 21.0588 358 0.5625 0.7520 0.5625 0.7500
No log 21.1765 360 0.5487 0.7508 0.5487 0.7408
No log 21.2941 362 0.5553 0.7508 0.5553 0.7452
No log 21.4118 364 0.5671 0.7514 0.5671 0.7531
No log 21.5294 366 0.5681 0.7573 0.5681 0.7537
No log 21.6471 368 0.5772 0.7573 0.5772 0.7597
No log 21.7647 370 0.5887 0.7179 0.5887 0.7673
No log 21.8824 372 0.6108 0.6946 0.6108 0.7815
No log 22.0 374 0.6114 0.6762 0.6114 0.7819
No log 22.1176 376 0.5952 0.7001 0.5952 0.7715
No log 22.2353 378 0.5858 0.7001 0.5858 0.7653
No log 22.3529 380 0.5867 0.7514 0.5867 0.7660
No log 22.4706 382 0.5938 0.7514 0.5938 0.7706
No log 22.5882 384 0.5915 0.7619 0.5915 0.7691
No log 22.7059 386 0.5950 0.7179 0.5950 0.7714
No log 22.8235 388 0.5996 0.7508 0.5996 0.7743
No log 22.9412 390 0.6091 0.6886 0.6091 0.7805
No log 23.0588 392 0.6075 0.7238 0.6075 0.7794
No log 23.1765 394 0.6065 0.7292 0.6065 0.7788
No log 23.2941 396 0.5977 0.7183 0.5977 0.7731
No log 23.4118 398 0.5840 0.7403 0.5840 0.7642
No log 23.5294 400 0.5747 0.7291 0.5747 0.7581
No log 23.6471 402 0.5776 0.7291 0.5776 0.7600
No log 23.7647 404 0.5756 0.6962 0.5756 0.7587
No log 23.8824 406 0.5589 0.6962 0.5589 0.7476
No log 24.0 408 0.5466 0.7066 0.5466 0.7393
No log 24.1176 410 0.5559 0.7396 0.5559 0.7456
No log 24.2353 412 0.5845 0.6742 0.5845 0.7645
No log 24.3529 414 0.6048 0.6409 0.6048 0.7777
No log 24.4706 416 0.5816 0.7025 0.5816 0.7626
No log 24.5882 418 0.5783 0.7074 0.5783 0.7604
No log 24.7059 420 0.5839 0.7017 0.5839 0.7641
No log 24.8235 422 0.6147 0.6292 0.6147 0.7840
No log 24.9412 424 0.6316 0.6409 0.6316 0.7947
No log 25.0588 426 0.6277 0.6958 0.6277 0.7923
No log 25.1765 428 0.6476 0.5844 0.6476 0.8047
No log 25.2941 430 0.6874 0.5054 0.6874 0.8291
No log 25.4118 432 0.7072 0.5516 0.7072 0.8410
No log 25.5294 434 0.7103 0.5472 0.7103 0.8428
No log 25.6471 436 0.7145 0.5746 0.7145 0.8453
No log 25.7647 438 0.7021 0.5847 0.7021 0.8379
No log 25.8824 440 0.6862 0.5688 0.6862 0.8284
No log 26.0 442 0.6873 0.5654 0.6873 0.8290
No log 26.1176 444 0.7017 0.5033 0.7017 0.8377
No log 26.2353 446 0.6976 0.5033 0.6976 0.8352
No log 26.3529 448 0.6720 0.5728 0.6720 0.8198
No log 26.4706 450 0.6465 0.5858 0.6465 0.8040
No log 26.5882 452 0.6337 0.5774 0.6337 0.7960
No log 26.7059 454 0.6093 0.6588 0.6093 0.7806
No log 26.8235 456 0.5986 0.6954 0.5986 0.7737
No log 26.9412 458 0.5990 0.7051 0.5990 0.7740
No log 27.0588 460 0.5940 0.6719 0.5940 0.7707
No log 27.1765 462 0.5621 0.7059 0.5621 0.7497
No log 27.2941 464 0.5496 0.7403 0.5496 0.7414
No log 27.4118 466 0.5791 0.6139 0.5791 0.7610
No log 27.5294 468 0.6119 0.6172 0.6119 0.7822
No log 27.6471 470 0.6103 0.5962 0.6103 0.7812
No log 27.7647 472 0.5748 0.6447 0.5748 0.7581
No log 27.8824 474 0.5403 0.7291 0.5403 0.7351
No log 28.0 476 0.5470 0.7284 0.5470 0.7396
No log 28.1176 478 0.5614 0.6623 0.5614 0.7493
No log 28.2353 480 0.5602 0.7291 0.5602 0.7484
No log 28.3529 482 0.5685 0.7514 0.5685 0.7540
No log 28.4706 484 0.5759 0.7403 0.5759 0.7589
No log 28.5882 486 0.5710 0.7291 0.5710 0.7557
No log 28.7059 488 0.5695 0.6519 0.5695 0.7547
No log 28.8235 490 0.5691 0.6623 0.5691 0.7544
No log 28.9412 492 0.5800 0.6724 0.5800 0.7615
No log 29.0588 494 0.5885 0.7059 0.5885 0.7671
No log 29.1765 496 0.5781 0.6623 0.5781 0.7603
No log 29.2941 498 0.5782 0.6728 0.5782 0.7604
0.3465 29.4118 500 0.5875 0.6623 0.5875 0.7665
0.3465 29.5294 502 0.5793 0.6954 0.5793 0.7611
0.3465 29.6471 504 0.5752 0.7284 0.5752 0.7584
0.3465 29.7647 506 0.5748 0.6623 0.5748 0.7582
0.3465 29.8824 508 0.5617 0.6623 0.5617 0.7495
0.3465 30.0 510 0.5382 0.7179 0.5382 0.7336
0.3465 30.1176 512 0.5240 0.7403 0.5240 0.7239
0.3465 30.2353 514 0.5237 0.7403 0.5237 0.7237
0.3465 30.3529 516 0.5294 0.7514 0.5294 0.7276
0.3465 30.4706 518 0.5356 0.7625 0.5356 0.7318
0.3465 30.5882 520 0.5352 0.7514 0.5352 0.7316
0.3465 30.7059 522 0.5361 0.7396 0.5361 0.7322
0.3465 30.8235 524 0.5452 0.6833 0.5452 0.7384
0.3465 30.9412 526 0.5470 0.6950 0.5470 0.7396
0.3465 31.0588 528 0.5549 0.7067 0.5549 0.7449
0.3465 31.1765 530 0.5689 0.6485 0.5689 0.7543
0.3465 31.2941 532 0.5566 0.6897 0.5566 0.7460
0.3465 31.4118 534 0.5447 0.6950 0.5447 0.7380
0.3465 31.5294 536 0.5469 0.7183 0.5469 0.7395
0.3465 31.6471 538 0.5613 0.7126 0.5613 0.7492
0.3465 31.7647 540 0.5637 0.7126 0.5637 0.7508
0.3465 31.8824 542 0.5615 0.7396 0.5615 0.7493
0.3465 32.0 544 0.5884 0.6946 0.5884 0.7671
0.3465 32.1176 546 0.6062 0.5948 0.6062 0.7786
0.3465 32.2353 548 0.6102 0.6092 0.6102 0.7812
0.3465 32.3529 550 0.6185 0.6407 0.6185 0.7864
0.3465 32.4706 552 0.6236 0.6407 0.6236 0.7897
0.3465 32.5882 554 0.6282 0.6350 0.6282 0.7926

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_B_usingWellWrittenEssays_FineTuningAraBERT_run3_AugV5_k3_task5_organization

Finetuned
(4019)
this model