ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k3_task5_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.5914
  • Qwk: 0.5944
  • Mse: 0.5914
  • Rmse: 0.7690

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.1333 2 3.9280 -0.0239 3.9280 1.9819
No log 0.2667 4 2.2037 0.0902 2.2037 1.4845
No log 0.4 6 1.1941 0.0760 1.1941 1.0928
No log 0.5333 8 1.0088 0.4744 1.0088 1.0044
No log 0.6667 10 0.9524 0.4346 0.9524 0.9759
No log 0.8 12 1.0353 0.2611 1.0353 1.0175
No log 0.9333 14 1.0888 0.1711 1.0888 1.0435
No log 1.0667 16 1.1444 0.1685 1.1444 1.0698
No log 1.2 18 1.1039 0.2293 1.1039 1.0506
No log 1.3333 20 0.9880 0.3856 0.9880 0.9940
No log 1.4667 22 0.8443 0.4022 0.8443 0.9188
No log 1.6 24 0.9134 0.4235 0.9134 0.9557
No log 1.7333 26 0.8235 0.4339 0.8235 0.9075
No log 1.8667 28 0.7720 0.3519 0.7720 0.8786
No log 2.0 30 0.7845 0.4048 0.7845 0.8857
No log 2.1333 32 0.7763 0.5334 0.7763 0.8811
No log 2.2667 34 0.7268 0.5188 0.7268 0.8525
No log 2.4 36 0.7870 0.3734 0.7870 0.8871
No log 2.5333 38 0.8437 0.3814 0.8437 0.9185
No log 2.6667 40 0.7537 0.4955 0.7537 0.8682
No log 2.8 42 0.6467 0.5913 0.6467 0.8042
No log 2.9333 44 0.6443 0.6519 0.6443 0.8027
No log 3.0667 46 0.6746 0.5634 0.6746 0.8213
No log 3.2 48 0.6886 0.5494 0.6886 0.8298
No log 3.3333 50 0.7087 0.6118 0.7087 0.8418
No log 3.4667 52 0.6618 0.6335 0.6618 0.8135
No log 3.6 54 0.6694 0.5843 0.6694 0.8182
No log 3.7333 56 0.6901 0.5569 0.6901 0.8307
No log 3.8667 58 0.7071 0.5695 0.7071 0.8409
No log 4.0 60 0.7162 0.5513 0.7162 0.8463
No log 4.1333 62 0.6969 0.6406 0.6969 0.8348
No log 4.2667 64 0.8402 0.56 0.8402 0.9166
No log 4.4 66 0.9537 0.4490 0.9537 0.9766
No log 4.5333 68 0.7989 0.5516 0.7989 0.8938
No log 4.6667 70 0.6466 0.5855 0.6466 0.8041
No log 4.8 72 0.5917 0.5995 0.5917 0.7692
No log 4.9333 74 0.6068 0.6578 0.6068 0.7790
No log 5.0667 76 0.6452 0.6606 0.6452 0.8032
No log 5.2 78 0.6309 0.6390 0.6309 0.7943
No log 5.3333 80 0.8142 0.5490 0.8142 0.9023
No log 5.4667 82 0.8069 0.5861 0.8069 0.8983
No log 5.6 84 0.6369 0.6662 0.6369 0.7981
No log 5.7333 86 0.5776 0.6609 0.5776 0.7600
No log 5.8667 88 0.5918 0.5949 0.5918 0.7693
No log 6.0 90 0.6663 0.6385 0.6663 0.8163
No log 6.1333 92 0.6271 0.6545 0.6271 0.7919
No log 6.2667 94 0.5796 0.6820 0.5796 0.7613
No log 6.4 96 0.5818 0.6833 0.5818 0.7628
No log 6.5333 98 0.5870 0.6833 0.5870 0.7662
No log 6.6667 100 0.5726 0.6902 0.5726 0.7567
No log 6.8 102 0.6208 0.6642 0.6208 0.7879
No log 6.9333 104 0.6253 0.6503 0.6253 0.7908
No log 7.0667 106 0.5859 0.6500 0.5859 0.7654
No log 7.2 108 0.6054 0.6334 0.6054 0.7781
No log 7.3333 110 0.5995 0.6642 0.5995 0.7743
No log 7.4667 112 0.7341 0.6097 0.7341 0.8568
No log 7.6 114 0.7804 0.5985 0.7804 0.8834
No log 7.7333 116 0.7067 0.5995 0.7067 0.8407
No log 7.8667 118 0.6362 0.6225 0.6362 0.7976
No log 8.0 120 0.6451 0.5592 0.6451 0.8032
No log 8.1333 122 0.6172 0.6464 0.6172 0.7857
No log 8.2667 124 0.6041 0.6875 0.6041 0.7772
No log 8.4 126 0.6226 0.5940 0.6226 0.7891
No log 8.5333 128 0.7079 0.6401 0.7079 0.8414
No log 8.6667 130 0.6813 0.6497 0.6813 0.8254
No log 8.8 132 0.5662 0.7284 0.5662 0.7525
No log 8.9333 134 0.5723 0.6938 0.5723 0.7565
No log 9.0667 136 0.5920 0.6501 0.5920 0.7694
No log 9.2 138 0.6812 0.5995 0.6812 0.8253
No log 9.3333 140 0.7131 0.5892 0.7131 0.8445
No log 9.4667 142 0.7045 0.5892 0.7045 0.8393
No log 9.6 144 0.6605 0.5973 0.6605 0.8127
No log 9.7333 146 0.6306 0.5844 0.6306 0.7941
No log 9.8667 148 0.5945 0.6266 0.5945 0.7710
No log 10.0 150 0.5972 0.6349 0.5972 0.7728
No log 10.1333 152 0.5979 0.7071 0.5979 0.7733
No log 10.2667 154 0.6074 0.6783 0.6074 0.7794
No log 10.4 156 0.6567 0.6115 0.6567 0.8104
No log 10.5333 158 0.5976 0.6421 0.5976 0.7731
No log 10.6667 160 0.6493 0.6204 0.6493 0.8058
No log 10.8 162 0.6995 0.5909 0.6995 0.8364
No log 10.9333 164 0.6336 0.5554 0.6336 0.7960
No log 11.0667 166 0.5838 0.6311 0.5838 0.7641
No log 11.2 168 0.5671 0.6951 0.5671 0.7530
No log 11.3333 170 0.5566 0.7077 0.5566 0.7460
No log 11.4667 172 0.5660 0.6672 0.5660 0.7524
No log 11.6 174 0.5655 0.6672 0.5655 0.7520
No log 11.7333 176 0.6159 0.6791 0.6159 0.7848
No log 11.8667 178 0.5930 0.6451 0.5930 0.7700
No log 12.0 180 0.5623 0.6491 0.5623 0.7499
No log 12.1333 182 0.5925 0.6555 0.5925 0.7697
No log 12.2667 184 0.5988 0.6405 0.5988 0.7738
No log 12.4 186 0.6068 0.6446 0.6068 0.7790
No log 12.5333 188 0.5916 0.6774 0.5916 0.7692
No log 12.6667 190 0.5964 0.7019 0.5964 0.7722
No log 12.8 192 0.6063 0.6923 0.6063 0.7786
No log 12.9333 194 0.6013 0.6464 0.6013 0.7755
No log 13.0667 196 0.6695 0.6367 0.6695 0.8182
No log 13.2 198 0.6795 0.6225 0.6795 0.8243
No log 13.3333 200 0.6109 0.6427 0.6109 0.7816
No log 13.4667 202 0.5814 0.6439 0.5814 0.7625
No log 13.6 204 0.5764 0.6374 0.5764 0.7592
No log 13.7333 206 0.5743 0.6206 0.5743 0.7578
No log 13.8667 208 0.6084 0.6169 0.6084 0.7800
No log 14.0 210 0.5981 0.6275 0.5981 0.7734
No log 14.1333 212 0.6126 0.6064 0.6126 0.7827
No log 14.2667 214 0.6354 0.5668 0.6354 0.7971
No log 14.4 216 0.5947 0.5855 0.5947 0.7711
No log 14.5333 218 0.5826 0.6745 0.5826 0.7633
No log 14.6667 220 0.5858 0.6745 0.5858 0.7654
No log 14.8 222 0.5999 0.5855 0.5999 0.7746
No log 14.9333 224 0.5852 0.6880 0.5852 0.7650
No log 15.0667 226 0.5916 0.6197 0.5916 0.7691
No log 15.2 228 0.5891 0.7171 0.5891 0.7675
No log 15.3333 230 0.5984 0.6672 0.5984 0.7736
No log 15.4667 232 0.6547 0.6023 0.6547 0.8091
No log 15.6 234 0.6540 0.5921 0.6540 0.8087
No log 15.7333 236 0.6054 0.6110 0.6054 0.7781
No log 15.8667 238 0.5909 0.6256 0.5909 0.7687
No log 16.0 240 0.5875 0.6293 0.5875 0.7665
No log 16.1333 242 0.5861 0.6649 0.5861 0.7656
No log 16.2667 244 0.6253 0.6452 0.6253 0.7908
No log 16.4 246 0.6100 0.6598 0.6100 0.7810
No log 16.5333 248 0.6021 0.6564 0.6021 0.7760
No log 16.6667 250 0.6154 0.5690 0.6154 0.7845
No log 16.8 252 0.6330 0.5819 0.6330 0.7956
No log 16.9333 254 0.6189 0.5840 0.6189 0.7867
No log 17.0667 256 0.6215 0.6289 0.6215 0.7883
No log 17.2 258 0.6598 0.6380 0.6598 0.8123
No log 17.3333 260 0.6967 0.6004 0.6967 0.8347
No log 17.4667 262 0.6733 0.5577 0.6733 0.8206
No log 17.6 264 0.6228 0.5798 0.6228 0.7892
No log 17.7333 266 0.6437 0.5466 0.6437 0.8023
No log 17.8667 268 0.6298 0.5466 0.6298 0.7936
No log 18.0 270 0.6175 0.5688 0.6175 0.7858
No log 18.1333 272 0.5910 0.5809 0.5910 0.7687
No log 18.2667 274 0.5751 0.5987 0.5751 0.7584
No log 18.4 276 0.5639 0.7070 0.5639 0.7509
No log 18.5333 278 0.5773 0.6305 0.5773 0.7598
No log 18.6667 280 0.6510 0.6162 0.6510 0.8068
No log 18.8 282 0.6748 0.5957 0.6748 0.8215
No log 18.9333 284 0.5902 0.6217 0.5902 0.7682
No log 19.0667 286 0.6066 0.6593 0.6066 0.7789
No log 19.2 288 0.6163 0.5747 0.6163 0.7851
No log 19.3333 290 0.6053 0.6424 0.6053 0.7780
No log 19.4667 292 0.6388 0.5690 0.6388 0.7993
No log 19.6 294 0.6145 0.6293 0.6145 0.7839
No log 19.7333 296 0.6027 0.6528 0.6027 0.7764
No log 19.8667 298 0.6215 0.6293 0.6215 0.7884
No log 20.0 300 0.6201 0.6564 0.6201 0.7875
No log 20.1333 302 0.6046 0.6320 0.6046 0.7775
No log 20.2667 304 0.6028 0.6729 0.6028 0.7764
No log 20.4 306 0.5898 0.6602 0.5898 0.7680
No log 20.5333 308 0.6478 0.5885 0.6478 0.8049
No log 20.6667 310 0.6583 0.5830 0.6583 0.8113
No log 20.8 312 0.5819 0.6397 0.5819 0.7628
No log 20.9333 314 0.5854 0.7223 0.5854 0.7651
No log 21.0667 316 0.6569 0.6587 0.6569 0.8105
No log 21.2 318 0.6215 0.6620 0.6215 0.7884
No log 21.3333 320 0.5799 0.6546 0.5799 0.7615
No log 21.4667 322 0.6771 0.5436 0.6771 0.8229
No log 21.6 324 0.7937 0.6119 0.7937 0.8909
No log 21.7333 326 0.7460 0.5769 0.7460 0.8637
No log 21.8667 328 0.6161 0.5860 0.6161 0.7849
No log 22.0 330 0.5735 0.6985 0.5735 0.7573
No log 22.1333 332 0.6020 0.6357 0.6020 0.7759
No log 22.2667 334 0.5879 0.6685 0.5879 0.7667
No log 22.4 336 0.5836 0.5822 0.5836 0.7639
No log 22.5333 338 0.6086 0.5305 0.6086 0.7802
No log 22.6667 340 0.6168 0.5482 0.6168 0.7853
No log 22.8 342 0.6094 0.5798 0.6094 0.7806
No log 22.9333 344 0.5714 0.5626 0.5714 0.7559
No log 23.0667 346 0.5506 0.6537 0.5506 0.7420
No log 23.2 348 0.5453 0.6680 0.5453 0.7385
No log 23.3333 350 0.5716 0.6435 0.5716 0.7560
No log 23.4667 352 0.6023 0.5746 0.6023 0.7761
No log 23.6 354 0.6258 0.6064 0.6258 0.7911
No log 23.7333 356 0.6002 0.6091 0.6002 0.7747
No log 23.8667 358 0.6010 0.5850 0.6010 0.7753
No log 24.0 360 0.5844 0.6179 0.5844 0.7644
No log 24.1333 362 0.5971 0.5923 0.5971 0.7727
No log 24.2667 364 0.5866 0.6100 0.5866 0.7659
No log 24.4 366 0.6028 0.5810 0.6028 0.7764
No log 24.5333 368 0.6861 0.5715 0.6861 0.8283
No log 24.6667 370 0.7403 0.6218 0.7403 0.8604
No log 24.8 372 0.6981 0.5639 0.6981 0.8355
No log 24.9333 374 0.6640 0.5466 0.6640 0.8149
No log 25.0667 376 0.6480 0.5697 0.6480 0.8050
No log 25.2 378 0.6199 0.5740 0.6199 0.7874
No log 25.3333 380 0.6362 0.5740 0.6362 0.7976
No log 25.4667 382 0.6524 0.5719 0.6524 0.8077
No log 25.6 384 0.6415 0.5545 0.6415 0.8009
No log 25.7333 386 0.6213 0.5863 0.6213 0.7882
No log 25.8667 388 0.6080 0.5975 0.6080 0.7798
No log 26.0 390 0.6165 0.6169 0.6165 0.7852
No log 26.1333 392 0.6138 0.6293 0.6138 0.7834
No log 26.2667 394 0.5885 0.6397 0.5885 0.7671
No log 26.4 396 0.5884 0.6397 0.5884 0.7671
No log 26.5333 398 0.5826 0.6536 0.5826 0.7633
No log 26.6667 400 0.5884 0.6452 0.5884 0.7671
No log 26.8 402 0.6294 0.6516 0.6294 0.7933
No log 26.9333 404 0.6034 0.6393 0.6034 0.7768
No log 27.0667 406 0.5891 0.6288 0.5891 0.7675
No log 27.2 408 0.6365 0.6032 0.6365 0.7978
No log 27.3333 410 0.6216 0.6254 0.6216 0.7884
No log 27.4667 412 0.5993 0.6421 0.5993 0.7741
No log 27.6 414 0.5886 0.6619 0.5886 0.7672
No log 27.7333 416 0.5938 0.6509 0.5938 0.7706
No log 27.8667 418 0.6008 0.6320 0.6008 0.7751
No log 28.0 420 0.5974 0.6320 0.5974 0.7729
No log 28.1333 422 0.5936 0.6186 0.5936 0.7705
No log 28.2667 424 0.5946 0.6197 0.5946 0.7711
No log 28.4 426 0.6145 0.6335 0.6145 0.7839
No log 28.5333 428 0.6674 0.6280 0.6674 0.8170
No log 28.6667 430 0.7009 0.6071 0.7009 0.8372
No log 28.8 432 0.6733 0.6172 0.6733 0.8206
No log 28.9333 434 0.6288 0.5909 0.6288 0.7930
No log 29.0667 436 0.5946 0.5905 0.5946 0.7711
No log 29.2 438 0.5806 0.6035 0.5806 0.7619
No log 29.3333 440 0.5858 0.6113 0.5858 0.7654
No log 29.4667 442 0.6233 0.5898 0.6233 0.7895
No log 29.6 444 0.6457 0.5864 0.6457 0.8036
No log 29.7333 446 0.5966 0.6038 0.5966 0.7724
No log 29.8667 448 0.5693 0.6729 0.5693 0.7545
No log 30.0 450 0.5912 0.6402 0.5912 0.7689
No log 30.1333 452 0.5974 0.6288 0.5974 0.7729
No log 30.2667 454 0.5800 0.7049 0.5800 0.7616
No log 30.4 456 0.5809 0.6157 0.5809 0.7622
No log 30.5333 458 0.6153 0.6110 0.6153 0.7844
No log 30.6667 460 0.6906 0.5726 0.6906 0.8310
No log 30.8 462 0.6754 0.5726 0.6754 0.8218
No log 30.9333 464 0.6082 0.6110 0.6082 0.7798
No log 31.0667 466 0.5795 0.6903 0.5795 0.7613
No log 31.2 468 0.6056 0.6368 0.6056 0.7782
No log 31.3333 470 0.5989 0.6368 0.5989 0.7739
No log 31.4667 472 0.5804 0.7010 0.5804 0.7618
No log 31.6 474 0.5964 0.6226 0.5964 0.7723
No log 31.7333 476 0.6576 0.5671 0.6576 0.8109
No log 31.8667 478 0.6838 0.5548 0.6838 0.8269
No log 32.0 480 0.6408 0.5882 0.6408 0.8005
No log 32.1333 482 0.6128 0.6243 0.6128 0.7828
No log 32.2667 484 0.5912 0.6650 0.5912 0.7689
No log 32.4 486 0.5886 0.7064 0.5886 0.7672
No log 32.5333 488 0.5885 0.6650 0.5885 0.7671
No log 32.6667 490 0.5948 0.6256 0.5948 0.7713
No log 32.8 492 0.6033 0.5866 0.6033 0.7767
No log 32.9333 494 0.6122 0.5752 0.6122 0.7825
No log 33.0667 496 0.6070 0.5853 0.6070 0.7791
No log 33.2 498 0.5992 0.5964 0.5992 0.7741
0.221 33.3333 500 0.5942 0.6157 0.5942 0.7708
0.221 33.4667 502 0.6022 0.5964 0.6022 0.7760
0.221 33.6 504 0.6373 0.5798 0.6373 0.7983
0.221 33.7333 506 0.6411 0.5688 0.6411 0.8007
0.221 33.8667 508 0.6085 0.5977 0.6085 0.7801
0.221 34.0 510 0.5821 0.6301 0.5821 0.7630
0.221 34.1333 512 0.5802 0.6197 0.5802 0.7617
0.221 34.2667 514 0.5956 0.6263 0.5956 0.7718
0.221 34.4 516 0.6012 0.5919 0.6012 0.7754
0.221 34.5333 518 0.6349 0.5560 0.6349 0.7968
0.221 34.6667 520 0.6484 0.5658 0.6484 0.8052
0.221 34.8 522 0.6310 0.5658 0.6310 0.7943
0.221 34.9333 524 0.5914 0.5944 0.5914 0.7690

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k3_task5_organization

Finetuned
(4019)
this model