ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k19_task5_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.5623
  • Qwk: 0.6455
  • Mse: 0.5623
  • Rmse: 0.7499

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0211 2 3.9455 -0.0257 3.9455 1.9863
No log 0.0421 4 2.1378 0.0628 2.1378 1.4621
No log 0.0632 6 1.4185 0.0 1.4185 1.1910
No log 0.0842 8 1.1640 0.1738 1.1640 1.0789
No log 0.1053 10 1.1887 0.0520 1.1887 1.0903
No log 0.1263 12 1.1105 0.1398 1.1105 1.0538
No log 0.1474 14 0.9870 0.2541 0.9870 0.9935
No log 0.1684 16 0.9019 0.3432 0.9019 0.9497
No log 0.1895 18 0.8888 0.3688 0.8888 0.9428
No log 0.2105 20 0.9008 0.3332 0.9008 0.9491
No log 0.2316 22 0.8553 0.4547 0.8553 0.9248
No log 0.2526 24 0.7936 0.3604 0.7936 0.8909
No log 0.2737 26 0.7916 0.3529 0.7916 0.8897
No log 0.2947 28 0.7595 0.5419 0.7595 0.8715
No log 0.3158 30 0.7590 0.5399 0.7590 0.8712
No log 0.3368 32 0.7530 0.5060 0.7530 0.8678
No log 0.3579 34 0.7966 0.5467 0.7966 0.8925
No log 0.3789 36 1.0234 0.4705 1.0234 1.0116
No log 0.4 38 1.0615 0.4917 1.0615 1.0303
No log 0.4211 40 0.7440 0.6203 0.7440 0.8626
No log 0.4421 42 0.6880 0.6714 0.6880 0.8295
No log 0.4632 44 0.6583 0.6473 0.6583 0.8113
No log 0.4842 46 0.8834 0.5996 0.8834 0.9399
No log 0.5053 48 1.1907 0.4699 1.1907 1.0912
No log 0.5263 50 0.9862 0.5455 0.9862 0.9931
No log 0.5474 52 0.6400 0.6015 0.6400 0.8000
No log 0.5684 54 0.6104 0.5977 0.6104 0.7813
No log 0.5895 56 0.6289 0.6410 0.6289 0.7930
No log 0.6105 58 0.6570 0.6586 0.6570 0.8105
No log 0.6316 60 0.6538 0.6225 0.6538 0.8086
No log 0.6526 62 0.6393 0.6005 0.6393 0.7996
No log 0.6737 64 0.6340 0.5902 0.6340 0.7962
No log 0.6947 66 0.6228 0.5828 0.6228 0.7892
No log 0.7158 68 0.6892 0.6559 0.6892 0.8302
No log 0.7368 70 0.7561 0.6838 0.7561 0.8695
No log 0.7579 72 0.6538 0.5884 0.6538 0.8086
No log 0.7789 74 0.6807 0.5477 0.6807 0.8251
No log 0.8 76 0.8085 0.5643 0.8085 0.8992
No log 0.8211 78 0.6686 0.5944 0.6686 0.8177
No log 0.8421 80 0.5923 0.6491 0.5923 0.7696
No log 0.8632 82 0.6195 0.6176 0.6195 0.7871
No log 0.8842 84 0.5842 0.6983 0.5842 0.7643
No log 0.9053 86 0.5840 0.7027 0.5840 0.7642
No log 0.9263 88 0.5883 0.6927 0.5883 0.7670
No log 0.9474 90 0.5764 0.6775 0.5764 0.7592
No log 0.9684 92 0.5649 0.6751 0.5649 0.7516
No log 0.9895 94 0.5649 0.6377 0.5649 0.7516
No log 1.0105 96 0.6798 0.6887 0.6798 0.8245
No log 1.0316 98 0.6996 0.6697 0.6996 0.8364
No log 1.0526 100 0.6053 0.7348 0.6053 0.7780
No log 1.0737 102 0.6351 0.6932 0.6351 0.7969
No log 1.0947 104 0.6131 0.6314 0.6131 0.7830
No log 1.1158 106 0.6060 0.6370 0.6060 0.7785
No log 1.1368 108 0.6164 0.6593 0.6164 0.7851
No log 1.1579 110 0.6589 0.6550 0.6589 0.8117
No log 1.1789 112 0.6931 0.6774 0.6931 0.8325
No log 1.2 114 0.6570 0.6205 0.6570 0.8105
No log 1.2211 116 0.6556 0.6562 0.6556 0.8097
No log 1.2421 118 0.6513 0.6164 0.6513 0.8070
No log 1.2632 120 0.7564 0.5061 0.7564 0.8697
No log 1.2842 122 0.7879 0.5175 0.7879 0.8877
No log 1.3053 124 0.5890 0.6990 0.5890 0.7675
No log 1.3263 126 0.7976 0.6094 0.7976 0.8931
No log 1.3474 128 0.8679 0.5984 0.8679 0.9316
No log 1.3684 130 0.6639 0.6713 0.6639 0.8148
No log 1.3895 132 0.5800 0.6552 0.5800 0.7616
No log 1.4105 134 0.5750 0.6310 0.5750 0.7583
No log 1.4316 136 0.6188 0.6459 0.6188 0.7866
No log 1.4526 138 0.6616 0.6396 0.6616 0.8134
No log 1.4737 140 0.6312 0.6692 0.6312 0.7945
No log 1.4947 142 0.6053 0.6751 0.6053 0.7780
No log 1.5158 144 0.6325 0.6930 0.6325 0.7953
No log 1.5368 146 0.5984 0.6594 0.5984 0.7736
No log 1.5579 148 0.6138 0.6293 0.6138 0.7835
No log 1.5789 150 0.6684 0.6620 0.6684 0.8175
No log 1.6 152 0.6163 0.6570 0.6163 0.7851
No log 1.6211 154 0.5949 0.6846 0.5949 0.7713
No log 1.6421 156 0.5803 0.6932 0.5803 0.7618
No log 1.6632 158 0.5973 0.6675 0.5973 0.7728
No log 1.6842 160 0.6769 0.6132 0.6769 0.8227
No log 1.7053 162 0.6419 0.6626 0.6419 0.8012
No log 1.7263 164 0.5783 0.7193 0.5783 0.7605
No log 1.7474 166 0.6700 0.6922 0.6700 0.8185
No log 1.7684 168 0.6775 0.6487 0.6775 0.8231
No log 1.7895 170 0.5522 0.6680 0.5522 0.7431
No log 1.8105 172 0.5820 0.6278 0.5820 0.7629
No log 1.8316 174 0.6181 0.6461 0.6181 0.7862
No log 1.8526 176 0.5881 0.6644 0.5881 0.7669
No log 1.8737 178 0.6206 0.6589 0.6206 0.7878
No log 1.8947 180 0.6887 0.6520 0.6887 0.8299
No log 1.9158 182 0.6456 0.7226 0.6456 0.8035
No log 1.9368 184 0.6433 0.6220 0.6433 0.8020
No log 1.9579 186 0.6541 0.7174 0.6541 0.8088
No log 1.9789 188 0.6748 0.6628 0.6748 0.8215
No log 2.0 190 0.6454 0.6868 0.6454 0.8033
No log 2.0211 192 0.6695 0.6231 0.6695 0.8183
No log 2.0421 194 0.6996 0.6461 0.6996 0.8364
No log 2.0632 196 0.7140 0.6348 0.7140 0.8450
No log 2.0842 198 0.8265 0.5304 0.8265 0.9091
No log 2.1053 200 0.8465 0.4829 0.8465 0.9201
No log 2.1263 202 0.7478 0.5708 0.7478 0.8647
No log 2.1474 204 0.6311 0.7012 0.6311 0.7944
No log 2.1684 206 0.6371 0.6869 0.6371 0.7982
No log 2.1895 208 0.6172 0.7012 0.6172 0.7856
No log 2.2105 210 0.6939 0.6289 0.6939 0.8330
No log 2.2316 212 0.7061 0.6615 0.7061 0.8403
No log 2.2526 214 0.6537 0.6728 0.6537 0.8085
No log 2.2737 216 0.6448 0.6430 0.6448 0.8030
No log 2.2947 218 0.6879 0.6476 0.6879 0.8294
No log 2.3158 220 0.6718 0.6305 0.6718 0.8196
No log 2.3368 222 0.6308 0.6123 0.6308 0.7942
No log 2.3579 224 0.6382 0.6770 0.6382 0.7989
No log 2.3789 226 0.6449 0.7064 0.6449 0.8031
No log 2.4 228 0.6717 0.6758 0.6717 0.8196
No log 2.4211 230 0.6875 0.7077 0.6875 0.8292
No log 2.4421 232 0.6536 0.6857 0.6536 0.8085
No log 2.4632 234 0.6187 0.6902 0.6187 0.7866
No log 2.4842 236 0.6066 0.6983 0.6066 0.7789
No log 2.5053 238 0.5857 0.7126 0.5857 0.7653
No log 2.5263 240 0.6272 0.7009 0.6272 0.7919
No log 2.5474 242 0.6439 0.6888 0.6439 0.8024
No log 2.5684 244 0.5953 0.7687 0.5953 0.7715
No log 2.5895 246 0.5694 0.7284 0.5694 0.7546
No log 2.6105 248 0.5786 0.7175 0.5786 0.7606
No log 2.6316 250 0.5666 0.6770 0.5666 0.7527
No log 2.6526 252 0.6055 0.6187 0.6055 0.7781
No log 2.6737 254 0.6692 0.5833 0.6692 0.8180
No log 2.6947 256 0.6912 0.5677 0.6912 0.8314
No log 2.7158 258 0.6354 0.6337 0.6354 0.7971
No log 2.7368 260 0.6018 0.7216 0.6018 0.7757
No log 2.7579 262 0.7222 0.5739 0.7222 0.8498
No log 2.7789 264 0.7305 0.5739 0.7305 0.8547
No log 2.8 266 0.6039 0.7223 0.6039 0.7771
No log 2.8211 268 0.6219 0.6821 0.6219 0.7886
No log 2.8421 270 0.8043 0.5194 0.8043 0.8968
No log 2.8632 272 0.8270 0.5185 0.8270 0.9094
No log 2.8842 274 0.6777 0.6189 0.6777 0.8232
No log 2.9053 276 0.5854 0.6537 0.5854 0.7651
No log 2.9263 278 0.5758 0.6575 0.5758 0.7588
No log 2.9474 280 0.5765 0.6575 0.5765 0.7592
No log 2.9684 282 0.5887 0.6500 0.5887 0.7673
No log 2.9895 284 0.7231 0.6269 0.7231 0.8504
No log 3.0105 286 0.8698 0.6312 0.8698 0.9326
No log 3.0316 288 0.8543 0.6009 0.8543 0.9243
No log 3.0526 290 0.7911 0.6406 0.7911 0.8895
No log 3.0737 292 0.7258 0.6869 0.7258 0.8519
No log 3.0947 294 0.6683 0.6358 0.6683 0.8175
No log 3.1158 296 0.6232 0.6697 0.6232 0.7894
No log 3.1368 298 0.6581 0.6122 0.6581 0.8112
No log 3.1579 300 0.6970 0.5846 0.6970 0.8349
No log 3.1789 302 0.7206 0.5777 0.7206 0.8489
No log 3.2 304 0.6766 0.5983 0.6766 0.8226
No log 3.2211 306 0.6035 0.6219 0.6035 0.7768
No log 3.2421 308 0.5986 0.6555 0.5986 0.7737
No log 3.2632 310 0.5862 0.6470 0.5862 0.7657
No log 3.2842 312 0.6050 0.6510 0.6050 0.7778
No log 3.3053 314 0.7172 0.5222 0.7172 0.8469
No log 3.3263 316 0.7092 0.5636 0.7092 0.8421
No log 3.3474 318 0.6190 0.6188 0.6190 0.7868
No log 3.3684 320 0.5842 0.6699 0.5842 0.7643
No log 3.3895 322 0.5932 0.6854 0.5932 0.7702
No log 3.4105 324 0.5838 0.6978 0.5838 0.7641
No log 3.4316 326 0.6177 0.6528 0.6177 0.7859
No log 3.4526 328 0.6374 0.6821 0.6374 0.7984
No log 3.4737 330 0.6392 0.6914 0.6392 0.7995
No log 3.4947 332 0.6377 0.6914 0.6377 0.7986
No log 3.5158 334 0.6317 0.6228 0.6317 0.7948
No log 3.5368 336 0.6021 0.5905 0.6021 0.7760
No log 3.5579 338 0.5945 0.6518 0.5945 0.7710
No log 3.5789 340 0.5976 0.6219 0.5976 0.7730
No log 3.6 342 0.6155 0.5993 0.6155 0.7846
No log 3.6211 344 0.5973 0.7124 0.5973 0.7728
No log 3.6421 346 0.6166 0.6875 0.6166 0.7852
No log 3.6632 348 0.7269 0.6090 0.7269 0.8526
No log 3.6842 350 0.7832 0.5961 0.7832 0.8850
No log 3.7053 352 0.7071 0.6035 0.7071 0.8409
No log 3.7263 354 0.6252 0.6916 0.6252 0.7907
No log 3.7474 356 0.6650 0.6626 0.6650 0.8155
No log 3.7684 358 0.7168 0.6201 0.7168 0.8466
No log 3.7895 360 0.6952 0.6549 0.6952 0.8338
No log 3.8105 362 0.6241 0.6938 0.6241 0.7900
No log 3.8316 364 0.6292 0.6528 0.6292 0.7932
No log 3.8526 366 0.6738 0.5819 0.6738 0.8208
No log 3.8737 368 0.6383 0.6337 0.6383 0.7989
No log 3.8947 370 0.5866 0.6370 0.5866 0.7659
No log 3.9158 372 0.5925 0.6307 0.5925 0.7697
No log 3.9368 374 0.5968 0.6846 0.5968 0.7725
No log 3.9579 376 0.5948 0.6781 0.5948 0.7712
No log 3.9789 378 0.6029 0.7077 0.6029 0.7765
No log 4.0 380 0.6153 0.6615 0.6153 0.7844
No log 4.0211 382 0.5990 0.7218 0.5990 0.7740
No log 4.0421 384 0.5920 0.7278 0.5920 0.7694
No log 4.0632 386 0.5866 0.7423 0.5866 0.7659
No log 4.0842 388 0.5948 0.6838 0.5948 0.7712
No log 4.1053 390 0.6231 0.6555 0.6231 0.7893
No log 4.1263 392 0.6310 0.6157 0.6310 0.7944
No log 4.1474 394 0.6066 0.6335 0.6066 0.7788
No log 4.1684 396 0.5663 0.6720 0.5663 0.7526
No log 4.1895 398 0.5868 0.7286 0.5868 0.7660
No log 4.2105 400 0.6701 0.6109 0.6701 0.8186
No log 4.2316 402 0.6446 0.6765 0.6446 0.8029
No log 4.2526 404 0.5849 0.6972 0.5849 0.7648
No log 4.2737 406 0.6144 0.7506 0.6144 0.7838
No log 4.2947 408 0.6270 0.7177 0.6270 0.7918
No log 4.3158 410 0.5770 0.7219 0.5770 0.7596
No log 4.3368 412 0.5659 0.7103 0.5659 0.7523
No log 4.3579 414 0.6125 0.6758 0.6125 0.7826
No log 4.3789 416 0.6086 0.6826 0.6086 0.7801
No log 4.4 418 0.5717 0.7085 0.5717 0.7561
No log 4.4211 420 0.5855 0.6853 0.5855 0.7652
No log 4.4421 422 0.6547 0.6142 0.6547 0.8091
No log 4.4632 424 0.6593 0.6047 0.6593 0.8120
No log 4.4842 426 0.6367 0.6179 0.6367 0.7979
No log 4.5053 428 0.5852 0.6455 0.5852 0.7650
No log 4.5263 430 0.5500 0.6830 0.5500 0.7416
No log 4.5474 432 0.5428 0.7216 0.5428 0.7367
No log 4.5684 434 0.5378 0.7077 0.5378 0.7333
No log 4.5895 436 0.5377 0.7122 0.5377 0.7333
No log 4.6105 438 0.5444 0.7122 0.5444 0.7378
No log 4.6316 440 0.5514 0.7056 0.5514 0.7426
No log 4.6526 442 0.5634 0.6307 0.5634 0.7506
No log 4.6737 444 0.5816 0.6296 0.5816 0.7626
No log 4.6947 446 0.5885 0.6325 0.5885 0.7671
No log 4.7158 448 0.6373 0.6189 0.6373 0.7983
No log 4.7368 450 0.6513 0.6471 0.6513 0.8070
No log 4.7579 452 0.6037 0.6845 0.6037 0.7770
No log 4.7789 454 0.5885 0.7411 0.5885 0.7672
No log 4.8 456 0.5891 0.7225 0.5891 0.7675
No log 4.8211 458 0.6162 0.6998 0.6162 0.7850
No log 4.8421 460 0.6469 0.6228 0.6469 0.8043
No log 4.8632 462 0.6082 0.6415 0.6082 0.7799
No log 4.8842 464 0.6019 0.6498 0.6019 0.7758
No log 4.9053 466 0.6214 0.5879 0.6214 0.7883
No log 4.9263 468 0.6013 0.6437 0.6013 0.7755
No log 4.9474 470 0.5787 0.6610 0.5787 0.7607
No log 4.9684 472 0.6352 0.6772 0.6352 0.7970
No log 4.9895 474 0.6676 0.6254 0.6676 0.8170
No log 5.0105 476 0.6489 0.6473 0.6489 0.8055
No log 5.0316 478 0.6344 0.6286 0.6344 0.7965
No log 5.0526 480 0.6292 0.6209 0.6292 0.7932
No log 5.0737 482 0.6317 0.5197 0.6317 0.7948
No log 5.0947 484 0.6094 0.5331 0.6094 0.7806
No log 5.1158 486 0.5813 0.6332 0.5813 0.7624
No log 5.1368 488 0.5700 0.6104 0.5700 0.7550
No log 5.1579 490 0.5757 0.6509 0.5757 0.7588
No log 5.1789 492 0.5747 0.6722 0.5747 0.7581
No log 5.2 494 0.5623 0.6543 0.5623 0.7498
No log 5.2211 496 0.6045 0.7217 0.6045 0.7775
No log 5.2421 498 0.6045 0.6880 0.6045 0.7775
0.274 5.2632 500 0.5470 0.7071 0.5470 0.7396
0.274 5.2842 502 0.5447 0.6959 0.5447 0.7380
0.274 5.3053 504 0.5578 0.7178 0.5578 0.7469
0.274 5.3263 506 0.6041 0.6221 0.6041 0.7773
0.274 5.3474 508 0.6177 0.6319 0.6177 0.7859
0.274 5.3684 510 0.6023 0.7032 0.6023 0.7761
0.274 5.3895 512 0.5465 0.6838 0.5465 0.7393
0.274 5.4105 514 0.5623 0.6455 0.5623 0.7499

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k19_task5_organization

Finetuned
(4019)
this model