ArabicNewSplits7_B_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k19_task5_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.7129
  • Qwk: 0.4473
  • Mse: 0.7129
  • Rmse: 0.8443

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0208 2 4.0848 0.0130 4.0848 2.0211
No log 0.0417 4 2.2871 0.0450 2.2871 1.5123
No log 0.0625 6 1.5493 0.0185 1.5493 1.2447
No log 0.0833 8 1.2525 0.1140 1.2525 1.1192
No log 0.1042 10 1.1253 0.1725 1.1253 1.0608
No log 0.125 12 1.1769 0.0854 1.1769 1.0848
No log 0.1458 14 1.1494 0.1041 1.1494 1.0721
No log 0.1667 16 1.0727 0.0919 1.0727 1.0357
No log 0.1875 18 0.9977 0.2035 0.9977 0.9988
No log 0.2083 20 1.0318 0.2834 1.0318 1.0158
No log 0.2292 22 1.2011 0.1024 1.2011 1.0959
No log 0.25 24 1.2355 0.0380 1.2355 1.1115
No log 0.2708 26 1.1577 0.1764 1.1577 1.0760
No log 0.2917 28 1.1097 0.2221 1.1097 1.0534
No log 0.3125 30 1.0525 0.1810 1.0525 1.0259
No log 0.3333 32 1.0634 0.2375 1.0634 1.0312
No log 0.3542 34 1.0581 0.1837 1.0581 1.0286
No log 0.375 36 1.1020 0.1603 1.1020 1.0498
No log 0.3958 38 1.0664 0.2023 1.0664 1.0327
No log 0.4167 40 0.9668 0.3817 0.9668 0.9832
No log 0.4375 42 1.0053 0.3021 1.0053 1.0026
No log 0.4583 44 0.9481 0.2671 0.9481 0.9737
No log 0.4792 46 0.9315 0.3236 0.9315 0.9652
No log 0.5 48 0.9663 0.3192 0.9663 0.9830
No log 0.5208 50 0.9506 0.3485 0.9506 0.9750
No log 0.5417 52 0.9201 0.3876 0.9201 0.9592
No log 0.5625 54 0.9651 0.3243 0.9651 0.9824
No log 0.5833 56 0.9644 0.3243 0.9644 0.9820
No log 0.6042 58 0.8917 0.3857 0.8917 0.9443
No log 0.625 60 0.9499 0.3317 0.9499 0.9746
No log 0.6458 62 0.9703 0.2288 0.9703 0.9850
No log 0.6667 64 0.9372 0.3067 0.9372 0.9681
No log 0.6875 66 0.8661 0.3960 0.8661 0.9307
No log 0.7083 68 0.8539 0.3979 0.8539 0.9241
No log 0.7292 70 0.8637 0.3821 0.8637 0.9294
No log 0.75 72 0.9083 0.5167 0.9083 0.9530
No log 0.7708 74 0.8883 0.5070 0.8883 0.9425
No log 0.7917 76 1.0753 0.3584 1.0753 1.0369
No log 0.8125 78 1.0615 0.3658 1.0615 1.0303
No log 0.8333 80 0.9549 0.3169 0.9549 0.9772
No log 0.8542 82 0.9152 0.3604 0.9152 0.9567
No log 0.875 84 0.8208 0.4475 0.8208 0.9060
No log 0.8958 86 0.7805 0.3713 0.7805 0.8834
No log 0.9167 88 0.8474 0.4180 0.8474 0.9206
No log 0.9375 90 0.9358 0.3763 0.9358 0.9673
No log 0.9583 92 1.2043 0.2026 1.2043 1.0974
No log 0.9792 94 1.4977 0.1892 1.4977 1.2238
No log 1.0 96 1.5492 0.1729 1.5492 1.2447
No log 1.0208 98 1.4098 0.1601 1.4098 1.1873
No log 1.0417 100 1.2217 0.1700 1.2217 1.1053
No log 1.0625 102 1.0076 0.2175 1.0076 1.0038
No log 1.0833 104 0.8770 0.3817 0.8770 0.9365
No log 1.1042 106 0.8653 0.4321 0.8653 0.9302
No log 1.125 108 0.8383 0.5195 0.8383 0.9156
No log 1.1458 110 0.8365 0.4892 0.8365 0.9146
No log 1.1667 112 0.8369 0.4192 0.8369 0.9148
No log 1.1875 114 0.8796 0.4162 0.8796 0.9379
No log 1.2083 116 0.8990 0.3902 0.8990 0.9482
No log 1.2292 118 0.9073 0.3902 0.9073 0.9525
No log 1.25 120 0.9214 0.3623 0.9214 0.9599
No log 1.2708 122 0.8346 0.4697 0.8346 0.9135
No log 1.2917 124 0.8520 0.4421 0.8520 0.9230
No log 1.3125 126 0.9452 0.2535 0.9452 0.9722
No log 1.3333 128 1.0431 0.3000 1.0431 1.0213
No log 1.3542 130 0.9808 0.2535 0.9808 0.9903
No log 1.375 132 0.8786 0.4283 0.8786 0.9373
No log 1.3958 134 0.9534 0.3081 0.9534 0.9764
No log 1.4167 136 1.0919 0.2284 1.0919 1.0449
No log 1.4375 138 1.2583 0.2260 1.2583 1.1217
No log 1.4583 140 1.2760 0.1952 1.2760 1.1296
No log 1.4792 142 1.1759 0.1886 1.1759 1.0844
No log 1.5 144 0.9661 0.2108 0.9661 0.9829
No log 1.5208 146 0.8952 0.3733 0.8952 0.9462
No log 1.5417 148 0.8482 0.5044 0.8482 0.9210
No log 1.5625 150 0.8299 0.4110 0.8299 0.9110
No log 1.5833 152 0.8494 0.4290 0.8494 0.9217
No log 1.6042 154 0.8496 0.4281 0.8496 0.9218
No log 1.625 156 0.8020 0.4963 0.8020 0.8955
No log 1.6458 158 0.7638 0.5785 0.7638 0.8740
No log 1.6667 160 0.7521 0.5543 0.7521 0.8672
No log 1.6875 162 0.7423 0.5543 0.7423 0.8616
No log 1.7083 164 0.6660 0.5798 0.6660 0.8161
No log 1.7292 166 0.6290 0.6028 0.6290 0.7931
No log 1.75 168 0.6157 0.6338 0.6157 0.7846
No log 1.7708 170 0.7485 0.5164 0.7485 0.8651
No log 1.7917 172 0.9608 0.3571 0.9608 0.9802
No log 1.8125 174 0.9539 0.3394 0.9539 0.9767
No log 1.8333 176 0.7709 0.4433 0.7709 0.8780
No log 1.8542 178 0.6165 0.5550 0.6165 0.7852
No log 1.875 180 0.6234 0.5550 0.6234 0.7896
No log 1.8958 182 0.6207 0.5303 0.6207 0.7879
No log 1.9167 184 0.6180 0.5303 0.6180 0.7861
No log 1.9375 186 0.6408 0.5936 0.6408 0.8005
No log 1.9583 188 0.6895 0.5925 0.6895 0.8304
No log 1.9792 190 0.6936 0.5898 0.6936 0.8328
No log 2.0 192 0.7035 0.6009 0.7035 0.8388
No log 2.0208 194 0.6982 0.6316 0.6982 0.8356
No log 2.0417 196 0.6425 0.5408 0.6425 0.8016
No log 2.0625 198 0.6970 0.4958 0.6970 0.8348
No log 2.0833 200 0.6808 0.5073 0.6808 0.8251
No log 2.1042 202 0.6589 0.5042 0.6589 0.8118
No log 2.125 204 0.6722 0.5026 0.6722 0.8199
No log 2.1458 206 0.6565 0.5138 0.6565 0.8103
No log 2.1667 208 0.6461 0.5680 0.6461 0.8038
No log 2.1875 210 0.6981 0.5854 0.6981 0.8355
No log 2.2083 212 0.7378 0.5893 0.7378 0.8589
No log 2.2292 214 0.7697 0.5857 0.7697 0.8773
No log 2.25 216 0.7048 0.6062 0.7048 0.8395
No log 2.2708 218 0.6161 0.6128 0.6161 0.7849
No log 2.2917 220 0.6281 0.5969 0.6281 0.7925
No log 2.3125 222 0.6288 0.5880 0.6288 0.7930
No log 2.3333 224 0.7002 0.5348 0.7002 0.8368
No log 2.3542 226 0.7378 0.5003 0.7378 0.8589
No log 2.375 228 0.6696 0.5192 0.6696 0.8183
No log 2.3958 230 0.6758 0.5123 0.6758 0.8221
No log 2.4167 232 0.7198 0.5576 0.7198 0.8484
No log 2.4375 234 0.6326 0.6322 0.6326 0.7954
No log 2.4583 236 0.5696 0.6419 0.5696 0.7547
No log 2.4792 238 0.6130 0.5996 0.6130 0.7829
No log 2.5 240 0.7865 0.4244 0.7865 0.8868
No log 2.5208 242 0.8245 0.3979 0.8245 0.9080
No log 2.5417 244 0.7583 0.5313 0.7583 0.8708
No log 2.5625 246 0.8144 0.5713 0.8144 0.9024
No log 2.5833 248 0.8734 0.5780 0.8734 0.9346
No log 2.6042 250 0.7716 0.6274 0.7716 0.8784
No log 2.625 252 0.6150 0.6194 0.6150 0.7842
No log 2.6458 254 0.6131 0.6371 0.6131 0.7830
No log 2.6667 256 0.7643 0.5305 0.7643 0.8743
No log 2.6875 258 0.8992 0.4301 0.8992 0.9483
No log 2.7083 260 0.8709 0.4186 0.8709 0.9332
No log 2.7292 262 0.7868 0.4485 0.7868 0.8870
No log 2.75 264 0.6983 0.5180 0.6983 0.8356
No log 2.7708 266 0.6130 0.6044 0.6130 0.7829
No log 2.7917 268 0.5623 0.6173 0.5623 0.7498
No log 2.8125 270 0.5644 0.6311 0.5644 0.7513
No log 2.8333 272 0.5548 0.6311 0.5548 0.7449
No log 2.8542 274 0.5475 0.6407 0.5475 0.7399
No log 2.875 276 0.5582 0.6107 0.5582 0.7471
No log 2.8958 278 0.5728 0.6230 0.5728 0.7568
No log 2.9167 280 0.5776 0.6569 0.5776 0.7600
No log 2.9375 282 0.5683 0.6219 0.5683 0.7538
No log 2.9583 284 0.5872 0.5988 0.5872 0.7663
No log 2.9792 286 0.5696 0.6456 0.5696 0.7547
No log 3.0 288 0.5708 0.6139 0.5708 0.7555
No log 3.0208 290 0.6329 0.5733 0.6329 0.7955
No log 3.0417 292 0.6498 0.5244 0.6498 0.8061
No log 3.0625 294 0.5818 0.5380 0.5818 0.7627
No log 3.0833 296 0.5656 0.5943 0.5656 0.7520
No log 3.1042 298 0.5891 0.5564 0.5891 0.7675
No log 3.125 300 0.6250 0.5758 0.6250 0.7906
No log 3.1458 302 0.6465 0.5217 0.6465 0.8041
No log 3.1667 304 0.6489 0.5195 0.6489 0.8055
No log 3.1875 306 0.7715 0.4829 0.7715 0.8784
No log 3.2083 308 0.8156 0.4444 0.8156 0.9031
No log 3.2292 310 0.6998 0.5637 0.6998 0.8365
No log 3.25 312 0.6644 0.5302 0.6644 0.8151
No log 3.2708 314 0.7176 0.5555 0.7176 0.8471
No log 3.2917 316 0.7167 0.5434 0.7167 0.8466
No log 3.3125 318 0.6657 0.5542 0.6657 0.8159
No log 3.3333 320 0.6326 0.5419 0.6326 0.7954
No log 3.3542 322 0.6207 0.6020 0.6207 0.7878
No log 3.375 324 0.6737 0.5637 0.6737 0.8208
No log 3.3958 326 0.7539 0.4815 0.7539 0.8683
No log 3.4167 328 0.6760 0.4862 0.6760 0.8222
No log 3.4375 330 0.6241 0.5550 0.6241 0.7900
No log 3.4583 332 0.6727 0.5464 0.6727 0.8202
No log 3.4792 334 0.6965 0.5588 0.6965 0.8345
No log 3.5 336 0.7334 0.4990 0.7334 0.8564
No log 3.5208 338 0.6644 0.5316 0.6644 0.8151
No log 3.5417 340 0.6191 0.5529 0.6191 0.7868
No log 3.5625 342 0.6195 0.5011 0.6195 0.7871
No log 3.5833 344 0.6848 0.4831 0.6848 0.8275
No log 3.6042 346 0.6779 0.4835 0.6779 0.8233
No log 3.625 348 0.6015 0.5515 0.6015 0.7756
No log 3.6458 350 0.5635 0.5943 0.5635 0.7507
No log 3.6667 352 0.5603 0.6173 0.5603 0.7485
No log 3.6875 354 0.5634 0.6269 0.5634 0.7506
No log 3.7083 356 0.6289 0.4847 0.6289 0.7930
No log 3.7292 358 0.7211 0.5320 0.7211 0.8492
No log 3.75 360 0.7216 0.4946 0.7216 0.8495
No log 3.7708 362 0.6564 0.5686 0.6564 0.8102
No log 3.7917 364 0.6154 0.6301 0.6154 0.7845
No log 3.8125 366 0.6210 0.5363 0.6210 0.7881
No log 3.8333 368 0.6009 0.5906 0.6009 0.7752
No log 3.8542 370 0.5856 0.5666 0.5856 0.7652
No log 3.875 372 0.5935 0.5784 0.5935 0.7704
No log 3.8958 374 0.6351 0.5572 0.6351 0.7969
No log 3.9167 376 0.6789 0.5181 0.6789 0.8239
No log 3.9375 378 0.6856 0.5192 0.6856 0.8280
No log 3.9583 380 0.6835 0.4958 0.6835 0.8267
No log 3.9792 382 0.6372 0.5380 0.6372 0.7983
No log 4.0 384 0.6289 0.6052 0.6289 0.7931
No log 4.0208 386 0.6306 0.5663 0.6306 0.7941
No log 4.0417 388 0.6514 0.5380 0.6514 0.8071
No log 4.0625 390 0.6447 0.5837 0.6447 0.8029
No log 4.0833 392 0.5957 0.6311 0.5957 0.7718
No log 4.1042 394 0.5853 0.6150 0.5853 0.7651
No log 4.125 396 0.5954 0.6052 0.5954 0.7716
No log 4.1458 398 0.6367 0.5593 0.6367 0.7979
No log 4.1667 400 0.6584 0.5593 0.6584 0.8114
No log 4.1875 402 0.6344 0.5003 0.6344 0.7965
No log 4.2083 404 0.6016 0.5784 0.6016 0.7756
No log 4.2292 406 0.5860 0.5918 0.5860 0.7655
No log 4.25 408 0.5825 0.5784 0.5825 0.7632
No log 4.2708 410 0.5941 0.5966 0.5941 0.7708
No log 4.2917 412 0.6148 0.6039 0.6148 0.7841
No log 4.3125 414 0.6028 0.6444 0.6028 0.7764
No log 4.3333 416 0.5832 0.5784 0.5832 0.7637
No log 4.3542 418 0.5915 0.6020 0.5915 0.7691
No log 4.375 420 0.6015 0.6020 0.6015 0.7755
No log 4.3958 422 0.6094 0.5889 0.6094 0.7806
No log 4.4167 424 0.6045 0.6008 0.6045 0.7775
No log 4.4375 426 0.6028 0.6062 0.6028 0.7764
No log 4.4583 428 0.5840 0.6824 0.5840 0.7642
No log 4.4792 430 0.5922 0.6451 0.5922 0.7696
No log 4.5 432 0.5876 0.6065 0.5876 0.7666
No log 4.5208 434 0.6256 0.6094 0.6256 0.7910
No log 4.5417 436 0.6798 0.5231 0.6798 0.8245
No log 4.5625 438 0.6799 0.5231 0.6799 0.8246
No log 4.5833 440 0.6465 0.5245 0.6465 0.8040
No log 4.6042 442 0.6537 0.5721 0.6537 0.8085
No log 4.625 444 0.6537 0.5698 0.6537 0.8085
No log 4.6458 446 0.6755 0.5333 0.6755 0.8219
No log 4.6667 448 0.7180 0.5709 0.7180 0.8473
No log 4.6875 450 0.6741 0.4862 0.6741 0.8210
No log 4.7083 452 0.6276 0.5640 0.6276 0.7922
No log 4.7292 454 0.6317 0.5640 0.6317 0.7948
No log 4.75 456 0.6356 0.5640 0.6356 0.7973
No log 4.7708 458 0.6261 0.5640 0.6261 0.7912
No log 4.7917 460 0.6303 0.5640 0.6303 0.7939
No log 4.8125 462 0.6442 0.5363 0.6442 0.8026
No log 4.8333 464 0.6281 0.5348 0.6281 0.7925
No log 4.8542 466 0.6010 0.6588 0.6010 0.7753
No log 4.875 468 0.6045 0.6122 0.6045 0.7775
No log 4.8958 470 0.6279 0.5763 0.6279 0.7924
No log 4.9167 472 0.6279 0.5975 0.6279 0.7924
No log 4.9375 474 0.5984 0.6018 0.5984 0.7736
No log 4.9583 476 0.6839 0.6028 0.6839 0.8270
No log 4.9792 478 0.7875 0.5246 0.7875 0.8874
No log 5.0 480 0.8078 0.4460 0.8078 0.8988
No log 5.0208 482 0.7508 0.5353 0.7508 0.8665
No log 5.0417 484 0.6581 0.4903 0.6581 0.8112
No log 5.0625 486 0.7242 0.5559 0.7242 0.8510
No log 5.0833 488 0.7773 0.5266 0.7773 0.8816
No log 5.1042 490 0.7356 0.5065 0.7356 0.8577
No log 5.125 492 0.6751 0.5712 0.6751 0.8216
No log 5.1458 494 0.6761 0.5650 0.6761 0.8223
No log 5.1667 496 0.7189 0.5876 0.7189 0.8479
No log 5.1875 498 0.6872 0.5894 0.6872 0.8290
0.4034 5.2083 500 0.6629 0.5316 0.6629 0.8142
0.4034 5.2292 502 0.6943 0.4608 0.6943 0.8332
0.4034 5.25 504 0.7294 0.4697 0.7294 0.8540
0.4034 5.2708 506 0.7290 0.4318 0.7290 0.8538
0.4034 5.2917 508 0.7295 0.5065 0.7295 0.8541
0.4034 5.3125 510 0.7170 0.5446 0.7170 0.8467
0.4034 5.3333 512 0.6524 0.5394 0.6524 0.8077
0.4034 5.3542 514 0.6460 0.5394 0.6460 0.8038
0.4034 5.375 516 0.6651 0.5259 0.6651 0.8156
0.4034 5.3958 518 0.6760 0.5259 0.6760 0.8222
0.4034 5.4167 520 0.6981 0.4473 0.6981 0.8355
0.4034 5.4375 522 0.7297 0.4473 0.7297 0.8543
0.4034 5.4583 524 0.7129 0.4473 0.7129 0.8443

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_B_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k19_task5_organization

Finetuned
(4019)
this model