ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run3_AugV5_k16_task5_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.5928
  • Qwk: 0.5882
  • Mse: 0.5928
  • Rmse: 0.7699

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.025 2 3.7775 0.0118 3.7775 1.9436
No log 0.05 4 1.9927 0.1039 1.9927 1.4116
No log 0.075 6 1.4398 0.0513 1.4398 1.1999
No log 0.1 8 1.0719 0.3117 1.0719 1.0353
No log 0.125 10 1.0893 0.2239 1.0893 1.0437
No log 0.15 12 1.1761 0.1091 1.1761 1.0845
No log 0.175 14 1.0507 0.2639 1.0507 1.0250
No log 0.2 16 1.0033 0.2787 1.0033 1.0016
No log 0.225 18 1.0689 0.1977 1.0689 1.0339
No log 0.25 20 0.8448 0.3310 0.8448 0.9192
No log 0.275 22 0.7871 0.4391 0.7871 0.8872
No log 0.3 24 0.8664 0.3518 0.8664 0.9308
No log 0.325 26 1.1082 0.3893 1.1082 1.0527
No log 0.35 28 0.9664 0.5273 0.9664 0.9830
No log 0.375 30 0.6528 0.5451 0.6528 0.8080
No log 0.4 32 0.6725 0.7019 0.6725 0.8201
No log 0.425 34 0.6507 0.6324 0.6507 0.8067
No log 0.45 36 0.9353 0.4522 0.9353 0.9671
No log 0.475 38 0.8264 0.5813 0.8264 0.9091
No log 0.5 40 0.6788 0.6345 0.6788 0.8239
No log 0.525 42 0.6901 0.6293 0.6901 0.8307
No log 0.55 44 0.6904 0.6075 0.6904 0.8309
No log 0.575 46 0.7535 0.6289 0.7535 0.8681
No log 0.6 48 0.6776 0.6324 0.6776 0.8232
No log 0.625 50 0.6522 0.6157 0.6522 0.8076
No log 0.65 52 0.6477 0.5861 0.6477 0.8048
No log 0.675 54 0.6645 0.6528 0.6645 0.8152
No log 0.7 56 0.6459 0.6157 0.6459 0.8037
No log 0.725 58 0.7128 0.6148 0.7128 0.8443
No log 0.75 60 1.0557 0.5273 1.0557 1.0275
No log 0.775 62 1.1976 0.4630 1.1976 1.0943
No log 0.8 64 0.9668 0.5643 0.9668 0.9833
No log 0.825 66 0.8287 0.5843 0.8287 0.9104
No log 0.85 68 0.6936 0.5912 0.6936 0.8328
No log 0.875 70 0.7132 0.5610 0.7132 0.8445
No log 0.9 72 0.7244 0.5833 0.7244 0.8511
No log 0.925 74 0.6781 0.6148 0.6781 0.8235
No log 0.95 76 0.6852 0.6148 0.6852 0.8278
No log 0.975 78 0.7386 0.5898 0.7386 0.8594
No log 1.0 80 0.7920 0.5429 0.7920 0.8899
No log 1.025 82 0.7083 0.6025 0.7083 0.8416
No log 1.05 84 0.6730 0.6439 0.6730 0.8203
No log 1.075 86 0.6796 0.6078 0.6796 0.8244
No log 1.1 88 0.9892 0.5387 0.9892 0.9946
No log 1.125 90 1.1044 0.4846 1.1044 1.0509
No log 1.15 92 0.7995 0.5034 0.7995 0.8942
No log 1.175 94 0.6738 0.6068 0.6738 0.8209
No log 1.2 96 0.6828 0.6928 0.6828 0.8263
No log 1.225 98 0.6821 0.6014 0.6821 0.8259
No log 1.25 100 0.8143 0.6472 0.8143 0.9024
No log 1.275 102 0.7387 0.6580 0.7387 0.8595
No log 1.3 104 0.7305 0.5955 0.7305 0.8547
No log 1.325 106 0.7050 0.6565 0.7050 0.8396
No log 1.35 108 0.7117 0.5981 0.7117 0.8436
No log 1.375 110 0.7579 0.6115 0.7579 0.8706
No log 1.4 112 0.6994 0.6441 0.6994 0.8363
No log 1.425 114 0.6429 0.6473 0.6429 0.8018
No log 1.45 116 0.6662 0.6493 0.6662 0.8162
No log 1.475 118 0.6638 0.6493 0.6638 0.8148
No log 1.5 120 0.6479 0.6067 0.6479 0.8049
No log 1.525 122 0.7102 0.5653 0.7102 0.8427
No log 1.55 124 0.7641 0.5142 0.7641 0.8742
No log 1.575 126 0.7395 0.6091 0.7395 0.8599
No log 1.6 128 0.6507 0.6354 0.6507 0.8066
No log 1.625 130 0.6426 0.6760 0.6426 0.8016
No log 1.65 132 0.6413 0.6692 0.6413 0.8008
No log 1.675 134 0.6245 0.7063 0.6245 0.7902
No log 1.7 136 0.5724 0.7095 0.5724 0.7565
No log 1.725 138 0.5889 0.6828 0.5889 0.7674
No log 1.75 140 0.6862 0.6356 0.6862 0.8284
No log 1.775 142 0.7210 0.5821 0.7210 0.8491
No log 1.8 144 0.6421 0.6686 0.6421 0.8013
No log 1.825 146 0.5618 0.7129 0.5618 0.7495
No log 1.85 148 0.5936 0.7198 0.5936 0.7704
No log 1.875 150 0.7723 0.5438 0.7723 0.8788
No log 1.9 152 0.7134 0.6092 0.7134 0.8446
No log 1.925 154 0.6367 0.6909 0.6367 0.7979
No log 1.95 156 0.6930 0.6554 0.6930 0.8325
No log 1.975 158 0.6370 0.7056 0.6370 0.7981
No log 2.0 160 0.7330 0.5739 0.7330 0.8561
No log 2.025 162 0.7925 0.5170 0.7925 0.8902
No log 2.05 164 0.6669 0.5902 0.6669 0.8166
No log 2.075 166 0.6361 0.6415 0.6361 0.7976
No log 2.1 168 0.7405 0.5998 0.7405 0.8605
No log 2.125 170 0.6981 0.6340 0.6981 0.8355
No log 2.15 172 0.6040 0.6610 0.6040 0.7772
No log 2.175 174 0.7090 0.4686 0.7090 0.8420
No log 2.2 176 0.7559 0.4808 0.7559 0.8694
No log 2.225 178 0.6850 0.5442 0.6850 0.8276
No log 2.25 180 0.6427 0.6973 0.6427 0.8017
No log 2.275 182 0.6603 0.6674 0.6603 0.8126
No log 2.3 184 0.8979 0.4930 0.8979 0.9476
No log 2.325 186 0.9966 0.4870 0.9966 0.9983
No log 2.35 188 0.7170 0.5918 0.7170 0.8468
No log 2.375 190 0.6465 0.6716 0.6465 0.8040
No log 2.4 192 0.6234 0.5845 0.6234 0.7895
No log 2.425 194 0.6087 0.6196 0.6087 0.7802
No log 2.45 196 0.5887 0.6526 0.5887 0.7673
No log 2.475 198 0.6039 0.6470 0.6039 0.7771
No log 2.5 200 0.5884 0.6602 0.5884 0.7671
No log 2.525 202 0.6144 0.6679 0.6144 0.7838
No log 2.55 204 0.6056 0.6679 0.6056 0.7782
No log 2.575 206 0.6050 0.7221 0.6050 0.7778
No log 2.6 208 0.5768 0.7158 0.5768 0.7595
No log 2.625 210 0.5712 0.7042 0.5712 0.7558
No log 2.65 212 0.5906 0.6616 0.5906 0.7685
No log 2.675 214 0.6071 0.5771 0.6071 0.7792
No log 2.7 216 0.6164 0.5999 0.6164 0.7851
No log 2.725 218 0.6631 0.5759 0.6631 0.8143
No log 2.75 220 0.6483 0.6653 0.6483 0.8052
No log 2.775 222 0.6164 0.6909 0.6164 0.7851
No log 2.8 224 0.7833 0.6293 0.7833 0.8850
No log 2.825 226 0.7839 0.6293 0.7839 0.8854
No log 2.85 228 0.6260 0.6955 0.6260 0.7912
No log 2.875 230 0.7126 0.5728 0.7126 0.8442
No log 2.9 232 0.9470 0.5027 0.9470 0.9732
No log 2.925 234 0.8651 0.4787 0.8651 0.9301
No log 2.95 236 0.6460 0.6080 0.6460 0.8038
No log 2.975 238 0.5841 0.6380 0.5841 0.7643
No log 3.0 240 0.5880 0.6704 0.5880 0.7668
No log 3.025 242 0.5853 0.6491 0.5853 0.7651
No log 3.05 244 0.5954 0.6575 0.5954 0.7716
No log 3.075 246 0.6118 0.6206 0.6118 0.7822
No log 3.1 248 0.6217 0.6377 0.6217 0.7885
No log 3.125 250 0.6500 0.6374 0.6500 0.8062
No log 3.15 252 0.6249 0.6662 0.6249 0.7905
No log 3.175 254 0.6015 0.6274 0.6015 0.7756
No log 3.2 256 0.5582 0.6689 0.5582 0.7471
No log 3.225 258 0.5799 0.6830 0.5799 0.7615
No log 3.25 260 0.6466 0.6845 0.6466 0.8041
No log 3.275 262 0.6347 0.6845 0.6347 0.7967
No log 3.3 264 0.5678 0.7005 0.5678 0.7535
No log 3.325 266 0.6026 0.6772 0.6026 0.7763
No log 3.35 268 0.6526 0.6507 0.6526 0.8078
No log 3.375 270 0.6361 0.6973 0.6361 0.7975
No log 3.4 272 0.6161 0.6537 0.6161 0.7849
No log 3.425 274 0.6195 0.6830 0.6195 0.7871
No log 3.45 276 0.6407 0.6716 0.6407 0.8004
No log 3.475 278 0.6623 0.6774 0.6623 0.8138
No log 3.5 280 0.6087 0.6589 0.6087 0.7802
No log 3.525 282 0.5741 0.7005 0.5741 0.7577
No log 3.55 284 0.5620 0.6748 0.5620 0.7497
No log 3.575 286 0.5476 0.6846 0.5476 0.7400
No log 3.6 288 0.5380 0.6846 0.5380 0.7335
No log 3.625 290 0.5307 0.6853 0.5307 0.7285
No log 3.65 292 0.5267 0.7012 0.5267 0.7257
No log 3.675 294 0.5246 0.6896 0.5246 0.7243
No log 3.7 296 0.5320 0.6896 0.5320 0.7294
No log 3.725 298 0.5321 0.6995 0.5321 0.7294
No log 3.75 300 0.5887 0.6962 0.5887 0.7673
No log 3.775 302 0.6877 0.6388 0.6877 0.8293
No log 3.8 304 0.6859 0.6476 0.6859 0.8282
No log 3.825 306 0.6238 0.6197 0.6238 0.7898
No log 3.85 308 0.6071 0.6625 0.6071 0.7792
No log 3.875 310 0.6137 0.6006 0.6137 0.7834
No log 3.9 312 0.6009 0.6118 0.6009 0.7752
No log 3.925 314 0.5835 0.6575 0.5835 0.7639
No log 3.95 316 0.6028 0.6446 0.6028 0.7764
No log 3.975 318 0.5925 0.6555 0.5925 0.7698
No log 4.0 320 0.5599 0.6888 0.5599 0.7483
No log 4.025 322 0.5434 0.6888 0.5434 0.7372
No log 4.05 324 0.5396 0.6689 0.5396 0.7346
No log 4.075 326 0.5295 0.6888 0.5295 0.7277
No log 4.1 328 0.5279 0.6944 0.5279 0.7265
No log 4.125 330 0.5435 0.6672 0.5435 0.7372
No log 4.15 332 0.5507 0.6288 0.5507 0.7421
No log 4.175 334 0.5533 0.6882 0.5533 0.7438
No log 4.2 336 0.6084 0.6899 0.6084 0.7800
No log 4.225 338 0.6113 0.7018 0.6113 0.7819
No log 4.25 340 0.5841 0.7080 0.5841 0.7642
No log 4.275 342 0.5940 0.6939 0.5940 0.7707
No log 4.3 344 0.5843 0.6903 0.5843 0.7644
No log 4.325 346 0.5949 0.7266 0.5949 0.7713
No log 4.35 348 0.6331 0.6602 0.6331 0.7956
No log 4.375 350 0.6920 0.5798 0.6920 0.8318
No log 4.4 352 0.6589 0.6073 0.6589 0.8118
No log 4.425 354 0.6241 0.6087 0.6241 0.7900
No log 4.45 356 0.6208 0.6154 0.6208 0.7879
No log 4.475 358 0.6285 0.6435 0.6285 0.7928
No log 4.5 360 0.6322 0.6009 0.6322 0.7951
No log 4.525 362 0.6070 0.6828 0.6070 0.7791
No log 4.55 364 0.6025 0.6667 0.6025 0.7762
No log 4.575 366 0.6091 0.5961 0.6091 0.7804
No log 4.6 368 0.6101 0.5747 0.6101 0.7811
No log 4.625 370 0.6174 0.6067 0.6174 0.7858
No log 4.65 372 0.6210 0.6932 0.6210 0.7880
No log 4.675 374 0.6581 0.6860 0.6581 0.8113
No log 4.7 376 0.7392 0.5672 0.7392 0.8598
No log 4.725 378 0.7522 0.5672 0.7522 0.8673
No log 4.75 380 0.6975 0.6160 0.6975 0.8351
No log 4.775 382 0.6213 0.6133 0.6213 0.7883
No log 4.8 384 0.6397 0.6207 0.6397 0.7998
No log 4.825 386 0.6542 0.6097 0.6542 0.8088
No log 4.85 388 0.6205 0.6207 0.6205 0.7877
No log 4.875 390 0.6587 0.5821 0.6587 0.8116
No log 4.9 392 0.7730 0.5318 0.7730 0.8792
No log 4.925 394 0.7749 0.5208 0.7749 0.8803
No log 4.95 396 0.6924 0.5677 0.6924 0.8321
No log 4.975 398 0.6251 0.6753 0.6251 0.7907
No log 5.0 400 0.6204 0.6259 0.6204 0.7877
No log 5.025 402 0.6227 0.6067 0.6227 0.7891
No log 5.05 404 0.6226 0.6057 0.6226 0.7890
No log 5.075 406 0.6426 0.5582 0.6426 0.8016
No log 5.1 408 0.6540 0.5570 0.6540 0.8087
No log 5.125 410 0.6327 0.5582 0.6327 0.7954
No log 5.15 412 0.6201 0.6164 0.6201 0.7874
No log 5.175 414 0.6168 0.6057 0.6168 0.7853
No log 5.2 416 0.6309 0.6256 0.6309 0.7943
No log 5.225 418 0.6765 0.5908 0.6765 0.8225
No log 5.25 420 0.6872 0.6561 0.6872 0.8290
No log 5.275 422 0.6585 0.6452 0.6585 0.8115
No log 5.3 424 0.6224 0.6174 0.6224 0.7889
No log 5.325 426 0.6151 0.6658 0.6151 0.7843
No log 5.35 428 0.6076 0.6658 0.6076 0.7795
No log 5.375 430 0.6065 0.6259 0.6065 0.7788
No log 5.4 432 0.6123 0.6658 0.6123 0.7825
No log 5.425 434 0.6176 0.6470 0.6176 0.7859
No log 5.45 436 0.6279 0.6354 0.6279 0.7924
No log 5.475 438 0.6248 0.5759 0.6248 0.7904
No log 5.5 440 0.6310 0.5759 0.6310 0.7943
No log 5.525 442 0.6397 0.5966 0.6397 0.7998
No log 5.55 444 0.6449 0.5939 0.6449 0.8031
No log 5.575 446 0.6468 0.5759 0.6468 0.8042
No log 5.6 448 0.6448 0.5759 0.6448 0.8030
No log 5.625 450 0.6377 0.5517 0.6377 0.7986
No log 5.65 452 0.6250 0.6354 0.6250 0.7906
No log 5.675 454 0.6082 0.6500 0.6082 0.7799
No log 5.7 456 0.5929 0.6518 0.5929 0.7700
No log 5.725 458 0.5913 0.6545 0.5913 0.7690
No log 5.75 460 0.5986 0.6780 0.5986 0.7737
No log 5.775 462 0.5934 0.6649 0.5934 0.7704
No log 5.8 464 0.5908 0.6057 0.5908 0.7686
No log 5.825 466 0.5939 0.6556 0.5939 0.7707
No log 5.85 468 0.6098 0.6119 0.6098 0.7809
No log 5.875 470 0.6021 0.6144 0.6021 0.7759
No log 5.9 472 0.5811 0.6237 0.5811 0.7623
No log 5.925 474 0.5807 0.6584 0.5807 0.7620
No log 5.95 476 0.5763 0.6584 0.5763 0.7592
No log 5.975 478 0.5780 0.6237 0.5780 0.7603
No log 6.0 480 0.5729 0.6584 0.5729 0.7569
No log 6.025 482 0.5767 0.6470 0.5767 0.7594
No log 6.05 484 0.5773 0.6470 0.5773 0.7598
No log 6.075 486 0.5853 0.6437 0.5853 0.7651
No log 6.1 488 0.5901 0.6164 0.5901 0.7682
No log 6.125 490 0.5984 0.6046 0.5984 0.7736
No log 6.15 492 0.5980 0.6018 0.5980 0.7733
No log 6.175 494 0.5893 0.6249 0.5893 0.7677
No log 6.2 496 0.5974 0.6237 0.5974 0.7729
No log 6.225 498 0.5947 0.6237 0.5947 0.7712
0.2677 6.25 500 0.5899 0.6049 0.5899 0.7680
0.2677 6.275 502 0.5676 0.6690 0.5676 0.7534
0.2677 6.3 504 0.5801 0.6415 0.5801 0.7617
0.2677 6.325 506 0.6064 0.6647 0.6064 0.7787
0.2677 6.35 508 0.5967 0.6510 0.5967 0.7724
0.2677 6.375 510 0.5680 0.6528 0.5680 0.7537
0.2677 6.4 512 0.5724 0.6207 0.5724 0.7566
0.2677 6.425 514 0.6374 0.5987 0.6374 0.7984
0.2677 6.45 516 0.6557 0.6388 0.6557 0.8097
0.2677 6.475 518 0.6068 0.7172 0.6068 0.7790
0.2677 6.5 520 0.5733 0.6561 0.5733 0.7572
0.2677 6.525 522 0.6295 0.6664 0.6295 0.7934
0.2677 6.55 524 0.6662 0.6677 0.6662 0.8162
0.2677 6.575 526 0.6414 0.6528 0.6414 0.8009
0.2677 6.6 528 0.6133 0.5966 0.6133 0.7831
0.2677 6.625 530 0.6249 0.5425 0.6249 0.7905
0.2677 6.65 532 0.6468 0.5202 0.6468 0.8042
0.2677 6.675 534 0.6354 0.5446 0.6354 0.7971
0.2677 6.7 536 0.5928 0.5882 0.5928 0.7699

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run3_AugV5_k16_task5_organization

Finetuned
(4019)
this model