ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k8_task7_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.4975
  • Qwk: 0.4795
  • Mse: 0.4975
  • Rmse: 0.7053

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.05 2 2.4010 -0.0495 2.4010 1.5495
No log 0.1 4 0.9716 0.2483 0.9716 0.9857
No log 0.15 6 0.6828 0.1646 0.6828 0.8263
No log 0.2 8 0.7133 0.3267 0.7133 0.8446
No log 0.25 10 0.6301 0.3081 0.6301 0.7938
No log 0.3 12 0.6152 0.4539 0.6152 0.7844
No log 0.35 14 0.5672 0.4338 0.5672 0.7531
No log 0.4 16 0.6164 0.4749 0.6164 0.7851
No log 0.45 18 0.5518 0.4494 0.5518 0.7428
No log 0.5 20 0.5961 0.3953 0.5961 0.7721
No log 0.55 22 0.4984 0.4698 0.4984 0.7060
No log 0.6 24 0.4724 0.5672 0.4724 0.6873
No log 0.65 26 0.4561 0.5797 0.4561 0.6753
No log 0.7 28 0.4763 0.6182 0.4763 0.6902
No log 0.75 30 0.7590 0.4764 0.7590 0.8712
No log 0.8 32 1.0096 0.4528 1.0096 1.0048
No log 0.85 34 0.9166 0.4804 0.9166 0.9574
No log 0.9 36 0.7614 0.4666 0.7614 0.8726
No log 0.95 38 0.5328 0.6032 0.5328 0.7299
No log 1.0 40 0.4914 0.6909 0.4914 0.7010
No log 1.05 42 0.4639 0.6750 0.4639 0.6811
No log 1.1 44 0.4501 0.7147 0.4501 0.6709
No log 1.15 46 0.4626 0.7147 0.4626 0.6802
No log 1.2 48 0.4493 0.6778 0.4493 0.6703
No log 1.25 50 0.4474 0.6561 0.4474 0.6689
No log 1.3 52 0.4821 0.6692 0.4821 0.6944
No log 1.35 54 0.5732 0.4811 0.5732 0.7571
No log 1.4 56 0.4486 0.6849 0.4486 0.6698
No log 1.45 58 0.4969 0.6121 0.4969 0.7049
No log 1.5 60 0.4447 0.6637 0.4447 0.6668
No log 1.55 62 0.5441 0.5831 0.5441 0.7376
No log 1.6 64 0.4880 0.6470 0.4880 0.6985
No log 1.65 66 0.4778 0.6788 0.4778 0.6912
No log 1.7 68 0.5101 0.6934 0.5101 0.7142
No log 1.75 70 0.7538 0.4573 0.7538 0.8682
No log 1.8 72 0.7475 0.5077 0.7475 0.8646
No log 1.85 74 0.5191 0.7179 0.5191 0.7205
No log 1.9 76 0.5394 0.6025 0.5394 0.7345
No log 1.95 78 0.4912 0.6539 0.4912 0.7008
No log 2.0 80 0.4902 0.6292 0.4902 0.7002
No log 2.05 82 0.7207 0.4748 0.7207 0.8489
No log 2.1 84 0.7836 0.4520 0.7836 0.8852
No log 2.15 86 0.5778 0.5340 0.5778 0.7602
No log 2.2 88 0.4806 0.6683 0.4806 0.6932
No log 2.25 90 0.4783 0.6572 0.4783 0.6916
No log 2.3 92 0.5325 0.5667 0.5325 0.7298
No log 2.35 94 0.5402 0.5667 0.5402 0.7350
No log 2.4 96 0.6240 0.5655 0.6240 0.7899
No log 2.45 98 0.6222 0.5625 0.6222 0.7888
No log 2.5 100 0.5411 0.5696 0.5411 0.7356
No log 2.55 102 0.4788 0.6368 0.4788 0.6919
No log 2.6 104 0.4839 0.5919 0.4839 0.6956
No log 2.65 106 0.4590 0.6503 0.4590 0.6775
No log 2.7 108 0.4360 0.5947 0.4360 0.6603
No log 2.75 110 0.5050 0.5986 0.5050 0.7106
No log 2.8 112 0.4896 0.5966 0.4896 0.6997
No log 2.85 114 0.4463 0.6389 0.4463 0.6680
No log 2.9 116 0.8296 0.4779 0.8296 0.9108
No log 2.95 118 1.0387 0.3126 1.0387 1.0192
No log 3.0 120 0.8371 0.4429 0.8371 0.9149
No log 3.05 122 0.5099 0.5956 0.5099 0.7141
No log 3.1 124 0.6519 0.5471 0.6519 0.8074
No log 3.15 126 0.9738 0.5144 0.9738 0.9868
No log 3.2 128 0.8575 0.5530 0.8575 0.9260
No log 3.25 130 0.5297 0.5639 0.5297 0.7278
No log 3.3 132 0.5038 0.5983 0.5038 0.7098
No log 3.35 134 0.7016 0.4748 0.7016 0.8376
No log 3.4 136 0.7132 0.4699 0.7132 0.8445
No log 3.45 138 0.5398 0.6314 0.5398 0.7347
No log 3.5 140 0.4805 0.5580 0.4805 0.6932
No log 3.55 142 0.5608 0.5223 0.5608 0.7489
No log 3.6 144 0.5415 0.5166 0.5415 0.7359
No log 3.65 146 0.4708 0.6624 0.4708 0.6862
No log 3.7 148 0.5404 0.5101 0.5404 0.7351
No log 3.75 150 0.5616 0.5486 0.5616 0.7494
No log 3.8 152 0.5043 0.6431 0.5043 0.7101
No log 3.85 154 0.4645 0.6383 0.4645 0.6815
No log 3.9 156 0.4839 0.5538 0.4839 0.6956
No log 3.95 158 0.4811 0.5701 0.4811 0.6936
No log 4.0 160 0.4437 0.6115 0.4437 0.6661
No log 4.05 162 0.4269 0.6115 0.4269 0.6534
No log 4.1 164 0.4219 0.6242 0.4219 0.6495
No log 4.15 166 0.4218 0.6330 0.4218 0.6495
No log 4.2 168 0.4364 0.6115 0.4364 0.6606
No log 4.25 170 0.4358 0.6052 0.4358 0.6602
No log 4.3 172 0.4402 0.6341 0.4402 0.6635
No log 4.35 174 0.4267 0.6254 0.4267 0.6532
No log 4.4 176 0.4444 0.6414 0.4444 0.6666
No log 4.45 178 0.4321 0.6282 0.4321 0.6573
No log 4.5 180 0.4221 0.5941 0.4221 0.6497
No log 4.55 182 0.4323 0.6132 0.4323 0.6575
No log 4.6 184 0.4253 0.5926 0.4253 0.6521
No log 4.65 186 0.4561 0.6013 0.4561 0.6754
No log 4.7 188 0.5174 0.6074 0.5174 0.7193
No log 4.75 190 0.4741 0.5933 0.4741 0.6886
No log 4.8 192 0.4492 0.5816 0.4492 0.6703
No log 4.85 194 0.6767 0.5650 0.6767 0.8226
No log 4.9 196 0.7614 0.5254 0.7614 0.8726
No log 4.95 198 0.5958 0.5827 0.5958 0.7719
No log 5.0 200 0.4551 0.5476 0.4551 0.6746
No log 5.05 202 0.5223 0.5997 0.5223 0.7227
No log 5.1 204 0.5480 0.6010 0.5480 0.7403
No log 5.15 206 0.4955 0.6537 0.4955 0.7039
No log 5.2 208 0.4566 0.5993 0.4566 0.6757
No log 5.25 210 0.4513 0.5888 0.4513 0.6718
No log 5.3 212 0.4528 0.5357 0.4528 0.6729
No log 5.35 214 0.4474 0.5288 0.4474 0.6688
No log 5.4 216 0.4496 0.5617 0.4496 0.6705
No log 5.45 218 0.4728 0.5947 0.4728 0.6876
No log 5.5 220 0.4802 0.6354 0.4802 0.6930
No log 5.55 222 0.4609 0.6402 0.4609 0.6789
No log 5.6 224 0.4627 0.6494 0.4627 0.6802
No log 5.65 226 0.4762 0.5447 0.4762 0.6901
No log 5.7 228 0.5043 0.5524 0.5043 0.7102
No log 5.75 230 0.5024 0.5484 0.5024 0.7088
No log 5.8 232 0.4818 0.5597 0.4818 0.6941
No log 5.85 234 0.4816 0.6234 0.4816 0.6940
No log 5.9 236 0.5025 0.5906 0.5025 0.7088
No log 5.95 238 0.4758 0.6222 0.4758 0.6898
No log 6.0 240 0.4645 0.6330 0.4645 0.6815
No log 6.05 242 0.4707 0.5411 0.4707 0.6861
No log 6.1 244 0.4476 0.5995 0.4476 0.6690
No log 6.15 246 0.4536 0.6269 0.4536 0.6735
No log 6.2 248 0.4838 0.6442 0.4838 0.6956
No log 6.25 250 0.4883 0.6947 0.4883 0.6988
No log 6.3 252 0.4973 0.5888 0.4973 0.7052
No log 6.35 254 0.5307 0.5992 0.5307 0.7285
No log 6.4 256 0.5157 0.5909 0.5157 0.7181
No log 6.45 258 0.5019 0.5954 0.5019 0.7084
No log 6.5 260 0.4750 0.5941 0.4750 0.6892
No log 6.55 262 0.4844 0.5231 0.4844 0.6960
No log 6.6 264 0.4812 0.5042 0.4812 0.6937
No log 6.65 266 0.5027 0.5955 0.5027 0.7090
No log 6.7 268 0.5903 0.4554 0.5903 0.7683
No log 6.75 270 0.6380 0.4387 0.6380 0.7987
No log 6.8 272 0.5737 0.4554 0.5737 0.7575
No log 6.85 274 0.4975 0.5111 0.4975 0.7053
No log 6.9 276 0.4671 0.5042 0.4671 0.6835
No log 6.95 278 0.4706 0.5687 0.4706 0.6860
No log 7.0 280 0.5009 0.6637 0.5009 0.7077
No log 7.05 282 0.4868 0.6620 0.4868 0.6977
No log 7.1 284 0.4861 0.6617 0.4861 0.6972
No log 7.15 286 0.4848 0.7075 0.4848 0.6963
No log 7.2 288 0.4897 0.6690 0.4897 0.6998
No log 7.25 290 0.4942 0.6775 0.4942 0.7030
No log 7.3 292 0.4873 0.6356 0.4873 0.6981
No log 7.35 294 0.4828 0.6034 0.4828 0.6948
No log 7.4 296 0.4621 0.5522 0.4621 0.6798
No log 7.45 298 0.4655 0.5404 0.4655 0.6823
No log 7.5 300 0.4883 0.4875 0.4883 0.6988
No log 7.55 302 0.4827 0.5404 0.4827 0.6948
No log 7.6 304 0.4766 0.5503 0.4766 0.6903
No log 7.65 306 0.4690 0.5503 0.4690 0.6848
No log 7.7 308 0.4739 0.4895 0.4739 0.6884
No log 7.75 310 0.4559 0.5141 0.4559 0.6752
No log 7.8 312 0.4540 0.5574 0.4540 0.6738
No log 7.85 314 0.4608 0.5272 0.4608 0.6788
No log 7.9 316 0.4442 0.5692 0.4442 0.6665
No log 7.95 318 0.4661 0.5438 0.4661 0.6827
No log 8.0 320 0.4456 0.5904 0.4456 0.6676
No log 8.05 322 0.4320 0.5846 0.4320 0.6573
No log 8.1 324 0.4980 0.5166 0.4980 0.7057
No log 8.15 326 0.4991 0.5512 0.4991 0.7065
No log 8.2 328 0.4769 0.5190 0.4769 0.6905
No log 8.25 330 0.4614 0.5711 0.4614 0.6792
No log 8.3 332 0.4656 0.5742 0.4656 0.6824
No log 8.35 334 0.4507 0.5860 0.4507 0.6714
No log 8.4 336 0.4455 0.6125 0.4455 0.6675
No log 8.45 338 0.4444 0.5986 0.4444 0.6666
No log 8.5 340 0.4571 0.5550 0.4571 0.6761
No log 8.55 342 0.4595 0.5852 0.4595 0.6778
No log 8.6 344 0.4443 0.5521 0.4443 0.6666
No log 8.65 346 0.4470 0.5044 0.4470 0.6686
No log 8.7 348 0.4516 0.4677 0.4516 0.6720
No log 8.75 350 0.4656 0.5748 0.4656 0.6823
No log 8.8 352 0.4527 0.6158 0.4527 0.6729
No log 8.85 354 0.4761 0.6550 0.4761 0.6900
No log 8.9 356 0.4744 0.6730 0.4744 0.6888
No log 8.95 358 0.4671 0.7077 0.4671 0.6834
No log 9.0 360 0.4574 0.7077 0.4574 0.6763
No log 9.05 362 0.4459 0.6747 0.4459 0.6678
No log 9.1 364 0.4360 0.7119 0.4360 0.6603
No log 9.15 366 0.4325 0.6514 0.4325 0.6577
No log 9.2 368 0.4529 0.6330 0.4529 0.6730
No log 9.25 370 0.4655 0.5718 0.4655 0.6823
No log 9.3 372 0.4609 0.5098 0.4609 0.6789
No log 9.35 374 0.4496 0.4590 0.4496 0.6706
No log 9.4 376 0.4714 0.5411 0.4714 0.6866
No log 9.45 378 0.4790 0.5882 0.4790 0.6921
No log 9.5 380 0.4721 0.4752 0.4721 0.6871
No log 9.55 382 0.4673 0.5422 0.4673 0.6836
No log 9.6 384 0.4737 0.5444 0.4737 0.6883
No log 9.65 386 0.4790 0.5357 0.4790 0.6921
No log 9.7 388 0.4897 0.5373 0.4897 0.6998
No log 9.75 390 0.4723 0.5961 0.4723 0.6872
No log 9.8 392 0.4598 0.5009 0.4598 0.6781
No log 9.85 394 0.4943 0.5528 0.4943 0.7030
No log 9.9 396 0.4831 0.5300 0.4831 0.6951
No log 9.95 398 0.4522 0.4703 0.4522 0.6724
No log 10.0 400 0.4519 0.5648 0.4519 0.6722
No log 10.05 402 0.4538 0.5286 0.4538 0.6736
No log 10.1 404 0.4546 0.4703 0.4546 0.6742
No log 10.15 406 0.4580 0.4448 0.4580 0.6768
No log 10.2 408 0.4588 0.4538 0.4588 0.6773
No log 10.25 410 0.4699 0.4767 0.4699 0.6855
No log 10.3 412 0.4953 0.5153 0.4953 0.7038
No log 10.35 414 0.5060 0.5390 0.5060 0.7114
No log 10.4 416 0.4695 0.4494 0.4695 0.6852
No log 10.45 418 0.4620 0.5437 0.4620 0.6797
No log 10.5 420 0.4760 0.5601 0.4760 0.6899
No log 10.55 422 0.4552 0.5593 0.4552 0.6747
No log 10.6 424 0.4587 0.5736 0.4587 0.6773
No log 10.65 426 0.4733 0.6066 0.4733 0.6880
No log 10.7 428 0.4546 0.5611 0.4546 0.6743
No log 10.75 430 0.4513 0.5672 0.4513 0.6718
No log 10.8 432 0.4525 0.5970 0.4525 0.6727
No log 10.85 434 0.4607 0.5784 0.4607 0.6788
No log 10.9 436 0.4681 0.6298 0.4681 0.6842
No log 10.95 438 0.4785 0.5847 0.4785 0.6917
No log 11.0 440 0.4623 0.6083 0.4623 0.6799
No log 11.05 442 0.4466 0.5507 0.4466 0.6683
No log 11.1 444 0.4651 0.4888 0.4651 0.6820
No log 11.15 446 0.4821 0.4653 0.4821 0.6944
No log 11.2 448 0.4820 0.4517 0.4820 0.6943
No log 11.25 450 0.4676 0.3916 0.4676 0.6838
No log 11.3 452 0.4646 0.4224 0.4646 0.6816
No log 11.35 454 0.4733 0.5071 0.4733 0.6879
No log 11.4 456 0.4796 0.5133 0.4796 0.6925
No log 11.45 458 0.4792 0.4847 0.4792 0.6922
No log 11.5 460 0.4884 0.3958 0.4884 0.6989
No log 11.55 462 0.4848 0.3780 0.4848 0.6963
No log 11.6 464 0.4861 0.4591 0.4861 0.6972
No log 11.65 466 0.4802 0.4837 0.4802 0.6929
No log 11.7 468 0.4700 0.4752 0.4700 0.6856
No log 11.75 470 0.4698 0.4908 0.4698 0.6854
No log 11.8 472 0.4648 0.4908 0.4648 0.6817
No log 11.85 474 0.4521 0.5379 0.4521 0.6724
No log 11.9 476 0.4733 0.5300 0.4733 0.6879
No log 11.95 478 0.4979 0.5687 0.4979 0.7056
No log 12.0 480 0.4891 0.5317 0.4891 0.6993
No log 12.05 482 0.4578 0.4746 0.4578 0.6766
No log 12.1 484 0.4451 0.5521 0.4451 0.6671
No log 12.15 486 0.4647 0.5698 0.4647 0.6817
No log 12.2 488 0.4553 0.4908 0.4553 0.6747
No log 12.25 490 0.4518 0.5521 0.4518 0.6722
No log 12.3 492 0.4669 0.5057 0.4669 0.6833
No log 12.35 494 0.4723 0.5057 0.4723 0.6872
No log 12.4 496 0.4666 0.5235 0.4666 0.6831
No log 12.45 498 0.4778 0.5617 0.4778 0.6912
0.281 12.5 500 0.5350 0.5190 0.5350 0.7315
0.281 12.55 502 0.5587 0.4175 0.5587 0.7475
0.281 12.6 504 0.5170 0.4753 0.5170 0.7190
0.281 12.65 506 0.4794 0.4825 0.4794 0.6924
0.281 12.7 508 0.4751 0.4298 0.4751 0.6893
0.281 12.75 510 0.5082 0.5763 0.5082 0.7129
0.281 12.8 512 0.5029 0.5897 0.5029 0.7091
0.281 12.85 514 0.4672 0.5509 0.4672 0.6835
0.281 12.9 516 0.4813 0.5437 0.4813 0.6938
0.281 12.95 518 0.5067 0.5814 0.5067 0.7118
0.281 13.0 520 0.4923 0.5705 0.4923 0.7016
0.281 13.05 522 0.4626 0.4895 0.4626 0.6802
0.281 13.1 524 0.4543 0.5208 0.4543 0.6740
0.281 13.15 526 0.4623 0.5208 0.4623 0.6799
0.281 13.2 528 0.4612 0.5208 0.4612 0.6791
0.281 13.25 530 0.4611 0.5356 0.4611 0.6791
0.281 13.3 532 0.4593 0.5143 0.4593 0.6777
0.281 13.35 534 0.4857 0.5341 0.4857 0.6970
0.281 13.4 536 0.5225 0.4788 0.5225 0.7229
0.281 13.45 538 0.5507 0.4437 0.5507 0.7421
0.281 13.5 540 0.5325 0.4524 0.5325 0.7297
0.281 13.55 542 0.5086 0.4524 0.5086 0.7131
0.281 13.6 544 0.4975 0.4795 0.4975 0.7053

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k8_task7_organization

Finetuned
(4019)
this model