ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run3_AugV5_k3_task7_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.4986
  • Qwk: 0.6406
  • Mse: 0.4986
  • Rmse: 0.7061

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.1333 2 2.5872 -0.0646 2.5872 1.6085
No log 0.2667 4 1.0676 0.2374 1.0676 1.0332
No log 0.4 6 0.6452 0.0893 0.6452 0.8032
No log 0.5333 8 0.8733 0.3280 0.8733 0.9345
No log 0.6667 10 0.8828 0.3114 0.8828 0.9396
No log 0.8 12 0.7179 0.0 0.7179 0.8473
No log 0.9333 14 0.8414 0.1786 0.8414 0.9173
No log 1.0667 16 0.7967 0.1786 0.7967 0.8926
No log 1.2 18 0.6667 0.2087 0.6667 0.8165
No log 1.3333 20 0.6240 0.2285 0.6240 0.7899
No log 1.4667 22 0.6197 0.3099 0.6197 0.7872
No log 1.6 24 0.6511 0.3099 0.6511 0.8069
No log 1.7333 26 0.6016 0.3688 0.6016 0.7756
No log 1.8667 28 0.5727 0.4378 0.5727 0.7568
No log 2.0 30 0.5979 0.2540 0.5979 0.7732
No log 2.1333 32 0.6200 0.2471 0.6200 0.7874
No log 2.2667 34 0.6475 0.2857 0.6475 0.8047
No log 2.4 36 0.6231 0.3444 0.6231 0.7893
No log 2.5333 38 0.5776 0.4247 0.5776 0.7600
No log 2.6667 40 0.5607 0.4925 0.5607 0.7488
No log 2.8 42 0.5399 0.5544 0.5399 0.7348
No log 2.9333 44 0.4682 0.5801 0.4682 0.6842
No log 3.0667 46 0.4761 0.5214 0.4761 0.6900
No log 3.2 48 0.5308 0.5289 0.5308 0.7285
No log 3.3333 50 0.4600 0.5883 0.4600 0.6782
No log 3.4667 52 0.5571 0.5528 0.5571 0.7464
No log 3.6 54 0.8190 0.4418 0.8190 0.9050
No log 3.7333 56 0.7961 0.4562 0.7961 0.8922
No log 3.8667 58 0.4865 0.6901 0.4865 0.6975
No log 4.0 60 0.4960 0.6547 0.4960 0.7043
No log 4.1333 62 0.4840 0.6260 0.4840 0.6957
No log 4.2667 64 0.5050 0.6784 0.5050 0.7107
No log 4.4 66 0.5717 0.6009 0.5717 0.7561
No log 4.5333 68 0.6542 0.4789 0.6542 0.8088
No log 4.6667 70 0.6046 0.5328 0.6046 0.7775
No log 4.8 72 0.4775 0.6455 0.4775 0.6910
No log 4.9333 74 0.5553 0.5416 0.5553 0.7452
No log 5.0667 76 0.6275 0.5204 0.6275 0.7922
No log 5.2 78 0.5635 0.5831 0.5635 0.7506
No log 5.3333 80 0.5200 0.5597 0.5200 0.7211
No log 5.4667 82 0.5239 0.5306 0.5239 0.7238
No log 5.6 84 0.5582 0.5034 0.5582 0.7471
No log 5.7333 86 0.5425 0.5218 0.5425 0.7366
No log 5.8667 88 0.5256 0.5444 0.5256 0.7250
No log 6.0 90 0.5267 0.5961 0.5267 0.7257
No log 6.1333 92 0.5256 0.6307 0.5256 0.7250
No log 6.2667 94 0.5514 0.6003 0.5514 0.7425
No log 6.4 96 0.5398 0.6405 0.5398 0.7347
No log 6.5333 98 0.5035 0.6733 0.5035 0.7095
No log 6.6667 100 0.4997 0.6444 0.4997 0.7069
No log 6.8 102 0.5055 0.5919 0.5055 0.7110
No log 6.9333 104 0.4954 0.6574 0.4954 0.7038
No log 7.0667 106 0.4908 0.6599 0.4908 0.7006
No log 7.2 108 0.5320 0.5688 0.5320 0.7294
No log 7.3333 110 0.5853 0.5742 0.5853 0.7651
No log 7.4667 112 0.5117 0.6285 0.5117 0.7154
No log 7.6 114 0.4473 0.6243 0.4473 0.6688
No log 7.7333 116 0.4361 0.6617 0.4361 0.6604
No log 7.8667 118 0.4854 0.6594 0.4854 0.6967
No log 8.0 120 0.6472 0.5894 0.6472 0.8045
No log 8.1333 122 0.6541 0.5574 0.6541 0.8088
No log 8.2667 124 0.5118 0.6690 0.5118 0.7154
No log 8.4 126 0.4888 0.6682 0.4888 0.6992
No log 8.5333 128 0.4938 0.6494 0.4938 0.7027
No log 8.6667 130 0.5442 0.5983 0.5442 0.7377
No log 8.8 132 0.5983 0.5574 0.5983 0.7735
No log 8.9333 134 0.5372 0.6465 0.5372 0.7329
No log 9.0667 136 0.5157 0.5938 0.5157 0.7181
No log 9.2 138 0.5119 0.5987 0.5119 0.7155
No log 9.3333 140 0.5049 0.5753 0.5049 0.7106
No log 9.4667 142 0.5036 0.5784 0.5036 0.7096
No log 9.6 144 0.5117 0.6438 0.5117 0.7153
No log 9.7333 146 0.5492 0.6107 0.5492 0.7411
No log 9.8667 148 0.6097 0.6016 0.6097 0.7808
No log 10.0 150 0.5492 0.6022 0.5492 0.7411
No log 10.1333 152 0.5002 0.6472 0.5002 0.7072
No log 10.2667 154 0.4964 0.6197 0.4964 0.7045
No log 10.4 156 0.5066 0.5996 0.5066 0.7117
No log 10.5333 158 0.5240 0.6272 0.5240 0.7239
No log 10.6667 160 0.5146 0.5958 0.5146 0.7174
No log 10.8 162 0.5181 0.5889 0.5181 0.7198
No log 10.9333 164 0.5248 0.6207 0.5248 0.7244
No log 11.0667 166 0.5332 0.6207 0.5332 0.7302
No log 11.2 168 0.5238 0.6692 0.5238 0.7238
No log 11.3333 170 0.5046 0.6515 0.5046 0.7104
No log 11.4667 172 0.4986 0.6329 0.4986 0.7061
No log 11.6 174 0.4901 0.6172 0.4901 0.7000
No log 11.7333 176 0.4799 0.6667 0.4799 0.6927
No log 11.8667 178 0.4961 0.6096 0.4961 0.7044
No log 12.0 180 0.5171 0.5845 0.5171 0.7191
No log 12.1333 182 0.5293 0.5897 0.5293 0.7275
No log 12.2667 184 0.5733 0.5468 0.5733 0.7572
No log 12.4 186 0.5635 0.5601 0.5635 0.7507
No log 12.5333 188 0.4802 0.5974 0.4802 0.6930
No log 12.6667 190 0.4739 0.6292 0.4739 0.6884
No log 12.8 192 0.4816 0.6698 0.4816 0.6940
No log 12.9333 194 0.4624 0.6298 0.4624 0.6800
No log 13.0667 196 0.5257 0.6825 0.5257 0.7251
No log 13.2 198 0.6743 0.6127 0.6743 0.8212
No log 13.3333 200 0.6578 0.6127 0.6578 0.8110
No log 13.4667 202 0.5233 0.6684 0.5233 0.7234
No log 13.6 204 0.4727 0.6657 0.4727 0.6875
No log 13.7333 206 0.4810 0.6766 0.4810 0.6936
No log 13.8667 208 0.5014 0.6526 0.5014 0.7081
No log 14.0 210 0.5076 0.6295 0.5076 0.7125
No log 14.1333 212 0.5007 0.6357 0.5007 0.7076
No log 14.2667 214 0.4941 0.6515 0.4941 0.7029
No log 14.4 216 0.4922 0.5783 0.4922 0.7016
No log 14.5333 218 0.4969 0.5923 0.4969 0.7049
No log 14.6667 220 0.5110 0.6186 0.5110 0.7148
No log 14.8 222 0.5331 0.6174 0.5331 0.7301
No log 14.9333 224 0.5105 0.6398 0.5105 0.7145
No log 15.0667 226 0.5003 0.6173 0.5003 0.7073
No log 15.2 228 0.5016 0.6322 0.5016 0.7082
No log 15.3333 230 0.5122 0.6371 0.5122 0.7157
No log 15.4667 232 0.5584 0.6003 0.5584 0.7473
No log 15.6 234 0.5864 0.5954 0.5864 0.7658
No log 15.7333 236 0.5437 0.6080 0.5437 0.7374
No log 15.8667 238 0.5556 0.5861 0.5556 0.7454
No log 16.0 240 0.5609 0.5373 0.5609 0.7490
No log 16.1333 242 0.5379 0.5991 0.5379 0.7334
No log 16.2667 244 0.5187 0.5868 0.5187 0.7202
No log 16.4 246 0.5063 0.5965 0.5063 0.7115
No log 16.5333 248 0.5555 0.6045 0.5555 0.7453
No log 16.6667 250 0.6581 0.5204 0.6581 0.8113
No log 16.8 252 0.6821 0.5132 0.6821 0.8259
No log 16.9333 254 0.6394 0.5170 0.6394 0.7996
No log 17.0667 256 0.6045 0.5748 0.6045 0.7775
No log 17.2 258 0.5653 0.6279 0.5653 0.7519
No log 17.3333 260 0.5344 0.6612 0.5344 0.7311
No log 17.4667 262 0.5134 0.6286 0.5134 0.7166
No log 17.6 264 0.5122 0.5929 0.5122 0.7157
No log 17.7333 266 0.5017 0.5979 0.5017 0.7083
No log 17.8667 268 0.5232 0.6609 0.5232 0.7233
No log 18.0 270 0.5713 0.6356 0.5713 0.7559
No log 18.1333 272 0.5588 0.6674 0.5588 0.7475
No log 18.2667 274 0.5064 0.6566 0.5064 0.7116
No log 18.4 276 0.5273 0.6108 0.5273 0.7262
No log 18.5333 278 0.5486 0.5683 0.5486 0.7407
No log 18.6667 280 0.5026 0.6108 0.5026 0.7090
No log 18.8 282 0.4707 0.6088 0.4707 0.6861
No log 18.9333 284 0.4647 0.5866 0.4647 0.6817
No log 19.0667 286 0.4730 0.6452 0.4730 0.6878
No log 19.2 288 0.4675 0.6382 0.4675 0.6837
No log 19.3333 290 0.4708 0.6304 0.4708 0.6862
No log 19.4667 292 0.4925 0.6539 0.4925 0.7018
No log 19.6 294 0.5039 0.6608 0.5039 0.7098
No log 19.7333 296 0.4937 0.6611 0.4937 0.7026
No log 19.8667 298 0.4615 0.6136 0.4615 0.6793
No log 20.0 300 0.4774 0.5636 0.4774 0.6909
No log 20.1333 302 0.4995 0.5836 0.4995 0.7068
No log 20.2667 304 0.4716 0.5873 0.4716 0.6867
No log 20.4 306 0.4691 0.6479 0.4691 0.6849
No log 20.5333 308 0.5134 0.6608 0.5134 0.7165
No log 20.6667 310 0.5024 0.6539 0.5024 0.7088
No log 20.8 312 0.4686 0.6876 0.4686 0.6845
No log 20.9333 314 0.4833 0.6308 0.4833 0.6952
No log 21.0667 316 0.4682 0.6207 0.4682 0.6843
No log 21.2 318 0.4438 0.6745 0.4438 0.6662
No log 21.3333 320 0.4371 0.6388 0.4371 0.6611
No log 21.4667 322 0.4509 0.6787 0.4509 0.6715
No log 21.6 324 0.4663 0.6519 0.4663 0.6829
No log 21.7333 326 0.4708 0.6786 0.4708 0.6861
No log 21.8667 328 0.4420 0.6712 0.4420 0.6648
No log 22.0 330 0.4410 0.6483 0.4410 0.6641
No log 22.1333 332 0.4600 0.6071 0.4600 0.6782
No log 22.2667 334 0.4684 0.5987 0.4684 0.6844
No log 22.4 336 0.4755 0.6336 0.4755 0.6896
No log 22.5333 338 0.4890 0.6467 0.4890 0.6993
No log 22.6667 340 0.5075 0.6693 0.5075 0.7124
No log 22.8 342 0.5415 0.6611 0.5415 0.7358
No log 22.9333 344 0.5334 0.6684 0.5334 0.7303
No log 23.0667 346 0.4739 0.6293 0.4739 0.6884
No log 23.2 348 0.4392 0.6377 0.4392 0.6627
No log 23.3333 350 0.4471 0.6322 0.4471 0.6686
No log 23.4667 352 0.4726 0.6449 0.4726 0.6875
No log 23.6 354 0.4764 0.6207 0.4764 0.6903
No log 23.7333 356 0.4718 0.6526 0.4718 0.6869
No log 23.8667 358 0.4652 0.5904 0.4652 0.6821
No log 24.0 360 0.4601 0.5634 0.4601 0.6783
No log 24.1333 362 0.4614 0.5810 0.4614 0.6792
No log 24.2667 364 0.4603 0.5878 0.4603 0.6785
No log 24.4 366 0.4580 0.6059 0.4580 0.6767
No log 24.5333 368 0.4744 0.6603 0.4744 0.6887
No log 24.6667 370 0.4879 0.6218 0.4879 0.6985
No log 24.8 372 0.4822 0.6696 0.4822 0.6944
No log 24.9333 374 0.4743 0.6403 0.4743 0.6887
No log 25.0667 376 0.4732 0.6549 0.4732 0.6879
No log 25.2 378 0.4603 0.7196 0.4603 0.6784
No log 25.3333 380 0.4580 0.7073 0.4580 0.6767
No log 25.4667 382 0.4567 0.6730 0.4567 0.6758
No log 25.6 384 0.4531 0.6818 0.4531 0.6731
No log 25.7333 386 0.4488 0.6563 0.4488 0.6699
No log 25.8667 388 0.4510 0.6307 0.4510 0.6715
No log 26.0 390 0.4552 0.6121 0.4552 0.6747
No log 26.1333 392 0.4538 0.6121 0.4538 0.6737
No log 26.2667 394 0.4554 0.5939 0.4554 0.6748
No log 26.4 396 0.4700 0.5578 0.4700 0.6855
No log 26.5333 398 0.4839 0.5953 0.4839 0.6956
No log 26.6667 400 0.4857 0.6135 0.4857 0.6969
No log 26.8 402 0.4752 0.6194 0.4752 0.6894
No log 26.9333 404 0.4544 0.5422 0.4544 0.6741
No log 27.0667 406 0.4324 0.6210 0.4324 0.6576
No log 27.2 408 0.4485 0.7053 0.4485 0.6697
No log 27.3333 410 0.4754 0.7122 0.4754 0.6895
No log 27.4667 412 0.5150 0.6226 0.5150 0.7176
No log 27.6 414 0.5062 0.6404 0.5062 0.7115
No log 27.7333 416 0.4871 0.6590 0.4871 0.6979
No log 27.8667 418 0.4612 0.6359 0.4612 0.6791
No log 28.0 420 0.4554 0.5611 0.4554 0.6748
No log 28.1333 422 0.4605 0.5826 0.4605 0.6786
No log 28.2667 424 0.4617 0.5781 0.4617 0.6795
No log 28.4 426 0.4647 0.5809 0.4647 0.6817
No log 28.5333 428 0.4591 0.5994 0.4591 0.6776
No log 28.6667 430 0.4518 0.6827 0.4518 0.6722
No log 28.8 432 0.4426 0.6197 0.4426 0.6653
No log 28.9333 434 0.4370 0.6053 0.4370 0.6610
No log 29.0667 436 0.4327 0.6053 0.4327 0.6578
No log 29.2 438 0.4365 0.6435 0.4365 0.6607
No log 29.3333 440 0.4550 0.6721 0.4550 0.6745
No log 29.4667 442 0.4871 0.5882 0.4871 0.6979
No log 29.6 444 0.4914 0.5965 0.4914 0.7010
No log 29.7333 446 0.4733 0.6289 0.4733 0.6879
No log 29.8667 448 0.4467 0.6200 0.4467 0.6684
No log 30.0 450 0.4400 0.6448 0.4400 0.6633
No log 30.1333 452 0.4400 0.6555 0.4400 0.6633
No log 30.2667 454 0.4460 0.7297 0.4460 0.6678
No log 30.4 456 0.4666 0.6709 0.4666 0.6831
No log 30.5333 458 0.5084 0.5758 0.5084 0.7131
No log 30.6667 460 0.5072 0.6249 0.5072 0.7122
No log 30.8 462 0.4801 0.6698 0.4801 0.6929
No log 30.9333 464 0.4671 0.6643 0.4671 0.6835
No log 31.0667 466 0.4623 0.6643 0.4623 0.6800
No log 31.2 468 0.4700 0.6797 0.4700 0.6856
No log 31.3333 470 0.5040 0.6680 0.5040 0.7099
No log 31.4667 472 0.4991 0.6752 0.4991 0.7065
No log 31.6 474 0.4651 0.6938 0.4651 0.6820
No log 31.7333 476 0.4387 0.6806 0.4387 0.6624
No log 31.8667 478 0.4380 0.6492 0.4380 0.6618
No log 32.0 480 0.4365 0.6301 0.4365 0.6607
No log 32.1333 482 0.4375 0.6839 0.4375 0.6614
No log 32.2667 484 0.4456 0.6706 0.4456 0.6675
No log 32.4 486 0.4629 0.6944 0.4629 0.6804
No log 32.5333 488 0.4725 0.6787 0.4725 0.6874
No log 32.6667 490 0.4716 0.6417 0.4716 0.6867
No log 32.8 492 0.4797 0.5769 0.4797 0.6926
No log 32.9333 494 0.4716 0.5795 0.4716 0.6867
No log 33.0667 496 0.4630 0.4918 0.4630 0.6805
No log 33.2 498 0.4542 0.5352 0.4542 0.6740
0.2489 33.3333 500 0.4437 0.6467 0.4437 0.6661
0.2489 33.4667 502 0.4514 0.7287 0.4514 0.6719
0.2489 33.6 504 0.5000 0.6337 0.5000 0.7071
0.2489 33.7333 506 0.5664 0.4966 0.5664 0.7526
0.2489 33.8667 508 0.5802 0.5354 0.5802 0.7617
0.2489 34.0 510 0.5434 0.5591 0.5434 0.7372
0.2489 34.1333 512 0.4986 0.6406 0.4986 0.7061

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run3_AugV5_k3_task7_organization

Finetuned
(4019)
this model