ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k9_task7_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.4898
  • Qwk: 0.5317
  • Mse: 0.4898
  • Rmse: 0.6998

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0444 2 2.4974 0.0052 2.4974 1.5803
No log 0.0889 4 1.1075 0.1262 1.1075 1.0524
No log 0.1333 6 0.7284 0.0937 0.7284 0.8535
No log 0.1778 8 0.7190 0.3127 0.7190 0.8480
No log 0.2222 10 0.6075 0.3494 0.6075 0.7794
No log 0.2667 12 0.6213 0.3435 0.6213 0.7882
No log 0.3111 14 0.6089 0.3746 0.6089 0.7803
No log 0.3556 16 0.5116 0.4494 0.5116 0.7153
No log 0.4 18 0.5053 0.5608 0.5053 0.7109
No log 0.4444 20 0.6118 0.5262 0.6118 0.7822
No log 0.4889 22 0.4658 0.5591 0.4658 0.6825
No log 0.5333 24 0.4505 0.5665 0.4505 0.6712
No log 0.5778 26 0.4671 0.5918 0.4671 0.6835
No log 0.6222 28 0.6404 0.4939 0.6404 0.8003
No log 0.6667 30 0.6649 0.5146 0.6649 0.8154
No log 0.7111 32 0.5097 0.5471 0.5097 0.7139
No log 0.7556 34 0.4552 0.6434 0.4552 0.6746
No log 0.8 36 0.4397 0.6753 0.4397 0.6631
No log 0.8444 38 0.5269 0.5298 0.5269 0.7259
No log 0.8889 40 0.4855 0.5826 0.4855 0.6967
No log 0.9333 42 0.4002 0.6229 0.4002 0.6326
No log 0.9778 44 0.3879 0.6321 0.3879 0.6228
No log 1.0222 46 0.5969 0.5069 0.5969 0.7726
No log 1.0667 48 0.9484 0.3567 0.9484 0.9738
No log 1.1111 50 0.7802 0.4447 0.7802 0.8833
No log 1.1556 52 0.4355 0.6688 0.4355 0.6599
No log 1.2 54 0.4170 0.6730 0.4170 0.6458
No log 1.2444 56 0.4512 0.6411 0.4512 0.6717
No log 1.2889 58 0.6627 0.5706 0.6627 0.8141
No log 1.3333 60 0.6207 0.5575 0.6207 0.7878
No log 1.3778 62 0.4879 0.5736 0.4879 0.6985
No log 1.4222 64 0.5329 0.6404 0.5329 0.7300
No log 1.4667 66 0.6318 0.5815 0.6318 0.7949
No log 1.5111 68 0.5628 0.5915 0.5628 0.7502
No log 1.5556 70 0.4320 0.6966 0.4320 0.6572
No log 1.6 72 0.5460 0.6074 0.5460 0.7389
No log 1.6444 74 0.7077 0.5706 0.7077 0.8412
No log 1.6889 76 0.6660 0.5961 0.6660 0.8161
No log 1.7333 78 0.4722 0.5970 0.4722 0.6872
No log 1.7778 80 0.5050 0.6241 0.5050 0.7107
No log 1.8222 82 0.8333 0.5903 0.8333 0.9128
No log 1.8667 84 0.7741 0.6277 0.7741 0.8798
No log 1.9111 86 0.5412 0.6945 0.5412 0.7357
No log 1.9556 88 0.7361 0.4799 0.7361 0.8579
No log 2.0 90 0.8419 0.4764 0.8419 0.9175
No log 2.0444 92 0.6971 0.5281 0.6971 0.8349
No log 2.0889 94 0.4814 0.6526 0.4814 0.6939
No log 2.1333 96 0.4238 0.6360 0.4238 0.6510
No log 2.1778 98 0.4133 0.6243 0.4133 0.6429
No log 2.2222 100 0.4907 0.5171 0.4907 0.7005
No log 2.2667 102 0.6141 0.5021 0.6141 0.7836
No log 2.3111 104 0.5432 0.5051 0.5432 0.7370
No log 2.3556 106 0.4234 0.6168 0.4234 0.6507
No log 2.4 108 0.4352 0.6434 0.4352 0.6597
No log 2.4444 110 0.4704 0.6675 0.4704 0.6858
No log 2.4889 112 0.4255 0.6648 0.4255 0.6523
No log 2.5333 114 0.4688 0.6623 0.4688 0.6847
No log 2.5778 116 0.6119 0.5281 0.6119 0.7822
No log 2.6222 118 0.8301 0.4729 0.8301 0.9111
No log 2.6667 120 0.8707 0.4779 0.8707 0.9331
No log 2.7111 122 0.6363 0.5504 0.6363 0.7977
No log 2.7556 124 0.4202 0.5010 0.4202 0.6482
No log 2.8 126 0.4763 0.5622 0.4763 0.6902
No log 2.8444 128 0.5255 0.5678 0.5255 0.7249
No log 2.8889 130 0.4253 0.7122 0.4253 0.6521
No log 2.9333 132 0.4505 0.6677 0.4505 0.6712
No log 2.9778 134 0.4719 0.6368 0.4719 0.6869
No log 3.0222 136 0.4196 0.6222 0.4196 0.6478
No log 3.0667 138 0.4544 0.5978 0.4544 0.6741
No log 3.1111 140 0.5003 0.5774 0.5003 0.7073
No log 3.1556 142 0.4476 0.6309 0.4476 0.6690
No log 3.2 144 0.4193 0.6229 0.4193 0.6475
No log 3.2444 146 0.4255 0.5782 0.4255 0.6523
No log 3.2889 148 0.4171 0.5782 0.4171 0.6459
No log 3.3333 150 0.4572 0.6118 0.4572 0.6762
No log 3.3778 152 0.4716 0.5802 0.4716 0.6867
No log 3.4222 154 0.4086 0.5681 0.4086 0.6392
No log 3.4667 156 0.4258 0.5571 0.4258 0.6525
No log 3.5111 158 0.4813 0.5869 0.4813 0.6938
No log 3.5556 160 0.4939 0.5657 0.4939 0.7028
No log 3.6 162 0.4435 0.5405 0.4435 0.6659
No log 3.6444 164 0.4508 0.5623 0.4508 0.6714
No log 3.6889 166 0.4471 0.5623 0.4471 0.6686
No log 3.7333 168 0.4358 0.5846 0.4358 0.6602
No log 3.7778 170 0.4589 0.5904 0.4589 0.6774
No log 3.8222 172 0.5628 0.6042 0.5628 0.7502
No log 3.8667 174 0.5913 0.6042 0.5913 0.7690
No log 3.9111 176 0.4877 0.5869 0.4877 0.6983
No log 3.9556 178 0.4181 0.6641 0.4181 0.6466
No log 4.0 180 0.4343 0.6201 0.4343 0.6590
No log 4.0444 182 0.4338 0.6615 0.4338 0.6587
No log 4.0889 184 0.4441 0.6122 0.4441 0.6664
No log 4.1333 186 0.4954 0.4528 0.4954 0.7039
No log 4.1778 188 0.4889 0.4979 0.4889 0.6992
No log 4.2222 190 0.4522 0.5812 0.4522 0.6724
No log 4.2667 192 0.4634 0.5937 0.4634 0.6807
No log 4.3111 194 0.4778 0.6418 0.4778 0.6912
No log 4.3556 196 0.4433 0.5913 0.4433 0.6658
No log 4.4 198 0.4248 0.5421 0.4248 0.6518
No log 4.4444 200 0.4126 0.5563 0.4126 0.6423
No log 4.4889 202 0.4046 0.6818 0.4046 0.6360
No log 4.5333 204 0.4232 0.7002 0.4232 0.6506
No log 4.5778 206 0.4101 0.6832 0.4101 0.6404
No log 4.6222 208 0.4081 0.7002 0.4081 0.6388
No log 4.6667 210 0.4149 0.5826 0.4149 0.6441
No log 4.7111 212 0.4820 0.6303 0.4820 0.6942
No log 4.7556 214 0.5199 0.6311 0.5199 0.7210
No log 4.8 216 0.5115 0.6389 0.5115 0.7152
No log 4.8444 218 0.4536 0.5872 0.4536 0.6735
No log 4.8889 220 0.4449 0.6719 0.4449 0.6670
No log 4.9333 222 0.4500 0.6518 0.4500 0.6708
No log 4.9778 224 0.4572 0.6418 0.4572 0.6762
No log 5.0222 226 0.5255 0.5131 0.5255 0.7249
No log 5.0667 228 0.5446 0.5027 0.5446 0.7380
No log 5.1111 230 0.4950 0.5560 0.4950 0.7036
No log 5.1556 232 0.4455 0.6001 0.4455 0.6675
No log 5.2 234 0.5357 0.4713 0.5357 0.7319
No log 5.2444 236 0.5836 0.5558 0.5836 0.7639
No log 5.2889 238 0.4992 0.4868 0.4992 0.7066
No log 5.3333 240 0.4388 0.5414 0.4388 0.6624
No log 5.3778 242 0.4420 0.6251 0.4420 0.6648
No log 5.4222 244 0.4439 0.6261 0.4439 0.6662
No log 5.4667 246 0.4314 0.6365 0.4314 0.6568
No log 5.5111 248 0.4360 0.5710 0.4360 0.6603
No log 5.5556 250 0.4554 0.5569 0.4554 0.6748
No log 5.6 252 0.4640 0.5584 0.4640 0.6812
No log 5.6444 254 0.4253 0.5414 0.4253 0.6522
No log 5.6889 256 0.4513 0.6808 0.4513 0.6718
No log 5.7333 258 0.4882 0.6973 0.4882 0.6987
No log 5.7778 260 0.4516 0.6820 0.4516 0.6720
No log 5.8222 262 0.4154 0.6229 0.4154 0.6445
No log 5.8667 264 0.4396 0.5855 0.4396 0.6630
No log 5.9111 266 0.4624 0.6642 0.4624 0.6800
No log 5.9556 268 0.4444 0.6260 0.4444 0.6666
No log 6.0 270 0.4386 0.6342 0.4386 0.6623
No log 6.0444 272 0.4044 0.6053 0.4044 0.6360
No log 6.0889 274 0.3996 0.6542 0.3996 0.6321
No log 6.1333 276 0.4086 0.6068 0.4086 0.6392
No log 6.1778 278 0.4200 0.5877 0.4200 0.6481
No log 6.2222 280 0.4440 0.5718 0.4440 0.6664
No log 6.2667 282 0.4633 0.5438 0.4633 0.6806
No log 6.3111 284 0.4373 0.5877 0.4373 0.6613
No log 6.3556 286 0.4303 0.5698 0.4303 0.6559
No log 6.4 288 0.4359 0.5698 0.4359 0.6602
No log 6.4444 290 0.4442 0.5574 0.4442 0.6665
No log 6.4889 292 0.4492 0.5574 0.4492 0.6703
No log 6.5333 294 0.4533 0.5574 0.4533 0.6733
No log 6.5778 296 0.4592 0.5208 0.4592 0.6776
No log 6.6222 298 0.4543 0.5208 0.4543 0.6740
No log 6.6667 300 0.4456 0.5493 0.4456 0.6676
No log 6.7111 302 0.4887 0.6252 0.4887 0.6991
No log 6.7556 304 0.5272 0.5991 0.5272 0.7261
No log 6.8 306 0.5126 0.5970 0.5126 0.7159
No log 6.8444 308 0.4597 0.5831 0.4597 0.6780
No log 6.8889 310 0.4423 0.5432 0.4423 0.6650
No log 6.9333 312 0.4461 0.5432 0.4461 0.6679
No log 6.9778 314 0.4373 0.5960 0.4373 0.6613
No log 7.0222 316 0.4430 0.6032 0.4430 0.6656
No log 7.0667 318 0.4424 0.5945 0.4424 0.6651
No log 7.1111 320 0.4491 0.5796 0.4491 0.6701
No log 7.1556 322 0.4741 0.5283 0.4741 0.6886
No log 7.2 324 0.4955 0.5300 0.4955 0.7039
No log 7.2444 326 0.4700 0.5796 0.4700 0.6856
No log 7.2889 328 0.4680 0.5815 0.4680 0.6841
No log 7.3333 330 0.4813 0.5265 0.4813 0.6938
No log 7.3778 332 0.5005 0.4788 0.5005 0.7075
No log 7.4222 334 0.4982 0.4788 0.4982 0.7058
No log 7.4667 336 0.4863 0.4767 0.4863 0.6974
No log 7.5111 338 0.4668 0.4938 0.4668 0.6832
No log 7.5556 340 0.4559 0.5493 0.4559 0.6752
No log 7.6 342 0.4659 0.6078 0.4659 0.6825
No log 7.6444 344 0.4604 0.6060 0.4604 0.6785
No log 7.6889 346 0.4574 0.5798 0.4574 0.6763
No log 7.7333 348 0.4570 0.5010 0.4570 0.6760
No log 7.7778 350 0.4724 0.4938 0.4724 0.6873
No log 7.8222 352 0.4778 0.4938 0.4778 0.6913
No log 7.8667 354 0.4576 0.4938 0.4576 0.6765
No log 7.9111 356 0.4518 0.6017 0.4518 0.6722
No log 7.9556 358 0.4511 0.6017 0.4511 0.6717
No log 8.0 360 0.4437 0.6017 0.4437 0.6661
No log 8.0444 362 0.4453 0.5846 0.4453 0.6673
No log 8.0889 364 0.4509 0.5877 0.4509 0.6715
No log 8.1333 366 0.4460 0.5493 0.4460 0.6678
No log 8.1778 368 0.4406 0.5658 0.4406 0.6638
No log 8.2222 370 0.4338 0.6106 0.4338 0.6586
No log 8.2667 372 0.4326 0.6733 0.4326 0.6577
No log 8.3111 374 0.4309 0.5930 0.4309 0.6564
No log 8.3556 376 0.4792 0.5657 0.4792 0.6923
No log 8.4 378 0.5638 0.5897 0.5638 0.7509
No log 8.4444 380 0.5637 0.6100 0.5637 0.7508
No log 8.4889 382 0.5069 0.5883 0.5069 0.7120
No log 8.5333 384 0.4313 0.5438 0.4313 0.6567
No log 8.5778 386 0.4230 0.6275 0.4230 0.6504
No log 8.6222 388 0.4312 0.6797 0.4312 0.6567
No log 8.6667 390 0.4162 0.6265 0.4162 0.6452
No log 8.7111 392 0.4151 0.6370 0.4151 0.6443
No log 8.7556 394 0.4294 0.5528 0.4294 0.6553
No log 8.8 396 0.4362 0.5682 0.4362 0.6605
No log 8.8444 398 0.4500 0.5528 0.4500 0.6708
No log 8.8889 400 0.4392 0.5438 0.4392 0.6628
No log 8.9333 402 0.4385 0.5841 0.4385 0.6622
No log 8.9778 404 0.4446 0.5841 0.4446 0.6668
No log 9.0222 406 0.4426 0.5796 0.4426 0.6653
No log 9.0667 408 0.4440 0.5493 0.4440 0.6663
No log 9.1111 410 0.4297 0.5475 0.4297 0.6555
No log 9.1556 412 0.4318 0.5475 0.4318 0.6571
No log 9.2 414 0.4336 0.5941 0.4336 0.6585
No log 9.2444 416 0.4385 0.5941 0.4385 0.6622
No log 9.2889 418 0.4328 0.5734 0.4328 0.6579
No log 9.3333 420 0.4271 0.5875 0.4271 0.6535
No log 9.3778 422 0.4248 0.5985 0.4248 0.6518
No log 9.4222 424 0.4250 0.5666 0.4250 0.6519
No log 9.4667 426 0.4257 0.5571 0.4257 0.6525
No log 9.5111 428 0.4283 0.5698 0.4283 0.6545
No log 9.5556 430 0.4299 0.5930 0.4299 0.6557
No log 9.6 432 0.4449 0.5356 0.4449 0.6670
No log 9.6444 434 0.4747 0.5135 0.4747 0.6890
No log 9.6889 436 0.4856 0.5135 0.4856 0.6968
No log 9.7333 438 0.5027 0.5466 0.5027 0.7090
No log 9.7778 440 0.4913 0.5373 0.4913 0.7010
No log 9.8222 442 0.4635 0.5373 0.4635 0.6808
No log 9.8667 444 0.4534 0.5135 0.4534 0.6733
No log 9.9111 446 0.4587 0.5208 0.4587 0.6773
No log 9.9556 448 0.4596 0.5030 0.4596 0.6779
No log 10.0 450 0.4592 0.5246 0.4592 0.6777
No log 10.0444 452 0.4564 0.5265 0.4564 0.6755
No log 10.0889 454 0.4476 0.5117 0.4476 0.6690
No log 10.1333 456 0.4453 0.5339 0.4453 0.6673
No log 10.1778 458 0.4448 0.6171 0.4448 0.6669
No log 10.2222 460 0.4410 0.5611 0.4410 0.6641
No log 10.2667 462 0.4483 0.5733 0.4483 0.6696
No log 10.3111 464 0.4538 0.5733 0.4538 0.6736
No log 10.3556 466 0.4448 0.6223 0.4448 0.6669
No log 10.4 468 0.4385 0.6313 0.4385 0.6622
No log 10.4444 470 0.4431 0.6313 0.4431 0.6656
No log 10.4889 472 0.4552 0.6503 0.4552 0.6747
No log 10.5333 474 0.4490 0.6503 0.4490 0.6700
No log 10.5778 476 0.4628 0.6223 0.4628 0.6803
No log 10.6222 478 0.4611 0.6013 0.4611 0.6790
No log 10.6667 480 0.4875 0.5300 0.4875 0.6982
No log 10.7111 482 0.4827 0.5300 0.4827 0.6948
No log 10.7556 484 0.4579 0.6096 0.4579 0.6767
No log 10.8 486 0.4410 0.6073 0.4410 0.6641
No log 10.8444 488 0.4437 0.6073 0.4437 0.6661
No log 10.8889 490 0.4639 0.5587 0.4639 0.6811
No log 10.9333 492 0.5051 0.5619 0.5051 0.7107
No log 10.9778 494 0.5001 0.5390 0.5001 0.7072
No log 11.0222 496 0.4633 0.5189 0.4633 0.6807
No log 11.0667 498 0.4547 0.5010 0.4547 0.6743
0.2741 11.1111 500 0.4509 0.5556 0.4509 0.6715
0.2741 11.1556 502 0.4480 0.5475 0.4480 0.6693
0.2741 11.2 504 0.4555 0.5283 0.4555 0.6749
0.2741 11.2444 506 0.4723 0.5390 0.4723 0.6872
0.2741 11.2889 508 0.4971 0.5317 0.4971 0.7050
0.2741 11.3333 510 0.4898 0.5317 0.4898 0.6998

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k9_task7_organization

Finetuned
(4019)
this model