ArabicNewSplits7_B_usingWellWrittenEssays_FineTuningAraBERT_run3_AugV5_k2_task7_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.4622
  • Qwk: 0.4585
  • Mse: 0.4622
  • Rmse: 0.6798

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.1667 2 2.6453 -0.0262 2.6453 1.6264
No log 0.3333 4 1.2157 0.0726 1.2157 1.1026
No log 0.5 6 0.9296 -0.0660 0.9296 0.9641
No log 0.6667 8 0.8656 0.1815 0.8656 0.9304
No log 0.8333 10 0.6652 0.2783 0.6652 0.8156
No log 1.0 12 0.6556 0.2498 0.6556 0.8097
No log 1.1667 14 0.5973 0.4494 0.5973 0.7729
No log 1.3333 16 0.5980 0.4698 0.5980 0.7733
No log 1.5 18 0.5847 0.4881 0.5847 0.7646
No log 1.6667 20 0.5772 0.5398 0.5772 0.7597
No log 1.8333 22 0.6048 0.5262 0.6048 0.7777
No log 2.0 24 1.1644 0.2169 1.1644 1.0791
No log 2.1667 26 1.2713 0.1670 1.2713 1.1275
No log 2.3333 28 1.0665 0.2348 1.0665 1.0327
No log 2.5 30 0.7311 0.5232 0.7311 0.8550
No log 2.6667 32 0.5563 0.5907 0.5563 0.7459
No log 2.8333 34 0.5208 0.4375 0.5208 0.7216
No log 3.0 36 0.6763 0.4930 0.6763 0.8224
No log 3.1667 38 0.5506 0.4473 0.5506 0.7421
No log 3.3333 40 0.5137 0.4147 0.5137 0.7167
No log 3.5 42 0.5191 0.5171 0.5191 0.7205
No log 3.6667 44 0.5894 0.4633 0.5894 0.7677
No log 3.8333 46 0.6436 0.5184 0.6436 0.8022
No log 4.0 48 0.7869 0.4250 0.7869 0.8871
No log 4.1667 50 0.6076 0.4654 0.6076 0.7795
No log 4.3333 52 0.5584 0.4848 0.5584 0.7473
No log 4.5 54 0.8347 0.3928 0.8347 0.9136
No log 4.6667 56 0.7997 0.4161 0.7997 0.8942
No log 4.8333 58 0.5424 0.4991 0.5424 0.7365
No log 5.0 60 0.6020 0.4087 0.6020 0.7759
No log 5.1667 62 0.5437 0.5625 0.5437 0.7374
No log 5.3333 64 0.8220 0.2810 0.8220 0.9067
No log 5.5 66 1.0019 0.2659 1.0019 1.0010
No log 5.6667 68 0.7099 0.4008 0.7099 0.8425
No log 5.8333 70 0.5306 0.5248 0.5306 0.7284
No log 6.0 72 0.6811 0.5129 0.6811 0.8253
No log 6.1667 74 0.5825 0.5779 0.5825 0.7632
No log 6.3333 76 0.5087 0.5647 0.5087 0.7132
No log 6.5 78 0.5115 0.5647 0.5115 0.7152
No log 6.6667 80 0.5216 0.5831 0.5216 0.7222
No log 6.8333 82 0.5201 0.5831 0.5201 0.7212
No log 7.0 84 0.4936 0.5753 0.4936 0.7026
No log 7.1667 86 0.5412 0.5483 0.5412 0.7357
No log 7.3333 88 0.6694 0.4427 0.6694 0.8181
No log 7.5 90 0.5364 0.5621 0.5364 0.7324
No log 7.6667 92 0.5099 0.6038 0.5099 0.7141
No log 7.8333 94 0.5135 0.6038 0.5135 0.7166
No log 8.0 96 0.4829 0.6589 0.4829 0.6949
No log 8.1667 98 0.7746 0.4552 0.7746 0.8801
No log 8.3333 100 0.9075 0.4250 0.9075 0.9526
No log 8.5 102 0.6439 0.5428 0.6439 0.8025
No log 8.6667 104 0.4844 0.5888 0.4844 0.6960
No log 8.8333 106 0.6234 0.4284 0.6234 0.7896
No log 9.0 108 0.6089 0.4353 0.6089 0.7803
No log 9.1667 110 0.4869 0.5888 0.4869 0.6978
No log 9.3333 112 0.5174 0.6053 0.5174 0.7193
No log 9.5 114 0.5412 0.6337 0.5412 0.7356
No log 9.6667 116 0.5158 0.6007 0.5158 0.7182
No log 9.8333 118 0.5113 0.5913 0.5113 0.7151
No log 10.0 120 0.5176 0.6334 0.5176 0.7194
No log 10.1667 122 0.5078 0.5797 0.5078 0.7126
No log 10.3333 124 0.5562 0.5841 0.5562 0.7458
No log 10.5 126 0.5825 0.5233 0.5825 0.7632
No log 10.6667 128 0.5863 0.5233 0.5863 0.7657
No log 10.8333 130 0.5995 0.4580 0.5995 0.7743
No log 11.0 132 0.5389 0.5895 0.5389 0.7341
No log 11.1667 134 0.4934 0.6130 0.4934 0.7024
No log 11.3333 136 0.4987 0.6146 0.4987 0.7062
No log 11.5 138 0.4808 0.6148 0.4808 0.6934
No log 11.6667 140 0.4670 0.6096 0.4670 0.6834
No log 11.8333 142 0.4708 0.6147 0.4708 0.6861
No log 12.0 144 0.4410 0.6377 0.4410 0.6641
No log 12.1667 146 0.5296 0.5983 0.5296 0.7277
No log 12.3333 148 0.6026 0.4593 0.6026 0.7763
No log 12.5 150 0.5087 0.5603 0.5087 0.7132
No log 12.6667 152 0.4442 0.6215 0.4442 0.6665
No log 12.8333 154 0.4617 0.5955 0.4617 0.6795
No log 13.0 156 0.4545 0.5539 0.4545 0.6741
No log 13.1667 158 0.6532 0.4760 0.6532 0.8082
No log 13.3333 160 0.7741 0.4212 0.7741 0.8798
No log 13.5 162 0.6267 0.4926 0.6267 0.7916
No log 13.6667 164 0.4731 0.6289 0.4731 0.6878
No log 13.8333 166 0.5514 0.5561 0.5514 0.7426
No log 14.0 168 0.6739 0.4993 0.6739 0.8209
No log 14.1667 170 0.5527 0.5459 0.5527 0.7435
No log 14.3333 172 0.4833 0.6071 0.4833 0.6952
No log 14.5 174 0.7364 0.5007 0.7364 0.8581
No log 14.6667 176 0.8397 0.4429 0.8397 0.9163
No log 14.8333 178 0.6767 0.5103 0.6767 0.8226
No log 15.0 180 0.4820 0.6310 0.4820 0.6943
No log 15.1667 182 0.6549 0.5076 0.6549 0.8093
No log 15.3333 184 0.9922 0.4977 0.9922 0.9961
No log 15.5 186 1.0054 0.4712 1.0054 1.0027
No log 15.6667 188 0.7631 0.4444 0.7631 0.8735
No log 15.8333 190 0.5149 0.5420 0.5149 0.7176
No log 16.0 192 0.4820 0.6154 0.4820 0.6943
No log 16.1667 194 0.5253 0.5512 0.5253 0.7248
No log 16.3333 196 0.4850 0.6096 0.4850 0.6964
No log 16.5 198 0.4584 0.5784 0.4584 0.6771
No log 16.6667 200 0.5295 0.5357 0.5295 0.7277
No log 16.8333 202 0.5159 0.5403 0.5159 0.7183
No log 17.0 204 0.4350 0.6632 0.4350 0.6595
No log 17.1667 206 0.4789 0.5811 0.4789 0.6920
No log 17.3333 208 0.5853 0.5048 0.5853 0.7650
No log 17.5 210 0.6203 0.4341 0.6203 0.7876
No log 17.6667 212 0.5363 0.4841 0.5363 0.7323
No log 17.8333 214 0.4794 0.5246 0.4794 0.6924
No log 18.0 216 0.4741 0.5819 0.4741 0.6886
No log 18.1667 218 0.5039 0.5551 0.5039 0.7099
No log 18.3333 220 0.4876 0.5803 0.4876 0.6983
No log 18.5 222 0.4511 0.5970 0.4511 0.6716
No log 18.6667 224 0.4347 0.6330 0.4347 0.6593
No log 18.8333 226 0.4616 0.6235 0.4616 0.6794
No log 19.0 228 0.5275 0.5735 0.5275 0.7263
No log 19.1667 230 0.5850 0.6055 0.5850 0.7649
No log 19.3333 232 0.5229 0.5735 0.5229 0.7231
No log 19.5 234 0.4268 0.6584 0.4268 0.6533
No log 19.6667 236 0.4676 0.5859 0.4676 0.6838
No log 19.8333 238 0.5535 0.5484 0.5535 0.7439
No log 20.0 240 0.5033 0.5845 0.5033 0.7094
No log 20.1667 242 0.4249 0.6321 0.4249 0.6519
No log 20.3333 244 0.4413 0.6101 0.4413 0.6643
No log 20.5 246 0.4684 0.4979 0.4684 0.6844
No log 20.6667 248 0.4552 0.5556 0.4552 0.6747
No log 20.8333 250 0.4510 0.5897 0.4510 0.6716
No log 21.0 252 0.4776 0.6201 0.4776 0.6911
No log 21.1667 254 0.5488 0.5787 0.5488 0.7408
No log 21.3333 256 0.5329 0.5975 0.5329 0.7300
No log 21.5 258 0.4678 0.6118 0.4678 0.6840
No log 21.6667 260 0.4496 0.5800 0.4496 0.6705
No log 21.8333 262 0.4562 0.4809 0.4562 0.6754
No log 22.0 264 0.4691 0.4802 0.4691 0.6849
No log 22.1667 266 0.4593 0.5213 0.4593 0.6777
No log 22.3333 268 0.4641 0.6132 0.4641 0.6812
No log 22.5 270 0.5005 0.6296 0.5005 0.7074
No log 22.6667 272 0.5593 0.5712 0.5593 0.7479
No log 22.8333 274 0.5994 0.4980 0.5994 0.7742
No log 23.0 276 0.6165 0.4457 0.6165 0.7852
No log 23.1667 278 0.5493 0.5867 0.5493 0.7412
No log 23.3333 280 0.4909 0.5970 0.4909 0.7006
No log 23.5 282 0.4904 0.4569 0.4904 0.7003
No log 23.6667 284 0.4863 0.4656 0.4863 0.6973
No log 23.8333 286 0.4668 0.4809 0.4668 0.6833
No log 24.0 288 0.4482 0.6186 0.4482 0.6695
No log 24.1667 290 0.4654 0.5957 0.4654 0.6822
No log 24.3333 292 0.4557 0.5957 0.4557 0.6751
No log 24.5 294 0.4434 0.5133 0.4434 0.6659
No log 24.6667 296 0.5103 0.4821 0.5103 0.7143
No log 24.8333 298 0.5367 0.4005 0.5367 0.7326
No log 25.0 300 0.5112 0.4632 0.5112 0.7150
No log 25.1667 302 0.4758 0.4273 0.4758 0.6898
No log 25.3333 304 0.4661 0.5687 0.4661 0.6827
No log 25.5 306 0.5206 0.6174 0.5206 0.7216
No log 25.6667 308 0.5363 0.5961 0.5363 0.7323
No log 25.8333 310 0.4844 0.5975 0.4844 0.6960
No log 26.0 312 0.4527 0.5521 0.4527 0.6729
No log 26.1667 314 0.4736 0.4888 0.4736 0.6882
No log 26.3333 316 0.4753 0.4997 0.4753 0.6894
No log 26.5 318 0.4400 0.6053 0.4400 0.6633
No log 26.6667 320 0.4331 0.6145 0.4331 0.6581
No log 26.8333 322 0.4563 0.6038 0.4563 0.6755
No log 27.0 324 0.4653 0.5817 0.4653 0.6821
No log 27.1667 326 0.4401 0.6418 0.4401 0.6634
No log 27.3333 328 0.4341 0.6140 0.4341 0.6589
No log 27.5 330 0.4449 0.5765 0.4449 0.6670
No log 27.6667 332 0.4460 0.5883 0.4460 0.6679
No log 27.8333 334 0.4563 0.6045 0.4563 0.6755
No log 28.0 336 0.4654 0.5939 0.4654 0.6822
No log 28.1667 338 0.4743 0.5939 0.4743 0.6887
No log 28.3333 340 0.4832 0.6173 0.4832 0.6952
No log 28.5 342 0.4587 0.5801 0.4587 0.6773
No log 28.6667 344 0.4323 0.6242 0.4323 0.6575
No log 28.8333 346 0.4251 0.6566 0.4251 0.6520
No log 29.0 348 0.4383 0.6047 0.4383 0.6620
No log 29.1667 350 0.4322 0.5841 0.4322 0.6574
No log 29.3333 352 0.4303 0.6317 0.4303 0.6560
No log 29.5 354 0.4176 0.6105 0.4176 0.6462
No log 29.6667 356 0.4119 0.6255 0.4119 0.6418
No log 29.8333 358 0.4088 0.6087 0.4088 0.6394
No log 30.0 360 0.4034 0.6242 0.4034 0.6351
No log 30.1667 362 0.4006 0.6330 0.4006 0.6329
No log 30.3333 364 0.3980 0.6530 0.3980 0.6309
No log 30.5 366 0.4039 0.6530 0.4039 0.6356
No log 30.6667 368 0.3987 0.6439 0.3987 0.6314
No log 30.8333 370 0.4168 0.6235 0.4168 0.6456
No log 31.0 372 0.4243 0.5569 0.4243 0.6514
No log 31.1667 374 0.4130 0.6344 0.4130 0.6427
No log 31.3333 376 0.4278 0.5868 0.4278 0.6540
No log 31.5 378 0.4492 0.6187 0.4492 0.6702
No log 31.6667 380 0.4457 0.5955 0.4457 0.6676
No log 31.8333 382 0.4352 0.5868 0.4352 0.6597
No log 32.0 384 0.4441 0.5305 0.4441 0.6664
No log 32.1667 386 0.4455 0.5305 0.4455 0.6675
No log 32.3333 388 0.4441 0.5521 0.4441 0.6664
No log 32.5 390 0.4575 0.6201 0.4575 0.6764
No log 32.6667 392 0.4997 0.6073 0.4997 0.7069
No log 32.8333 394 0.5060 0.5712 0.5060 0.7113
No log 33.0 396 0.4589 0.6201 0.4589 0.6775
No log 33.1667 398 0.4196 0.5937 0.4196 0.6477
No log 33.3333 400 0.4360 0.5584 0.4360 0.6603
No log 33.5 402 0.5218 0.5735 0.5218 0.7224
No log 33.6667 404 0.5333 0.5678 0.5333 0.7303
No log 33.8333 406 0.4690 0.5528 0.4690 0.6849
No log 34.0 408 0.4122 0.5815 0.4122 0.6421
No log 34.1667 410 0.4430 0.6201 0.4430 0.6656
No log 34.3333 412 0.5079 0.5712 0.5079 0.7126
No log 34.5 414 0.5146 0.5975 0.5146 0.7174
No log 34.6667 416 0.4771 0.6073 0.4771 0.6907
No log 34.8333 418 0.4511 0.5927 0.4511 0.6716
No log 35.0 420 0.4538 0.4273 0.4538 0.6736
No log 35.1667 422 0.4746 0.3885 0.4746 0.6889
No log 35.3333 424 0.4699 0.4116 0.4699 0.6855
No log 35.5 426 0.4408 0.4677 0.4408 0.6639
No log 35.6667 428 0.4209 0.5800 0.4209 0.6488
No log 35.8333 430 0.4554 0.6201 0.4554 0.6748
No log 36.0 432 0.4852 0.6188 0.4852 0.6966
No log 36.1667 434 0.4776 0.6188 0.4776 0.6911
No log 36.3333 436 0.4551 0.6201 0.4551 0.6746
No log 36.5 438 0.4277 0.5868 0.4277 0.6540
No log 36.6667 440 0.4218 0.5765 0.4218 0.6494
No log 36.8333 442 0.4303 0.5765 0.4303 0.6560
No log 37.0 444 0.4366 0.4970 0.4366 0.6608
No log 37.1667 446 0.4365 0.4970 0.4365 0.6607
No log 37.3333 448 0.4320 0.4970 0.4320 0.6573
No log 37.5 450 0.4311 0.5648 0.4311 0.6566
No log 37.6667 452 0.4446 0.5955 0.4446 0.6668
No log 37.8333 454 0.4428 0.5648 0.4428 0.6655
No log 38.0 456 0.4411 0.4703 0.4411 0.6642
No log 38.1667 458 0.4444 0.4448 0.4444 0.6667
No log 38.3333 460 0.4401 0.4448 0.4401 0.6634
No log 38.5 462 0.4329 0.4448 0.4329 0.6579
No log 38.6667 464 0.4284 0.5286 0.4284 0.6545
No log 38.8333 466 0.4314 0.5521 0.4314 0.6568
No log 39.0 468 0.4308 0.5749 0.4308 0.6564
No log 39.1667 470 0.4314 0.5749 0.4314 0.6568
No log 39.3333 472 0.4337 0.5749 0.4337 0.6586
No log 39.5 474 0.4403 0.4448 0.4403 0.6636
No log 39.6667 476 0.4457 0.4448 0.4457 0.6676
No log 39.8333 478 0.4458 0.5022 0.4458 0.6677
No log 40.0 480 0.4488 0.5379 0.4488 0.6699
No log 40.1667 482 0.4540 0.6087 0.4540 0.6738
No log 40.3333 484 0.4532 0.6187 0.4532 0.6732
No log 40.5 486 0.4501 0.4774 0.4501 0.6709
No log 40.6667 488 0.4644 0.4538 0.4644 0.6815
No log 40.8333 490 0.4799 0.3636 0.4799 0.6928
No log 41.0 492 0.4803 0.3636 0.4803 0.6931
No log 41.1667 494 0.4685 0.4538 0.4685 0.6845
No log 41.3333 496 0.4475 0.4538 0.4475 0.6690
No log 41.5 498 0.4343 0.5286 0.4343 0.6590
0.2543 41.6667 500 0.4338 0.5923 0.4338 0.6586
0.2543 41.8333 502 0.4482 0.6408 0.4482 0.6695
0.2543 42.0 504 0.4719 0.6408 0.4719 0.6869
0.2543 42.1667 506 0.4840 0.6755 0.4840 0.6957
0.2543 42.3333 508 0.4584 0.6590 0.4584 0.6770
0.2543 42.5 510 0.4484 0.5159 0.4484 0.6697
0.2543 42.6667 512 0.4539 0.4448 0.4539 0.6737
0.2543 42.8333 514 0.4583 0.4538 0.4583 0.6770
0.2543 43.0 516 0.4584 0.4538 0.4584 0.6770
0.2543 43.1667 518 0.4622 0.4585 0.4622 0.6798

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
3
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_B_usingWellWrittenEssays_FineTuningAraBERT_run3_AugV5_k2_task7_organization

Finetuned
(4019)
this model