ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k6_task7_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.4376
  • Qwk: 0.5611
  • Mse: 0.4376
  • Rmse: 0.6615

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0667 2 2.5318 -0.0593 2.5318 1.5912
No log 0.1333 4 1.2792 0.1241 1.2792 1.1310
No log 0.2 6 0.6645 0.0444 0.6645 0.8152
No log 0.2667 8 0.7863 0.2904 0.7863 0.8867
No log 0.3333 10 0.7916 0.3579 0.7916 0.8897
No log 0.4 12 0.6562 0.0393 0.6562 0.8101
No log 0.4667 14 0.7151 0.0937 0.7151 0.8456
No log 0.5333 16 0.6709 0.0393 0.6709 0.8191
No log 0.6 18 0.6805 0.3243 0.6805 0.8249
No log 0.6667 20 0.7325 0.3789 0.7325 0.8558
No log 0.7333 22 0.6876 0.3894 0.6876 0.8292
No log 0.8 24 0.5771 0.4253 0.5771 0.7597
No log 0.8667 26 0.7401 0.5119 0.7401 0.8603
No log 0.9333 28 0.8544 0.3638 0.8544 0.9243
No log 1.0 30 0.5946 0.5129 0.5945 0.7711
No log 1.0667 32 0.4936 0.6247 0.4936 0.7025
No log 1.1333 34 0.5039 0.5902 0.5039 0.7098
No log 1.2 36 0.5152 0.6515 0.5152 0.7178
No log 1.2667 38 0.5428 0.5047 0.5428 0.7367
No log 1.3333 40 0.5389 0.5674 0.5389 0.7341
No log 1.4 42 0.5891 0.5822 0.5891 0.7675
No log 1.4667 44 0.9433 0.3405 0.9433 0.9712
No log 1.5333 46 0.8988 0.3506 0.8988 0.9480
No log 1.6 48 0.5647 0.5031 0.5647 0.7514
No log 1.6667 50 0.5360 0.4933 0.5360 0.7321
No log 1.7333 52 0.6211 0.4592 0.6211 0.7881
No log 1.8 54 0.4579 0.5752 0.4579 0.6766
No log 1.8667 56 0.5328 0.5178 0.5328 0.7299
No log 1.9333 58 0.5507 0.5212 0.5507 0.7421
No log 2.0 60 0.4237 0.6830 0.4237 0.6509
No log 2.0667 62 0.5900 0.5402 0.5900 0.7681
No log 2.1333 64 0.5568 0.5085 0.5568 0.7462
No log 2.2 66 0.4159 0.7217 0.4159 0.6449
No log 2.2667 68 0.6333 0.4484 0.6333 0.7958
No log 2.3333 70 0.6807 0.4228 0.6807 0.8251
No log 2.4 72 0.4940 0.5614 0.4940 0.7028
No log 2.4667 74 0.4359 0.5933 0.4359 0.6603
No log 2.5333 76 0.5443 0.5948 0.5443 0.7377
No log 2.6 78 0.7173 0.5605 0.7173 0.8469
No log 2.6667 80 0.7278 0.5415 0.7278 0.8531
No log 2.7333 82 0.7758 0.4479 0.7758 0.8808
No log 2.8 84 0.5468 0.5552 0.5468 0.7395
No log 2.8667 86 0.5060 0.6485 0.5060 0.7113
No log 2.9333 88 0.5361 0.5612 0.5361 0.7322
No log 3.0 90 0.5392 0.5744 0.5392 0.7343
No log 3.0667 92 0.4750 0.6132 0.4750 0.6892
No log 3.1333 94 0.4656 0.6604 0.4656 0.6823
No log 3.2 96 0.4784 0.5639 0.4784 0.6917
No log 3.2667 98 0.4691 0.6364 0.4691 0.6849
No log 3.3333 100 0.5863 0.4501 0.5863 0.7657
No log 3.4 102 0.5846 0.4868 0.5846 0.7646
No log 3.4667 104 0.4967 0.6357 0.4967 0.7048
No log 3.5333 106 0.5262 0.6449 0.5262 0.7254
No log 3.6 108 0.5421 0.6876 0.5421 0.7363
No log 3.6667 110 0.5480 0.6309 0.5480 0.7402
No log 3.7333 112 0.6842 0.5441 0.6842 0.8272
No log 3.8 114 0.7051 0.5401 0.7051 0.8397
No log 3.8667 116 0.5596 0.6228 0.5596 0.7481
No log 3.9333 118 0.5227 0.6254 0.5227 0.7230
No log 4.0 120 0.5043 0.6492 0.5043 0.7101
No log 4.0667 122 0.5423 0.5721 0.5423 0.7364
No log 4.1333 124 0.5046 0.5513 0.5046 0.7104
No log 4.2 126 0.4808 0.5779 0.4808 0.6934
No log 4.2667 128 0.4823 0.5569 0.4823 0.6945
No log 4.3333 130 0.5057 0.6074 0.5057 0.7111
No log 4.4 132 0.4877 0.5880 0.4877 0.6983
No log 4.4667 134 0.4864 0.6124 0.4864 0.6975
No log 4.5333 136 0.4982 0.5621 0.4982 0.7058
No log 4.6 138 0.4982 0.5744 0.4982 0.7058
No log 4.6667 140 0.5541 0.5683 0.5541 0.7444
No log 4.7333 142 0.5360 0.5683 0.5360 0.7321
No log 4.8 144 0.4520 0.6762 0.4520 0.6723
No log 4.8667 146 0.4436 0.6481 0.4436 0.6660
No log 4.9333 148 0.4364 0.6762 0.4364 0.6606
No log 5.0 150 0.4990 0.6030 0.4990 0.7064
No log 5.0667 152 0.4661 0.6716 0.4661 0.6827
No log 5.1333 154 0.4365 0.6587 0.4365 0.6607
No log 5.2 156 0.4318 0.6672 0.4318 0.6571
No log 5.2667 158 0.4473 0.6705 0.4473 0.6688
No log 5.3333 160 0.5295 0.5266 0.5295 0.7277
No log 5.4 162 0.5849 0.5464 0.5849 0.7648
No log 5.4667 164 0.4594 0.6537 0.4594 0.6778
No log 5.5333 166 0.4396 0.5852 0.4396 0.6630
No log 5.6 168 0.5285 0.5802 0.5285 0.7270
No log 5.6667 170 0.4426 0.6162 0.4426 0.6653
No log 5.7333 172 0.3916 0.6841 0.3916 0.6258
No log 5.8 174 0.3954 0.7042 0.3954 0.6288
No log 5.8667 176 0.4260 0.7328 0.4260 0.6527
No log 5.9333 178 0.4878 0.6465 0.4878 0.6984
No log 6.0 180 0.5200 0.6303 0.5200 0.7211
No log 6.0667 182 0.5499 0.5779 0.5499 0.7416
No log 6.1333 184 0.4774 0.7448 0.4774 0.6910
No log 6.2 186 0.4660 0.6371 0.4660 0.6826
No log 6.2667 188 0.4675 0.5902 0.4675 0.6837
No log 6.3333 190 0.4699 0.5725 0.4699 0.6855
No log 6.4 192 0.5020 0.6366 0.5020 0.7085
No log 6.4667 194 0.5086 0.6034 0.5086 0.7132
No log 6.5333 196 0.4671 0.6185 0.4671 0.6834
No log 6.6 198 0.4844 0.6515 0.4844 0.6960
No log 6.6667 200 0.5070 0.6698 0.5070 0.7120
No log 6.7333 202 0.4883 0.6515 0.4883 0.6988
No log 6.8 204 0.4713 0.6083 0.4713 0.6865
No log 6.8667 206 0.5061 0.5724 0.5061 0.7114
No log 6.9333 208 0.5720 0.5939 0.5720 0.7563
No log 7.0 210 0.4735 0.5601 0.4735 0.6881
No log 7.0667 212 0.4435 0.7041 0.4435 0.6660
No log 7.1333 214 0.6530 0.5516 0.6530 0.8081
No log 7.2 216 0.7932 0.4683 0.7932 0.8906
No log 7.2667 218 0.6878 0.5516 0.6878 0.8294
No log 7.3333 220 0.5055 0.6235 0.5055 0.7110
No log 7.4 222 0.5230 0.5459 0.5230 0.7232
No log 7.4667 224 0.5808 0.5382 0.5808 0.7621
No log 7.5333 226 0.5204 0.5177 0.5204 0.7214
No log 7.6 228 0.4818 0.5550 0.4818 0.6941
No log 7.6667 230 0.5216 0.5248 0.5216 0.7222
No log 7.7333 232 0.5246 0.5046 0.5246 0.7243
No log 7.8 234 0.4690 0.6255 0.4690 0.6848
No log 7.8667 236 0.5098 0.5653 0.5098 0.7140
No log 7.9333 238 0.6723 0.5550 0.6723 0.8200
No log 8.0 240 0.6767 0.5481 0.6767 0.8226
No log 8.0667 242 0.5436 0.5639 0.5436 0.7373
No log 8.1333 244 0.4457 0.6439 0.4457 0.6676
No log 8.2 246 0.4297 0.6452 0.4297 0.6555
No log 8.2667 248 0.4187 0.6439 0.4187 0.6470
No log 8.3333 250 0.4187 0.6046 0.4187 0.6471
No log 8.4 252 0.4343 0.6434 0.4343 0.6590
No log 8.4667 254 0.4435 0.6447 0.4435 0.6659
No log 8.5333 256 0.4582 0.7730 0.4582 0.6769
No log 8.6 258 0.4947 0.6379 0.4947 0.7034
No log 8.6667 260 0.5449 0.6206 0.5449 0.7382
No log 8.7333 262 0.4940 0.6631 0.4940 0.7028
No log 8.8 264 0.4471 0.6895 0.4471 0.6686
No log 8.8667 266 0.4509 0.6100 0.4509 0.6715
No log 8.9333 268 0.4401 0.6091 0.4401 0.6634
No log 9.0 270 0.4461 0.6184 0.4461 0.6679
No log 9.0667 272 0.4680 0.5214 0.4680 0.6841
No log 9.1333 274 0.4944 0.4948 0.4944 0.7031
No log 9.2 276 0.4838 0.5098 0.4838 0.6956
No log 9.2667 278 0.4621 0.5756 0.4621 0.6798
No log 9.3333 280 0.4642 0.6173 0.4642 0.6813
No log 9.4 282 0.4553 0.6365 0.4553 0.6748
No log 9.4667 284 0.4413 0.6130 0.4413 0.6643
No log 9.5333 286 0.4381 0.6127 0.4381 0.6619
No log 9.6 288 0.4104 0.6555 0.4104 0.6407
No log 9.6667 290 0.4134 0.6431 0.4134 0.6430
No log 9.7333 292 0.4320 0.6313 0.4320 0.6573
No log 9.8 294 0.4232 0.6032 0.4232 0.6506
No log 9.8667 296 0.4314 0.5816 0.4314 0.6568
No log 9.9333 298 0.4469 0.5593 0.4469 0.6685
No log 10.0 300 0.4398 0.5567 0.4398 0.6632
No log 10.0667 302 0.4466 0.5344 0.4466 0.6683
No log 10.1333 304 0.4650 0.5698 0.4650 0.6819
No log 10.2 306 0.4845 0.5817 0.4845 0.6960
No log 10.2667 308 0.4922 0.5719 0.4922 0.7015
No log 10.3333 310 0.4601 0.6434 0.4601 0.6783
No log 10.4 312 0.4458 0.6503 0.4458 0.6677
No log 10.4667 314 0.4666 0.6442 0.4666 0.6831
No log 10.5333 316 0.4458 0.6598 0.4458 0.6677
No log 10.6 318 0.4321 0.6761 0.4321 0.6573
No log 10.6667 320 0.4324 0.6351 0.4324 0.6576
No log 10.7333 322 0.4339 0.6448 0.4339 0.6587
No log 10.8 324 0.4344 0.6462 0.4344 0.6591
No log 10.8667 326 0.4298 0.6229 0.4298 0.6556
No log 10.9333 328 0.4254 0.6448 0.4254 0.6522
No log 11.0 330 0.4239 0.6448 0.4239 0.6511
No log 11.0667 332 0.4243 0.6661 0.4243 0.6514
No log 11.1333 334 0.4360 0.6295 0.4360 0.6603
No log 11.2 336 0.4530 0.5811 0.4530 0.6730
No log 11.2667 338 0.4374 0.6661 0.4374 0.6613
No log 11.3333 340 0.4436 0.5379 0.4436 0.6661
No log 11.4 342 0.4756 0.5528 0.4756 0.6897
No log 11.4667 344 0.4851 0.5528 0.4851 0.6965
No log 11.5333 346 0.4657 0.5437 0.4657 0.6824
No log 11.6 348 0.4394 0.5567 0.4394 0.6629
No log 11.6667 350 0.4274 0.5941 0.4274 0.6538
No log 11.7333 352 0.4235 0.5941 0.4235 0.6508
No log 11.8 354 0.4238 0.5941 0.4238 0.6510
No log 11.8667 356 0.4327 0.6960 0.4327 0.6578
No log 11.9333 358 0.4333 0.6771 0.4333 0.6583
No log 12.0 360 0.4376 0.6158 0.4376 0.6615
No log 12.0667 362 0.4649 0.6330 0.4649 0.6819
No log 12.1333 364 0.4704 0.5736 0.4704 0.6859
No log 12.2 366 0.4415 0.5784 0.4415 0.6645
No log 12.2667 368 0.4284 0.6010 0.4284 0.6545
No log 12.3333 370 0.4300 0.6228 0.4300 0.6557
No log 12.4 372 0.4289 0.6060 0.4289 0.6549
No log 12.4667 374 0.4388 0.6503 0.4388 0.6624
No log 12.5333 376 0.4496 0.5918 0.4496 0.6705
No log 12.6 378 0.4403 0.6183 0.4403 0.6636
No log 12.6667 380 0.4429 0.5625 0.4429 0.6655
No log 12.7333 382 0.4753 0.5965 0.4753 0.6894
No log 12.8 384 0.4749 0.5386 0.4749 0.6891
No log 12.8667 386 0.4655 0.5386 0.4655 0.6823
No log 12.9333 388 0.4450 0.6307 0.4450 0.6671
No log 13.0 390 0.4320 0.6422 0.4320 0.6573
No log 13.0667 392 0.4239 0.6724 0.4239 0.6511
No log 13.1333 394 0.4158 0.6566 0.4158 0.6448
No log 13.2 396 0.4240 0.6317 0.4240 0.6511
No log 13.2667 398 0.4284 0.6127 0.4284 0.6545
No log 13.3333 400 0.4174 0.6020 0.4174 0.6460
No log 13.4 402 0.4146 0.7032 0.4146 0.6439
No log 13.4667 404 0.4227 0.6716 0.4227 0.6501
No log 13.5333 406 0.4242 0.6636 0.4242 0.6513
No log 13.6 408 0.4332 0.5698 0.4332 0.6582
No log 13.6667 410 0.4792 0.5230 0.4792 0.6922
No log 13.7333 412 0.4741 0.5300 0.4741 0.6886
No log 13.8 414 0.4284 0.6229 0.4284 0.6546
No log 13.8667 416 0.4681 0.5327 0.4681 0.6841
No log 13.9333 418 0.5228 0.5237 0.5228 0.7230
No log 14.0 420 0.5262 0.4911 0.5262 0.7254
No log 14.0667 422 0.4892 0.5177 0.4892 0.6995
No log 14.1333 424 0.4334 0.6724 0.4334 0.6583
No log 14.2 426 0.4302 0.6060 0.4302 0.6559
No log 14.2667 428 0.4534 0.6773 0.4534 0.6734
No log 14.3333 430 0.4864 0.6864 0.4864 0.6975
No log 14.4 432 0.4790 0.6765 0.4790 0.6921
No log 14.4667 434 0.4651 0.7229 0.4651 0.6820
No log 14.5333 436 0.4481 0.6723 0.4481 0.6694
No log 14.6 438 0.4309 0.6632 0.4309 0.6564
No log 14.6667 440 0.4316 0.6517 0.4316 0.6569
No log 14.7333 442 0.4408 0.6503 0.4408 0.6639
No log 14.8 444 0.4618 0.5271 0.4618 0.6796
No log 14.8667 446 0.5002 0.4925 0.5002 0.7072
No log 14.9333 448 0.4956 0.5327 0.4956 0.7040
No log 15.0 450 0.4632 0.5708 0.4632 0.6806
No log 15.0667 452 0.4413 0.6627 0.4413 0.6643
No log 15.1333 454 0.4801 0.5883 0.4801 0.6929
No log 15.2 456 0.5202 0.6087 0.5202 0.7212
No log 15.2667 458 0.5394 0.5735 0.5394 0.7344
No log 15.3333 460 0.4818 0.5883 0.4818 0.6941
No log 15.4 462 0.4431 0.5846 0.4431 0.6657
No log 15.4667 464 0.4363 0.6018 0.4363 0.6605
No log 15.5333 466 0.4535 0.5195 0.4535 0.6734
No log 15.6 468 0.4668 0.5271 0.4668 0.6832
No log 15.6667 470 0.4598 0.5614 0.4598 0.6781
No log 15.7333 472 0.4470 0.5614 0.4470 0.6685
No log 15.8 474 0.4364 0.6344 0.4364 0.6606
No log 15.8667 476 0.4456 0.5617 0.4456 0.6675
No log 15.9333 478 0.4567 0.5617 0.4567 0.6758
No log 16.0 480 0.4610 0.5600 0.4610 0.6790
No log 16.0667 482 0.4705 0.4681 0.4705 0.6859
No log 16.1333 484 0.4826 0.4705 0.4826 0.6947
No log 16.2 486 0.5051 0.5017 0.5051 0.7107
No log 16.2667 488 0.5100 0.4925 0.5100 0.7141
No log 16.3333 490 0.4937 0.4925 0.4937 0.7027
No log 16.4 492 0.4879 0.5177 0.4879 0.6985
No log 16.4667 494 0.4806 0.5939 0.4806 0.6932
No log 16.5333 496 0.4663 0.5413 0.4663 0.6828
No log 16.6 498 0.4623 0.5171 0.4623 0.6799
0.2658 16.6667 500 0.4597 0.5831 0.4597 0.6780
0.2658 16.7333 502 0.4564 0.5640 0.4564 0.6756
0.2658 16.8 504 0.4501 0.6636 0.4501 0.6709
0.2658 16.8667 506 0.4402 0.6542 0.4402 0.6635
0.2658 16.9333 508 0.4359 0.5985 0.4359 0.6602
0.2658 17.0 510 0.4480 0.5283 0.4480 0.6694
0.2658 17.0667 512 0.4483 0.5512 0.4483 0.6696
0.2658 17.1333 514 0.4551 0.5657 0.4551 0.6746
0.2658 17.2 516 0.4595 0.5657 0.4595 0.6779
0.2658 17.2667 518 0.4376 0.5611 0.4376 0.6615

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k6_task7_organization

Finetuned
(4019)
this model