ArabicNewSplits7_B_usingWellWrittenEssays_FineTuningAraBERT_run3_AugV5_k10_task7_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.4387
  • Qwk: 0.5152
  • Mse: 0.4387
  • Rmse: 0.6623

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0377 2 2.6047 -0.0449 2.6047 1.6139
No log 0.0755 4 1.3906 0.0715 1.3906 1.1792
No log 0.1132 6 1.1783 -0.0927 1.1783 1.0855
No log 0.1509 8 0.8559 0.0 0.8559 0.9251
No log 0.1887 10 0.7907 0.1264 0.7907 0.8892
No log 0.2264 12 0.7761 0.2319 0.7761 0.8809
No log 0.2642 14 0.8291 0.2142 0.8291 0.9106
No log 0.3019 16 0.7797 0.2096 0.7797 0.8830
No log 0.3396 18 0.7840 0.2897 0.7840 0.8854
No log 0.3774 20 0.7608 0.2522 0.7608 0.8722
No log 0.4151 22 0.7034 0.3135 0.7034 0.8387
No log 0.4528 24 0.7460 0.2948 0.7460 0.8637
No log 0.4906 26 0.6036 0.3396 0.6036 0.7769
No log 0.5283 28 0.5845 0.3274 0.5845 0.7645
No log 0.5660 30 0.5976 0.1617 0.5976 0.7730
No log 0.6038 32 0.6149 0.1617 0.6149 0.7842
No log 0.6415 34 0.6069 0.1617 0.6069 0.7790
No log 0.6792 36 0.6107 0.1674 0.6107 0.7815
No log 0.7170 38 0.5985 0.1617 0.5985 0.7736
No log 0.7547 40 0.5838 0.1604 0.5838 0.7641
No log 0.7925 42 0.5750 0.3092 0.5750 0.7583
No log 0.8302 44 0.6312 0.3564 0.6312 0.7945
No log 0.8679 46 0.7453 0.4308 0.7453 0.8633
No log 0.9057 48 0.6864 0.4230 0.6864 0.8285
No log 0.9434 50 0.6583 0.4482 0.6583 0.8113
No log 0.9811 52 0.5775 0.4182 0.5775 0.7599
No log 1.0189 54 0.5555 0.4052 0.5555 0.7453
No log 1.0566 56 0.5447 0.5208 0.5447 0.7381
No log 1.0943 58 0.5290 0.5414 0.5290 0.7273
No log 1.1321 60 0.5434 0.3961 0.5434 0.7371
No log 1.1698 62 0.6550 0.4568 0.6550 0.8094
No log 1.2075 64 0.5290 0.4201 0.5290 0.7273
No log 1.2453 66 0.4900 0.5781 0.4900 0.7000
No log 1.2830 68 0.4812 0.5133 0.4812 0.6937
No log 1.3208 70 0.5118 0.4576 0.5118 0.7154
No log 1.3585 72 0.5800 0.4845 0.5800 0.7616
No log 1.3962 74 0.6687 0.4444 0.6687 0.8177
No log 1.4340 76 0.5690 0.4336 0.5690 0.7543
No log 1.4717 78 0.4530 0.5945 0.4530 0.6731
No log 1.5094 80 0.4522 0.5975 0.4522 0.6724
No log 1.5472 82 0.4449 0.5326 0.4449 0.6670
No log 1.5849 84 0.7012 0.4419 0.7012 0.8374
No log 1.6226 86 0.8134 0.4484 0.8134 0.9019
No log 1.6604 88 0.5296 0.4502 0.5296 0.7278
No log 1.6981 90 0.4237 0.5782 0.4237 0.6509
No log 1.7358 92 0.4406 0.5975 0.4406 0.6638
No log 1.7736 94 0.4147 0.6257 0.4147 0.6440
No log 1.8113 96 0.4754 0.6041 0.4754 0.6895
No log 1.8491 98 0.4534 0.6249 0.4534 0.6734
No log 1.8868 100 0.4162 0.6495 0.4162 0.6452
No log 1.9245 102 0.5547 0.5612 0.5547 0.7448
No log 1.9623 104 0.5609 0.5308 0.5609 0.7489
No log 2.0 106 0.4395 0.6096 0.4395 0.6630
No log 2.0377 108 0.4520 0.6115 0.4520 0.6723
No log 2.0755 110 0.5617 0.4864 0.5617 0.7494
No log 2.1132 112 0.4993 0.5455 0.4993 0.7066
No log 2.1509 114 0.4941 0.5970 0.4941 0.7029
No log 2.1887 116 0.4966 0.6082 0.4966 0.7047
No log 2.2264 118 0.4967 0.5779 0.4967 0.7048
No log 2.2642 120 0.4777 0.6655 0.4777 0.6912
No log 2.3019 122 0.5528 0.4889 0.5528 0.7435
No log 2.3396 124 0.5415 0.5473 0.5415 0.7359
No log 2.3774 126 0.4793 0.5899 0.4793 0.6923
No log 2.4151 128 0.4869 0.6052 0.4869 0.6978
No log 2.4528 130 0.4923 0.6263 0.4923 0.7017
No log 2.4906 132 0.5411 0.5701 0.5411 0.7356
No log 2.5283 134 0.5840 0.5538 0.5840 0.7642
No log 2.5660 136 0.4909 0.6346 0.4909 0.7007
No log 2.6038 138 0.5061 0.5642 0.5061 0.7114
No log 2.6415 140 0.4836 0.5160 0.4836 0.6954
No log 2.6792 142 0.4989 0.5455 0.4989 0.7063
No log 2.7170 144 0.5270 0.5081 0.5270 0.7259
No log 2.7547 146 0.6668 0.4877 0.6668 0.8166
No log 2.7925 148 0.5989 0.5096 0.5989 0.7739
No log 2.8302 150 0.4954 0.4829 0.4954 0.7039
No log 2.8679 152 0.5890 0.5687 0.5890 0.7675
No log 2.9057 154 0.5087 0.5248 0.5087 0.7132
No log 2.9434 156 0.5160 0.5093 0.5160 0.7184
No log 2.9811 158 0.4970 0.5235 0.4970 0.7050
No log 3.0189 160 0.4998 0.6377 0.4998 0.7070
No log 3.0566 162 0.4917 0.5941 0.4917 0.7012
No log 3.0943 164 0.5873 0.4783 0.5873 0.7664
No log 3.1321 166 0.6026 0.4615 0.6026 0.7763
No log 3.1698 168 0.5038 0.6589 0.5038 0.7098
No log 3.2075 170 0.4810 0.5488 0.4810 0.6935
No log 3.2453 172 0.4980 0.6383 0.4980 0.7057
No log 3.2830 174 0.5209 0.6074 0.5209 0.7217
No log 3.3208 176 0.4817 0.6082 0.4817 0.6941
No log 3.3585 178 0.4749 0.4949 0.4749 0.6892
No log 3.3962 180 0.5154 0.5184 0.5154 0.7179
No log 3.4340 182 0.4835 0.4914 0.4835 0.6954
No log 3.4717 184 0.4616 0.5475 0.4616 0.6794
No log 3.5094 186 0.4686 0.5475 0.4686 0.6845
No log 3.5472 188 0.5456 0.5149 0.5456 0.7386
No log 3.5849 190 0.7550 0.5147 0.7550 0.8689
No log 3.6226 192 0.6994 0.4444 0.6994 0.8363
No log 3.6604 194 0.5977 0.4212 0.5977 0.7731
No log 3.6981 196 0.5431 0.3396 0.5431 0.7369
No log 3.7358 198 0.6347 0.3956 0.6347 0.7967
No log 3.7736 200 0.7609 0.3847 0.7609 0.8723
No log 3.8113 202 0.7212 0.4400 0.7212 0.8492
No log 3.8491 204 0.5420 0.3882 0.5420 0.7362
No log 3.8868 206 0.5070 0.4608 0.5070 0.7120
No log 3.9245 208 0.5087 0.4698 0.5087 0.7132
No log 3.9623 210 0.5672 0.3931 0.5672 0.7531
No log 4.0 212 0.5153 0.5208 0.5153 0.7179
No log 4.0377 214 0.5018 0.5432 0.5018 0.7084
No log 4.0755 216 0.4837 0.5665 0.4837 0.6955
No log 4.1132 218 0.4870 0.6082 0.4870 0.6978
No log 4.1509 220 0.5430 0.4197 0.5430 0.7369
No log 4.1887 222 0.5604 0.4197 0.5604 0.7486
No log 4.2264 224 0.4920 0.5452 0.4920 0.7014
No log 4.2642 226 0.4305 0.5875 0.4305 0.6562
No log 4.3019 228 0.4656 0.6196 0.4656 0.6823
No log 4.3396 230 0.4261 0.6389 0.4261 0.6528
No log 4.3774 232 0.4164 0.6983 0.4164 0.6453
No log 4.4151 234 0.4940 0.5870 0.4940 0.7028
No log 4.4528 236 0.4493 0.6019 0.4493 0.6703
No log 4.4906 238 0.4157 0.6357 0.4157 0.6448
No log 4.5283 240 0.4444 0.5649 0.4444 0.6666
No log 4.5660 242 0.4365 0.6257 0.4365 0.6607
No log 4.6038 244 0.4728 0.5123 0.4728 0.6876
No log 4.6415 246 0.4622 0.5123 0.4622 0.6799
No log 4.6792 248 0.4356 0.6257 0.4356 0.6600
No log 4.7170 250 0.5230 0.5911 0.5230 0.7232
No log 4.7547 252 0.5049 0.6041 0.5049 0.7105
No log 4.7925 254 0.4658 0.5432 0.4658 0.6825
No log 4.8302 256 0.4471 0.6269 0.4471 0.6687
No log 4.8679 258 0.4993 0.5733 0.4993 0.7066
No log 4.9057 260 0.5071 0.5168 0.5071 0.7121
No log 4.9434 262 0.5953 0.5358 0.5953 0.7715
No log 4.9811 264 0.5493 0.5464 0.5493 0.7411
No log 5.0189 266 0.4932 0.5765 0.4932 0.7023
No log 5.0566 268 0.4915 0.5157 0.4915 0.7010
No log 5.0943 270 0.6050 0.5569 0.6050 0.7778
No log 5.1321 272 0.5512 0.5170 0.5512 0.7424
No log 5.1698 274 0.4407 0.5941 0.4407 0.6639
No log 5.2075 276 0.4394 0.6698 0.4394 0.6629
No log 5.2453 278 0.4216 0.6950 0.4216 0.6493
No log 5.2830 280 0.4528 0.5894 0.4528 0.6729
No log 5.3208 282 0.4878 0.5291 0.4878 0.6984
No log 5.3585 284 0.4622 0.5677 0.4622 0.6799
No log 5.3962 286 0.4487 0.5677 0.4487 0.6698
No log 5.4340 288 0.4214 0.6067 0.4214 0.6492
No log 5.4717 290 0.4430 0.5836 0.4430 0.6656
No log 5.5094 292 0.4753 0.5498 0.4753 0.6894
No log 5.5472 294 0.4313 0.5503 0.4313 0.6567
No log 5.5849 296 0.4282 0.6017 0.4282 0.6544
No log 5.6226 298 0.4402 0.5861 0.4402 0.6635
No log 5.6604 300 0.4387 0.5861 0.4387 0.6624
No log 5.6981 302 0.5044 0.6236 0.5044 0.7102
No log 5.7358 304 0.7297 0.4063 0.7297 0.8542
No log 5.7736 306 0.7754 0.4275 0.7754 0.8806
No log 5.8113 308 0.5901 0.5664 0.5901 0.7682
No log 5.8491 310 0.4448 0.6370 0.4448 0.6669
No log 5.8868 312 0.4439 0.5604 0.4439 0.6663
No log 5.9245 314 0.4507 0.7085 0.4507 0.6714
No log 5.9623 316 0.6678 0.4430 0.6678 0.8172
No log 6.0 318 0.8338 0.3979 0.8338 0.9131
No log 6.0377 320 0.8555 0.3979 0.8555 0.9249
No log 6.0755 322 0.6070 0.4593 0.6070 0.7791
No log 6.1132 324 0.4639 0.4659 0.4639 0.6811
No log 6.1509 326 0.5553 0.4684 0.5553 0.7452
No log 6.1887 328 0.5820 0.4684 0.5820 0.7629
No log 6.2264 330 0.5517 0.4614 0.5517 0.7428
No log 6.2642 332 0.5175 0.4825 0.5175 0.7194
No log 6.3019 334 0.5154 0.4517 0.5154 0.7179
No log 6.3396 336 0.5006 0.5030 0.5006 0.7075
No log 6.3774 338 0.4772 0.5960 0.4772 0.6908
No log 6.4151 340 0.5000 0.4966 0.5000 0.7071
No log 6.4528 342 0.5874 0.4916 0.5874 0.7664
No log 6.4906 344 0.5413 0.4933 0.5413 0.7357
No log 6.5283 346 0.4620 0.5151 0.4620 0.6797
No log 6.5660 348 0.4638 0.5228 0.4638 0.6810
No log 6.6038 350 0.4932 0.5309 0.4932 0.7023
No log 6.6415 352 0.5425 0.4855 0.5425 0.7365
No log 6.6792 354 0.5624 0.4997 0.5624 0.7500
No log 6.7170 356 0.5159 0.4769 0.5159 0.7183
No log 6.7547 358 0.5145 0.4769 0.5145 0.7173
No log 6.7925 360 0.4642 0.5819 0.4642 0.6813
No log 6.8302 362 0.4655 0.5819 0.4655 0.6823
No log 6.8679 364 0.4654 0.5675 0.4654 0.6822
No log 6.9057 366 0.4577 0.6106 0.4577 0.6766
No log 6.9434 368 0.4421 0.6357 0.4421 0.6649
No log 6.9811 370 0.4314 0.6364 0.4314 0.6568
No log 7.0189 372 0.4417 0.6978 0.4417 0.6646
No log 7.0566 374 0.4313 0.6771 0.4313 0.6567
No log 7.0943 376 0.4738 0.5324 0.4738 0.6883
No log 7.1321 378 0.5498 0.5773 0.5498 0.7415
No log 7.1698 380 0.5578 0.5435 0.5578 0.7468
No log 7.2075 382 0.4957 0.4855 0.4957 0.7040
No log 7.2453 384 0.4473 0.5768 0.4473 0.6688
No log 7.2830 386 0.4530 0.5768 0.4530 0.6730
No log 7.3208 388 0.5036 0.5086 0.5036 0.7096
No log 7.3585 390 0.4896 0.5345 0.4896 0.6997
No log 7.3962 392 0.4299 0.6364 0.4299 0.6557
No log 7.4340 394 0.4325 0.6402 0.4325 0.6577
No log 7.4717 396 0.4333 0.6402 0.4333 0.6582
No log 7.5094 398 0.4324 0.6197 0.4324 0.6576
No log 7.5472 400 0.4276 0.6305 0.4276 0.6539
No log 7.5849 402 0.4232 0.6389 0.4232 0.6505
No log 7.6226 404 0.4193 0.6269 0.4193 0.6475
No log 7.6604 406 0.4223 0.6269 0.4223 0.6498
No log 7.6981 408 0.4327 0.6282 0.4327 0.6578
No log 7.7358 410 0.4398 0.6489 0.4398 0.6631
No log 7.7736 412 0.4330 0.6489 0.4330 0.6580
No log 7.8113 414 0.4295 0.6402 0.4295 0.6553
No log 7.8491 416 0.4309 0.6458 0.4309 0.6564
No log 7.8868 418 0.5333 0.6210 0.5333 0.7303
No log 7.9245 420 0.5130 0.6221 0.5130 0.7162
No log 7.9623 422 0.4374 0.6551 0.4374 0.6613
No log 8.0 424 0.4582 0.6890 0.4582 0.6769
No log 8.0377 426 0.5687 0.5205 0.5687 0.7541
No log 8.0755 428 0.5766 0.5205 0.5766 0.7593
No log 8.1132 430 0.4766 0.6716 0.4766 0.6904
No log 8.1509 432 0.4257 0.6612 0.4257 0.6525
No log 8.1887 434 0.4157 0.6282 0.4157 0.6447
No log 8.2264 436 0.4159 0.6295 0.4159 0.6449
No log 8.2642 438 0.4121 0.6295 0.4121 0.6419
No log 8.3019 440 0.4038 0.6282 0.4038 0.6354
No log 8.3396 442 0.3991 0.6489 0.3991 0.6317
No log 8.3774 444 0.4027 0.6678 0.4027 0.6346
No log 8.4151 446 0.4071 0.6837 0.4071 0.6380
No log 8.4528 448 0.4072 0.6489 0.4072 0.6381
No log 8.4906 450 0.4120 0.6489 0.4120 0.6419
No log 8.5283 452 0.4224 0.6489 0.4224 0.6499
No log 8.5660 454 0.4423 0.6295 0.4423 0.6650
No log 8.6038 456 0.4527 0.6082 0.4527 0.6728
No log 8.6415 458 0.4603 0.6168 0.4603 0.6784
No log 8.6792 460 0.4497 0.6154 0.4497 0.6706
No log 8.7170 462 0.4376 0.6370 0.4376 0.6615
No log 8.7547 464 0.4335 0.6491 0.4335 0.6584
No log 8.7925 466 0.4347 0.6601 0.4347 0.6593
No log 8.8302 468 0.4377 0.6601 0.4377 0.6616
No log 8.8679 470 0.4354 0.6282 0.4354 0.6599
No log 8.9057 472 0.4569 0.5361 0.4569 0.6760
No log 8.9434 474 0.4962 0.4729 0.4962 0.7044
No log 8.9811 476 0.5172 0.4148 0.5172 0.7192
No log 9.0189 478 0.5075 0.4902 0.5075 0.7124
No log 9.0566 480 0.5065 0.4902 0.5065 0.7117
No log 9.0943 482 0.4854 0.5101 0.4854 0.6967
No log 9.1321 484 0.4896 0.5182 0.4896 0.6997
No log 9.1698 486 0.4893 0.4729 0.4893 0.6995
No log 9.2075 488 0.4604 0.5600 0.4604 0.6785
No log 9.2453 490 0.4498 0.6491 0.4498 0.6707
No log 9.2830 492 0.4661 0.6793 0.4661 0.6827
No log 9.3208 494 0.4613 0.6783 0.4613 0.6792
No log 9.3585 496 0.4490 0.6313 0.4490 0.6700
No log 9.3962 498 0.4752 0.6346 0.4752 0.6893
0.361 9.4340 500 0.5368 0.5353 0.5368 0.7327
0.361 9.4717 502 0.5054 0.5633 0.5054 0.7109
0.361 9.5094 504 0.4438 0.6222 0.4438 0.6662
0.361 9.5472 506 0.5099 0.6074 0.5099 0.7141
0.361 9.5849 508 0.6621 0.4632 0.6621 0.8137
0.361 9.6226 510 0.6392 0.5153 0.6392 0.7995
0.361 9.6604 512 0.5157 0.6154 0.5157 0.7181
0.361 9.6981 514 0.4574 0.6479 0.4574 0.6763
0.361 9.7358 516 0.4629 0.5996 0.4629 0.6804
0.361 9.7736 518 0.4914 0.5665 0.4914 0.7010
0.361 9.8113 520 0.4643 0.6186 0.4643 0.6814
0.361 9.8491 522 0.4736 0.6943 0.4736 0.6882
0.361 9.8868 524 0.4864 0.6688 0.4864 0.6975
0.361 9.9245 526 0.5095 0.5855 0.5095 0.7138
0.361 9.9623 528 0.5064 0.5947 0.5064 0.7116
0.361 10.0 530 0.4620 0.5152 0.4620 0.6797
0.361 10.0377 532 0.4620 0.4774 0.4620 0.6797
0.361 10.0755 534 0.4580 0.4774 0.4580 0.6767
0.361 10.1132 536 0.4547 0.4938 0.4547 0.6743
0.361 10.1509 538 0.4387 0.5152 0.4387 0.6623

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_B_usingWellWrittenEssays_FineTuningAraBERT_run3_AugV5_k10_task7_organization

Finetuned
(4019)
this model