ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run3_AugV5_k5_task7_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.4289
  • Qwk: 0.6267
  • Mse: 0.4289
  • Rmse: 0.6549

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.08 2 2.3816 -0.0336 2.3816 1.5432
No log 0.16 4 1.0762 0.1808 1.0762 1.0374
No log 0.24 6 0.6441 0.0810 0.6441 0.8025
No log 0.32 8 0.8186 0.2460 0.8186 0.9048
No log 0.4 10 0.6622 0.3755 0.6622 0.8138
No log 0.48 12 0.6951 0.2464 0.6951 0.8337
No log 0.56 14 0.9361 0.3141 0.9361 0.9675
No log 0.64 16 0.8680 0.2483 0.8680 0.9317
No log 0.72 18 0.6102 0.4 0.6102 0.7811
No log 0.8 20 0.6651 0.3518 0.6651 0.8155
No log 0.88 22 0.7077 0.3444 0.7077 0.8413
No log 0.96 24 0.6167 0.4238 0.6167 0.7853
No log 1.04 26 0.5864 0.2779 0.5864 0.7658
No log 1.12 28 0.6436 0.3173 0.6436 0.8023
No log 1.2 30 0.6418 0.3131 0.6418 0.8011
No log 1.28 32 0.5934 0.2992 0.5934 0.7703
No log 1.3600 34 0.5688 0.4116 0.5688 0.7542
No log 1.44 36 0.5830 0.4824 0.5830 0.7636
No log 1.52 38 0.5966 0.5341 0.5966 0.7724
No log 1.6 40 0.6120 0.5988 0.6120 0.7823
No log 1.6800 42 0.7397 0.4997 0.7397 0.8601
No log 1.76 44 0.8357 0.4903 0.8357 0.9142
No log 1.8400 46 0.6673 0.5276 0.6673 0.8169
No log 1.92 48 0.5687 0.6298 0.5687 0.7541
No log 2.0 50 0.5926 0.5283 0.5926 0.7698
No log 2.08 52 0.5631 0.5843 0.5631 0.7504
No log 2.16 54 0.5284 0.5823 0.5284 0.7269
No log 2.24 56 0.5416 0.4992 0.5416 0.7359
No log 2.32 58 0.5768 0.5357 0.5768 0.7595
No log 2.4 60 0.5136 0.5753 0.5136 0.7166
No log 2.48 62 0.5012 0.5455 0.5012 0.7080
No log 2.56 64 0.5747 0.5184 0.5747 0.7581
No log 2.64 66 0.6338 0.5003 0.6338 0.7961
No log 2.7200 68 0.5945 0.5754 0.5945 0.7710
No log 2.8 70 0.6044 0.5144 0.6044 0.7774
No log 2.88 72 0.5491 0.5947 0.5491 0.7410
No log 2.96 74 0.6865 0.6169 0.6865 0.8285
No log 3.04 76 0.6218 0.6106 0.6218 0.7886
No log 3.12 78 0.5378 0.6267 0.5378 0.7333
No log 3.2 80 0.5399 0.6438 0.5399 0.7348
No log 3.2800 82 0.4921 0.6112 0.4921 0.7015
No log 3.36 84 0.4859 0.5983 0.4859 0.6971
No log 3.44 86 0.4487 0.6458 0.4487 0.6699
No log 3.52 88 0.4619 0.6223 0.4619 0.6796
No log 3.6 90 0.4637 0.6645 0.4637 0.6809
No log 3.68 92 0.4953 0.6025 0.4953 0.7038
No log 3.76 94 0.4762 0.6406 0.4762 0.6901
No log 3.84 96 0.4713 0.6623 0.4713 0.6865
No log 3.92 98 0.5253 0.6105 0.5253 0.7247
No log 4.0 100 0.5302 0.5497 0.5302 0.7282
No log 4.08 102 0.5004 0.6309 0.5004 0.7074
No log 4.16 104 0.4662 0.6337 0.4662 0.6828
No log 4.24 106 0.6295 0.5131 0.6295 0.7934
No log 4.32 108 0.5757 0.5436 0.5757 0.7588
No log 4.4 110 0.4812 0.6232 0.4812 0.6937
No log 4.48 112 0.5237 0.6434 0.5237 0.7236
No log 4.5600 114 0.4826 0.6261 0.4826 0.6947
No log 4.64 116 0.4250 0.6185 0.4250 0.6519
No log 4.72 118 0.4185 0.6479 0.4185 0.6470
No log 4.8 120 0.4502 0.6349 0.4502 0.6710
No log 4.88 122 0.4213 0.6762 0.4213 0.6491
No log 4.96 124 0.4144 0.6827 0.4144 0.6438
No log 5.04 126 0.4462 0.6252 0.4462 0.6680
No log 5.12 128 0.5119 0.6032 0.5119 0.7155
No log 5.2 130 0.4998 0.6372 0.4998 0.7070
No log 5.28 132 0.4926 0.6467 0.4926 0.7018
No log 5.36 134 0.4908 0.6467 0.4908 0.7005
No log 5.44 136 0.4932 0.5442 0.4932 0.7023
No log 5.52 138 0.4633 0.5840 0.4633 0.6807
No log 5.6 140 0.4532 0.6418 0.4532 0.6732
No log 5.68 142 0.4769 0.5692 0.4769 0.6906
No log 5.76 144 0.5088 0.5802 0.5088 0.7133
No log 5.84 146 0.4956 0.5859 0.4956 0.7040
No log 5.92 148 0.5062 0.6059 0.5062 0.7115
No log 6.0 150 0.5090 0.6198 0.5090 0.7134
No log 6.08 152 0.4941 0.5574 0.4941 0.7029
No log 6.16 154 0.4841 0.6018 0.4841 0.6958
No log 6.24 156 0.4778 0.5306 0.4778 0.6912
No log 6.32 158 0.4882 0.5563 0.4882 0.6987
No log 6.4 160 0.4838 0.5517 0.4838 0.6955
No log 6.48 162 0.4864 0.5517 0.4864 0.6974
No log 6.5600 164 0.4885 0.5379 0.4885 0.6989
No log 6.64 166 0.4925 0.5867 0.4925 0.7018
No log 6.72 168 0.5926 0.4562 0.5926 0.7698
No log 6.8 170 0.6488 0.4328 0.6488 0.8055
No log 6.88 172 0.5242 0.5403 0.5242 0.7240
No log 6.96 174 0.4416 0.5926 0.4416 0.6645
No log 7.04 176 0.4384 0.6575 0.4384 0.6621
No log 7.12 178 0.4439 0.6762 0.4439 0.6662
No log 7.2 180 0.4432 0.6414 0.4432 0.6657
No log 7.28 182 0.5039 0.5327 0.5039 0.7099
No log 7.36 184 0.5171 0.5327 0.5171 0.7191
No log 7.44 186 0.4761 0.5752 0.4761 0.6900
No log 7.52 188 0.4451 0.5902 0.4451 0.6672
No log 7.6 190 0.4453 0.6481 0.4453 0.6673
No log 7.68 192 0.4591 0.6295 0.4591 0.6776
No log 7.76 194 0.5359 0.5758 0.5359 0.7321
No log 7.84 196 0.5279 0.5342 0.5279 0.7266
No log 7.92 198 0.4493 0.6406 0.4493 0.6703
No log 8.0 200 0.4282 0.6636 0.4282 0.6544
No log 8.08 202 0.4398 0.5379 0.4398 0.6632
No log 8.16 204 0.4540 0.5736 0.4540 0.6738
No log 8.24 206 0.5081 0.5362 0.5081 0.7128
No log 8.32 208 0.5242 0.4979 0.5242 0.7240
No log 8.4 210 0.4689 0.5288 0.4689 0.6848
No log 8.48 212 0.4658 0.5304 0.4658 0.6825
No log 8.56 214 0.4823 0.5653 0.4823 0.6945
No log 8.64 216 0.4864 0.6025 0.4864 0.6974
No log 8.72 218 0.4376 0.6608 0.4376 0.6615
No log 8.8 220 0.4104 0.6828 0.4104 0.6406
No log 8.88 222 0.4232 0.6950 0.4232 0.6505
No log 8.96 224 0.4121 0.6923 0.4121 0.6419
No log 9.04 226 0.4183 0.6518 0.4183 0.6468
No log 9.12 228 0.4197 0.6716 0.4197 0.6478
No log 9.2 230 0.4059 0.6819 0.4059 0.6371
No log 9.28 232 0.4029 0.7012 0.4029 0.6347
No log 9.36 234 0.4181 0.6269 0.4181 0.6466
No log 9.44 236 0.4039 0.6364 0.4039 0.6355
No log 9.52 238 0.4166 0.6712 0.4166 0.6455
No log 9.6 240 0.4484 0.6709 0.4484 0.6696
No log 9.68 242 0.4263 0.6904 0.4263 0.6529
No log 9.76 244 0.4316 0.6606 0.4316 0.6570
No log 9.84 246 0.4460 0.6526 0.4460 0.6678
No log 9.92 248 0.4195 0.6073 0.4195 0.6477
No log 10.0 250 0.4376 0.6505 0.4376 0.6615
No log 10.08 252 0.4792 0.5894 0.4792 0.6923
No log 10.16 254 0.5181 0.5998 0.5181 0.7198
No log 10.24 256 0.4798 0.5947 0.4798 0.6927
No log 10.32 258 0.4501 0.5827 0.4501 0.6709
No log 10.4 260 0.4386 0.5979 0.4386 0.6623
No log 10.48 262 0.4360 0.5609 0.4360 0.6603
No log 10.56 264 0.4879 0.5677 0.4879 0.6985
No log 10.64 266 0.5706 0.4892 0.5706 0.7554
No log 10.72 268 0.5855 0.4815 0.5855 0.7652
No log 10.8 270 0.5193 0.5349 0.5193 0.7206
No log 10.88 272 0.4504 0.6076 0.4504 0.6711
No log 10.96 274 0.4621 0.5438 0.4621 0.6798
No log 11.04 276 0.5038 0.6087 0.5038 0.7098
No log 11.12 278 0.4863 0.6181 0.4863 0.6973
No log 11.2 280 0.4684 0.7191 0.4684 0.6844
No log 11.28 282 0.4706 0.6885 0.4706 0.6860
No log 11.36 284 0.4538 0.5961 0.4538 0.6737
No log 11.44 286 0.4388 0.5604 0.4388 0.6624
No log 11.52 288 0.4352 0.5640 0.4352 0.6597
No log 11.6 290 0.4382 0.6517 0.4382 0.6620
No log 11.68 292 0.4340 0.6724 0.4340 0.6588
No log 11.76 294 0.4296 0.6158 0.4296 0.6554
No log 11.84 296 0.4371 0.6158 0.4371 0.6612
No log 11.92 298 0.4684 0.6606 0.4684 0.6844
No log 12.0 300 0.4740 0.6519 0.4740 0.6885
No log 12.08 302 0.4357 0.6158 0.4357 0.6601
No log 12.16 304 0.4259 0.6317 0.4259 0.6526
No log 12.24 306 0.4339 0.6317 0.4339 0.6587
No log 12.32 308 0.4170 0.6852 0.4170 0.6457
No log 12.4 310 0.4195 0.6832 0.4195 0.6477
No log 12.48 312 0.4131 0.7033 0.4131 0.6427
No log 12.56 314 0.4203 0.6140 0.4203 0.6483
No log 12.64 316 0.4425 0.6096 0.4425 0.6652
No log 12.72 318 0.4222 0.5930 0.4222 0.6498
No log 12.8 320 0.4230 0.6228 0.4230 0.6504
No log 12.88 322 0.4386 0.6241 0.4386 0.6622
No log 12.96 324 0.4428 0.6530 0.4428 0.6654
No log 13.04 326 0.4416 0.5854 0.4416 0.6645
No log 13.12 328 0.4544 0.5627 0.4544 0.6741
No log 13.2 330 0.4445 0.5231 0.4445 0.6667
No log 13.28 332 0.4512 0.5868 0.4512 0.6717
No log 13.36 334 0.4751 0.5516 0.4751 0.6893
No log 13.44 336 0.4754 0.5516 0.4754 0.6895
No log 13.52 338 0.4908 0.5091 0.4908 0.7006
No log 13.6 340 0.5075 0.5091 0.5075 0.7124
No log 13.68 342 0.5117 0.5362 0.5117 0.7153
No log 13.76 344 0.4672 0.5698 0.4672 0.6835
No log 13.84 346 0.4327 0.5422 0.4327 0.6578
No log 13.92 348 0.4550 0.5283 0.4550 0.6746
No log 14.0 350 0.4597 0.5212 0.4597 0.6780
No log 14.08 352 0.4406 0.5782 0.4406 0.6638
No log 14.16 354 0.4749 0.5970 0.4749 0.6892
No log 14.24 356 0.5660 0.5388 0.5660 0.7524
No log 14.32 358 0.5551 0.4980 0.5551 0.7450
No log 14.4 360 0.4888 0.5516 0.4888 0.6991
No log 14.48 362 0.4609 0.4660 0.4609 0.6789
No log 14.56 364 0.4535 0.4660 0.4535 0.6734
No log 14.64 366 0.4917 0.5786 0.4917 0.7012
No log 14.72 368 0.5242 0.5388 0.5242 0.7240
No log 14.8 370 0.5172 0.5059 0.5172 0.7192
No log 14.88 372 0.4599 0.5752 0.4599 0.6782
No log 14.96 374 0.4456 0.6278 0.4456 0.6675
No log 15.04 376 0.4484 0.6278 0.4484 0.6696
No log 15.12 378 0.4804 0.5016 0.4804 0.6931
No log 15.2 380 0.5162 0.5086 0.5162 0.7184
No log 15.28 382 0.4825 0.5081 0.4825 0.6947
No log 15.36 384 0.4446 0.6426 0.4446 0.6668
No log 15.44 386 0.4491 0.5993 0.4491 0.6702
No log 15.52 388 0.4494 0.5640 0.4494 0.6703
No log 15.6 390 0.4612 0.6004 0.4612 0.6791
No log 15.68 392 0.5039 0.5081 0.5039 0.7098
No log 15.76 394 0.5028 0.5081 0.5028 0.7091
No log 15.84 396 0.4492 0.6401 0.4492 0.6702
No log 15.92 398 0.4471 0.5890 0.4471 0.6687
No log 16.0 400 0.5089 0.5897 0.5089 0.7134
No log 16.08 402 0.4996 0.6167 0.4996 0.7068
No log 16.16 404 0.4477 0.5985 0.4477 0.6691
No log 16.24 406 0.4456 0.6171 0.4456 0.6675
No log 16.32 408 0.4683 0.6067 0.4683 0.6843
No log 16.4 410 0.4438 0.5784 0.4438 0.6662
No log 16.48 412 0.4626 0.5999 0.4626 0.6801
No log 16.56 414 0.4817 0.5796 0.4817 0.6940
No log 16.64 416 0.4659 0.5633 0.4659 0.6826
No log 16.72 418 0.4861 0.5232 0.4861 0.6972
No log 16.8 420 0.5355 0.4473 0.5355 0.7318
No log 16.88 422 0.5546 0.4979 0.5546 0.7447
No log 16.96 424 0.4985 0.4983 0.4985 0.7060
No log 17.04 426 0.4491 0.5095 0.4491 0.6701
No log 17.12 428 0.4641 0.6068 0.4641 0.6813
No log 17.2 430 0.4691 0.6389 0.4691 0.6849
No log 17.28 432 0.4734 0.6170 0.4734 0.6881
No log 17.36 434 0.5143 0.5109 0.5143 0.7172
No log 17.44 436 0.5724 0.4979 0.5724 0.7566
No log 17.52 438 0.5763 0.4827 0.5763 0.7592
No log 17.6 440 0.5366 0.5177 0.5366 0.7326
No log 17.68 442 0.5073 0.5289 0.5073 0.7123
No log 17.76 444 0.4960 0.5268 0.4960 0.7043
No log 17.84 446 0.5022 0.5890 0.5022 0.7087
No log 17.92 448 0.4992 0.5178 0.4992 0.7065
No log 18.0 450 0.4844 0.4904 0.4844 0.6960
No log 18.08 452 0.4946 0.5697 0.4946 0.7033
No log 18.16 454 0.5044 0.5468 0.5044 0.7102
No log 18.24 456 0.5151 0.5468 0.5151 0.7177
No log 18.32 458 0.4825 0.5104 0.4825 0.6946
No log 18.4 460 0.4574 0.5467 0.4574 0.6763
No log 18.48 462 0.4474 0.5715 0.4474 0.6689
No log 18.56 464 0.4375 0.6542 0.4375 0.6614
No log 18.64 466 0.4344 0.6170 0.4344 0.6591
No log 18.72 468 0.4268 0.6351 0.4268 0.6533
No log 18.8 470 0.4281 0.6351 0.4281 0.6543
No log 18.88 472 0.4267 0.6228 0.4267 0.6532
No log 18.96 474 0.4273 0.5784 0.4273 0.6537
No log 19.04 476 0.4422 0.5786 0.4422 0.6650
No log 19.12 478 0.4772 0.5533 0.4772 0.6908
No log 19.2 480 0.5191 0.4978 0.5191 0.7205
No log 19.28 482 0.5285 0.4664 0.5285 0.7269
No log 19.36 484 0.4837 0.5403 0.4837 0.6955
No log 19.44 486 0.4372 0.6010 0.4372 0.6612
No log 19.52 488 0.4451 0.5826 0.4451 0.6671
No log 19.6 490 0.5063 0.5599 0.5063 0.7115
No log 19.68 492 0.5200 0.5735 0.5200 0.7211
No log 19.76 494 0.4986 0.5528 0.4986 0.7061
No log 19.84 496 0.4793 0.5748 0.4793 0.6923
No log 19.92 498 0.4479 0.6389 0.4479 0.6693
0.26 20.0 500 0.4450 0.6389 0.4450 0.6671
0.26 20.08 502 0.4612 0.5702 0.4612 0.6791
0.26 20.16 504 0.4921 0.5603 0.4921 0.7015
0.26 20.24 506 0.4841 0.5718 0.4841 0.6958
0.26 20.32 508 0.4610 0.5781 0.4610 0.6790
0.26 20.4 510 0.4480 0.6469 0.4480 0.6693
0.26 20.48 512 0.4688 0.6716 0.4688 0.6847
0.26 20.56 514 0.5400 0.5895 0.5400 0.7348
0.26 20.64 516 0.5751 0.5080 0.5751 0.7584
0.26 20.72 518 0.5220 0.5326 0.5220 0.7225
0.26 20.8 520 0.4511 0.5923 0.4511 0.6716
0.26 20.88 522 0.4289 0.6267 0.4289 0.6549

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run3_AugV5_k5_task7_organization

Finetuned
(4019)
this model