ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k6_task7_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.4791
  • Qwk: 0.4182
  • Mse: 0.4791
  • Rmse: 0.6921

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0667 2 2.3694 -0.0449 2.3694 1.5393
No log 0.1333 4 1.0954 0.1573 1.0954 1.0466
No log 0.2 6 0.8392 0.0937 0.8392 0.9161
No log 0.2667 8 0.7724 0.2615 0.7724 0.8789
No log 0.3333 10 0.6134 0.2413 0.6134 0.7832
No log 0.4 12 0.9111 0.3560 0.9111 0.9545
No log 0.4667 14 0.9461 0.3719 0.9461 0.9727
No log 0.5333 16 0.5647 0.4496 0.5647 0.7515
No log 0.6 18 0.5349 0.4173 0.5349 0.7314
No log 0.6667 20 0.5181 0.5386 0.5181 0.7198
No log 0.7333 22 0.6932 0.3347 0.6932 0.8326
No log 0.8 24 0.8813 0.3948 0.8813 0.9388
No log 0.8667 26 0.6673 0.4361 0.6673 0.8169
No log 0.9333 28 0.5124 0.5715 0.5124 0.7158
No log 1.0 30 0.6703 0.4089 0.6703 0.8187
No log 1.0667 32 0.5640 0.5228 0.5640 0.7510
No log 1.1333 34 0.5569 0.4997 0.5569 0.7463
No log 1.2 36 0.6599 0.4725 0.6599 0.8123
No log 1.2667 38 0.5680 0.4795 0.5680 0.7537
No log 1.3333 40 0.5117 0.4740 0.5117 0.7154
No log 1.4 42 0.4813 0.5340 0.4813 0.6937
No log 1.4667 44 0.4773 0.5875 0.4773 0.6908
No log 1.5333 46 0.4825 0.5781 0.4825 0.6946
No log 1.6 48 0.4947 0.5918 0.4947 0.7033
No log 1.6667 50 0.5605 0.5112 0.5605 0.7487
No log 1.7333 52 0.6102 0.5200 0.6102 0.7812
No log 1.8 54 0.5748 0.4825 0.5748 0.7582
No log 1.8667 56 0.4942 0.5205 0.4942 0.7030
No log 1.9333 58 0.5340 0.6080 0.5340 0.7308
No log 2.0 60 0.5313 0.6355 0.5313 0.7289
No log 2.0667 62 0.5638 0.5656 0.5638 0.7509
No log 2.1333 64 0.6020 0.5236 0.6020 0.7759
No log 2.2 66 0.4982 0.6289 0.4982 0.7058
No log 2.2667 68 0.4989 0.6741 0.4989 0.7063
No log 2.3333 70 0.5043 0.6279 0.5043 0.7102
No log 2.4 72 0.7486 0.3563 0.7486 0.8652
No log 2.4667 74 0.7718 0.3786 0.7718 0.8785
No log 2.5333 76 0.5871 0.4778 0.5871 0.7663
No log 2.6 78 0.4945 0.6574 0.4945 0.7032
No log 2.6667 80 0.5576 0.5714 0.5576 0.7467
No log 2.7333 82 0.8354 0.4151 0.8354 0.9140
No log 2.8 84 0.7406 0.4601 0.7406 0.8606
No log 2.8667 86 0.5545 0.6146 0.5545 0.7446
No log 2.9333 88 0.5429 0.5919 0.5429 0.7368
No log 3.0 90 0.5814 0.5664 0.5814 0.7625
No log 3.0667 92 0.4995 0.5868 0.4995 0.7067
No log 3.1333 94 0.4851 0.6409 0.4851 0.6965
No log 3.2 96 0.5004 0.6011 0.5004 0.7074
No log 3.2667 98 0.4940 0.6235 0.4940 0.7029
No log 3.3333 100 0.4933 0.6130 0.4933 0.7024
No log 3.4 102 0.5175 0.5920 0.5175 0.7194
No log 3.4667 104 0.5484 0.6010 0.5484 0.7405
No log 3.5333 106 0.4823 0.6738 0.4823 0.6945
No log 3.6 108 0.5063 0.6616 0.5063 0.7116
No log 3.6667 110 0.4660 0.6914 0.4660 0.6826
No log 3.7333 112 0.4808 0.6419 0.4808 0.6934
No log 3.8 114 0.4877 0.5779 0.4877 0.6984
No log 3.8667 116 0.4637 0.5841 0.4637 0.6809
No log 3.9333 118 0.4185 0.6566 0.4185 0.6470
No log 4.0 120 0.4056 0.6830 0.4056 0.6368
No log 4.0667 122 0.4238 0.5904 0.4238 0.6510
No log 4.1333 124 0.4433 0.5811 0.4433 0.6658
No log 4.2 126 0.3910 0.6946 0.3910 0.6253
No log 4.2667 128 0.4040 0.6863 0.4040 0.6356
No log 4.3333 130 0.4627 0.6010 0.4627 0.6802
No log 4.4 132 0.4869 0.6158 0.4869 0.6977
No log 4.4667 134 0.4471 0.7213 0.4471 0.6686
No log 4.5333 136 0.4906 0.6529 0.4906 0.7004
No log 4.6 138 0.4548 0.6703 0.4548 0.6744
No log 4.6667 140 0.4691 0.6287 0.4691 0.6849
No log 4.7333 142 0.4682 0.6111 0.4682 0.6843
No log 4.8 144 0.4299 0.7198 0.4299 0.6557
No log 4.8667 146 0.4500 0.6282 0.4500 0.6708
No log 4.9333 148 0.4640 0.5761 0.4640 0.6812
No log 5.0 150 0.4211 0.6371 0.4211 0.6489
No log 5.0667 152 0.4345 0.6431 0.4345 0.6591
No log 5.1333 154 0.4359 0.6235 0.4359 0.6602
No log 5.2 156 0.4303 0.7041 0.4303 0.6559
No log 5.2667 158 0.4384 0.6040 0.4384 0.6621
No log 5.3333 160 0.4395 0.6563 0.4395 0.6630
No log 5.4 162 0.4691 0.6526 0.4691 0.6849
No log 5.4667 164 0.6065 0.5464 0.6065 0.7788
No log 5.5333 166 0.5763 0.6042 0.5763 0.7591
No log 5.6 168 0.4525 0.6593 0.4525 0.6727
No log 5.6667 170 0.4830 0.6401 0.4830 0.6950
No log 5.7333 172 0.4934 0.6498 0.4934 0.7025
No log 5.8 174 0.4868 0.6498 0.4868 0.6977
No log 5.8667 176 0.4589 0.6120 0.4589 0.6774
No log 5.9333 178 0.4722 0.6259 0.4722 0.6872
No log 6.0 180 0.4571 0.6034 0.4571 0.6761
No log 6.0667 182 0.4646 0.6034 0.4646 0.6816
No log 6.1333 184 0.4604 0.6034 0.4604 0.6786
No log 6.2 186 0.4382 0.6184 0.4382 0.6620
No log 6.2667 188 0.4425 0.6645 0.4425 0.6652
No log 6.3333 190 0.4509 0.5479 0.4509 0.6715
No log 6.4 192 0.4541 0.5479 0.4541 0.6739
No log 6.4667 194 0.4698 0.5168 0.4698 0.6855
No log 6.5333 196 0.4602 0.5357 0.4602 0.6784
No log 6.6 198 0.4763 0.5388 0.4763 0.6901
No log 6.6667 200 0.4695 0.5421 0.4695 0.6852
No log 6.7333 202 0.5336 0.5438 0.5336 0.7305
No log 6.8 204 0.5244 0.5627 0.5244 0.7242
No log 6.8667 206 0.4861 0.4719 0.4861 0.6972
No log 6.9333 208 0.5055 0.4743 0.5055 0.7110
No log 7.0 210 0.4861 0.4856 0.4861 0.6972
No log 7.0667 212 0.5033 0.5929 0.5033 0.7095
No log 7.1333 214 0.5097 0.5966 0.5097 0.7139
No log 7.2 216 0.4995 0.5634 0.4995 0.7068
No log 7.2667 218 0.4995 0.5634 0.4995 0.7068
No log 7.3333 220 0.4939 0.5201 0.4939 0.7028
No log 7.4 222 0.5009 0.4934 0.5009 0.7077
No log 7.4667 224 0.5288 0.4534 0.5288 0.7272
No log 7.5333 226 0.5149 0.5098 0.5149 0.7176
No log 7.6 228 0.4812 0.4973 0.4812 0.6937
No log 7.6667 230 0.5536 0.5935 0.5536 0.7441
No log 7.7333 232 0.5581 0.5935 0.5581 0.7471
No log 7.8 234 0.4773 0.6184 0.4773 0.6909
No log 7.8667 236 0.4661 0.6486 0.4661 0.6827
No log 7.9333 238 0.4453 0.6472 0.4453 0.6673
No log 8.0 240 0.4453 0.6515 0.4453 0.6673
No log 8.0667 242 0.5037 0.6104 0.5037 0.7097
No log 8.1333 244 0.5569 0.5935 0.5569 0.7462
No log 8.2 246 0.4952 0.6457 0.4952 0.7037
No log 8.2667 248 0.4549 0.5816 0.4549 0.6744
No log 8.3333 250 0.5487 0.5645 0.5487 0.7407
No log 8.4 252 0.5784 0.5815 0.5784 0.7605
No log 8.4667 254 0.5168 0.5950 0.5168 0.7189
No log 8.5333 256 0.4758 0.5505 0.4758 0.6898
No log 8.6 258 0.4702 0.5421 0.4702 0.6857
No log 8.6667 260 0.4795 0.5723 0.4795 0.6925
No log 8.7333 262 0.5166 0.5560 0.5166 0.7188
No log 8.8 264 0.5195 0.5568 0.5195 0.7207
No log 8.8667 266 0.4872 0.5958 0.4872 0.6980
No log 8.9333 268 0.4656 0.6146 0.4656 0.6824
No log 9.0 270 0.5132 0.6468 0.5132 0.7164
No log 9.0667 272 0.5116 0.5706 0.5116 0.7153
No log 9.1333 274 0.5030 0.5569 0.5030 0.7092
No log 9.2 276 0.4655 0.5634 0.4655 0.6822
No log 9.2667 278 0.4933 0.5428 0.4933 0.7023
No log 9.3333 280 0.5655 0.5624 0.5655 0.7520
No log 9.4 282 0.5369 0.5712 0.5369 0.7328
No log 9.4667 284 0.4695 0.5866 0.4695 0.6852
No log 9.5333 286 0.4749 0.5953 0.4749 0.6891
No log 9.6 288 0.4747 0.6146 0.4747 0.6890
No log 9.6667 290 0.5110 0.5721 0.5110 0.7149
No log 9.7333 292 0.5082 0.5721 0.5082 0.7128
No log 9.8 294 0.4593 0.5669 0.4593 0.6777
No log 9.8667 296 0.5311 0.5468 0.5311 0.7287
No log 9.9333 298 0.6339 0.4860 0.6339 0.7962
No log 10.0 300 0.5840 0.5042 0.5840 0.7642
No log 10.0667 302 0.4718 0.5923 0.4718 0.6869
No log 10.1333 304 0.4543 0.5840 0.4543 0.6740
No log 10.2 306 0.4646 0.6047 0.4646 0.6816
No log 10.2667 308 0.4444 0.5634 0.4444 0.6666
No log 10.3333 310 0.4700 0.5801 0.4700 0.6856
No log 10.4 312 0.4991 0.5485 0.4991 0.7064
No log 10.4667 314 0.4646 0.5882 0.4646 0.6816
No log 10.5333 316 0.4394 0.6053 0.4394 0.6629
No log 10.6 318 0.4519 0.5702 0.4519 0.6722
No log 10.6667 320 0.4376 0.5904 0.4376 0.6615
No log 10.7333 322 0.4388 0.6389 0.4388 0.6624
No log 10.8 324 0.4716 0.5584 0.4716 0.6867
No log 10.8667 326 0.4854 0.5584 0.4854 0.6967
No log 10.9333 328 0.4462 0.5904 0.4462 0.6680
No log 11.0 330 0.4384 0.6344 0.4384 0.6621
No log 11.0667 332 0.4415 0.6615 0.4415 0.6645
No log 11.1333 334 0.4460 0.6615 0.4460 0.6678
No log 11.2 336 0.4602 0.5614 0.4602 0.6784
No log 11.2667 338 0.4606 0.5698 0.4606 0.6787
No log 11.3333 340 0.4488 0.6403 0.4488 0.6699
No log 11.4 342 0.4385 0.5930 0.4385 0.6622
No log 11.4667 344 0.4468 0.5929 0.4468 0.6684
No log 11.5333 346 0.4918 0.6292 0.4918 0.7013
No log 11.6 348 0.4770 0.6333 0.4770 0.6907
No log 11.6667 350 0.4558 0.6716 0.4558 0.6751
No log 11.7333 352 0.5182 0.5787 0.5182 0.7199
No log 11.8 354 0.5157 0.5787 0.5157 0.7181
No log 11.8667 356 0.4726 0.6411 0.4726 0.6874
No log 11.9333 358 0.4535 0.5797 0.4535 0.6734
No log 12.0 360 0.5067 0.5455 0.5067 0.7118
No log 12.0667 362 0.5496 0.6023 0.5496 0.7414
No log 12.1333 364 0.5058 0.5567 0.5058 0.7112
No log 12.2 366 0.4535 0.6150 0.4535 0.6734
No log 12.2667 368 0.5204 0.6016 0.5204 0.7214
No log 12.3333 370 0.5390 0.5763 0.5390 0.7342
No log 12.4 372 0.4794 0.5934 0.4794 0.6924
No log 12.4667 374 0.4347 0.6111 0.4347 0.6593
No log 12.5333 376 0.4479 0.5714 0.4479 0.6692
No log 12.6 378 0.4727 0.5135 0.4727 0.6875
No log 12.6667 380 0.4527 0.5781 0.4527 0.6728
No log 12.7333 382 0.4313 0.6215 0.4313 0.6567
No log 12.8 384 0.4796 0.5978 0.4796 0.6925
No log 12.8667 386 0.4956 0.5875 0.4956 0.7040
No log 12.9333 388 0.4469 0.6709 0.4469 0.6685
No log 13.0 390 0.4573 0.6431 0.4573 0.6762
No log 13.0667 392 0.5377 0.6129 0.5377 0.7333
No log 13.1333 394 0.5769 0.5982 0.5769 0.7595
No log 13.2 396 0.5332 0.6129 0.5332 0.7302
No log 13.2667 398 0.4714 0.6169 0.4714 0.6866
No log 13.3333 400 0.4283 0.6114 0.4283 0.6545
No log 13.4 402 0.4208 0.6736 0.4208 0.6487
No log 13.4667 404 0.4319 0.6820 0.4319 0.6572
No log 13.5333 406 0.4243 0.6931 0.4243 0.6514
No log 13.6 408 0.4177 0.6667 0.4177 0.6463
No log 13.6667 410 0.4740 0.5161 0.4740 0.6885
No log 13.7333 412 0.5182 0.5808 0.5182 0.7199
No log 13.8 414 0.4814 0.5230 0.4814 0.6938
No log 13.8667 416 0.4267 0.5765 0.4267 0.6532
No log 13.9333 418 0.4411 0.6503 0.4411 0.6641
No log 14.0 420 0.4895 0.5597 0.4895 0.6996
No log 14.0667 422 0.5329 0.5310 0.5329 0.7300
No log 14.1333 424 0.5081 0.5349 0.5081 0.7128
No log 14.2 426 0.4596 0.6082 0.4596 0.6780
No log 14.2667 428 0.4347 0.6448 0.4347 0.6593
No log 14.3333 430 0.4556 0.5642 0.4556 0.6750
No log 14.4 432 0.4797 0.5983 0.4797 0.6926
No log 14.4667 434 0.4608 0.6349 0.4608 0.6788
No log 14.5333 436 0.4667 0.6480 0.4667 0.6831
No log 14.6 438 0.4632 0.6482 0.4632 0.6806
No log 14.6667 440 0.4488 0.6469 0.4488 0.6699
No log 14.7333 442 0.4391 0.6359 0.4391 0.6627
No log 14.8 444 0.4500 0.6018 0.4500 0.6708
No log 14.8667 446 0.4775 0.5692 0.4775 0.6910
No log 14.9333 448 0.4918 0.5468 0.4918 0.7013
No log 15.0 450 0.4651 0.5692 0.4651 0.6820
No log 15.0667 452 0.4347 0.6241 0.4347 0.6593
No log 15.1333 454 0.4252 0.6277 0.4252 0.6521
No log 15.2 456 0.4129 0.6479 0.4129 0.6426
No log 15.2667 458 0.4290 0.6305 0.4290 0.6550
No log 15.3333 460 0.4551 0.6361 0.4551 0.6746
No log 15.4 462 0.4561 0.6181 0.4561 0.6753
No log 15.4667 464 0.4258 0.6210 0.4258 0.6525
No log 15.5333 466 0.4463 0.5705 0.4463 0.6681
No log 15.6 468 0.5006 0.6210 0.5006 0.7075
No log 15.6667 470 0.4863 0.6188 0.4863 0.6973
No log 15.7333 472 0.4419 0.5889 0.4419 0.6648
No log 15.8 474 0.4257 0.6839 0.4257 0.6525
No log 15.8667 476 0.4213 0.5915 0.4213 0.6490
No log 15.9333 478 0.4207 0.5915 0.4207 0.6486
No log 16.0 480 0.4184 0.5915 0.4184 0.6468
No log 16.0667 482 0.4170 0.5915 0.4170 0.6457
No log 16.1333 484 0.4122 0.6383 0.4122 0.6420
No log 16.2 486 0.4421 0.6141 0.4421 0.6649
No log 16.2667 488 0.4441 0.6452 0.4441 0.6664
No log 16.3333 490 0.4195 0.6365 0.4195 0.6477
No log 16.4 492 0.4112 0.6365 0.4112 0.6412
No log 16.4667 494 0.4195 0.6887 0.4195 0.6477
No log 16.5333 496 0.4345 0.6171 0.4345 0.6592
No log 16.6 498 0.4365 0.6278 0.4365 0.6607
0.2522 16.6667 500 0.4329 0.5665 0.4329 0.6579
0.2522 16.7333 502 0.4249 0.5765 0.4249 0.6519
0.2522 16.8 504 0.4214 0.6032 0.4214 0.6491
0.2522 16.8667 506 0.4102 0.5719 0.4102 0.6405
0.2522 16.9333 508 0.4035 0.6140 0.4035 0.6352
0.2522 17.0 510 0.4017 0.6053 0.4017 0.6338
0.2522 17.0667 512 0.4007 0.6053 0.4007 0.6330
0.2522 17.1333 514 0.4013 0.6101 0.4013 0.6335
0.2522 17.2 516 0.4063 0.6305 0.4063 0.6374
0.2522 17.2667 518 0.4023 0.6305 0.4023 0.6343
0.2522 17.3333 520 0.3955 0.5831 0.3955 0.6289
0.2522 17.4 522 0.4062 0.7011 0.4062 0.6373
0.2522 17.4667 524 0.4231 0.6887 0.4231 0.6504
0.2522 17.5333 526 0.4175 0.6518 0.4175 0.6462
0.2522 17.6 528 0.4132 0.6620 0.4132 0.6428
0.2522 17.6667 530 0.4117 0.6129 0.4117 0.6417
0.2522 17.7333 532 0.4262 0.5904 0.4262 0.6529
0.2522 17.8 534 0.4708 0.6181 0.4708 0.6862
0.2522 17.8667 536 0.4629 0.6047 0.4629 0.6804
0.2522 17.9333 538 0.4471 0.6020 0.4471 0.6687
0.2522 18.0 540 0.4455 0.6020 0.4455 0.6675
0.2522 18.0667 542 0.4374 0.5915 0.4374 0.6613
0.2522 18.1333 544 0.4347 0.6339 0.4347 0.6594
0.2522 18.2 546 0.4420 0.6201 0.4420 0.6648
0.2522 18.2667 548 0.4458 0.6201 0.4458 0.6677
0.2522 18.3333 550 0.4318 0.6317 0.4318 0.6571
0.2522 18.4 552 0.4263 0.6007 0.4263 0.6529
0.2522 18.4667 554 0.4359 0.6020 0.4359 0.6602
0.2522 18.5333 556 0.4370 0.6034 0.4370 0.6611
0.2522 18.6 558 0.4228 0.6101 0.4228 0.6502
0.2522 18.6667 560 0.4359 0.6096 0.4359 0.6602
0.2522 18.7333 562 0.4888 0.6150 0.4888 0.6991
0.2522 18.8 564 0.4949 0.6032 0.4949 0.7035
0.2522 18.8667 566 0.4657 0.5617 0.4657 0.6824
0.2522 18.9333 568 0.4286 0.6096 0.4286 0.6547
0.2522 19.0 570 0.4176 0.5665 0.4176 0.6463
0.2522 19.0667 572 0.4199 0.5665 0.4199 0.6480
0.2522 19.1333 574 0.4336 0.6200 0.4336 0.6585
0.2522 19.2 576 0.4486 0.6490 0.4486 0.6698
0.2522 19.2667 578 0.4523 0.6395 0.4523 0.6725
0.2522 19.3333 580 0.4546 0.6401 0.4546 0.6742
0.2522 19.4 582 0.4269 0.6712 0.4269 0.6533
0.2522 19.4667 584 0.4077 0.5765 0.4077 0.6385
0.2522 19.5333 586 0.4122 0.5539 0.4122 0.6420
0.2522 19.6 588 0.4180 0.5305 0.4180 0.6466
0.2522 19.6667 590 0.4233 0.5305 0.4233 0.6506
0.2522 19.7333 592 0.4292 0.6307 0.4292 0.6551
0.2522 19.8 594 0.4619 0.6029 0.4619 0.6797
0.2522 19.8667 596 0.4848 0.5349 0.4848 0.6963
0.2522 19.9333 598 0.4742 0.5597 0.4742 0.6886
0.2522 20.0 600 0.4506 0.6076 0.4506 0.6712
0.2522 20.0667 602 0.4405 0.5765 0.4405 0.6637
0.2522 20.1333 604 0.4494 0.4869 0.4494 0.6704
0.2522 20.2 606 0.4491 0.5339 0.4491 0.6701
0.2522 20.2667 608 0.4410 0.5538 0.4410 0.6641
0.2522 20.3333 610 0.4382 0.5915 0.4382 0.6620
0.2522 20.4 612 0.4375 0.5681 0.4375 0.6615
0.2522 20.4667 614 0.4425 0.5765 0.4425 0.6652
0.2522 20.5333 616 0.4686 0.5214 0.4686 0.6845
0.2522 20.6 618 0.5070 0.4997 0.5070 0.7120
0.2522 20.6667 620 0.5395 0.4997 0.5395 0.7345
0.2522 20.7333 622 0.5229 0.4925 0.5229 0.7231
0.2522 20.8 624 0.4791 0.4182 0.4791 0.6921

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k6_task7_organization

Finetuned
(4019)
this model