ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k15_task7_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.4863
  • Qwk: 0.5166
  • Mse: 0.4863
  • Rmse: 0.6973

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0267 2 2.6891 -0.0262 2.6891 1.6399
No log 0.0533 4 1.9603 0.0987 1.9603 1.4001
No log 0.08 6 1.0422 0.1271 1.0422 1.0209
No log 0.1067 8 0.8651 -0.0483 0.8651 0.9301
No log 0.1333 10 0.9843 -0.0079 0.9843 0.9921
No log 0.16 12 0.9665 0.0208 0.9665 0.9831
No log 0.1867 14 0.8735 0.2527 0.8735 0.9346
No log 0.2133 16 0.8319 0.2345 0.8319 0.9121
No log 0.24 18 0.8274 0.2171 0.8274 0.9096
No log 0.2667 20 0.8108 0.4624 0.8108 0.9004
No log 0.2933 22 0.7509 0.3972 0.7509 0.8666
No log 0.32 24 0.7332 0.3312 0.7332 0.8563
No log 0.3467 26 0.7208 0.2027 0.7208 0.8490
No log 0.3733 28 0.8265 -0.0099 0.8265 0.9091
No log 0.4 30 0.6655 0.2270 0.6655 0.8158
No log 0.4267 32 0.5795 0.2751 0.5795 0.7613
No log 0.4533 34 0.5708 0.2751 0.5708 0.7555
No log 0.48 36 0.5362 0.3092 0.5362 0.7323
No log 0.5067 38 0.6120 0.4930 0.6120 0.7823
No log 0.5333 40 0.7870 0.4702 0.7870 0.8871
No log 0.56 42 0.9655 0.3706 0.9655 0.9826
No log 0.5867 44 0.7912 0.4209 0.7912 0.8895
No log 0.6133 46 0.5007 0.6113 0.5007 0.7076
No log 0.64 48 0.5049 0.5657 0.5049 0.7106
No log 0.6667 50 0.6485 0.5093 0.6485 0.8053
No log 0.6933 52 0.6583 0.4687 0.6583 0.8114
No log 0.72 54 0.5484 0.4713 0.5484 0.7405
No log 0.7467 56 0.5590 0.4354 0.5590 0.7477
No log 0.7733 58 0.6996 0.4307 0.6996 0.8364
No log 0.8 60 0.7437 0.4328 0.7437 0.8624
No log 0.8267 62 0.6392 0.4030 0.6392 0.7995
No log 0.8533 64 0.4864 0.4983 0.4864 0.6974
No log 0.88 66 0.4858 0.4838 0.4858 0.6970
No log 0.9067 68 0.5019 0.4838 0.5019 0.7084
No log 0.9333 70 0.5000 0.4052 0.5000 0.7071
No log 0.96 72 0.5463 0.5131 0.5463 0.7391
No log 0.9867 74 0.6580 0.5342 0.6580 0.8112
No log 1.0133 76 0.6283 0.5294 0.6283 0.7927
No log 1.04 78 0.5385 0.5131 0.5385 0.7338
No log 1.0667 80 0.4826 0.5254 0.4826 0.6947
No log 1.0933 82 0.4665 0.6411 0.4665 0.6830
No log 1.12 84 0.4876 0.6441 0.4876 0.6983
No log 1.1467 86 0.5176 0.6430 0.5176 0.7194
No log 1.1733 88 0.5433 0.6499 0.5433 0.7371
No log 1.2 90 0.5803 0.6683 0.5803 0.7618
No log 1.2267 92 0.6653 0.5655 0.6653 0.8157
No log 1.2533 94 0.8024 0.4851 0.8024 0.8957
No log 1.28 96 0.8181 0.4548 0.8181 0.9045
No log 1.3067 98 0.7155 0.5372 0.7155 0.8459
No log 1.3333 100 0.6336 0.5742 0.6336 0.7960
No log 1.3600 102 0.5543 0.6060 0.5543 0.7445
No log 1.3867 104 0.5920 0.6075 0.5920 0.7694
No log 1.4133 106 0.6968 0.5747 0.6968 0.8347
No log 1.44 108 0.6611 0.5579 0.6611 0.8131
No log 1.4667 110 0.5488 0.5898 0.5488 0.7408
No log 1.4933 112 0.4963 0.6241 0.4963 0.7045
No log 1.52 114 0.4648 0.6339 0.4648 0.6818
No log 1.5467 116 0.4903 0.6141 0.4903 0.7002
No log 1.5733 118 0.4991 0.6282 0.4991 0.7065
No log 1.6 120 0.4762 0.6443 0.4762 0.6901
No log 1.6267 122 0.4793 0.6458 0.4793 0.6923
No log 1.6533 124 0.5093 0.5884 0.5093 0.7136
No log 1.6800 126 0.5132 0.5568 0.5132 0.7164
No log 1.7067 128 0.5545 0.5179 0.5545 0.7447
No log 1.7333 130 0.5678 0.5388 0.5678 0.7535
No log 1.76 132 0.5799 0.5839 0.5799 0.7615
No log 1.7867 134 0.6042 0.5518 0.6042 0.7773
No log 1.8133 136 0.6802 0.5903 0.6802 0.8248
No log 1.8400 138 0.6655 0.5903 0.6655 0.8158
No log 1.8667 140 0.5588 0.5810 0.5588 0.7476
No log 1.8933 142 0.4953 0.6505 0.4953 0.7038
No log 1.92 144 0.5054 0.5754 0.5054 0.7109
No log 1.9467 146 0.4865 0.6185 0.4865 0.6975
No log 1.9733 148 0.6066 0.5382 0.6066 0.7788
No log 2.0 150 0.7646 0.4831 0.7646 0.8744
No log 2.0267 152 0.6587 0.4979 0.6587 0.8116
No log 2.0533 154 0.5019 0.6063 0.5019 0.7084
No log 2.08 156 0.4758 0.6145 0.4758 0.6898
No log 2.1067 158 0.4823 0.6239 0.4823 0.6945
No log 2.1333 160 0.5159 0.6124 0.5159 0.7183
No log 2.16 162 0.5473 0.5644 0.5473 0.7398
No log 2.1867 164 0.5227 0.6016 0.5227 0.7230
No log 2.2133 166 0.4904 0.6199 0.4904 0.7003
No log 2.24 168 0.4690 0.5742 0.4690 0.6848
No log 2.2667 170 0.4630 0.6365 0.4630 0.6804
No log 2.2933 172 0.4947 0.5799 0.4947 0.7034
No log 2.32 174 0.5929 0.5947 0.5929 0.7700
No log 2.3467 176 0.5905 0.5813 0.5905 0.7684
No log 2.3733 178 0.5360 0.5511 0.5360 0.7321
No log 2.4 180 0.4778 0.6021 0.4778 0.6912
No log 2.4267 182 0.4802 0.6827 0.4802 0.6929
No log 2.4533 184 0.5215 0.6187 0.5215 0.7221
No log 2.48 186 0.6997 0.4588 0.6997 0.8365
No log 2.5067 188 0.8259 0.4321 0.8259 0.9088
No log 2.5333 190 0.9421 0.4657 0.9421 0.9706
No log 2.56 192 0.8685 0.4427 0.8685 0.9319
No log 2.5867 194 0.7315 0.4687 0.7315 0.8553
No log 2.6133 196 0.5836 0.4036 0.5836 0.7640
No log 2.64 198 0.5192 0.4352 0.5192 0.7206
No log 2.6667 200 0.4763 0.5569 0.4763 0.6901
No log 2.6933 202 0.4885 0.5884 0.4885 0.6989
No log 2.7200 204 0.5813 0.5392 0.5813 0.7624
No log 2.7467 206 0.6155 0.5664 0.6155 0.7845
No log 2.7733 208 0.5905 0.5361 0.5905 0.7684
No log 2.8 210 0.5109 0.5935 0.5109 0.7148
No log 2.8267 212 0.4576 0.5939 0.4576 0.6764
No log 2.8533 214 0.4567 0.5912 0.4567 0.6758
No log 2.88 216 0.4786 0.5577 0.4786 0.6918
No log 2.9067 218 0.5312 0.5497 0.5312 0.7288
No log 2.9333 220 0.6459 0.5460 0.6459 0.8037
No log 2.96 222 0.7384 0.5481 0.7384 0.8593
No log 2.9867 224 0.6496 0.5152 0.6496 0.8060
No log 3.0133 226 0.5770 0.4904 0.5770 0.7596
No log 3.04 228 0.5088 0.6082 0.5088 0.7133
No log 3.0667 230 0.5071 0.4962 0.5071 0.7121
No log 3.0933 232 0.5594 0.4602 0.5594 0.7479
No log 3.12 234 0.7298 0.4784 0.7298 0.8543
No log 3.1467 236 0.7831 0.4533 0.7831 0.8849
No log 3.1733 238 0.6298 0.5008 0.6298 0.7936
No log 3.2 240 0.5233 0.5223 0.5233 0.7234
No log 3.2267 242 0.4843 0.5951 0.4843 0.6959
No log 3.2533 244 0.4879 0.6173 0.4879 0.6985
No log 3.2800 246 0.5193 0.6025 0.5193 0.7206
No log 3.3067 248 0.5696 0.4916 0.5696 0.7547
No log 3.3333 250 0.6272 0.5179 0.6272 0.7920
No log 3.36 252 0.7808 0.5002 0.7808 0.8836
No log 3.3867 254 0.8536 0.4442 0.8536 0.9239
No log 3.4133 256 0.7518 0.4779 0.7518 0.8670
No log 3.44 258 0.5946 0.5293 0.5946 0.7711
No log 3.4667 260 0.5620 0.6038 0.5620 0.7497
No log 3.4933 262 0.5511 0.6404 0.5511 0.7424
No log 3.52 264 0.5696 0.5658 0.5696 0.7547
No log 3.5467 266 0.6460 0.5167 0.6460 0.8038
No log 3.5733 268 0.7539 0.4716 0.7539 0.8683
No log 3.6 270 0.7422 0.4716 0.7422 0.8615
No log 3.6267 272 0.6939 0.4768 0.6939 0.8330
No log 3.6533 274 0.5856 0.5631 0.5856 0.7652
No log 3.68 276 0.5371 0.5758 0.5371 0.7329
No log 3.7067 278 0.5573 0.5845 0.5573 0.7465
No log 3.7333 280 0.6477 0.5167 0.6477 0.8048
No log 3.76 282 0.7650 0.5157 0.7650 0.8747
No log 3.7867 284 0.7203 0.4837 0.7203 0.8487
No log 3.8133 286 0.6087 0.5325 0.6087 0.7802
No log 3.84 288 0.5101 0.5566 0.5101 0.7142
No log 3.8667 290 0.4883 0.6552 0.4883 0.6988
No log 3.8933 292 0.4811 0.5926 0.4811 0.6936
No log 3.92 294 0.4959 0.5574 0.4959 0.7042
No log 3.9467 296 0.5705 0.4980 0.5705 0.7553
No log 3.9733 298 0.6443 0.5387 0.6443 0.8027
No log 4.0 300 0.6070 0.4741 0.6070 0.7791
No log 4.0267 302 0.5201 0.5468 0.5201 0.7212
No log 4.0533 304 0.4797 0.5631 0.4797 0.6926
No log 4.08 306 0.4649 0.6076 0.4649 0.6819
No log 4.1067 308 0.4704 0.5868 0.4704 0.6858
No log 4.1333 310 0.5336 0.5059 0.5336 0.7305
No log 4.16 312 0.6037 0.5139 0.6037 0.7770
No log 4.1867 314 0.5673 0.5350 0.5673 0.7532
No log 4.2133 316 0.4880 0.6210 0.4880 0.6986
No log 4.24 318 0.4788 0.6388 0.4788 0.6919
No log 4.2667 320 0.4865 0.6388 0.4865 0.6975
No log 4.2933 322 0.5103 0.6314 0.5103 0.7143
No log 4.32 324 0.5257 0.6382 0.5257 0.7250
No log 4.3467 326 0.4993 0.6550 0.4993 0.7066
No log 4.3733 328 0.4672 0.6286 0.4672 0.6836
No log 4.4 330 0.4790 0.6100 0.4790 0.6921
No log 4.4267 332 0.5236 0.5484 0.5236 0.7236
No log 4.4533 334 0.5576 0.5123 0.5576 0.7467
No log 4.48 336 0.5086 0.5059 0.5086 0.7132
No log 4.5067 338 0.4685 0.5741 0.4685 0.6845
No log 4.5333 340 0.4389 0.6553 0.4389 0.6625
No log 4.5600 342 0.4394 0.6645 0.4394 0.6629
No log 4.5867 344 0.4681 0.5646 0.4681 0.6842
No log 4.6133 346 0.4980 0.5679 0.4980 0.7057
No log 4.64 348 0.4763 0.5679 0.4763 0.6901
No log 4.6667 350 0.4717 0.5865 0.4717 0.6868
No log 4.6933 352 0.4603 0.5528 0.4603 0.6784
No log 4.72 354 0.4622 0.5559 0.4622 0.6798
No log 4.7467 356 0.4998 0.5395 0.4998 0.7070
No log 4.7733 358 0.5718 0.4887 0.5718 0.7562
No log 4.8 360 0.6507 0.5050 0.6507 0.8066
No log 4.8267 362 0.7498 0.5073 0.7498 0.8659
No log 4.8533 364 0.7048 0.4756 0.7048 0.8396
No log 4.88 366 0.5864 0.3384 0.5864 0.7658
No log 4.9067 368 0.5312 0.4081 0.5312 0.7288
No log 4.9333 370 0.5175 0.4081 0.5175 0.7194
No log 4.96 372 0.5322 0.4408 0.5322 0.7295
No log 4.9867 374 0.6140 0.5007 0.6140 0.7836
No log 5.0133 376 0.7230 0.5242 0.7230 0.8503
No log 5.04 378 0.7063 0.4773 0.7063 0.8404
No log 5.0667 380 0.5851 0.4964 0.5851 0.7649
No log 5.0933 382 0.4866 0.5428 0.4866 0.6976
No log 5.12 384 0.4636 0.5533 0.4636 0.6809
No log 5.1467 386 0.4809 0.5715 0.4809 0.6935
No log 5.1733 388 0.6198 0.5457 0.6198 0.7873
No log 5.2 390 0.9038 0.4755 0.9038 0.9507
No log 5.2267 392 0.9654 0.4699 0.9654 0.9825
No log 5.2533 394 0.8437 0.4699 0.8437 0.9186
No log 5.28 396 0.6111 0.4592 0.6111 0.7817
No log 5.3067 398 0.4881 0.5517 0.4881 0.6986
No log 5.3333 400 0.4756 0.5875 0.4756 0.6896
No log 5.36 402 0.4773 0.6024 0.4773 0.6909
No log 5.3867 404 0.5194 0.5223 0.5194 0.7207
No log 5.4133 406 0.6801 0.5050 0.6801 0.8247
No log 5.44 408 0.8000 0.4655 0.8000 0.8944
No log 5.4667 410 0.7429 0.5139 0.7429 0.8619
No log 5.4933 412 0.6148 0.5008 0.6148 0.7841
No log 5.52 414 0.5814 0.5243 0.5814 0.7625
No log 5.5467 416 0.5813 0.5243 0.5813 0.7624
No log 5.5733 418 0.5459 0.5858 0.5459 0.7389
No log 5.6 420 0.4971 0.5514 0.4971 0.7051
No log 5.6267 422 0.4860 0.5646 0.4860 0.6971
No log 5.6533 424 0.4977 0.5715 0.4977 0.7055
No log 5.68 426 0.5371 0.5957 0.5371 0.7329
No log 5.7067 428 0.5455 0.5524 0.5455 0.7386
No log 5.7333 430 0.5152 0.5223 0.5152 0.7178
No log 5.76 432 0.4912 0.4966 0.4912 0.7009
No log 5.7867 434 0.4925 0.4966 0.4925 0.7018
No log 5.8133 436 0.5125 0.5098 0.5125 0.7159
No log 5.84 438 0.5402 0.5098 0.5402 0.7350
No log 5.8667 440 0.5344 0.5357 0.5344 0.7310
No log 5.8933 442 0.4912 0.5528 0.4912 0.7008
No log 5.92 444 0.4682 0.5404 0.4682 0.6843
No log 5.9467 446 0.4685 0.5404 0.4685 0.6845
No log 5.9733 448 0.4771 0.5840 0.4771 0.6907
No log 6.0 450 0.4829 0.5958 0.4829 0.6949
No log 6.0267 452 0.5044 0.6069 0.5044 0.7102
No log 6.0533 454 0.5333 0.5932 0.5333 0.7303
No log 6.08 456 0.5314 0.6221 0.5314 0.7290
No log 6.1067 458 0.4930 0.5920 0.4930 0.7022
No log 6.1333 460 0.4778 0.5632 0.4778 0.6912
No log 6.16 462 0.4727 0.5485 0.4727 0.6875
No log 6.1867 464 0.4700 0.5868 0.4700 0.6855
No log 6.2133 466 0.4774 0.5437 0.4774 0.6909
No log 6.24 468 0.5052 0.5237 0.5052 0.7108
No log 6.2667 470 0.5889 0.5326 0.5889 0.7674
No log 6.2933 472 0.6190 0.5523 0.6190 0.7868
No log 6.32 474 0.5694 0.5259 0.5694 0.7546
No log 6.3467 476 0.5116 0.5418 0.5116 0.7153
No log 6.3733 478 0.4576 0.5729 0.4576 0.6765
No log 6.4 480 0.4561 0.6092 0.4561 0.6753
No log 6.4267 482 0.4750 0.6434 0.4750 0.6892
No log 6.4533 484 0.4793 0.6092 0.4793 0.6923
No log 6.48 486 0.4933 0.6392 0.4933 0.7024
No log 6.5067 488 0.5013 0.6457 0.5013 0.7081
No log 6.5333 490 0.4785 0.5557 0.4785 0.6917
No log 6.5600 492 0.4607 0.4809 0.4607 0.6788
No log 6.5867 494 0.4589 0.4703 0.4589 0.6774
No log 6.6133 496 0.4751 0.4459 0.4751 0.6893
No log 6.64 498 0.5118 0.5452 0.5118 0.7154
0.3061 6.6667 500 0.5524 0.5362 0.5524 0.7432
0.3061 6.6933 502 0.5520 0.5275 0.5520 0.7430
0.3061 6.72 504 0.4986 0.5601 0.4986 0.7061
0.3061 6.7467 506 0.4568 0.5567 0.4568 0.6759
0.3061 6.7733 508 0.4532 0.5549 0.4532 0.6732
0.3061 6.8 510 0.4622 0.5662 0.4622 0.6799
0.3061 6.8267 512 0.4781 0.5404 0.4781 0.6915
0.3061 6.8533 514 0.5192 0.6063 0.5192 0.7206
0.3061 6.88 516 0.5012 0.5418 0.5012 0.7080
0.3061 6.9067 518 0.4664 0.5166 0.4664 0.6829
0.3061 6.9333 520 0.4624 0.5327 0.4624 0.6800
0.3061 6.96 522 0.4645 0.5166 0.4645 0.6816
0.3061 6.9867 524 0.4623 0.5254 0.4623 0.6799
0.3061 7.0133 526 0.4863 0.5166 0.4863 0.6973

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k15_task7_organization

Finetuned
(4019)
this model