ArabicNewSplits7_B_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k15_task7_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.5615
  • Qwk: 0.4375
  • Mse: 0.5615
  • Rmse: 0.7493

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0260 2 2.4921 -0.1089 2.4921 1.5786
No log 0.0519 4 1.3043 0.1941 1.3043 1.1421
No log 0.0779 6 0.9492 -0.0228 0.9492 0.9743
No log 0.1039 8 0.8763 0.0532 0.8763 0.9361
No log 0.1299 10 0.9368 0.1454 0.9368 0.9679
No log 0.1558 12 0.7553 0.1050 0.7553 0.8691
No log 0.1818 14 0.7803 0.3455 0.7803 0.8833
No log 0.2078 16 0.7829 0.3031 0.7829 0.8848
No log 0.2338 18 0.7452 0.3112 0.7452 0.8633
No log 0.2597 20 0.6839 0.2923 0.6839 0.8270
No log 0.2857 22 0.6786 0.2923 0.6786 0.8238
No log 0.3117 24 0.7227 0.3127 0.7227 0.8501
No log 0.3377 26 0.7494 0.3444 0.7494 0.8657
No log 0.3636 28 0.7123 0.3737 0.7123 0.8440
No log 0.3896 30 0.6874 0.3123 0.6874 0.8291
No log 0.4156 32 0.6637 0.1604 0.6637 0.8147
No log 0.4416 34 0.6739 0.0893 0.6739 0.8209
No log 0.4675 36 0.7139 0.2156 0.7139 0.8449
No log 0.4935 38 0.6763 0.1729 0.6763 0.8224
No log 0.5195 40 0.6460 0.2041 0.6460 0.8037
No log 0.5455 42 0.6783 0.3312 0.6783 0.8236
No log 0.5714 44 0.6322 0.3633 0.6322 0.7951
No log 0.5974 46 0.6146 0.3661 0.6146 0.7840
No log 0.6234 48 0.5962 0.3336 0.5962 0.7721
No log 0.6494 50 0.6424 0.3701 0.6424 0.8015
No log 0.6753 52 0.8158 0.3110 0.8158 0.9032
No log 0.7013 54 0.8842 0.3228 0.8842 0.9403
No log 0.7273 56 0.7169 0.3770 0.7169 0.8467
No log 0.7532 58 0.8060 0.3960 0.8060 0.8978
No log 0.7792 60 0.7659 0.4175 0.7659 0.8752
No log 0.8052 62 0.5478 0.3575 0.5478 0.7401
No log 0.8312 64 0.5249 0.4631 0.5249 0.7245
No log 0.8571 66 0.5145 0.4891 0.5145 0.7173
No log 0.8831 68 0.5560 0.4740 0.5560 0.7457
No log 0.9091 70 0.5207 0.5527 0.5207 0.7216
No log 0.9351 72 0.6150 0.4550 0.6150 0.7842
No log 0.9610 74 0.8037 0.3887 0.8037 0.8965
No log 0.9870 76 0.5522 0.4212 0.5522 0.7431
No log 1.0130 78 0.4743 0.5133 0.4743 0.6887
No log 1.0390 80 0.4780 0.5267 0.4780 0.6914
No log 1.0649 82 0.7318 0.5124 0.7318 0.8555
No log 1.0909 84 1.2494 0.2732 1.2494 1.1178
No log 1.1169 86 1.0045 0.3902 1.0045 1.0022
No log 1.1429 88 0.5199 0.3990 0.5199 0.7210
No log 1.1688 90 0.5127 0.4380 0.5127 0.7161
No log 1.1948 92 0.5985 0.3078 0.5985 0.7737
No log 1.2208 94 0.8422 0.3060 0.8422 0.9177
No log 1.2468 96 0.6450 0.3665 0.6450 0.8031
No log 1.2727 98 0.5501 0.4013 0.5501 0.7417
No log 1.2987 100 0.5855 0.3841 0.5855 0.7652
No log 1.3247 102 0.6662 0.4602 0.6662 0.8162
No log 1.3506 104 0.6280 0.4315 0.6280 0.7925
No log 1.3766 106 0.5491 0.4617 0.5491 0.7410
No log 1.4026 108 0.5563 0.4535 0.5563 0.7459
No log 1.4286 110 0.6980 0.4745 0.6980 0.8355
No log 1.4545 112 0.7862 0.4947 0.7862 0.8867
No log 1.4805 114 0.6487 0.4294 0.6487 0.8054
No log 1.5065 116 0.5657 0.4495 0.5657 0.7521
No log 1.5325 118 0.5733 0.3839 0.5733 0.7572
No log 1.5584 120 0.6775 0.4745 0.6775 0.8231
No log 1.5844 122 0.6929 0.4531 0.6929 0.8324
No log 1.6104 124 0.5728 0.3792 0.5728 0.7568
No log 1.6364 126 0.5652 0.3792 0.5652 0.7518
No log 1.6623 128 0.6583 0.4081 0.6583 0.8114
No log 1.6883 130 0.6919 0.4522 0.6919 0.8318
No log 1.7143 132 0.6321 0.4642 0.6321 0.7950
No log 1.7403 134 0.5602 0.3769 0.5602 0.7485
No log 1.7662 136 0.5590 0.3769 0.5590 0.7477
No log 1.7922 138 0.6194 0.4642 0.6194 0.7870
No log 1.8182 140 0.7543 0.5090 0.7543 0.8685
No log 1.8442 142 0.6560 0.4371 0.6560 0.8099
No log 1.8701 144 0.5408 0.3769 0.5408 0.7354
No log 1.8961 146 0.5390 0.5812 0.5390 0.7342
No log 1.9221 148 0.5564 0.5991 0.5564 0.7459
No log 1.9481 150 0.6239 0.5408 0.6239 0.7899
No log 1.9740 152 0.5767 0.5836 0.5767 0.7594
No log 2.0 154 0.5435 0.4640 0.5435 0.7372
No log 2.0260 156 0.5468 0.5150 0.5468 0.7395
No log 2.0519 158 0.5464 0.5405 0.5464 0.7392
No log 2.0779 160 0.5468 0.4782 0.5468 0.7394
No log 2.1039 162 0.5436 0.5929 0.5436 0.7373
No log 2.1299 164 0.5333 0.5286 0.5333 0.7303
No log 2.1558 166 0.5584 0.5133 0.5584 0.7473
No log 2.1818 168 0.6849 0.4163 0.6849 0.8276
No log 2.2078 170 0.6143 0.4496 0.6143 0.7838
No log 2.2338 172 0.6085 0.5953 0.6085 0.7800
No log 2.2597 174 0.7517 0.4321 0.7517 0.8670
No log 2.2857 176 0.7831 0.4321 0.7831 0.8849
No log 2.3117 178 0.5883 0.5650 0.5883 0.7670
No log 2.3377 180 0.6728 0.4574 0.6728 0.8202
No log 2.3636 182 0.7974 0.4735 0.7974 0.8930
No log 2.3896 184 0.6572 0.4371 0.6572 0.8107
No log 2.4156 186 0.5986 0.5200 0.5986 0.7737
No log 2.4416 188 0.6886 0.5474 0.6886 0.8298
No log 2.4675 190 0.6193 0.5317 0.6193 0.7869
No log 2.4935 192 0.5722 0.3599 0.5722 0.7564
No log 2.5195 194 0.7849 0.4153 0.7849 0.8860
No log 2.5455 196 0.8867 0.4297 0.8867 0.9417
No log 2.5714 198 0.7511 0.3822 0.7511 0.8666
No log 2.5974 200 0.5824 0.4124 0.5824 0.7631
No log 2.6234 202 0.5718 0.3478 0.5718 0.7562
No log 2.6494 204 0.6254 0.4371 0.6254 0.7908
No log 2.6753 206 0.7012 0.3157 0.7012 0.8374
No log 2.7013 208 0.8254 0.4153 0.8254 0.9085
No log 2.7273 210 0.7795 0.3480 0.7795 0.8829
No log 2.7532 212 0.6436 0.4451 0.6436 0.8023
No log 2.7792 214 0.5850 0.3198 0.5850 0.7649
No log 2.8052 216 0.5882 0.3964 0.5882 0.7669
No log 2.8312 218 0.5941 0.4059 0.5941 0.7708
No log 2.8571 220 0.6574 0.4294 0.6574 0.8108
No log 2.8831 222 0.7238 0.4295 0.7238 0.8507
No log 2.9091 224 0.6604 0.4315 0.6604 0.8126
No log 2.9351 226 0.6054 0.5133 0.6054 0.7781
No log 2.9610 228 0.6133 0.5136 0.6133 0.7832
No log 2.9870 230 0.6965 0.5077 0.6965 0.8345
No log 3.0130 232 0.6892 0.5077 0.6892 0.8302
No log 3.0390 234 0.5862 0.5087 0.5862 0.7656
No log 3.0649 236 0.6352 0.4389 0.6352 0.7970
No log 3.0909 238 0.7824 0.4650 0.7824 0.8846
No log 3.1169 240 0.7231 0.4512 0.7231 0.8504
No log 3.1429 242 0.6395 0.3985 0.6395 0.7997
No log 3.1688 244 0.6752 0.3520 0.6752 0.8217
No log 3.1948 246 0.8034 0.4511 0.8034 0.8963
No log 3.2208 248 0.7495 0.4580 0.7495 0.8657
No log 3.2468 250 0.5886 0.4576 0.5886 0.7672
No log 3.2727 252 0.5370 0.5184 0.5370 0.7328
No log 3.2987 254 0.5326 0.5533 0.5326 0.7298
No log 3.3247 256 0.5779 0.4234 0.5779 0.7602
No log 3.3506 258 0.6159 0.3261 0.6159 0.7848
No log 3.3766 260 0.6942 0.4251 0.6942 0.8332
No log 3.4026 262 0.6551 0.3519 0.6551 0.8094
No log 3.4286 264 0.5399 0.5071 0.5399 0.7348
No log 3.4545 266 0.5208 0.5039 0.5208 0.7216
No log 3.4805 268 0.5250 0.5201 0.5250 0.7246
No log 3.5065 270 0.5657 0.5404 0.5657 0.7521
No log 3.5325 272 0.6746 0.5211 0.6746 0.8213
No log 3.5584 274 0.7059 0.4134 0.7059 0.8402
No log 3.5844 276 0.5906 0.5357 0.5906 0.7685
No log 3.6104 278 0.5209 0.6092 0.5209 0.7218
No log 3.6364 280 0.5247 0.5765 0.5247 0.7244
No log 3.6623 282 0.5099 0.5841 0.5099 0.7141
No log 3.6883 284 0.5073 0.5413 0.5073 0.7122
No log 3.7143 286 0.5956 0.3737 0.5956 0.7717
No log 3.7403 288 0.6382 0.3737 0.6382 0.7989
No log 3.7662 290 0.5784 0.4375 0.5784 0.7605
No log 3.7922 292 0.4922 0.5488 0.4922 0.7016
No log 3.8182 294 0.4848 0.5702 0.4848 0.6963
No log 3.8442 296 0.4794 0.5751 0.4794 0.6924
No log 3.8701 298 0.5156 0.5404 0.5156 0.7181
No log 3.8961 300 0.6094 0.4740 0.6094 0.7806
No log 3.9221 302 0.6149 0.4212 0.6149 0.7842
No log 3.9481 304 0.5594 0.4661 0.5594 0.7479
No log 3.9740 306 0.5151 0.5703 0.5151 0.7177
No log 4.0 308 0.4878 0.5522 0.4878 0.6985
No log 4.0260 310 0.5552 0.5841 0.5552 0.7451
No log 4.0519 312 0.5486 0.5841 0.5486 0.7407
No log 4.0779 314 0.5121 0.6114 0.5121 0.7156
No log 4.1039 316 0.6354 0.5179 0.6354 0.7971
No log 4.1299 318 0.7295 0.5105 0.7295 0.8541
No log 4.1558 320 0.6136 0.5337 0.6136 0.7833
No log 4.1818 322 0.5221 0.5436 0.5221 0.7226
No log 4.2078 324 0.5120 0.5075 0.5120 0.7155
No log 4.2338 326 0.5523 0.5951 0.5523 0.7432
No log 4.2597 328 0.6273 0.4007 0.6273 0.7920
No log 4.2857 330 0.6670 0.3475 0.6670 0.8167
No log 4.3117 332 0.6133 0.3950 0.6133 0.7831
No log 4.3377 334 0.5373 0.3995 0.5373 0.7330
No log 4.3636 336 0.5304 0.4869 0.5304 0.7283
No log 4.3896 338 0.5178 0.5061 0.5178 0.7196
No log 4.4156 340 0.5849 0.4243 0.5849 0.7648
No log 4.4416 342 0.8432 0.4297 0.8432 0.9182
No log 4.4675 344 0.9480 0.3890 0.9480 0.9737
No log 4.4935 346 0.8637 0.4024 0.8637 0.9294
No log 4.5195 348 0.6850 0.3940 0.6850 0.8276
No log 4.5455 350 0.5550 0.4985 0.5550 0.7450
No log 4.5714 352 0.5486 0.5452 0.5486 0.7407
No log 4.5974 354 0.6209 0.4890 0.6209 0.7880
No log 4.6234 356 0.6168 0.5096 0.6168 0.7853
No log 4.6494 358 0.5917 0.6244 0.5917 0.7692
No log 4.6753 360 0.6941 0.5027 0.6941 0.8332
No log 4.7013 362 0.7614 0.4293 0.7614 0.8726
No log 4.7273 364 0.6700 0.4609 0.6700 0.8185
No log 4.7532 366 0.5863 0.5133 0.5863 0.7657
No log 4.7792 368 0.5859 0.4901 0.5859 0.7654
No log 4.8052 370 0.6789 0.3746 0.6789 0.8240
No log 4.8312 372 0.7622 0.4102 0.7622 0.8731
No log 4.8571 374 0.7868 0.4175 0.7868 0.8870
No log 4.8831 376 0.8106 0.3754 0.8106 0.9003
No log 4.9091 378 0.8830 0.3782 0.8830 0.9397
No log 4.9351 380 0.8737 0.3274 0.8737 0.9347
No log 4.9610 382 0.8212 0.3333 0.8212 0.9062
No log 4.9870 384 0.7216 0.3590 0.7216 0.8495
No log 5.0130 386 0.6269 0.4352 0.6269 0.7918
No log 5.0390 388 0.6397 0.4352 0.6397 0.7998
No log 5.0649 390 0.6821 0.3867 0.6821 0.8259
No log 5.0909 392 0.6217 0.4292 0.6217 0.7885
No log 5.1169 394 0.5919 0.4808 0.5919 0.7694
No log 5.1429 396 0.5474 0.5357 0.5474 0.7399
No log 5.1688 398 0.5136 0.5965 0.5136 0.7167
No log 5.1948 400 0.5320 0.5574 0.5320 0.7294
No log 5.2208 402 0.6212 0.4625 0.6212 0.7882
No log 5.2468 404 0.6344 0.4389 0.6344 0.7965
No log 5.2727 406 0.5146 0.5071 0.5146 0.7174
No log 5.2987 408 0.4610 0.6279 0.4610 0.6790
No log 5.3247 410 0.4547 0.5890 0.4547 0.6743
No log 5.3506 412 0.4826 0.4914 0.4826 0.6947
No log 5.3766 414 0.5741 0.3763 0.5741 0.7577
No log 5.4026 416 0.6663 0.4076 0.6663 0.8163
No log 5.4286 418 0.6194 0.3434 0.6194 0.7870
No log 5.4545 420 0.5579 0.5057 0.5579 0.7469
No log 5.4805 422 0.5956 0.4776 0.5956 0.7717
No log 5.5065 424 0.7075 0.4986 0.7075 0.8411
No log 5.5325 426 0.8438 0.3563 0.8438 0.9186
No log 5.5584 428 0.8072 0.3909 0.8072 0.8984
No log 5.5844 430 0.6734 0.4434 0.6734 0.8206
No log 5.6104 432 0.5689 0.5874 0.5689 0.7543
No log 5.6364 434 0.5549 0.6434 0.5549 0.7449
No log 5.6623 436 0.6171 0.4845 0.6171 0.7856
No log 5.6883 438 0.6996 0.4522 0.6996 0.8364
No log 5.7143 440 0.6560 0.5063 0.6560 0.8099
No log 5.7403 442 0.5673 0.5860 0.5673 0.7532
No log 5.7662 444 0.5279 0.6377 0.5279 0.7266
No log 5.7922 446 0.5239 0.5926 0.5239 0.7238
No log 5.8182 448 0.5354 0.5926 0.5354 0.7317
No log 5.8442 450 0.5333 0.5235 0.5333 0.7302
No log 5.8701 452 0.5412 0.5003 0.5412 0.7357
No log 5.8961 454 0.5495 0.5003 0.5495 0.7413
No log 5.9221 456 0.5417 0.5021 0.5417 0.7360
No log 5.9481 458 0.5692 0.5235 0.5692 0.7545
No log 5.9740 460 0.5432 0.5151 0.5432 0.7370
No log 6.0 462 0.5489 0.5642 0.5489 0.7409
No log 6.0260 464 0.5576 0.5554 0.5576 0.7467
No log 6.0519 466 0.5658 0.5744 0.5658 0.7522
No log 6.0779 468 0.5600 0.5533 0.5600 0.7483
No log 6.1039 470 0.5570 0.5412 0.5570 0.7463
No log 6.1299 472 0.6015 0.5291 0.6015 0.7756
No log 6.1558 474 0.6419 0.4315 0.6419 0.8012
No log 6.1818 476 0.6072 0.4625 0.6072 0.7792
No log 6.2078 478 0.6033 0.4782 0.6033 0.7767
No log 6.2338 480 0.5844 0.4260 0.5844 0.7645
No log 6.2597 482 0.5557 0.5022 0.5557 0.7455
No log 6.2857 484 0.5754 0.4681 0.5754 0.7586
No log 6.3117 486 0.6493 0.4486 0.6493 0.8058
No log 6.3377 488 0.7090 0.3906 0.7090 0.8420
No log 6.3636 490 0.6621 0.4887 0.6621 0.8137
No log 6.3896 492 0.5637 0.5115 0.5637 0.7508
No log 6.4156 494 0.5312 0.5580 0.5312 0.7288
No log 6.4416 496 0.5366 0.5422 0.5366 0.7326
No log 6.4675 498 0.5528 0.5438 0.5528 0.7435
0.3701 6.4935 500 0.5319 0.5028 0.5319 0.7293
0.3701 6.5195 502 0.5275 0.5195 0.5275 0.7263
0.3701 6.5455 504 0.5471 0.4828 0.5471 0.7397
0.3701 6.5714 506 0.5822 0.4864 0.5822 0.7630
0.3701 6.5974 508 0.6080 0.4112 0.6080 0.7798
0.3701 6.6234 510 0.6137 0.4112 0.6137 0.7834
0.3701 6.6494 512 0.5615 0.4375 0.5615 0.7493

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_B_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k15_task7_organization

Finetuned
(4019)
this model