ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k12_task2_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.7578
  • Qwk: 0.5421
  • Mse: 0.7578
  • Rmse: 0.8705

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0294 2 4.1384 0.0018 4.1384 2.0343
No log 0.0588 4 2.3997 0.0856 2.3997 1.5491
No log 0.0882 6 1.3894 0.0682 1.3894 1.1787
No log 0.1176 8 1.1529 0.1918 1.1529 1.0737
No log 0.1471 10 1.2190 0.1689 1.2190 1.1041
No log 0.1765 12 1.2260 0.1370 1.2260 1.1073
No log 0.2059 14 1.3362 0.0803 1.3362 1.1559
No log 0.2353 16 1.2822 0.1354 1.2822 1.1324
No log 0.2647 18 1.2154 0.0020 1.2154 1.1024
No log 0.2941 20 1.2240 0.0464 1.2240 1.1063
No log 0.3235 22 1.2592 0.0464 1.2592 1.1221
No log 0.3529 24 1.3336 0.0454 1.3336 1.1548
No log 0.3824 26 1.4115 0.1243 1.4115 1.1881
No log 0.4118 28 1.4089 0.1559 1.4089 1.1870
No log 0.4412 30 1.3836 0.2549 1.3836 1.1763
No log 0.4706 32 1.3199 0.2678 1.3199 1.1489
No log 0.5 34 1.3199 0.2739 1.3199 1.1489
No log 0.5294 36 1.2923 0.2949 1.2923 1.1368
No log 0.5588 38 1.1690 0.3696 1.1690 1.0812
No log 0.5882 40 1.1861 0.4079 1.1861 1.0891
No log 0.6176 42 1.0882 0.3997 1.0882 1.0432
No log 0.6471 44 1.0994 0.3964 1.0994 1.0485
No log 0.6765 46 1.2229 0.3392 1.2229 1.1058
No log 0.7059 48 1.2570 0.3613 1.2570 1.1211
No log 0.7353 50 1.0686 0.4555 1.0686 1.0337
No log 0.7647 52 0.9097 0.5275 0.9097 0.9538
No log 0.7941 54 0.9005 0.4783 0.9005 0.9489
No log 0.8235 56 0.8424 0.4774 0.8424 0.9178
No log 0.8529 58 0.8104 0.5169 0.8104 0.9002
No log 0.8824 60 0.8499 0.6234 0.8499 0.9219
No log 0.9118 62 0.9074 0.5298 0.9074 0.9526
No log 0.9412 64 1.0260 0.3726 1.0260 1.0129
No log 0.9706 66 1.1407 0.3527 1.1407 1.0681
No log 1.0 68 1.0254 0.4611 1.0254 1.0126
No log 1.0294 70 0.7653 0.6570 0.7653 0.8748
No log 1.0588 72 0.7683 0.6040 0.7683 0.8765
No log 1.0882 74 0.8226 0.6021 0.8226 0.9070
No log 1.1176 76 0.9402 0.5959 0.9402 0.9696
No log 1.1471 78 0.9059 0.5531 0.9059 0.9518
No log 1.1765 80 0.9441 0.5004 0.9441 0.9717
No log 1.2059 82 0.7925 0.6199 0.7925 0.8902
No log 1.2353 84 0.8677 0.6337 0.8677 0.9315
No log 1.2647 86 0.8962 0.6436 0.8962 0.9467
No log 1.2941 88 0.7607 0.6404 0.7607 0.8722
No log 1.3235 90 0.8917 0.6133 0.8917 0.9443
No log 1.3529 92 0.8821 0.6226 0.8821 0.9392
No log 1.3824 94 0.7829 0.6423 0.7829 0.8848
No log 1.4118 96 0.7112 0.6078 0.7112 0.8433
No log 1.4412 98 0.7765 0.5968 0.7765 0.8812
No log 1.4706 100 0.8027 0.5865 0.8027 0.8959
No log 1.5 102 0.7508 0.5509 0.7508 0.8665
No log 1.5294 104 0.7493 0.5534 0.7493 0.8656
No log 1.5588 106 0.7575 0.6274 0.7575 0.8703
No log 1.5882 108 0.7496 0.6178 0.7496 0.8658
No log 1.6176 110 0.7160 0.6685 0.7160 0.8462
No log 1.6471 112 0.7755 0.6208 0.7755 0.8806
No log 1.6765 114 0.8284 0.6124 0.8284 0.9102
No log 1.7059 116 0.7063 0.6916 0.7063 0.8404
No log 1.7353 118 0.7790 0.6337 0.7790 0.8826
No log 1.7647 120 0.7415 0.6731 0.7415 0.8611
No log 1.7941 122 0.6971 0.6773 0.6971 0.8349
No log 1.8235 124 0.6935 0.6967 0.6935 0.8328
No log 1.8529 126 0.7286 0.5526 0.7286 0.8536
No log 1.8824 128 0.9293 0.5019 0.9293 0.9640
No log 1.9118 130 0.8581 0.4861 0.8581 0.9263
No log 1.9412 132 0.7381 0.5648 0.7381 0.8591
No log 1.9706 134 0.9820 0.5280 0.9820 0.9910
No log 2.0 136 1.3094 0.4632 1.3094 1.1443
No log 2.0294 138 1.2250 0.4632 1.2250 1.1068
No log 2.0588 140 0.8508 0.5637 0.8508 0.9224
No log 2.0882 142 0.7029 0.6198 0.7029 0.8384
No log 2.1176 144 0.9039 0.4556 0.9039 0.9507
No log 2.1471 146 0.9322 0.4556 0.9322 0.9655
No log 2.1765 148 0.7828 0.4641 0.7828 0.8848
No log 2.2059 150 0.7496 0.5161 0.7496 0.8658
No log 2.2353 152 0.8476 0.5374 0.8476 0.9207
No log 2.2647 154 0.8374 0.5313 0.8374 0.9151
No log 2.2941 156 0.7543 0.5346 0.7543 0.8685
No log 2.3235 158 0.7161 0.5381 0.7161 0.8462
No log 2.3529 160 0.7065 0.5517 0.7065 0.8405
No log 2.3824 162 0.6840 0.6840 0.6840 0.8270
No log 2.4118 164 0.7150 0.6379 0.7150 0.8456
No log 2.4412 166 0.8283 0.6197 0.8283 0.9101
No log 2.4706 168 0.8624 0.6049 0.8624 0.9287
No log 2.5 170 0.8243 0.5920 0.8243 0.9079
No log 2.5294 172 0.8017 0.5704 0.8017 0.8954
No log 2.5588 174 0.7874 0.6025 0.7874 0.8873
No log 2.5882 176 0.7926 0.6025 0.7926 0.8903
No log 2.6176 178 0.8319 0.5624 0.8319 0.9121
No log 2.6471 180 0.8584 0.5926 0.8584 0.9265
No log 2.6765 182 0.7907 0.5996 0.7907 0.8892
No log 2.7059 184 0.7781 0.6442 0.7781 0.8821
No log 2.7353 186 0.7667 0.6642 0.7667 0.8756
No log 2.7647 188 0.7596 0.5939 0.7596 0.8716
No log 2.7941 190 0.7816 0.5144 0.7816 0.8841
No log 2.8235 192 0.7913 0.5025 0.7913 0.8895
No log 2.8529 194 0.8012 0.5076 0.8012 0.8951
No log 2.8824 196 0.8025 0.4617 0.8025 0.8958
No log 2.9118 198 0.8435 0.4009 0.8435 0.9184
No log 2.9412 200 0.8703 0.5353 0.8703 0.9329
No log 2.9706 202 0.9488 0.4902 0.9488 0.9741
No log 3.0 204 0.9161 0.4918 0.9161 0.9571
No log 3.0294 206 0.8179 0.4293 0.8179 0.9044
No log 3.0588 208 0.8414 0.5087 0.8414 0.9173
No log 3.0882 210 0.8394 0.4771 0.8394 0.9162
No log 3.1176 212 0.8246 0.4731 0.8246 0.9081
No log 3.1471 214 0.9414 0.5431 0.9414 0.9702
No log 3.1765 216 1.1759 0.4396 1.1759 1.0844
No log 3.2059 218 1.1724 0.4216 1.1724 1.0828
No log 3.2353 220 0.9607 0.5216 0.9607 0.9801
No log 3.2647 222 0.7945 0.5131 0.7945 0.8914
No log 3.2941 224 0.7655 0.5779 0.7655 0.8749
No log 3.3235 226 0.7699 0.5607 0.7699 0.8775
No log 3.3529 228 0.7470 0.6095 0.7470 0.8643
No log 3.3824 230 0.7191 0.5993 0.7191 0.8480
No log 3.4118 232 0.7172 0.6377 0.7172 0.8469
No log 3.4412 234 0.7927 0.5245 0.7927 0.8904
No log 3.4706 236 0.8940 0.5250 0.8940 0.9455
No log 3.5 238 0.8749 0.5102 0.8749 0.9353
No log 3.5294 240 0.7883 0.5062 0.7883 0.8879
No log 3.5588 242 0.7537 0.5479 0.7537 0.8682
No log 3.5882 244 0.7568 0.5552 0.7568 0.8699
No log 3.6176 246 0.7429 0.5766 0.7429 0.8619
No log 3.6471 248 0.7416 0.5261 0.7416 0.8611
No log 3.6765 250 0.7494 0.5607 0.7494 0.8657
No log 3.7059 252 0.7692 0.5951 0.7692 0.8770
No log 3.7353 254 0.8445 0.5680 0.8445 0.9189
No log 3.7647 256 0.9703 0.5233 0.9703 0.9850
No log 3.7941 258 1.0517 0.4974 1.0517 1.0255
No log 3.8235 260 0.9896 0.4601 0.9896 0.9948
No log 3.8529 262 0.8726 0.4553 0.8726 0.9341
No log 3.8824 264 0.7850 0.4548 0.7850 0.8860
No log 3.9118 266 0.7490 0.5253 0.7490 0.8655
No log 3.9412 268 0.7237 0.6011 0.7237 0.8507
No log 3.9706 270 0.7281 0.6088 0.7281 0.8533
No log 4.0 272 0.7986 0.5578 0.7986 0.8936
No log 4.0294 274 0.8355 0.5160 0.8355 0.9141
No log 4.0588 276 0.8774 0.5379 0.8774 0.9367
No log 4.0882 278 0.8587 0.5306 0.8587 0.9267
No log 4.1176 280 0.8336 0.4666 0.8336 0.9130
No log 4.1471 282 0.8664 0.4526 0.8664 0.9308
No log 4.1765 284 0.9009 0.4792 0.9009 0.9492
No log 4.2059 286 0.8414 0.5147 0.8414 0.9173
No log 4.2353 288 0.7513 0.5606 0.7513 0.8668
No log 4.2647 290 0.7458 0.5534 0.7458 0.8636
No log 4.2941 292 0.7426 0.5357 0.7426 0.8617
No log 4.3235 294 0.7283 0.5948 0.7283 0.8534
No log 4.3529 296 0.7276 0.6825 0.7276 0.8530
No log 4.3824 298 0.7487 0.6714 0.7487 0.8653
No log 4.4118 300 0.7676 0.6252 0.7676 0.8761
No log 4.4412 302 0.7674 0.6714 0.7674 0.8760
No log 4.4706 304 0.8278 0.5975 0.8278 0.9098
No log 4.5 306 0.9201 0.5532 0.9201 0.9592
No log 4.5294 308 0.9335 0.5340 0.9335 0.9662
No log 4.5588 310 0.9651 0.4874 0.9651 0.9824
No log 4.5882 312 0.9057 0.4940 0.9057 0.9517
No log 4.6176 314 0.8995 0.5130 0.8995 0.9484
No log 4.6471 316 0.8878 0.5130 0.8878 0.9422
No log 4.6765 318 0.8504 0.5130 0.8504 0.9221
No log 4.7059 320 0.8033 0.5311 0.8033 0.8963
No log 4.7353 322 0.7590 0.5498 0.7590 0.8712
No log 4.7647 324 0.7628 0.5311 0.7628 0.8734
No log 4.7941 326 0.7766 0.5311 0.7766 0.8813
No log 4.8235 328 0.7545 0.5259 0.7545 0.8686
No log 4.8529 330 0.7496 0.6285 0.7496 0.8658
No log 4.8824 332 0.7501 0.6227 0.7501 0.8661
No log 4.9118 334 0.7600 0.5898 0.7600 0.8718
No log 4.9412 336 0.7991 0.4785 0.7991 0.8939
No log 4.9706 338 0.7889 0.5303 0.7889 0.8882
No log 5.0 340 0.7765 0.5785 0.7765 0.8812
No log 5.0294 342 0.8713 0.5166 0.8713 0.9334
No log 5.0588 344 0.9373 0.5184 0.9373 0.9681
No log 5.0882 346 0.8773 0.5322 0.8773 0.9366
No log 5.1176 348 0.8545 0.5183 0.8545 0.9244
No log 5.1471 350 0.8033 0.4425 0.8033 0.8963
No log 5.1765 352 0.7692 0.5759 0.7692 0.8771
No log 5.2059 354 0.7920 0.4923 0.7920 0.8899
No log 5.2353 356 0.8021 0.4681 0.8021 0.8956
No log 5.2647 358 0.8887 0.4833 0.8887 0.9427
No log 5.2941 360 1.0827 0.4414 1.0827 1.0405
No log 5.3235 362 1.1036 0.4580 1.1036 1.0505
No log 5.3529 364 0.9578 0.5056 0.9578 0.9787
No log 5.3824 366 0.8396 0.5014 0.8396 0.9163
No log 5.4118 368 0.8097 0.4993 0.8097 0.8999
No log 5.4412 370 0.7765 0.5053 0.7765 0.8812
No log 5.4706 372 0.7650 0.5006 0.7650 0.8746
No log 5.5 374 0.7595 0.4825 0.7595 0.8715
No log 5.5294 376 0.7468 0.5455 0.7468 0.8641
No log 5.5588 378 0.7692 0.4832 0.7692 0.8771
No log 5.5882 380 0.8251 0.5324 0.8251 0.9083
No log 5.6176 382 0.9196 0.5553 0.9196 0.9590
No log 5.6471 384 0.9638 0.5340 0.9638 0.9817
No log 5.6765 386 0.9138 0.5576 0.9138 0.9559
No log 5.7059 388 0.8986 0.4845 0.8986 0.9480
No log 5.7353 390 0.8685 0.5090 0.8685 0.9319
No log 5.7647 392 0.8575 0.4681 0.8575 0.9260
No log 5.7941 394 0.8577 0.5283 0.8577 0.9261
No log 5.8235 396 0.8405 0.5905 0.8405 0.9168
No log 5.8529 398 0.9273 0.5661 0.9273 0.9630
No log 5.8824 400 1.0609 0.5168 1.0609 1.0300
No log 5.9118 402 1.0464 0.5184 1.0464 1.0230
No log 5.9412 404 0.9292 0.5576 0.9292 0.9639
No log 5.9706 406 0.8338 0.4920 0.8338 0.9131
No log 6.0 408 0.7977 0.4980 0.7977 0.8931
No log 6.0294 410 0.7840 0.5165 0.7840 0.8855
No log 6.0588 412 0.8630 0.5823 0.8630 0.9290
No log 6.0882 414 0.9318 0.5490 0.9318 0.9653
No log 6.1176 416 0.9024 0.5490 0.9024 0.9500
No log 6.1471 418 0.8778 0.5712 0.8778 0.9369
No log 6.1765 420 0.7903 0.5451 0.7903 0.8890
No log 6.2059 422 0.7572 0.4834 0.7572 0.8702
No log 6.2353 424 0.7576 0.5327 0.7576 0.8704
No log 6.2647 426 0.7524 0.4979 0.7524 0.8674
No log 6.2941 428 0.7680 0.4540 0.7680 0.8763
No log 6.3235 430 0.8084 0.5250 0.8084 0.8991
No log 6.3529 432 0.8302 0.4979 0.8302 0.9112
No log 6.3824 434 0.8690 0.4711 0.8690 0.9322
No log 6.4118 436 0.8735 0.4102 0.8735 0.9346
No log 6.4412 438 0.8691 0.3744 0.8691 0.9322
No log 6.4706 440 0.8532 0.3652 0.8532 0.9237
No log 6.5 442 0.8470 0.4180 0.8470 0.9203
No log 6.5294 444 0.8362 0.4418 0.8362 0.9144
No log 6.5588 446 0.8222 0.5125 0.8222 0.9068
No log 6.5882 448 0.8050 0.5552 0.8050 0.8972
No log 6.6176 450 0.8155 0.5781 0.8155 0.9031
No log 6.6471 452 0.8347 0.5428 0.8347 0.9136
No log 6.6765 454 0.8646 0.5311 0.8646 0.9298
No log 6.7059 456 0.9331 0.5044 0.9331 0.9660
No log 6.7353 458 0.9685 0.4792 0.9685 0.9841
No log 6.7647 460 0.9624 0.4792 0.9624 0.9810
No log 6.7941 462 0.9049 0.5044 0.9049 0.9513
No log 6.8235 464 0.8405 0.4373 0.8405 0.9168
No log 6.8529 466 0.8346 0.4820 0.8346 0.9136
No log 6.8824 468 0.8651 0.5201 0.8651 0.9301
No log 6.9118 470 0.8950 0.4858 0.8950 0.9460
No log 6.9412 472 0.9095 0.4938 0.9095 0.9537
No log 6.9706 474 0.8733 0.5060 0.8733 0.9345
No log 7.0 476 0.8036 0.5094 0.8036 0.8965
No log 7.0294 478 0.7719 0.5061 0.7719 0.8786
No log 7.0588 480 0.7584 0.5061 0.7584 0.8708
No log 7.0882 482 0.7464 0.5732 0.7464 0.8640
No log 7.1176 484 0.7543 0.5569 0.7543 0.8685
No log 7.1471 486 0.7819 0.5517 0.7819 0.8842
No log 7.1765 488 0.7725 0.5517 0.7725 0.8789
No log 7.2059 490 0.7659 0.5649 0.7659 0.8752
No log 7.2353 492 0.7595 0.4860 0.7595 0.8715
No log 7.2647 494 0.7529 0.5169 0.7529 0.8677
No log 7.2941 496 0.7528 0.5085 0.7528 0.8676
No log 7.3235 498 0.7364 0.5387 0.7364 0.8581
0.3214 7.3529 500 0.7306 0.6437 0.7306 0.8547
0.3214 7.3824 502 0.7388 0.6218 0.7388 0.8595
0.3214 7.4118 504 0.7831 0.5892 0.7831 0.8849
0.3214 7.4412 506 0.8143 0.5272 0.8143 0.9024
0.3214 7.4706 508 0.8048 0.5107 0.8048 0.8971
0.3214 7.5 510 0.7578 0.5421 0.7578 0.8705

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k12_task2_organization

Finetuned
(4019)
this model