ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run3_AugV5_k15_task2_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.8046
  • Qwk: 0.4851
  • Mse: 0.8046
  • Rmse: 0.8970

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0235 2 4.6910 0.0042 4.6910 2.1659
No log 0.0471 4 3.1175 -0.0274 3.1175 1.7657
No log 0.0706 6 1.6282 0.0372 1.6282 1.2760
No log 0.0941 8 1.4815 -0.0046 1.4815 1.2172
No log 0.1176 10 1.6961 -0.0061 1.6961 1.3023
No log 0.1412 12 1.2889 0.0914 1.2889 1.1353
No log 0.1647 14 1.1952 0.1814 1.1952 1.0933
No log 0.1882 16 1.1958 0.0872 1.1958 1.0935
No log 0.2118 18 1.2224 0.2155 1.2224 1.1056
No log 0.2353 20 1.2283 0.3243 1.2283 1.1083
No log 0.2588 22 1.1258 0.3602 1.1258 1.0610
No log 0.2824 24 1.0471 0.2643 1.0471 1.0233
No log 0.3059 26 1.0586 0.3681 1.0586 1.0289
No log 0.3294 28 1.1548 0.2918 1.1548 1.0746
No log 0.3529 30 1.1011 0.2673 1.1011 1.0493
No log 0.3765 32 1.0955 0.2490 1.0955 1.0467
No log 0.4 34 1.0604 0.2681 1.0604 1.0298
No log 0.4235 36 1.1563 0.3491 1.1563 1.0753
No log 0.4471 38 1.1041 0.2786 1.1041 1.0507
No log 0.4706 40 1.0590 0.4444 1.0590 1.0291
No log 0.4941 42 0.9744 0.3404 0.9744 0.9871
No log 0.5176 44 0.9540 0.3771 0.9540 0.9767
No log 0.5412 46 0.9354 0.3771 0.9354 0.9672
No log 0.5647 48 0.9126 0.4737 0.9126 0.9553
No log 0.5882 50 0.9086 0.4945 0.9086 0.9532
No log 0.6118 52 0.8954 0.4930 0.8954 0.9462
No log 0.6353 54 0.8925 0.4604 0.8925 0.9447
No log 0.6588 56 0.9874 0.4490 0.9874 0.9937
No log 0.6824 58 1.0950 0.4705 1.0950 1.0464
No log 0.7059 60 1.0086 0.5150 1.0086 1.0043
No log 0.7294 62 0.8665 0.4694 0.8665 0.9309
No log 0.7529 64 0.8241 0.5102 0.8241 0.9078
No log 0.7765 66 0.8182 0.4978 0.8182 0.9046
No log 0.8 68 0.8288 0.5216 0.8288 0.9104
No log 0.8235 70 0.7672 0.5054 0.7672 0.8759
No log 0.8471 72 0.7460 0.6044 0.7460 0.8637
No log 0.8706 74 0.7454 0.6343 0.7454 0.8634
No log 0.8941 76 0.7849 0.6131 0.7849 0.8860
No log 0.9176 78 0.9287 0.5605 0.9287 0.9637
No log 0.9412 80 0.9326 0.5754 0.9326 0.9657
No log 0.9647 82 0.9130 0.5605 0.9130 0.9555
No log 0.9882 84 0.9981 0.5659 0.9981 0.9991
No log 1.0118 86 0.8794 0.5702 0.8794 0.9378
No log 1.0353 88 0.8329 0.5712 0.8329 0.9127
No log 1.0588 90 0.7927 0.5648 0.7927 0.8903
No log 1.0824 92 0.7685 0.5646 0.7685 0.8766
No log 1.1059 94 0.7853 0.4853 0.7853 0.8862
No log 1.1294 96 0.7785 0.5026 0.7785 0.8823
No log 1.1529 98 0.7490 0.5735 0.7490 0.8654
No log 1.1765 100 0.7467 0.5571 0.7467 0.8641
No log 1.2 102 0.9169 0.5899 0.9169 0.9576
No log 1.2235 104 1.0799 0.5273 1.0799 1.0392
No log 1.2471 106 1.1364 0.5273 1.1364 1.0660
No log 1.2706 108 0.8970 0.5769 0.8970 0.9471
No log 1.2941 110 0.7581 0.6038 0.7581 0.8707
No log 1.3176 112 0.7452 0.5632 0.7452 0.8632
No log 1.3412 114 0.7426 0.5699 0.7426 0.8618
No log 1.3647 116 0.7647 0.5545 0.7647 0.8744
No log 1.3882 118 0.7852 0.5498 0.7852 0.8861
No log 1.4118 120 0.7808 0.5572 0.7808 0.8836
No log 1.4353 122 0.8102 0.6012 0.8102 0.9001
No log 1.4588 124 1.0506 0.5710 1.0506 1.0250
No log 1.4824 126 1.0456 0.5729 1.0456 1.0226
No log 1.5059 128 0.8289 0.5580 0.8289 0.9104
No log 1.5294 130 0.8726 0.5730 0.8726 0.9341
No log 1.5529 132 0.8489 0.5631 0.8489 0.9214
No log 1.5765 134 0.7600 0.5550 0.7600 0.8718
No log 1.6 136 1.2128 0.5336 1.2128 1.1013
No log 1.6235 138 1.6312 0.3265 1.6312 1.2772
No log 1.6471 140 1.5043 0.3058 1.5043 1.2265
No log 1.6706 142 1.1280 0.4654 1.1280 1.0621
No log 1.6941 144 0.7438 0.5862 0.7438 0.8625
No log 1.7176 146 0.7437 0.5909 0.7437 0.8624
No log 1.7412 148 0.7080 0.6605 0.7080 0.8414
No log 1.7647 150 0.7329 0.5070 0.7329 0.8561
No log 1.7882 152 0.9075 0.5976 0.9075 0.9526
No log 1.8118 154 1.0260 0.5763 1.0260 1.0129
No log 1.8353 156 0.8291 0.5826 0.8291 0.9105
No log 1.8588 158 0.6786 0.5570 0.6786 0.8238
No log 1.8824 160 0.6564 0.6756 0.6564 0.8102
No log 1.9059 162 0.6666 0.6015 0.6666 0.8165
No log 1.9294 164 0.7700 0.5708 0.7700 0.8775
No log 1.9529 166 0.9759 0.4939 0.9759 0.9879
No log 1.9765 168 1.0079 0.4477 1.0079 1.0040
No log 2.0 170 0.8872 0.5485 0.8872 0.9419
No log 2.0235 172 0.7282 0.6196 0.7282 0.8533
No log 2.0471 174 0.6490 0.6769 0.6490 0.8056
No log 2.0706 176 0.6155 0.7080 0.6155 0.7845
No log 2.0941 178 0.6158 0.7157 0.6158 0.7847
No log 2.1176 180 0.6225 0.6460 0.6225 0.7890
No log 2.1412 182 0.7116 0.6369 0.7116 0.8436
No log 2.1647 184 0.7057 0.6195 0.7057 0.8400
No log 2.1882 186 0.6527 0.5622 0.6527 0.8079
No log 2.2118 188 0.6634 0.6017 0.6634 0.8145
No log 2.2353 190 0.6354 0.6561 0.6354 0.7971
No log 2.2588 192 0.6593 0.6356 0.6593 0.8120
No log 2.2824 194 0.7774 0.5719 0.7774 0.8817
No log 2.3059 196 0.9931 0.5392 0.9931 0.9966
No log 2.3294 198 1.0199 0.5583 1.0199 1.0099
No log 2.3529 200 0.8153 0.5719 0.8153 0.9029
No log 2.3765 202 0.6666 0.6154 0.6666 0.8165
No log 2.4 204 0.6639 0.6319 0.6639 0.8148
No log 2.4235 206 0.8097 0.5614 0.8097 0.8998
No log 2.4471 208 0.9692 0.5583 0.9692 0.9845
No log 2.4706 210 0.9481 0.5486 0.9481 0.9737
No log 2.4941 212 0.9027 0.5486 0.9027 0.9501
No log 2.5176 214 0.7229 0.6228 0.7229 0.8502
No log 2.5412 216 0.6752 0.6559 0.6752 0.8217
No log 2.5647 218 0.6995 0.6518 0.6995 0.8363
No log 2.5882 220 0.8144 0.5322 0.8144 0.9025
No log 2.6118 222 1.0301 0.5206 1.0301 1.0150
No log 2.6353 224 0.9696 0.5055 0.9696 0.9847
No log 2.6588 226 0.7473 0.5426 0.7473 0.8645
No log 2.6824 228 0.7015 0.5987 0.7015 0.8376
No log 2.7059 230 0.7394 0.5515 0.7394 0.8599
No log 2.7294 232 0.9075 0.4883 0.9075 0.9526
No log 2.7529 234 1.1942 0.4968 1.1942 1.0928
No log 2.7765 236 1.1217 0.5065 1.1217 1.0591
No log 2.8 238 0.8189 0.5408 0.8189 0.9049
No log 2.8235 240 0.6503 0.6709 0.6503 0.8064
No log 2.8471 242 0.6618 0.6540 0.6618 0.8135
No log 2.8706 244 0.7004 0.6319 0.7004 0.8369
No log 2.8941 246 0.8550 0.4824 0.8550 0.9247
No log 2.9176 248 1.0334 0.5016 1.0334 1.0166
No log 2.9412 250 1.0454 0.5016 1.0454 1.0224
No log 2.9647 252 0.9152 0.4603 0.9152 0.9567
No log 2.9882 254 0.8013 0.5769 0.8013 0.8952
No log 3.0118 256 0.8000 0.5769 0.8000 0.8944
No log 3.0353 258 0.9289 0.5384 0.9289 0.9638
No log 3.0588 260 1.1426 0.4724 1.1426 1.0689
No log 3.0824 262 1.1315 0.5191 1.1315 1.0637
No log 3.1059 264 0.9209 0.5591 0.9209 0.9597
No log 3.1294 266 0.8149 0.5800 0.8149 0.9027
No log 3.1529 268 0.7846 0.5182 0.7846 0.8858
No log 3.1765 270 0.7834 0.5153 0.7834 0.8851
No log 3.2 272 0.7728 0.5510 0.7728 0.8791
No log 3.2235 274 0.8061 0.5444 0.8061 0.8978
No log 3.2471 276 0.7865 0.5531 0.7865 0.8868
No log 3.2706 278 0.6609 0.6217 0.6609 0.8130
No log 3.2941 280 0.6281 0.6163 0.6281 0.7925
No log 3.3176 282 0.6213 0.5997 0.6213 0.7882
No log 3.3412 284 0.6658 0.6388 0.6658 0.8160
No log 3.3647 286 0.8315 0.5748 0.8315 0.9118
No log 3.3882 288 0.9602 0.5214 0.9602 0.9799
No log 3.4118 290 0.9038 0.5548 0.9038 0.9507
No log 3.4353 292 0.8030 0.5487 0.8030 0.8961
No log 3.4588 294 0.7313 0.5264 0.7313 0.8551
No log 3.4824 296 0.7656 0.5118 0.7656 0.8750
No log 3.5059 298 0.8188 0.5306 0.8188 0.9049
No log 3.5294 300 0.7826 0.5326 0.7826 0.8846
No log 3.5529 302 0.7083 0.5495 0.7083 0.8416
No log 3.5765 304 0.6844 0.6131 0.6844 0.8273
No log 3.6 306 0.7483 0.5600 0.7483 0.8651
No log 3.6235 308 0.9821 0.5703 0.9821 0.9910
No log 3.6471 310 1.0223 0.5681 1.0223 1.0111
No log 3.6706 312 0.9174 0.5626 0.9174 0.9578
No log 3.6941 314 0.9011 0.5548 0.9011 0.9492
No log 3.7176 316 0.7846 0.5276 0.7846 0.8858
No log 3.7412 318 0.7479 0.5585 0.7479 0.8648
No log 3.7647 320 0.7769 0.4954 0.7769 0.8814
No log 3.7882 322 0.7898 0.5111 0.7898 0.8887
No log 3.8118 324 0.7845 0.5044 0.7845 0.8857
No log 3.8353 326 0.8243 0.54 0.8243 0.9079
No log 3.8588 328 0.9730 0.5347 0.9730 0.9864
No log 3.8824 330 1.0326 0.5402 1.0326 1.0162
No log 3.9059 332 0.9584 0.5262 0.9584 0.9790
No log 3.9294 334 0.8859 0.5056 0.8859 0.9412
No log 3.9529 336 0.8365 0.5056 0.8365 0.9146
No log 3.9765 338 0.7043 0.5424 0.7043 0.8392
No log 4.0 340 0.6610 0.6006 0.6610 0.8130
No log 4.0235 342 0.6859 0.5847 0.6859 0.8282
No log 4.0471 344 0.7678 0.5710 0.7678 0.8762
No log 4.0706 346 0.7283 0.5710 0.7283 0.8534
No log 4.0941 348 0.6768 0.6057 0.6768 0.8227
No log 4.1176 350 0.7046 0.6414 0.7046 0.8394
No log 4.1412 352 0.6817 0.6753 0.6817 0.8257
No log 4.1647 354 0.7221 0.5470 0.7221 0.8498
No log 4.1882 356 0.8107 0.5490 0.8107 0.9004
No log 4.2118 358 0.8468 0.4742 0.8468 0.9202
No log 4.2353 360 0.8897 0.4918 0.8897 0.9433
No log 4.2588 362 0.9287 0.4625 0.9287 0.9637
No log 4.2824 364 0.8917 0.4663 0.8917 0.9443
No log 4.3059 366 0.9437 0.5083 0.9437 0.9715
No log 4.3294 368 1.0006 0.4963 1.0006 1.0003
No log 4.3529 370 0.9684 0.5083 0.9684 0.9841
No log 4.3765 372 0.8429 0.5537 0.8429 0.9181
No log 4.4 374 0.7973 0.4808 0.7973 0.8929
No log 4.4235 376 0.8152 0.5045 0.8152 0.9029
No log 4.4471 378 0.8459 0.4826 0.8459 0.9197
No log 4.4706 380 0.9271 0.4545 0.9271 0.9629
No log 4.4941 382 0.9016 0.4545 0.9016 0.9495
No log 4.5176 384 0.8756 0.4655 0.8756 0.9357
No log 4.5412 386 0.8840 0.4972 0.8840 0.9402
No log 4.5647 388 0.9005 0.4655 0.9005 0.9490
No log 4.5882 390 0.8763 0.5000 0.8763 0.9361
No log 4.6118 392 0.8547 0.4754 0.8547 0.9245
No log 4.6353 394 0.8789 0.4519 0.8789 0.9375
No log 4.6588 396 0.8745 0.4291 0.8745 0.9352
No log 4.6824 398 0.8258 0.4871 0.8258 0.9087
No log 4.7059 400 0.8238 0.4764 0.8238 0.9076
No log 4.7294 402 0.8675 0.4378 0.8675 0.9314
No log 4.7529 404 0.8358 0.4416 0.8358 0.9142
No log 4.7765 406 0.8426 0.4507 0.8426 0.9179
No log 4.8 408 0.9389 0.4958 0.9389 0.9689
No log 4.8235 410 1.1025 0.4950 1.1025 1.0500
No log 4.8471 412 1.1524 0.4723 1.1524 1.0735
No log 4.8706 414 1.0093 0.5334 1.0093 1.0046
No log 4.8941 416 0.8377 0.4911 0.8377 0.9152
No log 4.9176 418 0.7389 0.5606 0.7389 0.8596
No log 4.9412 420 0.7305 0.5790 0.7305 0.8547
No log 4.9647 422 0.7283 0.5930 0.7283 0.8534
No log 4.9882 424 0.8457 0.4911 0.8457 0.9196
No log 5.0118 426 1.0981 0.4829 1.0981 1.0479
No log 5.0353 428 1.1947 0.4234 1.1947 1.0930
No log 5.0588 430 1.1108 0.4235 1.1108 1.0539
No log 5.0824 432 1.0177 0.4275 1.0177 1.0088
No log 5.1059 434 0.9027 0.4552 0.9027 0.9501
No log 5.1294 436 0.8592 0.4803 0.8592 0.9269
No log 5.1529 438 0.8769 0.4986 0.8769 0.9364
No log 5.1765 440 0.9939 0.5465 0.9939 0.9969
No log 5.2 442 1.0071 0.5066 1.0071 1.0035
No log 5.2235 444 0.8521 0.5851 0.8521 0.9231
No log 5.2471 446 0.7670 0.5823 0.7670 0.8758
No log 5.2706 448 0.6836 0.6546 0.6836 0.8268
No log 5.2941 450 0.6992 0.5875 0.6992 0.8362
No log 5.3176 452 0.8364 0.5403 0.8364 0.9146
No log 5.3412 454 0.9052 0.5080 0.9052 0.9514
No log 5.3647 456 0.8968 0.5207 0.8968 0.9470
No log 5.3882 458 0.8474 0.5334 0.8474 0.9206
No log 5.4118 460 0.7769 0.5470 0.7769 0.8814
No log 5.4353 462 0.7429 0.5663 0.7429 0.8619
No log 5.4588 464 0.7382 0.5585 0.7382 0.8592
No log 5.4824 466 0.7962 0.5577 0.7962 0.8923
No log 5.5059 468 0.8061 0.5577 0.8061 0.8978
No log 5.5294 470 0.8421 0.5719 0.8421 0.9177
No log 5.5529 472 0.7972 0.5685 0.7972 0.8929
No log 5.5765 474 0.7422 0.5495 0.7422 0.8615
No log 5.6 476 0.7514 0.5649 0.7514 0.8668
No log 5.6235 478 0.8242 0.5980 0.8242 0.9079
No log 5.6471 480 0.8348 0.5980 0.8348 0.9137
No log 5.6706 482 0.7750 0.5537 0.7750 0.8803
No log 5.6941 484 0.7393 0.5787 0.7393 0.8598
No log 5.7176 486 0.7727 0.5561 0.7727 0.8791
No log 5.7412 488 0.9142 0.5614 0.9142 0.9561
No log 5.7647 490 1.0086 0.5174 1.0086 1.0043
No log 5.7882 492 1.0133 0.5002 1.0133 1.0066
No log 5.8118 494 0.9341 0.4663 0.9341 0.9665
No log 5.8353 496 0.8876 0.4986 0.8876 0.9421
No log 5.8588 498 0.8392 0.5326 0.8392 0.9161
0.3698 5.8824 500 0.8506 0.5326 0.8506 0.9223
0.3698 5.9059 502 0.8755 0.5306 0.8755 0.9357
0.3698 5.9294 504 0.8705 0.5306 0.8705 0.9330
0.3698 5.9529 506 0.8321 0.5347 0.8321 0.9122
0.3698 5.9765 508 0.8228 0.5014 0.8228 0.9071
0.3698 6.0 510 0.8334 0.4952 0.8334 0.9129
0.3698 6.0235 512 0.7862 0.5317 0.7862 0.8867
0.3698 6.0471 514 0.7443 0.5585 0.7443 0.8627
0.3698 6.0706 516 0.7473 0.5537 0.7473 0.8644
0.3698 6.0941 518 0.8602 0.6083 0.8602 0.9275
0.3698 6.1176 520 1.0116 0.5105 1.0116 1.0058
0.3698 6.1412 522 1.1008 0.5343 1.1008 1.0492
0.3698 6.1647 524 1.0441 0.5398 1.0441 1.0218
0.3698 6.1882 526 0.8421 0.5156 0.8421 0.9176
0.3698 6.2118 528 0.7724 0.5368 0.7724 0.8789
0.3698 6.2353 530 0.7763 0.5368 0.7763 0.8811
0.3698 6.2588 532 0.8404 0.5156 0.8404 0.9167
0.3698 6.2824 534 0.9524 0.5165 0.9524 0.9759
0.3698 6.3059 536 0.9552 0.4685 0.9552 0.9773
0.3698 6.3294 538 0.8561 0.5000 0.8561 0.9253
0.3698 6.3529 540 0.8046 0.4851 0.8046 0.8970

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
1
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run3_AugV5_k15_task2_organization

Finetuned
(4019)
this model