ArabicNewSplits7_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k7_task7_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.7187
  • Qwk: 0.3341
  • Mse: 0.7187
  • Rmse: 0.8478

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.1176 2 2.5757 -0.0924 2.5757 1.6049
No log 0.2353 4 1.3587 0.0994 1.3587 1.1656
No log 0.3529 6 1.1844 -0.2292 1.1844 1.0883
No log 0.4706 8 0.9977 -0.0426 0.9977 0.9988
No log 0.5882 10 0.9364 0.1007 0.9364 0.9677
No log 0.7059 12 0.8715 0.1648 0.8715 0.9335
No log 0.8235 14 0.8307 -0.0103 0.8307 0.9114
No log 0.9412 16 0.8269 -0.0483 0.8269 0.9094
No log 1.0588 18 0.9257 0.0495 0.9257 0.9621
No log 1.1765 20 0.9539 -0.0700 0.9539 0.9767
No log 1.2941 22 0.8720 0.0027 0.8720 0.9338
No log 1.4118 24 0.8314 0.0 0.8314 0.9118
No log 1.5294 26 0.8248 0.0 0.8248 0.9082
No log 1.6471 28 0.8307 0.1236 0.8307 0.9114
No log 1.7647 30 0.7798 0.0 0.7798 0.8831
No log 1.8824 32 0.7549 0.0 0.7549 0.8688
No log 2.0 34 0.7683 0.0 0.7683 0.8765
No log 2.1176 36 0.8061 0.0481 0.8061 0.8978
No log 2.2353 38 0.9177 0.2526 0.9177 0.9579
No log 2.3529 40 0.9241 0.3444 0.9241 0.9613
No log 2.4706 42 0.8782 0.3173 0.8782 0.9371
No log 2.5882 44 0.7992 0.1372 0.7992 0.8940
No log 2.7059 46 0.7477 0.0937 0.7477 0.8647
No log 2.8235 48 0.7057 0.0428 0.7057 0.8400
No log 2.9412 50 0.7487 0.3243 0.7487 0.8653
No log 3.0588 52 0.8144 0.1648 0.8144 0.9025
No log 3.1765 54 0.8334 0.1699 0.8334 0.9129
No log 3.2941 56 0.8386 0.1094 0.8386 0.9157
No log 3.4118 58 0.8747 -0.0027 0.8747 0.9353
No log 3.5294 60 0.9345 -0.1275 0.9345 0.9667
No log 3.6471 62 0.8860 -0.0444 0.8860 0.9413
No log 3.7647 64 0.7937 0.0 0.7937 0.8909
No log 3.8824 66 0.7158 0.0889 0.7158 0.8460
No log 4.0 68 0.7009 0.0393 0.7009 0.8372
No log 4.1176 70 0.7320 0.0359 0.7320 0.8556
No log 4.2353 72 0.7952 -0.0051 0.7952 0.8917
No log 4.3529 74 0.8104 0.0265 0.8104 0.9002
No log 4.4706 76 0.8378 0.0927 0.8378 0.9153
No log 4.5882 78 0.8915 0.0966 0.8915 0.9442
No log 4.7059 80 0.9184 0.1699 0.9184 0.9583
No log 4.8235 82 0.9273 0.2171 0.9273 0.9630
No log 4.9412 84 0.9053 0.1972 0.9053 0.9515
No log 5.0588 86 0.9132 0.0245 0.9132 0.9556
No log 5.1765 88 0.8966 0.0968 0.8966 0.9469
No log 5.2941 90 0.9317 0.1303 0.9317 0.9652
No log 5.4118 92 0.9620 0.2063 0.9620 0.9808
No log 5.5294 94 0.9345 0.2632 0.9345 0.9667
No log 5.6471 96 0.8433 0.3238 0.8433 0.9183
No log 5.7647 98 0.8240 0.2007 0.8240 0.9078
No log 5.8824 100 0.8275 -0.0070 0.8275 0.9096
No log 6.0 102 0.8620 0.0362 0.8620 0.9284
No log 6.1176 104 0.8316 0.0697 0.8316 0.9119
No log 6.2353 106 0.8263 0.2345 0.8263 0.9090
No log 6.3529 108 0.8755 0.2604 0.8755 0.9357
No log 6.4706 110 0.8392 0.2171 0.8392 0.9161
No log 6.5882 112 0.8389 0.2063 0.8389 0.9159
No log 6.7059 114 0.8407 0.0 0.8407 0.9169
No log 6.8235 116 0.7234 0.1829 0.7234 0.8505
No log 6.9412 118 0.6556 0.2819 0.6556 0.8097
No log 7.0588 120 0.6881 0.3950 0.6881 0.8295
No log 7.1765 122 0.8563 0.3499 0.8563 0.9254
No log 7.2941 124 0.9866 0.2921 0.9866 0.9933
No log 7.4118 126 0.9981 0.2464 0.9981 0.9991
No log 7.5294 128 1.0640 0.1354 1.0640 1.0315
No log 7.6471 130 1.5347 0.1007 1.5347 1.2388
No log 7.7647 132 1.5733 0.0790 1.5733 1.2543
No log 7.8824 134 1.2396 0.1332 1.2396 1.1134
No log 8.0 136 0.9817 0.0801 0.9817 0.9908
No log 8.1176 138 0.9364 0.2832 0.9364 0.9677
No log 8.2353 140 0.8926 0.3183 0.8926 0.9448
No log 8.3529 142 0.8516 0.3221 0.8516 0.9228
No log 8.4706 144 0.8222 0.2414 0.8222 0.9068
No log 8.5882 146 0.8010 0.2813 0.8010 0.8950
No log 8.7059 148 0.7963 0.2784 0.7963 0.8924
No log 8.8235 150 0.8079 0.3372 0.8079 0.8989
No log 8.9412 152 0.8412 0.3699 0.8412 0.9171
No log 9.0588 154 0.7978 0.3637 0.7978 0.8932
No log 9.1765 156 0.7614 0.3099 0.7614 0.8726
No log 9.2941 158 0.7245 0.1699 0.7245 0.8512
No log 9.4118 160 0.7185 0.1807 0.7185 0.8477
No log 9.5294 162 0.7381 0.1268 0.7381 0.8592
No log 9.6471 164 0.7843 0.2171 0.7843 0.8856
No log 9.7647 166 0.8367 0.2328 0.8367 0.9147
No log 9.8824 168 0.8363 0.1995 0.8363 0.9145
No log 10.0 170 0.8047 0.2589 0.8047 0.8970
No log 10.1176 172 0.8180 0.0652 0.8180 0.9044
No log 10.2353 174 0.8233 0.0652 0.8233 0.9074
No log 10.3529 176 0.7997 0.2027 0.7997 0.8942
No log 10.4706 178 0.8016 0.3372 0.8016 0.8953
No log 10.5882 180 0.7845 0.3819 0.7845 0.8857
No log 10.7059 182 0.6968 0.3032 0.6968 0.8347
No log 10.8235 184 0.6736 0.3425 0.6736 0.8208
No log 10.9412 186 0.7071 0.3127 0.7071 0.8409
No log 11.0588 188 0.7794 0.3372 0.7794 0.8828
No log 11.1765 190 0.8896 0.3782 0.8896 0.9432
No log 11.2941 192 0.8603 0.4251 0.8603 0.9275
No log 11.4118 194 0.7622 0.2784 0.7622 0.8730
No log 11.5294 196 0.7462 0.2683 0.7462 0.8638
No log 11.6471 198 0.7544 0.2683 0.7544 0.8686
No log 11.7647 200 0.7524 0.2319 0.7524 0.8674
No log 11.8824 202 0.8204 0.4089 0.8204 0.9058
No log 12.0 204 0.8332 0.3590 0.8332 0.9128
No log 12.1176 206 0.8053 0.1264 0.8053 0.8974
No log 12.2353 208 0.8030 0.2043 0.8030 0.8961
No log 12.3529 210 0.8035 0.1051 0.8035 0.8964
No log 12.4706 212 0.7790 0.2158 0.7790 0.8826
No log 12.5882 214 0.7441 0.2319 0.7441 0.8626
No log 12.7059 216 0.7776 0.3894 0.7776 0.8818
No log 12.8235 218 0.7996 0.4014 0.7996 0.8942
No log 12.9412 220 0.7439 0.4052 0.7439 0.8625
No log 13.0588 222 0.7111 0.3471 0.7111 0.8433
No log 13.1765 224 0.6985 0.3341 0.6985 0.8357
No log 13.2941 226 0.7284 0.3545 0.7284 0.8535
No log 13.4118 228 0.8148 0.3372 0.8148 0.9027
No log 13.5294 230 0.9035 0.3519 0.9035 0.9505
No log 13.6471 232 0.8974 0.2754 0.8974 0.9473
No log 13.7647 234 0.8592 0.2847 0.8592 0.9269
No log 13.8824 236 0.8327 0.3737 0.8327 0.9125
No log 14.0 238 0.8219 0.3918 0.8219 0.9066
No log 14.1176 240 0.7738 0.3032 0.7738 0.8797
No log 14.2353 242 0.7455 0.3712 0.7455 0.8634
No log 14.3529 244 0.7875 0.3302 0.7875 0.8874
No log 14.4706 246 0.8593 0.3869 0.8593 0.9270
No log 14.5882 248 0.9272 0.3825 0.9272 0.9629
No log 14.7059 250 0.8637 0.3538 0.8637 0.9294
No log 14.8235 252 0.7736 0.3894 0.7736 0.8795
No log 14.9412 254 0.7035 0.3594 0.7035 0.8387
No log 15.0588 256 0.6780 0.4001 0.6780 0.8234
No log 15.1765 258 0.6778 0.4291 0.6778 0.8233
No log 15.2941 260 0.6805 0.4001 0.6805 0.8249
No log 15.4118 262 0.7081 0.4158 0.7081 0.8415
No log 15.5294 264 0.7434 0.3868 0.7434 0.8622
No log 15.6471 266 0.8074 0.3746 0.8074 0.8986
No log 15.7647 268 0.8077 0.3699 0.8077 0.8987
No log 15.8824 270 0.8001 0.4014 0.8001 0.8945
No log 16.0 272 0.7637 0.3770 0.7637 0.8739
No log 16.1176 274 0.7132 0.3518 0.7132 0.8445
No log 16.2353 276 0.7116 0.3238 0.7116 0.8436
No log 16.3529 278 0.7267 0.3238 0.7267 0.8525
No log 16.4706 280 0.7366 0.3099 0.7366 0.8583
No log 16.5882 282 0.7447 0.3712 0.7447 0.8630
No log 16.7059 284 0.7767 0.3712 0.7767 0.8813
No log 16.8235 286 0.8651 0.3675 0.8651 0.9301
No log 16.9412 288 0.8676 0.3606 0.8676 0.9315
No log 17.0588 290 0.8148 0.3819 0.8148 0.9027
No log 17.1765 292 0.8137 0.3637 0.8137 0.9021
No log 17.2941 294 0.7826 0.3737 0.7826 0.8847
No log 17.4118 296 0.7781 0.2847 0.7781 0.8821
No log 17.5294 298 0.7970 0.2319 0.7970 0.8927
No log 17.6471 300 0.8518 0.1918 0.8518 0.9230
No log 17.7647 302 0.9251 0.1866 0.9251 0.9618
No log 17.8824 304 0.9407 0.1866 0.9407 0.9699
No log 18.0 306 0.8990 0.1918 0.8990 0.9482
No log 18.1176 308 0.8168 0.2847 0.8168 0.9038
No log 18.2353 310 0.7374 0.3919 0.7374 0.8587
No log 18.3529 312 0.7206 0.3919 0.7206 0.8489
No log 18.4706 314 0.7563 0.4684 0.7563 0.8696
No log 18.5882 316 0.7759 0.4684 0.7759 0.8809
No log 18.7059 318 0.7929 0.4684 0.7929 0.8905
No log 18.8235 320 0.7879 0.3572 0.7879 0.8876
No log 18.9412 322 0.7850 0.3572 0.7850 0.8860
No log 19.0588 324 0.7964 0.4270 0.7964 0.8924
No log 19.1765 326 0.8016 0.3996 0.8016 0.8953
No log 19.2941 328 0.8174 0.3590 0.8174 0.9041
No log 19.4118 330 0.7876 0.4247 0.7876 0.8874
No log 19.5294 332 0.8040 0.3770 0.8040 0.8966
No log 19.6471 334 0.7851 0.3770 0.7851 0.8861
No log 19.7647 336 0.7121 0.4592 0.7121 0.8439
No log 19.8824 338 0.6655 0.2819 0.6655 0.8158
No log 20.0 340 0.6671 0.2819 0.6671 0.8167
No log 20.1176 342 0.6997 0.3782 0.6997 0.8365
No log 20.2353 344 0.7596 0.4076 0.7596 0.8715
No log 20.3529 346 0.8016 0.3372 0.8016 0.8953
No log 20.4706 348 0.8415 0.3372 0.8415 0.9173
No log 20.5882 350 0.8207 0.3519 0.8207 0.9059
No log 20.7059 352 0.8039 0.3519 0.8039 0.8966
No log 20.8235 354 0.7602 0.3544 0.7602 0.8719
No log 20.9412 356 0.7442 0.4190 0.7442 0.8626
No log 21.0588 358 0.7384 0.4167 0.7384 0.8593
No log 21.1765 360 0.7168 0.4479 0.7168 0.8467
No log 21.2941 362 0.7191 0.4479 0.7191 0.8480
No log 21.4118 364 0.7384 0.4576 0.7384 0.8593
No log 21.5294 366 0.7592 0.4167 0.7592 0.8713
No log 21.6471 368 0.7503 0.4576 0.7503 0.8662
No log 21.7647 370 0.7116 0.4576 0.7116 0.8436
No log 21.8824 372 0.6735 0.3755 0.6735 0.8207
No log 22.0 374 0.6573 0.3123 0.6573 0.8107
No log 22.1176 376 0.6633 0.3976 0.6633 0.8144
No log 22.2353 378 0.7019 0.4753 0.7019 0.8378
No log 22.3529 380 0.7806 0.4167 0.7806 0.8835
No log 22.4706 382 0.9162 0.3913 0.9162 0.9572
No log 22.5882 384 0.9838 0.3128 0.9838 0.9919
No log 22.7059 386 0.9348 0.3012 0.9348 0.9668
No log 22.8235 388 0.8444 0.2883 0.8444 0.9189
No log 22.9412 390 0.7869 0.2145 0.7869 0.8871
No log 23.0588 392 0.7715 0.1863 0.7715 0.8783
No log 23.1765 394 0.7584 0.2206 0.7584 0.8709
No log 23.2941 396 0.7867 0.3020 0.7867 0.8870
No log 23.4118 398 0.8898 0.3972 0.8898 0.9433
No log 23.5294 400 0.9919 0.3473 0.9919 0.9960
No log 23.6471 402 1.0320 0.2781 1.0320 1.0159
No log 23.7647 404 1.0073 0.2387 1.0073 1.0036
No log 23.8824 406 0.9323 0.3060 0.9323 0.9656
No log 24.0 408 0.9247 0.2532 0.9247 0.9616
No log 24.1176 410 0.9271 0.2838 0.9271 0.9629
No log 24.2353 412 0.9283 0.3106 0.9283 0.9635
No log 24.3529 414 0.9059 0.2813 0.9059 0.9518
No log 24.4706 416 0.9277 0.3344 0.9277 0.9632
No log 24.5882 418 0.9200 0.3918 0.9200 0.9591
No log 24.7059 420 0.8989 0.3991 0.8989 0.9481
No log 24.8235 422 0.8600 0.3894 0.8600 0.9274
No log 24.9412 424 0.8149 0.4392 0.8149 0.9027
No log 25.0588 426 0.7731 0.4243 0.7731 0.8793
No log 25.1765 428 0.7624 0.3976 0.7624 0.8731
No log 25.2941 430 0.7936 0.4243 0.7936 0.8909
No log 25.4118 432 0.8036 0.3622 0.8036 0.8964
No log 25.5294 434 0.8172 0.3224 0.8172 0.9040
No log 25.6471 436 0.7936 0.3224 0.7936 0.8908
No log 25.7647 438 0.7833 0.3498 0.7833 0.8851
No log 25.8824 440 0.7797 0.4663 0.7797 0.8830
No log 26.0 442 0.8268 0.3843 0.8268 0.9093
No log 26.1176 444 0.8310 0.3843 0.8310 0.9116
No log 26.2353 446 0.7962 0.4753 0.7962 0.8923
No log 26.3529 448 0.7693 0.4753 0.7693 0.8771
No log 26.4706 450 0.7771 0.4479 0.7771 0.8816
No log 26.5882 452 0.8233 0.4052 0.8233 0.9074
No log 26.7059 454 0.8303 0.3637 0.8303 0.9112
No log 26.8235 456 0.7922 0.4479 0.7922 0.8901
No log 26.9412 458 0.7423 0.3622 0.7423 0.8616
No log 27.0588 460 0.7357 0.2981 0.7357 0.8577
No log 27.1765 462 0.7645 0.3594 0.7645 0.8744
No log 27.2941 464 0.8427 0.3972 0.8427 0.9180
No log 27.4118 466 0.9057 0.3675 0.9057 0.9517
No log 27.5294 468 0.9028 0.3675 0.9028 0.9501
No log 27.6471 470 0.8554 0.3894 0.8554 0.9249
No log 27.7647 472 0.8067 0.4247 0.8067 0.8982
No log 27.8824 474 0.8030 0.4247 0.8030 0.8961
No log 28.0 476 0.7978 0.4247 0.7978 0.8932
No log 28.1176 478 0.8022 0.4052 0.8022 0.8957
No log 28.2353 480 0.8193 0.3972 0.8193 0.9051
No log 28.3529 482 0.8121 0.4479 0.8121 0.9012
No log 28.4706 484 0.8070 0.4479 0.8070 0.8984
No log 28.5882 486 0.7850 0.4219 0.7850 0.8860
No log 28.7059 488 0.7744 0.4502 0.7744 0.8800
No log 28.8235 490 0.7633 0.4502 0.7633 0.8737
No log 28.9412 492 0.7753 0.4774 0.7753 0.8805
No log 29.0588 494 0.8084 0.4076 0.8084 0.8991
No log 29.1765 496 0.8381 0.4330 0.8381 0.9155
No log 29.2941 498 0.8586 0.4409 0.8586 0.9266
0.3134 29.4118 500 0.8490 0.4224 0.8490 0.9214
0.3134 29.5294 502 0.8322 0.3972 0.8322 0.9123
0.3134 29.6471 504 0.7995 0.4479 0.7995 0.8942
0.3134 29.7647 506 0.7992 0.4845 0.7992 0.8940
0.3134 29.8824 508 0.8180 0.4414 0.8180 0.9044
0.3134 30.0 510 0.8297 0.4414 0.8297 0.9109
0.3134 30.1176 512 0.8231 0.3649 0.8231 0.9072
0.3134 30.2353 514 0.8386 0.3737 0.8386 0.9158
0.3134 30.3529 516 0.8447 0.3737 0.8447 0.9191
0.3134 30.4706 518 0.8344 0.3471 0.8344 0.9135
0.3134 30.5882 520 0.7964 0.4330 0.7964 0.8924
0.3134 30.7059 522 0.7581 0.3782 0.7581 0.8707
0.3134 30.8235 524 0.7483 0.3782 0.7483 0.8651
0.3134 30.9412 526 0.7706 0.4414 0.7706 0.8779
0.3134 31.0588 528 0.8221 0.3972 0.8221 0.9067
0.3134 31.1765 530 0.8522 0.3972 0.8522 0.9231
0.3134 31.2941 532 0.8332 0.4392 0.8332 0.9128
0.3134 31.4118 534 0.7971 0.4219 0.7971 0.8928
0.3134 31.5294 536 0.7689 0.3950 0.7689 0.8768
0.3134 31.6471 538 0.7299 0.3976 0.7299 0.8543
0.3134 31.7647 540 0.7136 0.3976 0.7136 0.8447
0.3134 31.8824 542 0.7187 0.3341 0.7187 0.8478

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k7_task7_organization

Finetuned
(4019)
this model