ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run3_AugV5_k4_task3_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.7478
  • Qwk: 0.1327
  • Mse: 0.7478
  • Rmse: 0.8648

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0952 2 4.0130 0.0 4.0130 2.0033
No log 0.1905 4 2.9845 0.0024 2.9845 1.7276
No log 0.2857 6 1.3693 0.0 1.3693 1.1702
No log 0.3810 8 0.9624 -0.0067 0.9624 0.9810
No log 0.4762 10 0.8139 0.1605 0.8139 0.9022
No log 0.5714 12 0.6796 0.0 0.6796 0.8244
No log 0.6667 14 0.6969 0.0 0.6969 0.8348
No log 0.7619 16 0.8251 -0.0079 0.8251 0.9084
No log 0.8571 18 1.2680 0.0100 1.2680 1.1260
No log 0.9524 20 2.0575 0.0153 2.0575 1.4344
No log 1.0476 22 1.5562 0.0032 1.5562 1.2475
No log 1.1429 24 0.8112 0.0129 0.8112 0.9007
No log 1.2381 26 0.7424 0.0460 0.7424 0.8616
No log 1.3333 28 0.7816 -0.0215 0.7816 0.8841
No log 1.4286 30 0.7374 -0.0131 0.7374 0.8587
No log 1.5238 32 0.7584 -0.0571 0.7584 0.8708
No log 1.6190 34 0.7279 -0.0609 0.7279 0.8532
No log 1.7143 36 0.8489 0.0409 0.8489 0.9214
No log 1.8095 38 1.0365 0.0147 1.0365 1.0181
No log 1.9048 40 0.8905 -0.0320 0.8905 0.9437
No log 2.0 42 0.9694 -0.0585 0.9694 0.9846
No log 2.0952 44 0.9837 0.0457 0.9837 0.9918
No log 2.1905 46 1.0426 0.0255 1.0426 1.0211
No log 2.2857 48 1.1590 0.0121 1.1590 1.0766
No log 2.3810 50 1.0406 -0.0105 1.0406 1.0201
No log 2.4762 52 1.0348 -0.0561 1.0348 1.0173
No log 2.5714 54 1.1152 -0.0245 1.1152 1.0561
No log 2.6667 56 0.9932 -0.0522 0.9932 0.9966
No log 2.7619 58 0.8969 0.0087 0.8969 0.9470
No log 2.8571 60 0.9786 -0.0175 0.9786 0.9892
No log 2.9524 62 0.8948 -0.0339 0.8948 0.9459
No log 3.0476 64 1.3300 0.0574 1.3300 1.1533
No log 3.1429 66 1.1920 0.0531 1.1920 1.0918
No log 3.2381 68 0.8996 -0.0563 0.8996 0.9485
No log 3.3333 70 0.9644 -0.0468 0.9644 0.9820
No log 3.4286 72 0.8960 -0.0923 0.8960 0.9466
No log 3.5238 74 1.1231 0.0808 1.1231 1.0597
No log 3.6190 76 1.1401 0.0541 1.1401 1.0677
No log 3.7143 78 0.9094 0.0861 0.9094 0.9536
No log 3.8095 80 1.0502 0.0537 1.0502 1.0248
No log 3.9048 82 0.8942 0.0438 0.8942 0.9456
No log 4.0 84 0.8894 0.0378 0.8894 0.9431
No log 4.0952 86 1.0768 0.0531 1.0768 1.0377
No log 4.1905 88 0.8523 -0.0150 0.8523 0.9232
No log 4.2857 90 0.7291 0.0 0.7291 0.8539
No log 4.3810 92 0.8603 -0.0056 0.8603 0.9275
No log 4.4762 94 0.8240 0.0476 0.8240 0.9077
No log 4.5714 96 0.7713 0.0471 0.7713 0.8782
No log 4.6667 98 0.7674 0.0432 0.7674 0.8760
No log 4.7619 100 0.8507 0.0718 0.8507 0.9223
No log 4.8571 102 0.8547 -0.0066 0.8547 0.9245
No log 4.9524 104 0.7504 0.2209 0.7504 0.8663
No log 5.0476 106 0.7477 0.1815 0.7477 0.8647
No log 5.1429 108 0.7648 0.1453 0.7648 0.8746
No log 5.2381 110 0.7858 0.1387 0.7858 0.8865
No log 5.3333 112 0.8562 0.0628 0.8562 0.9253
No log 5.4286 114 0.8941 0.0628 0.8941 0.9456
No log 5.5238 116 0.8181 0.0545 0.8181 0.9045
No log 5.6190 118 0.8012 0.0709 0.8012 0.8951
No log 5.7143 120 0.7944 0.0583 0.7944 0.8913
No log 5.8095 122 1.0729 -0.0211 1.0729 1.0358
No log 5.9048 124 0.9733 -0.0477 0.9733 0.9866
No log 6.0 126 0.7397 0.0454 0.7397 0.8600
No log 6.0952 128 0.8071 0.0129 0.8071 0.8984
No log 6.1905 130 0.7768 0.0236 0.7768 0.8813
No log 6.2857 132 0.8897 -0.0173 0.8897 0.9432
No log 6.3810 134 0.9245 -0.0116 0.9245 0.9615
No log 6.4762 136 0.8587 0.1095 0.8587 0.9267
No log 6.5714 138 0.9623 -0.0440 0.9623 0.9810
No log 6.6667 140 0.8535 0.0118 0.8535 0.9238
No log 6.7619 142 0.9507 0.0378 0.9507 0.9750
No log 6.8571 144 1.0123 0.0833 1.0123 1.0061
No log 6.9524 146 0.8302 0.0606 0.8302 0.9112
No log 7.0476 148 0.7653 0.1387 0.7653 0.8748
No log 7.1429 150 0.7564 0.1354 0.7564 0.8697
No log 7.2381 152 0.7722 0.0863 0.7722 0.8787
No log 7.3333 154 0.7810 0.0879 0.7810 0.8837
No log 7.4286 156 0.8104 0.0532 0.8104 0.9002
No log 7.5238 158 0.7849 0.0879 0.7849 0.8859
No log 7.6190 160 0.7632 0.0879 0.7632 0.8736
No log 7.7143 162 0.7681 0.1443 0.7681 0.8764
No log 7.8095 164 0.7245 0.1304 0.7245 0.8512
No log 7.9048 166 0.7699 0.1146 0.7699 0.8774
No log 8.0 168 0.7262 0.1599 0.7262 0.8522
No log 8.0952 170 0.7513 0.1490 0.7513 0.8668
No log 8.1905 172 0.7727 0.0622 0.7727 0.8790
No log 8.2857 174 0.6873 0.1423 0.6873 0.8290
No log 8.3810 176 0.7065 0.0768 0.7065 0.8405
No log 8.4762 178 0.8009 0.1231 0.8009 0.8949
No log 8.5714 180 0.7858 0.1800 0.7858 0.8865
No log 8.6667 182 0.7401 0.2707 0.7401 0.8603
No log 8.7619 184 0.8671 0.0365 0.8671 0.9312
No log 8.8571 186 0.9027 0.0680 0.9027 0.9501
No log 8.9524 188 0.8089 0.1379 0.8089 0.8994
No log 9.0476 190 0.8123 0.1001 0.8123 0.9013
No log 9.1429 192 0.7966 0.0956 0.7966 0.8925
No log 9.2381 194 0.7461 0.0414 0.7461 0.8638
No log 9.3333 196 0.7844 0.1434 0.7844 0.8857
No log 9.4286 198 0.7458 0.0454 0.7458 0.8636
No log 9.5238 200 0.7456 0.1440 0.7456 0.8635
No log 9.6190 202 0.8460 0.1147 0.8460 0.9198
No log 9.7143 204 0.8045 0.0913 0.8045 0.8970
No log 9.8095 206 0.7831 0.1674 0.7831 0.8849
No log 9.9048 208 0.8252 0.0875 0.8252 0.9084
No log 10.0 210 0.8107 0.1259 0.8107 0.9004
No log 10.0952 212 0.7691 0.1187 0.7691 0.8770
No log 10.1905 214 0.7234 0.0723 0.7234 0.8505
No log 10.2857 216 0.7028 0.0863 0.7028 0.8383
No log 10.3810 218 0.7092 0.1146 0.7092 0.8421
No log 10.4762 220 0.7298 -0.0145 0.7298 0.8543
No log 10.5714 222 0.8068 0.0106 0.8068 0.8982
No log 10.6667 224 0.8735 -0.0608 0.8735 0.9346
No log 10.7619 226 0.9236 0.1235 0.9236 0.9610
No log 10.8571 228 1.0126 0.0651 1.0126 1.0063
No log 10.9524 230 0.8875 0.0964 0.8875 0.9421
No log 11.0476 232 0.8396 0.0784 0.8396 0.9163
No log 11.1429 234 0.8258 0.0129 0.8258 0.9087
No log 11.2381 236 0.7429 0.0869 0.7429 0.8619
No log 11.3333 238 0.7268 0.1199 0.7268 0.8525
No log 11.4286 240 0.7271 0.1740 0.7271 0.8527
No log 11.5238 242 0.7340 0.1740 0.7340 0.8568
No log 11.6190 244 0.7505 0.1740 0.7505 0.8663
No log 11.7143 246 0.7669 0.1740 0.7669 0.8757
No log 11.8095 248 0.7845 0.1292 0.7845 0.8857
No log 11.9048 250 0.7902 0.1689 0.7902 0.8889
No log 12.0 252 0.7897 0.1660 0.7897 0.8886
No log 12.0952 254 0.7654 0.2195 0.7654 0.8749
No log 12.1905 256 0.7531 0.1740 0.7531 0.8678
No log 12.2857 258 0.7712 0.0562 0.7712 0.8782
No log 12.3810 260 0.7388 0.1146 0.7388 0.8595
No log 12.4762 262 0.6909 0.0863 0.6909 0.8312
No log 12.5714 264 0.6941 0.0914 0.6941 0.8331
No log 12.6667 266 0.6998 0.0914 0.6998 0.8365
No log 12.7619 268 0.7151 0.0863 0.7151 0.8456
No log 12.8571 270 0.7155 0.0863 0.7155 0.8459
No log 12.9524 272 0.7194 0.0863 0.7194 0.8482
No log 13.0476 274 0.7215 0.2078 0.7215 0.8494
No log 13.1429 276 0.7595 0.0956 0.7595 0.8715
No log 13.2381 278 0.7202 0.1675 0.7202 0.8487
No log 13.3333 280 0.7242 0.1675 0.7242 0.8510
No log 13.4286 282 0.7435 0.1612 0.7435 0.8623
No log 13.5238 284 0.7496 0.1612 0.7496 0.8658
No log 13.6190 286 0.7510 0.1612 0.7510 0.8666
No log 13.7143 288 0.7581 0.1096 0.7581 0.8707
No log 13.8095 290 0.7549 0.0783 0.7549 0.8688
No log 13.9048 292 0.7529 0.1144 0.7529 0.8677
No log 14.0 294 0.7699 0.0562 0.7699 0.8774
No log 14.0952 296 0.7779 0.0999 0.7779 0.8820
No log 14.1905 298 0.7696 0.0999 0.7696 0.8773
No log 14.2857 300 0.7235 0.1199 0.7235 0.8506
No log 14.3810 302 0.7754 -0.0329 0.7754 0.8806
No log 14.4762 304 0.7683 0.0518 0.7683 0.8765
No log 14.5714 306 0.7473 0.0723 0.7473 0.8645
No log 14.6667 308 0.8756 0.0719 0.8756 0.9357
No log 14.7619 310 0.8693 -0.0031 0.8693 0.9324
No log 14.8571 312 0.7862 0.1585 0.7862 0.8867
No log 14.9524 314 0.7765 0.2515 0.7765 0.8812
No log 15.0476 316 0.7485 0.1144 0.7485 0.8652
No log 15.1429 318 0.7579 0.0490 0.7579 0.8706
No log 15.2381 320 0.7270 0.0723 0.7270 0.8526
No log 15.3333 322 0.7257 0.0723 0.7257 0.8519
No log 15.4286 324 0.7419 0.0723 0.7419 0.8613
No log 15.5238 326 0.8053 0.0956 0.8053 0.8974
No log 15.6190 328 0.8517 0.0456 0.8517 0.9229
No log 15.7143 330 0.7720 0.0562 0.7720 0.8786
No log 15.8095 332 0.7383 0.1249 0.7383 0.8593
No log 15.9048 334 0.7351 0.0496 0.7351 0.8574
No log 16.0 336 0.7192 0.0395 0.7192 0.8481
No log 16.0952 338 0.7154 0.0680 0.7154 0.8458
No log 16.1905 340 0.7665 0.0909 0.7665 0.8755
No log 16.2857 342 0.7370 0.0600 0.7370 0.8585
No log 16.3810 344 0.7308 0.1787 0.7308 0.8548
No log 16.4762 346 0.7813 0.0981 0.7813 0.8839
No log 16.5714 348 0.7811 0.1327 0.7811 0.8838
No log 16.6667 350 0.7658 0.1146 0.7658 0.8751
No log 16.7619 352 0.7879 0.1001 0.7879 0.8877
No log 16.8571 354 0.7754 0.0600 0.7754 0.8806
No log 16.9524 356 0.7366 0.1675 0.7366 0.8583
No log 17.0476 358 0.7415 0.1734 0.7415 0.8611
No log 17.1429 360 0.7370 0.0828 0.7370 0.8585
No log 17.2381 362 0.7558 0.1612 0.7558 0.8694
No log 17.3333 364 0.7750 0.0562 0.7750 0.8804
No log 17.4286 366 0.7370 0.0814 0.7370 0.8585
No log 17.5238 368 0.7237 0.0814 0.7237 0.8507
No log 17.6190 370 0.7183 0.0814 0.7183 0.8475
No log 17.7143 372 0.7265 0.1254 0.7265 0.8524
No log 17.8095 374 0.7214 0.0863 0.7214 0.8494
No log 17.9048 376 0.7323 0.0394 0.7323 0.8557
No log 18.0 378 0.7414 0.0814 0.7414 0.8610
No log 18.0952 380 0.7668 0.0956 0.7668 0.8757
No log 18.1905 382 0.7437 0.2009 0.7437 0.8624
No log 18.2857 384 0.7242 0.0768 0.7242 0.8510
No log 18.3810 386 0.7116 -0.0062 0.7116 0.8436
No log 18.4762 388 0.7181 -0.0062 0.7181 0.8474
No log 18.5714 390 0.7312 0.0355 0.7312 0.8551
No log 18.6667 392 0.7428 0.1612 0.7428 0.8619
No log 18.7619 394 0.7547 0.1612 0.7547 0.8687
No log 18.8571 396 0.7312 0.1612 0.7312 0.8551
No log 18.9524 398 0.7373 0.0978 0.7373 0.8586
No log 19.0476 400 0.8179 0.0249 0.8179 0.9044
No log 19.1429 402 0.7778 -0.0329 0.7778 0.8819
No log 19.2381 404 0.7106 0.0863 0.7106 0.8429
No log 19.3333 406 0.7921 0.0831 0.7921 0.8900
No log 19.4286 408 0.9214 0.1025 0.9214 0.9599
No log 19.5238 410 0.8508 0.1493 0.8508 0.9224
No log 19.6190 412 0.7703 0.0956 0.7703 0.8777
No log 19.7143 414 0.7398 0.1199 0.7398 0.8601
No log 19.8095 416 0.7424 0.0863 0.7424 0.8616
No log 19.9048 418 0.7376 0.1199 0.7376 0.8588
No log 20.0 420 0.7517 0.1096 0.7517 0.8670
No log 20.0952 422 0.8124 0.0831 0.8124 0.9013
No log 20.1905 424 0.8397 0.0831 0.8397 0.9163
No log 20.2857 426 0.7696 0.1879 0.7696 0.8772
No log 20.3810 428 0.7416 0.0282 0.7416 0.8611
No log 20.4762 430 0.7375 0.0814 0.7375 0.8588
No log 20.5714 432 0.7270 0.0282 0.7270 0.8526
No log 20.6667 434 0.7478 0.1146 0.7478 0.8647
No log 20.7619 436 0.7348 0.0282 0.7348 0.8572
No log 20.8571 438 0.7330 0.0814 0.7330 0.8562
No log 20.9524 440 0.7323 0.0814 0.7323 0.8557
No log 21.0476 442 0.7321 0.1146 0.7321 0.8557
No log 21.1429 444 0.7429 0.1553 0.7429 0.8619
No log 21.2381 446 0.7192 0.1675 0.7192 0.8480
No log 21.3333 448 0.7464 0.0922 0.7464 0.8640
No log 21.4286 450 0.8632 -0.0066 0.8632 0.9291
No log 21.5238 452 0.8788 -0.0016 0.8788 0.9374
No log 21.6190 454 0.7704 0.0081 0.7704 0.8778
No log 21.7143 456 0.7459 0.1612 0.7459 0.8637
No log 21.8095 458 0.8610 0.0867 0.8610 0.9279
No log 21.9048 460 0.8617 0.0442 0.8617 0.9283
No log 22.0 462 0.7821 0.1495 0.7821 0.8844
No log 22.0952 464 0.7249 0.1675 0.7249 0.8514
No log 22.1905 466 0.7014 0.1254 0.7014 0.8375
No log 22.2857 468 0.6924 0.1612 0.6924 0.8321
No log 22.3810 470 0.7032 0.1612 0.7032 0.8386
No log 22.4762 472 0.6977 0.1612 0.6977 0.8353
No log 22.5714 474 0.6973 0.1254 0.6973 0.8350
No log 22.6667 476 0.7095 0.0814 0.7095 0.8423
No log 22.7619 478 0.7108 0.0814 0.7108 0.8431
No log 22.8571 480 0.7109 0.0395 0.7109 0.8431
No log 22.9524 482 0.7034 0.1675 0.7034 0.8387
No log 23.0476 484 0.7515 0.2009 0.7515 0.8669
No log 23.1429 486 0.7882 0.1449 0.7882 0.8878
No log 23.2381 488 0.7517 0.1965 0.7517 0.8670
No log 23.3333 490 0.6922 0.2009 0.6922 0.8320
No log 23.4286 492 0.6737 0.0355 0.6737 0.8208
No log 23.5238 494 0.6902 0.0355 0.6902 0.8308
No log 23.6190 496 0.6877 0.0355 0.6877 0.8293
No log 23.7143 498 0.6801 0.2584 0.6801 0.8247
0.2382 23.8095 500 0.7283 0.2502 0.7283 0.8534
0.2382 23.9048 502 0.7538 0.0909 0.7538 0.8682
0.2382 24.0 504 0.7319 0.2502 0.7319 0.8555
0.2382 24.0952 506 0.7197 0.1659 0.7197 0.8483
0.2382 24.1905 508 0.7399 0.0889 0.7399 0.8602
0.2382 24.2857 510 0.7478 0.1327 0.7478 0.8648

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run3_AugV5_k4_task3_organization

Finetuned
(4019)
this model