ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run3_AugV5_k13_task1_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.8945
  • Qwk: 0.6706
  • Mse: 0.8945
  • Rmse: 0.9458

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0206 2 6.8852 0.0179 6.8852 2.6240
No log 0.0412 4 4.7207 0.0843 4.7207 2.1727
No log 0.0619 6 2.8590 0.0848 2.8590 1.6909
No log 0.0825 8 2.6033 0.0267 2.6033 1.6135
No log 0.1031 10 2.4716 0.0972 2.4716 1.5721
No log 0.1237 12 1.7569 0.2037 1.7569 1.3255
No log 0.1443 14 1.7980 0.1165 1.7980 1.3409
No log 0.1649 16 1.7958 0.1538 1.7958 1.3401
No log 0.1856 18 1.7352 0.1538 1.7352 1.3173
No log 0.2062 20 1.6263 0.0784 1.6263 1.2753
No log 0.2268 22 1.4605 0.1165 1.4605 1.2085
No log 0.2474 24 1.4655 0.2727 1.4655 1.2106
No log 0.2680 26 1.5568 0.3590 1.5568 1.2477
No log 0.2887 28 1.3756 0.3009 1.3756 1.1729
No log 0.3093 30 1.2007 0.3826 1.2007 1.0958
No log 0.3299 32 1.1530 0.55 1.1530 1.0738
No log 0.3505 34 1.3016 0.4878 1.3016 1.1409
No log 0.3711 36 1.4496 0.4194 1.4496 1.2040
No log 0.3918 38 1.3023 0.4878 1.3023 1.1412
No log 0.4124 40 1.0180 0.5645 1.0180 1.0089
No log 0.4330 42 0.8132 0.6562 0.8132 0.9018
No log 0.4536 44 0.7507 0.7101 0.7507 0.8664
No log 0.4742 46 0.7285 0.7429 0.7285 0.8535
No log 0.4948 48 0.7430 0.7042 0.7430 0.8620
No log 0.5155 50 0.7011 0.7310 0.7011 0.8373
No log 0.5361 52 0.6999 0.7310 0.6999 0.8366
No log 0.5567 54 0.7873 0.7164 0.7873 0.8873
No log 0.5773 56 0.8391 0.6567 0.8391 0.9160
No log 0.5979 58 0.7256 0.7153 0.7256 0.8518
No log 0.6186 60 0.6882 0.7153 0.6882 0.8296
No log 0.6392 62 0.6822 0.7391 0.6822 0.8259
No log 0.6598 64 0.7835 0.6849 0.7835 0.8852
No log 0.6804 66 1.3663 0.4857 1.3663 1.1689
No log 0.7010 68 1.9435 0.2979 1.9435 1.3941
No log 0.7216 70 1.8723 0.2877 1.8723 1.3683
No log 0.7423 72 1.2213 0.6145 1.2213 1.1051
No log 0.7629 74 0.7604 0.7117 0.7604 0.8720
No log 0.7835 76 0.5766 0.7778 0.5766 0.7593
No log 0.8041 78 0.6507 0.7468 0.6507 0.8066
No log 0.8247 80 0.8644 0.6184 0.8644 0.9297
No log 0.8454 82 0.7602 0.6623 0.7602 0.8719
No log 0.8660 84 0.6642 0.7639 0.6642 0.8150
No log 0.8866 86 0.7074 0.6761 0.7074 0.8410
No log 0.9072 88 0.8075 0.6861 0.8075 0.8986
No log 0.9278 90 1.0323 0.5758 1.0323 1.0160
No log 0.9485 92 1.1303 0.5986 1.1303 1.0632
No log 0.9691 94 0.8850 0.6154 0.8850 0.9407
No log 0.9897 96 0.7488 0.6957 0.7488 0.8654
No log 1.0103 98 0.9771 0.6301 0.9771 0.9885
No log 1.0309 100 0.8974 0.6713 0.8974 0.9473
No log 1.0515 102 0.7046 0.7083 0.7046 0.8394
No log 1.0722 104 0.9736 0.7027 0.9736 0.9867
No log 1.0928 106 1.3832 0.5517 1.3832 1.1761
No log 1.1134 108 1.4779 0.5341 1.4779 1.2157
No log 1.1340 110 1.3241 0.6467 1.3241 1.1507
No log 1.1546 112 0.9866 0.6497 0.9866 0.9933
No log 1.1753 114 0.7099 0.6933 0.7099 0.8425
No log 1.1959 116 0.6300 0.7712 0.6300 0.7937
No log 1.2165 118 0.6711 0.7320 0.6711 0.8192
No log 1.2371 120 0.7398 0.72 0.7398 0.8601
No log 1.2577 122 0.7519 0.6980 0.7519 0.8671
No log 1.2784 124 0.8061 0.6755 0.8061 0.8978
No log 1.2990 126 0.7592 0.7407 0.7592 0.8713
No log 1.3196 128 0.6085 0.8208 0.6085 0.7800
No log 1.3402 130 0.6107 0.8023 0.6107 0.7815
No log 1.3608 132 0.7120 0.7326 0.7120 0.8438
No log 1.3814 134 0.6430 0.7950 0.6430 0.8018
No log 1.4021 136 0.6105 0.8052 0.6105 0.7814
No log 1.4227 138 0.6097 0.7862 0.6097 0.7808
No log 1.4433 140 0.6385 0.7550 0.6385 0.7991
No log 1.4639 142 0.6268 0.7552 0.6268 0.7917
No log 1.4845 144 0.8024 0.6712 0.8024 0.8958
No log 1.5052 146 1.2254 0.6104 1.2254 1.1070
No log 1.5258 148 1.2852 0.5734 1.2852 1.1337
No log 1.5464 150 1.1015 0.5775 1.1015 1.0495
No log 1.5670 152 0.8269 0.7075 0.8269 0.9093
No log 1.5876 154 0.7039 0.7105 0.7039 0.8390
No log 1.6082 156 0.6848 0.7516 0.6848 0.8275
No log 1.6289 158 0.8455 0.6585 0.8455 0.9195
No log 1.6495 160 0.8162 0.6747 0.8162 0.9034
No log 1.6701 162 0.6335 0.7613 0.6335 0.7959
No log 1.6907 164 0.6937 0.7413 0.6937 0.8329
No log 1.7113 166 0.7213 0.7183 0.7213 0.8493
No log 1.7320 168 0.6803 0.7376 0.6803 0.8248
No log 1.7526 170 0.9466 0.6358 0.9466 0.9729
No log 1.7732 172 1.0263 0.6503 1.0263 1.0131
No log 1.7938 174 0.8404 0.6667 0.8404 0.9168
No log 1.8144 176 0.6659 0.75 0.6659 0.8160
No log 1.8351 178 0.6837 0.75 0.6837 0.8269
No log 1.8557 180 0.8542 0.6875 0.8542 0.9242
No log 1.8763 182 0.8930 0.6988 0.8930 0.9450
No log 1.8969 184 0.7906 0.7273 0.7906 0.8892
No log 1.9175 186 0.7726 0.7381 0.7726 0.8790
No log 1.9381 188 0.7684 0.7273 0.7684 0.8766
No log 1.9588 190 0.6303 0.7683 0.6303 0.7939
No log 1.9794 192 0.5759 0.8171 0.5759 0.7589
No log 2.0 194 0.6532 0.7403 0.6532 0.8082
No log 2.0206 196 0.7391 0.7226 0.7391 0.8597
No log 2.0412 198 0.8955 0.6800 0.8955 0.9463
No log 2.0619 200 0.9762 0.6486 0.9762 0.9880
No log 2.0825 202 0.9520 0.6222 0.9520 0.9757
No log 2.1031 204 0.8528 0.6567 0.8528 0.9235
No log 2.1237 206 0.6985 0.7248 0.6985 0.8358
No log 2.1443 208 0.6294 0.7632 0.6294 0.7934
No log 2.1649 210 0.6529 0.7467 0.6529 0.8080
No log 2.1856 212 0.6634 0.7383 0.6634 0.8145
No log 2.2062 214 0.6280 0.7973 0.6280 0.7925
No log 2.2268 216 0.6711 0.76 0.6711 0.8192
No log 2.2474 218 0.7484 0.6968 0.7484 0.8651
No log 2.2680 220 0.8352 0.6711 0.8352 0.9139
No log 2.2887 222 0.8674 0.6711 0.8674 0.9313
No log 2.3093 224 0.8873 0.6490 0.8873 0.9419
No log 2.3299 226 0.8336 0.6331 0.8336 0.9130
No log 2.3505 228 0.7620 0.6861 0.7620 0.8729
No log 2.3711 230 0.7469 0.6944 0.7469 0.8642
No log 2.3918 232 0.7689 0.7013 0.7689 0.8769
No log 2.4124 234 0.8896 0.7013 0.8896 0.9432
No log 2.4330 236 0.8770 0.6575 0.8770 0.9365
No log 2.4536 238 0.8390 0.6383 0.8390 0.9160
No log 2.4742 240 0.7627 0.6667 0.7627 0.8734
No log 2.4948 242 0.7912 0.6803 0.7912 0.8895
No log 2.5155 244 0.8463 0.6525 0.8463 0.9200
No log 2.5361 246 0.8663 0.6483 0.8663 0.9308
No log 2.5567 248 0.9128 0.6301 0.9128 0.9554
No log 2.5773 250 0.8945 0.6753 0.8945 0.9458
No log 2.5979 252 0.8971 0.6533 0.8971 0.9471
No log 2.6186 254 0.9648 0.6225 0.9648 0.9823
No log 2.6392 256 1.0201 0.6069 1.0201 1.0100
No log 2.6598 258 0.9790 0.6222 0.9790 0.9894
No log 2.6804 260 0.9701 0.6519 0.9701 0.9849
No log 2.7010 262 0.8897 0.6522 0.8897 0.9432
No log 2.7216 264 0.7531 0.7101 0.7531 0.8678
No log 2.7423 266 0.8084 0.6812 0.8084 0.8991
No log 2.7629 268 0.9772 0.6423 0.9772 0.9886
No log 2.7835 270 1.0764 0.5899 1.0764 1.0375
No log 2.8041 272 1.1134 0.6027 1.1134 1.0552
No log 2.8247 274 0.9527 0.7081 0.9527 0.9760
No log 2.8454 276 0.7514 0.7355 0.7514 0.8669
No log 2.8660 278 0.7757 0.6944 0.7757 0.8807
No log 2.8866 280 0.8222 0.6809 0.8222 0.9068
No log 2.9072 282 0.7806 0.6993 0.7806 0.8835
No log 2.9278 284 0.7026 0.6853 0.7026 0.8382
No log 2.9485 286 0.7019 0.6714 0.7019 0.8378
No log 2.9691 288 0.7778 0.6429 0.7778 0.8819
No log 2.9897 290 0.9127 0.5985 0.9127 0.9554
No log 3.0103 292 0.9291 0.5821 0.9291 0.9639
No log 3.0309 294 0.9849 0.5970 0.9849 0.9924
No log 3.0515 296 0.9463 0.6165 0.9463 0.9728
No log 3.0722 298 0.9212 0.6131 0.9212 0.9598
No log 3.0928 300 0.9442 0.6623 0.9442 0.9717
No log 3.1134 302 0.9953 0.6429 0.9953 0.9977
No log 3.1340 304 0.9266 0.6667 0.9266 0.9626
No log 3.1546 306 0.7899 0.7089 0.7899 0.8888
No log 3.1753 308 0.7222 0.6667 0.7222 0.8498
No log 3.1959 310 0.6903 0.7194 0.6903 0.8309
No log 3.2165 312 0.7284 0.6866 0.7284 0.8535
No log 3.2371 314 0.7758 0.6618 0.7758 0.8808
No log 3.2577 316 0.8046 0.6533 0.8046 0.8970
No log 3.2784 318 0.7627 0.7170 0.7627 0.8733
No log 3.2990 320 0.7536 0.7314 0.7536 0.8681
No log 3.3196 322 0.6812 0.7667 0.6812 0.8253
No log 3.3402 324 0.8465 0.7225 0.8465 0.9201
No log 3.3608 326 1.1116 0.6593 1.1116 1.0543
No log 3.3814 328 1.0628 0.6310 1.0628 1.0309
No log 3.4021 330 0.8442 0.6619 0.8442 0.9188
No log 3.4227 332 0.7641 0.6412 0.7641 0.8741
No log 3.4433 334 0.7356 0.6912 0.7356 0.8577
No log 3.4639 336 0.6750 0.7517 0.6750 0.8216
No log 3.4845 338 0.6612 0.7453 0.6612 0.8131
No log 3.5052 340 0.7624 0.7186 0.7624 0.8732
No log 3.5258 342 0.8533 0.6982 0.8533 0.9237
No log 3.5464 344 0.9326 0.6503 0.9326 0.9657
No log 3.5670 346 0.9019 0.6883 0.9019 0.9497
No log 3.5876 348 0.8855 0.6571 0.8854 0.9410
No log 3.6082 350 0.9375 0.6232 0.9375 0.9683
No log 3.6289 352 0.9751 0.6176 0.9751 0.9875
No log 3.6495 354 0.9879 0.6176 0.9879 0.9939
No log 3.6701 356 0.9015 0.6131 0.9015 0.9495
No log 3.6907 358 0.8758 0.6711 0.8758 0.9358
No log 3.7113 360 0.9645 0.6792 0.9645 0.9821
No log 3.7320 362 1.2166 0.6272 1.2166 1.1030
No log 3.7526 364 1.2393 0.5890 1.2393 1.1132
No log 3.7732 366 0.9880 0.6027 0.9880 0.9940
No log 3.7938 368 0.8394 0.6528 0.8394 0.9162
No log 3.8144 370 0.7993 0.6531 0.7993 0.8940
No log 3.8351 372 0.7156 0.7075 0.7156 0.8459
No log 3.8557 374 0.7407 0.7105 0.7407 0.8606
No log 3.8763 376 0.7133 0.6980 0.7133 0.8446
No log 3.8969 378 0.6705 0.6980 0.6705 0.8188
No log 3.9175 380 0.6089 0.7383 0.6089 0.7803
No log 3.9381 382 0.6259 0.7162 0.6259 0.7911
No log 3.9588 384 0.6930 0.7273 0.6930 0.8325
No log 3.9794 386 0.7536 0.7134 0.7536 0.8681
No log 4.0 388 0.8130 0.6923 0.8130 0.9017
No log 4.0206 390 0.8605 0.6711 0.8605 0.9276
No log 4.0412 392 0.8374 0.6571 0.8374 0.9151
No log 4.0619 394 0.8302 0.6569 0.8302 0.9112
No log 4.0825 396 0.7476 0.6569 0.7476 0.8646
No log 4.1031 398 0.7033 0.6569 0.7033 0.8386
No log 4.1237 400 0.7548 0.6853 0.7548 0.8688
No log 4.1443 402 0.7527 0.6923 0.7527 0.8676
No log 4.1649 404 0.6829 0.7273 0.6829 0.8264
No log 4.1856 406 0.7406 0.7296 0.7406 0.8606
No log 4.2062 408 0.7712 0.7205 0.7712 0.8782
No log 4.2268 410 0.7381 0.7059 0.7381 0.8591
No log 4.2474 412 0.6907 0.7333 0.6907 0.8311
No log 4.2680 414 0.8285 0.6905 0.8285 0.9102
No log 4.2887 416 0.8375 0.6971 0.8375 0.9151
No log 4.3093 418 0.6151 0.7799 0.6151 0.7843
No log 4.3299 420 0.5975 0.7922 0.5975 0.7730
No log 4.3505 422 0.6705 0.7662 0.6705 0.8188
No log 4.3711 424 0.7628 0.6579 0.7628 0.8734
No log 4.3918 426 0.7437 0.6797 0.7437 0.8624
No log 4.4124 428 0.7496 0.6759 0.7496 0.8658
No log 4.4330 430 0.6840 0.7114 0.6840 0.8271
No log 4.4536 432 0.6372 0.7632 0.6372 0.7982
No log 4.4742 434 0.6662 0.72 0.6662 0.8162
No log 4.4948 436 0.8441 0.6579 0.8441 0.9188
No log 4.5155 438 0.9620 0.6197 0.9620 0.9808
No log 4.5361 440 0.9912 0.5909 0.9912 0.9956
No log 4.5567 442 0.9416 0.5909 0.9416 0.9704
No log 4.5773 444 0.8037 0.6618 0.8037 0.8965
No log 4.5979 446 0.7063 0.6618 0.7063 0.8404
No log 4.6186 448 0.6887 0.6712 0.6887 0.8299
No log 4.6392 450 0.6988 0.6933 0.6988 0.8359
No log 4.6598 452 0.6636 0.6712 0.6636 0.8146
No log 4.6804 454 0.6145 0.7758 0.6145 0.7839
No log 4.7010 456 0.6510 0.7170 0.6510 0.8069
No log 4.7216 458 0.7232 0.6879 0.7232 0.8504
No log 4.7423 460 0.7560 0.6879 0.7560 0.8695
No log 4.7629 462 0.7423 0.6753 0.7423 0.8616
No log 4.7835 464 0.7298 0.6806 0.7298 0.8543
No log 4.8041 466 0.6842 0.6906 0.6842 0.8272
No log 4.8247 468 0.6373 0.7299 0.6373 0.7983
No log 4.8454 470 0.6408 0.7286 0.6408 0.8005
No log 4.8660 472 0.7056 0.6812 0.7056 0.8400
No log 4.8866 474 0.7499 0.6757 0.7499 0.8660
No log 4.9072 476 0.7400 0.6757 0.7400 0.8602
No log 4.9278 478 0.7138 0.6883 0.7138 0.8449
No log 4.9485 480 0.6415 0.7453 0.6415 0.8009
No log 4.9691 482 0.5949 0.7826 0.5949 0.7713
No log 4.9897 484 0.5684 0.7722 0.5684 0.7539
No log 5.0103 486 0.5678 0.7692 0.5678 0.7536
No log 5.0309 488 0.6132 0.7143 0.6132 0.7831
No log 5.0515 490 0.6915 0.7027 0.6915 0.8316
No log 5.0722 492 0.6982 0.6761 0.6982 0.8356
No log 5.0928 494 0.6593 0.6993 0.6593 0.8120
No log 5.1134 496 0.6140 0.7246 0.6140 0.7836
No log 5.1340 498 0.6210 0.7297 0.6210 0.7880
0.4264 5.1546 500 0.7273 0.6994 0.7273 0.8528
0.4264 5.1753 502 0.7696 0.6901 0.7696 0.8772
0.4264 5.1959 504 0.6663 0.7296 0.6663 0.8163
0.4264 5.2165 506 0.6387 0.7211 0.6387 0.7992
0.4264 5.2371 508 0.6239 0.7413 0.6239 0.7899
0.4264 5.2577 510 0.6302 0.7183 0.6302 0.7939
0.4264 5.2784 512 0.6702 0.7162 0.6702 0.8187
0.4264 5.2990 514 0.7281 0.7248 0.7281 0.8533
0.4264 5.3196 516 0.6972 0.7133 0.6972 0.8350
0.4264 5.3402 518 0.6613 0.7429 0.6613 0.8132
0.4264 5.3608 520 0.6643 0.7429 0.6643 0.8150
0.4264 5.3814 522 0.6909 0.7092 0.6909 0.8312
0.4264 5.4021 524 0.8238 0.6883 0.8238 0.9077
0.4264 5.4227 526 0.8416 0.6883 0.8416 0.9174
0.4264 5.4433 528 0.7900 0.6986 0.7900 0.8888
0.4264 5.4639 530 0.6934 0.6906 0.6934 0.8327
0.4264 5.4845 532 0.6659 0.7273 0.6659 0.8160
0.4264 5.5052 534 0.6975 0.6993 0.6975 0.8352
0.4264 5.5258 536 0.7249 0.7123 0.7249 0.8514
0.4264 5.5464 538 0.6890 0.7027 0.6890 0.8300
0.4264 5.5670 540 0.6970 0.7027 0.6970 0.8349
0.4264 5.5876 542 0.6504 0.7451 0.6504 0.8065
0.4264 5.6082 544 0.5643 0.7974 0.5643 0.7512
0.4264 5.6289 546 0.5885 0.7815 0.5885 0.7672
0.4264 5.6495 548 0.6662 0.7248 0.6662 0.8162
0.4264 5.6701 550 0.6758 0.7172 0.6758 0.8221
0.4264 5.6907 552 0.6429 0.7613 0.6429 0.8018
0.4264 5.7113 554 0.6335 0.7702 0.6335 0.7959
0.4264 5.7320 556 0.6178 0.7805 0.6178 0.7860
0.4264 5.7526 558 0.6327 0.7389 0.6327 0.7954
0.4264 5.7732 560 0.7323 0.7308 0.7323 0.8557
0.4264 5.7938 562 0.7648 0.7067 0.7648 0.8745
0.4264 5.8144 564 0.7091 0.6667 0.7091 0.8421
0.4264 5.8351 566 0.7294 0.6944 0.7294 0.8540
0.4264 5.8557 568 0.7605 0.7067 0.7605 0.8721
0.4264 5.8763 570 0.6679 0.7067 0.6679 0.8173
0.4264 5.8969 572 0.5382 0.7922 0.5382 0.7336
0.4264 5.9175 574 0.4781 0.85 0.4781 0.6915
0.4264 5.9381 576 0.4713 0.8606 0.4713 0.6865
0.4264 5.9588 578 0.5287 0.8047 0.5287 0.7271
0.4264 5.9794 580 0.6317 0.7683 0.6317 0.7948
0.4264 6.0 582 0.6459 0.7403 0.6459 0.8037
0.4264 6.0206 584 0.6400 0.7286 0.6400 0.8000
0.4264 6.0412 586 0.6828 0.6957 0.6828 0.8263
0.4264 6.0619 588 0.6774 0.6861 0.6774 0.8230
0.4264 6.0825 590 0.7262 0.6667 0.7262 0.8522
0.4264 6.1031 592 0.8368 0.6667 0.8368 0.9148
0.4264 6.1237 594 0.8945 0.6706 0.8945 0.9458

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
6
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run3_AugV5_k13_task1_organization

Finetuned
(4023)
this model