ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k17_task5_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.5897
  • Qwk: 0.5977
  • Mse: 0.5897
  • Rmse: 0.7679

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0235 2 4.0783 0.0118 4.0783 2.0195
No log 0.0471 4 2.0976 0.0631 2.0976 1.4483
No log 0.0706 6 1.2009 0.1142 1.2009 1.0958
No log 0.0941 8 1.0805 0.2243 1.0805 1.0395
No log 0.1176 10 1.0666 0.2841 1.0666 1.0327
No log 0.1412 12 1.0230 0.2094 1.0230 1.0115
No log 0.1647 14 1.0983 0.2163 1.0983 1.0480
No log 0.1882 16 1.1487 0.2686 1.1487 1.0718
No log 0.2118 18 1.1547 0.3043 1.1547 1.0745
No log 0.2353 20 0.7348 0.4571 0.7348 0.8572
No log 0.2588 22 0.9233 0.3597 0.9233 0.9609
No log 0.2824 24 0.8873 0.4045 0.8873 0.9420
No log 0.3059 26 0.7381 0.5746 0.7381 0.8591
No log 0.3294 28 0.7052 0.5905 0.7052 0.8398
No log 0.3529 30 0.8542 0.5248 0.8542 0.9242
No log 0.3765 32 0.7796 0.5023 0.7796 0.8829
No log 0.4 34 0.6689 0.5596 0.6689 0.8179
No log 0.4235 36 0.6720 0.5819 0.6720 0.8198
No log 0.4471 38 0.6733 0.5988 0.6733 0.8205
No log 0.4706 40 0.6680 0.5894 0.6680 0.8173
No log 0.4941 42 0.6231 0.6894 0.6231 0.7893
No log 0.5176 44 0.6115 0.6345 0.6115 0.7820
No log 0.5412 46 0.6070 0.6229 0.6070 0.7791
No log 0.5647 48 0.6219 0.6526 0.6219 0.7886
No log 0.5882 50 0.6488 0.6426 0.6488 0.8055
No log 0.6118 52 0.7671 0.6032 0.7671 0.8758
No log 0.6353 54 1.0700 0.4538 1.0700 1.0344
No log 0.6588 56 0.9660 0.5098 0.9660 0.9828
No log 0.6824 58 0.7153 0.6367 0.7153 0.8457
No log 0.7059 60 0.6784 0.6726 0.6784 0.8237
No log 0.7294 62 0.6960 0.6726 0.6960 0.8342
No log 0.7529 64 0.7131 0.6259 0.7131 0.8445
No log 0.7765 66 0.8302 0.6864 0.8302 0.9111
No log 0.8 68 0.8339 0.6864 0.8339 0.9132
No log 0.8235 70 0.6797 0.6509 0.6797 0.8245
No log 0.8471 72 0.7000 0.6243 0.7000 0.8367
No log 0.8706 74 0.6697 0.6560 0.6697 0.8183
No log 0.8941 76 0.7502 0.6439 0.7502 0.8661
No log 0.9176 78 0.7470 0.6148 0.7470 0.8643
No log 0.9412 80 0.6680 0.6254 0.6680 0.8173
No log 0.9647 82 0.7217 0.5798 0.7217 0.8495
No log 0.9882 84 0.7088 0.6132 0.7088 0.8419
No log 1.0118 86 0.6453 0.6112 0.6453 0.8033
No log 1.0353 88 0.7832 0.5978 0.7832 0.8850
No log 1.0588 90 0.6977 0.6817 0.6977 0.8353
No log 1.0824 92 0.6563 0.5968 0.6563 0.8101
No log 1.1059 94 0.6611 0.5766 0.6611 0.8131
No log 1.1294 96 0.6897 0.6599 0.6897 0.8305
No log 1.1529 98 0.7553 0.6524 0.7553 0.8691
No log 1.1765 100 0.6836 0.6442 0.6836 0.8268
No log 1.2 102 0.6691 0.6525 0.6691 0.8180
No log 1.2235 104 0.9058 0.5343 0.9058 0.9518
No log 1.2471 106 1.0071 0.5184 1.0071 1.0035
No log 1.2706 108 0.8754 0.6311 0.8754 0.9356
No log 1.2941 110 0.7790 0.6768 0.7790 0.8826
No log 1.3176 112 0.6866 0.6714 0.6866 0.8286
No log 1.3412 114 0.7803 0.5304 0.7803 0.8833
No log 1.3647 116 0.7759 0.5338 0.7759 0.8808
No log 1.3882 118 0.6794 0.6226 0.6794 0.8243
No log 1.4118 120 0.6929 0.6290 0.6929 0.8324
No log 1.4353 122 0.7500 0.5625 0.7500 0.8660
No log 1.4588 124 0.7202 0.6082 0.7202 0.8487
No log 1.4824 126 0.6983 0.6826 0.6983 0.8357
No log 1.5059 128 0.7583 0.6886 0.7583 0.8708
No log 1.5294 130 0.7717 0.6706 0.7717 0.8785
No log 1.5529 132 0.7484 0.6873 0.7484 0.8651
No log 1.5765 134 0.7025 0.6541 0.7025 0.8381
No log 1.6 136 0.7129 0.5893 0.7129 0.8443
No log 1.6235 138 0.7620 0.5989 0.7620 0.8729
No log 1.6471 140 0.6886 0.6555 0.6886 0.8298
No log 1.6706 142 0.7634 0.4376 0.7634 0.8737
No log 1.6941 144 0.8173 0.4751 0.8173 0.9040
No log 1.7176 146 0.7185 0.5224 0.7185 0.8476
No log 1.7412 148 0.7333 0.5547 0.7333 0.8563
No log 1.7647 150 0.7730 0.6164 0.7730 0.8792
No log 1.7882 152 0.7888 0.6195 0.7888 0.8881
No log 1.8118 154 0.7814 0.6138 0.7814 0.8840
No log 1.8353 156 0.7641 0.6605 0.7641 0.8741
No log 1.8588 158 0.7207 0.6404 0.7207 0.8489
No log 1.8824 160 0.6726 0.5972 0.6726 0.8201
No log 1.9059 162 0.6465 0.6095 0.6465 0.8041
No log 1.9294 164 0.6402 0.5806 0.6402 0.8001
No log 1.9529 166 0.6322 0.5843 0.6322 0.7951
No log 1.9765 168 0.6391 0.5945 0.6391 0.7994
No log 2.0 170 0.6588 0.6787 0.6588 0.8117
No log 2.0235 172 0.6719 0.6911 0.6719 0.8197
No log 2.0471 174 0.6553 0.6639 0.6553 0.8095
No log 2.0706 176 0.6809 0.6833 0.6809 0.8252
No log 2.0941 178 0.7747 0.6089 0.7747 0.8802
No log 2.1176 180 0.9436 0.5254 0.9436 0.9714
No log 2.1412 182 0.9463 0.5165 0.9463 0.9728
No log 2.1647 184 0.7444 0.6301 0.7444 0.8628
No log 2.1882 186 0.6692 0.6350 0.6692 0.8181
No log 2.2118 188 0.6416 0.6011 0.6416 0.8010
No log 2.2353 190 0.6402 0.6316 0.6402 0.8001
No log 2.2588 192 0.6307 0.6316 0.6307 0.7942
No log 2.2824 194 0.6151 0.7218 0.6151 0.7843
No log 2.3059 196 0.6187 0.6347 0.6187 0.7866
No log 2.3294 198 0.6794 0.5902 0.6794 0.8243
No log 2.3529 200 0.6092 0.6259 0.6092 0.7805
No log 2.3765 202 0.5742 0.6966 0.5742 0.7577
No log 2.4 204 0.6501 0.5837 0.6501 0.8063
No log 2.4235 206 0.7032 0.6262 0.7032 0.8385
No log 2.4471 208 0.6671 0.5912 0.6671 0.8168
No log 2.4706 210 0.6491 0.6882 0.6491 0.8057
No log 2.4941 212 0.6537 0.6672 0.6537 0.8085
No log 2.5176 214 0.6253 0.6857 0.6253 0.7908
No log 2.5412 216 0.6496 0.5994 0.6496 0.8060
No log 2.5647 218 0.6258 0.6418 0.6258 0.7911
No log 2.5882 220 0.6262 0.6214 0.6262 0.7913
No log 2.6118 222 0.6272 0.6440 0.6272 0.7920
No log 2.6353 224 0.6375 0.6344 0.6375 0.7985
No log 2.6588 226 0.6960 0.6244 0.6960 0.8343
No log 2.6824 228 0.6791 0.6344 0.6791 0.8241
No log 2.7059 230 0.6707 0.6234 0.6707 0.8190
No log 2.7294 232 0.7085 0.6019 0.7085 0.8417
No log 2.7529 234 0.6887 0.5994 0.6887 0.8299
No log 2.7765 236 0.6633 0.6695 0.6633 0.8144
No log 2.8 238 0.6549 0.6902 0.6549 0.8092
No log 2.8235 240 0.6399 0.6511 0.6399 0.7999
No log 2.8471 242 0.6447 0.6188 0.6447 0.8029
No log 2.8706 244 0.6402 0.6052 0.6402 0.8002
No log 2.8941 246 0.7524 0.4949 0.7524 0.8674
No log 2.9176 248 0.8432 0.5658 0.8432 0.9183
No log 2.9412 250 0.7069 0.6184 0.7069 0.8408
No log 2.9647 252 0.6671 0.6672 0.6671 0.8168
No log 2.9882 254 0.8183 0.5461 0.8183 0.9046
No log 3.0118 256 0.8890 0.5384 0.8890 0.9429
No log 3.0353 258 0.7434 0.5660 0.7434 0.8622
No log 3.0588 260 0.6695 0.5726 0.6695 0.8182
No log 3.0824 262 0.6724 0.6070 0.6724 0.8200
No log 3.1059 264 0.6189 0.5934 0.6189 0.7867
No log 3.1294 266 0.6524 0.5898 0.6524 0.8077
No log 3.1529 268 0.6930 0.5266 0.6930 0.8325
No log 3.1765 270 0.6352 0.5981 0.6352 0.7970
No log 3.2 272 0.5783 0.6021 0.5783 0.7605
No log 3.2235 274 0.5357 0.7154 0.5357 0.7319
No log 3.2471 276 0.5279 0.7330 0.5279 0.7266
No log 3.2706 278 0.5469 0.7251 0.5469 0.7395
No log 3.2941 280 0.5468 0.7423 0.5468 0.7395
No log 3.3176 282 0.5855 0.7161 0.5855 0.7652
No log 3.3412 284 0.6062 0.6371 0.6062 0.7786
No log 3.3647 286 0.6170 0.5797 0.6170 0.7855
No log 3.3882 288 0.6323 0.5510 0.6323 0.7951
No log 3.4118 290 0.6670 0.5873 0.6670 0.8167
No log 3.4353 292 0.6541 0.6062 0.6541 0.8088
No log 3.4588 294 0.6156 0.5923 0.6156 0.7846
No log 3.4824 296 0.6532 0.6388 0.6532 0.8082
No log 3.5059 298 0.7622 0.6629 0.7622 0.8730
No log 3.5294 300 0.7628 0.6387 0.7628 0.8734
No log 3.5529 302 0.6895 0.6293 0.6895 0.8304
No log 3.5765 304 0.6589 0.6642 0.6589 0.8117
No log 3.6 306 0.6532 0.6227 0.6532 0.8082
No log 3.6235 308 0.6416 0.6320 0.6416 0.8010
No log 3.6471 310 0.6314 0.6320 0.6314 0.7946
No log 3.6706 312 0.6166 0.6464 0.6166 0.7852
No log 3.6941 314 0.5957 0.6689 0.5957 0.7718
No log 3.7176 316 0.5880 0.6737 0.5880 0.7668
No log 3.7412 318 0.5991 0.6721 0.5991 0.7740
No log 3.7647 320 0.6468 0.6249 0.6468 0.8042
No log 3.7882 322 0.6346 0.6654 0.6346 0.7966
No log 3.8118 324 0.6020 0.6990 0.6020 0.7759
No log 3.8353 326 0.6588 0.6972 0.6588 0.8117
No log 3.8588 328 0.7476 0.5741 0.7476 0.8647
No log 3.8824 330 0.6930 0.6236 0.6930 0.8325
No log 3.9059 332 0.6179 0.5740 0.6179 0.7861
No log 3.9294 334 0.5885 0.5402 0.5885 0.7671
No log 3.9529 336 0.6051 0.5171 0.6051 0.7779
No log 3.9765 338 0.6070 0.5402 0.6070 0.7791
No log 4.0 340 0.6083 0.5712 0.6083 0.7799
No log 4.0235 342 0.6023 0.5602 0.6023 0.7761
No log 4.0471 344 0.5872 0.6383 0.5872 0.7663
No log 4.0706 346 0.6127 0.6184 0.6127 0.7828
No log 4.0941 348 0.6008 0.6302 0.6008 0.7751
No log 4.1176 350 0.5828 0.6598 0.5828 0.7634
No log 4.1412 352 0.5736 0.6124 0.5736 0.7574
No log 4.1647 354 0.5947 0.6099 0.5947 0.7712
No log 4.1882 356 0.5796 0.5988 0.5796 0.7613
No log 4.2118 358 0.5683 0.6039 0.5683 0.7539
No log 4.2353 360 0.6341 0.6493 0.6341 0.7963
No log 4.2588 362 0.6269 0.6664 0.6269 0.7918
No log 4.2824 364 0.5768 0.6976 0.5768 0.7595
No log 4.3059 366 0.6097 0.6091 0.6097 0.7808
No log 4.3294 368 0.6988 0.5638 0.6988 0.8359
No log 4.3529 370 0.6692 0.6230 0.6692 0.8181
No log 4.3765 372 0.5943 0.6253 0.5943 0.7709
No log 4.4 374 0.5567 0.6909 0.5567 0.7461
No log 4.4235 376 0.6129 0.7106 0.6129 0.7829
No log 4.4471 378 0.6626 0.6761 0.6626 0.8140
No log 4.4706 380 0.6202 0.6815 0.6202 0.7876
No log 4.4941 382 0.5572 0.6717 0.5572 0.7465
No log 4.5176 384 0.5870 0.6174 0.5870 0.7661
No log 4.5412 386 0.5988 0.5770 0.5988 0.7738
No log 4.5647 388 0.5741 0.6032 0.5741 0.7577
No log 4.5882 390 0.5638 0.6224 0.5638 0.7509
No log 4.6118 392 0.5798 0.6909 0.5798 0.7614
No log 4.6353 394 0.5968 0.6819 0.5968 0.7725
No log 4.6588 396 0.5980 0.6819 0.5980 0.7733
No log 4.6824 398 0.5985 0.7096 0.5985 0.7736
No log 4.7059 400 0.5990 0.6543 0.5990 0.7739
No log 4.7294 402 0.6082 0.6410 0.6082 0.7799
No log 4.7529 404 0.5891 0.6491 0.5891 0.7675
No log 4.7765 406 0.6080 0.6509 0.6080 0.7797
No log 4.8 408 0.6366 0.6123 0.6366 0.7979
No log 4.8235 410 0.6450 0.6123 0.6450 0.8031
No log 4.8471 412 0.6535 0.5657 0.6535 0.8084
No log 4.8706 414 0.6715 0.5862 0.6715 0.8194
No log 4.8941 416 0.6779 0.6160 0.6779 0.8233
No log 4.9176 418 0.6529 0.6451 0.6529 0.8080
No log 4.9412 420 0.6290 0.6305 0.6290 0.7931
No log 4.9647 422 0.6188 0.6128 0.6188 0.7866
No log 4.9882 424 0.6094 0.6123 0.6094 0.7806
No log 5.0118 426 0.6041 0.6320 0.6041 0.7772
No log 5.0353 428 0.5975 0.6518 0.5975 0.7730
No log 5.0588 430 0.6013 0.6482 0.6013 0.7754
No log 5.0824 432 0.6186 0.6406 0.6186 0.7865
No log 5.1059 434 0.6379 0.6406 0.6379 0.7987
No log 5.1294 436 0.6760 0.6018 0.6760 0.8222
No log 5.1529 438 0.7218 0.5658 0.7218 0.8496
No log 5.1765 440 0.7340 0.5491 0.7340 0.8567
No log 5.2 442 0.6824 0.5463 0.6824 0.8261
No log 5.2235 444 0.6376 0.4908 0.6376 0.7985
No log 5.2471 446 0.6318 0.5905 0.6318 0.7948
No log 5.2706 448 0.6031 0.6419 0.6031 0.7766
No log 5.2941 450 0.5777 0.7064 0.5777 0.7600
No log 5.3176 452 0.5983 0.7219 0.5983 0.7735
No log 5.3412 454 0.6132 0.7325 0.6132 0.7831
No log 5.3647 456 0.5854 0.7232 0.5854 0.7651
No log 5.3882 458 0.5603 0.6942 0.5603 0.7485
No log 5.4118 460 0.5642 0.7064 0.5642 0.7511
No log 5.4353 462 0.5835 0.6545 0.5835 0.7639
No log 5.4588 464 0.6153 0.6397 0.6153 0.7844
No log 5.4824 466 0.6395 0.6552 0.6395 0.7997
No log 5.5059 468 0.6411 0.6651 0.6411 0.8007
No log 5.5294 470 0.6606 0.6857 0.6606 0.8128
No log 5.5529 472 0.6592 0.6826 0.6592 0.8119
No log 5.5765 474 0.6577 0.6840 0.6577 0.8110
No log 5.6 476 0.6413 0.6694 0.6413 0.8008
No log 5.6235 478 0.6292 0.6251 0.6292 0.7932
No log 5.6471 480 0.6140 0.6473 0.6140 0.7836
No log 5.6706 482 0.6250 0.6895 0.6250 0.7906
No log 5.6941 484 0.7191 0.6233 0.7191 0.8480
No log 5.7176 486 0.7369 0.6263 0.7369 0.8584
No log 5.7412 488 0.7132 0.6742 0.7132 0.8445
No log 5.7647 490 0.6410 0.6797 0.6410 0.8006
No log 5.7882 492 0.6150 0.7014 0.6150 0.7842
No log 5.8118 494 0.5915 0.6284 0.5915 0.7691
No log 5.8353 496 0.5813 0.6447 0.5813 0.7624
No log 5.8588 498 0.5870 0.6736 0.5870 0.7661
0.2548 5.8824 500 0.6026 0.6263 0.6026 0.7763
0.2548 5.9059 502 0.6105 0.6388 0.6105 0.7813
0.2548 5.9294 504 0.6071 0.6388 0.6071 0.7792
0.2548 5.9529 506 0.6340 0.6151 0.6340 0.7962
0.2548 5.9765 508 0.6114 0.6501 0.6114 0.7819
0.2548 6.0 510 0.5940 0.6796 0.5940 0.7707
0.2548 6.0235 512 0.6194 0.6444 0.6194 0.7870
0.2548 6.0471 514 0.6244 0.6444 0.6244 0.7902
0.2548 6.0706 516 0.6020 0.6876 0.6020 0.7759
0.2548 6.0941 518 0.6120 0.6333 0.6120 0.7823
0.2548 6.1176 520 0.6802 0.5930 0.6802 0.8247
0.2548 6.1412 522 0.7218 0.6260 0.7218 0.8496
0.2548 6.1647 524 0.7174 0.6395 0.7174 0.8470
0.2548 6.1882 526 0.6903 0.6528 0.6903 0.8308
0.2548 6.2118 528 0.6256 0.6412 0.6256 0.7910
0.2548 6.2353 530 0.5819 0.6909 0.5819 0.7628
0.2548 6.2588 532 0.5751 0.6923 0.5751 0.7583
0.2548 6.2824 534 0.5810 0.6642 0.5810 0.7622
0.2548 6.3059 536 0.6170 0.6293 0.6170 0.7855
0.2548 6.3294 538 0.6374 0.6357 0.6374 0.7984
0.2548 6.3529 540 0.6613 0.6189 0.6613 0.8132
0.2548 6.3765 542 0.6390 0.6128 0.6390 0.7993
0.2548 6.4 544 0.6037 0.5853 0.6037 0.7770
0.2548 6.4235 546 0.5821 0.5759 0.5821 0.7629
0.2548 6.4471 548 0.5897 0.5977 0.5897 0.7679

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k17_task5_organization

Finetuned
(4019)
this model