ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run3_AugV5_k17_task5_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.7175
  • Qwk: 0.5017
  • Mse: 0.7175
  • Rmse: 0.8471

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0235 2 3.7414 -0.0347 3.7414 1.9343
No log 0.0471 4 1.9167 0.0123 1.9167 1.3844
No log 0.0706 6 1.1725 0.0996 1.1725 1.0828
No log 0.0941 8 1.0441 0.2865 1.0441 1.0218
No log 0.1176 10 1.1342 0.1561 1.1342 1.0650
No log 0.1412 12 1.4953 -0.0394 1.4953 1.2228
No log 0.1647 14 1.4905 -0.0328 1.4905 1.2209
No log 0.1882 16 1.1992 0.1148 1.1992 1.0951
No log 0.2118 18 1.0063 0.3071 1.0063 1.0031
No log 0.2353 20 0.9757 0.3175 0.9757 0.9878
No log 0.2588 22 0.8963 0.3562 0.8963 0.9467
No log 0.2824 24 0.9584 0.3026 0.9584 0.9790
No log 0.3059 26 0.8766 0.3851 0.8766 0.9363
No log 0.3294 28 0.7364 0.4533 0.7364 0.8581
No log 0.3529 30 0.9499 0.2569 0.9499 0.9746
No log 0.3765 32 0.9736 0.2849 0.9736 0.9867
No log 0.4 34 0.7433 0.5057 0.7433 0.8621
No log 0.4235 36 0.8537 0.4356 0.8537 0.9240
No log 0.4471 38 1.0734 0.4236 1.0734 1.0360
No log 0.4706 40 1.0008 0.4706 1.0008 1.0004
No log 0.4941 42 0.7130 0.5483 0.7130 0.8444
No log 0.5176 44 0.6132 0.6673 0.6132 0.7831
No log 0.5412 46 0.6252 0.6349 0.6252 0.7907
No log 0.5647 48 0.6690 0.5633 0.6690 0.8180
No log 0.5882 50 0.9505 0.5181 0.9505 0.9750
No log 0.6118 52 0.7894 0.6101 0.7894 0.8885
No log 0.6353 54 0.6546 0.6094 0.6546 0.8091
No log 0.6588 56 0.6752 0.6168 0.6752 0.8217
No log 0.6824 58 0.8463 0.5988 0.8463 0.9200
No log 0.7059 60 0.7419 0.6176 0.7419 0.8613
No log 0.7294 62 0.8354 0.6064 0.8354 0.9140
No log 0.7529 64 0.8874 0.5414 0.8874 0.9420
No log 0.7765 66 0.7877 0.6596 0.7877 0.8875
No log 0.8 68 0.6779 0.6595 0.6779 0.8234
No log 0.8235 70 0.6568 0.5940 0.6568 0.8104
No log 0.8471 72 0.6851 0.6194 0.6851 0.8277
No log 0.8706 74 0.6973 0.6194 0.6973 0.8350
No log 0.8941 76 0.6355 0.5841 0.6355 0.7972
No log 0.9176 78 0.7045 0.4888 0.7045 0.8393
No log 0.9412 80 0.8287 0.5283 0.8287 0.9103
No log 0.9647 82 0.8444 0.5073 0.8444 0.9189
No log 0.9882 84 0.7743 0.5711 0.7743 0.8800
No log 1.0118 86 0.6818 0.6492 0.6818 0.8257
No log 1.0353 88 0.6929 0.6571 0.6929 0.8324
No log 1.0588 90 0.7368 0.6511 0.7368 0.8583
No log 1.0824 92 0.9423 0.5614 0.9423 0.9707
No log 1.1059 94 1.1541 0.4632 1.1541 1.0743
No log 1.1294 96 0.9107 0.5536 0.9107 0.9543
No log 1.1529 98 0.6433 0.6811 0.6433 0.8021
No log 1.1765 100 0.5977 0.6630 0.5977 0.7731
No log 1.2 102 0.5967 0.6947 0.5967 0.7725
No log 1.2235 104 0.6216 0.6911 0.6216 0.7884
No log 1.2471 106 0.6598 0.6884 0.6598 0.8123
No log 1.2706 108 0.7820 0.6157 0.7820 0.8843
No log 1.2941 110 0.8741 0.5410 0.8741 0.9349
No log 1.3176 112 0.7636 0.6291 0.7636 0.8738
No log 1.3412 114 0.7184 0.6285 0.7184 0.8476
No log 1.3647 116 0.7906 0.6199 0.7906 0.8892
No log 1.3882 118 0.7364 0.5850 0.7364 0.8581
No log 1.4118 120 0.7148 0.5752 0.7148 0.8455
No log 1.4353 122 0.7430 0.5641 0.7430 0.8620
No log 1.4588 124 0.7060 0.5659 0.7060 0.8402
No log 1.4824 126 0.7005 0.5425 0.7005 0.8370
No log 1.5059 128 0.6667 0.5977 0.6667 0.8165
No log 1.5294 130 0.6438 0.6104 0.6438 0.8024
No log 1.5529 132 0.6344 0.6433 0.6344 0.7965
No log 1.5765 134 0.6261 0.6546 0.6261 0.7913
No log 1.6 136 0.6158 0.6546 0.6158 0.7847
No log 1.6235 138 0.6231 0.6243 0.6231 0.7894
No log 1.6471 140 0.6130 0.6546 0.6130 0.7830
No log 1.6706 142 0.6917 0.5674 0.6917 0.8317
No log 1.6941 144 0.7099 0.5566 0.7099 0.8426
No log 1.7176 146 0.6520 0.6433 0.6520 0.8074
No log 1.7412 148 0.7096 0.5848 0.7096 0.8424
No log 1.7647 150 0.6916 0.6642 0.6916 0.8317
No log 1.7882 152 0.7046 0.5868 0.7046 0.8394
No log 1.8118 154 0.7030 0.5868 0.7030 0.8385
No log 1.8353 156 0.6501 0.6364 0.6501 0.8063
No log 1.8588 158 0.6394 0.6186 0.6394 0.7996
No log 1.8824 160 0.6248 0.6720 0.6248 0.7905
No log 1.9059 162 0.6397 0.6438 0.6397 0.7998
No log 1.9294 164 0.6100 0.6488 0.6100 0.7810
No log 1.9529 166 0.6034 0.6756 0.6034 0.7768
No log 1.9765 168 0.5783 0.6546 0.5783 0.7605
No log 2.0 170 0.5781 0.6400 0.5781 0.7603
No log 2.0235 172 0.6030 0.5905 0.6030 0.7765
No log 2.0471 174 0.6294 0.6003 0.6294 0.7934
No log 2.0706 176 0.5889 0.6566 0.5889 0.7674
No log 2.0941 178 0.6170 0.6364 0.6170 0.7855
No log 2.1176 180 0.6373 0.6330 0.6373 0.7983
No log 2.1412 182 0.6724 0.6254 0.6724 0.8200
No log 2.1647 184 0.6480 0.6254 0.6480 0.8050
No log 2.1882 186 0.6342 0.6284 0.6342 0.7964
No log 2.2118 188 0.6126 0.6772 0.6126 0.7827
No log 2.2353 190 0.5869 0.6433 0.5869 0.7661
No log 2.2588 192 0.5982 0.6667 0.5982 0.7735
No log 2.2824 194 0.6092 0.6598 0.6092 0.7805
No log 2.3059 196 0.6489 0.6656 0.6489 0.8056
No log 2.3294 198 0.6942 0.6586 0.6942 0.8332
No log 2.3529 200 0.6698 0.6751 0.6698 0.8184
No log 2.3765 202 0.6765 0.6410 0.6765 0.8225
No log 2.4 204 0.6526 0.5995 0.6526 0.8078
No log 2.4235 206 0.6725 0.6157 0.6725 0.8200
No log 2.4471 208 0.6697 0.6206 0.6697 0.8184
No log 2.4706 210 0.7135 0.5636 0.7135 0.8447
No log 2.4941 212 0.7094 0.5439 0.7094 0.8422
No log 2.5176 214 0.6880 0.6114 0.6880 0.8295
No log 2.5412 216 0.7291 0.5720 0.7291 0.8539
No log 2.5647 218 0.7333 0.5545 0.7333 0.8563
No log 2.5882 220 0.7357 0.5545 0.7357 0.8577
No log 2.6118 222 0.7264 0.5944 0.7264 0.8523
No log 2.6353 224 0.7089 0.6254 0.7089 0.8419
No log 2.6588 226 0.6615 0.6473 0.6615 0.8134
No log 2.6824 228 0.6724 0.5622 0.6724 0.8200
No log 2.7059 230 0.6718 0.6015 0.6718 0.8196
No log 2.7294 232 0.6500 0.6553 0.6500 0.8063
No log 2.7529 234 0.7057 0.6692 0.7057 0.8401
No log 2.7765 236 0.8346 0.5879 0.8346 0.9136
No log 2.8 238 1.0029 0.5534 1.0029 1.0014
No log 2.8235 240 0.9000 0.5426 0.9000 0.9487
No log 2.8471 242 0.6736 0.6160 0.6736 0.8207
No log 2.8706 244 0.6489 0.6049 0.6489 0.8056
No log 2.8941 246 0.6700 0.6089 0.6700 0.8186
No log 2.9176 248 0.6532 0.6196 0.6532 0.8082
No log 2.9412 250 0.7161 0.5588 0.7161 0.8462
No log 2.9647 252 0.8304 0.5830 0.8304 0.9113
No log 2.9882 254 0.8390 0.5830 0.8390 0.9160
No log 3.0118 256 0.7597 0.5934 0.7597 0.8716
No log 3.0353 258 0.6643 0.6610 0.6643 0.8150
No log 3.0588 260 0.6642 0.6667 0.6642 0.8150
No log 3.0824 262 0.6548 0.6649 0.6548 0.8092
No log 3.1059 264 0.7111 0.5599 0.7111 0.8433
No log 3.1294 266 0.7578 0.6151 0.7578 0.8705
No log 3.1529 268 0.7237 0.6064 0.7237 0.8507
No log 3.1765 270 0.6429 0.6634 0.6429 0.8018
No log 3.2 272 0.7028 0.5815 0.7028 0.8383
No log 3.2235 274 0.7766 0.5228 0.7766 0.8813
No log 3.2471 276 0.7878 0.4934 0.7878 0.8876
No log 3.2706 278 0.7006 0.5412 0.7006 0.8370
No log 3.2941 280 0.6708 0.6335 0.6708 0.8190
No log 3.3176 282 0.7709 0.5163 0.7709 0.8780
No log 3.3412 284 0.8356 0.5455 0.8356 0.9141
No log 3.3647 286 0.7938 0.5566 0.7938 0.8910
No log 3.3882 288 0.7004 0.5830 0.7004 0.8369
No log 3.4118 290 0.6749 0.6272 0.6749 0.8215
No log 3.4353 292 0.7514 0.6346 0.7514 0.8668
No log 3.4588 294 0.7290 0.6346 0.7290 0.8538
No log 3.4824 296 0.6738 0.5981 0.6738 0.8209
No log 3.5059 298 0.6536 0.6244 0.6536 0.8085
No log 3.5294 300 0.6884 0.5645 0.6884 0.8297
No log 3.5529 302 0.7205 0.5079 0.7205 0.8488
No log 3.5765 304 0.7356 0.5178 0.7356 0.8577
No log 3.6 306 0.7775 0.5358 0.7775 0.8818
No log 3.6235 308 0.7811 0.5414 0.7811 0.8838
No log 3.6471 310 0.7008 0.5968 0.7008 0.8371
No log 3.6706 312 0.6465 0.6528 0.6465 0.8041
No log 3.6941 314 0.6305 0.6788 0.6305 0.7941
No log 3.7176 316 0.6398 0.6045 0.6398 0.7999
No log 3.7412 318 0.7140 0.5746 0.7140 0.8450
No log 3.7647 320 0.7623 0.5943 0.7623 0.8731
No log 3.7882 322 0.7282 0.5658 0.7282 0.8533
No log 3.8118 324 0.6656 0.5540 0.6656 0.8159
No log 3.8353 326 0.6564 0.5528 0.6564 0.8102
No log 3.8588 328 0.6331 0.5558 0.6331 0.7957
No log 3.8824 330 0.6036 0.6262 0.6036 0.7769
No log 3.9059 332 0.5802 0.6470 0.5802 0.7617
No log 3.9294 334 0.5547 0.6451 0.5547 0.7448
No log 3.9529 336 0.5476 0.6441 0.5476 0.7400
No log 3.9765 338 0.5438 0.6441 0.5438 0.7374
No log 4.0 340 0.5405 0.6451 0.5405 0.7352
No log 4.0235 342 0.5557 0.5771 0.5557 0.7454
No log 4.0471 344 0.6065 0.5491 0.6065 0.7788
No log 4.0706 346 0.6563 0.5005 0.6563 0.8101
No log 4.0941 348 0.6136 0.5464 0.6136 0.7833
No log 4.1176 350 0.6135 0.5156 0.6135 0.7832
No log 4.1412 352 0.6785 0.6547 0.6785 0.8237
No log 4.1647 354 0.7776 0.5889 0.7776 0.8818
No log 4.1882 356 0.7592 0.6503 0.7592 0.8713
No log 4.2118 358 0.7276 0.6275 0.7276 0.8530
No log 4.2353 360 0.6824 0.6371 0.6824 0.8261
No log 4.2588 362 0.6539 0.6229 0.6539 0.8086
No log 4.2824 364 0.6564 0.6054 0.6564 0.8102
No log 4.3059 366 0.6781 0.5540 0.6781 0.8235
No log 4.3294 368 0.6861 0.5885 0.6861 0.8283
No log 4.3529 370 0.6534 0.5944 0.6534 0.8083
No log 4.3765 372 0.6382 0.6114 0.6382 0.7989
No log 4.4 374 0.6356 0.6266 0.6356 0.7972
No log 4.4235 376 0.6675 0.6091 0.6675 0.8170
No log 4.4471 378 0.7400 0.6029 0.7400 0.8603
No log 4.4706 380 0.7622 0.6322 0.7622 0.8731
No log 4.4941 382 0.6966 0.5729 0.6966 0.8346
No log 4.5176 384 0.6271 0.6537 0.6271 0.7919
No log 4.5412 386 0.6188 0.6078 0.6188 0.7867
No log 4.5647 388 0.6070 0.6461 0.6070 0.7791
No log 4.5882 390 0.5981 0.6720 0.5981 0.7734
No log 4.6118 392 0.6403 0.6547 0.6403 0.8002
No log 4.6353 394 0.6764 0.6322 0.6764 0.8224
No log 4.6588 396 0.6391 0.6025 0.6391 0.7994
No log 4.6824 398 0.5871 0.6441 0.5871 0.7662
No log 4.7059 400 0.5763 0.6013 0.5763 0.7591
No log 4.7294 402 0.5615 0.6733 0.5615 0.7493
No log 4.7529 404 0.5547 0.6861 0.5547 0.7448
No log 4.7765 406 0.5974 0.6699 0.5974 0.7729
No log 4.8 408 0.6147 0.6699 0.6147 0.7840
No log 4.8235 410 0.6440 0.6528 0.6440 0.8025
No log 4.8471 412 0.6730 0.6209 0.6730 0.8203
No log 4.8706 414 0.6686 0.5472 0.6686 0.8177
No log 4.8941 416 0.6429 0.5746 0.6429 0.8018
No log 4.9176 418 0.6398 0.5656 0.6398 0.7999
No log 4.9412 420 0.6259 0.5656 0.6259 0.7912
No log 4.9647 422 0.6114 0.6335 0.6114 0.7819
No log 4.9882 424 0.7285 0.6468 0.7285 0.8535
No log 5.0118 426 0.8412 0.5714 0.8412 0.9172
No log 5.0353 428 0.8156 0.5789 0.8156 0.9031
No log 5.0588 430 0.6941 0.6573 0.6941 0.8331
No log 5.0824 432 0.6200 0.6177 0.6200 0.7874
No log 5.1059 434 0.6089 0.6409 0.6089 0.7803
No log 5.1294 436 0.6355 0.6217 0.6355 0.7972
No log 5.1529 438 0.6306 0.6024 0.6306 0.7941
No log 5.1765 440 0.6258 0.5747 0.6258 0.7911
No log 5.2 442 0.6657 0.6397 0.6657 0.8159
No log 5.2235 444 0.7229 0.5595 0.7229 0.8503
No log 5.2471 446 0.7313 0.5498 0.7313 0.8552
No log 5.2706 448 0.6771 0.6360 0.6771 0.8229
No log 5.2941 450 0.6219 0.5622 0.6219 0.7886
No log 5.3176 452 0.6135 0.6207 0.6135 0.7832
No log 5.3412 454 0.6165 0.6164 0.6165 0.7851
No log 5.3647 456 0.6084 0.5961 0.6084 0.7800
No log 5.3882 458 0.5982 0.6470 0.5982 0.7734
No log 5.4118 460 0.6052 0.6018 0.6052 0.7779
No log 5.4353 462 0.6607 0.6228 0.6607 0.8129
No log 5.4588 464 0.7161 0.6199 0.7161 0.8462
No log 5.4824 466 0.7002 0.5938 0.7002 0.8368
No log 5.5059 468 0.6715 0.5948 0.6715 0.8194
No log 5.5294 470 0.6275 0.6335 0.6275 0.7921
No log 5.5529 472 0.6124 0.6297 0.6124 0.7825
No log 5.5765 474 0.6257 0.6275 0.6257 0.7910
No log 5.6 476 0.6666 0.6224 0.6666 0.8164
No log 5.6235 478 0.6993 0.6474 0.6993 0.8362
No log 5.6471 480 0.6777 0.6197 0.6777 0.8232
No log 5.6706 482 0.6273 0.6414 0.6273 0.7920
No log 5.6941 484 0.6277 0.5841 0.6277 0.7923
No log 5.7176 486 0.6616 0.5641 0.6616 0.8134
No log 5.7412 488 0.6588 0.5300 0.6588 0.8117
No log 5.7647 490 0.6359 0.5759 0.6359 0.7974
No log 5.7882 492 0.6271 0.6566 0.6271 0.7919
No log 5.8118 494 0.6515 0.6554 0.6515 0.8072
No log 5.8353 496 0.6489 0.6684 0.6489 0.8056
No log 5.8588 498 0.6159 0.6343 0.6159 0.7848
0.2697 5.8824 500 0.5883 0.6260 0.5883 0.7670
0.2697 5.9059 502 0.5821 0.6627 0.5821 0.7629
0.2697 5.9294 504 0.5815 0.6330 0.5815 0.7626
0.2697 5.9529 506 0.5897 0.6345 0.5897 0.7679
0.2697 5.9765 508 0.5804 0.6217 0.5804 0.7619
0.2697 6.0 510 0.5846 0.6217 0.5846 0.7646
0.2697 6.0235 512 0.6016 0.6564 0.6016 0.7756
0.2697 6.0471 514 0.5896 0.5925 0.5896 0.7679
0.2697 6.0706 516 0.5831 0.6102 0.5831 0.7636
0.2697 6.0941 518 0.5708 0.5925 0.5708 0.7555
0.2697 6.1176 520 0.5554 0.6330 0.5554 0.7453
0.2697 6.1412 522 0.5516 0.6658 0.5516 0.7427
0.2697 6.1647 524 0.5476 0.6658 0.5476 0.7400
0.2697 6.1882 526 0.5653 0.6195 0.5653 0.7518
0.2697 6.2118 528 0.5688 0.6217 0.5688 0.7542
0.2697 6.2353 530 0.5640 0.6217 0.5640 0.7510
0.2697 6.2588 532 0.5661 0.6364 0.5661 0.7524
0.2697 6.2824 534 0.5771 0.6364 0.5771 0.7597
0.2697 6.3059 536 0.5925 0.6537 0.5925 0.7698
0.2697 6.3294 538 0.6075 0.6311 0.6075 0.7794
0.2697 6.3529 540 0.6230 0.6370 0.6230 0.7893
0.2697 6.3765 542 0.6356 0.6407 0.6356 0.7972
0.2697 6.4 544 0.6519 0.5759 0.6519 0.8074
0.2697 6.4235 546 0.6495 0.5536 0.6495 0.8059
0.2697 6.4471 548 0.6358 0.5432 0.6358 0.7974
0.2697 6.4706 550 0.6560 0.5770 0.6560 0.8100
0.2697 6.4941 552 0.7175 0.5017 0.7175 0.8471

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
1
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run3_AugV5_k17_task5_organization

Finetuned
(4019)
this model