ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k7_task5_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.6497
  • Qwk: 0.5173
  • Mse: 0.6497
  • Rmse: 0.8061

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0571 2 3.9969 -0.0062 3.9969 1.9992
No log 0.1143 4 2.2105 0.0890 2.2105 1.4868
No log 0.1714 6 1.2801 0.0380 1.2801 1.1314
No log 0.2286 8 1.0062 0.2611 1.0062 1.0031
No log 0.2857 10 0.9266 0.3876 0.9266 0.9626
No log 0.3429 12 0.8824 0.4237 0.8824 0.9394
No log 0.4 14 0.9008 0.4565 0.9008 0.9491
No log 0.4571 16 0.9350 0.3184 0.9350 0.9669
No log 0.5143 18 1.0550 0.1616 1.0550 1.0271
No log 0.5714 20 1.0138 0.2541 1.0138 1.0069
No log 0.6286 22 0.8709 0.3519 0.8709 0.9332
No log 0.6857 24 0.8091 0.4781 0.8091 0.8995
No log 0.7429 26 0.8079 0.4490 0.8079 0.8989
No log 0.8 28 0.8474 0.3815 0.8474 0.9205
No log 0.8571 30 0.8820 0.4293 0.8820 0.9392
No log 0.9143 32 0.7675 0.4537 0.7675 0.8760
No log 0.9714 34 0.7216 0.5188 0.7216 0.8495
No log 1.0286 36 0.8064 0.4399 0.8064 0.8980
No log 1.0857 38 0.8122 0.5027 0.8122 0.9012
No log 1.1429 40 0.7666 0.5242 0.7666 0.8756
No log 1.2 42 0.6858 0.5582 0.6858 0.8281
No log 1.2571 44 0.8489 0.4563 0.8489 0.9214
No log 1.3143 46 0.9007 0.4600 0.9007 0.9490
No log 1.3714 48 0.8004 0.4749 0.8004 0.8947
No log 1.4286 50 0.7625 0.4996 0.7625 0.8732
No log 1.4857 52 0.7229 0.5431 0.7229 0.8503
No log 1.5429 54 0.6929 0.6337 0.6929 0.8324
No log 1.6 56 0.6737 0.6138 0.6737 0.8208
No log 1.6571 58 0.6670 0.6011 0.6670 0.8167
No log 1.7143 60 0.6227 0.6690 0.6227 0.7891
No log 1.7714 62 0.6048 0.6886 0.6048 0.7777
No log 1.8286 64 0.6587 0.6337 0.6587 0.8116
No log 1.8857 66 0.7505 0.5192 0.7505 0.8663
No log 1.9429 68 0.8062 0.4838 0.8062 0.8979
No log 2.0 70 0.9028 0.5232 0.9028 0.9502
No log 2.0571 72 0.8858 0.4976 0.8858 0.9412
No log 2.1143 74 0.8634 0.5387 0.8634 0.9292
No log 2.1714 76 0.8563 0.5532 0.8563 0.9254
No log 2.2286 78 0.7845 0.5625 0.7845 0.8857
No log 2.2857 80 0.7218 0.5871 0.7218 0.8496
No log 2.3429 82 0.6614 0.6306 0.6614 0.8132
No log 2.4 84 0.6307 0.5806 0.6307 0.7942
No log 2.4571 86 0.6498 0.5817 0.6498 0.8061
No log 2.5143 88 0.6576 0.6206 0.6576 0.8109
No log 2.5714 90 0.6771 0.5964 0.6771 0.8228
No log 2.6286 92 0.7641 0.5885 0.7641 0.8741
No log 2.6857 94 0.8053 0.5538 0.8053 0.8974
No log 2.7429 96 0.7011 0.6045 0.7011 0.8373
No log 2.8 98 0.6966 0.6306 0.6966 0.8346
No log 2.8571 100 0.7707 0.5506 0.7707 0.8779
No log 2.9143 102 0.7348 0.5587 0.7348 0.8572
No log 2.9714 104 0.7005 0.5853 0.7005 0.8370
No log 3.0286 106 0.7236 0.5832 0.7236 0.8506
No log 3.0857 108 0.7181 0.5741 0.7181 0.8474
No log 3.1429 110 0.7166 0.5819 0.7166 0.8465
No log 3.2 112 0.7508 0.5988 0.7508 0.8665
No log 3.2571 114 0.7155 0.5808 0.7155 0.8459
No log 3.3143 116 0.7712 0.5318 0.7712 0.8782
No log 3.3714 118 0.7677 0.5928 0.7677 0.8762
No log 3.4286 120 0.6672 0.6220 0.6672 0.8168
No log 3.4857 122 0.6793 0.4960 0.6793 0.8242
No log 3.5429 124 0.7555 0.4389 0.7555 0.8692
No log 3.6 126 0.7104 0.4738 0.7104 0.8428
No log 3.6571 128 0.6990 0.6371 0.6990 0.8361
No log 3.7143 130 0.8131 0.56 0.8131 0.9017
No log 3.7714 132 0.7668 0.5995 0.7668 0.8757
No log 3.8286 134 0.7167 0.5766 0.7167 0.8466
No log 3.8857 136 0.6393 0.5407 0.6393 0.7995
No log 3.9429 138 0.6279 0.6018 0.6279 0.7924
No log 4.0 140 0.6348 0.6227 0.6348 0.7968
No log 4.0571 142 0.6599 0.5516 0.6599 0.8123
No log 4.1143 144 0.7337 0.5756 0.7337 0.8565
No log 4.1714 146 0.6853 0.5978 0.6853 0.8278
No log 4.2286 148 0.5856 0.6259 0.5856 0.7653
No log 4.2857 150 0.5725 0.6903 0.5725 0.7566
No log 4.3429 152 0.5909 0.6499 0.5909 0.7687
No log 4.4 154 0.6284 0.6228 0.6284 0.7927
No log 4.4571 156 0.8189 0.5538 0.8189 0.9049
No log 4.5143 158 1.0421 0.5455 1.0421 1.0208
No log 4.5714 160 0.9516 0.5376 0.9516 0.9755
No log 4.6286 162 0.7777 0.5745 0.7777 0.8819
No log 4.6857 164 0.6350 0.5626 0.6350 0.7968
No log 4.7429 166 0.6510 0.5345 0.6510 0.8068
No log 4.8 168 0.7387 0.5292 0.7387 0.8595
No log 4.8571 170 0.8820 0.3619 0.8820 0.9391
No log 4.9143 172 0.9993 0.3782 0.9993 0.9997
No log 4.9714 174 0.9257 0.4012 0.9257 0.9621
No log 5.0286 176 0.8150 0.5370 0.8150 0.9028
No log 5.0857 178 0.7128 0.5855 0.7128 0.8443
No log 5.1429 180 0.6887 0.5833 0.6887 0.8299
No log 5.2 182 0.6706 0.6322 0.6706 0.8189
No log 5.2571 184 0.6440 0.6208 0.6440 0.8025
No log 5.3143 186 0.6577 0.6029 0.6577 0.8110
No log 5.3714 188 0.6495 0.5855 0.6495 0.8059
No log 5.4286 190 0.6514 0.5654 0.6514 0.8071
No log 5.4857 192 0.7076 0.5763 0.7076 0.8412
No log 5.5429 194 0.7972 0.5385 0.7972 0.8929
No log 5.6 196 0.9166 0.4091 0.9166 0.9574
No log 5.6571 198 0.9309 0.4695 0.9309 0.9648
No log 5.7143 200 0.9216 0.4478 0.9216 0.9600
No log 5.7714 202 0.8138 0.4920 0.8138 0.9021
No log 5.8286 204 0.6838 0.5774 0.6838 0.8269
No log 5.8857 206 0.6481 0.6207 0.6481 0.8050
No log 5.9429 208 0.6549 0.6035 0.6549 0.8092
No log 6.0 210 0.7207 0.5566 0.7207 0.8490
No log 6.0571 212 0.7117 0.5885 0.7117 0.8436
No log 6.1143 214 0.6564 0.6311 0.6564 0.8102
No log 6.1714 216 0.6549 0.6197 0.6549 0.8093
No log 6.2286 218 0.6780 0.6228 0.6780 0.8234
No log 6.2857 220 0.6830 0.6028 0.6830 0.8264
No log 6.3429 222 0.6932 0.6028 0.6932 0.8326
No log 6.4 224 0.7365 0.5549 0.7365 0.8582
No log 6.4571 226 0.7480 0.5561 0.7480 0.8649
No log 6.5143 228 0.6694 0.5886 0.6694 0.8182
No log 6.5714 230 0.6410 0.6117 0.6410 0.8006
No log 6.6286 232 0.6499 0.6128 0.6499 0.8061
No log 6.6857 234 0.6602 0.6133 0.6602 0.8125
No log 6.7429 236 0.7084 0.5668 0.7084 0.8417
No log 6.8 238 0.7327 0.5766 0.7327 0.8560
No log 6.8571 240 0.7218 0.5833 0.7218 0.8496
No log 6.9143 242 0.7099 0.5375 0.7099 0.8426
No log 6.9714 244 0.7396 0.4817 0.7396 0.8600
No log 7.0286 246 0.7309 0.4908 0.7309 0.8550
No log 7.0857 248 0.7678 0.5466 0.7678 0.8762
No log 7.1429 250 0.7840 0.5355 0.7840 0.8855
No log 7.2 252 0.7110 0.5699 0.7110 0.8432
No log 7.2571 254 0.6759 0.5594 0.6759 0.8221
No log 7.3143 256 0.6763 0.5591 0.6763 0.8224
No log 7.3714 258 0.6608 0.5693 0.6608 0.8129
No log 7.4286 260 0.6443 0.6764 0.6443 0.8027
No log 7.4857 262 0.7846 0.5422 0.7846 0.8858
No log 7.5429 264 0.9160 0.4177 0.9160 0.9571
No log 7.6 266 0.8794 0.4135 0.8794 0.9378
No log 7.6571 268 0.7642 0.4745 0.7642 0.8742
No log 7.7143 270 0.7041 0.5722 0.7041 0.8391
No log 7.7714 272 0.6681 0.5880 0.6681 0.8174
No log 7.8286 274 0.6582 0.5833 0.6582 0.8113
No log 7.8857 276 0.6924 0.6079 0.6924 0.8321
No log 7.9429 278 0.7673 0.5515 0.7673 0.8760
No log 8.0 280 0.6941 0.6052 0.6941 0.8331
No log 8.0571 282 0.5800 0.6567 0.5800 0.7616
No log 8.1143 284 0.5529 0.6217 0.5529 0.7436
No log 8.1714 286 0.5643 0.6217 0.5643 0.7512
No log 8.2286 288 0.5935 0.6806 0.5935 0.7704
No log 8.2857 290 0.6886 0.5153 0.6886 0.8298
No log 8.3429 292 0.8636 0.4987 0.8636 0.9293
No log 8.4 294 0.9395 0.4987 0.9395 0.9693
No log 8.4571 296 0.8683 0.5086 0.8683 0.9319
No log 8.5143 298 0.7856 0.5014 0.7856 0.8863
No log 8.5714 300 0.7145 0.5319 0.7145 0.8453
No log 8.6286 302 0.6598 0.5288 0.6598 0.8123
No log 8.6857 304 0.6504 0.4813 0.6504 0.8065
No log 8.7429 306 0.6353 0.5042 0.6353 0.7971
No log 8.8 308 0.6710 0.5413 0.6710 0.8191
No log 8.8571 310 0.7788 0.5106 0.7788 0.8825
No log 8.9143 312 0.7840 0.5106 0.7840 0.8854
No log 8.9714 314 0.6614 0.5380 0.6614 0.8133
No log 9.0286 316 0.5929 0.6632 0.5929 0.7700
No log 9.0857 318 0.5821 0.6673 0.5821 0.7630
No log 9.1429 320 0.5891 0.6405 0.5891 0.7675
No log 9.2 322 0.6304 0.5877 0.6304 0.7940
No log 9.2571 324 0.6285 0.5777 0.6285 0.7928
No log 9.3143 326 0.6137 0.5898 0.6137 0.7834
No log 9.3714 328 0.5973 0.6209 0.5973 0.7729
No log 9.4286 330 0.6288 0.5998 0.6288 0.7929
No log 9.4857 332 0.6739 0.5777 0.6739 0.8209
No log 9.5429 334 0.7179 0.5766 0.7179 0.8473
No log 9.6 336 0.7311 0.6004 0.7311 0.8551
No log 9.6571 338 0.6896 0.5811 0.6896 0.8304
No log 9.7143 340 0.6462 0.5931 0.6462 0.8039
No log 9.7714 342 0.6595 0.5835 0.6595 0.8121
No log 9.8286 344 0.6612 0.5956 0.6612 0.8132
No log 9.8857 346 0.6632 0.5333 0.6632 0.8143
No log 9.9429 348 0.7651 0.5219 0.7651 0.8747
No log 10.0 350 0.9101 0.5295 0.9101 0.9540
No log 10.0571 352 0.8427 0.5295 0.8427 0.9180
No log 10.1143 354 0.6771 0.6343 0.6771 0.8229
No log 10.1714 356 0.5974 0.5317 0.5974 0.7729
No log 10.2286 358 0.5941 0.6175 0.5941 0.7708
No log 10.2857 360 0.5996 0.6065 0.5996 0.7744
No log 10.3429 362 0.6050 0.5678 0.6050 0.7778
No log 10.4 364 0.6750 0.6147 0.6750 0.8216
No log 10.4571 366 0.8149 0.5207 0.8149 0.9027
No log 10.5143 368 0.8449 0.5086 0.8449 0.9192
No log 10.5714 370 0.7494 0.5867 0.7494 0.8657
No log 10.6286 372 0.6334 0.6101 0.6334 0.7959
No log 10.6857 374 0.6144 0.6569 0.6144 0.7838
No log 10.7429 376 0.6253 0.6620 0.6253 0.7908
No log 10.8 378 0.6364 0.6350 0.6364 0.7977
No log 10.8571 380 0.6380 0.6231 0.6380 0.7988
No log 10.9143 382 0.6450 0.5858 0.6450 0.8031
No log 10.9714 384 0.6306 0.5751 0.6306 0.7941
No log 11.0286 386 0.6213 0.5858 0.6213 0.7882
No log 11.0857 388 0.6130 0.5718 0.6130 0.7830
No log 11.1429 390 0.6415 0.5811 0.6415 0.8009
No log 11.2 392 0.6761 0.5788 0.6761 0.8222
No log 11.2571 394 0.6836 0.6004 0.6836 0.8268
No log 11.3143 396 0.6549 0.6004 0.6549 0.8092
No log 11.3714 398 0.6075 0.5923 0.6075 0.7794
No log 11.4286 400 0.5928 0.6209 0.5928 0.7699
No log 11.4857 402 0.5996 0.6539 0.5996 0.7743
No log 11.5429 404 0.6392 0.6209 0.6392 0.7995
No log 11.6 406 0.6446 0.5888 0.6446 0.8029
No log 11.6571 408 0.6635 0.5912 0.6635 0.8146
No log 11.7143 410 0.6583 0.6042 0.6583 0.8113
No log 11.7714 412 0.6616 0.6071 0.6616 0.8134
No log 11.8286 414 0.6316 0.5823 0.6316 0.7947
No log 11.8857 416 0.6382 0.6187 0.6382 0.7989
No log 11.9429 418 0.6738 0.5602 0.6738 0.8209
No log 12.0 420 0.7422 0.5254 0.7422 0.8615
No log 12.0571 422 0.7496 0.5663 0.7496 0.8658
No log 12.1143 424 0.7034 0.5370 0.7034 0.8387
No log 12.1714 426 0.6943 0.5385 0.6943 0.8332
No log 12.2286 428 0.6662 0.5400 0.6662 0.8162
No log 12.2857 430 0.6496 0.5429 0.6496 0.8060
No log 12.3429 432 0.6406 0.5932 0.6406 0.8004
No log 12.4 434 0.6598 0.5700 0.6598 0.8123
No log 12.4571 436 0.6955 0.5894 0.6955 0.8340
No log 12.5143 438 0.7850 0.4911 0.7850 0.8860
No log 12.5714 440 0.8822 0.4796 0.8822 0.9393
No log 12.6286 442 0.9831 0.4359 0.9831 0.9915
No log 12.6857 444 0.9163 0.4860 0.9163 0.9572
No log 12.7429 446 0.7275 0.5766 0.7275 0.8529
No log 12.8 448 0.5959 0.6003 0.5959 0.7719
No log 12.8571 450 0.5766 0.6360 0.5766 0.7594
No log 12.9143 452 0.5695 0.6335 0.5695 0.7546
No log 12.9714 454 0.5758 0.6548 0.5758 0.7588
No log 13.0286 456 0.5886 0.6231 0.5886 0.7672
No log 13.0857 458 0.6165 0.6491 0.6165 0.7852
No log 13.1429 460 0.6028 0.6305 0.6028 0.7764
No log 13.2 462 0.5758 0.6252 0.5758 0.7588
No log 13.2571 464 0.5870 0.6175 0.5870 0.7662
No log 13.3143 466 0.5995 0.5921 0.5995 0.7743
No log 13.3714 468 0.6515 0.6147 0.6515 0.8072
No log 13.4286 470 0.7342 0.5756 0.7342 0.8568
No log 13.4857 472 0.7694 0.5644 0.7694 0.8771
No log 13.5429 474 0.7195 0.5684 0.7195 0.8482
No log 13.6 476 0.6687 0.5429 0.6687 0.8177
No log 13.6571 478 0.6697 0.5429 0.6697 0.8184
No log 13.7143 480 0.7065 0.5497 0.7065 0.8405
No log 13.7714 482 0.8062 0.5614 0.8062 0.8979
No log 13.8286 484 0.8533 0.5073 0.8533 0.9237
No log 13.8857 486 0.7986 0.5633 0.7986 0.8936
No log 13.9429 488 0.7004 0.5598 0.7004 0.8369
No log 14.0 490 0.6176 0.5585 0.6176 0.7859
No log 14.0571 492 0.5986 0.6020 0.5986 0.7737
No log 14.1143 494 0.5925 0.6198 0.5925 0.7697
No log 14.1714 496 0.6067 0.5425 0.6067 0.7789
No log 14.2286 498 0.6821 0.5854 0.6821 0.8259
0.2298 14.2857 500 0.7368 0.5636 0.7368 0.8584
0.2298 14.3429 502 0.7105 0.5745 0.7105 0.8429
0.2298 14.4 504 0.6680 0.5756 0.6680 0.8173
0.2298 14.4571 506 0.6381 0.5953 0.6381 0.7988
0.2298 14.5143 508 0.6431 0.5953 0.6431 0.8019
0.2298 14.5714 510 0.6652 0.5618 0.6652 0.8156
0.2298 14.6286 512 0.7033 0.5916 0.7033 0.8386
0.2298 14.6857 514 0.7415 0.5734 0.7415 0.8611
0.2298 14.7429 516 0.7307 0.5734 0.7307 0.8548
0.2298 14.8 518 0.6583 0.5666 0.6583 0.8114
0.2298 14.8571 520 0.6085 0.5858 0.6085 0.7801
0.2298 14.9143 522 0.6131 0.5953 0.6131 0.7830
0.2298 14.9714 524 0.6515 0.5811 0.6515 0.8071
0.2298 15.0286 526 0.7097 0.5684 0.7097 0.8425
0.2298 15.0857 528 0.7051 0.5567 0.7051 0.8397
0.2298 15.1429 530 0.6553 0.6165 0.6553 0.8095
0.2298 15.2 532 0.6186 0.5127 0.6186 0.7865
0.2298 15.2571 534 0.6006 0.5771 0.6006 0.7750
0.2298 15.3143 536 0.6000 0.5989 0.6000 0.7746
0.2298 15.3714 538 0.6306 0.5654 0.6306 0.7941
0.2298 15.4286 540 0.6433 0.5516 0.6433 0.8021
0.2298 15.4857 542 0.6488 0.5279 0.6488 0.8055
0.2298 15.5429 544 0.6580 0.5041 0.6580 0.8111
0.2298 15.6 546 0.6514 0.5173 0.6514 0.8071
0.2298 15.6571 548 0.6497 0.5173 0.6497 0.8061

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k7_task5_organization

Finetuned
(4019)
this model