ArabicNewSplits7_B_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k3_task5_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.6720
  • Qwk: 0.5809
  • Mse: 0.6720
  • Rmse: 0.8198

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.1176 2 3.6098 -0.0302 3.6098 1.8999
No log 0.2353 4 2.1496 -0.0007 2.1496 1.4662
No log 0.3529 6 1.4897 -0.0078 1.4897 1.2205
No log 0.4706 8 1.1870 0.0731 1.1870 1.0895
No log 0.5882 10 1.2493 0.0153 1.2493 1.1177
No log 0.7059 12 1.4266 -0.0613 1.4266 1.1944
No log 0.8235 14 1.3606 0.0671 1.3606 1.1665
No log 0.9412 16 1.1623 0.1713 1.1623 1.0781
No log 1.0588 18 1.2217 0.2217 1.2217 1.1053
No log 1.1765 20 1.3274 0.0395 1.3274 1.1521
No log 1.2941 22 1.2306 0.1262 1.2306 1.1093
No log 1.4118 24 1.3071 0.1085 1.3071 1.1433
No log 1.5294 26 1.5518 0.1419 1.5518 1.2457
No log 1.6471 28 1.4969 0.1453 1.4969 1.2235
No log 1.7647 30 1.4156 0.1817 1.4156 1.1898
No log 1.8824 32 1.1531 0.1814 1.1531 1.0738
No log 2.0 34 0.8512 0.4857 0.8512 0.9226
No log 2.1176 36 0.8396 0.4318 0.8396 0.9163
No log 2.2353 38 0.9829 0.3470 0.9829 0.9914
No log 2.3529 40 0.7998 0.3826 0.7998 0.8943
No log 2.4706 42 0.8240 0.4976 0.8240 0.9077
No log 2.5882 44 1.1707 0.3864 1.1707 1.0820
No log 2.7059 46 1.0589 0.4042 1.0589 1.0290
No log 2.8235 48 0.7253 0.5923 0.7253 0.8517
No log 2.9412 50 0.7899 0.5135 0.7899 0.8888
No log 3.0588 52 0.9612 0.4773 0.9612 0.9804
No log 3.1765 54 0.8878 0.4978 0.8878 0.9422
No log 3.2941 56 0.7190 0.5763 0.7190 0.8479
No log 3.4118 58 0.8749 0.5825 0.8749 0.9354
No log 3.5294 60 0.9852 0.4952 0.9852 0.9925
No log 3.6471 62 0.8177 0.5412 0.8177 0.9043
No log 3.7647 64 0.7314 0.5776 0.7314 0.8552
No log 3.8824 66 0.9178 0.4255 0.9178 0.9580
No log 4.0 68 0.9445 0.4565 0.9445 0.9719
No log 4.1176 70 0.8176 0.5046 0.8176 0.9042
No log 4.2353 72 0.7877 0.4691 0.7877 0.8875
No log 4.3529 74 0.8403 0.5434 0.8403 0.9167
No log 4.4706 76 0.8309 0.5300 0.8309 0.9116
No log 4.5882 78 0.8245 0.5501 0.8245 0.9080
No log 4.7059 80 0.7821 0.5416 0.7821 0.8844
No log 4.8235 82 0.7782 0.5787 0.7782 0.8822
No log 4.9412 84 0.7641 0.5654 0.7641 0.8741
No log 5.0588 86 0.7555 0.5969 0.7555 0.8692
No log 5.1765 88 0.7364 0.6002 0.7364 0.8581
No log 5.2941 90 0.7318 0.5291 0.7318 0.8555
No log 5.4118 92 0.7204 0.5495 0.7204 0.8488
No log 5.5294 94 0.7410 0.6060 0.7410 0.8608
No log 5.6471 96 0.7031 0.5796 0.7031 0.8385
No log 5.7647 98 0.7130 0.5522 0.7130 0.8444
No log 5.8824 100 0.7224 0.5719 0.7224 0.8499
No log 6.0 102 0.7643 0.5677 0.7643 0.8743
No log 6.1176 104 0.7175 0.5304 0.7175 0.8471
No log 6.2353 106 0.7068 0.5770 0.7068 0.8407
No log 6.3529 108 0.7711 0.5364 0.7711 0.8781
No log 6.4706 110 0.7020 0.5759 0.7020 0.8379
No log 6.5882 112 0.7006 0.5785 0.7006 0.8370
No log 6.7059 114 0.7100 0.5463 0.7100 0.8426
No log 6.8235 116 0.6975 0.6364 0.6975 0.8351
No log 6.9412 118 0.6917 0.5759 0.6917 0.8317
No log 7.0588 120 0.7280 0.4657 0.7280 0.8532
No log 7.1765 122 0.7644 0.5257 0.7644 0.8743
No log 7.2941 124 0.6877 0.5438 0.6877 0.8293
No log 7.4118 126 0.6778 0.5833 0.6778 0.8233
No log 7.5294 128 0.6827 0.5722 0.6827 0.8262
No log 7.6471 130 0.6691 0.5894 0.6691 0.8180
No log 7.7647 132 0.8262 0.5720 0.8262 0.9090
No log 7.8824 134 0.7774 0.5829 0.7774 0.8817
No log 8.0 136 0.6568 0.6039 0.6568 0.8104
No log 8.1176 138 0.7855 0.5202 0.7855 0.8863
No log 8.2353 140 0.8902 0.5495 0.8902 0.9435
No log 8.3529 142 0.7966 0.4476 0.7966 0.8925
No log 8.4706 144 0.6718 0.5943 0.6718 0.8196
No log 8.5882 146 0.6767 0.5843 0.6767 0.8226
No log 8.7059 148 0.6873 0.5843 0.6873 0.8291
No log 8.8235 150 0.6880 0.5809 0.6880 0.8295
No log 8.9412 152 0.7526 0.5909 0.7526 0.8675
No log 9.0588 154 0.7507 0.5558 0.7507 0.8664
No log 9.1765 156 0.6865 0.5972 0.6865 0.8285
No log 9.2941 158 0.7164 0.5111 0.7164 0.8464
No log 9.4118 160 0.7728 0.5362 0.7728 0.8791
No log 9.5294 162 0.7543 0.5245 0.7543 0.8685
No log 9.6471 164 0.6899 0.5510 0.6899 0.8306
No log 9.7647 166 0.7182 0.5972 0.7182 0.8475
No log 9.8824 168 0.7187 0.5943 0.7187 0.8477
No log 10.0 170 0.7025 0.5943 0.7025 0.8381
No log 10.1176 172 0.7013 0.5943 0.7013 0.8374
No log 10.2353 174 0.7032 0.5943 0.7032 0.8386
No log 10.3529 176 0.7065 0.5288 0.7065 0.8406
No log 10.4706 178 0.7392 0.4968 0.7392 0.8597
No log 10.5882 180 0.7340 0.4498 0.7340 0.8567
No log 10.7059 182 0.7062 0.5510 0.7062 0.8404
No log 10.8235 184 0.7249 0.5785 0.7249 0.8514
No log 10.9412 186 0.7156 0.5785 0.7156 0.8459
No log 11.0588 188 0.6937 0.5785 0.6937 0.8329
No log 11.1765 190 0.6841 0.5044 0.6841 0.8271
No log 11.2941 192 0.6814 0.5300 0.6814 0.8255
No log 11.4118 194 0.7106 0.5098 0.7106 0.8430
No log 11.5294 196 0.7340 0.5547 0.7340 0.8568
No log 11.6471 198 0.6916 0.4965 0.6916 0.8316
No log 11.7647 200 0.7052 0.5866 0.7052 0.8398
No log 11.8824 202 0.7868 0.5595 0.7868 0.8870
No log 12.0 204 0.7869 0.5482 0.7869 0.8871
No log 12.1176 206 0.7261 0.5317 0.7261 0.8521
No log 12.2353 208 0.6952 0.4968 0.6952 0.8338
No log 12.3529 210 0.7544 0.5024 0.7544 0.8686
No log 12.4706 212 0.7637 0.5037 0.7637 0.8739
No log 12.5882 214 0.6901 0.5312 0.6901 0.8307
No log 12.7059 216 0.7011 0.6167 0.7011 0.8373
No log 12.8235 218 0.7320 0.6187 0.7320 0.8556
No log 12.9412 220 0.7028 0.5742 0.7028 0.8383
No log 13.0588 222 0.7042 0.5856 0.7042 0.8392
No log 13.1765 224 0.7075 0.5174 0.7075 0.8411
No log 13.2941 226 0.7115 0.5370 0.7115 0.8435
No log 13.4118 228 0.7191 0.5149 0.7191 0.8480
No log 13.5294 230 0.7122 0.5146 0.7122 0.8439
No log 13.6471 232 0.6963 0.5146 0.6963 0.8344
No log 13.7647 234 0.6834 0.5822 0.6834 0.8267
No log 13.8824 236 0.6877 0.5797 0.6877 0.8293
No log 14.0 238 0.6498 0.5996 0.6498 0.8061
No log 14.1176 240 0.6279 0.5726 0.6279 0.7924
No log 14.2353 242 0.6556 0.5548 0.6556 0.8097
No log 14.3529 244 0.6481 0.6429 0.6481 0.8050
No log 14.4706 246 0.6223 0.6437 0.6223 0.7888
No log 14.5882 248 0.6292 0.6460 0.6292 0.7933
No log 14.7059 250 0.6853 0.6228 0.6853 0.8279
No log 14.8235 252 0.6809 0.6138 0.6809 0.8252
No log 14.9412 254 0.6469 0.6427 0.6469 0.8043
No log 15.0588 256 0.6972 0.5222 0.6972 0.8350
No log 15.1765 258 0.7364 0.5678 0.7364 0.8582
No log 15.2941 260 0.6823 0.5926 0.6823 0.8260
No log 15.4118 262 0.6320 0.6518 0.6320 0.7950
No log 15.5294 264 0.6389 0.6380 0.6389 0.7993
No log 15.6471 266 0.6202 0.6814 0.6202 0.7875
No log 15.7647 268 0.6115 0.5892 0.6115 0.7820
No log 15.8824 270 0.6541 0.6533 0.6541 0.8088
No log 16.0 272 0.6408 0.6214 0.6408 0.8005
No log 16.1176 274 0.6608 0.6583 0.6608 0.8129
No log 16.2353 276 0.6839 0.5870 0.6839 0.8270
No log 16.3529 278 0.6788 0.6232 0.6788 0.8239
No log 16.4706 280 0.6080 0.6164 0.6080 0.7797
No log 16.5882 282 0.6021 0.6641 0.6021 0.7760
No log 16.7059 284 0.6285 0.6038 0.6285 0.7928
No log 16.8235 286 0.6102 0.6389 0.6102 0.7811
No log 16.9412 288 0.6084 0.6266 0.6084 0.7800
No log 17.0588 290 0.6202 0.6266 0.6202 0.7875
No log 17.1765 292 0.6243 0.6256 0.6243 0.7901
No log 17.2941 294 0.6494 0.6520 0.6494 0.8058
No log 17.4118 296 0.6209 0.6066 0.6209 0.7880
No log 17.5294 298 0.5960 0.6470 0.5960 0.7720
No log 17.6471 300 0.5998 0.6175 0.5998 0.7745
No log 17.7647 302 0.6049 0.6175 0.6049 0.7778
No log 17.8824 304 0.6062 0.6175 0.6062 0.7786
No log 18.0 306 0.6090 0.6546 0.6090 0.7804
No log 18.1176 308 0.6133 0.6407 0.6133 0.7831
No log 18.2353 310 0.6195 0.6167 0.6195 0.7871
No log 18.3529 312 0.6124 0.6227 0.6124 0.7825
No log 18.4706 314 0.6155 0.6227 0.6155 0.7845
No log 18.5882 316 0.6309 0.5831 0.6309 0.7943
No log 18.7059 318 0.6502 0.5833 0.6502 0.8064
No log 18.8235 320 0.6619 0.5160 0.6619 0.8136
No log 18.9412 322 0.6855 0.5548 0.6855 0.8279
No log 19.0588 324 0.7176 0.5450 0.7176 0.8471
No log 19.1765 326 0.7041 0.5450 0.7041 0.8391
No log 19.2941 328 0.6714 0.5069 0.6714 0.8194
No log 19.4118 330 0.6768 0.6165 0.6768 0.8226
No log 19.5294 332 0.7598 0.5810 0.7598 0.8717
No log 19.6471 334 0.7751 0.5566 0.7751 0.8804
No log 19.7647 336 0.7275 0.6064 0.7275 0.8530
No log 19.8824 338 0.6477 0.6291 0.6477 0.8048
No log 20.0 340 0.6122 0.6113 0.6122 0.7825
No log 20.1176 342 0.6115 0.6076 0.6115 0.7820
No log 20.2353 344 0.6192 0.6282 0.6192 0.7869
No log 20.3529 346 0.6127 0.6345 0.6127 0.7828
No log 20.4706 348 0.6368 0.6045 0.6368 0.7980
No log 20.5882 350 0.6411 0.6018 0.6411 0.8007
No log 20.7059 352 0.6154 0.6641 0.6154 0.7845
No log 20.8235 354 0.6507 0.6125 0.6507 0.8067
No log 20.9412 356 0.7120 0.5259 0.7120 0.8438
No log 21.0588 358 0.7271 0.5570 0.7271 0.8527
No log 21.1765 360 0.6685 0.6344 0.6685 0.8176
No log 21.2941 362 0.6260 0.6316 0.6260 0.7912
No log 21.4118 364 0.6258 0.6350 0.6258 0.7911
No log 21.5294 366 0.6372 0.6254 0.6372 0.7983
No log 21.6471 368 0.6689 0.5783 0.6689 0.8179
No log 21.7647 370 0.6675 0.5794 0.6675 0.8170
No log 21.8824 372 0.6451 0.6589 0.6451 0.8032
No log 22.0 374 0.6821 0.6118 0.6821 0.8259
No log 22.1176 376 0.6962 0.6091 0.6962 0.8344
No log 22.2353 378 0.6717 0.6073 0.6717 0.8196
No log 22.3529 380 0.6298 0.6256 0.6298 0.7936
No log 22.4706 382 0.6172 0.6619 0.6172 0.7856
No log 22.5882 384 0.6200 0.6537 0.6200 0.7874
No log 22.7059 386 0.6329 0.6325 0.6329 0.7955
No log 22.8235 388 0.6471 0.6325 0.6471 0.8044
No log 22.9412 390 0.6382 0.6325 0.6382 0.7989
No log 23.0588 392 0.6143 0.6460 0.6143 0.7838
No log 23.1765 394 0.6144 0.6564 0.6144 0.7838
No log 23.2941 396 0.6137 0.6526 0.6137 0.7834
No log 23.4118 398 0.6191 0.6564 0.6191 0.7868
No log 23.5294 400 0.6118 0.6564 0.6118 0.7822
No log 23.6471 402 0.6236 0.6060 0.6236 0.7897
No log 23.7647 404 0.6139 0.6164 0.6139 0.7835
No log 23.8824 406 0.5977 0.6720 0.5977 0.7731
No log 24.0 408 0.6121 0.6066 0.6121 0.7824
No log 24.1176 410 0.6123 0.5964 0.6123 0.7825
No log 24.2353 412 0.6014 0.6697 0.6014 0.7755
No log 24.3529 414 0.6461 0.6035 0.6461 0.8038
No log 24.4706 416 0.6723 0.6362 0.6723 0.8199
No log 24.5882 418 0.6489 0.6035 0.6489 0.8055
No log 24.7059 420 0.6180 0.6683 0.6180 0.7861
No log 24.8235 422 0.5980 0.6186 0.5980 0.7733
No log 24.9412 424 0.6419 0.6347 0.6419 0.8012
No log 25.0588 426 0.7506 0.6371 0.7506 0.8664
No log 25.1765 428 0.7587 0.6097 0.7587 0.8710
No log 25.2941 430 0.6981 0.6340 0.6981 0.8355
No log 25.4118 432 0.6235 0.6380 0.6235 0.7896
No log 25.5294 434 0.6078 0.6460 0.6078 0.7796
No log 25.6471 436 0.6259 0.5871 0.6259 0.7911
No log 25.7647 438 0.6448 0.5656 0.6448 0.8030
No log 25.8824 440 0.6504 0.5886 0.6504 0.8065
No log 26.0 442 0.6489 0.6096 0.6489 0.8055
No log 26.1176 444 0.6476 0.6335 0.6476 0.8047
No log 26.2353 446 0.6565 0.6325 0.6565 0.8102
No log 26.3529 448 0.6536 0.6187 0.6536 0.8085
No log 26.4706 450 0.6357 0.6177 0.6357 0.7973
No log 26.5882 452 0.6130 0.6697 0.6130 0.7829
No log 26.7059 454 0.6338 0.6281 0.6338 0.7961
No log 26.8235 456 0.6320 0.6107 0.6320 0.7950
No log 26.9412 458 0.6276 0.6237 0.6276 0.7922
No log 27.0588 460 0.6442 0.6479 0.6442 0.8026
No log 27.1765 462 0.6621 0.6561 0.6621 0.8137
No log 27.2941 464 0.6543 0.6858 0.6543 0.8089
No log 27.4118 466 0.6230 0.6440 0.6230 0.7893
No log 27.5294 468 0.6156 0.6233 0.6156 0.7846
No log 27.6471 470 0.6046 0.6368 0.6046 0.7776
No log 27.7647 472 0.5987 0.6368 0.5987 0.7738
No log 27.8824 474 0.5972 0.6452 0.5972 0.7728
No log 28.0 476 0.5980 0.6813 0.5980 0.7733
No log 28.1176 478 0.5965 0.6164 0.5965 0.7723
No log 28.2353 480 0.6064 0.6154 0.6064 0.7787
No log 28.3529 482 0.6242 0.6035 0.6242 0.7901
No log 28.4706 484 0.6561 0.6241 0.6561 0.8100
No log 28.5882 486 0.6451 0.6133 0.6451 0.8032
No log 28.7059 488 0.6277 0.6143 0.6277 0.7923
No log 28.8235 490 0.6069 0.6154 0.6069 0.7790
No log 28.9412 492 0.6030 0.6272 0.6030 0.7765
No log 29.0588 494 0.6216 0.6229 0.6216 0.7884
No log 29.1765 496 0.6418 0.6102 0.6418 0.8011
No log 29.2941 498 0.6434 0.6102 0.6434 0.8021
0.2534 29.4118 500 0.6267 0.6229 0.6267 0.7916
0.2534 29.5294 502 0.6171 0.6133 0.6171 0.7856
0.2534 29.6471 504 0.6123 0.6452 0.6123 0.7825
0.2534 29.7647 506 0.6148 0.6745 0.6148 0.7841
0.2534 29.8824 508 0.6258 0.6572 0.6258 0.7911
0.2534 30.0 510 0.6584 0.5578 0.6584 0.8114
0.2534 30.1176 512 0.6749 0.5578 0.6749 0.8215
0.2534 30.2353 514 0.6460 0.5928 0.6460 0.8038
0.2534 30.3529 516 0.5904 0.6584 0.5904 0.7684
0.2534 30.4706 518 0.5851 0.6123 0.5851 0.7649
0.2534 30.5882 520 0.5801 0.6518 0.5801 0.7616
0.2534 30.7059 522 0.5757 0.6737 0.5757 0.7588
0.2534 30.8235 524 0.5961 0.6535 0.5961 0.7721
0.2534 30.9412 526 0.6020 0.6508 0.6020 0.7759
0.2534 31.0588 528 0.6187 0.6508 0.6187 0.7866
0.2534 31.1765 530 0.6009 0.6401 0.6009 0.7752
0.2534 31.2941 532 0.6009 0.6297 0.6009 0.7752
0.2534 31.4118 534 0.6136 0.6186 0.6136 0.7833
0.2534 31.5294 536 0.6133 0.6186 0.6133 0.7831
0.2534 31.6471 538 0.5893 0.6442 0.5893 0.7677
0.2534 31.7647 540 0.5909 0.6401 0.5909 0.7687
0.2534 31.8824 542 0.5906 0.6401 0.5906 0.7685
0.2534 32.0 544 0.5909 0.6401 0.5909 0.7687
0.2534 32.1176 546 0.5879 0.6499 0.5879 0.7668
0.2534 32.2353 548 0.6012 0.6508 0.6012 0.7754
0.2534 32.3529 550 0.6185 0.6334 0.6185 0.7864
0.2534 32.4706 552 0.6053 0.6401 0.6053 0.7780
0.2534 32.5882 554 0.6081 0.6667 0.6081 0.7798
0.2534 32.7059 556 0.6057 0.6667 0.6057 0.7783
0.2534 32.8235 558 0.6084 0.6526 0.6084 0.7800
0.2534 32.9412 560 0.6110 0.6553 0.6110 0.7817
0.2534 33.0588 562 0.6271 0.6581 0.6271 0.7919
0.2534 33.1765 564 0.6500 0.6634 0.6500 0.8062
0.2534 33.2941 566 0.6570 0.6464 0.6570 0.8105
0.2534 33.4118 568 0.6571 0.6207 0.6571 0.8106
0.2534 33.5294 570 0.6651 0.5678 0.6651 0.8155
0.2534 33.6471 572 0.6517 0.6370 0.6517 0.8073
0.2534 33.7647 574 0.6409 0.5833 0.6409 0.8005
0.2534 33.8824 576 0.6335 0.5529 0.6335 0.7959
0.2534 34.0 578 0.6285 0.5950 0.6285 0.7928
0.2534 34.1176 580 0.6346 0.6433 0.6346 0.7966
0.2534 34.2353 582 0.6475 0.6104 0.6475 0.8046
0.2534 34.3529 584 0.6450 0.6297 0.6450 0.8031
0.2534 34.4706 586 0.6468 0.6297 0.6468 0.8042
0.2534 34.5882 588 0.6393 0.6297 0.6393 0.7995
0.2534 34.7059 590 0.6385 0.5808 0.6385 0.7990
0.2534 34.8235 592 0.6475 0.5396 0.6475 0.8047
0.2534 34.9412 594 0.6609 0.5492 0.6609 0.8129
0.2534 35.0588 596 0.6667 0.5492 0.6667 0.8165
0.2534 35.1765 598 0.6720 0.5809 0.6720 0.8198

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_B_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k3_task5_organization

Finetuned
(4019)
this model