ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k14_task5_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.6254
  • Qwk: 0.5357
  • Mse: 0.6254
  • Rmse: 0.7908

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0286 2 4.0774 0.0132 4.0774 2.0193
No log 0.0571 4 2.2546 0.0970 2.2546 1.5015
No log 0.0857 6 1.5157 -0.0078 1.5157 1.2311
No log 0.1143 8 1.2634 0.0075 1.2634 1.1240
No log 0.1429 10 1.0750 0.1699 1.0750 1.0368
No log 0.1714 12 1.1212 0.1846 1.1212 1.0589
No log 0.2 14 1.0149 0.2492 1.0149 1.0074
No log 0.2286 16 0.9297 0.3014 0.9297 0.9642
No log 0.2571 18 1.0748 0.2777 1.0748 1.0367
No log 0.2857 20 1.2843 0.2908 1.2843 1.1333
No log 0.3143 22 0.9240 0.3450 0.9240 0.9613
No log 0.3429 24 0.8120 0.4024 0.8120 0.9011
No log 0.3714 26 0.7135 0.5183 0.7135 0.8447
No log 0.4 28 0.8133 0.3883 0.8133 0.9018
No log 0.4286 30 0.7425 0.6143 0.7425 0.8617
No log 0.4571 32 0.7008 0.5830 0.7008 0.8371
No log 0.4857 34 0.7905 0.4911 0.7905 0.8891
No log 0.5143 36 0.9017 0.5250 0.9017 0.9496
No log 0.5429 38 0.8920 0.5987 0.8920 0.9445
No log 0.5714 40 0.6676 0.6025 0.6676 0.8171
No log 0.6 42 0.8419 0.5090 0.8419 0.9176
No log 0.6286 44 0.7917 0.5532 0.7917 0.8897
No log 0.6571 46 0.6399 0.6619 0.6399 0.7999
No log 0.6857 48 0.7599 0.5536 0.7599 0.8717
No log 0.7143 50 0.8194 0.4964 0.8194 0.9052
No log 0.7429 52 0.7696 0.5823 0.7696 0.8773
No log 0.7714 54 0.6524 0.5805 0.6524 0.8077
No log 0.8 56 0.6589 0.6204 0.6589 0.8118
No log 0.8286 58 0.6442 0.6015 0.6442 0.8026
No log 0.8571 60 0.7078 0.5432 0.7078 0.8413
No log 0.8857 62 0.6873 0.5232 0.6873 0.8290
No log 0.9143 64 0.6025 0.6553 0.6025 0.7762
No log 0.9429 66 0.6118 0.6456 0.6118 0.7822
No log 0.9714 68 0.6711 0.6249 0.6711 0.8192
No log 1.0 70 0.9790 0.5531 0.9790 0.9894
No log 1.0286 72 1.1007 0.4882 1.1007 1.0491
No log 1.0571 74 0.7716 0.7007 0.7716 0.8784
No log 1.0857 76 0.6266 0.6474 0.6266 0.7916
No log 1.1143 78 0.5743 0.6348 0.5743 0.7578
No log 1.1429 80 0.5504 0.6602 0.5504 0.7419
No log 1.1714 82 0.5428 0.6697 0.5428 0.7367
No log 1.2 84 0.5572 0.7095 0.5572 0.7465
No log 1.2286 86 0.5734 0.6697 0.5734 0.7573
No log 1.2571 88 0.6532 0.6357 0.6532 0.8082
No log 1.2857 90 0.7152 0.6683 0.7152 0.8457
No log 1.3143 92 0.6598 0.6874 0.6598 0.8123
No log 1.3429 94 0.6755 0.6874 0.6755 0.8219
No log 1.3714 96 0.6865 0.6669 0.6865 0.8286
No log 1.4 98 0.6990 0.5908 0.6990 0.8361
No log 1.4286 100 0.6973 0.5542 0.6973 0.8350
No log 1.4571 102 0.7439 0.5483 0.7439 0.8625
No log 1.4857 104 0.7549 0.5833 0.7549 0.8688
No log 1.5143 106 0.6326 0.6350 0.6326 0.7953
No log 1.5429 108 0.6517 0.5879 0.6517 0.8073
No log 1.5714 110 0.6313 0.6233 0.6313 0.7945
No log 1.6 112 0.6208 0.6254 0.6208 0.7879
No log 1.6286 114 0.6080 0.6619 0.6080 0.7797
No log 1.6571 116 0.6194 0.6207 0.6194 0.7870
No log 1.6857 118 0.6395 0.6006 0.6395 0.7997
No log 1.7143 120 0.6716 0.5340 0.6716 0.8195
No log 1.7429 122 0.8372 0.5451 0.8372 0.9150
No log 1.7714 124 0.8038 0.6076 0.8038 0.8966
No log 1.8 126 0.6538 0.6674 0.6538 0.8086
No log 1.8286 128 0.6980 0.6301 0.6980 0.8355
No log 1.8571 130 0.7348 0.5845 0.7348 0.8572
No log 1.8857 132 0.6589 0.6581 0.6589 0.8118
No log 1.9143 134 0.6037 0.6932 0.6037 0.7770
No log 1.9429 136 0.6150 0.6470 0.6150 0.7842
No log 1.9714 138 0.6103 0.7071 0.6103 0.7812
No log 2.0 140 0.6119 0.6570 0.6119 0.7822
No log 2.0286 142 0.6602 0.6225 0.6602 0.8125
No log 2.0571 144 0.6055 0.6969 0.6055 0.7782
No log 2.0857 146 0.6147 0.6766 0.6147 0.7840
No log 2.1143 148 0.6096 0.6766 0.6096 0.7808
No log 2.1429 150 0.6079 0.7158 0.6079 0.7797
No log 2.1714 152 0.6542 0.6240 0.6542 0.8088
No log 2.2 154 0.6283 0.6082 0.6283 0.7927
No log 2.2286 156 0.6143 0.6623 0.6143 0.7838
No log 2.2571 158 0.5988 0.6488 0.5988 0.7738
No log 2.2857 160 0.6107 0.6135 0.6107 0.7815
No log 2.3143 162 0.6139 0.6628 0.6139 0.7835
No log 2.3429 164 0.6076 0.6557 0.6076 0.7795
No log 2.3714 166 0.5976 0.6884 0.5976 0.7730
No log 2.4 168 0.5859 0.6916 0.5859 0.7654
No log 2.4286 170 0.6348 0.5351 0.6348 0.7968
No log 2.4571 172 0.6772 0.5134 0.6772 0.8229
No log 2.4857 174 0.6672 0.5355 0.6672 0.8168
No log 2.5143 176 0.6419 0.5891 0.6419 0.8012
No log 2.5429 178 0.6667 0.6520 0.6667 0.8165
No log 2.5714 180 0.6661 0.6430 0.6661 0.8161
No log 2.6 182 0.6751 0.6818 0.6751 0.8216
No log 2.6286 184 0.6701 0.6692 0.6701 0.8186
No log 2.6571 186 0.6398 0.6796 0.6398 0.7999
No log 2.6857 188 0.6424 0.6298 0.6424 0.8015
No log 2.7143 190 0.6215 0.5988 0.6215 0.7883
No log 2.7429 192 0.6263 0.5626 0.6263 0.7914
No log 2.7714 194 0.6118 0.5868 0.6118 0.7822
No log 2.8 196 0.5978 0.6316 0.5978 0.7732
No log 2.8286 198 0.5933 0.6176 0.5933 0.7703
No log 2.8571 200 0.6141 0.6324 0.6141 0.7836
No log 2.8857 202 0.7117 0.6469 0.7117 0.8436
No log 2.9143 204 0.6459 0.6113 0.6459 0.8037
No log 2.9429 206 0.6437 0.6139 0.6437 0.8023
No log 2.9714 208 0.7706 0.5855 0.7706 0.8778
No log 3.0 210 0.7520 0.5670 0.7520 0.8672
No log 3.0286 212 0.6457 0.5770 0.6457 0.8035
No log 3.0571 214 0.6303 0.5954 0.6303 0.7939
No log 3.0857 216 0.6354 0.5671 0.6354 0.7971
No log 3.1143 218 0.6490 0.5770 0.6490 0.8056
No log 3.1429 220 0.6769 0.5876 0.6769 0.8228
No log 3.1714 222 0.6264 0.5971 0.6264 0.7914
No log 3.2 224 0.6254 0.6553 0.6254 0.7909
No log 3.2286 226 0.6277 0.6553 0.6277 0.7923
No log 3.2571 228 0.6340 0.6377 0.6340 0.7963
No log 3.2857 230 0.7262 0.5224 0.7262 0.8522
No log 3.3143 232 0.7484 0.5224 0.7484 0.8651
No log 3.3429 234 0.6622 0.6157 0.6622 0.8138
No log 3.3714 236 0.6498 0.6714 0.6498 0.8061
No log 3.4 238 0.6449 0.6560 0.6449 0.8031
No log 3.4286 240 0.6562 0.6345 0.6562 0.8101
No log 3.4571 242 0.6993 0.5793 0.6993 0.8362
No log 3.4857 244 0.7777 0.5802 0.7777 0.8819
No log 3.5143 246 0.6881 0.5636 0.6881 0.8295
No log 3.5429 248 0.6452 0.6288 0.6452 0.8032
No log 3.5714 250 0.7408 0.6301 0.7408 0.8607
No log 3.6 252 0.7265 0.6189 0.7265 0.8524
No log 3.6286 254 0.6349 0.6196 0.6349 0.7968
No log 3.6571 256 0.6641 0.5826 0.6641 0.8149
No log 3.6857 258 0.7126 0.5965 0.7126 0.8442
No log 3.7143 260 0.6807 0.6400 0.6807 0.8250
No log 3.7429 262 0.6913 0.7147 0.6913 0.8314
No log 3.7714 264 0.6685 0.6828 0.6685 0.8176
No log 3.8 266 0.6479 0.6363 0.6479 0.8050
No log 3.8286 268 0.7791 0.5812 0.7791 0.8827
No log 3.8571 270 0.7787 0.5632 0.7787 0.8825
No log 3.8857 272 0.6540 0.5865 0.6540 0.8087
No log 3.9143 274 0.6265 0.6207 0.6265 0.7915
No log 3.9429 276 0.6764 0.6272 0.6764 0.8224
No log 3.9714 278 0.7046 0.6262 0.7046 0.8394
No log 4.0 280 0.6525 0.6177 0.6525 0.8078
No log 4.0286 282 0.6053 0.6390 0.6053 0.7780
No log 4.0571 284 0.5990 0.6349 0.5990 0.7740
No log 4.0857 286 0.5933 0.6284 0.5933 0.7702
No log 4.1143 288 0.5942 0.6650 0.5942 0.7709
No log 4.1429 290 0.6049 0.6614 0.6049 0.7777
No log 4.1714 292 0.6513 0.5940 0.6513 0.8070
No log 4.2 294 0.6330 0.6197 0.6330 0.7956
No log 4.2286 296 0.6061 0.6333 0.6061 0.7785
No log 4.2571 298 0.6126 0.6038 0.6126 0.7827
No log 4.2857 300 0.6420 0.6361 0.6420 0.8013
No log 4.3143 302 0.7089 0.5820 0.7089 0.8420
No log 4.3429 304 0.6663 0.6564 0.6663 0.8163
No log 4.3714 306 0.5891 0.6634 0.5891 0.7675
No log 4.4 308 0.5720 0.6564 0.5720 0.7563
No log 4.4286 310 0.5965 0.5852 0.5965 0.7724
No log 4.4571 312 0.5887 0.6401 0.5887 0.7673
No log 4.4857 314 0.5759 0.6667 0.5759 0.7589
No log 4.5143 316 0.5810 0.6667 0.5810 0.7622
No log 4.5429 318 0.6172 0.5816 0.6172 0.7856
No log 4.5714 320 0.7613 0.5729 0.7613 0.8725
No log 4.6 322 0.7950 0.5832 0.7950 0.8916
No log 4.6286 324 0.6570 0.5922 0.6570 0.8105
No log 4.6571 326 0.6284 0.6365 0.6284 0.7927
No log 4.6857 328 0.7373 0.6280 0.7373 0.8587
No log 4.7143 330 0.7183 0.6115 0.7183 0.8475
No log 4.7429 332 0.6329 0.6197 0.6329 0.7956
No log 4.7714 334 0.6354 0.6070 0.6354 0.7971
No log 4.8 336 0.6459 0.5839 0.6459 0.8037
No log 4.8286 338 0.6275 0.6764 0.6275 0.7921
No log 4.8571 340 0.6434 0.6119 0.6434 0.8021
No log 4.8857 342 0.6600 0.6254 0.6600 0.8124
No log 4.9143 344 0.6256 0.6314 0.6256 0.7909
No log 4.9429 346 0.5945 0.6347 0.5945 0.7710
No log 4.9714 348 0.5683 0.6641 0.5683 0.7538
No log 5.0 350 0.5540 0.6712 0.5540 0.7443
No log 5.0286 352 0.5565 0.6664 0.5565 0.7460
No log 5.0571 354 0.6198 0.6990 0.6198 0.7873
No log 5.0857 356 0.5869 0.6900 0.5869 0.7661
No log 5.1143 358 0.5552 0.6956 0.5552 0.7451
No log 5.1429 360 0.5862 0.6573 0.5862 0.7657
No log 5.1714 362 0.5822 0.6599 0.5822 0.7630
No log 5.2 364 0.5612 0.6820 0.5612 0.7491
No log 5.2286 366 0.6558 0.6552 0.6558 0.8098
No log 5.2571 368 0.7028 0.6417 0.7028 0.8383
No log 5.2857 370 0.6427 0.6012 0.6427 0.8017
No log 5.3143 372 0.5756 0.6642 0.5756 0.7587
No log 5.3429 374 0.5979 0.6080 0.5979 0.7732
No log 5.3714 376 0.6376 0.5922 0.6376 0.7985
No log 5.4 378 0.6172 0.6281 0.6172 0.7856
No log 5.4286 380 0.5851 0.6642 0.5851 0.7649
No log 5.4571 382 0.6125 0.6617 0.6125 0.7827
No log 5.4857 384 0.6367 0.6127 0.6367 0.7979
No log 5.5143 386 0.6613 0.5822 0.6613 0.8132
No log 5.5429 388 0.6682 0.5872 0.6682 0.8174
No log 5.5714 390 0.6756 0.5835 0.6756 0.8220
No log 5.6 392 0.6642 0.5938 0.6642 0.8150
No log 5.6286 394 0.6259 0.5938 0.6259 0.7911
No log 5.6571 396 0.6044 0.6361 0.6044 0.7774
No log 5.6857 398 0.5906 0.6439 0.5906 0.7685
No log 5.7143 400 0.5982 0.6748 0.5982 0.7734
No log 5.7429 402 0.6030 0.6788 0.6030 0.7765
No log 5.7714 404 0.6109 0.5808 0.6109 0.7816
No log 5.8 406 0.6365 0.5797 0.6365 0.7978
No log 5.8286 408 0.6660 0.5832 0.6660 0.8161
No log 5.8571 410 0.6302 0.6217 0.6302 0.7938
No log 5.8857 412 0.6061 0.6365 0.6061 0.7785
No log 5.9143 414 0.5903 0.6292 0.5903 0.7683
No log 5.9429 416 0.5769 0.6302 0.5769 0.7595
No log 5.9714 418 0.5719 0.6097 0.5719 0.7562
No log 6.0 420 0.5765 0.5955 0.5765 0.7593
No log 6.0286 422 0.5801 0.6335 0.5801 0.7617
No log 6.0571 424 0.5595 0.6345 0.5595 0.7480
No log 6.0857 426 0.5592 0.6345 0.5592 0.7478
No log 6.1143 428 0.5736 0.5994 0.5736 0.7574
No log 6.1429 430 0.5889 0.6470 0.5889 0.7674
No log 6.1714 432 0.6277 0.6157 0.6277 0.7922
No log 6.2 434 0.6937 0.6053 0.6937 0.8329
No log 6.2286 436 0.6930 0.5729 0.6930 0.8325
No log 6.2571 438 0.6511 0.5582 0.6511 0.8069
No log 6.2857 440 0.6394 0.5046 0.6394 0.7996
No log 6.3143 442 0.6280 0.5492 0.6280 0.7925
No log 6.3429 444 0.6174 0.5713 0.6174 0.7858
No log 6.3714 446 0.6111 0.6196 0.6111 0.7817
No log 6.4 448 0.6184 0.5784 0.6184 0.7864
No log 6.4286 450 0.6217 0.6501 0.6217 0.7885
No log 6.4571 452 0.6147 0.6536 0.6147 0.7840
No log 6.4857 454 0.6006 0.6581 0.6006 0.7750
No log 6.5143 456 0.5935 0.6581 0.5935 0.7704
No log 6.5429 458 0.5911 0.6619 0.5911 0.7688
No log 6.5714 460 0.5961 0.6347 0.5961 0.7721
No log 6.6 462 0.5940 0.6649 0.5940 0.7707
No log 6.6286 464 0.5981 0.6292 0.5981 0.7734
No log 6.6571 466 0.6154 0.6005 0.6154 0.7845
No log 6.6857 468 0.6359 0.6080 0.6359 0.7974
No log 6.7143 470 0.6423 0.6416 0.6423 0.8014
No log 6.7429 472 0.6210 0.6667 0.6210 0.7881
No log 6.7714 474 0.5965 0.6788 0.5965 0.7724
No log 6.8 476 0.6083 0.6119 0.6083 0.7799
No log 6.8286 478 0.6456 0.5846 0.6456 0.8035
No log 6.8571 480 0.6381 0.5872 0.6381 0.7988
No log 6.8857 482 0.6366 0.5872 0.6366 0.7978
No log 6.9143 484 0.6307 0.5504 0.6307 0.7942
No log 6.9429 486 0.6053 0.5969 0.6053 0.7780
No log 6.9714 488 0.5839 0.6528 0.5839 0.7642
No log 7.0 490 0.5775 0.6762 0.5775 0.7599
No log 7.0286 492 0.5855 0.6602 0.5855 0.7652
No log 7.0571 494 0.6207 0.6217 0.6207 0.7878
No log 7.0857 496 0.6439 0.6237 0.6439 0.8024
No log 7.1143 498 0.7355 0.5543 0.7355 0.8576
0.2558 7.1429 500 0.7668 0.5766 0.7668 0.8757
0.2558 7.1714 502 0.7038 0.5173 0.7038 0.8389
0.2558 7.2 504 0.6575 0.5131 0.6575 0.8108
0.2558 7.2286 506 0.6434 0.5247 0.6434 0.8021
0.2558 7.2571 508 0.6310 0.5343 0.6310 0.7944
0.2558 7.2857 510 0.6254 0.5357 0.6254 0.7908

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k14_task5_organization

Finetuned
(4019)
this model