ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k14_task5_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.6650
  • Qwk: 0.5747
  • Mse: 0.6650
  • Rmse: 0.8155

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0286 2 3.7800 -0.0134 3.7800 1.9442
No log 0.0571 4 1.7482 0.0057 1.7482 1.3222
No log 0.0857 6 1.0656 0.1805 1.0656 1.0323
No log 0.1143 8 0.9972 0.3175 0.9972 0.9986
No log 0.1429 10 0.9967 0.2262 0.9967 0.9984
No log 0.1714 12 0.9876 0.2135 0.9876 0.9938
No log 0.2 14 1.0738 0.1398 1.0738 1.0362
No log 0.2286 16 1.1049 0.1603 1.1049 1.0512
No log 0.2571 18 1.0389 0.2187 1.0389 1.0193
No log 0.2857 20 1.1291 0.2359 1.1291 1.0626
No log 0.3143 22 1.0807 0.2179 1.0807 1.0396
No log 0.3429 24 0.9911 0.3465 0.9911 0.9956
No log 0.3714 26 0.9648 0.3992 0.9648 0.9823
No log 0.4 28 0.9086 0.4244 0.9086 0.9532
No log 0.4286 30 0.9257 0.4119 0.9257 0.9621
No log 0.4571 32 1.0159 0.1927 1.0159 1.0079
No log 0.4857 34 1.1140 0.1160 1.1140 1.0555
No log 0.5143 36 1.0959 0.1341 1.0959 1.0469
No log 0.5429 38 0.8307 0.4388 0.8307 0.9114
No log 0.5714 40 0.9219 0.5370 0.9219 0.9602
No log 0.6 42 1.0032 0.3961 1.0032 1.0016
No log 0.6286 44 0.7787 0.5288 0.7787 0.8824
No log 0.6571 46 0.7717 0.4619 0.7717 0.8785
No log 0.6857 48 0.7687 0.4854 0.7687 0.8767
No log 0.7143 50 0.7228 0.5820 0.7228 0.8502
No log 0.7429 52 0.7533 0.5512 0.7533 0.8679
No log 0.7714 54 0.7560 0.5909 0.7560 0.8695
No log 0.8 56 0.7525 0.5773 0.7525 0.8675
No log 0.8286 58 0.7874 0.5356 0.7874 0.8874
No log 0.8571 60 0.9508 0.4783 0.9508 0.9751
No log 0.8857 62 0.8717 0.4721 0.8717 0.9336
No log 0.9143 64 0.8068 0.5228 0.8068 0.8982
No log 0.9429 66 0.8236 0.5632 0.8236 0.9075
No log 0.9714 68 0.8947 0.4850 0.8947 0.9459
No log 1.0 70 0.9081 0.4856 0.9081 0.9529
No log 1.0286 72 0.8781 0.5069 0.8781 0.9371
No log 1.0571 74 0.8775 0.4086 0.8775 0.9368
No log 1.0857 76 0.8177 0.5887 0.8177 0.9043
No log 1.1143 78 0.7977 0.5923 0.7977 0.8931
No log 1.1429 80 0.8253 0.5386 0.8253 0.9084
No log 1.1714 82 0.7594 0.5334 0.7594 0.8714
No log 1.2 84 0.8524 0.4917 0.8524 0.9233
No log 1.2286 86 0.8990 0.4503 0.8990 0.9482
No log 1.2571 88 0.7760 0.6003 0.7760 0.8809
No log 1.2857 90 0.7139 0.5843 0.7139 0.8449
No log 1.3143 92 0.7276 0.5982 0.7276 0.8530
No log 1.3429 94 0.7007 0.5830 0.7007 0.8371
No log 1.3714 96 0.6705 0.6464 0.6705 0.8189
No log 1.4 98 0.6709 0.6147 0.6709 0.8191
No log 1.4286 100 0.6746 0.6001 0.6746 0.8213
No log 1.4571 102 0.6940 0.5934 0.6940 0.8331
No log 1.4857 104 0.6992 0.5934 0.6992 0.8362
No log 1.5143 106 0.6790 0.6147 0.6790 0.8240
No log 1.5429 108 0.6651 0.6482 0.6651 0.8155
No log 1.5714 110 0.6556 0.5626 0.6556 0.8097
No log 1.6 112 0.7183 0.5794 0.7183 0.8475
No log 1.6286 114 0.6633 0.5876 0.6633 0.8144
No log 1.6571 116 0.6036 0.6389 0.6036 0.7769
No log 1.6857 118 0.6186 0.6869 0.6186 0.7865
No log 1.7143 120 0.6036 0.6272 0.6036 0.7769
No log 1.7429 122 0.6141 0.6164 0.6141 0.7836
No log 1.7714 124 0.6274 0.5966 0.6274 0.7921
No log 1.8 126 0.6270 0.5972 0.6270 0.7918
No log 1.8286 128 0.6540 0.6265 0.6540 0.8087
No log 1.8571 130 0.7180 0.5927 0.7180 0.8473
No log 1.8857 132 0.6471 0.6510 0.6471 0.8044
No log 1.9143 134 0.6289 0.5966 0.6289 0.7930
No log 1.9429 136 0.6496 0.6049 0.6496 0.8060
No log 1.9714 138 0.6430 0.6239 0.6430 0.8019
No log 2.0 140 0.6860 0.6055 0.6860 0.8283
No log 2.0286 142 0.6830 0.5902 0.6830 0.8265
No log 2.0571 144 0.6852 0.5902 0.6852 0.8278
No log 2.0857 146 0.7131 0.5632 0.7131 0.8445
No log 2.1143 148 0.7791 0.5569 0.7791 0.8827
No log 2.1429 150 0.8075 0.5086 0.8075 0.8986
No log 2.1714 152 0.7960 0.4907 0.7960 0.8922
No log 2.2 154 0.7925 0.5261 0.7925 0.8902
No log 2.2286 156 0.7699 0.5343 0.7699 0.8774
No log 2.2571 158 0.7391 0.5135 0.7391 0.8597
No log 2.2857 160 0.7440 0.5287 0.7440 0.8625
No log 2.3143 162 0.7303 0.5557 0.7303 0.8546
No log 2.3429 164 0.7979 0.6118 0.7979 0.8933
No log 2.3714 166 0.8619 0.4923 0.8619 0.9284
No log 2.4 168 0.7768 0.5657 0.7768 0.8813
No log 2.4286 170 0.7745 0.4955 0.7745 0.8800
No log 2.4571 172 0.8667 0.4671 0.8667 0.9309
No log 2.4857 174 0.8495 0.5005 0.8495 0.9217
No log 2.5143 176 0.7740 0.4804 0.7740 0.8798
No log 2.5429 178 0.7713 0.5690 0.7713 0.8782
No log 2.5714 180 0.8125 0.4982 0.8125 0.9014
No log 2.6 182 0.9238 0.4369 0.9238 0.9611
No log 2.6286 184 1.0326 0.3848 1.0326 1.0162
No log 2.6571 186 1.0058 0.4216 1.0058 1.0029
No log 2.6857 188 0.8601 0.4455 0.8601 0.9274
No log 2.7143 190 0.8093 0.5331 0.8093 0.8996
No log 2.7429 192 0.7986 0.5835 0.7986 0.8936
No log 2.7714 194 0.8127 0.5810 0.8127 0.9015
No log 2.8 196 0.8679 0.5165 0.8679 0.9316
No log 2.8286 198 0.8459 0.5666 0.8459 0.9197
No log 2.8571 200 0.8109 0.6003 0.8109 0.9005
No log 2.8857 202 0.7732 0.5822 0.7732 0.8793
No log 2.9143 204 0.7761 0.4594 0.7761 0.8809
No log 2.9429 206 0.7980 0.4889 0.7980 0.8933
No log 2.9714 208 0.8082 0.5311 0.8082 0.8990
No log 3.0 210 0.7699 0.5716 0.7699 0.8774
No log 3.0286 212 0.7746 0.6102 0.7746 0.8801
No log 3.0571 214 0.7938 0.5413 0.7938 0.8910
No log 3.0857 216 0.7804 0.5098 0.7804 0.8834
No log 3.1143 218 0.7567 0.5746 0.7567 0.8699
No log 3.1429 220 0.7706 0.5771 0.7706 0.8778
No log 3.1714 222 0.7864 0.5516 0.7864 0.8868
No log 3.2 224 0.8651 0.5054 0.8651 0.9301
No log 3.2286 226 0.9532 0.4826 0.9532 0.9763
No log 3.2571 228 0.9154 0.4521 0.9154 0.9568
No log 3.2857 230 0.8679 0.5136 0.8679 0.9316
No log 3.3143 232 0.8405 0.5381 0.8405 0.9168
No log 3.3429 234 0.8243 0.5487 0.8243 0.9079
No log 3.3714 236 0.8255 0.5016 0.8255 0.9086
No log 3.4 238 0.8076 0.5552 0.8076 0.8987
No log 3.4286 240 0.7914 0.4869 0.7914 0.8896
No log 3.4571 242 0.7536 0.5886 0.7536 0.8681
No log 3.4857 244 0.7651 0.5428 0.7651 0.8747
No log 3.5143 246 0.7583 0.5428 0.7583 0.8708
No log 3.5429 248 0.7395 0.5902 0.7395 0.8599
No log 3.5714 250 0.7839 0.5072 0.7839 0.8854
No log 3.6 252 0.7765 0.5072 0.7765 0.8812
No log 3.6286 254 0.7477 0.5932 0.7477 0.8647
No log 3.6571 256 0.7389 0.5183 0.7389 0.8596
No log 3.6857 258 0.7414 0.5183 0.7414 0.8611
No log 3.7143 260 0.7626 0.5810 0.7626 0.8733
No log 3.7429 262 0.8679 0.4588 0.8679 0.9316
No log 3.7714 264 0.9001 0.4696 0.9001 0.9487
No log 3.8 266 0.8480 0.4824 0.8480 0.9209
No log 3.8286 268 0.8520 0.4824 0.8520 0.9230
No log 3.8571 270 0.8396 0.5093 0.8396 0.9163
No log 3.8857 272 0.8636 0.4974 0.8636 0.9293
No log 3.9143 274 0.8511 0.4217 0.8511 0.9226
No log 3.9429 276 0.8283 0.5366 0.8283 0.9101
No log 3.9714 278 0.7976 0.5928 0.7976 0.8931
No log 4.0 280 0.7710 0.5331 0.7710 0.8781
No log 4.0286 282 0.7540 0.5552 0.7540 0.8683
No log 4.0571 284 0.7327 0.5909 0.7327 0.8560
No log 4.0857 286 0.6891 0.6073 0.6891 0.8301
No log 4.1143 288 0.6809 0.6322 0.6809 0.8252
No log 4.1429 290 0.6955 0.5747 0.6955 0.8339
No log 4.1714 292 0.7159 0.5822 0.7159 0.8461
No log 4.2 294 0.7690 0.5404 0.7690 0.8769
No log 4.2286 296 0.7549 0.5969 0.7549 0.8688
No log 4.2571 298 0.7417 0.5640 0.7417 0.8612
No log 4.2857 300 0.7323 0.5640 0.7323 0.8557
No log 4.3143 302 0.7309 0.4878 0.7309 0.8550
No log 4.3429 304 0.7532 0.4216 0.7532 0.8679
No log 4.3714 306 0.8179 0.4946 0.8179 0.9044
No log 4.4 308 0.8687 0.4815 0.8687 0.9320
No log 4.4286 310 0.8400 0.4952 0.8400 0.9165
No log 4.4571 312 0.8127 0.4450 0.8127 0.9015
No log 4.4857 314 0.7630 0.4873 0.7630 0.8735
No log 4.5143 316 0.7147 0.5835 0.7147 0.8454
No log 4.5429 318 0.7040 0.6209 0.7040 0.8391
No log 4.5714 320 0.7126 0.6198 0.7126 0.8441
No log 4.6 322 0.7440 0.5637 0.7440 0.8625
No log 4.6286 324 0.8185 0.5455 0.8185 0.9047
No log 4.6571 326 0.9218 0.5306 0.9218 0.9601
No log 4.6857 328 0.8823 0.4799 0.8823 0.9393
No log 4.7143 330 0.7771 0.5433 0.7771 0.8815
No log 4.7429 332 0.7291 0.5142 0.7291 0.8539
No log 4.7714 334 0.7416 0.5063 0.7416 0.8612
No log 4.8 336 0.7366 0.5063 0.7366 0.8583
No log 4.8286 338 0.7077 0.5419 0.7077 0.8412
No log 4.8571 340 0.7025 0.6054 0.7025 0.8381
No log 4.8857 342 0.7325 0.5666 0.7325 0.8559
No log 4.9143 344 0.7987 0.5998 0.7987 0.8937
No log 4.9429 346 0.7804 0.5810 0.7804 0.8834
No log 4.9714 348 0.7495 0.5993 0.7495 0.8657
No log 5.0 350 0.7150 0.6076 0.7150 0.8456
No log 5.0286 352 0.6928 0.6227 0.6928 0.8323
No log 5.0571 354 0.6788 0.6339 0.6788 0.8239
No log 5.0857 356 0.6709 0.6035 0.6709 0.8191
No log 5.1143 358 0.6948 0.5678 0.6948 0.8335
No log 5.1429 360 0.7318 0.6429 0.7318 0.8554
No log 5.1714 362 0.7057 0.6118 0.7057 0.8400
No log 5.2 364 0.6550 0.6341 0.6550 0.8093
No log 5.2286 366 0.6563 0.6306 0.6563 0.8101
No log 5.2571 368 0.6721 0.6374 0.6721 0.8198
No log 5.2857 370 0.6871 0.6057 0.6871 0.8289
No log 5.3143 372 0.7344 0.5763 0.7344 0.8570
No log 5.3429 374 0.7791 0.5292 0.7791 0.8827
No log 5.3714 376 0.7886 0.5319 0.7886 0.8880
No log 5.4 378 0.7515 0.5446 0.7515 0.8669
No log 5.4286 380 0.7060 0.6165 0.7060 0.8402
No log 5.4571 382 0.7113 0.5524 0.7113 0.8434
No log 5.4857 384 0.7378 0.5513 0.7378 0.8590
No log 5.5143 386 0.7131 0.5784 0.7131 0.8445
No log 5.5429 388 0.7015 0.6537 0.7015 0.8376
No log 5.5714 390 0.7300 0.5637 0.7300 0.8544
No log 5.6 392 0.7215 0.5810 0.7215 0.8494
No log 5.6286 394 0.7249 0.5375 0.7249 0.8514
No log 5.6571 396 0.7375 0.4277 0.7375 0.8588
No log 5.6857 398 0.7231 0.5042 0.7231 0.8503
No log 5.7143 400 0.7188 0.5168 0.7188 0.8478
No log 5.7429 402 0.7323 0.5425 0.7323 0.8557
No log 5.7714 404 0.7293 0.5868 0.7293 0.8540
No log 5.8 406 0.7384 0.5774 0.7384 0.8593
No log 5.8286 408 0.7848 0.5067 0.7848 0.8859
No log 5.8571 410 0.8166 0.4840 0.8166 0.9037
No log 5.8857 412 0.7517 0.5528 0.7517 0.8670
No log 5.9143 414 0.7021 0.5631 0.7021 0.8379
No log 5.9429 416 0.6922 0.5644 0.6922 0.8320
No log 5.9714 418 0.7196 0.5467 0.7196 0.8483
No log 6.0 420 0.7413 0.5346 0.7413 0.8610
No log 6.0286 422 0.7526 0.5442 0.7526 0.8675
No log 6.0571 424 0.7370 0.5434 0.7370 0.8585
No log 6.0857 426 0.7270 0.5570 0.7270 0.8526
No log 6.1143 428 0.7201 0.5590 0.7201 0.8486
No log 6.1429 430 0.7173 0.5822 0.7173 0.8469
No log 6.1714 432 0.7133 0.6154 0.7133 0.8445
No log 6.2 434 0.7136 0.6154 0.7136 0.8447
No log 6.2286 436 0.7168 0.5678 0.7168 0.8466
No log 6.2571 438 0.7213 0.5763 0.7213 0.8493
No log 6.2857 440 0.7356 0.5752 0.7356 0.8577
No log 6.3143 442 0.7613 0.5752 0.7613 0.8726
No log 6.3429 444 0.7843 0.6138 0.7843 0.8856
No log 6.3714 446 0.7950 0.6237 0.7950 0.8916
No log 6.4 448 0.7882 0.5891 0.7882 0.8878
No log 6.4286 450 0.7861 0.5917 0.7861 0.8866
No log 6.4571 452 0.8134 0.4974 0.8134 0.9019
No log 6.4857 454 0.8308 0.5062 0.8308 0.9115
No log 6.5143 456 0.8070 0.5534 0.8070 0.8983
No log 6.5429 458 0.8026 0.5487 0.8026 0.8959
No log 6.5714 460 0.7544 0.5067 0.7544 0.8686
No log 6.6 462 0.6942 0.5657 0.6941 0.8332
No log 6.6286 464 0.6786 0.5735 0.6786 0.8238
No log 6.6571 466 0.6744 0.5626 0.6744 0.8212
No log 6.6857 468 0.6755 0.5626 0.6755 0.8219
No log 6.7143 470 0.6795 0.6018 0.6795 0.8243
No log 6.7429 472 0.6951 0.6007 0.6951 0.8337
No log 6.7714 474 0.7233 0.5510 0.7233 0.8505
No log 6.8 476 0.7372 0.5395 0.7372 0.8586
No log 6.8286 478 0.7422 0.5498 0.7422 0.8615
No log 6.8571 480 0.7018 0.5093 0.7018 0.8378
No log 6.8857 482 0.6736 0.5509 0.6736 0.8207
No log 6.9143 484 0.6594 0.5771 0.6594 0.8120
No log 6.9429 486 0.6443 0.5771 0.6443 0.8027
No log 6.9714 488 0.6319 0.6096 0.6319 0.7949
No log 7.0 490 0.6344 0.6256 0.6344 0.7965
No log 7.0286 492 0.6359 0.5959 0.6359 0.7974
No log 7.0571 494 0.6326 0.6154 0.6326 0.7954
No log 7.0857 496 0.6251 0.5882 0.6251 0.7906
No log 7.1143 498 0.6248 0.5759 0.6248 0.7904
0.261 7.1429 500 0.6726 0.6133 0.6726 0.8201
0.261 7.1714 502 0.7365 0.5981 0.7365 0.8582
0.261 7.2 504 0.7312 0.6082 0.7312 0.8551
0.261 7.2286 506 0.6831 0.5710 0.6831 0.8265
0.261 7.2571 508 0.6727 0.6096 0.6727 0.8202
0.261 7.2857 510 0.6608 0.6175 0.6608 0.8129
0.261 7.3143 512 0.6624 0.6113 0.6624 0.8139
0.261 7.3429 514 0.6921 0.6035 0.6921 0.8319
0.261 7.3714 516 0.6786 0.5869 0.6786 0.8238
0.261 7.4 518 0.6604 0.6096 0.6604 0.8127
0.261 7.4286 520 0.6650 0.5747 0.6650 0.8155

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k14_task5_organization

Finetuned
(4019)
this model