ArabicNewSplits7_usingALLEssays_FineTuningAraBERT_run1_AugV5_k6_task5_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.7185
  • Qwk: 0.5359
  • Mse: 0.7185
  • Rmse: 0.8476

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0952 2 3.9038 0.0069 3.9038 1.9758
No log 0.1905 4 2.0805 0.0727 2.0805 1.4424
No log 0.2857 6 1.4980 -0.0078 1.4980 1.2239
No log 0.3810 8 1.5295 0.0416 1.5295 1.2367
No log 0.4762 10 1.2917 0.0 1.2917 1.1366
No log 0.5714 12 1.0846 0.2416 1.0846 1.0414
No log 0.6667 14 1.0337 0.3375 1.0337 1.0167
No log 0.7619 16 1.1522 0.0462 1.1522 1.0734
No log 0.8571 18 1.1922 0.1233 1.1922 1.0919
No log 0.9524 20 1.2788 0.0 1.2788 1.1308
No log 1.0476 22 1.5256 0.0 1.5256 1.2352
No log 1.1429 24 1.5035 0.0 1.5035 1.2262
No log 1.2381 26 1.3471 0.0 1.3471 1.1606
No log 1.3333 28 1.1811 0.1142 1.1811 1.0868
No log 1.4286 30 1.1167 0.1564 1.1167 1.0567
No log 1.5238 32 1.0004 0.3117 1.0004 1.0002
No log 1.6190 34 0.9750 0.3876 0.9750 0.9874
No log 1.7143 36 1.0063 0.3272 1.0063 1.0031
No log 1.8095 38 1.0309 0.3830 1.0309 1.0153
No log 1.9048 40 1.1358 0.2804 1.1358 1.0657
No log 2.0 42 1.4653 0.1216 1.4653 1.2105
No log 2.0952 44 1.5783 0.1501 1.5783 1.2563
No log 2.1905 46 1.2586 0.2030 1.2586 1.1219
No log 2.2857 48 0.9887 0.3332 0.9887 0.9944
No log 2.3810 50 1.0389 0.3014 1.0389 1.0193
No log 2.4762 52 1.0146 0.2991 1.0146 1.0073
No log 2.5714 54 1.0121 0.3499 1.0121 1.0060
No log 2.6667 56 1.0788 0.2416 1.0788 1.0386
No log 2.7619 58 1.3065 0.2326 1.3065 1.1430
No log 2.8571 60 1.2783 0.1756 1.2783 1.1306
No log 2.9524 62 1.1695 0.2062 1.1695 1.0814
No log 3.0476 64 1.1142 0.1952 1.1142 1.0555
No log 3.1429 66 1.0067 0.2850 1.0067 1.0034
No log 3.2381 68 0.9335 0.4338 0.9335 0.9662
No log 3.3333 70 0.9222 0.4826 0.9222 0.9603
No log 3.4286 72 0.9431 0.4152 0.9431 0.9711
No log 3.5238 74 0.9813 0.4123 0.9813 0.9906
No log 3.6190 76 0.9432 0.4545 0.9432 0.9712
No log 3.7143 78 0.8565 0.4817 0.8565 0.9255
No log 3.8095 80 0.8627 0.4817 0.8627 0.9288
No log 3.9048 82 0.7929 0.5244 0.7929 0.8905
No log 4.0 84 0.7517 0.4507 0.7517 0.8670
No log 4.0952 86 0.7752 0.4115 0.7752 0.8804
No log 4.1905 88 0.7534 0.3693 0.7534 0.8680
No log 4.2857 90 0.7662 0.4861 0.7662 0.8753
No log 4.3810 92 0.7608 0.4872 0.7608 0.8723
No log 4.4762 94 0.7561 0.4271 0.7561 0.8695
No log 4.5714 96 0.8939 0.4078 0.8939 0.9455
No log 4.6667 98 0.8163 0.4613 0.8163 0.9035
No log 4.7619 100 0.7652 0.5898 0.7652 0.8748
No log 4.8571 102 0.7814 0.5898 0.7814 0.8840
No log 4.9524 104 0.7525 0.6491 0.7525 0.8675
No log 5.0476 106 0.7626 0.5985 0.7626 0.8733
No log 5.1429 108 0.7776 0.5302 0.7776 0.8818
No log 5.2381 110 0.7843 0.6094 0.7843 0.8856
No log 5.3333 112 0.8797 0.5032 0.8797 0.9379
No log 5.4286 114 1.3366 0.3936 1.3366 1.1561
No log 5.5238 116 1.3920 0.3437 1.3920 1.1798
No log 5.6190 118 0.9973 0.4805 0.9973 0.9987
No log 5.7143 120 0.7519 0.4832 0.7519 0.8671
No log 5.8095 122 0.8152 0.4480 0.8152 0.9029
No log 5.9048 124 0.8426 0.4224 0.8426 0.9180
No log 6.0 126 0.7353 0.4774 0.7353 0.8575
No log 6.0952 128 0.7610 0.5688 0.7610 0.8723
No log 6.1905 130 0.8788 0.5061 0.8788 0.9374
No log 6.2857 132 0.7920 0.5455 0.7920 0.8899
No log 6.3810 134 0.6835 0.6352 0.6835 0.8267
No log 6.4762 136 0.7174 0.6125 0.7174 0.8470
No log 6.5714 138 0.7036 0.6659 0.7036 0.8388
No log 6.6667 140 0.7337 0.6170 0.7337 0.8566
No log 6.7619 142 0.7668 0.6239 0.7668 0.8757
No log 6.8571 144 0.9946 0.5160 0.9946 0.9973
No log 6.9524 146 1.0781 0.4681 1.0781 1.0383
No log 7.0476 148 0.9800 0.4270 0.9800 0.9900
No log 7.1429 150 0.7927 0.4995 0.7927 0.8903
No log 7.2381 152 0.7563 0.5895 0.7563 0.8697
No log 7.3333 154 0.7590 0.5471 0.7590 0.8712
No log 7.4286 156 0.7901 0.4982 0.7901 0.8889
No log 7.5238 158 0.8238 0.4821 0.8238 0.9076
No log 7.6190 160 0.8021 0.5064 0.8021 0.8956
No log 7.7143 162 0.7467 0.5523 0.7467 0.8641
No log 7.8095 164 0.7458 0.5518 0.7458 0.8636
No log 7.9048 166 0.8288 0.5340 0.8288 0.9104
No log 8.0 168 0.8864 0.4930 0.8864 0.9415
No log 8.0952 170 0.8390 0.5709 0.8390 0.9160
No log 8.1905 172 0.7972 0.6055 0.7972 0.8929
No log 8.2857 174 0.8460 0.6067 0.8460 0.9198
No log 8.3810 176 0.7433 0.5960 0.7433 0.8621
No log 8.4762 178 0.7351 0.5304 0.7351 0.8574
No log 8.5714 180 0.7298 0.4861 0.7298 0.8543
No log 8.6667 182 0.7414 0.5017 0.7414 0.8610
No log 8.7619 184 0.7797 0.4660 0.7797 0.8830
No log 8.8571 186 0.8112 0.4065 0.8112 0.9007
No log 8.9524 188 0.8086 0.4810 0.8086 0.8992
No log 9.0476 190 0.8489 0.4522 0.8489 0.9214
No log 9.1429 192 0.9504 0.4592 0.9504 0.9749
No log 9.2381 194 0.8677 0.5062 0.8677 0.9315
No log 9.3333 196 0.7908 0.4867 0.7908 0.8893
No log 9.4286 198 0.7428 0.5648 0.7428 0.8619
No log 9.5238 200 0.7385 0.5648 0.7385 0.8593
No log 9.6190 202 0.7693 0.5257 0.7693 0.8771
No log 9.7143 204 0.7493 0.6460 0.7493 0.8656
No log 9.8095 206 0.7741 0.5981 0.7741 0.8798
No log 9.9048 208 0.7985 0.5534 0.7985 0.8936
No log 10.0 210 0.7572 0.6460 0.7572 0.8702
No log 10.0952 212 0.7767 0.5062 0.7767 0.8813
No log 10.1905 214 0.7531 0.5291 0.7531 0.8678
No log 10.2857 216 0.7744 0.5487 0.7744 0.8800
No log 10.3810 218 0.7924 0.5268 0.7924 0.8902
No log 10.4762 220 0.7838 0.5472 0.7838 0.8853
No log 10.5714 222 0.8649 0.5030 0.8649 0.9300
No log 10.6667 224 1.0247 0.4592 1.0247 1.0123
No log 10.7619 226 0.9558 0.5183 0.9558 0.9776
No log 10.8571 228 0.8680 0.5462 0.8680 0.9317
No log 10.9524 230 0.8295 0.5216 0.8295 0.9108
No log 11.0476 232 0.8600 0.5215 0.8600 0.9274
No log 11.1429 234 0.9050 0.4725 0.9050 0.9513
No log 11.2381 236 0.9026 0.5082 0.9026 0.9500
No log 11.3333 238 0.8641 0.5338 0.8641 0.9296
No log 11.4286 240 0.8869 0.5598 0.8869 0.9418
No log 11.5238 242 0.8788 0.5557 0.8788 0.9375
No log 11.6190 244 0.9058 0.5355 0.9058 0.9517
No log 11.7143 246 0.9201 0.5444 0.9201 0.9592
No log 11.8095 248 0.8595 0.5640 0.8595 0.9271
No log 11.9048 250 0.8330 0.5738 0.8330 0.9127
No log 12.0 252 0.8178 0.5345 0.8178 0.9043
No log 12.0952 254 0.8193 0.5160 0.8193 0.9051
No log 12.1905 256 0.7843 0.6068 0.7843 0.8856
No log 12.2857 258 0.7690 0.5489 0.7690 0.8769
No log 12.3810 260 0.7640 0.5378 0.7640 0.8741
No log 12.4762 262 0.7248 0.6167 0.7248 0.8514
No log 12.5714 264 0.7151 0.6051 0.7151 0.8456
No log 12.6667 266 0.7173 0.6414 0.7173 0.8470
No log 12.7619 268 0.7791 0.5553 0.7791 0.8827
No log 12.8571 270 0.9180 0.5374 0.9180 0.9581
No log 12.9524 272 0.9069 0.5181 0.9069 0.9523
No log 13.0476 274 0.7752 0.5827 0.7752 0.8805
No log 13.1429 276 0.7091 0.6025 0.7091 0.8421
No log 13.2381 278 0.7062 0.5923 0.7062 0.8404
No log 13.3333 280 0.7149 0.5786 0.7149 0.8455
No log 13.4286 282 0.7323 0.5751 0.7323 0.8558
No log 13.5238 284 0.7516 0.5397 0.7516 0.8669
No log 13.6190 286 0.7893 0.5425 0.7893 0.8884
No log 13.7143 288 0.8152 0.5211 0.8152 0.9029
No log 13.8095 290 0.7889 0.6180 0.7889 0.8882
No log 13.9048 292 0.7655 0.5846 0.7655 0.8749
No log 14.0 294 0.7636 0.5867 0.7636 0.8739
No log 14.0952 296 0.7712 0.5867 0.7712 0.8782
No log 14.1905 298 0.7776 0.5741 0.7776 0.8818
No log 14.2857 300 0.7781 0.5629 0.7781 0.8821
No log 14.3810 302 0.7664 0.5455 0.7664 0.8755
No log 14.4762 304 0.7822 0.5070 0.7822 0.8844
No log 14.5714 306 0.7642 0.5552 0.7642 0.8742
No log 14.6667 308 0.7673 0.5413 0.7673 0.8760
No log 14.7619 310 0.8214 0.4694 0.8214 0.9063
No log 14.8571 312 0.8566 0.4885 0.8566 0.9255
No log 14.9524 314 0.7904 0.4828 0.7904 0.8890
No log 15.0476 316 0.7075 0.5692 0.7075 0.8411
No log 15.1429 318 0.7067 0.6215 0.7067 0.8407
No log 15.2381 320 0.7278 0.6260 0.7278 0.8531
No log 15.3333 322 0.8152 0.5579 0.8152 0.9029
No log 15.4286 324 0.9586 0.4803 0.9586 0.9791
No log 15.5238 326 0.9655 0.4794 0.9655 0.9826
No log 15.6190 328 0.8349 0.5034 0.8349 0.9137
No log 15.7143 330 0.7529 0.6142 0.7529 0.8677
No log 15.8095 332 0.7640 0.6097 0.7640 0.8741
No log 15.9048 334 0.7494 0.5773 0.7494 0.8657
No log 16.0 336 0.7681 0.5683 0.7681 0.8764
No log 16.0952 338 0.8011 0.5728 0.8011 0.8951
No log 16.1905 340 0.8041 0.6142 0.8041 0.8967
No log 16.2857 342 0.7993 0.6269 0.7993 0.8940
No log 16.3810 344 0.7798 0.6010 0.7798 0.8831
No log 16.4762 346 0.7679 0.5567 0.7679 0.8763
No log 16.5714 348 0.7570 0.5363 0.7570 0.8701
No log 16.6667 350 0.7619 0.5474 0.7619 0.8729
No log 16.7619 352 0.7713 0.5363 0.7713 0.8783
No log 16.8571 354 0.8154 0.5360 0.8154 0.9030
No log 16.9524 356 0.8356 0.5247 0.8356 0.9141
No log 17.0476 358 0.8325 0.5247 0.8325 0.9124
No log 17.1429 360 0.8078 0.4990 0.8078 0.8988
No log 17.2381 362 0.8421 0.5247 0.8421 0.9177
No log 17.3333 364 0.8253 0.5175 0.8253 0.9085
No log 17.4286 366 0.8183 0.4865 0.8183 0.9046
No log 17.5238 368 0.8447 0.5227 0.8447 0.9191
No log 17.6190 370 0.8153 0.5683 0.8153 0.9030
No log 17.7143 372 0.8053 0.5797 0.8053 0.8974
No log 17.8095 374 0.8722 0.5217 0.8722 0.9339
No log 17.9048 376 0.9372 0.4910 0.9372 0.9681
No log 18.0 378 0.9023 0.4910 0.9023 0.9499
No log 18.0952 380 0.8526 0.5216 0.8525 0.9233
No log 18.1905 382 0.8031 0.5532 0.8031 0.8962
No log 18.2857 384 0.7579 0.5958 0.7579 0.8706
No log 18.3810 386 0.7606 0.6125 0.7606 0.8721
No log 18.4762 388 0.8282 0.5314 0.8282 0.9101
No log 18.5714 390 0.9300 0.4805 0.9300 0.9644
No log 18.6667 392 0.9249 0.4890 0.9249 0.9617
No log 18.7619 394 0.8188 0.5129 0.8188 0.9049
No log 18.8571 396 0.7336 0.5935 0.7336 0.8565
No log 18.9524 398 0.7315 0.6120 0.7315 0.8553
No log 19.0476 400 0.7359 0.5625 0.7359 0.8579
No log 19.1429 402 0.7826 0.5652 0.7826 0.8847
No log 19.2381 404 0.9503 0.4898 0.9503 0.9748
No log 19.3333 406 1.0863 0.4866 1.0863 1.0422
No log 19.4286 408 1.0672 0.4970 1.0672 1.0331
No log 19.5238 410 0.9319 0.5177 0.9319 0.9654
No log 19.6190 412 0.8260 0.5235 0.8260 0.9088
No log 19.7143 414 0.7649 0.6117 0.7649 0.8746
No log 19.8095 416 0.7659 0.6437 0.7659 0.8752
No log 19.9048 418 0.7837 0.6247 0.7837 0.8853
No log 20.0 420 0.8038 0.5517 0.8038 0.8966
No log 20.0952 422 0.8203 0.5245 0.8203 0.9057
No log 20.1905 424 0.8006 0.4836 0.8006 0.8948
No log 20.2857 426 0.7556 0.5085 0.7556 0.8692
No log 20.3810 428 0.7222 0.4927 0.7222 0.8498
No log 20.4762 430 0.7378 0.4691 0.7378 0.8589
No log 20.5714 432 0.7626 0.4417 0.7626 0.8733
No log 20.6667 434 0.8061 0.4738 0.8061 0.8978
No log 20.7619 436 0.8551 0.4593 0.8551 0.9247
No log 20.8571 438 0.8169 0.4823 0.8169 0.9038
No log 20.9524 440 0.7643 0.5390 0.7643 0.8743
No log 21.0476 442 0.7361 0.5818 0.7361 0.8580
No log 21.1429 444 0.7434 0.5654 0.7434 0.8622
No log 21.2381 446 0.7556 0.5861 0.7556 0.8693
No log 21.3333 448 0.7811 0.5472 0.7811 0.8838
No log 21.4286 450 0.8343 0.5621 0.8343 0.9134
No log 21.5238 452 0.8720 0.5336 0.8720 0.9338
No log 21.6190 454 0.8890 0.5337 0.8890 0.9429
No log 21.7143 456 0.8110 0.5719 0.8110 0.9005
No log 21.8095 458 0.7682 0.5899 0.7682 0.8765
No log 21.9048 460 0.7619 0.6085 0.7619 0.8729
No log 22.0 462 0.7806 0.5826 0.7806 0.8835
No log 22.0952 464 0.8911 0.4826 0.8911 0.9440
No log 22.1905 466 0.9531 0.4534 0.9531 0.9763
No log 22.2857 468 0.8847 0.5012 0.8847 0.9406
No log 22.3810 470 0.7613 0.5350 0.7613 0.8725
No log 22.4762 472 0.7159 0.5763 0.7159 0.8461
No log 22.5714 474 0.7076 0.4912 0.7076 0.8412
No log 22.6667 476 0.7270 0.4903 0.7270 0.8526
No log 22.7619 478 0.7665 0.4888 0.7665 0.8755
No log 22.8571 480 0.8306 0.4836 0.8306 0.9114
No log 22.9524 482 0.8566 0.4924 0.8566 0.9255
No log 23.0476 484 0.8952 0.5122 0.8952 0.9461
No log 23.1429 486 0.8699 0.5234 0.8699 0.9327
No log 23.2381 488 0.7960 0.5360 0.7960 0.8922
No log 23.3333 490 0.7627 0.5925 0.7627 0.8733
No log 23.4286 492 0.7547 0.5902 0.7547 0.8687
No log 23.5238 494 0.7617 0.5857 0.7617 0.8727
No log 23.6190 496 0.7757 0.5756 0.7757 0.8808
No log 23.7143 498 0.8188 0.5211 0.8188 0.9049
0.2839 23.8095 500 0.7906 0.5462 0.7906 0.8891
0.2839 23.9048 502 0.7173 0.5552 0.7173 0.8470
0.2839 24.0 504 0.6925 0.5248 0.6925 0.8322
0.2839 24.0952 506 0.6897 0.5396 0.6897 0.8305
0.2839 24.1905 508 0.6881 0.5301 0.6881 0.8295
0.2839 24.2857 510 0.6878 0.5139 0.6878 0.8293
0.2839 24.3810 512 0.7185 0.5359 0.7185 0.8476

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_usingALLEssays_FineTuningAraBERT_run1_AugV5_k6_task5_organization

Finetuned
(4019)
this model