ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k4_task7_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.6488
  • Qwk: 0.5594
  • Mse: 0.6488
  • Rmse: 0.8055

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.1 2 2.5182 -0.0788 2.5182 1.5869
No log 0.2 4 1.0823 0.1549 1.0823 1.0403
No log 0.3 6 0.7961 0.0937 0.7961 0.8923
No log 0.4 8 0.7667 0.2751 0.7667 0.8756
No log 0.5 10 0.7429 0.3069 0.7429 0.8619
No log 0.6 12 0.7226 0.2784 0.7226 0.8501
No log 0.7 14 0.7534 0.0856 0.7534 0.8680
No log 0.8 16 0.6957 0.2308 0.6957 0.8341
No log 0.9 18 0.6880 0.3465 0.6880 0.8294
No log 1.0 20 0.7341 0.1807 0.7341 0.8568
No log 1.1 22 0.7349 0.0757 0.7349 0.8572
No log 1.2 24 0.7443 0.1561 0.7443 0.8627
No log 1.3 26 0.7251 0.2270 0.7251 0.8515
No log 1.4 28 0.6860 0.3405 0.6860 0.8283
No log 1.5 30 0.6688 -0.0054 0.6688 0.8178
No log 1.6 32 0.6800 0.0481 0.6800 0.8246
No log 1.7 34 0.6493 0.1498 0.6493 0.8058
No log 1.8 36 0.6746 0.3465 0.6746 0.8214
No log 1.9 38 0.7718 0.3302 0.7718 0.8785
No log 2.0 40 0.7761 0.3302 0.7761 0.8810
No log 2.1 42 0.6721 0.2883 0.6721 0.8198
No log 2.2 44 0.5983 0.3166 0.5983 0.7735
No log 2.3 46 0.5717 0.3081 0.5717 0.7561
No log 2.4 48 0.6313 0.4052 0.6313 0.7946
No log 2.5 50 0.6195 0.4052 0.6195 0.7871
No log 2.6 52 0.5305 0.3661 0.5305 0.7284
No log 2.7 54 0.6386 0.4491 0.6386 0.7991
No log 2.8 56 0.9080 0.3697 0.9080 0.9529
No log 2.9 58 0.8929 0.3846 0.8929 0.9449
No log 3.0 60 0.8572 0.3520 0.8572 0.9258
No log 3.1 62 0.7875 0.3544 0.7875 0.8874
No log 3.2 64 0.7215 0.3996 0.7215 0.8494
No log 3.3 66 0.7274 0.4491 0.7274 0.8529
No log 3.4 68 0.6349 0.3060 0.6349 0.7968
No log 3.5 70 0.5865 0.3662 0.5865 0.7659
No log 3.6 72 0.6720 0.4801 0.6720 0.8198
No log 3.7 74 0.6399 0.4726 0.6399 0.7999
No log 3.8 76 0.5940 0.4916 0.5940 0.7707
No log 3.9 78 0.5213 0.5167 0.5213 0.7220
No log 4.0 80 0.5139 0.5110 0.5139 0.7169
No log 4.1 82 0.5283 0.4642 0.5283 0.7268
No log 4.2 84 0.7048 0.5368 0.7048 0.8395
No log 4.3 86 0.7551 0.4295 0.7551 0.8690
No log 4.4 88 0.7399 0.4295 0.7399 0.8602
No log 4.5 90 0.7456 0.5139 0.7456 0.8635
No log 4.6 92 0.7641 0.5443 0.7641 0.8741
No log 4.7 94 0.8859 0.4906 0.8859 0.9412
No log 4.8 96 0.7957 0.5175 0.7957 0.8920
No log 4.9 98 0.6406 0.5470 0.6406 0.8003
No log 5.0 100 0.6186 0.5098 0.6186 0.7865
No log 5.1 102 0.6095 0.4883 0.6095 0.7807
No log 5.2 104 0.6013 0.3762 0.6013 0.7755
No log 5.3 106 0.6086 0.3887 0.6086 0.7802
No log 5.4 108 0.6487 0.5308 0.6487 0.8054
No log 5.5 110 0.6547 0.5195 0.6547 0.8091
No log 5.6 112 0.6246 0.4377 0.6246 0.7903
No log 5.7 114 0.6299 0.4268 0.6299 0.7936
No log 5.8 116 0.6838 0.4481 0.6838 0.8269
No log 5.9 118 0.9286 0.4519 0.9286 0.9636
No log 6.0 120 0.9845 0.4332 0.9845 0.9922
No log 6.1 122 0.8228 0.4733 0.8228 0.9071
No log 6.2 124 0.6809 0.4853 0.6809 0.8252
No log 6.3 126 0.6686 0.5323 0.6686 0.8177
No log 6.4 128 0.7091 0.5431 0.7091 0.8421
No log 6.5 130 0.8004 0.4815 0.8004 0.8946
No log 6.6 132 0.8852 0.4336 0.8852 0.9408
No log 6.7 134 0.7816 0.4303 0.7816 0.8841
No log 6.8 136 0.6670 0.4872 0.6670 0.8167
No log 6.9 138 0.6105 0.4819 0.6105 0.7814
No log 7.0 140 0.5932 0.5373 0.5932 0.7702
No log 7.1 142 0.6678 0.5457 0.6678 0.8172
No log 7.2 144 0.6556 0.5789 0.6556 0.8097
No log 7.3 146 0.7200 0.4937 0.7200 0.8485
No log 7.4 148 0.6845 0.4284 0.6845 0.8274
No log 7.5 150 0.6806 0.3909 0.6806 0.8250
No log 7.6 152 0.6712 0.4284 0.6712 0.8193
No log 7.7 154 0.6923 0.4369 0.6923 0.8320
No log 7.8 156 0.7414 0.4805 0.7414 0.8610
No log 7.9 158 0.6181 0.5323 0.6181 0.7862
No log 8.0 160 0.5328 0.5324 0.5328 0.7299
No log 8.1 162 0.5554 0.4116 0.5554 0.7453
No log 8.2 164 0.5526 0.5167 0.5526 0.7433
No log 8.3 166 0.6182 0.4719 0.6182 0.7863
No log 8.4 168 0.6377 0.4719 0.6377 0.7986
No log 8.5 170 0.6079 0.5093 0.6079 0.7797
No log 8.6 172 0.5836 0.5286 0.5836 0.7640
No log 8.7 174 0.6067 0.4656 0.6067 0.7789
No log 8.8 176 0.5898 0.4656 0.5898 0.7680
No log 8.9 178 0.5747 0.5373 0.5747 0.7581
No log 9.0 180 0.6342 0.5639 0.6342 0.7963
No log 9.1 182 0.7626 0.4993 0.7626 0.8733
No log 9.2 184 0.8069 0.4993 0.8069 0.8983
No log 9.3 186 0.6966 0.5423 0.6966 0.8346
No log 9.4 188 0.6396 0.5003 0.6396 0.7997
No log 9.5 190 0.6518 0.4987 0.6518 0.8074
No log 9.6 192 0.6496 0.5229 0.6496 0.8060
No log 9.7 194 0.6865 0.4648 0.6865 0.8285
No log 9.8 196 0.6837 0.4684 0.6837 0.8268
No log 9.9 198 0.6597 0.4315 0.6597 0.8122
No log 10.0 200 0.6120 0.4314 0.6120 0.7823
No log 10.1 202 0.6009 0.3738 0.6009 0.7752
No log 10.2 204 0.6213 0.4352 0.6213 0.7882
No log 10.3 206 0.7179 0.3799 0.7179 0.8473
No log 10.4 208 0.7139 0.4203 0.7139 0.8449
No log 10.5 210 0.6560 0.4982 0.6560 0.8099
No log 10.6 212 0.6183 0.4954 0.6183 0.7863
No log 10.7 214 0.6070 0.4954 0.6070 0.7791
No log 10.8 216 0.5904 0.5164 0.5904 0.7684
No log 10.9 218 0.5667 0.5109 0.5667 0.7528
No log 11.0 220 0.5401 0.4801 0.5401 0.7349
No log 11.1 222 0.5226 0.4914 0.5226 0.7229
No log 11.2 224 0.5323 0.5560 0.5323 0.7296
No log 11.3 226 0.5725 0.5140 0.5725 0.7566
No log 11.4 228 0.5628 0.5395 0.5628 0.7502
No log 11.5 230 0.5400 0.5723 0.5400 0.7349
No log 11.6 232 0.5341 0.6446 0.5341 0.7308
No log 11.7 234 0.5574 0.6187 0.5574 0.7466
No log 11.8 236 0.5774 0.5633 0.5774 0.7599
No log 11.9 238 0.5785 0.5179 0.5785 0.7606
No log 12.0 240 0.5768 0.5179 0.5768 0.7595
No log 12.1 242 0.5750 0.5633 0.5750 0.7583
No log 12.2 244 0.6075 0.5918 0.6075 0.7794
No log 12.3 246 0.6743 0.5827 0.6743 0.8211
No log 12.4 248 0.6038 0.5719 0.6038 0.7771
No log 12.5 250 0.5317 0.6441 0.5317 0.7292
No log 12.6 252 0.5311 0.5125 0.5311 0.7288
No log 12.7 254 0.5407 0.4904 0.5407 0.7353
No log 12.8 256 0.5568 0.5289 0.5568 0.7462
No log 12.9 258 0.6268 0.5524 0.6268 0.7917
No log 13.0 260 0.8499 0.4910 0.8499 0.9219
No log 13.1 262 0.9158 0.4604 0.9158 0.9570
No log 13.2 264 0.8215 0.5047 0.8215 0.9064
No log 13.3 266 0.6716 0.4997 0.6716 0.8195
No log 13.4 268 0.5989 0.4719 0.5989 0.7739
No log 13.5 270 0.5901 0.4910 0.5901 0.7682
No log 13.6 272 0.5693 0.4831 0.5693 0.7545
No log 13.7 274 0.5607 0.5337 0.5607 0.7488
No log 13.8 276 0.5875 0.5748 0.5875 0.7665
No log 13.9 278 0.6008 0.5735 0.6008 0.7751
No log 14.0 280 0.6063 0.5293 0.6063 0.7786
No log 14.1 282 0.6267 0.5231 0.6267 0.7917
No log 14.2 284 0.5986 0.4980 0.5986 0.7737
No log 14.3 286 0.5276 0.4639 0.5276 0.7264
No log 14.4 288 0.5200 0.4211 0.5200 0.7211
No log 14.5 290 0.5365 0.4174 0.5365 0.7324
No log 14.6 292 0.5148 0.4677 0.5148 0.7175
No log 14.7 294 0.5191 0.5428 0.5191 0.7205
No log 14.8 296 0.5781 0.5388 0.5781 0.7603
No log 14.9 298 0.6735 0.5325 0.6735 0.8207
No log 15.0 300 0.6649 0.4812 0.6649 0.8154
No log 15.1 302 0.5933 0.4782 0.5933 0.7703
No log 15.2 304 0.5514 0.4901 0.5514 0.7425
No log 15.3 306 0.5700 0.4901 0.5700 0.7550
No log 15.4 308 0.6332 0.5259 0.6332 0.7957
No log 15.5 310 0.7766 0.4717 0.7766 0.8813
No log 15.6 312 0.7984 0.4717 0.7984 0.8935
No log 15.7 314 0.6747 0.5190 0.6747 0.8214
No log 15.8 316 0.5372 0.6025 0.5372 0.7329
No log 15.9 318 0.5121 0.4955 0.5121 0.7156
No log 16.0 320 0.5025 0.5719 0.5025 0.7089
No log 16.1 322 0.4979 0.5756 0.4979 0.7056
No log 16.2 324 0.4963 0.5756 0.4963 0.7045
No log 16.3 326 0.4827 0.5800 0.4827 0.6947
No log 16.4 328 0.4786 0.6024 0.4786 0.6918
No log 16.5 330 0.4805 0.5286 0.4805 0.6932
No log 16.6 332 0.4794 0.6024 0.4794 0.6924
No log 16.7 334 0.5669 0.4979 0.5669 0.7530
No log 16.8 336 0.6367 0.5402 0.6367 0.7979
No log 16.9 338 0.6001 0.4887 0.6001 0.7747
No log 17.0 340 0.5432 0.4929 0.5432 0.7370
No log 17.1 342 0.5426 0.5223 0.5426 0.7366
No log 17.2 344 0.5929 0.4829 0.5929 0.7700
No log 17.3 346 0.6005 0.5354 0.6005 0.7749
No log 17.4 348 0.5452 0.5605 0.5452 0.7384
No log 17.5 350 0.5271 0.5687 0.5271 0.7260
No log 17.6 352 0.5193 0.5457 0.5193 0.7206
No log 17.7 354 0.5186 0.5306 0.5186 0.7202
No log 17.8 356 0.5279 0.4830 0.5279 0.7266
No log 17.9 358 0.5315 0.4467 0.5315 0.7291
No log 18.0 360 0.5313 0.4155 0.5313 0.7289
No log 18.1 362 0.5313 0.4103 0.5313 0.7289
No log 18.2 364 0.5697 0.4916 0.5697 0.7548
No log 18.3 366 0.6011 0.4911 0.6011 0.7753
No log 18.4 368 0.5794 0.4845 0.5794 0.7612
No log 18.5 370 0.5543 0.4642 0.5543 0.7445
No log 18.6 372 0.5524 0.4013 0.5524 0.7433
No log 18.7 374 0.5598 0.4562 0.5598 0.7482
No log 18.8 376 0.5888 0.5042 0.5888 0.7674
No log 18.9 378 0.6421 0.5042 0.6421 0.8013
No log 19.0 380 0.6526 0.5293 0.6526 0.8079
No log 19.1 382 0.6058 0.5373 0.6058 0.7783
No log 19.2 384 0.5600 0.4721 0.5600 0.7483
No log 19.3 386 0.5515 0.4523 0.5515 0.7426
No log 19.4 388 0.5795 0.4542 0.5795 0.7612
No log 19.5 390 0.6578 0.4542 0.6578 0.8110
No log 19.6 392 0.7640 0.3333 0.7640 0.8741
No log 19.7 394 0.7630 0.3333 0.7630 0.8735
No log 19.8 396 0.6932 0.4036 0.6932 0.8326
No log 19.9 398 0.6295 0.4855 0.6295 0.7934
No log 20.0 400 0.6156 0.4855 0.6156 0.7846
No log 20.1 402 0.5915 0.5272 0.5915 0.7691
No log 20.2 404 0.5800 0.5817 0.5800 0.7616
No log 20.3 406 0.5495 0.5428 0.5495 0.7413
No log 20.4 408 0.5337 0.5057 0.5337 0.7305
No log 20.5 410 0.5238 0.4776 0.5238 0.7237
No log 20.6 412 0.5139 0.5201 0.5139 0.7169
No log 20.7 414 0.5157 0.5845 0.5157 0.7181
No log 20.8 416 0.5550 0.6063 0.5550 0.7450
No log 20.9 418 0.6099 0.5595 0.6099 0.7810
No log 21.0 420 0.6493 0.5388 0.6493 0.8058
No log 21.1 422 0.6218 0.5470 0.6218 0.7885
No log 21.2 424 0.5655 0.5617 0.5655 0.7520
No log 21.3 426 0.5460 0.5577 0.5460 0.7389
No log 21.4 428 0.5381 0.5617 0.5381 0.7335
No log 21.5 430 0.5511 0.5442 0.5511 0.7424
No log 21.6 432 0.5540 0.4997 0.5540 0.7443
No log 21.7 434 0.5495 0.5388 0.5495 0.7413
No log 21.8 436 0.5349 0.5470 0.5349 0.7314
No log 21.9 438 0.5185 0.5920 0.5185 0.7201
No log 22.0 440 0.5005 0.5560 0.5005 0.7074
No log 22.1 442 0.4940 0.6067 0.4940 0.7029
No log 22.2 444 0.4751 0.6001 0.4751 0.6893
No log 22.3 446 0.4786 0.5782 0.4786 0.6918
No log 22.4 448 0.4866 0.5404 0.4866 0.6975
No log 22.5 450 0.5009 0.5801 0.5009 0.7077
No log 22.6 452 0.5414 0.5617 0.5414 0.7358
No log 22.7 454 0.5753 0.5617 0.5753 0.7585
No log 22.8 456 0.5792 0.5237 0.5792 0.7611
No log 22.9 458 0.5986 0.4911 0.5986 0.7737
No log 23.0 460 0.6100 0.5063 0.6100 0.7811
No log 23.1 462 0.6184 0.5497 0.6184 0.7864
No log 23.2 464 0.5782 0.5639 0.5782 0.7604
No log 23.3 466 0.5676 0.5639 0.5676 0.7534
No log 23.4 468 0.5449 0.5442 0.5449 0.7381
No log 23.5 470 0.5265 0.5583 0.5265 0.7256
No log 23.6 472 0.5134 0.6145 0.5134 0.7166
No log 23.7 474 0.5188 0.5995 0.5188 0.7203
No log 23.8 476 0.5154 0.5995 0.5154 0.7179
No log 23.9 478 0.5141 0.5923 0.5141 0.7170
No log 24.0 480 0.5017 0.5937 0.5017 0.7083
No log 24.1 482 0.5010 0.6038 0.5010 0.7078
No log 24.2 484 0.4906 0.6038 0.4906 0.7004
No log 24.3 486 0.4949 0.6506 0.4949 0.7035
No log 24.4 488 0.5101 0.5831 0.5101 0.7142
No log 24.5 490 0.5265 0.5468 0.5265 0.7256
No log 24.6 492 0.5114 0.5655 0.5114 0.7151
No log 24.7 494 0.4950 0.5307 0.4950 0.7036
No log 24.8 496 0.4792 0.5422 0.4792 0.6923
No log 24.9 498 0.4727 0.5703 0.4727 0.6875
0.2832 25.0 500 0.4727 0.6214 0.4727 0.6875
0.2832 25.1 502 0.4758 0.6705 0.4758 0.6898
0.2832 25.2 504 0.4742 0.6434 0.4742 0.6886
0.2832 25.3 506 0.4811 0.6240 0.4811 0.6936
0.2832 25.4 508 0.4871 0.6506 0.4871 0.6979
0.2832 25.5 510 0.4864 0.6101 0.4864 0.6974
0.2832 25.6 512 0.4820 0.6313 0.4820 0.6943
0.2832 25.7 514 0.4932 0.5708 0.4932 0.7023
0.2832 25.8 516 0.5111 0.5831 0.5111 0.7149
0.2832 25.9 518 0.5231 0.5831 0.5231 0.7232
0.2832 26.0 520 0.5000 0.5758 0.5000 0.7071
0.2832 26.1 522 0.4657 0.6506 0.4657 0.6824
0.2832 26.2 524 0.4811 0.5831 0.4811 0.6936
0.2832 26.3 526 0.5156 0.6581 0.5156 0.7181
0.2832 26.4 528 0.5666 0.6189 0.5666 0.7527
0.2832 26.5 530 0.5985 0.5892 0.5985 0.7736
0.2832 26.6 532 0.6343 0.5465 0.6343 0.7964
0.2832 26.7 534 0.6603 0.5508 0.6603 0.8126
0.2832 26.8 536 0.6488 0.5594 0.6488 0.8055

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k4_task7_organization

Finetuned
(4019)
this model