ArabicNewSplits7_B_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k8_task7_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.5840
  • Qwk: 0.3524
  • Mse: 0.5840
  • Rmse: 0.7642

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0476 2 2.8360 -0.0627 2.8360 1.6841
No log 0.0952 4 1.6383 0.0261 1.6383 1.2800
No log 0.1429 6 1.2038 -0.1514 1.2038 1.0972
No log 0.1905 8 0.9158 0.0968 0.9158 0.9570
No log 0.2381 10 0.9623 0.1332 0.9623 0.9810
No log 0.2857 12 0.8605 0.1007 0.8605 0.9276
No log 0.3333 14 0.8787 0.0717 0.8787 0.9374
No log 0.3810 16 0.8975 -0.0079 0.8975 0.9473
No log 0.4286 18 0.8247 0.0393 0.8247 0.9081
No log 0.4762 20 0.7550 0.0027 0.7550 0.8689
No log 0.5238 22 0.7564 0.0495 0.7564 0.8697
No log 0.5714 24 0.7249 0.0 0.7249 0.8514
No log 0.6190 26 0.8592 0.1766 0.8592 0.9270
No log 0.6667 28 0.9644 0.2012 0.9644 0.9821
No log 0.7143 30 1.1450 0.1501 1.1450 1.0700
No log 0.7619 32 1.1038 0.1542 1.1038 1.0506
No log 0.8095 34 0.9923 0.2012 0.9923 0.9961
No log 0.8571 36 0.9557 0.0448 0.9557 0.9776
No log 0.9048 38 0.9692 0.0851 0.9692 0.9845
No log 0.9524 40 1.0380 0.0584 1.0380 1.0188
No log 1.0 42 1.0891 0.1264 1.0891 1.0436
No log 1.0476 44 1.0669 0.1362 1.0669 1.0329
No log 1.0952 46 1.0229 0.1718 1.0229 1.0114
No log 1.1429 48 0.9118 0.1504 0.9118 0.9549
No log 1.1905 50 0.8214 -0.0127 0.8214 0.9063
No log 1.2381 52 0.8166 -0.0483 0.8166 0.9037
No log 1.2857 54 0.8501 0.0208 0.8501 0.9220
No log 1.3333 56 0.8582 0.1050 0.8582 0.9264
No log 1.3810 58 0.8269 0.1850 0.8269 0.9093
No log 1.4286 60 0.8017 0.1633 0.8017 0.8954
No log 1.4762 62 0.8080 0.1700 0.8080 0.8989
No log 1.5238 64 0.7868 0.1310 0.7868 0.8870
No log 1.5714 66 0.7587 0.1479 0.7587 0.8710
No log 1.6190 68 0.7754 0.1353 0.7754 0.8805
No log 1.6667 70 0.8520 0.1550 0.8520 0.9230
No log 1.7143 72 0.9532 0.1293 0.9532 0.9763
No log 1.7619 74 1.0015 0.2702 1.0015 1.0007
No log 1.8095 76 1.1290 0.2443 1.1290 1.0625
No log 1.8571 78 1.4203 0.2810 1.4203 1.1918
No log 1.9048 80 1.3892 0.2601 1.3892 1.1786
No log 1.9524 82 0.9387 0.3280 0.9387 0.9689
No log 2.0 84 0.8713 0.2328 0.8713 0.9334
No log 2.0476 86 0.8464 0.2045 0.8464 0.9200
No log 2.0952 88 0.8686 0.1850 0.8686 0.9320
No log 2.1429 90 0.8289 0.1347 0.8289 0.9105
No log 2.1905 92 0.8693 0.2516 0.8693 0.9324
No log 2.2381 94 0.9231 0.1846 0.9231 0.9608
No log 2.2857 96 0.8946 0.2777 0.8946 0.9458
No log 2.3333 98 0.9172 0.2291 0.9172 0.9577
No log 2.3810 100 0.9208 0.2221 0.9208 0.9596
No log 2.4286 102 0.9154 0.2888 0.9154 0.9568
No log 2.4762 104 0.9202 0.2555 0.9202 0.9593
No log 2.5238 106 0.9269 0.2201 0.9269 0.9627
No log 2.5714 108 0.8702 0.2528 0.8702 0.9329
No log 2.6190 110 0.8127 0.3161 0.8127 0.9015
No log 2.6667 112 0.8233 0.2400 0.8233 0.9073
No log 2.7143 114 0.8911 0.2605 0.8911 0.9440
No log 2.7619 116 0.8922 0.2140 0.8922 0.9446
No log 2.8095 118 0.8934 0.1773 0.8934 0.9452
No log 2.8571 120 0.8713 0.0787 0.8713 0.9334
No log 2.9048 122 0.8995 0.1103 0.8995 0.9484
No log 2.9524 124 0.9379 0.1362 0.9379 0.9685
No log 3.0 126 0.9176 0.1683 0.9176 0.9579
No log 3.0476 128 0.8777 0.1260 0.8777 0.9369
No log 3.0952 130 0.8467 0.1009 0.8467 0.9201
No log 3.1429 132 0.8517 0.1303 0.8517 0.9229
No log 3.1905 134 0.9440 0.1403 0.9440 0.9716
No log 3.2381 136 1.0252 0.1623 1.0252 1.0125
No log 3.2857 138 1.0165 0.2220 1.0165 1.0082
No log 3.3333 140 0.9558 0.2328 0.9558 0.9777
No log 3.3810 142 0.9678 0.2328 0.9678 0.9838
No log 3.4286 144 0.9745 0.1534 0.9745 0.9872
No log 3.4762 146 0.9714 0.2387 0.9714 0.9856
No log 3.5238 148 0.9398 0.2754 0.9398 0.9694
No log 3.5714 150 0.9437 0.2832 0.9437 0.9714
No log 3.6190 152 0.9002 0.2547 0.9002 0.9488
No log 3.6667 154 0.8907 0.2547 0.8907 0.9438
No log 3.7143 156 0.9227 0.3417 0.9227 0.9606
No log 3.7619 158 0.8940 0.1694 0.8940 0.9455
No log 3.8095 160 0.9198 0.1403 0.9198 0.9591
No log 3.8571 162 0.9420 0.1640 0.9420 0.9706
No log 3.9048 164 0.9917 0.2912 0.9917 0.9958
No log 3.9524 166 1.0126 0.2294 1.0126 1.0063
No log 4.0 168 0.9348 0.2669 0.9348 0.9669
No log 4.0476 170 0.8756 0.1213 0.8756 0.9357
No log 4.0952 172 0.8633 0.1733 0.8633 0.9291
No log 4.1429 174 0.8497 0.1331 0.8497 0.9218
No log 4.1905 176 0.8471 0.1219 0.8471 0.9204
No log 4.2381 178 0.8845 0.2183 0.8845 0.9405
No log 4.2857 180 0.9553 0.3183 0.9553 0.9774
No log 4.3333 182 0.9164 0.2751 0.9164 0.9573
No log 4.3810 184 0.8172 0.2440 0.8172 0.9040
No log 4.4286 186 0.7147 0.1918 0.7147 0.8454
No log 4.4762 188 0.7236 0.2227 0.7236 0.8507
No log 4.5238 190 0.7713 0.2383 0.7713 0.8782
No log 4.5714 192 0.9549 0.3417 0.9549 0.9772
No log 4.6190 194 1.0363 0.2578 1.0363 1.0180
No log 4.6667 196 0.9384 0.3012 0.9384 0.9687
No log 4.7143 198 0.7913 0.2691 0.7913 0.8895
No log 4.7619 200 0.7156 0.2043 0.7156 0.8459
No log 4.8095 202 0.7091 0.2043 0.7091 0.8421
No log 4.8571 204 0.7469 0.2389 0.7469 0.8642
No log 4.9048 206 0.8214 0.3544 0.8214 0.9063
No log 4.9524 208 0.8407 0.3909 0.8407 0.9169
No log 5.0 210 0.8227 0.3909 0.8227 0.9070
No log 5.0476 212 0.7761 0.3525 0.7761 0.8810
No log 5.0952 214 0.7804 0.1331 0.7804 0.8834
No log 5.1429 216 0.7907 0.1289 0.7907 0.8892
No log 5.1905 218 0.7938 0.2526 0.7938 0.8910
No log 5.2381 220 0.7920 0.2577 0.7920 0.8899
No log 5.2857 222 0.7873 0.2577 0.7873 0.8873
No log 5.3333 224 0.8002 0.2577 0.8002 0.8945
No log 5.3810 226 0.8206 0.3393 0.8206 0.9059
No log 5.4286 228 0.8058 0.2835 0.8058 0.8976
No log 5.4762 230 0.8114 0.2835 0.8114 0.9008
No log 5.5238 232 0.7921 0.2109 0.7921 0.8900
No log 5.5714 234 0.7738 0.1367 0.7738 0.8796
No log 5.6190 236 0.7900 0.1741 0.7900 0.8888
No log 5.6667 238 0.7683 0.1803 0.7683 0.8765
No log 5.7143 240 0.7361 0.2746 0.7361 0.8580
No log 5.7619 242 0.7626 0.3085 0.7626 0.8733
No log 5.8095 244 0.8650 0.3889 0.8650 0.9301
No log 5.8571 246 0.8537 0.3909 0.8537 0.9239
No log 5.9048 248 0.7644 0.3798 0.7644 0.8743
No log 5.9524 250 0.6858 0.3569 0.6858 0.8281
No log 6.0 252 0.6763 0.3569 0.6763 0.8223
No log 6.0476 254 0.6661 0.3355 0.6661 0.8161
No log 6.0952 256 0.7016 0.3131 0.7016 0.8376
No log 6.1429 258 0.7766 0.3562 0.7766 0.8813
No log 6.1905 260 0.7546 0.3699 0.7546 0.8687
No log 6.2381 262 0.7220 0.2440 0.7220 0.8497
No log 6.2857 264 0.7747 0.3384 0.7747 0.8802
No log 6.3333 266 0.7792 0.3450 0.7792 0.8827
No log 6.3810 268 0.7122 0.2817 0.7122 0.8439
No log 6.4286 270 0.7049 0.3060 0.7049 0.8396
No log 6.4762 272 0.7197 0.3918 0.7197 0.8483
No log 6.5238 274 0.7403 0.4072 0.7403 0.8604
No log 6.5714 276 0.7387 0.4072 0.7387 0.8595
No log 6.6190 278 0.6772 0.3985 0.6772 0.8229
No log 6.6667 280 0.6093 0.4206 0.6093 0.7806
No log 6.7143 282 0.6078 0.3837 0.6078 0.7796
No log 6.7619 284 0.6442 0.3841 0.6442 0.8026
No log 6.8095 286 0.7361 0.4023 0.7361 0.8579
No log 6.8571 288 0.6914 0.3866 0.6914 0.8315
No log 6.9048 290 0.6334 0.3788 0.6334 0.7959
No log 6.9524 292 0.6535 0.3498 0.6535 0.8084
No log 7.0 294 0.7721 0.4243 0.7721 0.8787
No log 7.0476 296 0.8117 0.4243 0.8117 0.9009
No log 7.0952 298 0.7343 0.4522 0.7343 0.8569
No log 7.1429 300 0.6608 0.3224 0.6608 0.8129
No log 7.1905 302 0.6725 0.3498 0.6725 0.8201
No log 7.2381 304 0.7414 0.3746 0.7414 0.8610
No log 7.2857 306 0.9149 0.3521 0.9149 0.9565
No log 7.3333 308 0.9353 0.3012 0.9353 0.9671
No log 7.3810 310 0.8170 0.3653 0.8170 0.9039
No log 7.4286 312 0.7363 0.3088 0.7363 0.8581
No log 7.4762 314 0.7426 0.3088 0.7426 0.8617
No log 7.5238 316 0.7760 0.3368 0.7760 0.8809
No log 7.5714 318 0.7436 0.3209 0.7436 0.8623
No log 7.6190 320 0.7220 0.3352 0.7220 0.8497
No log 7.6667 322 0.7385 0.2958 0.7385 0.8593
No log 7.7143 324 0.7569 0.3225 0.7569 0.8700
No log 7.7619 326 0.7665 0.2973 0.7665 0.8755
No log 7.8095 328 0.7640 0.2973 0.7640 0.8741
No log 7.8571 330 0.7742 0.1356 0.7742 0.8799
No log 7.9048 332 0.7654 0.2519 0.7654 0.8748
No log 7.9524 334 0.7392 0.2138 0.7392 0.8598
No log 8.0 336 0.7196 0.2484 0.7196 0.8483
No log 8.0476 338 0.7011 0.2715 0.7011 0.8373
No log 8.0952 340 0.6938 0.3235 0.6938 0.8329
No log 8.1429 342 0.7145 0.3183 0.7145 0.8453
No log 8.1905 344 0.7003 0.3221 0.7003 0.8368
No log 8.2381 346 0.6726 0.3498 0.6726 0.8201
No log 8.2857 348 0.6946 0.3590 0.6946 0.8334
No log 8.3333 350 0.6799 0.2943 0.6799 0.8246
No log 8.3810 352 0.6958 0.3942 0.6958 0.8341
No log 8.4286 354 0.7004 0.3662 0.7004 0.8369
No log 8.4762 356 0.6541 0.2943 0.6541 0.8088
No log 8.5238 358 0.6378 0.2936 0.6378 0.7986
No log 8.5714 360 0.6206 0.3651 0.6206 0.7878
No log 8.6190 362 0.6353 0.3399 0.6353 0.7970
No log 8.6667 364 0.6698 0.4197 0.6698 0.8184
No log 8.7143 366 0.6446 0.4089 0.6446 0.8028
No log 8.7619 368 0.5811 0.4270 0.5811 0.7623
No log 8.8095 370 0.5966 0.4270 0.5966 0.7724
No log 8.8571 372 0.7146 0.4296 0.7146 0.8453
No log 8.9048 374 0.8070 0.4142 0.8070 0.8984
No log 8.9524 376 0.7164 0.4539 0.7164 0.8464
No log 9.0 378 0.6707 0.4144 0.6707 0.8190
No log 9.0476 380 0.6080 0.4464 0.6080 0.7798
No log 9.0952 382 0.5842 0.4855 0.5842 0.7643
No log 9.1429 384 0.6173 0.3843 0.6173 0.7857
No log 9.1905 386 0.7038 0.4349 0.7038 0.8389
No log 9.2381 388 0.8150 0.4494 0.8150 0.9028
No log 9.2857 390 0.8992 0.4168 0.8992 0.9482
No log 9.3333 392 0.7898 0.4161 0.7898 0.8887
No log 9.3810 394 0.6617 0.4502 0.6617 0.8135
No log 9.4286 396 0.6102 0.4020 0.6102 0.7812
No log 9.4762 398 0.6221 0.4020 0.6221 0.7887
No log 9.5238 400 0.6886 0.3822 0.6886 0.8298
No log 9.5714 402 0.6776 0.4072 0.6776 0.8232
No log 9.6190 404 0.6184 0.4597 0.6184 0.7864
No log 9.6667 406 0.5666 0.4044 0.5666 0.7527
No log 9.7143 408 0.5666 0.3599 0.5666 0.7527
No log 9.7619 410 0.5568 0.3675 0.5568 0.7462
No log 9.8095 412 0.5740 0.4044 0.5740 0.7577
No log 9.8571 414 0.6143 0.3662 0.6143 0.7837
No log 9.9048 416 0.6133 0.3866 0.6133 0.7831
No log 9.9524 418 0.5848 0.3866 0.5848 0.7647
No log 10.0 420 0.5921 0.3866 0.5921 0.7695
No log 10.0476 422 0.5764 0.3817 0.5764 0.7592
No log 10.0952 424 0.5739 0.4451 0.5739 0.7576
No log 10.1429 426 0.5701 0.5123 0.5701 0.7550
No log 10.1905 428 0.5535 0.4681 0.5535 0.7439
No log 10.2381 430 0.5455 0.4820 0.5455 0.7386
No log 10.2857 432 0.5444 0.4820 0.5444 0.7378
No log 10.3333 434 0.5573 0.5141 0.5573 0.7465
No log 10.3810 436 0.5754 0.4724 0.5754 0.7585
No log 10.4286 438 0.5553 0.5195 0.5553 0.7452
No log 10.4762 440 0.5802 0.4788 0.5802 0.7617
No log 10.5238 442 0.6273 0.3699 0.6273 0.7920
No log 10.5714 444 0.6996 0.3456 0.6996 0.8364
No log 10.6190 446 0.6590 0.3699 0.6590 0.8118
No log 10.6667 448 0.5819 0.4867 0.5819 0.7628
No log 10.7143 450 0.5776 0.4867 0.5776 0.7600
No log 10.7619 452 0.5702 0.4867 0.5702 0.7551
No log 10.8095 454 0.5799 0.4330 0.5799 0.7615
No log 10.8571 456 0.5881 0.4158 0.5881 0.7669
No log 10.9048 458 0.6232 0.3302 0.6232 0.7894
No log 10.9524 460 0.6780 0.3940 0.6780 0.8234
No log 11.0 462 0.7066 0.3754 0.7066 0.8406
No log 11.0476 464 0.6904 0.3754 0.6904 0.8309
No log 11.0952 466 0.6271 0.3329 0.6271 0.7919
No log 11.1429 468 0.5999 0.3572 0.5999 0.7746
No log 11.1905 470 0.5981 0.3865 0.5981 0.7733
No log 11.2381 472 0.6162 0.3688 0.6162 0.7850
No log 11.2857 474 0.6162 0.3688 0.6162 0.7850
No log 11.3333 476 0.6011 0.3788 0.6011 0.7753
No log 11.3810 478 0.5981 0.3914 0.5981 0.7734
No log 11.4286 480 0.5947 0.3914 0.5947 0.7712
No log 11.4762 482 0.5706 0.4763 0.5706 0.7554
No log 11.5238 484 0.5618 0.4441 0.5618 0.7495
No log 11.5714 486 0.5527 0.4441 0.5527 0.7434
No log 11.6190 488 0.5489 0.4966 0.5489 0.7409
No log 11.6667 490 0.5444 0.4740 0.5444 0.7378
No log 11.7143 492 0.5496 0.5373 0.5496 0.7414
No log 11.7619 494 0.5758 0.4702 0.5758 0.7588
No log 11.8095 496 0.6342 0.4556 0.6342 0.7963
No log 11.8571 498 0.6732 0.4610 0.6732 0.8205
0.3557 11.9048 500 0.6677 0.4614 0.6677 0.8171
0.3557 11.9524 502 0.5782 0.4165 0.5782 0.7604
0.3557 12.0 504 0.5256 0.5208 0.5256 0.7250
0.3557 12.0476 506 0.5342 0.5457 0.5342 0.7309
0.3557 12.0952 508 0.5514 0.5617 0.5514 0.7425
0.3557 12.1429 510 0.5660 0.4955 0.5660 0.7523
0.3557 12.1905 512 0.5919 0.3613 0.5919 0.7693
0.3557 12.2381 514 0.6137 0.3353 0.6137 0.7834
0.3557 12.2857 516 0.6553 0.4385 0.6553 0.8095
0.3557 12.3333 518 0.6285 0.3955 0.6285 0.7928
0.3557 12.3810 520 0.5840 0.3524 0.5840 0.7642

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_B_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k8_task7_organization

Finetuned
(4019)
this model