ArabicNewSplits7_B_usingWellWrittenEssays_FineTuningAraBERT_run3_AugV5_k4_task5_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.7210
  • Qwk: 0.5261
  • Mse: 0.7210
  • Rmse: 0.8491

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0909 2 3.9873 -0.0090 3.9873 1.9968
No log 0.1818 4 2.3459 0.0357 2.3459 1.5316
No log 0.2727 6 2.2862 0.0798 2.2862 1.5120
No log 0.3636 8 2.0269 0.0727 2.0269 1.4237
No log 0.4545 10 1.1329 0.2639 1.1329 1.0644
No log 0.5455 12 1.1631 0.2166 1.1631 1.0785
No log 0.6364 14 1.2078 0.1284 1.2078 1.0990
No log 0.7273 16 1.1354 0.2265 1.1354 1.0656
No log 0.8182 18 1.1106 0.2740 1.1106 1.0539
No log 0.9091 20 1.0509 0.2243 1.0509 1.0251
No log 1.0 22 1.0423 0.2834 1.0423 1.0209
No log 1.0909 24 1.0415 0.2981 1.0415 1.0205
No log 1.1818 26 1.0644 0.2074 1.0644 1.0317
No log 1.2727 28 1.1345 0.1979 1.1345 1.0651
No log 1.3636 30 1.3183 0.0998 1.3183 1.1482
No log 1.4545 32 1.4048 0.1500 1.4048 1.1852
No log 1.5455 34 1.2542 0.0760 1.2542 1.1199
No log 1.6364 36 1.1935 0.0878 1.1935 1.0925
No log 1.7273 38 1.0518 0.2271 1.0518 1.0256
No log 1.8182 40 0.9348 0.1848 0.9348 0.9668
No log 1.9091 42 0.9419 0.1794 0.9419 0.9705
No log 2.0 44 0.9384 0.2615 0.9384 0.9687
No log 2.0909 46 0.9590 0.1848 0.9590 0.9793
No log 2.1818 48 0.9480 0.1971 0.9480 0.9737
No log 2.2727 50 1.1768 0.1030 1.1768 1.0848
No log 2.3636 52 1.2178 0.0654 1.2178 1.1036
No log 2.4545 54 0.9975 0.1876 0.9975 0.9987
No log 2.5455 56 0.9023 0.2981 0.9023 0.9499
No log 2.6364 58 0.8606 0.3059 0.8606 0.9277
No log 2.7273 60 0.8845 0.2616 0.8845 0.9405
No log 2.8182 62 0.8713 0.3243 0.8713 0.9334
No log 2.9091 64 0.8617 0.3795 0.8617 0.9283
No log 3.0 66 0.8409 0.4071 0.8409 0.9170
No log 3.0909 68 0.8571 0.4365 0.8571 0.9258
No log 3.1818 70 0.9384 0.2572 0.9384 0.9687
No log 3.2727 72 1.0190 0.3163 1.0190 1.0095
No log 3.3636 74 1.0665 0.3033 1.0665 1.0327
No log 3.4545 76 0.8830 0.3680 0.8830 0.9397
No log 3.5455 78 0.9342 0.2748 0.9342 0.9665
No log 3.6364 80 1.0496 0.1643 1.0496 1.0245
No log 3.7273 82 1.0367 0.2449 1.0367 1.0182
No log 3.8182 84 0.9629 0.4180 0.9629 0.9813
No log 3.9091 86 0.8648 0.4966 0.8648 0.9299
No log 4.0 88 0.8026 0.4405 0.8026 0.8959
No log 4.0909 90 0.7712 0.4810 0.7712 0.8782
No log 4.1818 92 0.7663 0.5069 0.7663 0.8754
No log 4.2727 94 0.7681 0.5301 0.7681 0.8764
No log 4.3636 96 0.7952 0.5301 0.7952 0.8917
No log 4.4545 98 0.7691 0.5274 0.7691 0.8770
No log 4.5455 100 0.7770 0.5485 0.7770 0.8815
No log 4.6364 102 0.7763 0.4507 0.7763 0.8811
No log 4.7273 104 0.7778 0.4898 0.7778 0.8820
No log 4.8182 106 0.7784 0.5432 0.7784 0.8823
No log 4.9091 108 0.9984 0.3779 0.9984 0.9992
No log 5.0 110 1.1700 0.4118 1.1700 1.0817
No log 5.0909 112 1.1323 0.3677 1.1323 1.0641
No log 5.1818 114 1.1209 0.3753 1.1209 1.0587
No log 5.2727 116 0.9025 0.3744 0.9025 0.9500
No log 5.3636 118 0.8083 0.5089 0.8083 0.8991
No log 5.4545 120 0.8413 0.4749 0.8413 0.9172
No log 5.5455 122 0.8671 0.4761 0.8671 0.9312
No log 5.6364 124 0.9053 0.4902 0.9053 0.9515
No log 5.7273 126 0.9543 0.4596 0.9543 0.9769
No log 5.8182 128 0.8725 0.4789 0.8725 0.9341
No log 5.9091 130 0.8587 0.4550 0.8587 0.9267
No log 6.0 132 0.8111 0.4565 0.8111 0.9006
No log 6.0909 134 0.8441 0.4522 0.8441 0.9187
No log 6.1818 136 0.8948 0.3782 0.8948 0.9460
No log 6.2727 138 0.9119 0.3637 0.9119 0.9549
No log 6.3636 140 0.8800 0.3780 0.8800 0.9381
No log 6.4545 142 0.8724 0.4318 0.8724 0.9340
No log 6.5455 144 0.8513 0.3994 0.8513 0.9227
No log 6.6364 146 0.8412 0.4192 0.8412 0.9172
No log 6.7273 148 0.8438 0.3737 0.8438 0.9186
No log 6.8182 150 0.8404 0.4626 0.8404 0.9167
No log 6.9091 152 0.8276 0.4869 0.8276 0.9097
No log 7.0 154 0.8155 0.4792 0.8155 0.9030
No log 7.0909 156 0.8008 0.5459 0.8008 0.8949
No log 7.1818 158 0.7900 0.5729 0.7900 0.8888
No log 7.2727 160 0.7794 0.4615 0.7794 0.8828
No log 7.3636 162 0.7640 0.4988 0.7640 0.8741
No log 7.4545 164 0.7334 0.5002 0.7334 0.8564
No log 7.5455 166 0.7169 0.4644 0.7169 0.8467
No log 7.6364 168 0.7191 0.4927 0.7191 0.8480
No log 7.7273 170 0.7423 0.5066 0.7423 0.8616
No log 7.8182 172 0.7392 0.5176 0.7392 0.8598
No log 7.9091 174 0.7353 0.4234 0.7352 0.8575
No log 8.0 176 0.7982 0.4839 0.7982 0.8934
No log 8.0909 178 0.8718 0.4796 0.8718 0.9337
No log 8.1818 180 0.9230 0.3679 0.9230 0.9607
No log 8.2727 182 0.8513 0.4489 0.8513 0.9227
No log 8.3636 184 0.7694 0.5546 0.7694 0.8772
No log 8.4545 186 0.7518 0.5046 0.7518 0.8671
No log 8.5455 188 0.7617 0.4932 0.7617 0.8727
No log 8.6364 190 0.7634 0.5176 0.7634 0.8737
No log 8.7273 192 0.8186 0.4971 0.8186 0.9048
No log 8.8182 194 0.8573 0.4570 0.8573 0.9259
No log 8.9091 196 0.7915 0.4963 0.7915 0.8897
No log 9.0 198 0.7648 0.6001 0.7648 0.8745
No log 9.0909 200 0.7764 0.5016 0.7764 0.8811
No log 9.1818 202 0.7861 0.4888 0.7861 0.8866
No log 9.2727 204 0.7896 0.4888 0.7896 0.8886
No log 9.3636 206 0.8071 0.4490 0.8071 0.8984
No log 9.4545 208 0.8216 0.4048 0.8216 0.9064
No log 9.5455 210 0.8496 0.4048 0.8496 0.9217
No log 9.6364 212 0.8718 0.3799 0.8718 0.9337
No log 9.7273 214 0.8995 0.4197 0.8995 0.9484
No log 9.8182 216 0.8380 0.4965 0.8380 0.9154
No log 9.9091 218 0.7867 0.5260 0.7867 0.8869
No log 10.0 220 0.8079 0.4225 0.8079 0.8989
No log 10.0909 222 0.9364 0.3694 0.9364 0.9677
No log 10.1818 224 1.0114 0.4069 1.0114 1.0057
No log 10.2727 226 0.9619 0.3731 0.9619 0.9808
No log 10.3636 228 0.8053 0.4377 0.8053 0.8974
No log 10.4545 230 0.7721 0.5343 0.7721 0.8787
No log 10.5455 232 0.7686 0.5570 0.7686 0.8767
No log 10.6364 234 0.7733 0.4595 0.7733 0.8794
No log 10.7273 236 0.7876 0.4595 0.7876 0.8875
No log 10.8182 238 0.7920 0.4958 0.7920 0.8899
No log 10.9091 240 0.7889 0.5333 0.7889 0.8882
No log 11.0 242 0.7728 0.5024 0.7728 0.8791
No log 11.0909 244 0.7836 0.4912 0.7836 0.8852
No log 11.1818 246 0.8121 0.4819 0.8121 0.9011
No log 11.2727 248 0.8043 0.4691 0.8043 0.8969
No log 11.3636 250 0.7895 0.4912 0.7895 0.8885
No log 11.4545 252 0.7926 0.5342 0.7926 0.8903
No log 11.5455 254 0.7801 0.5287 0.7801 0.8832
No log 11.6364 256 0.7809 0.4949 0.7809 0.8837
No log 11.7273 258 0.7749 0.5052 0.7749 0.8803
No log 11.8182 260 0.7651 0.5386 0.7651 0.8747
No log 11.9091 262 0.7655 0.5139 0.7655 0.8749
No log 12.0 264 0.7738 0.5139 0.7738 0.8796
No log 12.0909 266 0.7878 0.4772 0.7878 0.8876
No log 12.1818 268 0.8291 0.3879 0.8291 0.9106
No log 12.2727 270 0.8037 0.4141 0.8037 0.8965
No log 12.3636 272 0.7828 0.4271 0.7828 0.8847
No log 12.4545 274 0.7794 0.4395 0.7794 0.8828
No log 12.5455 276 0.7781 0.4395 0.7781 0.8821
No log 12.6364 278 0.7877 0.4630 0.7877 0.8875
No log 12.7273 280 0.8353 0.4709 0.8353 0.9140
No log 12.8182 282 0.8785 0.4910 0.8785 0.9373
No log 12.9091 284 0.8248 0.4616 0.8248 0.9082
No log 13.0 286 0.7844 0.4873 0.7844 0.8857
No log 13.0909 288 0.7756 0.5747 0.7756 0.8807
No log 13.1818 290 0.7695 0.5902 0.7695 0.8772
No log 13.2727 292 0.7731 0.5563 0.7731 0.8792
No log 13.3636 294 0.7848 0.5112 0.7848 0.8859
No log 13.4545 296 0.8334 0.5064 0.8334 0.9129
No log 13.5455 298 0.9080 0.4579 0.9080 0.9529
No log 13.6364 300 0.9762 0.4119 0.9762 0.9880
No log 13.7273 302 0.8986 0.4570 0.8986 0.9480
No log 13.8182 304 0.8103 0.4839 0.8103 0.9002
No log 13.9091 306 0.7499 0.5678 0.7499 0.8660
No log 14.0 308 0.7214 0.5712 0.7214 0.8494
No log 14.0909 310 0.7356 0.5300 0.7356 0.8577
No log 14.1818 312 0.7404 0.5300 0.7404 0.8604
No log 14.2727 314 0.7343 0.5618 0.7343 0.8569
No log 14.3636 316 0.7480 0.6198 0.7480 0.8649
No log 14.4545 318 0.8325 0.5934 0.8325 0.9124
No log 14.5455 320 0.8668 0.5255 0.8668 0.9310
No log 14.6364 322 0.8654 0.5455 0.8654 0.9302
No log 14.7273 324 0.8353 0.5934 0.8353 0.9140
No log 14.8182 326 0.7747 0.5983 0.7747 0.8802
No log 14.9091 328 0.7319 0.6113 0.7319 0.8555
No log 15.0 330 0.7275 0.6035 0.7275 0.8529
No log 15.0909 332 0.7325 0.5917 0.7325 0.8559
No log 15.1818 334 0.7476 0.5969 0.7476 0.8646
No log 15.2727 336 0.7537 0.5842 0.7537 0.8682
No log 15.3636 338 0.7442 0.5842 0.7442 0.8626
No log 15.4545 340 0.7314 0.5475 0.7314 0.8552
No log 15.5455 342 0.7556 0.5622 0.7556 0.8693
No log 15.6364 344 0.7517 0.5610 0.7517 0.8670
No log 15.7273 346 0.7576 0.5459 0.7576 0.8704
No log 15.8182 348 0.8375 0.5315 0.8375 0.9152
No log 15.9091 350 0.8979 0.5393 0.8979 0.9476
No log 16.0 352 0.9846 0.4694 0.9846 0.9923
No log 16.0909 354 1.0120 0.4587 1.0120 1.0060
No log 16.1818 356 0.9503 0.4146 0.9503 0.9748
No log 16.2727 358 0.9107 0.3844 0.9107 0.9543
No log 16.3636 360 0.9115 0.3844 0.9115 0.9547
No log 16.4545 362 0.8580 0.4712 0.8580 0.9263
No log 16.5455 364 0.7647 0.5010 0.7647 0.8745
No log 16.6364 366 0.7422 0.5713 0.7422 0.8615
No log 16.7273 368 0.7399 0.5819 0.7399 0.8602
No log 16.8182 370 0.7568 0.5496 0.7568 0.8699
No log 16.9091 372 0.7439 0.5805 0.7439 0.8625
No log 17.0 374 0.7183 0.5919 0.7183 0.8475
No log 17.0909 376 0.6987 0.5923 0.6987 0.8359
No log 17.1818 378 0.7020 0.5632 0.7020 0.8379
No log 17.2727 380 0.7900 0.5777 0.7900 0.8888
No log 17.3636 382 0.8839 0.5194 0.8839 0.9402
No log 17.4545 384 0.8544 0.4720 0.8544 0.9244
No log 17.5455 386 0.8197 0.5383 0.8197 0.9053
No log 17.6364 388 0.8318 0.5173 0.8318 0.9120
No log 17.7273 390 0.8465 0.5153 0.8465 0.9200
No log 17.8182 392 0.8584 0.5135 0.8584 0.9265
No log 17.9091 394 0.7937 0.5898 0.7937 0.8909
No log 18.0 396 0.7066 0.5434 0.7066 0.8406
No log 18.0909 398 0.6906 0.5843 0.6906 0.8310
No log 18.1818 400 0.7462 0.5219 0.7462 0.8638
No log 18.2727 402 0.7584 0.4884 0.7584 0.8708
No log 18.3636 404 0.7207 0.4970 0.7207 0.8489
No log 18.4545 406 0.6971 0.5622 0.6971 0.8349
No log 18.5455 408 0.6994 0.5622 0.6994 0.8363
No log 18.6364 410 0.6947 0.5795 0.6947 0.8335
No log 18.7273 412 0.6986 0.6341 0.6986 0.8358
No log 18.8182 414 0.7045 0.6157 0.7045 0.8393
No log 18.9091 416 0.7120 0.5573 0.7120 0.8438
No log 19.0 418 0.7036 0.6085 0.7036 0.8388
No log 19.0909 420 0.7639 0.5645 0.7639 0.8740
No log 19.1818 422 0.8753 0.4681 0.8753 0.9356
No log 19.2727 424 1.0060 0.4202 1.0060 1.0030
No log 19.3636 426 1.1376 0.3286 1.1376 1.0666
No log 19.4545 428 1.1444 0.2113 1.1444 1.0698
No log 19.5455 430 1.1229 0.1461 1.1229 1.0597
No log 19.6364 432 1.1225 0.1461 1.1225 1.0595
No log 19.7273 434 1.0495 0.2748 1.0495 1.0244
No log 19.8182 436 1.0410 0.3565 1.0410 1.0203
No log 19.9091 438 1.0669 0.3182 1.0669 1.0329
No log 20.0 440 1.0105 0.3679 1.0105 1.0052
No log 20.0909 442 0.9217 0.3351 0.9217 0.9601
No log 20.1818 444 0.8627 0.4406 0.8627 0.9288
No log 20.2727 446 0.8106 0.5852 0.8106 0.9003
No log 20.3636 448 0.7460 0.6485 0.7460 0.8637
No log 20.4545 450 0.7406 0.5388 0.7406 0.8606
No log 20.5455 452 0.8013 0.4911 0.8013 0.8952
No log 20.6364 454 0.7610 0.4880 0.7610 0.8723
No log 20.7273 456 0.7407 0.5483 0.7407 0.8607
No log 20.8182 458 0.7649 0.5329 0.7649 0.8746
No log 20.9091 460 0.7907 0.5212 0.7907 0.8892
No log 21.0 462 0.8074 0.4853 0.8074 0.8985
No log 21.0909 464 0.8390 0.5766 0.8390 0.9160
No log 21.1818 466 0.8338 0.5577 0.8338 0.9132
No log 21.2727 468 0.7840 0.5657 0.7840 0.8855
No log 21.3636 470 0.7630 0.5016 0.7630 0.8735
No log 21.4545 472 0.7621 0.5370 0.7621 0.8730
No log 21.5455 474 0.7603 0.5370 0.7603 0.8719
No log 21.6364 476 0.7626 0.5236 0.7626 0.8732
No log 21.7273 478 0.7615 0.5570 0.7615 0.8727
No log 21.8182 480 0.7558 0.5149 0.7558 0.8694
No log 21.9091 482 0.7901 0.3873 0.7901 0.8889
No log 22.0 484 0.8295 0.3533 0.8295 0.9108
No log 22.0909 486 0.8075 0.3891 0.8075 0.8986
No log 22.1818 488 0.7514 0.4691 0.7514 0.8668
No log 22.2727 490 0.7614 0.5917 0.7614 0.8726
No log 22.3636 492 0.7805 0.5666 0.7805 0.8835
No log 22.4545 494 0.7833 0.5345 0.7833 0.8850
No log 22.5455 496 0.8033 0.5546 0.8033 0.8963
No log 22.6364 498 0.8234 0.5577 0.8234 0.9074
0.3291 22.7273 500 0.8404 0.5560 0.8404 0.9168
0.3291 22.8182 502 0.7928 0.5923 0.7928 0.8904
0.3291 22.9091 504 0.7440 0.5700 0.7440 0.8625
0.3291 23.0 506 0.7374 0.5163 0.7374 0.8587
0.3291 23.0909 508 0.7306 0.5038 0.7306 0.8548
0.3291 23.1818 510 0.7222 0.5261 0.7222 0.8498
0.3291 23.2727 512 0.7219 0.5261 0.7219 0.8496
0.3291 23.3636 514 0.7210 0.5261 0.7210 0.8491

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_B_usingWellWrittenEssays_FineTuningAraBERT_run3_AugV5_k4_task5_organization

Finetuned
(4019)
this model