ArabicNewSplits7_B_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k9_task7_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.6343
  • Qwk: 0.3197
  • Mse: 0.6343
  • Rmse: 0.7964

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0426 2 2.4737 -0.0449 2.4737 1.5728
No log 0.0851 4 1.2826 0.1274 1.2826 1.1325
No log 0.1277 6 1.0342 -0.0927 1.0342 1.0169
No log 0.1702 8 0.8416 0.0608 0.8416 0.9174
No log 0.2128 10 0.7882 0.1541 0.7882 0.8878
No log 0.2553 12 0.9216 0.2651 0.9216 0.9600
No log 0.2979 14 1.1074 0.1464 1.1074 1.0523
No log 0.3404 16 1.0232 0.2601 1.0232 1.0116
No log 0.3830 18 0.8715 0.1672 0.8715 0.9336
No log 0.4255 20 0.7589 0.1508 0.7589 0.8712
No log 0.4681 22 0.7496 -0.0079 0.7496 0.8658
No log 0.5106 24 0.7812 0.0444 0.7812 0.8839
No log 0.5532 26 0.7702 0.0393 0.7702 0.8776
No log 0.5957 28 0.7673 0.0757 0.7673 0.8760
No log 0.6383 30 0.8165 0.0781 0.8165 0.9036
No log 0.6809 32 0.8371 0.1065 0.8371 0.9149
No log 0.7234 34 0.8192 0.1183 0.8192 0.9051
No log 0.7660 36 0.7573 0.1050 0.7573 0.8702
No log 0.8085 38 0.7398 -0.0054 0.7398 0.8601
No log 0.8511 40 0.7388 -0.0054 0.7388 0.8596
No log 0.8936 42 0.7831 0.2027 0.7831 0.8849
No log 0.9362 44 0.9834 0.1867 0.9834 0.9917
No log 0.9787 46 1.0286 0.1416 1.0286 1.0142
No log 1.0213 48 1.0084 0.1454 1.0084 1.0042
No log 1.0638 50 0.9604 0.1628 0.9604 0.9800
No log 1.1064 52 0.8181 0.1504 0.8181 0.9045
No log 1.1489 54 0.7333 0.0717 0.7333 0.8563
No log 1.1915 56 0.7345 0.1007 0.7345 0.8571
No log 1.2340 58 0.7823 0.1504 0.7823 0.8845
No log 1.2766 60 0.8256 0.1416 0.8256 0.9086
No log 1.3191 62 0.7767 0.1550 0.7767 0.8813
No log 1.3617 64 0.7269 0.1508 0.7269 0.8526
No log 1.4043 66 0.7288 0.0428 0.7288 0.8537
No log 1.4468 68 0.7831 0.0937 0.7831 0.8849
No log 1.4894 70 0.7752 0.0937 0.7752 0.8804
No log 1.5319 72 0.7357 0.0846 0.7357 0.8577
No log 1.5745 74 0.7601 0.1648 0.7601 0.8718
No log 1.6170 76 0.9161 0.2193 0.9161 0.9571
No log 1.6596 78 1.1631 0.1993 1.1631 1.0785
No log 1.7021 80 1.2658 0.2421 1.2658 1.1251
No log 1.7447 82 1.1657 0.2805 1.1657 1.0797
No log 1.7872 84 0.9751 0.3409 0.9751 0.9875
No log 1.8298 86 0.8365 0.3169 0.8365 0.9146
No log 1.8723 88 0.7943 0.2904 0.7943 0.8912
No log 1.9149 90 0.7357 0.2967 0.7357 0.8577
No log 1.9574 92 0.8380 0.3746 0.8380 0.9154
No log 2.0 94 0.8406 0.3746 0.8406 0.9168
No log 2.0426 96 0.7487 0.2904 0.7487 0.8652
No log 2.0851 98 0.7531 0.3699 0.7531 0.8678
No log 2.1277 100 0.9603 0.3019 0.9603 0.9799
No log 2.1702 102 1.2261 0.3033 1.2261 1.1073
No log 2.2128 104 1.1596 0.2930 1.1596 1.0768
No log 2.2553 106 0.9234 0.1672 0.9234 0.9609
No log 2.2979 108 0.7321 0.1308 0.7321 0.8556
No log 2.3404 110 0.7025 0.1303 0.7025 0.8381
No log 2.3830 112 0.7720 0.1459 0.7720 0.8786
No log 2.4255 114 0.8791 0.2670 0.8791 0.9376
No log 2.4681 116 1.0090 0.2411 1.0090 1.0045
No log 2.5106 118 1.0116 0.2651 1.0116 1.0058
No log 2.5532 120 0.9735 0.2939 0.9735 0.9867
No log 2.5957 122 1.0421 0.3170 1.0421 1.0209
No log 2.6383 124 0.9855 0.2703 0.9855 0.9927
No log 2.6809 126 0.8550 0.2632 0.8550 0.9247
No log 2.7234 128 0.8493 0.2171 0.8493 0.9216
No log 2.7660 130 0.8814 0.1766 0.8814 0.9388
No log 2.8085 132 0.8819 0.1766 0.8819 0.9391
No log 2.8511 134 0.9064 0.3564 0.9064 0.9521
No log 2.8936 136 1.0116 0.2982 1.0116 1.0058
No log 2.9362 138 1.1948 0.2691 1.1948 1.0931
No log 2.9787 140 1.2239 0.2691 1.2239 1.1063
No log 3.0213 142 1.3878 0.1579 1.3878 1.1781
No log 3.0638 144 1.1335 0.2910 1.1335 1.0647
No log 3.1064 146 0.9165 0.3497 0.9165 0.9574
No log 3.1489 148 0.9161 0.3497 0.9161 0.9571
No log 3.1915 150 1.2317 0.2675 1.2317 1.1098
No log 3.2340 152 1.2331 0.2713 1.2331 1.1105
No log 3.2766 154 0.9539 0.4064 0.9539 0.9767
No log 3.3191 156 0.6742 0.3314 0.6742 0.8211
No log 3.3617 158 0.6391 0.3336 0.6391 0.7994
No log 3.4043 160 0.6317 0.2419 0.6317 0.7948
No log 3.4468 162 0.6604 0.2913 0.6604 0.8127
No log 3.4894 164 0.8573 0.3494 0.8573 0.9259
No log 3.5319 166 1.0122 0.3346 1.0122 1.0061
No log 3.5745 168 0.9508 0.2988 0.9508 0.9751
No log 3.6170 170 0.7631 0.3615 0.7631 0.8735
No log 3.6596 172 0.6757 0.2561 0.6757 0.8220
No log 3.7021 174 0.6831 0.3380 0.6831 0.8265
No log 3.7447 176 0.6829 0.2621 0.6829 0.8264
No log 3.7872 178 0.7467 0.3688 0.7467 0.8641
No log 3.8298 180 0.9254 0.2988 0.9254 0.9620
No log 3.8723 182 1.1430 0.2709 1.1430 1.0691
No log 3.9149 184 1.2874 0.2591 1.2874 1.1346
No log 3.9574 186 1.2016 0.2820 1.2016 1.0962
No log 4.0 188 0.9529 0.3477 0.9529 0.9762
No log 4.0426 190 0.7233 0.2754 0.7233 0.8505
No log 4.0851 192 0.6800 0.3445 0.6800 0.8246
No log 4.1277 194 0.6906 0.3196 0.6906 0.8310
No log 4.1702 196 0.7822 0.3169 0.7822 0.8844
No log 4.2128 198 0.9335 0.3409 0.9335 0.9662
No log 4.2553 200 0.9824 0.3347 0.9824 0.9911
No log 4.2979 202 0.9162 0.3409 0.9162 0.9572
No log 4.3404 204 0.8691 0.3409 0.8691 0.9323
No log 4.3830 206 0.8724 0.3310 0.8724 0.9340
No log 4.4255 208 0.8503 0.3499 0.8503 0.9221
No log 4.4681 210 0.8009 0.3243 0.8009 0.8949
No log 4.5106 212 0.7523 0.3355 0.7523 0.8673
No log 4.5532 214 0.7659 0.3867 0.7659 0.8752
No log 4.5957 216 0.7586 0.4112 0.7586 0.8710
No log 4.6383 218 0.8208 0.4315 0.8208 0.9060
No log 4.6809 220 0.9209 0.3381 0.9209 0.9596
No log 4.7234 222 0.9788 0.3381 0.9788 0.9893
No log 4.7660 224 0.8731 0.3538 0.8731 0.9344
No log 4.8085 226 0.7448 0.3564 0.7448 0.8630
No log 4.8511 228 0.7045 0.2294 0.7045 0.8393
No log 4.8936 230 0.7129 0.2476 0.7129 0.8443
No log 4.9362 232 0.7259 0.3369 0.7259 0.8520
No log 4.9787 234 0.7525 0.2471 0.7525 0.8675
No log 5.0213 236 0.7751 0.2784 0.7751 0.8804
No log 5.0638 238 0.7949 0.3794 0.7949 0.8916
No log 5.1064 240 0.7777 0.3794 0.7777 0.8819
No log 5.1489 242 0.7451 0.3287 0.7451 0.8632
No log 5.1915 244 0.7459 0.3287 0.7459 0.8636
No log 5.2340 246 0.7181 0.3478 0.7181 0.8474
No log 5.2766 248 0.7131 0.2872 0.7131 0.8445
No log 5.3191 250 0.7694 0.3221 0.7694 0.8772
No log 5.3617 252 0.8034 0.3794 0.8034 0.8963
No log 5.4043 254 0.8261 0.4036 0.8261 0.9089
No log 5.4468 256 0.8001 0.4036 0.8001 0.8945
No log 5.4894 258 0.7303 0.2530 0.7303 0.8546
No log 5.5319 260 0.7282 0.2780 0.7282 0.8534
No log 5.5745 262 0.7311 0.2813 0.7311 0.8550
No log 5.6170 264 0.8349 0.4036 0.8349 0.9137
No log 5.6596 266 0.9917 0.3425 0.9917 0.9958
No log 5.7021 268 0.8962 0.3844 0.8962 0.9467
No log 5.7447 270 0.7520 0.2237 0.7520 0.8672
No log 5.7872 272 0.7272 0.1624 0.7272 0.8528
No log 5.8298 274 0.7267 0.1935 0.7267 0.8525
No log 5.8723 276 0.7731 0.3914 0.7731 0.8792
No log 5.9149 278 0.7559 0.3914 0.7559 0.8694
No log 5.9574 280 0.6779 0.3116 0.6779 0.8233
No log 6.0 282 0.6650 0.3628 0.6650 0.8155
No log 6.0426 284 0.6523 0.3380 0.6523 0.8077
No log 6.0851 286 0.6468 0.3788 0.6468 0.8042
No log 6.1277 288 0.6629 0.3840 0.6629 0.8142
No log 6.1702 290 0.6672 0.4437 0.6672 0.8168
No log 6.2128 292 0.6456 0.3572 0.6456 0.8035
No log 6.2553 294 0.6336 0.2540 0.6336 0.7960
No log 6.2979 296 0.6454 0.2540 0.6454 0.8034
No log 6.3404 298 0.6562 0.2884 0.6562 0.8101
No log 6.3830 300 0.6621 0.2783 0.6621 0.8137
No log 6.4255 302 0.6618 0.3111 0.6618 0.8135
No log 6.4681 304 0.6626 0.3070 0.6626 0.8140
No log 6.5106 306 0.6690 0.3106 0.6690 0.8179
No log 6.5532 308 0.6847 0.2313 0.6847 0.8275
No log 6.5957 310 0.6917 0.2445 0.6917 0.8317
No log 6.6383 312 0.6961 0.2652 0.6961 0.8343
No log 6.6809 314 0.6999 0.2652 0.6999 0.8366
No log 6.7234 316 0.6979 0.2973 0.6979 0.8354
No log 6.7660 318 0.7030 0.2893 0.7030 0.8385
No log 6.8085 320 0.7191 0.3458 0.7191 0.8480
No log 6.8511 322 0.7386 0.3700 0.7386 0.8594
No log 6.8936 324 0.7961 0.3544 0.7961 0.8922
No log 6.9362 326 0.8420 0.3520 0.8420 0.9176
No log 6.9787 328 0.7829 0.3609 0.7829 0.8848
No log 7.0213 330 0.6981 0.3023 0.6981 0.8356
No log 7.0638 332 0.6739 0.3296 0.6739 0.8209
No log 7.1064 334 0.6681 0.3352 0.6681 0.8174
No log 7.1489 336 0.6723 0.3129 0.6723 0.8199
No log 7.1915 338 0.6610 0.3643 0.6610 0.8130
No log 7.2340 340 0.6825 0.4093 0.6825 0.8261
No log 7.2766 342 0.7061 0.4093 0.7061 0.8403
No log 7.3191 344 0.7278 0.4315 0.7278 0.8531
No log 7.3617 346 0.7923 0.4315 0.7923 0.8901
No log 7.4043 348 0.7799 0.4223 0.7799 0.8831
No log 7.4468 350 0.7247 0.3157 0.7247 0.8513
No log 7.4894 352 0.6731 0.3369 0.6731 0.8204
No log 7.5319 354 0.6677 0.3369 0.6677 0.8171
No log 7.5745 356 0.6693 0.3369 0.6693 0.8181
No log 7.6170 358 0.6773 0.3498 0.6773 0.8230
No log 7.6596 360 0.7240 0.3798 0.7240 0.8509
No log 7.7021 362 0.7341 0.4457 0.7341 0.8568
No log 7.7447 364 0.6551 0.4430 0.6551 0.8094
No log 7.7872 366 0.5986 0.4182 0.5986 0.7737
No log 7.8298 368 0.5847 0.4091 0.5847 0.7646
No log 7.8723 370 0.5904 0.4147 0.5904 0.7684
No log 7.9149 372 0.6257 0.4352 0.6257 0.7910
No log 7.9574 374 0.6631 0.4352 0.6631 0.8143
No log 8.0 376 0.6771 0.4352 0.6771 0.8228
No log 8.0426 378 0.6670 0.3253 0.6670 0.8167
No log 8.0851 380 0.6806 0.3253 0.6806 0.8250
No log 8.1277 382 0.7414 0.3268 0.7414 0.8611
No log 8.1702 384 0.8172 0.3934 0.8172 0.9040
No log 8.2128 386 0.9285 0.4085 0.9285 0.9636
No log 8.2553 388 0.8989 0.4085 0.8989 0.9481
No log 8.2979 390 0.7877 0.3710 0.7877 0.8876
No log 8.3404 392 0.7054 0.3769 0.7054 0.8399
No log 8.3830 394 0.7018 0.3211 0.7018 0.8377
No log 8.4255 396 0.7138 0.3253 0.7138 0.8449
No log 8.4681 398 0.7543 0.3688 0.7543 0.8685
No log 8.5106 400 0.8643 0.3100 0.8643 0.9297
No log 8.5532 402 0.8843 0.3274 0.8843 0.9404
No log 8.5957 404 0.8630 0.3868 0.8630 0.9290
No log 8.6383 406 0.7735 0.3798 0.7735 0.8795
No log 8.6809 408 0.7378 0.3817 0.7378 0.8590
No log 8.7234 410 0.7258 0.3817 0.7258 0.8519
No log 8.7660 412 0.7063 0.4257 0.7063 0.8404
No log 8.8085 414 0.7130 0.4212 0.7130 0.8444
No log 8.8511 416 0.7946 0.4104 0.7946 0.8914
No log 8.8936 418 0.9625 0.3501 0.9625 0.9811
No log 8.9362 420 1.0078 0.3445 1.0078 1.0039
No log 8.9787 422 0.9368 0.3501 0.9368 0.9679
No log 9.0213 424 0.8099 0.4315 0.8099 0.8999
No log 9.0638 426 0.7421 0.4059 0.7421 0.8615
No log 9.1064 428 0.7225 0.3763 0.7225 0.8500
No log 9.1489 430 0.7391 0.3569 0.7391 0.8597
No log 9.1915 432 0.7636 0.3633 0.7636 0.8738
No log 9.2340 434 0.7504 0.3498 0.7504 0.8663
No log 9.2766 436 0.7840 0.3567 0.7840 0.8854
No log 9.3191 438 0.8239 0.2943 0.8239 0.9077
No log 9.3617 440 0.8377 0.2917 0.8377 0.9153
No log 9.4043 442 0.8123 0.2632 0.8123 0.9013
No log 9.4468 444 0.7809 0.2632 0.7809 0.8837
No log 9.4894 446 0.7939 0.2843 0.7939 0.8910
No log 9.5319 448 0.8040 0.3991 0.8040 0.8967
No log 9.5745 450 0.7525 0.3494 0.7525 0.8675
No log 9.6170 452 0.7519 0.3746 0.7519 0.8671
No log 9.6596 454 0.7055 0.3564 0.7055 0.8400
No log 9.7021 456 0.6519 0.3155 0.6519 0.8074
No log 9.7447 458 0.6421 0.3116 0.6421 0.8013
No log 9.7872 460 0.6725 0.4350 0.6725 0.8201
No log 9.8298 462 0.7630 0.4648 0.7630 0.8735
No log 9.8723 464 0.7539 0.4777 0.7539 0.8682
No log 9.9149 466 0.6543 0.4144 0.6543 0.8089
No log 9.9574 468 0.6276 0.3662 0.6276 0.7922
No log 10.0 470 0.6569 0.3894 0.6569 0.8105
No log 10.0426 472 0.6864 0.4387 0.6864 0.8285
No log 10.0851 474 0.7307 0.4482 0.7307 0.8548
No log 10.1277 476 0.7351 0.4726 0.7351 0.8574
No log 10.1702 478 0.7117 0.4502 0.7117 0.8436
No log 10.2128 480 0.6553 0.3183 0.6553 0.8095
No log 10.2553 482 0.6235 0.3452 0.6235 0.7896
No log 10.2979 484 0.6377 0.3296 0.6377 0.7986
No log 10.3404 486 0.6741 0.3088 0.6741 0.8211
No log 10.3830 488 0.6669 0.3688 0.6669 0.8166
No log 10.4255 490 0.6758 0.3688 0.6758 0.8221
No log 10.4681 492 0.6566 0.3688 0.6566 0.8103
No log 10.5106 494 0.6432 0.3498 0.6432 0.8020
No log 10.5532 496 0.6378 0.3919 0.6378 0.7986
No log 10.5957 498 0.6404 0.3919 0.6404 0.8002
0.3785 10.6383 500 0.6454 0.4001 0.6454 0.8034
0.3785 10.6809 502 0.6424 0.4086 0.6424 0.8015
0.3785 10.7234 504 0.6422 0.4086 0.6422 0.8014
0.3785 10.7660 506 0.6559 0.3688 0.6559 0.8099
0.3785 10.8085 508 0.7087 0.4243 0.7087 0.8419
0.3785 10.8511 510 0.7289 0.4385 0.7289 0.8537
0.3785 10.8936 512 0.6874 0.4759 0.6874 0.8291
0.3785 10.9362 514 0.6504 0.4464 0.6504 0.8065
0.3785 10.9787 516 0.6438 0.3966 0.6438 0.8024
0.3785 11.0213 518 0.6483 0.5086 0.6483 0.8051
0.3785 11.0638 520 0.6623 0.4684 0.6623 0.8138
0.3785 11.1064 522 0.6666 0.5086 0.6666 0.8165
0.3785 11.1489 524 0.6496 0.4855 0.6496 0.8060
0.3785 11.1915 526 0.6656 0.4597 0.6656 0.8158
0.3785 11.2340 528 0.7273 0.4089 0.7273 0.8528
0.3785 11.2766 530 0.7899 0.3688 0.7899 0.8887
0.3785 11.3191 532 0.8141 0.3623 0.8141 0.9023
0.3785 11.3617 534 0.7673 0.4014 0.7673 0.8759
0.3785 11.4043 536 0.6797 0.3267 0.6797 0.8244
0.3785 11.4468 538 0.6399 0.3809 0.6399 0.7999
0.3785 11.4894 540 0.6428 0.2923 0.6428 0.8017
0.3785 11.5319 542 0.6427 0.2923 0.6427 0.8017
0.3785 11.5745 544 0.6491 0.4086 0.6491 0.8056
0.3785 11.6170 546 0.6854 0.3525 0.6854 0.8279
0.3785 11.6596 548 0.6743 0.3594 0.6743 0.8212
0.3785 11.7021 550 0.6533 0.3649 0.6533 0.8083
0.3785 11.7447 552 0.6332 0.3197 0.6332 0.7958
0.3785 11.7872 554 0.6343 0.3197 0.6343 0.7964

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_B_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k9_task7_organization

Finetuned
(4019)
this model