ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k13_task7_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 1.0748
  • Qwk: 0.3506
  • Mse: 1.0748
  • Rmse: 1.0367

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0308 2 2.5223 -0.0230 2.5223 1.5882
No log 0.0615 4 1.0967 0.1910 1.0967 1.0472
No log 0.0923 6 0.8054 0.0937 0.8054 0.8974
No log 0.1231 8 0.7318 0.1222 0.7318 0.8555
No log 0.1538 10 0.9592 0.1747 0.9592 0.9794
No log 0.1846 12 1.1068 0.1086 1.1068 1.0520
No log 0.2154 14 1.0208 0.1428 1.0208 1.0103
No log 0.2462 16 0.7600 0.3892 0.7600 0.8718
No log 0.2769 18 0.6953 0.3770 0.6953 0.8339
No log 0.3077 20 0.6382 0.4547 0.6382 0.7989
No log 0.3385 22 0.6504 0.3474 0.6504 0.8065
No log 0.3692 24 0.6453 0.2382 0.6453 0.8033
No log 0.4 26 0.7026 0.2382 0.7026 0.8382
No log 0.4308 28 0.7670 0.1636 0.7670 0.8758
No log 0.4615 30 0.8250 0.1766 0.8250 0.9083
No log 0.4923 32 0.8317 0.3372 0.8317 0.9120
No log 0.5231 34 0.7920 0.3799 0.7920 0.8899
No log 0.5538 36 0.7012 0.4197 0.7012 0.8374
No log 0.5846 38 0.7252 0.4424 0.7252 0.8516
No log 0.6154 40 0.8346 0.3804 0.8346 0.9135
No log 0.6462 42 0.7790 0.1648 0.7790 0.8826
No log 0.6769 44 0.7638 0.1648 0.7638 0.8739
No log 0.7077 46 0.7686 0.2589 0.7686 0.8767
No log 0.7385 48 0.8284 0.3746 0.8284 0.9102
No log 0.7692 50 0.9103 0.2635 0.9103 0.9541
No log 0.8 52 0.7874 0.4044 0.7874 0.8874
No log 0.8308 54 0.7051 0.5594 0.7051 0.8397
No log 0.8615 56 0.6829 0.5345 0.6829 0.8264
No log 0.8923 58 0.6831 0.2883 0.6831 0.8265
No log 0.9231 60 0.6386 0.3837 0.6386 0.7991
No log 0.9538 62 0.5464 0.4448 0.5464 0.7392
No log 0.9846 64 0.5448 0.4538 0.5448 0.7381
No log 1.0154 66 0.5449 0.5422 0.5449 0.7381
No log 1.0462 68 0.5335 0.5232 0.5335 0.7304
No log 1.0769 70 0.5935 0.5697 0.5935 0.7704
No log 1.1077 72 0.7085 0.4521 0.7085 0.8417
No log 1.1385 74 0.9086 0.2886 0.9086 0.9532
No log 1.1692 76 0.9823 0.2389 0.9823 0.9911
No log 1.2 78 0.9798 0.2287 0.9798 0.9899
No log 1.2308 80 0.9928 0.2507 0.9928 0.9964
No log 1.2615 82 0.9257 0.4061 0.9257 0.9622
No log 1.2923 84 0.9288 0.4491 0.9288 0.9637
No log 1.3231 86 1.0191 0.3584 1.0191 1.0095
No log 1.3538 88 1.0921 0.3444 1.0921 1.0450
No log 1.3846 90 1.0434 0.3311 1.0434 1.0215
No log 1.4154 92 0.8743 0.4332 0.8743 0.9351
No log 1.4462 94 0.9248 0.3905 0.9248 0.9616
No log 1.4769 96 1.0917 0.3957 1.0917 1.0449
No log 1.5077 98 1.0250 0.4428 1.0250 1.0124
No log 1.5385 100 0.8480 0.4409 0.8480 0.9209
No log 1.5692 102 0.7588 0.3409 0.7588 0.8711
No log 1.6 104 0.7602 0.2572 0.7602 0.8719
No log 1.6308 106 0.7362 0.3153 0.7362 0.8580
No log 1.6615 108 0.7584 0.3667 0.7584 0.8709
No log 1.6923 110 0.8326 0.4287 0.8326 0.9125
No log 1.7231 112 1.0142 0.3834 1.0142 1.0071
No log 1.7538 114 1.2784 0.3185 1.2784 1.1307
No log 1.7846 116 1.4149 0.2703 1.4149 1.1895
No log 1.8154 118 1.4271 0.2306 1.4271 1.1946
No log 1.8462 120 1.2350 0.0849 1.2350 1.1113
No log 1.8769 122 1.2074 0.1058 1.2074 1.0988
No log 1.9077 124 1.2929 0.2978 1.2929 1.1371
No log 1.9385 126 1.1077 0.3252 1.1077 1.0525
No log 1.9692 128 1.0065 0.3384 1.0065 1.0033
No log 2.0 130 0.9410 0.3337 0.9410 0.9701
No log 2.0308 132 0.9227 0.3650 0.9227 0.9606
No log 2.0615 134 0.9028 0.3445 0.9028 0.9502
No log 2.0923 136 0.9412 0.3650 0.9412 0.9701
No log 2.1231 138 0.9462 0.2898 0.9462 0.9727
No log 2.1538 140 0.9251 0.2948 0.9251 0.9618
No log 2.1846 142 0.9051 0.2948 0.9051 0.9514
No log 2.2154 144 0.8416 0.3538 0.8416 0.9174
No log 2.2462 146 0.8318 0.4541 0.8318 0.9120
No log 2.2769 148 1.0082 0.2358 1.0082 1.0041
No log 2.3077 150 1.2183 0.3144 1.2183 1.1038
No log 2.3385 152 1.2727 0.3241 1.2727 1.1282
No log 2.3692 154 1.1871 0.3082 1.1871 1.0895
No log 2.4 156 1.1749 0.3081 1.1749 1.0839
No log 2.4308 158 1.0743 0.3915 1.0743 1.0365
No log 2.4615 160 0.9066 0.3939 0.9066 0.9522
No log 2.4923 162 0.7444 0.4648 0.7444 0.8628
No log 2.5231 164 0.6954 0.3918 0.6954 0.8339
No log 2.5538 166 0.7095 0.4167 0.7095 0.8423
No log 2.5846 168 0.7531 0.4562 0.7531 0.8678
No log 2.6154 170 0.8646 0.3970 0.8646 0.9298
No log 2.6462 172 1.0407 0.3833 1.0407 1.0201
No log 2.6769 174 1.1064 0.4283 1.1064 1.0519
No log 2.7077 176 0.9665 0.3869 0.9665 0.9831
No log 2.7385 178 0.9039 0.3832 0.9039 0.9507
No log 2.7692 180 0.8675 0.4132 0.8675 0.9314
No log 2.8 182 0.7450 0.4364 0.7450 0.8631
No log 2.8308 184 0.6677 0.4917 0.6677 0.8171
No log 2.8615 186 0.7802 0.4427 0.7802 0.8833
No log 2.8923 188 1.0594 0.3553 1.0594 1.0293
No log 2.9231 190 1.0995 0.3278 1.0995 1.0486
No log 2.9538 192 1.2216 0.2754 1.2216 1.1053
No log 2.9846 194 1.1956 0.2773 1.1956 1.0934
No log 3.0154 196 1.2103 0.2592 1.2103 1.1001
No log 3.0462 198 1.2703 0.2929 1.2703 1.1271
No log 3.0769 200 1.1003 0.3161 1.1003 1.0490
No log 3.1077 202 0.8284 0.3269 0.8284 0.9102
No log 3.1385 204 0.6959 0.4470 0.6959 0.8342
No log 3.1692 206 0.6591 0.4330 0.6591 0.8119
No log 3.2 208 0.7183 0.4424 0.7183 0.8475
No log 3.2308 210 0.8463 0.5067 0.8463 0.9199
No log 3.2615 212 0.8915 0.4838 0.8915 0.9442
No log 3.2923 214 0.8749 0.5183 0.8749 0.9354
No log 3.3231 216 0.7689 0.5402 0.7689 0.8769
No log 3.3538 218 0.6407 0.5457 0.6407 0.8004
No log 3.3846 220 0.5882 0.5659 0.5882 0.7670
No log 3.4154 222 0.5990 0.5831 0.5990 0.7740
No log 3.4462 224 0.6179 0.5388 0.6179 0.7860
No log 3.4769 226 0.7010 0.4007 0.7010 0.8373
No log 3.5077 228 0.7154 0.3786 0.7154 0.8458
No log 3.5385 230 0.6811 0.4512 0.6811 0.8253
No log 3.5692 232 0.6892 0.4587 0.6892 0.8302
No log 3.6 234 0.7694 0.4853 0.7694 0.8771
No log 3.6308 236 0.9004 0.4393 0.9004 0.9489
No log 3.6615 238 0.9849 0.3919 0.9849 0.9924
No log 3.6923 240 0.9112 0.4526 0.9112 0.9546
No log 3.7231 242 0.8032 0.4528 0.8032 0.8962
No log 3.7538 244 0.7067 0.4512 0.7067 0.8406
No log 3.7846 246 0.6974 0.4424 0.6974 0.8351
No log 3.8154 248 0.7977 0.4512 0.7977 0.8931
No log 3.8462 250 1.0152 0.3741 1.0152 1.0076
No log 3.8769 252 1.2659 0.3123 1.2659 1.1251
No log 3.9077 254 1.3039 0.2537 1.3039 1.1419
No log 3.9385 256 1.3484 0.3171 1.3484 1.1612
No log 3.9692 258 1.1482 0.3322 1.1482 1.0716
No log 4.0 260 0.9210 0.3807 0.9210 0.9597
No log 4.0308 262 0.8343 0.3782 0.8343 0.9134
No log 4.0615 264 0.8791 0.3324 0.8791 0.9376
No log 4.0923 266 0.9751 0.3328 0.9751 0.9875
No log 4.1231 268 1.0220 0.3039 1.0220 1.0110
No log 4.1538 270 0.9356 0.3425 0.9356 0.9673
No log 4.1846 272 0.7313 0.3712 0.7313 0.8551
No log 4.2154 274 0.6517 0.3575 0.6517 0.8073
No log 4.2462 276 0.6293 0.3866 0.6293 0.7933
No log 4.2769 278 0.6567 0.4306 0.6567 0.8104
No log 4.3077 280 0.8771 0.4635 0.8771 0.9365
No log 4.3385 282 1.1442 0.3926 1.1442 1.0697
No log 4.3692 284 1.2480 0.3462 1.2480 1.1171
No log 4.4 286 1.1201 0.3926 1.1201 1.0584
No log 4.4308 288 0.8395 0.4844 0.8395 0.9163
No log 4.4615 290 0.7052 0.5085 0.7052 0.8397
No log 4.4923 292 0.6609 0.4315 0.6609 0.8129
No log 4.5231 294 0.6794 0.4819 0.6794 0.8243
No log 4.5538 296 0.7516 0.4819 0.7516 0.8670
No log 4.5846 298 0.8084 0.3988 0.8084 0.8991
No log 4.6154 300 0.8379 0.4161 0.8379 0.9154
No log 4.6462 302 0.8367 0.4096 0.8367 0.9147
No log 4.6769 304 0.8048 0.4438 0.8048 0.8971
No log 4.7077 306 0.8309 0.3892 0.8309 0.9115
No log 4.7385 308 0.8102 0.3960 0.8102 0.9001
No log 4.7692 310 0.8856 0.3807 0.8856 0.9411
No log 4.8 312 0.8719 0.3998 0.8719 0.9338
No log 4.8308 314 0.7793 0.4275 0.7793 0.8828
No log 4.8615 316 0.7626 0.4424 0.7626 0.8732
No log 4.8923 318 0.8499 0.4511 0.8499 0.9219
No log 4.9231 320 0.9981 0.4074 0.9981 0.9991
No log 4.9538 322 0.9841 0.3922 0.9841 0.9920
No log 4.9846 324 0.7797 0.4161 0.7797 0.8830
No log 5.0154 326 0.6327 0.5158 0.6327 0.7954
No log 5.0462 328 0.6221 0.5091 0.6221 0.7888
No log 5.0769 330 0.6206 0.5091 0.6206 0.7878
No log 5.1077 332 0.6799 0.4808 0.6799 0.8246
No log 5.1385 334 0.8696 0.4003 0.8696 0.9325
No log 5.1692 336 0.9631 0.3298 0.9631 0.9814
No log 5.2 338 0.9275 0.3643 0.9275 0.9631
No log 5.2308 340 0.8389 0.3650 0.8389 0.9159
No log 5.2615 342 0.8099 0.3970 0.8099 0.8999
No log 5.2923 344 0.7539 0.4347 0.7539 0.8683
No log 5.3231 346 0.7055 0.4102 0.7055 0.8399
No log 5.3538 348 0.7237 0.3991 0.7237 0.8507
No log 5.3846 350 0.7767 0.4347 0.7767 0.8813
No log 5.4154 352 0.7995 0.3869 0.7995 0.8941
No log 5.4462 354 0.9026 0.3481 0.9026 0.9500
No log 5.4769 356 0.9588 0.3161 0.9588 0.9792
No log 5.5077 358 0.9853 0.3010 0.9853 0.9926
No log 5.5385 360 0.9120 0.3601 0.9120 0.9550
No log 5.5692 362 0.8448 0.2574 0.8448 0.9192
No log 5.6 364 0.8429 0.3194 0.8429 0.9181
No log 5.6308 366 0.8750 0.3161 0.8750 0.9354
No log 5.6615 368 0.7952 0.4228 0.7952 0.8917
No log 5.6923 370 0.7794 0.4018 0.7794 0.8828
No log 5.7231 372 0.7586 0.4366 0.7586 0.8710
No log 5.7538 374 0.7596 0.4133 0.7596 0.8716
No log 5.7846 376 0.8667 0.3650 0.8667 0.9310
No log 5.8154 378 0.9851 0.3183 0.9851 0.9925
No log 5.8462 380 0.9492 0.3593 0.9492 0.9743
No log 5.8769 382 0.9155 0.3650 0.9155 0.9568
No log 5.9077 384 0.8803 0.3538 0.8803 0.9383
No log 5.9385 386 0.9704 0.2982 0.9704 0.9851
No log 5.9692 388 1.1093 0.2880 1.1093 1.0532
No log 6.0 390 1.1052 0.3086 1.1052 1.0513
No log 6.0308 392 0.9238 0.3593 0.9238 0.9612
No log 6.0615 394 0.7473 0.4014 0.7473 0.8644
No log 6.0923 396 0.6543 0.4352 0.6543 0.8089
No log 6.1231 398 0.6410 0.5177 0.6410 0.8006
No log 6.1538 400 0.6901 0.4684 0.6901 0.8307
No log 6.1846 402 0.7688 0.4545 0.7688 0.8768
No log 6.2154 404 0.8850 0.4572 0.8850 0.9407
No log 6.2462 406 0.8486 0.4639 0.8486 0.9212
No log 6.2769 408 0.7671 0.4496 0.7671 0.8758
No log 6.3077 410 0.6836 0.4556 0.6836 0.8268
No log 6.3385 412 0.6824 0.4165 0.6824 0.8261
No log 6.3692 414 0.7467 0.3929 0.7467 0.8641
No log 6.4 416 0.9627 0.3608 0.9627 0.9812
No log 6.4308 418 1.1599 0.3643 1.1599 1.0770
No log 6.4615 420 1.1395 0.3554 1.1395 1.0675
No log 6.4923 422 0.9369 0.4067 0.9369 0.9680
No log 6.5231 424 0.7535 0.4265 0.7535 0.8680
No log 6.5538 426 0.6989 0.4464 0.6989 0.8360
No log 6.5846 428 0.7271 0.4531 0.7271 0.8527
No log 6.6154 430 0.8300 0.4268 0.8300 0.9110
No log 6.6462 432 0.8727 0.4051 0.8727 0.9342
No log 6.6769 434 0.9051 0.3928 0.9051 0.9514
No log 6.7077 436 0.9947 0.4091 0.9947 0.9974
No log 6.7385 438 0.9975 0.4050 0.9975 0.9987
No log 6.7692 440 0.9154 0.3923 0.9154 0.9568
No log 6.8 442 0.9123 0.4081 0.9123 0.9551
No log 6.8308 444 0.9077 0.4081 0.9077 0.9527
No log 6.8615 446 0.8855 0.3887 0.8855 0.9410
No log 6.8923 448 0.7944 0.4597 0.7944 0.8913
No log 6.9231 450 0.7646 0.4528 0.7646 0.8744
No log 6.9538 452 0.7801 0.4462 0.7801 0.8833
No log 6.9846 454 0.8122 0.4784 0.8122 0.9012
No log 7.0154 456 0.7954 0.4584 0.7954 0.8918
No log 7.0462 458 0.7273 0.4366 0.7273 0.8528
No log 7.0769 460 0.7225 0.4123 0.7225 0.8500
No log 7.1077 462 0.8038 0.3928 0.8038 0.8966
No log 7.1385 464 0.8752 0.3753 0.8752 0.9355
No log 7.1692 466 0.8949 0.3557 0.8949 0.9460
No log 7.2 468 0.8326 0.4350 0.8326 0.9125
No log 7.2308 470 0.6845 0.5190 0.6845 0.8274
No log 7.2615 472 0.6199 0.5149 0.6199 0.7873
No log 7.2923 474 0.5926 0.5063 0.5926 0.7698
No log 7.3231 476 0.5847 0.5291 0.5847 0.7647
No log 7.3538 478 0.6159 0.5179 0.6159 0.7848
No log 7.3846 480 0.6730 0.4883 0.6730 0.8204
No log 7.4154 482 0.7201 0.5190 0.7201 0.8486
No log 7.4462 484 0.6979 0.5325 0.6979 0.8354
No log 7.4769 486 0.6900 0.4329 0.6900 0.8307
No log 7.5077 488 0.6890 0.3996 0.6890 0.8301
No log 7.5385 490 0.7420 0.4627 0.7420 0.8614
No log 7.5692 492 0.8610 0.4091 0.8610 0.9279
No log 7.6 494 1.0447 0.4110 1.0447 1.0221
No log 7.6308 496 1.0656 0.4110 1.0656 1.0323
No log 7.6615 498 0.9171 0.4021 0.9171 0.9577
0.3595 7.6923 500 0.7565 0.4462 0.7565 0.8698
0.3595 7.7231 502 0.6981 0.4741 0.6981 0.8355
0.3595 7.7538 504 0.7167 0.4385 0.7167 0.8466
0.3595 7.7846 506 0.7738 0.5103 0.7738 0.8796
0.3595 7.8154 508 0.8805 0.4771 0.8805 0.9384
0.3595 7.8462 510 0.9317 0.4428 0.9317 0.9652
0.3595 7.8769 512 0.9115 0.4460 0.9115 0.9547
0.3595 7.9077 514 0.9738 0.3902 0.9738 0.9868
0.3595 7.9385 516 0.9505 0.4283 0.9505 0.9749
0.3595 7.9692 518 0.8565 0.4475 0.8565 0.9255
0.3595 8.0 520 0.7445 0.5199 0.7445 0.8628
0.3595 8.0308 522 0.6887 0.5351 0.6887 0.8299
0.3595 8.0615 524 0.6874 0.5309 0.6874 0.8291
0.3595 8.0923 526 0.7317 0.5309 0.7317 0.8554
0.3595 8.1231 528 0.7925 0.4794 0.7925 0.8902
0.3595 8.1538 530 0.8357 0.4965 0.8357 0.9142
0.3595 8.1846 532 0.8386 0.4669 0.8386 0.9157
0.3595 8.2154 534 0.7578 0.4992 0.7578 0.8705
0.3595 8.2462 536 0.6590 0.5063 0.6590 0.8118
0.3595 8.2769 538 0.6945 0.5310 0.6945 0.8334
0.3595 8.3077 540 0.7598 0.4597 0.7598 0.8717
0.3595 8.3385 542 0.8734 0.4214 0.8734 0.9345
0.3595 8.3692 544 0.9280 0.4233 0.9280 0.9633
0.3595 8.4 546 0.9131 0.4391 0.9131 0.9555
0.3595 8.4308 548 0.8263 0.5016 0.8263 0.9090
0.3595 8.4615 550 0.6763 0.5181 0.6763 0.8223
0.3595 8.4923 552 0.5773 0.5583 0.5773 0.7598
0.3595 8.5231 554 0.5731 0.5015 0.5731 0.7570
0.3595 8.5538 556 0.6101 0.4845 0.6101 0.7811
0.3595 8.5846 558 0.6963 0.4550 0.6963 0.8344
0.3595 8.6154 560 0.8355 0.4568 0.8355 0.9141
0.3595 8.6462 562 1.0247 0.3459 1.0247 1.0123
0.3595 8.6769 564 1.0353 0.3833 1.0353 1.0175
0.3595 8.7077 566 0.9763 0.3662 0.9763 0.9881
0.3595 8.7385 568 0.9819 0.3662 0.9819 0.9909
0.3595 8.7692 570 1.0748 0.3506 1.0748 1.0367

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k13_task7_organization

Finetuned
(4019)
this model