ArabicNewSplits7_usingALLEssays_FineTuningAraBERT_run1_AugV5_k14_task2_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 1.1178
  • Qwk: 0.3298
  • Mse: 1.1178
  • Rmse: 1.0573

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0392 2 4.9855 -0.0161 4.9855 2.2328
No log 0.0784 4 3.3337 -0.0272 3.3337 1.8259
No log 0.1176 6 2.2953 -0.0553 2.2953 1.5150
No log 0.1569 8 2.3067 -0.0399 2.3067 1.5188
No log 0.1961 10 1.5687 0.0036 1.5687 1.2525
No log 0.2353 12 1.2822 0.2010 1.2822 1.1323
No log 0.2745 14 1.2574 0.1519 1.2574 1.1213
No log 0.3137 16 1.3334 0.0561 1.3334 1.1547
No log 0.3529 18 1.6304 -0.0066 1.6304 1.2769
No log 0.3922 20 1.7982 0.0504 1.7982 1.3410
No log 0.4314 22 1.5834 0.0310 1.5834 1.2583
No log 0.4706 24 1.3855 0.0416 1.3855 1.1771
No log 0.5098 26 1.3080 0.0955 1.3080 1.1437
No log 0.5490 28 1.2854 0.2203 1.2854 1.1338
No log 0.5882 30 1.3061 0.1622 1.3061 1.1429
No log 0.6275 32 1.4511 0.0707 1.4511 1.2046
No log 0.6667 34 1.4890 0.0608 1.4890 1.2202
No log 0.7059 36 1.5407 0.0435 1.5407 1.2412
No log 0.7451 38 1.6001 0.0603 1.6001 1.2650
No log 0.7843 40 1.5040 0.0638 1.5040 1.2264
No log 0.8235 42 1.3952 0.1305 1.3952 1.1812
No log 0.8627 44 1.3491 0.1481 1.3491 1.1615
No log 0.9020 46 1.5330 0.1735 1.5330 1.2382
No log 0.9412 48 1.7192 0.1199 1.7192 1.3112
No log 0.9804 50 1.6929 0.1647 1.6929 1.3011
No log 1.0196 52 1.5908 0.1923 1.5908 1.2613
No log 1.0588 54 1.4601 0.1604 1.4601 1.2084
No log 1.0980 56 1.2005 0.1904 1.2005 1.0957
No log 1.1373 58 1.0276 0.3086 1.0276 1.0137
No log 1.1765 60 0.9981 0.3086 0.9981 0.9990
No log 1.2157 62 0.9780 0.3237 0.9780 0.9889
No log 1.2549 64 0.9914 0.3216 0.9914 0.9957
No log 1.2941 66 1.0662 0.3402 1.0662 1.0326
No log 1.3333 68 1.1929 0.2797 1.1929 1.0922
No log 1.3725 70 1.1479 0.2374 1.1479 1.0714
No log 1.4118 72 1.2756 0.3317 1.2756 1.1294
No log 1.4510 74 1.4378 0.2681 1.4378 1.1991
No log 1.4902 76 1.5055 0.2681 1.5055 1.2270
No log 1.5294 78 1.4516 0.2899 1.4516 1.2048
No log 1.5686 80 1.1260 0.3633 1.1260 1.0611
No log 1.6078 82 0.9716 0.3427 0.9716 0.9857
No log 1.6471 84 1.0302 0.3529 1.0302 1.0150
No log 1.6863 86 1.1943 0.3655 1.1943 1.0928
No log 1.7255 88 1.2894 0.4037 1.2894 1.1355
No log 1.7647 90 1.1552 0.2985 1.1552 1.0748
No log 1.8039 92 1.2835 0.2963 1.2835 1.1329
No log 1.8431 94 1.6205 0.2815 1.6205 1.2730
No log 1.8824 96 1.6966 0.2853 1.6966 1.3025
No log 1.9216 98 1.4355 0.2930 1.4355 1.1981
No log 1.9608 100 1.0129 0.3462 1.0129 1.0064
No log 2.0 102 0.9006 0.4512 0.9006 0.9490
No log 2.0392 104 1.0236 0.4398 1.0236 1.0117
No log 2.0784 106 0.9961 0.3926 0.9961 0.9981
No log 2.1176 108 0.9158 0.4526 0.9158 0.9570
No log 2.1569 110 0.9318 0.3671 0.9318 0.9653
No log 2.1961 112 1.1371 0.2895 1.1371 1.0663
No log 2.2353 114 1.2683 0.3439 1.2683 1.1262
No log 2.2745 116 1.3203 0.3002 1.3203 1.1490
No log 2.3137 118 1.2743 0.3187 1.2743 1.1288
No log 2.3529 120 1.0986 0.2577 1.0986 1.0482
No log 2.3922 122 1.0024 0.3321 1.0024 1.0012
No log 2.4314 124 1.0567 0.3705 1.0567 1.0280
No log 2.4706 126 1.1379 0.3380 1.1379 1.0667
No log 2.5098 128 1.5351 0.2295 1.5351 1.2390
No log 2.5490 130 1.5730 0.2627 1.5730 1.2542
No log 2.5882 132 1.3077 0.3347 1.3077 1.1436
No log 2.6275 134 1.1073 0.3574 1.1073 1.0523
No log 2.6667 136 1.0412 0.3719 1.0412 1.0204
No log 2.7059 138 1.1278 0.3504 1.1278 1.0620
No log 2.7451 140 1.2907 0.3824 1.2907 1.1361
No log 2.7843 142 1.1464 0.2857 1.1464 1.0707
No log 2.8235 144 0.9531 0.3014 0.9531 0.9763
No log 2.8627 146 0.9780 0.4423 0.9780 0.9889
No log 2.9020 148 1.0467 0.3927 1.0467 1.0231
No log 2.9412 150 1.1310 0.2988 1.1310 1.0635
No log 2.9804 152 1.2981 0.1371 1.2981 1.1394
No log 3.0196 154 1.2413 0.2964 1.2413 1.1142
No log 3.0588 156 1.0571 0.4123 1.0571 1.0282
No log 3.0980 158 1.1342 0.3197 1.1342 1.0650
No log 3.1373 160 1.2604 0.2984 1.2604 1.1227
No log 3.1765 162 1.1425 0.3346 1.1425 1.0689
No log 3.2157 164 0.9576 0.4361 0.9576 0.9786
No log 3.2549 166 0.8990 0.4912 0.8990 0.9482
No log 3.2941 168 0.8669 0.4712 0.8669 0.9311
No log 3.3333 170 0.8621 0.4297 0.8621 0.9285
No log 3.3725 172 1.0569 0.3959 1.0569 1.0281
No log 3.4118 174 1.4662 0.3750 1.4662 1.2109
No log 3.4510 176 1.4111 0.3879 1.4111 1.1879
No log 3.4902 178 1.0832 0.4120 1.0832 1.0408
No log 3.5294 180 0.8353 0.3879 0.8353 0.9139
No log 3.5686 182 0.8355 0.5680 0.8355 0.9141
No log 3.6078 184 0.8347 0.5680 0.8347 0.9136
No log 3.6471 186 0.8482 0.3527 0.8482 0.9210
No log 3.6863 188 1.0575 0.3410 1.0575 1.0283
No log 3.7255 190 1.1357 0.3398 1.1357 1.0657
No log 3.7647 192 1.0136 0.4435 1.0136 1.0068
No log 3.8039 194 0.8557 0.4617 0.8557 0.9250
No log 3.8431 196 0.8699 0.5649 0.8699 0.9327
No log 3.8824 198 0.8843 0.4945 0.8843 0.9404
No log 3.9216 200 1.0218 0.3644 1.0218 1.0108
No log 3.9608 202 1.4158 0.3081 1.4158 1.1899
No log 4.0 204 1.4377 0.2928 1.4377 1.1990
No log 4.0392 206 1.1462 0.2985 1.1462 1.0706
No log 4.0784 208 0.9728 0.4571 0.9728 0.9863
No log 4.1176 210 1.0003 0.4409 1.0003 1.0002
No log 4.1569 212 0.9350 0.4845 0.9350 0.9670
No log 4.1961 214 0.8794 0.4927 0.8794 0.9377
No log 4.2353 216 0.8748 0.4591 0.8748 0.9353
No log 4.2745 218 0.8812 0.4617 0.8812 0.9387
No log 4.3137 220 0.8829 0.4617 0.8829 0.9396
No log 4.3529 222 0.9137 0.4243 0.9137 0.9559
No log 4.3922 224 0.9111 0.4243 0.9111 0.9545
No log 4.4314 226 0.8982 0.3921 0.8982 0.9478
No log 4.4706 228 0.9060 0.3814 0.9060 0.9518
No log 4.5098 230 0.9145 0.3862 0.9145 0.9563
No log 4.5490 232 0.9285 0.4100 0.9285 0.9636
No log 4.5882 234 1.0859 0.4236 1.0859 1.0421
No log 4.6275 236 1.3040 0.3560 1.3040 1.1419
No log 4.6667 238 1.1665 0.3378 1.1665 1.0801
No log 4.7059 240 0.8821 0.5567 0.8821 0.9392
No log 4.7451 242 0.7982 0.5905 0.7982 0.8934
No log 4.7843 244 0.7958 0.5476 0.7958 0.8921
No log 4.8235 246 0.8584 0.4696 0.8584 0.9265
No log 4.8627 248 0.9155 0.4191 0.9155 0.9568
No log 4.9020 250 0.9508 0.4260 0.9508 0.9751
No log 4.9412 252 0.8797 0.4175 0.8797 0.9379
No log 4.9804 254 0.8942 0.3937 0.8942 0.9456
No log 5.0196 256 0.9658 0.3701 0.9658 0.9828
No log 5.0588 258 0.9806 0.4009 0.9806 0.9902
No log 5.0980 260 0.9999 0.4009 0.9999 1.0000
No log 5.1373 262 0.9878 0.3744 0.9878 0.9939
No log 5.1765 264 0.9556 0.4305 0.9556 0.9776
No log 5.2157 266 0.8982 0.4834 0.8982 0.9477
No log 5.2549 268 0.8911 0.5143 0.8911 0.9440
No log 5.2941 270 0.9071 0.4388 0.9071 0.9524
No log 5.3333 272 1.0103 0.3886 1.0103 1.0051
No log 5.3725 274 1.0138 0.3519 1.0138 1.0069
No log 5.4118 276 0.9334 0.4019 0.9334 0.9661
No log 5.4510 278 0.9021 0.3979 0.9021 0.9498
No log 5.4902 280 0.9024 0.4257 0.9024 0.9499
No log 5.5294 282 0.8960 0.4257 0.8960 0.9466
No log 5.5686 284 0.9371 0.4019 0.9371 0.9681
No log 5.6078 286 0.9647 0.4158 0.9647 0.9822
No log 5.6471 288 0.9893 0.4059 0.9893 0.9947
No log 5.6863 290 0.9810 0.4158 0.9810 0.9905
No log 5.7255 292 0.9528 0.4019 0.9528 0.9761
No log 5.7647 294 0.9467 0.4019 0.9467 0.9730
No log 5.8039 296 0.9537 0.4019 0.9537 0.9766
No log 5.8431 298 0.9681 0.4019 0.9681 0.9839
No log 5.8824 300 0.9832 0.4019 0.9832 0.9916
No log 5.9216 302 1.0325 0.3869 1.0325 1.0161
No log 5.9608 304 1.0013 0.3920 1.0013 1.0006
No log 6.0 306 0.9493 0.3830 0.9493 0.9743
No log 6.0392 308 0.9319 0.3830 0.9319 0.9653
No log 6.0784 310 0.9162 0.3728 0.9162 0.9572
No log 6.1176 312 0.9149 0.4019 0.9149 0.9565
No log 6.1569 314 0.9112 0.4061 0.9112 0.9545
No log 6.1961 316 0.8977 0.4521 0.8977 0.9475
No log 6.2353 318 0.8971 0.4738 0.8971 0.9472
No log 6.2745 320 0.9069 0.4715 0.9069 0.9523
No log 6.3137 322 0.9274 0.4681 0.9274 0.9630
No log 6.3529 324 0.9387 0.3728 0.9387 0.9689
No log 6.3922 326 0.9424 0.3728 0.9424 0.9708
No log 6.4314 328 0.9464 0.4019 0.9464 0.9728
No log 6.4706 330 0.9956 0.3637 0.9956 0.9978
No log 6.5098 332 1.0797 0.4015 1.0797 1.0391
No log 6.5490 334 1.0582 0.4137 1.0582 1.0287
No log 6.5882 336 0.9779 0.3812 0.9779 0.9889
No log 6.6275 338 0.8662 0.4962 0.8662 0.9307
No log 6.6667 340 0.8614 0.5244 0.8614 0.9281
No log 6.7059 342 0.8787 0.4617 0.8787 0.9374
No log 6.7451 344 0.9842 0.3850 0.9842 0.9921
No log 6.7843 346 1.2491 0.3494 1.2491 1.1176
No log 6.8235 348 1.4138 0.3315 1.4138 1.1890
No log 6.8627 350 1.2813 0.3320 1.2813 1.1319
No log 6.9020 352 0.9960 0.4053 0.9960 0.9980
No log 6.9412 354 0.8628 0.4356 0.8628 0.9288
No log 6.9804 356 0.8554 0.5244 0.8554 0.9249
No log 7.0196 358 0.8506 0.4912 0.8506 0.9223
No log 7.0588 360 0.9076 0.4425 0.9076 0.9527
No log 7.0980 362 0.9936 0.3856 0.9936 0.9968
No log 7.1373 364 0.9477 0.3819 0.9477 0.9735
No log 7.1765 366 0.8976 0.3965 0.8976 0.9474
No log 7.2157 368 0.8743 0.4077 0.8743 0.9350
No log 7.2549 370 0.8880 0.4656 0.8880 0.9423
No log 7.2941 372 0.8921 0.4656 0.8921 0.9445
No log 7.3333 374 0.8959 0.3925 0.8959 0.9465
No log 7.3725 376 0.9616 0.3189 0.9616 0.9806
No log 7.4118 378 0.9785 0.3189 0.9785 0.9892
No log 7.4510 380 0.9484 0.3733 0.9484 0.9739
No log 7.4902 382 0.9612 0.3542 0.9612 0.9804
No log 7.5294 384 0.9729 0.3542 0.9729 0.9864
No log 7.5686 386 0.9842 0.3542 0.9842 0.9921
No log 7.6078 388 0.9571 0.3965 0.9571 0.9783
No log 7.6471 390 0.9690 0.3965 0.9690 0.9844
No log 7.6863 392 0.9669 0.3637 0.9669 0.9833
No log 7.7255 394 1.0149 0.3685 1.0149 1.0074
No log 7.7647 396 1.1593 0.3497 1.1593 1.0767
No log 7.8039 398 1.2410 0.3179 1.2410 1.1140
No log 7.8431 400 1.1734 0.3133 1.1734 1.0832
No log 7.8824 402 1.0892 0.3121 1.0892 1.0437
No log 7.9216 404 1.0669 0.3100 1.0669 1.0329
No log 7.9608 406 1.0174 0.2795 1.0174 1.0087
No log 8.0 408 1.0485 0.2452 1.0485 1.0239
No log 8.0392 410 1.1055 0.3798 1.1055 1.0514
No log 8.0784 412 1.1607 0.3500 1.1607 1.0773
No log 8.1176 414 1.0875 0.3710 1.0875 1.0428
No log 8.1569 416 1.0014 0.3728 1.0014 1.0007
No log 8.1961 418 0.9669 0.3821 0.9669 0.9833
No log 8.2353 420 1.0260 0.3728 1.0260 1.0129
No log 8.2745 422 1.1167 0.3963 1.1167 1.0567
No log 8.3137 424 1.1819 0.3952 1.1819 1.0871
No log 8.3529 426 1.1650 0.3667 1.1650 1.0793
No log 8.3922 428 1.0310 0.3859 1.0310 1.0154
No log 8.4314 430 0.9724 0.3448 0.9724 0.9861
No log 8.4706 432 0.9856 0.3448 0.9856 0.9927
No log 8.5098 434 1.0342 0.3381 1.0342 1.0169
No log 8.5490 436 1.1160 0.3697 1.1160 1.0564
No log 8.5882 438 1.0927 0.3881 1.0927 1.0453
No log 8.6275 440 0.9779 0.3338 0.9779 0.9889
No log 8.6667 442 0.9421 0.3652 0.9421 0.9706
No log 8.7059 444 0.9524 0.3652 0.9524 0.9759
No log 8.7451 446 0.9575 0.3113 0.9575 0.9785
No log 8.7843 448 1.0446 0.3033 1.0446 1.0221
No log 8.8235 450 1.1440 0.3908 1.1440 1.0696
No log 8.8627 452 1.1081 0.3739 1.1081 1.0527
No log 8.9020 454 1.0070 0.2864 1.0070 1.0035
No log 8.9412 456 0.9700 0.3346 0.9700 0.9849
No log 8.9804 458 0.9680 0.3541 0.9680 0.9839
No log 9.0196 460 0.9682 0.4180 0.9682 0.9840
No log 9.0588 462 0.9936 0.3230 0.9936 0.9968
No log 9.0980 464 1.0352 0.3128 1.0352 1.0174
No log 9.1373 466 1.0974 0.3373 1.0974 1.0476
No log 9.1765 468 1.1127 0.3415 1.1127 1.0549
No log 9.2157 470 1.0880 0.3601 1.0880 1.0431
No log 9.2549 472 1.0013 0.3529 1.0013 1.0006
No log 9.2941 474 0.9546 0.4356 0.9546 0.9770
No log 9.3333 476 0.9487 0.3983 0.9487 0.9740
No log 9.3725 478 0.9534 0.4521 0.9534 0.9764
No log 9.4118 480 0.9739 0.3862 0.9739 0.9868
No log 9.4510 482 1.0215 0.3338 1.0215 1.0107
No log 9.4902 484 0.9948 0.3529 0.9948 0.9974
No log 9.5294 486 0.9667 0.3771 0.9667 0.9832
No log 9.5686 488 0.9740 0.3719 0.9740 0.9869
No log 9.6078 490 0.9803 0.3411 0.9803 0.9901
No log 9.6471 492 0.9932 0.3062 0.9932 0.9966
No log 9.6863 494 1.0865 0.3653 1.0865 1.0424
No log 9.7255 496 1.2329 0.3340 1.2329 1.1104
No log 9.7647 498 1.2942 0.2755 1.2942 1.1376
0.3575 9.8039 500 1.1774 0.3574 1.1774 1.0851
0.3575 9.8431 502 1.0629 0.4012 1.0629 1.0310
0.3575 9.8824 504 0.9794 0.3859 0.9794 0.9896
0.3575 9.9216 506 0.9645 0.3908 0.9645 0.9821
0.3575 9.9608 508 0.9563 0.4197 0.9563 0.9779
0.3575 10.0 510 0.9488 0.3681 0.9488 0.9740
0.3575 10.0392 512 0.9671 0.3723 0.9671 0.9834
0.3575 10.0784 514 0.9686 0.4059 0.9686 0.9842
0.3575 10.1176 516 0.9928 0.3859 0.9928 0.9964
0.3575 10.1569 518 0.9865 0.3859 0.9865 0.9932
0.3575 10.1961 520 0.9709 0.4098 0.9709 0.9853
0.3575 10.2353 522 0.9616 0.3814 0.9616 0.9806
0.3575 10.2745 524 0.9721 0.3814 0.9721 0.9860
0.3575 10.3137 526 0.9757 0.4098 0.9757 0.9878
0.3575 10.3529 528 1.0100 0.3935 1.0100 1.0050
0.3575 10.3922 530 1.0566 0.3484 1.0566 1.0279
0.3575 10.4314 532 1.0171 0.3972 1.0171 1.0085
0.3575 10.4706 534 0.9532 0.3723 0.9532 0.9763
0.3575 10.5098 536 0.9390 0.4158 0.9390 0.9690
0.3575 10.5490 538 0.9784 0.4500 0.9784 0.9892
0.3575 10.5882 540 1.0740 0.3613 1.0740 1.0363
0.3575 10.6275 542 1.1140 0.3307 1.1140 1.0555
0.3575 10.6667 544 1.0851 0.3237 1.0851 1.0417
0.3575 10.7059 546 1.0597 0.3798 1.0597 1.0294
0.3575 10.7451 548 1.0196 0.3373 1.0196 1.0097
0.3575 10.7843 550 0.9688 0.3443 0.9688 0.9843
0.3575 10.8235 552 0.9577 0.4218 0.9577 0.9786
0.3575 10.8627 554 0.9622 0.5042 0.9622 0.9809
0.3575 10.9020 556 1.0330 0.4589 1.0330 1.0164
0.3575 10.9412 558 1.2004 0.3121 1.2004 1.0956
0.3575 10.9804 560 1.2489 0.3022 1.2489 1.1175
0.3575 11.0196 562 1.1417 0.2984 1.1417 1.0685
0.3575 11.0588 564 1.0349 0.3709 1.0349 1.0173
0.3575 11.0980 566 0.9973 0.2764 0.9973 0.9986
0.3575 11.1373 568 1.0012 0.2914 1.0012 1.0006
0.3575 11.1765 570 1.0389 0.3059 1.0389 1.0193
0.3575 11.2157 572 1.0633 0.2870 1.0633 1.0311
0.3575 11.2549 574 1.1178 0.3298 1.1178 1.0573

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_usingALLEssays_FineTuningAraBERT_run1_AugV5_k14_task2_organization

Finetuned
(4019)
this model