ArabicNewSplits7_B_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k11_task1_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 1.6316
  • Qwk: 0.2923
  • Mse: 1.6316
  • Rmse: 1.2773

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0377 2 6.8095 0.0057 6.8095 2.6095
No log 0.0755 4 4.5067 0.0402 4.5067 2.1229
No log 0.1132 6 2.9770 0.0732 2.9770 1.7254
No log 0.1509 8 2.3174 0.0432 2.3174 1.5223
No log 0.1887 10 2.4314 -0.0296 2.4314 1.5593
No log 0.2264 12 2.3898 0.0284 2.3898 1.5459
No log 0.2642 14 2.1585 0.1374 2.1585 1.4692
No log 0.3019 16 2.3575 -0.0429 2.3575 1.5354
No log 0.3396 18 2.5665 -0.0405 2.5665 1.6020
No log 0.3774 20 2.5439 0.0 2.5439 1.5950
No log 0.4151 22 2.2367 0.0741 2.2367 1.4956
No log 0.4528 24 2.0092 0.1930 2.0092 1.4174
No log 0.4906 26 2.0278 0.125 2.0278 1.4240
No log 0.5283 28 1.9261 0.0741 1.9261 1.3878
No log 0.5660 30 1.8443 0.1132 1.8443 1.3581
No log 0.6038 32 1.5541 0.1165 1.5541 1.2466
No log 0.6415 34 1.5605 0.0971 1.5605 1.2492
No log 0.6792 36 1.7001 0.1524 1.7001 1.3039
No log 0.7170 38 2.0308 0.1197 2.0308 1.4251
No log 0.7547 40 2.1665 0.1429 2.1665 1.4719
No log 0.7925 42 2.6036 0.0408 2.6036 1.6136
No log 0.8302 44 2.7032 0.0408 2.7032 1.6441
No log 0.8679 46 2.2744 0.0278 2.2744 1.5081
No log 0.9057 48 1.9199 0.2521 1.9199 1.3856
No log 0.9434 50 1.9245 0.1967 1.9245 1.3873
No log 0.9811 52 1.9233 0.2759 1.9233 1.3868
No log 1.0189 54 1.9287 0.2586 1.9287 1.3888
No log 1.0566 56 1.8368 0.2545 1.8368 1.3553
No log 1.0943 58 1.7244 0.2385 1.7244 1.3132
No log 1.1321 60 1.6254 0.1154 1.6254 1.2749
No log 1.1698 62 1.5925 0.0784 1.5925 1.2620
No log 1.2075 64 1.5674 0.0777 1.5674 1.2519
No log 1.2453 66 1.6335 0.2037 1.6335 1.2781
No log 1.2830 68 1.9269 0.3333 1.9269 1.3881
No log 1.3208 70 2.1307 0.1857 2.1307 1.4597
No log 1.3585 72 2.0361 0.2319 2.0361 1.4269
No log 1.3962 74 1.6535 0.25 1.6535 1.2859
No log 1.4340 76 1.5724 0.2056 1.5724 1.2540
No log 1.4717 78 1.6088 0.2182 1.6088 1.2684
No log 1.5094 80 1.6150 0.2655 1.6150 1.2708
No log 1.5472 82 1.6706 0.3333 1.6706 1.2925
No log 1.5849 84 1.5434 0.3220 1.5434 1.2423
No log 1.6226 86 1.3949 0.3036 1.3949 1.1811
No log 1.6604 88 1.4067 0.2364 1.4067 1.1861
No log 1.6981 90 1.5597 0.2407 1.5597 1.2489
No log 1.7358 92 1.4996 0.2883 1.4996 1.2246
No log 1.7736 94 1.5606 0.3243 1.5606 1.2493
No log 1.8113 96 1.5184 0.3509 1.5184 1.2322
No log 1.8491 98 1.4052 0.4127 1.4052 1.1854
No log 1.8868 100 1.2929 0.4 1.2929 1.1371
No log 1.9245 102 1.3078 0.3968 1.3078 1.1436
No log 1.9623 104 1.4414 0.3876 1.4414 1.2006
No log 2.0 106 1.4106 0.4154 1.4106 1.1877
No log 2.0377 108 1.3819 0.3279 1.3819 1.1755
No log 2.0755 110 1.5125 0.4062 1.5125 1.2298
No log 2.1132 112 1.7215 0.3817 1.7215 1.3120
No log 2.1509 114 1.8837 0.2628 1.8837 1.3725
No log 2.1887 116 1.8136 0.2774 1.8136 1.3467
No log 2.2264 118 1.8351 0.2290 1.8351 1.3547
No log 2.2642 120 1.8985 0.1111 1.8985 1.3779
No log 2.3019 122 1.7460 0.2623 1.7460 1.3214
No log 2.3396 124 1.5985 0.3008 1.5985 1.2643
No log 2.3774 126 1.5811 0.3511 1.5811 1.2574
No log 2.4151 128 1.5835 0.3934 1.5835 1.2584
No log 2.4528 130 1.7534 0.272 1.7534 1.3242
No log 2.4906 132 1.9236 0.1600 1.9236 1.3870
No log 2.5283 134 1.8790 0.1875 1.8790 1.3708
No log 2.5660 136 1.7117 0.2791 1.7117 1.3083
No log 2.6038 138 1.5725 0.3876 1.5725 1.2540
No log 2.6415 140 1.4080 0.4615 1.4080 1.1866
No log 2.6792 142 1.4148 0.4615 1.4148 1.1895
No log 2.7170 144 1.6392 0.3511 1.6392 1.2803
No log 2.7547 146 1.9793 0.1774 1.9793 1.4069
No log 2.7925 148 2.0455 0.1463 2.0455 1.4302
No log 2.8302 150 1.9427 0.1463 1.9427 1.3938
No log 2.8679 152 1.6837 0.2764 1.6837 1.2976
No log 2.9057 154 1.6468 0.3125 1.6468 1.2833
No log 2.9434 156 1.7570 0.2791 1.7570 1.3255
No log 2.9811 158 1.9090 0.1846 1.9090 1.3817
No log 3.0189 160 2.0011 0.0952 2.0011 1.4146
No log 3.0566 162 2.0347 0.0952 2.0347 1.4264
No log 3.0943 164 1.8923 0.1654 1.8923 1.3756
No log 3.1321 166 1.8160 0.2556 1.8160 1.3476
No log 3.1698 168 1.7109 0.3308 1.7109 1.3080
No log 3.2075 170 1.6030 0.3636 1.6030 1.2661
No log 3.2453 172 1.4172 0.4062 1.4172 1.1905
No log 3.2830 174 1.4376 0.4062 1.4376 1.1990
No log 3.3208 176 1.5359 0.3969 1.5359 1.2393
No log 3.3585 178 1.4600 0.4062 1.4600 1.2083
No log 3.3962 180 1.4447 0.3846 1.4447 1.2020
No log 3.4340 182 1.5342 0.3939 1.5342 1.2386
No log 3.4717 184 1.6377 0.2941 1.6377 1.2797
No log 3.5094 186 1.7213 0.2628 1.7213 1.3120
No log 3.5472 188 1.7131 0.2815 1.7131 1.3088
No log 3.5849 190 1.7327 0.2707 1.7327 1.3163
No log 3.6226 192 1.7165 0.3256 1.7165 1.3101
No log 3.6604 194 1.5895 0.3902 1.5895 1.2607
No log 3.6981 196 1.4951 0.3667 1.4951 1.2227
No log 3.7358 198 1.5471 0.3802 1.5471 1.2438
No log 3.7736 200 1.7067 0.2769 1.7067 1.3064
No log 3.8113 202 1.8596 0.2239 1.8596 1.3637
No log 3.8491 204 1.8933 0.2353 1.8933 1.3760
No log 3.8868 206 1.7368 0.2537 1.7368 1.3179
No log 3.9245 208 1.5995 0.4186 1.5995 1.2647
No log 3.9623 210 1.5679 0.4186 1.5679 1.2521
No log 4.0 212 1.5853 0.3876 1.5853 1.2591
No log 4.0377 214 1.5433 0.4186 1.5433 1.2423
No log 4.0755 216 1.5949 0.3817 1.5949 1.2629
No log 4.1132 218 1.6772 0.3566 1.6772 1.2951
No log 4.1509 220 1.6814 0.3538 1.6814 1.2967
No log 4.1887 222 1.6645 0.3511 1.6645 1.2902
No log 4.2264 224 1.6772 0.2901 1.6772 1.2951
No log 4.2642 226 1.5814 0.3594 1.5814 1.2576
No log 4.3019 228 1.5338 0.3817 1.5338 1.2385
No log 4.3396 230 1.5206 0.3636 1.5206 1.2331
No log 4.3774 232 1.5420 0.3359 1.5420 1.2418
No log 4.4151 234 1.5319 0.3817 1.5319 1.2377
No log 4.4528 236 1.5046 0.4031 1.5046 1.2266
No log 4.4906 238 1.4386 0.48 1.4386 1.1994
No log 4.5283 240 1.3696 0.4677 1.3696 1.1703
No log 4.5660 242 1.4263 0.48 1.4263 1.1943
No log 4.6038 244 1.4929 0.4 1.4929 1.2218
No log 4.6415 246 1.4888 0.4375 1.4888 1.2202
No log 4.6792 248 1.5269 0.3969 1.5269 1.2357
No log 4.7170 250 1.6447 0.3235 1.6447 1.2824
No log 4.7547 252 1.7771 0.2754 1.7771 1.3331
No log 4.7925 254 1.6973 0.3111 1.6973 1.3028
No log 4.8302 256 1.6047 0.3788 1.6047 1.2668
No log 4.8679 258 1.5200 0.4031 1.5200 1.2329
No log 4.9057 260 1.5526 0.3433 1.5526 1.2460
No log 4.9434 262 1.7192 0.2609 1.7192 1.3112
No log 4.9811 264 1.7448 0.2963 1.7448 1.3209
No log 5.0189 266 1.6230 0.3065 1.6230 1.2740
No log 5.0566 268 1.5852 0.2727 1.5852 1.2590
No log 5.0943 270 1.4331 0.3333 1.4331 1.1971
No log 5.1321 272 1.2899 0.5082 1.2899 1.1358
No log 5.1698 274 1.3418 0.4724 1.3418 1.1584
No log 5.2075 276 1.4407 0.4154 1.4407 1.2003
No log 5.2453 278 1.7054 0.2754 1.7054 1.3059
No log 5.2830 280 1.7492 0.2837 1.7492 1.3226
No log 5.3208 282 1.5512 0.3939 1.5512 1.2455
No log 5.3585 284 1.3923 0.4806 1.3923 1.1799
No log 5.3962 286 1.3607 0.4651 1.3607 1.1665
No log 5.4340 288 1.4934 0.3759 1.4934 1.2221
No log 5.4717 290 1.4645 0.4275 1.4645 1.2102
No log 5.5094 292 1.3608 0.4567 1.3608 1.1665
No log 5.5472 294 1.2700 0.4839 1.2700 1.1269
No log 5.5849 296 1.2028 0.4918 1.2028 1.0967
No log 5.6226 298 1.2556 0.5079 1.2556 1.1205
No log 5.6604 300 1.4599 0.3939 1.4599 1.2083
No log 5.6981 302 1.7018 0.3043 1.7018 1.3045
No log 5.7358 304 1.6425 0.3259 1.6425 1.2816
No log 5.7736 306 1.3928 0.4444 1.3928 1.1802
No log 5.8113 308 1.2751 0.5203 1.2751 1.1292
No log 5.8491 310 1.3127 0.4706 1.3127 1.1457
No log 5.8868 312 1.3999 0.4715 1.3999 1.1832
No log 5.9245 314 1.5559 0.4062 1.5559 1.2474
No log 5.9623 316 1.7334 0.3259 1.7334 1.3166
No log 6.0 318 1.8628 0.2464 1.8628 1.3648
No log 6.0377 320 1.8314 0.2464 1.8314 1.3533
No log 6.0755 322 1.6765 0.3721 1.6765 1.2948
No log 6.1132 324 1.4762 0.4918 1.4762 1.2150
No log 6.1509 326 1.3613 0.4754 1.3613 1.1668
No log 6.1887 328 1.3555 0.4640 1.3555 1.1643
No log 6.2264 330 1.5152 0.4496 1.5152 1.2309
No log 6.2642 332 1.7857 0.2482 1.7857 1.3363
No log 6.3019 334 1.7860 0.2482 1.7860 1.3364
No log 6.3396 336 1.6329 0.2985 1.6329 1.2778
No log 6.3774 338 1.4040 0.4567 1.4040 1.1849
No log 6.4151 340 1.2799 0.4516 1.2799 1.1313
No log 6.4528 342 1.2291 0.5041 1.2291 1.1087
No log 6.4906 344 1.2698 0.48 1.2698 1.1268
No log 6.5283 346 1.3919 0.4286 1.3919 1.1798
No log 6.5660 348 1.5988 0.3308 1.5988 1.2644
No log 6.6038 350 1.7181 0.2628 1.7181 1.3108
No log 6.6415 352 1.6313 0.2985 1.6313 1.2772
No log 6.6792 354 1.4253 0.4031 1.4253 1.1939
No log 6.7170 356 1.2214 0.4961 1.2214 1.1052
No log 6.7547 358 1.1805 0.5116 1.1805 1.0865
No log 6.7925 360 1.2532 0.4961 1.2532 1.1195
No log 6.8302 362 1.4629 0.3969 1.4629 1.2095
No log 6.8679 364 1.5646 0.3333 1.5646 1.2508
No log 6.9057 366 1.5846 0.4062 1.5846 1.2588
No log 6.9434 368 1.5690 0.3833 1.5690 1.2526
No log 6.9811 370 1.5320 0.4094 1.5320 1.2377
No log 7.0189 372 1.4506 0.3788 1.4506 1.2044
No log 7.0566 374 1.3959 0.4 1.3959 1.1815
No log 7.0943 376 1.5198 0.3759 1.5198 1.2328
No log 7.1321 378 1.6843 0.2941 1.6843 1.2978
No log 7.1698 380 1.7643 0.2920 1.7643 1.3283
No log 7.2075 382 1.7657 0.2920 1.7657 1.3288
No log 7.2453 384 1.7620 0.2920 1.7620 1.3274
No log 7.2830 386 1.6786 0.3433 1.6786 1.2956
No log 7.3208 388 1.4925 0.4186 1.4925 1.2217
No log 7.3585 390 1.3612 0.4286 1.3612 1.1667
No log 7.3962 392 1.3341 0.4194 1.3341 1.1550
No log 7.4340 394 1.3795 0.4286 1.3795 1.1745
No log 7.4717 396 1.4307 0.4186 1.4307 1.1961
No log 7.5094 398 1.5610 0.3206 1.5610 1.2494
No log 7.5472 400 1.6674 0.3158 1.6674 1.2913
No log 7.5849 402 1.6473 0.3158 1.6473 1.2835
No log 7.6226 404 1.5594 0.3158 1.5594 1.2487
No log 7.6604 406 1.4698 0.4186 1.4698 1.2123
No log 7.6981 408 1.3794 0.4094 1.3794 1.1745
No log 7.7358 410 1.3150 0.4480 1.3150 1.1468
No log 7.7736 412 1.3688 0.4286 1.3688 1.1700
No log 7.8113 414 1.4222 0.4062 1.4222 1.1926
No log 7.8491 416 1.4489 0.4186 1.4489 1.2037
No log 7.8868 418 1.4286 0.4186 1.4286 1.1953
No log 7.9245 420 1.3531 0.4480 1.3531 1.1632
No log 7.9623 422 1.3961 0.4590 1.3961 1.1816
No log 8.0 424 1.3988 0.4576 1.3988 1.1827
No log 8.0377 426 1.4272 0.4538 1.4272 1.1947
No log 8.0755 428 1.4231 0.4603 1.4231 1.1929
No log 8.1132 430 1.3868 0.4341 1.3868 1.1776
No log 8.1509 432 1.4111 0.4341 1.4111 1.1879
No log 8.1887 434 1.3653 0.4375 1.3653 1.1685
No log 8.2264 436 1.4025 0.4375 1.4025 1.1843
No log 8.2642 438 1.5540 0.4308 1.5540 1.2466
No log 8.3019 440 1.6843 0.3308 1.6843 1.2978
No log 8.3396 442 1.6687 0.3459 1.6687 1.2918
No log 8.3774 444 1.5223 0.4122 1.5223 1.2338
No log 8.4151 446 1.3357 0.5271 1.3357 1.1557
No log 8.4528 448 1.2831 0.5156 1.2831 1.1327
No log 8.4906 450 1.3044 0.4762 1.3044 1.1421
No log 8.5283 452 1.4021 0.4567 1.4021 1.1841
No log 8.5660 454 1.6046 0.3636 1.6046 1.2667
No log 8.6038 456 1.7886 0.2464 1.7886 1.3374
No log 8.6415 458 1.7450 0.2941 1.7450 1.3210
No log 8.6792 460 1.6881 0.3134 1.6881 1.2993
No log 8.7170 462 1.5707 0.3636 1.5707 1.2533
No log 8.7547 464 1.5029 0.4186 1.5029 1.2259
No log 8.7925 466 1.5226 0.4 1.5226 1.2339
No log 8.8302 468 1.5567 0.3664 1.5567 1.2477
No log 8.8679 470 1.4906 0.4186 1.4906 1.2209
No log 8.9057 472 1.3686 0.4651 1.3686 1.1699
No log 8.9434 474 1.3091 0.4961 1.3091 1.1441
No log 8.9811 476 1.3123 0.4961 1.3123 1.1455
No log 9.0189 478 1.3157 0.5077 1.3157 1.1470
No log 9.0566 480 1.3789 0.4923 1.3789 1.1743
No log 9.0943 482 1.4995 0.4651 1.4995 1.2246
No log 9.1321 484 1.6296 0.3664 1.6296 1.2766
No log 9.1698 486 1.6593 0.3664 1.6593 1.2881
No log 9.2075 488 1.6905 0.3259 1.6905 1.3002
No log 9.2453 490 1.6651 0.3433 1.6651 1.2904
No log 9.2830 492 1.5668 0.3433 1.5668 1.2517
No log 9.3208 494 1.5342 0.3433 1.5342 1.2386
No log 9.3585 496 1.5640 0.3609 1.5640 1.2506
No log 9.3962 498 1.6154 0.3609 1.6154 1.2710
0.4681 9.4340 500 1.6078 0.3101 1.6078 1.2680
0.4681 9.4717 502 1.4948 0.3387 1.4948 1.2226
0.4681 9.5094 504 1.4866 0.3607 1.4866 1.2193
0.4681 9.5472 506 1.5282 0.3415 1.5282 1.2362
0.4681 9.5849 508 1.4920 0.3438 1.4920 1.2215
0.4681 9.6226 510 1.6161 0.3582 1.6161 1.2712
0.4681 9.6604 512 1.7944 0.3308 1.7944 1.3395
0.4681 9.6981 514 1.8798 0.2462 1.8798 1.3711
0.4681 9.7358 516 1.8722 0.2344 1.8722 1.3683
0.4681 9.7736 518 1.7520 0.3016 1.7520 1.3236
0.4681 9.8113 520 1.5923 0.3651 1.5923 1.2619
0.4681 9.8491 522 1.5713 0.3780 1.5713 1.2535
0.4681 9.8868 524 1.5824 0.3817 1.5824 1.2579
0.4681 9.9245 526 1.6809 0.3134 1.6809 1.2965
0.4681 9.9623 528 1.7866 0.2121 1.7866 1.3366
0.4681 10.0 530 1.8830 0.1774 1.8830 1.3722
0.4681 10.0377 532 1.9083 0.2478 1.9083 1.3814
0.4681 10.0755 534 1.8219 0.3248 1.8219 1.3498
0.4681 10.1132 536 1.6842 0.3333 1.6842 1.2978
0.4681 10.1509 538 1.5251 0.4567 1.5251 1.2350
0.4681 10.1887 540 1.3461 0.4844 1.3461 1.1602
0.4681 10.2264 542 1.3041 0.4844 1.3041 1.1420
0.4681 10.2642 544 1.3956 0.4651 1.3956 1.1814
0.4681 10.3019 546 1.5720 0.3158 1.5720 1.2538
0.4681 10.3396 548 1.6309 0.2985 1.6309 1.2771
0.4681 10.3774 550 1.5297 0.3511 1.5297 1.2368
0.4681 10.4151 552 1.3877 0.4567 1.3877 1.1780
0.4681 10.4528 554 1.3228 0.4603 1.3228 1.1501
0.4681 10.4906 556 1.3950 0.4286 1.3950 1.1811
0.4681 10.5283 558 1.4976 0.3692 1.4976 1.2238
0.4681 10.5660 560 1.5764 0.3692 1.5764 1.2555
0.4681 10.6038 562 1.6483 0.3411 1.6483 1.2839
0.4681 10.6415 564 1.7638 0.2687 1.7638 1.3281
0.4681 10.6792 566 1.7079 0.2946 1.7079 1.3069
0.4681 10.7170 568 1.6821 0.2992 1.6821 1.2970
0.4681 10.7547 570 1.6594 0.3333 1.6594 1.2882
0.4681 10.7925 572 1.6584 0.375 1.6584 1.2878
0.4681 10.8302 574 1.5861 0.375 1.5861 1.2594
0.4681 10.8679 576 1.5852 0.375 1.5852 1.2591
0.4681 10.9057 578 1.6815 0.2946 1.6815 1.2967
0.4681 10.9434 580 1.7675 0.2769 1.7675 1.3295
0.4681 10.9811 582 1.7241 0.2769 1.7241 1.3130
0.4681 11.0189 584 1.7159 0.2769 1.7159 1.3099
0.4681 11.0566 586 1.6316 0.2923 1.6316 1.2773

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_B_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k11_task1_organization

Finetuned
(4019)
this model