ArabicNewSplits7_B_usingALLEssays_FineTuningAraBERT_run1_AugV5_k10_task2_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 1.0245
  • Qwk: 0.3317
  • Mse: 1.0245
  • Rmse: 1.0122

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0392 2 4.7595 0.0010 4.7595 2.1816
No log 0.0784 4 2.8432 -0.0321 2.8432 1.6862
No log 0.1176 6 1.8787 0.0198 1.8787 1.3706
No log 0.1569 8 1.4779 -0.0074 1.4779 1.2157
No log 0.1961 10 1.4713 -0.0741 1.4713 1.2130
No log 0.2353 12 1.4016 0.0247 1.4016 1.1839
No log 0.2745 14 1.3291 0.1643 1.3291 1.1529
No log 0.3137 16 1.5252 0.0169 1.5252 1.2350
No log 0.3529 18 1.6458 0.0317 1.6458 1.2829
No log 0.3922 20 1.1507 0.2740 1.1507 1.0727
No log 0.4314 22 1.2260 0.1448 1.2260 1.1073
No log 0.4706 24 1.4160 0.0587 1.4160 1.1899
No log 0.5098 26 1.2604 0.1448 1.2604 1.1227
No log 0.5490 28 1.2184 0.1848 1.2184 1.1038
No log 0.5882 30 1.4695 0.2058 1.4695 1.2122
No log 0.6275 32 2.0556 -0.0335 2.0556 1.4337
No log 0.6667 34 2.8162 0.0332 2.8162 1.6781
No log 0.7059 36 3.3867 0.0240 3.3867 1.8403
No log 0.7451 38 2.7699 0.0610 2.7699 1.6643
No log 0.7843 40 2.2447 0.1389 2.2447 1.4982
No log 0.8235 42 2.0673 0.2390 2.0673 1.4378
No log 0.8627 44 1.6405 0.0925 1.6405 1.2808
No log 0.9020 46 1.2470 0.2678 1.2470 1.1167
No log 0.9412 48 1.1505 0.1999 1.1505 1.0726
No log 0.9804 50 1.0864 0.2684 1.0864 1.0423
No log 1.0196 52 1.0924 0.2684 1.0924 1.0452
No log 1.0588 54 1.1446 0.2095 1.1446 1.0699
No log 1.0980 56 1.2434 0.1904 1.2434 1.1151
No log 1.1373 58 1.2905 0.2473 1.2905 1.1360
No log 1.1765 60 1.3253 0.2411 1.3253 1.1512
No log 1.2157 62 1.5638 0.2554 1.5638 1.2505
No log 1.2549 64 1.6091 0.1730 1.6091 1.2685
No log 1.2941 66 1.6146 0.2154 1.6146 1.2707
No log 1.3333 68 1.4999 0.1615 1.4999 1.2247
No log 1.3725 70 1.2729 0.1748 1.2729 1.1282
No log 1.4118 72 1.2330 0.1943 1.2330 1.1104
No log 1.4510 74 1.3983 0.2634 1.3983 1.1825
No log 1.4902 76 1.7567 0.1952 1.7567 1.3254
No log 1.5294 78 1.6850 0.2057 1.6850 1.2981
No log 1.5686 80 1.3226 0.2438 1.3226 1.1500
No log 1.6078 82 1.1073 0.3536 1.1073 1.0523
No log 1.6471 84 1.0283 0.3830 1.0283 1.0140
No log 1.6863 86 0.9352 0.3059 0.9352 0.9671
No log 1.7255 88 0.9797 0.3365 0.9797 0.9898
No log 1.7647 90 1.0432 0.3744 1.0432 1.0214
No log 1.8039 92 1.1862 0.3277 1.1862 1.0891
No log 1.8431 94 1.5212 0.3109 1.5212 1.2334
No log 1.8824 96 1.7756 0.2401 1.7756 1.3325
No log 1.9216 98 1.6635 0.2565 1.6635 1.2898
No log 1.9608 100 1.4495 0.3581 1.4495 1.2040
No log 2.0 102 1.3491 0.3748 1.3491 1.1615
No log 2.0392 104 1.1650 0.3660 1.1650 1.0794
No log 2.0784 106 1.1428 0.3117 1.1428 1.0690
No log 2.1176 108 1.0567 0.3330 1.0567 1.0280
No log 2.1569 110 1.0925 0.3188 1.0925 1.0452
No log 2.1961 112 1.2070 0.1851 1.2070 1.0986
No log 2.2353 114 1.0862 0.2797 1.0862 1.0422
No log 2.2745 116 0.9909 0.3699 0.9909 0.9954
No log 2.3137 118 1.1116 0.2532 1.1116 1.0543
No log 2.3529 120 1.2674 0.2362 1.2674 1.1258
No log 2.3922 122 1.1998 0.2799 1.1998 1.0953
No log 2.4314 124 1.0262 0.3696 1.0262 1.0130
No log 2.4706 126 0.9120 0.4136 0.9120 0.9550
No log 2.5098 128 0.9730 0.4063 0.9730 0.9864
No log 2.5490 130 1.3177 0.3549 1.3177 1.1479
No log 2.5882 132 1.4446 0.2814 1.4446 1.2019
No log 2.6275 134 1.4002 0.2929 1.4002 1.1833
No log 2.6667 136 1.2614 0.2823 1.2614 1.1231
No log 2.7059 138 1.2108 0.2366 1.2108 1.1004
No log 2.7451 140 0.9545 0.4475 0.9545 0.9770
No log 2.7843 142 0.8952 0.4879 0.8952 0.9461
No log 2.8235 144 0.8983 0.4929 0.8983 0.9478
No log 2.8627 146 1.0413 0.3471 1.0413 1.0204
No log 2.9020 148 1.0837 0.3601 1.0837 1.0410
No log 2.9412 150 1.2474 0.3148 1.2474 1.1169
No log 2.9804 152 1.1952 0.2784 1.1952 1.0932
No log 3.0196 154 0.9224 0.4604 0.9224 0.9604
No log 3.0588 156 0.7861 0.4220 0.7861 0.8866
No log 3.0980 158 0.8693 0.3822 0.8693 0.9324
No log 3.1373 160 0.8926 0.4116 0.8926 0.9448
No log 3.1765 162 0.8522 0.4737 0.8522 0.9232
No log 3.2157 164 0.8627 0.4465 0.8627 0.9288
No log 3.2549 166 0.8275 0.4772 0.8275 0.9097
No log 3.2941 168 0.7885 0.4960 0.7885 0.8880
No log 3.3333 170 0.7906 0.5061 0.7906 0.8892
No log 3.3725 172 0.8219 0.5061 0.8219 0.9066
No log 3.4118 174 0.8661 0.4527 0.8661 0.9306
No log 3.4510 176 0.8817 0.4429 0.8817 0.9390
No log 3.4902 178 1.0053 0.3933 1.0053 1.0026
No log 3.5294 180 1.3627 0.3148 1.3627 1.1674
No log 3.5686 182 1.6634 0.3485 1.6634 1.2897
No log 3.6078 184 1.5442 0.3679 1.5442 1.2426
No log 3.6471 186 1.3908 0.3852 1.3908 1.1793
No log 3.6863 188 1.0187 0.4030 1.0187 1.0093
No log 3.7255 190 0.8718 0.4404 0.8718 0.9337
No log 3.7647 192 0.8832 0.4371 0.8832 0.9398
No log 3.8039 194 0.9647 0.3519 0.9647 0.9822
No log 3.8431 196 0.9743 0.4175 0.9743 0.9871
No log 3.8824 198 0.9850 0.4175 0.9850 0.9925
No log 3.9216 200 1.0933 0.3697 1.0933 1.0456
No log 3.9608 202 1.1444 0.3024 1.1444 1.0697
No log 4.0 204 1.1056 0.3872 1.1056 1.0515
No log 4.0392 206 0.9949 0.4212 0.9949 0.9975
No log 4.0784 208 0.9371 0.4098 0.9371 0.9680
No log 4.1176 210 0.9695 0.5050 0.9695 0.9846
No log 4.1569 212 1.0272 0.4906 1.0272 1.0135
No log 4.1961 214 0.9283 0.4735 0.9283 0.9635
No log 4.2353 216 0.8563 0.4271 0.8563 0.9254
No log 4.2745 218 0.7812 0.5044 0.7812 0.8838
No log 4.3137 220 0.7874 0.4813 0.7874 0.8873
No log 4.3529 222 0.8547 0.4039 0.8547 0.9245
No log 4.3922 224 0.9080 0.4347 0.9080 0.9529
No log 4.4314 226 0.8978 0.4570 0.8978 0.9475
No log 4.4706 228 0.9785 0.4347 0.9785 0.9892
No log 4.5098 230 0.8680 0.4590 0.8680 0.9317
No log 4.5490 232 0.8694 0.4590 0.8694 0.9324
No log 4.5882 234 0.9986 0.3696 0.9986 0.9993
No log 4.6275 236 1.0136 0.3696 1.0136 1.0068
No log 4.6667 238 0.8919 0.4273 0.8919 0.9444
No log 4.7059 240 0.8568 0.4273 0.8568 0.9257
No log 4.7451 242 0.8042 0.4780 0.8042 0.8968
No log 4.7843 244 0.8114 0.5421 0.8114 0.9008
No log 4.8235 246 0.8023 0.5351 0.8023 0.8957
No log 4.8627 248 0.9584 0.4537 0.9584 0.9790
No log 4.9020 250 1.1512 0.4310 1.1512 1.0729
No log 4.9412 252 1.0358 0.4387 1.0358 1.0177
No log 4.9804 254 0.8653 0.3947 0.8653 0.9302
No log 5.0196 256 0.7989 0.4120 0.7989 0.8938
No log 5.0588 258 0.8111 0.4258 0.8111 0.9006
No log 5.0980 260 0.9371 0.3930 0.9371 0.9681
No log 5.1373 262 1.0573 0.4272 1.0573 1.0282
No log 5.1765 264 1.0232 0.4387 1.0232 1.0116
No log 5.2157 266 0.8924 0.4774 0.8924 0.9447
No log 5.2549 268 0.8430 0.4686 0.8430 0.9181
No log 5.2941 270 0.8914 0.4976 0.8914 0.9442
No log 5.3333 272 0.9300 0.4932 0.9300 0.9644
No log 5.3725 274 0.9334 0.4406 0.9334 0.9661
No log 5.4118 276 0.8767 0.4235 0.8767 0.9363
No log 5.4510 278 0.8823 0.3855 0.8823 0.9393
No log 5.4902 280 0.8462 0.4019 0.8462 0.9199
No log 5.5294 282 0.8536 0.4280 0.8536 0.9239
No log 5.5686 284 0.8511 0.4321 0.8511 0.9225
No log 5.6078 286 0.8951 0.3855 0.8951 0.9461
No log 5.6471 288 0.9930 0.3774 0.9930 0.9965
No log 5.6863 290 0.9548 0.3699 0.9548 0.9772
No log 5.7255 292 0.8515 0.4118 0.8515 0.9228
No log 5.7647 294 0.8362 0.5028 0.8362 0.9144
No log 5.8039 296 0.8375 0.4927 0.8375 0.9152
No log 5.8431 298 0.9116 0.3418 0.9116 0.9548
No log 5.8824 300 1.0279 0.3590 1.0279 1.0139
No log 5.9216 302 0.9909 0.3340 0.9909 0.9955
No log 5.9608 304 0.8735 0.3457 0.8735 0.9346
No log 6.0 306 0.8446 0.3714 0.8446 0.9190
No log 6.0392 308 0.9083 0.4154 0.9083 0.9530
No log 6.0784 310 1.0631 0.3986 1.0631 1.0311
No log 6.1176 312 1.0938 0.3977 1.0938 1.0458
No log 6.1569 314 0.9595 0.5102 0.9595 0.9795
No log 6.1961 316 0.8179 0.4491 0.8179 0.9044
No log 6.2353 318 0.8166 0.4356 0.8166 0.9037
No log 6.2745 320 0.8475 0.3723 0.8475 0.9206
No log 6.3137 322 0.9533 0.3083 0.9533 0.9764
No log 6.3529 324 1.0236 0.3340 1.0236 1.0117
No log 6.3922 326 0.9687 0.2984 0.9687 0.9842
No log 6.4314 328 0.8917 0.3411 0.8917 0.9443
No log 6.4706 330 0.8963 0.3873 0.8963 0.9467
No log 6.5098 332 0.9395 0.3157 0.9395 0.9693
No log 6.5490 334 1.0195 0.2395 1.0195 1.0097
No log 6.5882 336 0.9853 0.2687 0.9853 0.9926
No log 6.6275 338 0.9386 0.3411 0.9386 0.9688
No log 6.6667 340 0.9008 0.4036 0.9008 0.9491
No log 6.7059 342 0.8921 0.3974 0.8921 0.9445
No log 6.7451 344 0.9168 0.3411 0.9168 0.9575
No log 6.7843 346 0.9406 0.3824 0.9406 0.9698
No log 6.8235 348 0.9233 0.3737 0.9233 0.9609
No log 6.8627 350 0.9306 0.4214 0.9306 0.9647
No log 6.9020 352 0.9956 0.4096 0.9956 0.9978
No log 6.9412 354 1.1019 0.3491 1.1019 1.0497
No log 6.9804 356 1.1467 0.3772 1.1467 1.0709
No log 7.0196 358 1.0408 0.3525 1.0408 1.0202
No log 7.0588 360 0.9269 0.3529 0.9269 0.9627
No log 7.0980 362 0.9534 0.3302 0.9534 0.9764
No log 7.1373 364 0.9720 0.3302 0.9720 0.9859
No log 7.1765 366 1.0097 0.3302 1.0097 1.0049
No log 7.2157 368 1.0820 0.3056 1.0820 1.0402
No log 7.2549 370 1.0757 0.3056 1.0757 1.0372
No log 7.2941 372 1.0226 0.3365 1.0226 1.0112
No log 7.3333 374 0.9901 0.3699 0.9901 0.9951
No log 7.3725 376 1.0341 0.3144 1.0341 1.0169
No log 7.4118 378 1.1479 0.2940 1.1479 1.0714
No log 7.4510 380 1.3273 0.3058 1.3273 1.1521
No log 7.4902 382 1.3626 0.3736 1.3626 1.1673
No log 7.5294 384 1.1758 0.3259 1.1758 1.0844
No log 7.5686 386 0.9460 0.4063 0.9460 0.9726
No log 7.6078 388 0.8630 0.4233 0.8630 0.9290
No log 7.6471 390 0.8661 0.4328 0.8661 0.9307
No log 7.6863 392 0.8944 0.3862 0.8944 0.9458
No log 7.7255 394 0.9783 0.3462 0.9783 0.9891
No log 7.7647 396 1.0941 0.2451 1.0941 1.0460
No log 7.8039 398 1.1505 0.2056 1.1505 1.0726
No log 7.8431 400 1.1231 0.2049 1.1231 1.0598
No log 7.8824 402 1.0353 0.3210 1.0353 1.0175
No log 7.9216 404 1.0163 0.2915 1.0163 1.0081
No log 7.9608 406 0.9984 0.2915 0.9984 0.9992
No log 8.0 408 0.9875 0.3583 0.9875 0.9937
No log 8.0392 410 1.0069 0.3373 1.0069 1.0035
No log 8.0784 412 1.1198 0.2100 1.1198 1.0582
No log 8.1176 414 1.0719 0.2844 1.0719 1.0353
No log 8.1569 416 0.9624 0.3462 0.9624 0.9810
No log 8.1961 418 0.9241 0.3602 0.9241 0.9613
No log 8.2353 420 0.8886 0.3558 0.8886 0.9427
No log 8.2745 422 0.8882 0.3457 0.8882 0.9425
No log 8.3137 424 0.8979 0.4080 0.8979 0.9476
No log 8.3529 426 0.9023 0.4023 0.9023 0.9499
No log 8.3922 428 0.9119 0.3930 0.9119 0.9549
No log 8.4314 430 0.9299 0.3930 0.9299 0.9643
No log 8.4706 432 0.9480 0.4136 0.9480 0.9736
No log 8.5098 434 1.0167 0.4915 1.0167 1.0083
No log 8.5490 436 1.0561 0.4168 1.0561 1.0277
No log 8.5882 438 0.9893 0.4659 0.9893 0.9946
No log 8.6275 440 0.8493 0.3402 0.8493 0.9216
No log 8.6667 442 0.7993 0.4324 0.7993 0.8940
No log 8.7059 444 0.8219 0.4036 0.8219 0.9066
No log 8.7451 446 0.8191 0.3779 0.8191 0.9051
No log 8.7843 448 0.8176 0.4157 0.8176 0.9042
No log 8.8235 450 0.8848 0.3686 0.8848 0.9407
No log 8.8627 452 0.8855 0.3964 0.8855 0.9410
No log 8.9020 454 0.8021 0.3996 0.8021 0.8956
No log 8.9412 456 0.7649 0.4748 0.7649 0.8746
No log 8.9804 458 0.7664 0.4036 0.7664 0.8754
No log 9.0196 460 0.7686 0.4036 0.7686 0.8767
No log 9.0588 462 0.7625 0.4180 0.7625 0.8732
No log 9.0980 464 0.8233 0.4406 0.8233 0.9073
No log 9.1373 466 1.0165 0.5224 1.0165 1.0082
No log 9.1765 468 1.0575 0.4608 1.0575 1.0284
No log 9.2157 470 0.9839 0.4466 0.9839 0.9919
No log 9.2549 472 0.9813 0.3917 0.9813 0.9906
No log 9.2941 474 1.0277 0.3995 1.0277 1.0138
No log 9.3333 476 1.0082 0.3561 1.0082 1.0041
No log 9.3725 478 0.9791 0.3250 0.9791 0.9895
No log 9.4118 480 0.9317 0.3559 0.9317 0.9652
No log 9.4510 482 0.8738 0.3771 0.8738 0.9348
No log 9.4902 484 0.8751 0.3723 0.8751 0.9355
No log 9.5294 486 0.9250 0.3699 0.9250 0.9618
No log 9.5686 488 1.1091 0.2516 1.1091 1.0531
No log 9.6078 490 1.2225 0.2902 1.2225 1.1057
No log 9.6471 492 1.1705 0.2784 1.1705 1.0819
No log 9.6863 494 1.0071 0.3323 1.0071 1.0035
No log 9.7255 496 0.9519 0.3614 0.9519 0.9756
No log 9.7647 498 0.9412 0.3665 0.9412 0.9702
0.3683 9.8039 500 0.8886 0.3897 0.8886 0.9427
0.3683 9.8431 502 0.9257 0.4463 0.9257 0.9621
0.3683 9.8824 504 1.0434 0.3820 1.0434 1.0215
0.3683 9.9216 506 1.0397 0.3936 1.0397 1.0196
0.3683 9.9608 508 0.9030 0.3744 0.9030 0.9503
0.3683 10.0 510 0.8220 0.4671 0.8220 0.9067
0.3683 10.0392 512 0.8064 0.4534 0.8064 0.8980
0.3683 10.0784 514 0.8082 0.4534 0.8082 0.8990
0.3683 10.1176 516 0.8297 0.4368 0.8297 0.9109
0.3683 10.1569 518 0.8953 0.4735 0.8953 0.9462
0.3683 10.1961 520 0.9559 0.3868 0.9559 0.9777
0.3683 10.2353 522 0.9404 0.4412 0.9404 0.9697
0.3683 10.2745 524 0.9087 0.4563 0.9087 0.9532
0.3683 10.3137 526 0.9043 0.4155 0.9043 0.9510
0.3683 10.3529 528 0.9215 0.4023 0.9215 0.9600
0.3683 10.3922 530 0.9146 0.3798 0.9146 0.9564
0.3683 10.4314 532 0.9392 0.3741 0.9392 0.9691
0.3683 10.4706 534 0.9781 0.3415 0.9781 0.9890
0.3683 10.5098 536 1.0637 0.3648 1.0637 1.0314
0.3683 10.5490 538 1.0245 0.3317 1.0245 1.0122

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_B_usingALLEssays_FineTuningAraBERT_run1_AugV5_k10_task2_organization

Finetuned
(4019)
this model