ArabicNewSplits7_FineTuningAraBERT_run3_AugV5_k20_task2_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.8053
  • Qwk: 0.3708
  • Mse: 0.8053
  • Rmse: 0.8974

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0328 2 4.7482 0.0018 4.7482 2.1790
No log 0.0656 4 2.9283 -0.0029 2.9283 1.7112
No log 0.0984 6 2.0087 0.0062 2.0087 1.4173
No log 0.1311 8 1.6917 0.0062 1.6917 1.3006
No log 0.1639 10 1.4433 -0.0143 1.4433 1.2014
No log 0.1967 12 1.2982 0.0509 1.2982 1.1394
No log 0.2295 14 1.3475 0.0362 1.3475 1.1608
No log 0.2623 16 1.4387 -0.0149 1.4387 1.1994
No log 0.2951 18 1.3303 0.0169 1.3303 1.1534
No log 0.3279 20 1.2713 0.0600 1.2713 1.1275
No log 0.3607 22 1.1749 0.2632 1.1749 1.0840
No log 0.3934 24 1.1438 0.3250 1.1438 1.0695
No log 0.4262 26 1.1708 0.2485 1.1708 1.0820
No log 0.4590 28 1.3060 0.0512 1.3060 1.1428
No log 0.4918 30 1.6309 0.0403 1.6309 1.2771
No log 0.5246 32 1.8393 0.0 1.8393 1.3562
No log 0.5574 34 1.7742 0.0 1.7742 1.3320
No log 0.5902 36 1.4105 0.0811 1.4105 1.1877
No log 0.6230 38 1.1393 0.3140 1.1393 1.0674
No log 0.6557 40 1.0924 0.2674 1.0924 1.0452
No log 0.6885 42 1.1492 0.3095 1.1492 1.0720
No log 0.7213 44 1.6081 0.1051 1.6081 1.2681
No log 0.7541 46 1.8438 0.0284 1.8438 1.3579
No log 0.7869 48 1.5327 0.0697 1.5327 1.2380
No log 0.8197 50 1.2531 0.2053 1.2531 1.1194
No log 0.8525 52 1.2827 0.1110 1.2827 1.1326
No log 0.8852 54 1.2907 0.0499 1.2907 1.1361
No log 0.9180 56 1.2109 0.2142 1.2109 1.1004
No log 0.9508 58 1.1372 0.2678 1.1372 1.0664
No log 0.9836 60 1.1162 0.2733 1.1162 1.0565
No log 1.0164 62 1.2270 0.1587 1.2270 1.1077
No log 1.0492 64 1.2791 0.1076 1.2791 1.1310
No log 1.0820 66 1.1667 0.1587 1.1667 1.0801
No log 1.1148 68 1.0519 0.4702 1.0519 1.0256
No log 1.1475 70 1.0999 0.2678 1.0999 1.0488
No log 1.1803 72 1.1728 0.2212 1.1728 1.0829
No log 1.2131 74 1.0526 0.3965 1.0526 1.0260
No log 1.2459 76 1.0432 0.3737 1.0432 1.0214
No log 1.2787 78 1.1253 0.3083 1.1253 1.0608
No log 1.3115 80 1.0927 0.3035 1.0927 1.0453
No log 1.3443 82 0.9657 0.4810 0.9657 0.9827
No log 1.3770 84 0.9527 0.2915 0.9527 0.9760
No log 1.4098 86 0.8974 0.3663 0.8974 0.9473
No log 1.4426 88 0.8519 0.4008 0.8519 0.9230
No log 1.4754 90 0.9000 0.4854 0.9000 0.9487
No log 1.5082 92 1.0238 0.4570 1.0238 1.0118
No log 1.5410 94 0.9190 0.4606 0.9190 0.9587
No log 1.5738 96 0.7963 0.4977 0.7963 0.8924
No log 1.6066 98 0.8162 0.4681 0.8162 0.9035
No log 1.6393 100 0.8281 0.4197 0.8281 0.9100
No log 1.6721 102 1.0972 0.4478 1.0972 1.0475
No log 1.7049 104 1.3827 0.2442 1.3827 1.1759
No log 1.7377 106 1.2000 0.3939 1.2000 1.0955
No log 1.7705 108 0.8257 0.4334 0.8257 0.9087
No log 1.8033 110 0.8221 0.5727 0.8221 0.9067
No log 1.8361 112 1.0023 0.4453 1.0023 1.0011
No log 1.8689 114 0.9302 0.5507 0.9302 0.9645
No log 1.9016 116 0.8048 0.4801 0.8048 0.8971
No log 1.9344 118 0.9331 0.4507 0.9331 0.9660
No log 1.9672 120 1.0038 0.5023 1.0038 1.0019
No log 2.0 122 0.9260 0.4570 0.9260 0.9623
No log 2.0328 124 0.8647 0.4734 0.8647 0.9299
No log 2.0656 126 1.0283 0.4430 1.0283 1.0141
No log 2.0984 128 1.0325 0.4221 1.0325 1.0161
No log 2.1311 130 0.9151 0.3275 0.9151 0.9566
No log 2.1639 132 0.9018 0.3902 0.9018 0.9496
No log 2.1967 134 0.9214 0.3902 0.9214 0.9599
No log 2.2295 136 0.9546 0.3427 0.9546 0.9770
No log 2.2623 138 0.9541 0.3779 0.9541 0.9768
No log 2.2951 140 1.0398 0.4048 1.0398 1.0197
No log 2.3279 142 1.1541 0.4186 1.1541 1.0743
No log 2.3607 144 1.0537 0.4219 1.0537 1.0265
No log 2.3934 146 0.9063 0.3896 0.9063 0.9520
No log 2.4262 148 0.8793 0.4079 0.8793 0.9377
No log 2.4590 150 0.9073 0.4164 0.9073 0.9525
No log 2.4918 152 0.9213 0.3891 0.9213 0.9599
No log 2.5246 154 0.8821 0.4302 0.8821 0.9392
No log 2.5574 156 0.8805 0.3660 0.8805 0.9383
No log 2.5902 158 0.8873 0.3291 0.8873 0.9419
No log 2.6230 160 0.8519 0.4142 0.8519 0.9230
No log 2.6557 162 0.8562 0.4002 0.8562 0.9253
No log 2.6885 164 0.8441 0.4220 0.8441 0.9187
No log 2.7213 166 0.9223 0.3646 0.9223 0.9604
No log 2.7541 168 1.1034 0.3822 1.1034 1.0504
No log 2.7869 170 0.9457 0.4175 0.9457 0.9724
No log 2.8197 172 0.8388 0.4628 0.8388 0.9159
No log 2.8525 174 0.9675 0.5507 0.9675 0.9836
No log 2.8852 176 1.1349 0.4673 1.1349 1.0653
No log 2.9180 178 1.2704 0.3839 1.2704 1.1271
No log 2.9508 180 1.1118 0.4516 1.1118 1.0544
No log 2.9836 182 0.8981 0.4644 0.8981 0.9477
No log 3.0164 184 0.9324 0.3806 0.9324 0.9656
No log 3.0492 186 0.9635 0.3709 0.9635 0.9816
No log 3.0820 188 0.8883 0.4054 0.8883 0.9425
No log 3.1148 190 0.8533 0.4780 0.8533 0.9237
No log 3.1475 192 0.8804 0.4439 0.8804 0.9383
No log 3.1803 194 0.9202 0.4302 0.9202 0.9593
No log 3.2131 196 0.8745 0.4300 0.8745 0.9352
No log 3.2459 198 0.8344 0.4845 0.8344 0.9134
No log 3.2787 200 0.8038 0.4220 0.8038 0.8966
No log 3.3115 202 0.8058 0.4158 0.8058 0.8976
No log 3.3443 204 0.8237 0.4598 0.8237 0.9076
No log 3.3770 206 0.8085 0.4527 0.8085 0.8992
No log 3.4098 208 0.8146 0.5080 0.8146 0.9026
No log 3.4426 210 0.9042 0.4402 0.9042 0.9509
No log 3.4754 212 0.9019 0.4402 0.9019 0.9497
No log 3.5082 214 0.8010 0.5244 0.8010 0.8950
No log 3.5410 216 0.8667 0.4379 0.8667 0.9310
No log 3.5738 218 0.9052 0.4537 0.9052 0.9514
No log 3.6066 220 0.9360 0.3955 0.9360 0.9675
No log 3.6393 222 0.8311 0.4363 0.8311 0.9116
No log 3.6721 224 0.8711 0.4420 0.8711 0.9334
No log 3.7049 226 0.8585 0.4241 0.8585 0.9265
No log 3.7377 228 0.8457 0.4324 0.8457 0.9196
No log 3.7705 230 0.8432 0.4221 0.8432 0.9182
No log 3.8033 232 0.8545 0.3822 0.8545 0.9244
No log 3.8361 234 0.8681 0.3271 0.8681 0.9317
No log 3.8689 236 0.9488 0.4386 0.9488 0.9741
No log 3.9016 238 1.0923 0.4098 1.0923 1.0451
No log 3.9344 240 1.1156 0.4098 1.1156 1.0562
No log 3.9672 242 1.1275 0.4594 1.1275 1.0619
No log 4.0 244 1.0221 0.4430 1.0221 1.0110
No log 4.0328 246 0.8637 0.3847 0.8637 0.9294
No log 4.0656 248 0.8541 0.4008 0.8541 0.9242
No log 4.0984 250 0.9543 0.3927 0.9543 0.9769
No log 4.1311 252 0.9147 0.3884 0.9147 0.9564
No log 4.1639 254 0.8591 0.3498 0.8591 0.9269
No log 4.1967 256 0.8801 0.3847 0.8801 0.9381
No log 4.2295 258 0.8913 0.4102 0.8913 0.9441
No log 4.2623 260 0.8634 0.4324 0.8634 0.9292
No log 4.2951 262 0.8492 0.4324 0.8492 0.9215
No log 4.3279 264 0.8368 0.4051 0.8368 0.9147
No log 4.3607 266 0.8173 0.4260 0.8173 0.9041
No log 4.3934 268 0.8110 0.3685 0.8110 0.9006
No log 4.4262 270 0.8974 0.4297 0.8974 0.9473
No log 4.4590 272 1.0929 0.3821 1.0929 1.0454
No log 4.4918 274 1.0481 0.3800 1.0481 1.0237
No log 4.5246 276 0.8788 0.3820 0.8788 0.9375
No log 4.5574 278 0.8195 0.4115 0.8195 0.9053
No log 4.5902 280 0.8324 0.3951 0.8324 0.9124
No log 4.6230 282 0.8002 0.4301 0.8002 0.8945
No log 4.6557 284 0.7955 0.4036 0.7955 0.8919
No log 4.6885 286 0.8357 0.4202 0.8357 0.9142
No log 4.7213 288 0.8400 0.4202 0.8400 0.9165
No log 4.7541 290 0.8037 0.3374 0.8037 0.8965
No log 4.7869 292 0.8320 0.4263 0.8320 0.9121
No log 4.8197 294 0.8239 0.4115 0.8239 0.9077
No log 4.8525 296 0.8368 0.2782 0.8368 0.9148
No log 4.8852 298 0.8816 0.3217 0.8816 0.9390
No log 4.9180 300 1.0218 0.3217 1.0218 1.0108
No log 4.9508 302 1.0246 0.3474 1.0246 1.0122
No log 4.9836 304 0.8891 0.3820 0.8891 0.9429
No log 5.0164 306 0.8101 0.3392 0.8101 0.9001
No log 5.0492 308 0.7815 0.3685 0.7815 0.8840
No log 5.0820 310 0.7780 0.3902 0.7780 0.8820
No log 5.1148 312 0.8011 0.5442 0.8011 0.8951
No log 5.1475 314 0.8838 0.4862 0.8838 0.9401
No log 5.1803 316 0.8794 0.4862 0.8794 0.9377
No log 5.2131 318 0.7515 0.4799 0.7515 0.8669
No log 5.2459 320 0.7293 0.4927 0.7293 0.8540
No log 5.2787 322 0.7222 0.5061 0.7222 0.8498
No log 5.3115 324 0.7156 0.5195 0.7156 0.8459
No log 5.3443 326 0.7119 0.4942 0.7119 0.8438
No log 5.3770 328 0.7340 0.4505 0.7340 0.8567
No log 5.4098 330 0.7458 0.4820 0.7458 0.8636
No log 5.4426 332 0.7728 0.4743 0.7728 0.8791
No log 5.4754 334 0.7408 0.4656 0.7408 0.8607
No log 5.5082 336 0.7993 0.5701 0.7993 0.8940
No log 5.5410 338 0.9197 0.4444 0.9197 0.9590
No log 5.5738 340 0.8039 0.5081 0.8039 0.8966
No log 5.6066 342 0.7389 0.5046 0.7389 0.8596
No log 5.6393 344 0.7540 0.4845 0.7540 0.8684
No log 5.6721 346 0.7565 0.4033 0.7565 0.8698
No log 5.7049 348 0.7661 0.4033 0.7661 0.8753
No log 5.7377 350 0.7720 0.3769 0.7720 0.8786
No log 5.7705 352 0.7679 0.4393 0.7679 0.8763
No log 5.8033 354 0.8111 0.4343 0.8111 0.9006
No log 5.8361 356 0.8669 0.4539 0.8669 0.9311
No log 5.8689 358 0.8558 0.3537 0.8558 0.9251
No log 5.9016 360 0.8322 0.3616 0.8322 0.9122
No log 5.9344 362 0.8397 0.3616 0.8397 0.9163
No log 5.9672 364 0.8658 0.3996 0.8658 0.9305
No log 6.0 366 0.9134 0.4497 0.9134 0.9557
No log 6.0328 368 0.8911 0.3996 0.8911 0.9440
No log 6.0656 370 0.8762 0.3045 0.8762 0.9361
No log 6.0984 372 0.9257 0.3655 0.9257 0.9622
No log 6.1311 374 0.9296 0.2834 0.9296 0.9642
No log 6.1639 376 0.9133 0.2624 0.9133 0.9557
No log 6.1967 378 0.9507 0.3708 0.9507 0.9751
No log 6.2295 380 1.0882 0.3621 1.0882 1.0431
No log 6.2623 382 1.0767 0.3621 1.0767 1.0376
No log 6.2951 384 0.8867 0.3737 0.8867 0.9417
No log 6.3279 386 0.8094 0.4700 0.8094 0.8997
No log 6.3607 388 0.9353 0.5299 0.9353 0.9671
No log 6.3934 390 1.0298 0.4418 1.0298 1.0148
No log 6.4262 392 0.9331 0.4700 0.9331 0.9660
No log 6.4590 394 0.8268 0.4476 0.8268 0.9093
No log 6.4918 396 0.8056 0.4526 0.8056 0.8975
No log 6.5246 398 0.8122 0.4363 0.8122 0.9012
No log 6.5574 400 0.8298 0.4261 0.8298 0.9110
No log 6.5902 402 0.8339 0.3933 0.8339 0.9132
No log 6.6230 404 0.8863 0.4302 0.8863 0.9415
No log 6.6557 406 0.9363 0.4037 0.9363 0.9676
No log 6.6885 408 0.9091 0.4614 0.9091 0.9535
No log 6.7213 410 0.8829 0.4614 0.8829 0.9396
No log 6.7541 412 0.8459 0.3309 0.8459 0.9197
No log 6.7869 414 0.8459 0.3652 0.8459 0.9197
No log 6.8197 416 0.8698 0.3861 0.8698 0.9326
No log 6.8525 418 0.9479 0.3326 0.9479 0.9736
No log 6.8852 420 0.9920 0.3046 0.9920 0.9960
No log 6.9180 422 1.0059 0.3046 1.0059 1.0030
No log 6.9508 424 0.9479 0.2522 0.9479 0.9736
No log 6.9836 426 0.9046 0.3779 0.9046 0.9511
No log 7.0164 428 0.8976 0.3866 0.8976 0.9474
No log 7.0492 430 0.9114 0.4444 0.9114 0.9547
No log 7.0820 432 0.8944 0.4008 0.8944 0.9457
No log 7.1148 434 0.8700 0.3570 0.8700 0.9327
No log 7.1475 436 0.9701 0.3902 0.9701 0.9849
No log 7.1803 438 1.0213 0.4513 1.0213 1.0106
No log 7.2131 440 0.9437 0.4622 0.9437 0.9714
No log 7.2459 442 0.8655 0.3970 0.8655 0.9303
No log 7.2787 444 0.9150 0.3693 0.9150 0.9566
No log 7.3115 446 0.9151 0.3753 0.9151 0.9566
No log 7.3443 448 0.8900 0.3570 0.8900 0.9434
No log 7.3770 450 0.9194 0.3326 0.9194 0.9588
No log 7.4098 452 0.9256 0.3326 0.9256 0.9621
No log 7.4426 454 0.9008 0.3779 0.9008 0.9491
No log 7.4754 456 0.8950 0.3970 0.8950 0.9461
No log 7.5082 458 0.8850 0.4075 0.8850 0.9407
No log 7.5410 460 0.9063 0.3345 0.9063 0.9520
No log 7.5738 462 1.0136 0.3405 1.0136 1.0068
No log 7.6066 464 0.9876 0.4128 0.9876 0.9938
No log 7.6393 466 0.8819 0.2539 0.8819 0.9391
No log 7.6721 468 0.8245 0.4180 0.8245 0.9080
No log 7.7049 470 0.8405 0.4054 0.8405 0.9168
No log 7.7377 472 0.8330 0.4299 0.8330 0.9127
No log 7.7705 474 0.8195 0.4180 0.8195 0.9053
No log 7.8033 476 0.8455 0.4476 0.8455 0.9195
No log 7.8361 478 0.9013 0.4712 0.9013 0.9494
No log 7.8689 480 0.9555 0.4694 0.9555 0.9775
No log 7.9016 482 0.9078 0.4712 0.9078 0.9528
No log 7.9344 484 0.8765 0.4202 0.8765 0.9362
No log 7.9672 486 0.8731 0.3738 0.8731 0.9344
No log 8.0 488 0.8299 0.4142 0.8299 0.9110
No log 8.0328 490 0.7912 0.4423 0.7912 0.8895
No log 8.0656 492 0.7818 0.4645 0.7818 0.8842
No log 8.0984 494 0.7912 0.4098 0.7912 0.8895
No log 8.1311 496 0.7918 0.4439 0.7918 0.8898
No log 8.1639 498 0.7879 0.4299 0.7879 0.8876
0.4216 8.1967 500 0.7858 0.4299 0.7858 0.8864
0.4216 8.2295 502 0.7959 0.4299 0.7959 0.8921
0.4216 8.2623 504 0.8599 0.3897 0.8599 0.9273
0.4216 8.2951 506 0.8588 0.3897 0.8588 0.9267
0.4216 8.3279 508 0.8056 0.4367 0.8056 0.8975
0.4216 8.3607 510 0.7996 0.3933 0.7996 0.8942
0.4216 8.3934 512 0.8200 0.4377 0.8200 0.9056
0.4216 8.4262 514 0.8582 0.4009 0.8582 0.9264
0.4216 8.4590 516 0.8430 0.5250 0.8430 0.9181
0.4216 8.4918 518 0.8173 0.4996 0.8173 0.9041
0.4216 8.5246 520 0.8060 0.5320 0.8060 0.8978
0.4216 8.5574 522 0.8062 0.4818 0.8062 0.8979
0.4216 8.5902 524 0.8171 0.5621 0.8171 0.9040
0.4216 8.6230 526 0.8136 0.5167 0.8136 0.9020
0.4216 8.6557 528 0.8128 0.4749 0.8128 0.9015
0.4216 8.6885 530 0.8232 0.4617 0.8232 0.9073
0.4216 8.7213 532 0.8501 0.4039 0.8501 0.9220
0.4216 8.7541 534 0.8722 0.3896 0.8722 0.9339
0.4216 8.7869 536 0.8587 0.3641 0.8587 0.9266
0.4216 8.8197 538 0.8418 0.4327 0.8418 0.9175
0.4216 8.8525 540 0.8499 0.4397 0.8499 0.9219
0.4216 8.8852 542 0.8921 0.4275 0.8921 0.9445
0.4216 8.9180 544 0.8687 0.4275 0.8687 0.9320
0.4216 8.9508 546 0.8390 0.4375 0.8390 0.9160
0.4216 8.9836 548 0.8236 0.4401 0.8236 0.9075
0.4216 9.0164 550 0.8317 0.4375 0.8317 0.9120
0.4216 9.0492 552 0.8561 0.4275 0.8561 0.9252
0.4216 9.0820 554 0.8353 0.4275 0.8353 0.9139
0.4216 9.1148 556 0.8140 0.4116 0.8140 0.9022
0.4216 9.1475 558 0.8159 0.4327 0.8159 0.9033
0.4216 9.1803 560 0.8293 0.4198 0.8293 0.9107
0.4216 9.2131 562 0.8593 0.4483 0.8593 0.9270
0.4216 9.2459 564 0.8423 0.4198 0.8423 0.9178
0.4216 9.2787 566 0.8095 0.3753 0.8095 0.8997
0.4216 9.3115 568 0.8127 0.3814 0.8127 0.9015
0.4216 9.3443 570 0.8085 0.3753 0.8085 0.8992
0.4216 9.3770 572 0.8053 0.3708 0.8053 0.8974

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_FineTuningAraBERT_run3_AugV5_k20_task2_organization

Finetuned
(4023)
this model