ArabicNewSplits6_WithDuplicationsForScore5_FineTuningAraBERT_run3_AugV5_k11_task3_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.7222
  • Qwk: 0.2275
  • Mse: 0.7222
  • Rmse: 0.8498

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 10

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0357 2 3.3398 -0.0041 3.3398 1.8275
No log 0.0714 4 1.5777 -0.0070 1.5777 1.2561
No log 0.1071 6 0.9095 0.0083 0.9095 0.9537
No log 0.1429 8 0.6328 -0.0303 0.6328 0.7955
No log 0.1786 10 0.6441 -0.0159 0.6441 0.8026
No log 0.2143 12 0.6612 0.0569 0.6612 0.8132
No log 0.25 14 0.8314 0.1919 0.8314 0.9118
No log 0.2857 16 0.7381 0.1913 0.7381 0.8591
No log 0.3214 18 0.5922 0.0476 0.5922 0.7695
No log 0.3571 20 0.5717 -0.0081 0.5717 0.7561
No log 0.3929 22 0.5982 -0.0732 0.5982 0.7734
No log 0.4286 24 0.6574 -0.0159 0.6574 0.8108
No log 0.4643 26 0.5859 -0.0732 0.5859 0.7655
No log 0.5 28 0.5431 0.3333 0.5431 0.7369
No log 0.5357 30 0.8329 0.1289 0.8329 0.9126
No log 0.5714 32 0.7761 0.1781 0.7761 0.8810
No log 0.6071 34 0.5528 0.0476 0.5528 0.7435
No log 0.6429 36 0.8101 0.2000 0.8101 0.9000
No log 0.6786 38 0.8935 0.1600 0.8935 0.9452
No log 0.7143 40 0.6836 0.1392 0.6836 0.8268
No log 0.75 42 0.6826 0.2917 0.6826 0.8262
No log 0.7857 44 0.9658 0.0694 0.9658 0.9828
No log 0.8214 46 0.7434 0.2157 0.7434 0.8622
No log 0.8571 48 0.5725 0.0685 0.5725 0.7566
No log 0.8929 50 0.5665 0.1329 0.5665 0.7526
No log 0.9286 52 0.5993 0.2281 0.5993 0.7742
No log 0.9643 54 0.8711 0.1050 0.8711 0.9333
No log 1.0 56 0.8073 0.1619 0.8073 0.8985
No log 1.0357 58 0.6108 0.0933 0.6108 0.7815
No log 1.0714 60 0.5783 0.0303 0.5783 0.7604
No log 1.1071 62 0.5986 0.0933 0.5986 0.7737
No log 1.1429 64 0.9427 0.0333 0.9427 0.9709
No log 1.1786 66 1.0701 0.0617 1.0701 1.0344
No log 1.2143 68 0.7299 0.2157 0.7299 0.8543
No log 1.25 70 0.5794 0.2405 0.5794 0.7612
No log 1.2857 72 0.5851 0.2195 0.5851 0.7649
No log 1.3214 74 0.6664 0.3143 0.6664 0.8163
No log 1.3571 76 0.6275 0.3089 0.6275 0.7921
No log 1.3929 78 0.7507 0.1020 0.7507 0.8664
No log 1.4286 80 0.8362 0.0796 0.8362 0.9145
No log 1.4643 82 0.7652 0.0647 0.7652 0.8748
No log 1.5 84 0.5990 0.3508 0.5990 0.7740
No log 1.5357 86 0.6143 0.3035 0.6143 0.7838
No log 1.5714 88 0.5955 0.2577 0.5955 0.7717
No log 1.6071 90 0.6668 0.1503 0.6668 0.8166
No log 1.6429 92 0.6725 0.1919 0.6725 0.8201
No log 1.6786 94 0.8417 0.1228 0.8417 0.9174
No log 1.7143 96 0.5829 0.3641 0.5829 0.7635
No log 1.75 98 0.5707 0.4764 0.5707 0.7555
No log 1.7857 100 0.7384 0.1855 0.7384 0.8593
No log 1.8214 102 0.7183 0.2315 0.7183 0.8475
No log 1.8571 104 0.5441 0.4043 0.5441 0.7377
No log 1.8929 106 0.5006 0.5 0.5006 0.7075
No log 1.9286 108 0.4890 0.4475 0.4890 0.6993
No log 1.9643 110 0.6152 0.2637 0.6152 0.7844
No log 2.0 112 1.0065 0.2291 1.0065 1.0032
No log 2.0357 114 1.0584 0.2000 1.0584 1.0288
No log 2.0714 116 0.5685 0.36 0.5685 0.7540
No log 2.1071 118 0.5227 0.4286 0.5227 0.7229
No log 2.1429 120 1.0713 0.2226 1.0713 1.0350
No log 2.1786 122 1.3342 0.1738 1.3342 1.1551
No log 2.2143 124 0.9905 0.2230 0.9905 0.9953
No log 2.25 126 0.8498 0.2239 0.8498 0.9218
No log 2.2857 128 0.7525 0.2797 0.7525 0.8675
No log 2.3214 130 0.7890 0.2548 0.7890 0.8882
No log 2.3571 132 0.5755 0.4231 0.5755 0.7586
No log 2.3929 134 0.7166 0.3391 0.7166 0.8465
No log 2.4286 136 1.0540 0.2222 1.0540 1.0266
No log 2.4643 138 0.7806 0.3188 0.7806 0.8835
No log 2.5 140 0.6581 0.3905 0.6581 0.8112
No log 2.5357 142 0.8163 0.2554 0.8163 0.9035
No log 2.5714 144 0.7151 0.3301 0.7151 0.8456
No log 2.6071 146 0.5504 0.3617 0.5504 0.7419
No log 2.6429 148 0.6448 0.3623 0.6448 0.8030
No log 2.6786 150 0.9538 0.1937 0.9538 0.9766
No log 2.7143 152 0.9922 0.2117 0.9922 0.9961
No log 2.75 154 0.5351 0.3427 0.5351 0.7315
No log 2.7857 156 0.5412 0.3684 0.5412 0.7357
No log 2.8214 158 0.6017 0.4400 0.6017 0.7757
No log 2.8571 160 1.4462 0.2277 1.4462 1.2026
No log 2.8929 162 1.5609 0.2043 1.5609 1.2494
No log 2.9286 164 0.7721 0.2520 0.7721 0.8787
No log 2.9643 166 0.4737 0.4220 0.4737 0.6883
No log 3.0 168 0.4826 0.4098 0.4826 0.6947
No log 3.0357 170 0.8731 0.3050 0.8731 0.9344
No log 3.0714 172 1.8519 0.1313 1.8519 1.3609
No log 3.1071 174 1.6595 0.1546 1.6595 1.2882
No log 3.1429 176 0.7963 0.3633 0.7963 0.8924
No log 3.1786 178 0.5422 0.3739 0.5422 0.7364
No log 3.2143 180 0.5872 0.3803 0.5872 0.7663
No log 3.25 182 0.9647 0.3588 0.9647 0.9822
No log 3.2857 184 1.3557 0.2208 1.3557 1.1643
No log 3.3214 186 1.2364 0.2218 1.2364 1.1119
No log 3.3571 188 0.7606 0.3882 0.7606 0.8721
No log 3.3929 190 0.6434 0.4 0.6434 0.8021
No log 3.4286 192 0.6378 0.3702 0.6378 0.7986
No log 3.4643 194 0.8007 0.2618 0.8007 0.8948
No log 3.5 196 0.9081 0.2569 0.9081 0.9530
No log 3.5357 198 0.8018 0.2618 0.8018 0.8954
No log 3.5714 200 0.5715 0.3803 0.5715 0.7560
No log 3.6071 202 0.6384 0.4386 0.6384 0.7990
No log 3.6429 204 0.8189 0.392 0.8189 0.9050
No log 3.6786 206 1.0184 0.2409 1.0184 1.0091
No log 3.7143 208 0.8137 0.392 0.8137 0.9021
No log 3.75 210 0.9257 0.3538 0.9257 0.9621
No log 3.7857 212 1.2993 0.1628 1.2993 1.1399
No log 3.8214 214 1.3126 0.1358 1.3126 1.1457
No log 3.8571 216 1.0359 0.2569 1.0359 1.0178
No log 3.8929 218 0.7464 0.3226 0.7464 0.8639
No log 3.9286 220 0.7414 0.3251 0.7414 0.8610
No log 3.9643 222 1.1596 0.2456 1.1596 1.0769
No log 4.0 224 1.9202 0.0980 1.9202 1.3857
No log 4.0357 226 1.9123 0.0769 1.9123 1.3829
No log 4.0714 228 1.1713 0.2456 1.1713 1.0823
No log 4.1071 230 0.6047 0.3427 0.6047 0.7777
No log 4.1429 232 0.5882 0.3462 0.5882 0.7669
No log 4.1786 234 0.8018 0.3651 0.8018 0.8954
No log 4.2143 236 1.2450 0.2222 1.2450 1.1158
No log 4.25 238 1.3649 0.1438 1.3649 1.1683
No log 4.2857 240 0.9864 0.2569 0.9864 0.9932
No log 4.3214 242 0.6253 0.3391 0.6253 0.7907
No log 4.3571 244 0.6405 0.3333 0.6405 0.8003
No log 4.3929 246 0.6829 0.3080 0.6829 0.8264
No log 4.4286 248 0.8182 0.3195 0.8182 0.9045
No log 4.4643 250 1.1886 0.1419 1.1886 1.0902
No log 4.5 252 1.2435 0.1946 1.2435 1.1151
No log 4.5357 254 0.9470 0.2812 0.9470 0.9731
No log 4.5714 256 0.6377 0.3593 0.6377 0.7986
No log 4.6071 258 0.6151 0.3982 0.6151 0.7843
No log 4.6429 260 0.5905 0.4074 0.5905 0.7684
No log 4.6786 262 0.8152 0.392 0.8152 0.9029
No log 4.7143 264 1.3871 0.1897 1.3871 1.1778
No log 4.75 266 1.5048 0.1420 1.5048 1.2267
No log 4.7857 268 1.2824 0.1667 1.2824 1.1324
No log 4.8214 270 0.8985 0.1934 0.8985 0.9479
No log 4.8571 272 0.6426 0.3898 0.6426 0.8016
No log 4.8929 274 0.5962 0.4123 0.5962 0.7721
No log 4.9286 276 0.6680 0.3214 0.6680 0.8173
No log 4.9643 278 0.7542 0.2281 0.7542 0.8684
No log 5.0 280 1.0752 0.1943 1.0752 1.0369
No log 5.0357 282 1.1999 0.2000 1.1999 1.0954
No log 5.0714 284 1.0176 0.1635 1.0176 1.0088
No log 5.1071 286 0.7228 0.3214 0.7228 0.8502
No log 5.1429 288 0.6276 0.3874 0.6276 0.7922
No log 5.1786 290 0.6695 0.3480 0.6695 0.8182
No log 5.2143 292 0.7226 0.3214 0.7226 0.8500
No log 5.25 294 0.8511 0.2787 0.8511 0.9225
No log 5.2857 296 0.9127 0.25 0.9127 0.9554
No log 5.3214 298 0.8936 0.1613 0.8936 0.9453
No log 5.3571 300 0.7744 0.2542 0.7744 0.8800
No log 5.3929 302 0.6951 0.3537 0.6951 0.8337
No log 5.4286 304 0.6641 0.3214 0.6641 0.8149
No log 5.4643 306 0.6594 0.3480 0.6594 0.8120
No log 5.5 308 0.6880 0.3103 0.6880 0.8295
No log 5.5357 310 0.7032 0.3418 0.7032 0.8386
No log 5.5714 312 0.6592 0.3874 0.6592 0.8119
No log 5.6071 314 0.7160 0.3719 0.7160 0.8462
No log 5.6429 316 0.8267 0.2756 0.8267 0.9092
No log 5.6786 318 1.0331 0.1655 1.0331 1.0164
No log 5.7143 320 0.8963 0.1939 0.8963 0.9467
No log 5.75 322 0.6882 0.3537 0.6882 0.8296
No log 5.7857 324 0.6028 0.4930 0.6028 0.7764
No log 5.8214 326 0.5988 0.4815 0.5988 0.7738
No log 5.8571 328 0.6681 0.3504 0.6681 0.8174
No log 5.8929 330 0.8291 0.1934 0.8291 0.9105
No log 5.9286 332 0.9096 0.1628 0.9096 0.9538
No log 5.9643 334 0.8125 0.2275 0.8125 0.9014
No log 6.0 336 0.7794 0.2632 0.7794 0.8829
No log 6.0357 338 0.7292 0.2632 0.7292 0.8539
No log 6.0714 340 0.7193 0.2632 0.7193 0.8481
No log 6.1071 342 0.7117 0.3028 0.7117 0.8436
No log 6.1429 344 0.6359 0.4545 0.6359 0.7974
No log 6.1786 346 0.6613 0.3645 0.6613 0.8132
No log 6.2143 348 0.7986 0.2605 0.7986 0.8937
No log 6.25 350 0.8734 0.1628 0.8734 0.9346
No log 6.2857 352 0.9105 0.1635 0.9105 0.9542
No log 6.3214 354 0.8987 0.1635 0.8987 0.9480
No log 6.3571 356 0.8109 0.2510 0.8109 0.9005
No log 6.3929 358 0.7701 0.2863 0.7701 0.8776
No log 6.4286 360 0.8427 0.1937 0.8427 0.9180
No log 6.4643 362 1.0444 0.1648 1.0444 1.0220
No log 6.5 364 1.0041 0.1648 1.0041 1.0020
No log 6.5357 366 0.7917 0.2531 0.7917 0.8898
No log 6.5714 368 0.6490 0.4087 0.6490 0.8056
No log 6.6071 370 0.6814 0.4087 0.6814 0.8255
No log 6.6429 372 0.8038 0.2275 0.8038 0.8965
No log 6.6786 374 1.0125 0.1648 1.0125 1.0063
No log 6.7143 376 1.0841 0.1888 1.0841 1.0412
No log 6.75 378 1.0501 0.1648 1.0501 1.0247
No log 6.7857 380 0.8730 0.2180 0.8730 0.9343
No log 6.8214 382 0.6487 0.3833 0.6487 0.8054
No log 6.8571 384 0.6343 0.3874 0.6343 0.7964
No log 6.8929 386 0.7493 0.2479 0.7493 0.8656
No log 6.9286 388 0.7852 0.2542 0.7852 0.8861
No log 6.9643 390 0.8612 0.1939 0.8612 0.9280
No log 7.0 392 0.9375 0.1940 0.9375 0.9683
No log 7.0357 394 1.0031 0.1648 1.0031 1.0015
No log 7.0714 396 0.8776 0.2180 0.8776 0.9368
No log 7.1071 398 0.7128 0.3448 0.7128 0.8443
No log 7.1429 400 0.7183 0.3448 0.7183 0.8475
No log 7.1786 402 0.8801 0.2177 0.8801 0.9382
No log 7.2143 404 1.0435 0.1655 1.0435 1.0215
No log 7.25 406 1.1841 0.1672 1.1841 1.0882
No log 7.2857 408 1.2652 0.1895 1.2652 1.1248
No log 7.3214 410 1.1137 0.1672 1.1137 1.0553
No log 7.3571 412 0.8475 0.2180 0.8475 0.9206
No log 7.3929 414 0.6381 0.3548 0.6381 0.7988
No log 7.4286 416 0.5446 0.4051 0.5446 0.7380
No log 7.4643 418 0.5245 0.4526 0.5245 0.7242
No log 7.5 420 0.5427 0.4225 0.5427 0.7367
No log 7.5357 422 0.5984 0.4010 0.5984 0.7736
No log 7.5714 424 0.7685 0.2275 0.7685 0.8767
No log 7.6071 426 1.0120 0.1655 1.0120 1.0060
No log 7.6429 428 1.0825 0.1661 1.0825 1.0404
No log 7.6786 430 0.9637 0.1940 0.9637 0.9817
No log 7.7143 432 0.7589 0.3138 0.7589 0.8712
No log 7.75 434 0.6530 0.3917 0.6530 0.8081
No log 7.7857 436 0.6328 0.4233 0.6328 0.7955
No log 7.8214 438 0.6523 0.3962 0.6523 0.8077
No log 7.8571 440 0.7717 0.3138 0.7717 0.8785
No log 7.8929 442 0.9205 0.1939 0.9205 0.9594
No log 7.9286 444 0.9636 0.1648 0.9636 0.9816
No log 7.9643 446 0.9080 0.2180 0.9080 0.9529
No log 8.0 448 0.8643 0.2188 0.8643 0.9297
No log 8.0357 450 0.9170 0.2177 0.9170 0.9576
No log 8.0714 452 1.0165 0.1655 1.0165 1.0082
No log 8.1071 454 1.0453 0.1661 1.0453 1.0224
No log 8.1429 456 1.0479 0.1661 1.0479 1.0237
No log 8.1786 458 0.9198 0.2180 0.9198 0.9591
No log 8.2143 460 0.7379 0.2542 0.7379 0.8590
No log 8.25 462 0.6061 0.4605 0.6061 0.7785
No log 8.2857 464 0.5446 0.4 0.5446 0.7380
No log 8.3214 466 0.5392 0.4051 0.5392 0.7343
No log 8.3571 468 0.5695 0.4286 0.5695 0.7546
No log 8.3929 470 0.6523 0.3333 0.6523 0.8076
No log 8.4286 472 0.7090 0.2542 0.7090 0.8420
No log 8.4643 474 0.7201 0.2275 0.7201 0.8486
No log 8.5 476 0.7476 0.2275 0.7476 0.8647
No log 8.5357 478 0.7640 0.2275 0.7640 0.8740
No log 8.5714 480 0.7362 0.2275 0.7362 0.8580
No log 8.6071 482 0.6727 0.3303 0.6727 0.8202
No log 8.6429 484 0.6583 0.3333 0.6583 0.8114
No log 8.6786 486 0.6873 0.3303 0.6873 0.8291
No log 8.7143 488 0.7505 0.2275 0.7505 0.8663
No log 8.75 490 0.8380 0.2263 0.8380 0.9154
No log 8.7857 492 0.9070 0.2253 0.9070 0.9524
No log 8.8214 494 0.8979 0.2253 0.8979 0.9476
No log 8.8571 496 0.8644 0.1935 0.8644 0.9298
No log 8.8929 498 0.8042 0.2263 0.8042 0.8968
0.3897 8.9286 500 0.7699 0.2269 0.7699 0.8774
0.3897 8.9643 502 0.7590 0.2531 0.7590 0.8712
0.3897 9.0 504 0.7375 0.2542 0.7375 0.8588
0.3897 9.0357 506 0.7433 0.2542 0.7433 0.8621
0.3897 9.0714 508 0.7636 0.2531 0.7636 0.8738
0.3897 9.1071 510 0.7910 0.2520 0.7910 0.8894
0.3897 9.1429 512 0.8113 0.2263 0.8113 0.9007
0.3897 9.1786 514 0.8120 0.2263 0.8120 0.9011
0.3897 9.2143 516 0.7908 0.2263 0.7908 0.8893
0.3897 9.25 518 0.7436 0.2803 0.7436 0.8623
0.3897 9.2857 520 0.6886 0.4185 0.6886 0.8298
0.3897 9.3214 522 0.6731 0.4185 0.6731 0.8204
0.3897 9.3571 524 0.6805 0.4185 0.6805 0.8249
0.3897 9.3929 526 0.6858 0.4185 0.6858 0.8281
0.3897 9.4286 528 0.7029 0.3571 0.7029 0.8384
0.3897 9.4643 530 0.7248 0.2803 0.7248 0.8514
0.3897 9.5 532 0.7440 0.2275 0.7440 0.8625
0.3897 9.5357 534 0.7643 0.2263 0.7643 0.8743
0.3897 9.5714 536 0.7662 0.2263 0.7662 0.8753
0.3897 9.6071 538 0.7674 0.2263 0.7674 0.8760
0.3897 9.6429 540 0.7633 0.2263 0.7633 0.8737
0.3897 9.6786 542 0.7502 0.2269 0.7502 0.8661
0.3897 9.7143 544 0.7424 0.2275 0.7424 0.8616
0.3897 9.75 546 0.7352 0.2275 0.7352 0.8575
0.3897 9.7857 548 0.7338 0.2275 0.7338 0.8566
0.3897 9.8214 550 0.7293 0.2275 0.7293 0.8540
0.3897 9.8571 552 0.7286 0.2275 0.7286 0.8536
0.3897 9.8929 554 0.7251 0.2275 0.7251 0.8515
0.3897 9.9286 556 0.7222 0.2275 0.7222 0.8498
0.3897 9.9643 558 0.7216 0.2281 0.7216 0.8495
0.3897 10.0 560 0.7222 0.2275 0.7222 0.8498

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits6_WithDuplicationsForScore5_FineTuningAraBERT_run3_AugV5_k11_task3_organization

Finetuned
(4023)
this model