ArabicNewSplits6_FineTuningAraBERT_run1_AugV5_k12_task1_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.7374
  • Qwk: 0.7046
  • Mse: 0.7374
  • Rmse: 0.8587

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 10

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0328 2 5.3255 -0.0071 5.3255 2.3077
No log 0.0656 4 3.3122 0.1008 3.3122 1.8199
No log 0.0984 6 2.5339 -0.0399 2.5339 1.5918
No log 0.1311 8 2.3865 -0.0921 2.3865 1.5448
No log 0.1639 10 2.2381 -0.0284 2.2381 1.4960
No log 0.1967 12 1.5955 0.0656 1.5955 1.2631
No log 0.2295 14 1.3177 0.1951 1.3177 1.1479
No log 0.2623 16 1.3384 0.0954 1.3384 1.1569
No log 0.2951 18 1.5092 0.0252 1.5092 1.2285
No log 0.3279 20 1.5819 0.0226 1.5819 1.2577
No log 0.3607 22 1.5664 0.0406 1.5664 1.2515
No log 0.3934 24 1.4474 0.1122 1.4474 1.2031
No log 0.4262 26 1.2520 0.2670 1.2520 1.1189
No log 0.4590 28 1.2019 0.2894 1.2019 1.0963
No log 0.4918 30 1.1532 0.3242 1.1532 1.0739
No log 0.5246 32 1.2234 0.2979 1.2234 1.1061
No log 0.5574 34 1.6102 0.2226 1.6102 1.2689
No log 0.5902 36 1.6048 0.2323 1.6048 1.2668
No log 0.6230 38 1.4147 0.2776 1.4147 1.1894
No log 0.6557 40 1.4143 0.3001 1.4143 1.1893
No log 0.6885 42 1.4060 0.2858 1.4060 1.1858
No log 0.7213 44 1.2234 0.3200 1.2234 1.1061
No log 0.7541 46 1.2865 0.3239 1.2865 1.1343
No log 0.7869 48 1.4799 0.2660 1.4799 1.2165
No log 0.8197 50 1.6573 0.2740 1.6573 1.2874
No log 0.8525 52 1.5848 0.2863 1.5848 1.2589
No log 0.8852 54 1.3765 0.3683 1.3765 1.1732
No log 0.9180 56 1.3965 0.3876 1.3965 1.1817
No log 0.9508 58 1.2419 0.3747 1.2419 1.1144
No log 0.9836 60 1.1045 0.4238 1.1045 1.0510
No log 1.0164 62 1.0418 0.4132 1.0418 1.0207
No log 1.0492 64 0.9491 0.4329 0.9491 0.9742
No log 1.0820 66 0.9319 0.3968 0.9319 0.9654
No log 1.1148 68 0.8783 0.4559 0.8783 0.9372
No log 1.1475 70 0.8551 0.4671 0.8551 0.9247
No log 1.1803 72 0.8499 0.5295 0.8499 0.9219
No log 1.2131 74 0.8416 0.5201 0.8416 0.9174
No log 1.2459 76 0.7989 0.5473 0.7989 0.8938
No log 1.2787 78 0.7818 0.5296 0.7818 0.8842
No log 1.3115 80 0.8433 0.5048 0.8433 0.9183
No log 1.3443 82 0.9115 0.5170 0.9115 0.9547
No log 1.3770 84 0.9834 0.5078 0.9834 0.9917
No log 1.4098 86 1.1431 0.4758 1.1431 1.0691
No log 1.4426 88 1.1886 0.5002 1.1886 1.0902
No log 1.4754 90 1.0267 0.5311 1.0267 1.0133
No log 1.5082 92 0.9419 0.5278 0.9419 0.9705
No log 1.5410 94 0.9412 0.4959 0.9412 0.9702
No log 1.5738 96 1.0943 0.5175 1.0943 1.0461
No log 1.6066 98 1.2391 0.4966 1.2391 1.1131
No log 1.6393 100 1.0206 0.5377 1.0206 1.0103
No log 1.6721 102 1.1023 0.5373 1.1023 1.0499
No log 1.7049 104 1.1762 0.5394 1.1762 1.0845
No log 1.7377 106 1.2896 0.5148 1.2896 1.1356
No log 1.7705 108 1.3433 0.5048 1.3433 1.1590
No log 1.8033 110 1.0163 0.5895 1.0163 1.0081
No log 1.8361 112 0.7769 0.6842 0.7769 0.8814
No log 1.8689 114 0.7505 0.6709 0.7505 0.8663
No log 1.9016 116 0.8404 0.6502 0.8404 0.9167
No log 1.9344 118 0.8305 0.6592 0.8305 0.9113
No log 1.9672 120 0.7350 0.6454 0.7350 0.8573
No log 2.0 122 0.6365 0.6874 0.6365 0.7978
No log 2.0328 124 0.6797 0.6684 0.6797 0.8244
No log 2.0656 126 0.7480 0.6400 0.7480 0.8649
No log 2.0984 128 0.7861 0.6441 0.7861 0.8866
No log 2.1311 130 0.7229 0.6470 0.7229 0.8503
No log 2.1639 132 0.6710 0.6827 0.6710 0.8191
No log 2.1967 134 0.6736 0.6620 0.6736 0.8207
No log 2.2295 136 0.7174 0.6525 0.7174 0.8470
No log 2.2623 138 0.8131 0.6469 0.8131 0.9017
No log 2.2951 140 0.8349 0.6534 0.8349 0.9137
No log 2.3279 142 0.9152 0.6488 0.9152 0.9567
No log 2.3607 144 0.9821 0.6157 0.9821 0.9910
No log 2.3934 146 0.9160 0.6271 0.9160 0.9571
No log 2.4262 148 0.8085 0.6655 0.8085 0.8992
No log 2.4590 150 0.7051 0.6990 0.7051 0.8397
No log 2.4918 152 0.6814 0.7056 0.6814 0.8254
No log 2.5246 154 0.7277 0.6909 0.7277 0.8531
No log 2.5574 156 0.8103 0.6617 0.8103 0.9001
No log 2.5902 158 0.7862 0.6747 0.7862 0.8867
No log 2.6230 160 0.7488 0.6835 0.7488 0.8653
No log 2.6557 162 0.6979 0.6930 0.6979 0.8354
No log 2.6885 164 0.7173 0.6589 0.7173 0.8469
No log 2.7213 166 0.7453 0.6752 0.7453 0.8633
No log 2.7541 168 0.7161 0.6820 0.7161 0.8463
No log 2.7869 170 0.6894 0.6889 0.6894 0.8303
No log 2.8197 172 0.6643 0.7015 0.6643 0.8151
No log 2.8525 174 0.6722 0.6769 0.6722 0.8199
No log 2.8852 176 0.6945 0.6727 0.6945 0.8334
No log 2.9180 178 0.7254 0.6912 0.7254 0.8517
No log 2.9508 180 0.7341 0.6857 0.7341 0.8568
No log 2.9836 182 0.7649 0.6797 0.7649 0.8746
No log 3.0164 184 0.7540 0.6911 0.7540 0.8683
No log 3.0492 186 0.7057 0.6937 0.7057 0.8401
No log 3.0820 188 0.6863 0.6834 0.6863 0.8284
No log 3.1148 190 0.7430 0.6752 0.7430 0.8620
No log 3.1475 192 0.7449 0.6697 0.7449 0.8630
No log 3.1803 194 0.6825 0.6742 0.6825 0.8261
No log 3.2131 196 0.6669 0.6629 0.6669 0.8167
No log 3.2459 198 0.6755 0.6721 0.6755 0.8219
No log 3.2787 200 0.6345 0.6946 0.6345 0.7966
No log 3.3115 202 0.6657 0.6868 0.6657 0.8159
No log 3.3443 204 0.8009 0.6574 0.8009 0.8949
No log 3.3770 206 0.8689 0.6086 0.8689 0.9322
No log 3.4098 208 0.8130 0.6424 0.8130 0.9016
No log 3.4426 210 0.6982 0.6718 0.6982 0.8356
No log 3.4754 212 0.6327 0.6828 0.6327 0.7954
No log 3.5082 214 0.6348 0.6864 0.6348 0.7967
No log 3.5410 216 0.6397 0.6764 0.6397 0.7998
No log 3.5738 218 0.6448 0.6757 0.6448 0.8030
No log 3.6066 220 0.7047 0.6791 0.7047 0.8394
No log 3.6393 222 0.8188 0.6529 0.8188 0.9049
No log 3.6721 224 0.7916 0.6572 0.7916 0.8897
No log 3.7049 226 0.7285 0.6577 0.7285 0.8535
No log 3.7377 228 0.7124 0.6919 0.7124 0.8440
No log 3.7705 230 0.7638 0.6779 0.7638 0.8740
No log 3.8033 232 0.7211 0.6768 0.7211 0.8492
No log 3.8361 234 0.6831 0.6954 0.6831 0.8265
No log 3.8689 236 0.7132 0.7018 0.7132 0.8445
No log 3.9016 238 0.7267 0.6831 0.7267 0.8525
No log 3.9344 240 0.7234 0.6616 0.7234 0.8505
No log 3.9672 242 0.7190 0.6613 0.7190 0.8479
No log 4.0 244 0.6973 0.6755 0.6973 0.8351
No log 4.0328 246 0.6786 0.6731 0.6786 0.8238
No log 4.0656 248 0.6589 0.6953 0.6589 0.8118
No log 4.0984 250 0.6518 0.7040 0.6518 0.8073
No log 4.1311 252 0.7102 0.7184 0.7102 0.8427
No log 4.1639 254 0.7582 0.6790 0.7582 0.8708
No log 4.1967 256 0.7766 0.6770 0.7766 0.8812
No log 4.2295 258 0.7570 0.7113 0.7570 0.8701
No log 4.2623 260 0.7269 0.7221 0.7269 0.8526
No log 4.2951 262 0.7387 0.7105 0.7387 0.8595
No log 4.3279 264 0.7423 0.7105 0.7423 0.8616
No log 4.3607 266 0.7573 0.7027 0.7573 0.8702
No log 4.3934 268 0.7621 0.6976 0.7621 0.8730
No log 4.4262 270 0.7893 0.6767 0.7893 0.8884
No log 4.4590 272 0.7777 0.6699 0.7777 0.8818
No log 4.4918 274 0.7855 0.6699 0.7855 0.8863
No log 4.5246 276 0.7470 0.6699 0.7470 0.8643
No log 4.5574 278 0.7329 0.6751 0.7329 0.8561
No log 4.5902 280 0.7640 0.6731 0.7640 0.8741
No log 4.6230 282 0.8097 0.6928 0.8097 0.8998
No log 4.6557 284 0.8916 0.6725 0.8916 0.9443
No log 4.6885 286 0.9305 0.6536 0.9305 0.9646
No log 4.7213 288 0.8663 0.6881 0.8663 0.9308
No log 4.7541 290 0.7900 0.6900 0.7900 0.8888
No log 4.7869 292 0.7128 0.6741 0.7128 0.8443
No log 4.8197 294 0.6870 0.6845 0.6870 0.8289
No log 4.8525 296 0.6900 0.6852 0.6900 0.8307
No log 4.8852 298 0.7027 0.6832 0.7027 0.8383
No log 4.9180 300 0.7232 0.6697 0.7232 0.8504
No log 4.9508 302 0.7099 0.6733 0.7099 0.8426
No log 4.9836 304 0.6969 0.6756 0.6969 0.8348
No log 5.0164 306 0.7004 0.6953 0.7004 0.8369
No log 5.0492 308 0.7031 0.6576 0.7031 0.8385
No log 5.0820 310 0.7154 0.6744 0.7154 0.8458
No log 5.1148 312 0.7329 0.6905 0.7329 0.8561
No log 5.1475 314 0.8038 0.6467 0.8038 0.8966
No log 5.1803 316 0.9198 0.6107 0.9198 0.9591
No log 5.2131 318 1.0342 0.6130 1.0342 1.0170
No log 5.2459 320 1.0232 0.5972 1.0232 1.0115
No log 5.2787 322 0.9601 0.6035 0.9601 0.9798
No log 5.3115 324 0.8821 0.6133 0.8821 0.9392
No log 5.3443 326 0.8010 0.6462 0.8010 0.8950
No log 5.3770 328 0.7200 0.6829 0.7200 0.8485
No log 5.4098 330 0.7204 0.6931 0.7204 0.8487
No log 5.4426 332 0.7573 0.6748 0.7573 0.8703
No log 5.4754 334 0.7819 0.6678 0.7819 0.8842
No log 5.5082 336 0.8232 0.6479 0.8232 0.9073
No log 5.5410 338 0.7938 0.6404 0.7938 0.8910
No log 5.5738 340 0.7390 0.7029 0.7390 0.8596
No log 5.6066 342 0.7221 0.6992 0.7221 0.8498
No log 5.6393 344 0.7581 0.6694 0.7581 0.8707
No log 5.6721 346 0.7963 0.6705 0.7963 0.8924
No log 5.7049 348 0.7919 0.6711 0.7919 0.8899
No log 5.7377 350 0.8019 0.6783 0.8019 0.8955
No log 5.7705 352 0.8200 0.6641 0.8200 0.9055
No log 5.8033 354 0.8892 0.6517 0.8892 0.9430
No log 5.8361 356 0.8942 0.6442 0.8942 0.9456
No log 5.8689 358 0.8697 0.6574 0.8697 0.9326
No log 5.9016 360 0.8580 0.6422 0.8580 0.9263
No log 5.9344 362 0.8270 0.6408 0.8270 0.9094
No log 5.9672 364 0.8462 0.6271 0.8462 0.9199
No log 6.0 366 0.8755 0.6271 0.8755 0.9357
No log 6.0328 368 0.8539 0.6462 0.8539 0.9240
No log 6.0656 370 0.7877 0.6794 0.7877 0.8875
No log 6.0984 372 0.7463 0.6756 0.7463 0.8639
No log 6.1311 374 0.7622 0.6899 0.7622 0.8731
No log 6.1639 376 0.7758 0.6825 0.7758 0.8808
No log 6.1967 378 0.7969 0.6853 0.7969 0.8927
No log 6.2295 380 0.8391 0.6633 0.8391 0.9160
No log 6.2623 382 0.8998 0.6433 0.8998 0.9486
No log 6.2951 384 0.9045 0.6359 0.9045 0.9511
No log 6.3279 386 0.8832 0.6545 0.8832 0.9398
No log 6.3607 388 0.8427 0.6617 0.8427 0.9180
No log 6.3934 390 0.7995 0.6753 0.7995 0.8941
No log 6.4262 392 0.7413 0.6850 0.7413 0.8610
No log 6.4590 394 0.6864 0.6969 0.6864 0.8285
No log 6.4918 396 0.6732 0.7012 0.6732 0.8205
No log 6.5246 398 0.6849 0.6944 0.6849 0.8276
No log 6.5574 400 0.7309 0.6954 0.7309 0.8549
No log 6.5902 402 0.7724 0.6951 0.7724 0.8788
No log 6.6230 404 0.8256 0.6600 0.8256 0.9086
No log 6.6557 406 0.8805 0.6449 0.8805 0.9384
No log 6.6885 408 0.8796 0.6542 0.8796 0.9379
No log 6.7213 410 0.8529 0.6472 0.8529 0.9235
No log 6.7541 412 0.8659 0.6365 0.8659 0.9305
No log 6.7869 414 0.8581 0.6301 0.8581 0.9263
No log 6.8197 416 0.8384 0.6293 0.8384 0.9156
No log 6.8525 418 0.8489 0.6323 0.8489 0.9214
No log 6.8852 420 0.8725 0.6337 0.8725 0.9341
No log 6.9180 422 0.8409 0.6499 0.8409 0.9170
No log 6.9508 424 0.8221 0.6737 0.8221 0.9067
No log 6.9836 426 0.8252 0.6727 0.8252 0.9084
No log 7.0164 428 0.8331 0.6692 0.8331 0.9127
No log 7.0492 430 0.8047 0.6680 0.8047 0.8971
No log 7.0820 432 0.7561 0.6566 0.7561 0.8695
No log 7.1148 434 0.7183 0.6720 0.7183 0.8475
No log 7.1475 436 0.7195 0.6558 0.7195 0.8483
No log 7.1803 438 0.7373 0.6515 0.7373 0.8586
No log 7.2131 440 0.7713 0.6296 0.7713 0.8782
No log 7.2459 442 0.7832 0.6333 0.7832 0.8850
No log 7.2787 444 0.7942 0.6401 0.7942 0.8912
No log 7.3115 446 0.8375 0.6561 0.8375 0.9152
No log 7.3443 448 0.8392 0.6553 0.8392 0.9161
No log 7.3770 450 0.8214 0.6517 0.8214 0.9063
No log 7.4098 452 0.7797 0.6544 0.7797 0.8830
No log 7.4426 454 0.7563 0.6459 0.7563 0.8697
No log 7.4754 456 0.7434 0.6676 0.7434 0.8622
No log 7.5082 458 0.7489 0.6563 0.7489 0.8654
No log 7.5410 460 0.7756 0.6379 0.7756 0.8807
No log 7.5738 462 0.8130 0.6533 0.8130 0.9017
No log 7.6066 464 0.8371 0.6595 0.8371 0.9149
No log 7.6393 466 0.8406 0.6641 0.8406 0.9168
No log 7.6721 468 0.8302 0.6641 0.8302 0.9112
No log 7.7049 470 0.7910 0.6706 0.7910 0.8894
No log 7.7377 472 0.7512 0.6928 0.7512 0.8667
No log 7.7705 474 0.7156 0.6834 0.7156 0.8459
No log 7.8033 476 0.6993 0.6834 0.6993 0.8363
No log 7.8361 478 0.6927 0.6951 0.6927 0.8323
No log 7.8689 480 0.6905 0.7044 0.6905 0.8310
No log 7.9016 482 0.6967 0.7044 0.6967 0.8347
No log 7.9344 484 0.7167 0.6816 0.7167 0.8466
No log 7.9672 486 0.7354 0.6556 0.7354 0.8575
No log 8.0 488 0.7464 0.6639 0.7464 0.8639
No log 8.0328 490 0.7490 0.6804 0.7490 0.8654
No log 8.0656 492 0.7531 0.6733 0.7531 0.8678
No log 8.0984 494 0.7588 0.6733 0.7588 0.8711
No log 8.1311 496 0.7568 0.6733 0.7568 0.8700
No log 8.1639 498 0.7558 0.6716 0.7558 0.8694
0.4234 8.1967 500 0.7513 0.6679 0.7513 0.8668
0.4234 8.2295 502 0.7586 0.6723 0.7586 0.8710
0.4234 8.2623 504 0.7645 0.6635 0.7645 0.8743
0.4234 8.2951 506 0.7808 0.6681 0.7808 0.8836
0.4234 8.3279 508 0.7953 0.6683 0.7953 0.8918
0.4234 8.3607 510 0.8067 0.6655 0.8067 0.8982
0.4234 8.3934 512 0.7924 0.6729 0.7924 0.8902
0.4234 8.4262 514 0.7764 0.6622 0.7764 0.8811
0.4234 8.4590 516 0.7681 0.6577 0.7681 0.8764
0.4234 8.4918 518 0.7508 0.6722 0.7508 0.8665
0.4234 8.5246 520 0.7278 0.6772 0.7278 0.8531
0.4234 8.5574 522 0.7064 0.6745 0.7064 0.8405
0.4234 8.5902 524 0.6978 0.6691 0.6978 0.8354
0.4234 8.6230 526 0.7002 0.6674 0.7002 0.8368
0.4234 8.6557 528 0.7134 0.6737 0.7134 0.8446
0.4234 8.6885 530 0.7300 0.6676 0.7300 0.8544
0.4234 8.7213 532 0.7503 0.6753 0.7503 0.8662
0.4234 8.7541 534 0.7605 0.6574 0.7605 0.8721
0.4234 8.7869 536 0.7630 0.6700 0.7630 0.8735
0.4234 8.8197 538 0.7629 0.6700 0.7629 0.8735
0.4234 8.8525 540 0.7579 0.6664 0.7579 0.8706
0.4234 8.8852 542 0.7606 0.6664 0.7606 0.8721
0.4234 8.9180 544 0.7547 0.6733 0.7547 0.8687
0.4234 8.9508 546 0.7397 0.6993 0.7397 0.8600
0.4234 8.9836 548 0.7248 0.6857 0.7248 0.8513
0.4234 9.0164 550 0.7131 0.6864 0.7131 0.8445
0.4234 9.0492 552 0.7023 0.6931 0.7023 0.8381
0.4234 9.0820 554 0.6992 0.6870 0.6992 0.8362
0.4234 9.1148 556 0.7006 0.6870 0.7006 0.8370
0.4234 9.1475 558 0.7053 0.6931 0.7053 0.8398
0.4234 9.1803 560 0.7174 0.6828 0.7174 0.8470
0.4234 9.2131 562 0.7371 0.6935 0.7371 0.8586
0.4234 9.2459 564 0.7537 0.6808 0.7537 0.8682
0.4234 9.2787 566 0.7691 0.6819 0.7691 0.8770
0.4234 9.3115 568 0.7842 0.6752 0.7842 0.8855
0.4234 9.3443 570 0.7945 0.6770 0.7945 0.8913
0.4234 9.3770 572 0.7977 0.6770 0.7977 0.8931
0.4234 9.4098 574 0.7979 0.6770 0.7979 0.8932
0.4234 9.4426 576 0.7927 0.6770 0.7927 0.8903
0.4234 9.4754 578 0.7863 0.6770 0.7863 0.8868
0.4234 9.5082 580 0.7761 0.6813 0.7761 0.8810
0.4234 9.5410 582 0.7689 0.6813 0.7689 0.8769
0.4234 9.5738 584 0.7548 0.6816 0.7548 0.8688
0.4234 9.6066 586 0.7463 0.6816 0.7463 0.8639
0.4234 9.6393 588 0.7388 0.6874 0.7388 0.8595
0.4234 9.6721 590 0.7353 0.7046 0.7353 0.8575
0.4234 9.7049 592 0.7326 0.7046 0.7326 0.8559
0.4234 9.7377 594 0.7311 0.7046 0.7311 0.8550
0.4234 9.7705 596 0.7296 0.7046 0.7296 0.8542
0.4234 9.8033 598 0.7301 0.7046 0.7301 0.8545
0.4234 9.8361 600 0.7309 0.7046 0.7309 0.8549
0.4234 9.8689 602 0.7329 0.7046 0.7329 0.8561
0.4234 9.9016 604 0.7350 0.7046 0.7350 0.8573
0.4234 9.9344 606 0.7363 0.7046 0.7363 0.8581
0.4234 9.9672 608 0.7370 0.7046 0.7370 0.8585
0.4234 10.0 610 0.7374 0.7046 0.7374 0.8587

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits6_FineTuningAraBERT_run1_AugV5_k12_task1_organization

Finetuned
(4023)
this model