ArabicNewSplits4_WithDuplicationsForScore5_FineTuningAraBERT_run3_AugV5_k13_task3_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.6795
  • Qwk: 0.2917
  • Mse: 0.6795
  • Rmse: 0.8243

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 10

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0351 2 3.5176 0.0026 3.5176 1.8755
No log 0.0702 4 1.8704 -0.0130 1.8704 1.3676
No log 0.1053 6 1.2256 0.0530 1.2256 1.1071
No log 0.1404 8 0.8083 0.2081 0.8083 0.8990
No log 0.1754 10 0.9592 0.1588 0.9592 0.9794
No log 0.2105 12 0.8752 0.1493 0.8752 0.9355
No log 0.2456 14 0.7960 0.1845 0.7960 0.8922
No log 0.2807 16 0.6874 0.0569 0.6874 0.8291
No log 0.3158 18 0.7041 0.0569 0.7041 0.8391
No log 0.3509 20 0.7219 0.0409 0.7219 0.8496
No log 0.3860 22 0.6704 0.0137 0.6704 0.8188
No log 0.4211 24 0.7890 0.1411 0.7890 0.8882
No log 0.4561 26 1.0988 -0.1077 1.0988 1.0482
No log 0.4912 28 1.1590 -0.0963 1.1590 1.0766
No log 0.5263 30 0.7116 0.0435 0.7116 0.8436
No log 0.5614 32 0.7502 0.0769 0.7502 0.8661
No log 0.5965 34 0.7285 0.1813 0.7285 0.8535
No log 0.6316 36 1.0540 0.0367 1.0540 1.0266
No log 0.6667 38 1.0011 0.1273 1.0011 1.0005
No log 0.7018 40 0.7702 0.2340 0.7702 0.8776
No log 0.7368 42 0.7045 0.0891 0.7045 0.8393
No log 0.7719 44 0.8432 0.2140 0.8432 0.9183
No log 0.8070 46 0.7144 0.2563 0.7144 0.8452
No log 0.8421 48 0.8678 0.1357 0.8678 0.9315
No log 0.8772 50 0.8429 0.1770 0.8429 0.9181
No log 0.9123 52 0.6182 0.3016 0.6182 0.7863
No log 0.9474 54 0.9506 0.1930 0.9506 0.9750
No log 0.9825 56 0.9884 0.2137 0.9884 0.9942
No log 1.0175 58 0.6176 0.0805 0.6176 0.7859
No log 1.0526 60 0.7773 0.1515 0.7773 0.8817
No log 1.0877 62 0.8049 0.1416 0.8049 0.8972
No log 1.1228 64 0.6388 0.3073 0.6388 0.7992
No log 1.1579 66 0.8656 0.1366 0.8656 0.9304
No log 1.1930 68 1.9819 0.1111 1.9819 1.4078
No log 1.2281 70 2.0745 0.0950 2.0745 1.4403
No log 1.2632 72 0.8657 0.2579 0.8657 0.9305
No log 1.2982 74 0.6834 0.1852 0.6834 0.8267
No log 1.3333 76 0.8243 0.1927 0.8243 0.9079
No log 1.3684 78 0.8730 0.1867 0.8730 0.9343
No log 1.4035 80 0.7639 0.2986 0.7639 0.8740
No log 1.4386 82 1.1133 0.0949 1.1133 1.0551
No log 1.4737 84 1.2999 0.1531 1.2999 1.1401
No log 1.5088 86 0.8950 0.1545 0.8950 0.9461
No log 1.5439 88 0.8437 0.0909 0.8437 0.9185
No log 1.5789 90 1.0514 0.1111 1.0514 1.0254
No log 1.6140 92 0.9929 0.1111 0.9929 0.9965
No log 1.6491 94 0.8094 0.0099 0.8094 0.8997
No log 1.6842 96 0.7545 0.12 0.7545 0.8686
No log 1.7193 98 0.7918 0.1489 0.7918 0.8898
No log 1.7544 100 1.0849 0.0492 1.0849 1.0416
No log 1.7895 102 1.7221 0.1099 1.7221 1.3123
No log 1.8246 104 1.4792 0.0541 1.4792 1.2162
No log 1.8596 106 0.8686 0.0680 0.8686 0.9320
No log 1.8947 108 0.8243 0.1489 0.8243 0.9079
No log 1.9298 110 0.9578 0.1148 0.9578 0.9787
No log 1.9649 112 0.6903 0.2184 0.6903 0.8308
No log 2.0 114 0.6069 0.1902 0.6069 0.7790
No log 2.0351 116 0.6273 0.2298 0.6273 0.7920
No log 2.0702 118 0.9828 0.25 0.9828 0.9914
No log 2.1053 120 0.7358 0.1921 0.7358 0.8578
No log 2.1404 122 0.5681 0.2611 0.5681 0.7537
No log 2.1754 124 0.5684 0.3000 0.5684 0.7539
No log 2.2105 126 0.8120 0.1925 0.8120 0.9011
No log 2.2456 128 0.7971 0.1925 0.7971 0.8928
No log 2.2807 130 0.5846 0.2370 0.5846 0.7646
No log 2.3158 132 0.5927 0.2370 0.5927 0.7699
No log 2.3509 134 0.7380 0.2323 0.7380 0.8591
No log 2.3860 136 0.7265 0.2323 0.7265 0.8523
No log 2.4211 138 0.5801 0.2704 0.5801 0.7617
No log 2.4561 140 0.8848 0.1429 0.8848 0.9407
No log 2.4912 142 0.6828 0.1135 0.6828 0.8263
No log 2.5263 144 0.7646 0.2332 0.7646 0.8744
No log 2.5614 146 0.9641 0.0988 0.9641 0.9819
No log 2.5965 148 0.6327 0.3292 0.6327 0.7954
No log 2.6316 150 0.6639 0.1111 0.6639 0.8148
No log 2.6667 152 0.6354 0.0838 0.6354 0.7971
No log 2.7018 154 0.7506 0.2332 0.7506 0.8664
No log 2.7368 156 0.9712 0.2180 0.9712 0.9855
No log 2.7719 158 0.7002 0.2421 0.7002 0.8368
No log 2.8070 160 0.6463 0.1910 0.6463 0.8039
No log 2.8421 162 0.6219 0.1000 0.6219 0.7886
No log 2.8772 164 0.7196 0.2258 0.7196 0.8483
No log 2.9123 166 1.3114 0.0952 1.3114 1.1452
No log 2.9474 168 1.1559 0.1383 1.1559 1.0751
No log 2.9825 170 0.6925 0.2258 0.6925 0.8322
No log 3.0175 172 0.5754 0.2000 0.5754 0.7585
No log 3.0526 174 0.5900 0.2099 0.5900 0.7681
No log 3.0877 176 0.8065 0.1923 0.8065 0.8981
No log 3.1228 178 0.7707 0.2075 0.7707 0.8779
No log 3.1579 180 0.6465 0.2179 0.6465 0.8041
No log 3.1930 182 0.8414 0.1770 0.8414 0.9173
No log 3.2281 184 0.8008 0.1527 0.8008 0.8949
No log 3.2632 186 0.7094 0.1011 0.7094 0.8423
No log 3.2982 188 0.8029 0.0647 0.8029 0.8960
No log 3.3333 190 0.6985 0.1135 0.6985 0.8358
No log 3.3684 192 0.7038 0.1135 0.7038 0.8390
No log 3.4035 194 0.7721 0.1269 0.7721 0.8787
No log 3.4386 196 0.7639 0.1600 0.7639 0.8740
No log 3.4737 198 0.7174 0.1923 0.7174 0.8470
No log 3.5088 200 0.7473 0.1600 0.7473 0.8644
No log 3.5439 202 0.9692 0.1807 0.9692 0.9845
No log 3.5789 204 0.7537 0.0995 0.7537 0.8682
No log 3.6140 206 0.6077 0.2832 0.6077 0.7795
No log 3.6491 208 0.6053 0.1813 0.6053 0.7780
No log 3.6842 210 0.8096 0.1570 0.8096 0.8998
No log 3.7193 212 0.8354 0.1570 0.8354 0.9140
No log 3.7544 214 0.5876 0.2593 0.5876 0.7665
No log 3.7895 216 0.6213 0.1675 0.6213 0.7882
No log 3.8246 218 0.5958 0.2251 0.5958 0.7719
No log 3.8596 220 0.5507 0.3631 0.5507 0.7421
No log 3.8947 222 0.9162 0.1652 0.9162 0.9572
No log 3.9298 224 1.0833 0.1127 1.0833 1.0408
No log 3.9649 226 0.8370 0.1273 0.8370 0.9149
No log 4.0 228 0.5489 0.1888 0.5489 0.7409
No log 4.0351 230 0.7264 0.1619 0.7264 0.8523
No log 4.0702 232 0.7321 0.1619 0.7321 0.8556
No log 4.1053 234 0.5658 0.2405 0.5658 0.7522
No log 4.1404 236 0.7625 0.2233 0.7625 0.8732
No log 4.1754 238 1.0395 0.1062 1.0395 1.0195
No log 4.2105 240 0.9612 0.0968 0.9612 0.9804
No log 4.2456 242 0.7028 0.2850 0.7028 0.8384
No log 4.2807 244 0.7108 0.1285 0.7108 0.8431
No log 4.3158 246 0.7438 0.1568 0.7438 0.8625
No log 4.3509 248 0.6980 0.25 0.6980 0.8355
No log 4.3860 250 0.8069 0.1718 0.8069 0.8983
No log 4.4211 252 0.7957 0.1712 0.7957 0.8920
No log 4.4561 254 0.6568 0.2644 0.6568 0.8104
No log 4.4912 256 0.6352 0.2000 0.6352 0.7970
No log 4.5263 258 0.6245 0.2099 0.6245 0.7902
No log 4.5614 260 0.6214 0.2099 0.6214 0.7883
No log 4.5965 262 0.6385 0.2749 0.6385 0.7991
No log 4.6316 264 0.6580 0.2360 0.6580 0.8112
No log 4.6667 266 0.7998 0.1111 0.7998 0.8943
No log 4.7018 268 0.9820 0.0988 0.9820 0.9910
No log 4.7368 270 0.8334 0.1131 0.8334 0.9129
No log 4.7719 272 0.6668 0.2370 0.6668 0.8166
No log 4.8070 274 0.6471 0.2683 0.6471 0.8044
No log 4.8421 276 0.6578 0.3103 0.6578 0.8111
No log 4.8772 278 0.6691 0.2370 0.6691 0.8180
No log 4.9123 280 0.6893 0.3953 0.6893 0.8303
No log 4.9474 282 0.8020 0.1493 0.8020 0.8956
No log 4.9825 284 0.7331 0.3016 0.7331 0.8562
No log 5.0175 286 0.6030 0.1195 0.6030 0.7765
No log 5.0526 288 0.7604 0.1456 0.7604 0.8720
No log 5.0877 290 0.9231 0.1366 0.9231 0.9608
No log 5.1228 292 0.7597 0.1456 0.7597 0.8716
No log 5.1579 294 0.5870 0.2000 0.5870 0.7662
No log 5.1930 296 0.8243 0.1560 0.8243 0.9079
No log 5.2281 298 0.9947 0.1040 0.9947 0.9974
No log 5.2632 300 0.8588 0.1579 0.8588 0.9267
No log 5.2982 302 0.6098 0.2889 0.6098 0.7809
No log 5.3333 304 0.5814 0.2771 0.5814 0.7625
No log 5.3684 306 0.5837 0.1902 0.5837 0.7640
No log 5.4035 308 0.6431 0.2865 0.6431 0.8019
No log 5.4386 310 0.8478 0.1579 0.8478 0.9207
No log 5.4737 312 0.8571 0.1588 0.8571 0.9258
No log 5.5088 314 0.6895 0.25 0.6895 0.8304
No log 5.5439 316 0.6188 0.1698 0.6188 0.7866
No log 5.5789 318 0.6334 0.2663 0.6334 0.7959
No log 5.6140 320 0.6342 0.1500 0.6342 0.7964
No log 5.6491 322 0.6975 0.2245 0.6975 0.8352
No log 5.6842 324 0.8524 0.0742 0.8524 0.9232
No log 5.7193 326 0.9148 0.1870 0.9148 0.9564
No log 5.7544 328 0.8729 0.1169 0.8729 0.9343
No log 5.7895 330 0.7487 0.2000 0.7487 0.8653
No log 5.8246 332 0.6925 0.2842 0.6925 0.8322
No log 5.8596 334 0.7425 0.2000 0.7425 0.8617
No log 5.8947 336 0.7178 0.2821 0.7178 0.8472
No log 5.9298 338 0.6330 0.2471 0.6330 0.7956
No log 5.9649 340 0.6209 0.2000 0.6209 0.7880
No log 6.0 342 0.6572 0.2865 0.6572 0.8107
No log 6.0351 344 0.7934 0.1770 0.7934 0.8907
No log 6.0702 346 0.7802 0.2075 0.7802 0.8833
No log 6.1053 348 0.7614 0.2077 0.7614 0.8726
No log 6.1404 350 0.6876 0.2727 0.6876 0.8292
No log 6.1754 352 0.6724 0.2766 0.6724 0.8200
No log 6.2105 354 0.6619 0.2766 0.6619 0.8135
No log 6.2456 356 0.6729 0.2842 0.6729 0.8203
No log 6.2807 358 0.6740 0.2842 0.6740 0.8210
No log 6.3158 360 0.6601 0.2865 0.6601 0.8125
No log 6.3509 362 0.6564 0.2865 0.6564 0.8102
No log 6.3860 364 0.5968 0.2318 0.5968 0.7726
No log 6.4211 366 0.6215 0.2096 0.6215 0.7883
No log 6.4561 368 0.6343 0.2096 0.6343 0.7964
No log 6.4912 370 0.6427 0.2683 0.6427 0.8017
No log 6.5263 372 0.8248 0.1333 0.8248 0.9082
No log 6.5614 374 0.9798 0.1331 0.9798 0.9898
No log 6.5965 376 0.9254 0.2129 0.9254 0.9620
No log 6.6316 378 0.7464 0.1636 0.7464 0.8640
No log 6.6667 380 0.6526 0.2626 0.6526 0.8078
No log 6.7018 382 0.6307 0.3455 0.6307 0.7942
No log 6.7368 384 0.6308 0.2457 0.6308 0.7942
No log 6.7719 386 0.7294 0.2153 0.7294 0.8540
No log 6.8070 388 0.8127 0.1193 0.8127 0.9015
No log 6.8421 390 0.7523 0.1481 0.7523 0.8673
No log 6.8772 392 0.6397 0.3446 0.6397 0.7998
No log 6.9123 394 0.5755 0.2418 0.5755 0.7586
No log 6.9474 396 0.5698 0.2632 0.5698 0.7549
No log 6.9825 398 0.5688 0.2318 0.5688 0.7542
No log 7.0175 400 0.5895 0.2911 0.5895 0.7678
No log 7.0526 402 0.6007 0.2393 0.6007 0.7750
No log 7.0877 404 0.6024 0.2370 0.6024 0.7762
No log 7.1228 406 0.5988 0.2771 0.5988 0.7738
No log 7.1579 408 0.6140 0.2457 0.6140 0.7836
No log 7.1930 410 0.6740 0.2941 0.6740 0.8210
No log 7.2281 412 0.6938 0.2871 0.6938 0.8329
No log 7.2632 414 0.6577 0.2967 0.6577 0.8110
No log 7.2982 416 0.6335 0.2889 0.6335 0.7959
No log 7.3333 418 0.6050 0.3253 0.6050 0.7778
No log 7.3684 420 0.6032 0.3253 0.6032 0.7766
No log 7.4035 422 0.6175 0.2381 0.6175 0.7858
No log 7.4386 424 0.6737 0.2941 0.6737 0.8208
No log 7.4737 426 0.6903 0.2871 0.6903 0.8308
No log 7.5088 428 0.6674 0.2432 0.6674 0.8170
No log 7.5439 430 0.6346 0.2749 0.6346 0.7966
No log 7.5789 432 0.6314 0.4083 0.6314 0.7946
No log 7.6140 434 0.6296 0.2970 0.6296 0.7935
No log 7.6491 436 0.6227 0.3086 0.6227 0.7891
No log 7.6842 438 0.6232 0.3684 0.6232 0.7894
No log 7.7193 440 0.6544 0.2444 0.6544 0.8089
No log 7.7544 442 0.6617 0.2941 0.6617 0.8134
No log 7.7895 444 0.6293 0.3488 0.6293 0.7933
No log 7.8246 446 0.5912 0.1892 0.5912 0.7689
No log 7.8596 448 0.5880 0.2318 0.5880 0.7668
No log 7.8947 450 0.5914 0.2208 0.5914 0.7691
No log 7.9298 452 0.5966 0.1895 0.5966 0.7724
No log 7.9649 454 0.6217 0.2914 0.6217 0.7885
No log 8.0 456 0.6491 0.3446 0.6491 0.8057
No log 8.0351 458 0.6512 0.3446 0.6512 0.8070
No log 8.0702 460 0.6260 0.3488 0.6260 0.7912
No log 8.1053 462 0.6113 0.2771 0.6113 0.7818
No log 8.1404 464 0.6176 0.2393 0.6176 0.7858
No log 8.1754 466 0.6316 0.2289 0.6316 0.7947
No log 8.2105 468 0.6264 0.2857 0.6264 0.7915
No log 8.2456 470 0.6403 0.3295 0.6403 0.8002
No log 8.2807 472 0.7214 0.2464 0.7214 0.8493
No log 8.3158 474 0.8151 0.1453 0.8151 0.9028
No log 8.3509 476 0.8322 0.1464 0.8322 0.9122
No log 8.3860 478 0.7785 0.2074 0.7785 0.8823
No log 8.4211 480 0.7100 0.2821 0.7100 0.8426
No log 8.4561 482 0.6573 0.3913 0.6573 0.8107
No log 8.4912 484 0.6482 0.2626 0.6482 0.8051
No log 8.5263 486 0.6482 0.2527 0.6482 0.8051
No log 8.5614 488 0.6401 0.3258 0.6401 0.8001
No log 8.5965 490 0.6471 0.3591 0.6471 0.8044
No log 8.6316 492 0.6755 0.2842 0.6755 0.8219
No log 8.6667 494 0.7330 0.2464 0.7330 0.8562
No log 8.7018 496 0.7659 0.2074 0.7659 0.8752
No log 8.7368 498 0.7716 0.1776 0.7716 0.8784
0.381 8.7719 500 0.7337 0.2464 0.7337 0.8565
0.381 8.8070 502 0.6776 0.2842 0.6776 0.8231
0.381 8.8421 504 0.6371 0.3258 0.6371 0.7982
0.381 8.8772 506 0.6239 0.3023 0.6239 0.7899
0.381 8.9123 508 0.6216 0.3446 0.6216 0.7884
0.381 8.9474 510 0.6229 0.3216 0.6229 0.7893
0.381 8.9825 512 0.6301 0.3295 0.6301 0.7938
0.381 9.0175 514 0.6394 0.3829 0.6394 0.7996
0.381 9.0526 516 0.6575 0.3778 0.6575 0.8109
0.381 9.0877 518 0.6872 0.2917 0.6872 0.8290
0.381 9.1228 520 0.7152 0.2893 0.7152 0.8457
0.381 9.1579 522 0.7274 0.2475 0.7274 0.8529
0.381 9.1930 524 0.7142 0.2917 0.7142 0.8451
0.381 9.2281 526 0.6932 0.2917 0.6932 0.8326
0.381 9.2632 528 0.6712 0.3846 0.6712 0.8192
0.381 9.2982 530 0.6625 0.3846 0.6625 0.8139
0.381 9.3333 532 0.6603 0.3898 0.6603 0.8126
0.381 9.3684 534 0.6525 0.3778 0.6525 0.8078
0.381 9.4035 536 0.6515 0.3778 0.6515 0.8072
0.381 9.4386 538 0.6614 0.3778 0.6614 0.8132
0.381 9.4737 540 0.6670 0.3730 0.6670 0.8167
0.381 9.5088 542 0.6721 0.2917 0.6721 0.8198
0.381 9.5439 544 0.6720 0.2917 0.6720 0.8197
0.381 9.5789 546 0.6752 0.2917 0.6752 0.8217
0.381 9.6140 548 0.6763 0.2917 0.6763 0.8223
0.381 9.6491 550 0.6769 0.2917 0.6769 0.8227
0.381 9.6842 552 0.6764 0.2917 0.6764 0.8224
0.381 9.7193 554 0.6743 0.2917 0.6743 0.8211
0.381 9.7544 556 0.6744 0.2917 0.6744 0.8212
0.381 9.7895 558 0.6769 0.2917 0.6769 0.8227
0.381 9.8246 560 0.6808 0.2917 0.6808 0.8251
0.381 9.8596 562 0.6819 0.2917 0.6819 0.8258
0.381 9.8947 564 0.6807 0.2917 0.6807 0.8251
0.381 9.9298 566 0.6796 0.2917 0.6796 0.8244
0.381 9.9649 568 0.6795 0.2917 0.6795 0.8243
0.381 10.0 570 0.6795 0.2917 0.6795 0.8243

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits4_WithDuplicationsForScore5_FineTuningAraBERT_run3_AugV5_k13_task3_organization

Finetuned
(4023)
this model