ArabicNewSplits4_WithDuplicationsForScore5_FineTuningAraBERT_run2_AugV5_k13_task3_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.6801
  • Qwk: 0.2917
  • Mse: 0.6801
  • Rmse: 0.8247

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 10

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0351 2 3.5176 0.0026 3.5176 1.8755
No log 0.0702 4 1.8704 -0.0130 1.8704 1.3676
No log 0.1053 6 1.2256 0.0530 1.2256 1.1071
No log 0.1404 8 0.8083 0.2081 0.8083 0.8990
No log 0.1754 10 0.9592 0.1588 0.9592 0.9794
No log 0.2105 12 0.8752 0.1493 0.8752 0.9355
No log 0.2456 14 0.7960 0.1845 0.7960 0.8922
No log 0.2807 16 0.6874 0.0569 0.6874 0.8291
No log 0.3158 18 0.7041 0.0569 0.7041 0.8391
No log 0.3509 20 0.7219 0.0409 0.7219 0.8496
No log 0.3860 22 0.6704 0.0137 0.6704 0.8188
No log 0.4211 24 0.7890 0.1411 0.7890 0.8882
No log 0.4561 26 1.0988 -0.1077 1.0988 1.0482
No log 0.4912 28 1.1590 -0.0963 1.1590 1.0766
No log 0.5263 30 0.7116 0.0435 0.7116 0.8436
No log 0.5614 32 0.7502 0.0769 0.7502 0.8661
No log 0.5965 34 0.7285 0.1813 0.7285 0.8535
No log 0.6316 36 1.0540 0.0367 1.0540 1.0266
No log 0.6667 38 1.0011 0.1273 1.0011 1.0005
No log 0.7018 40 0.7702 0.2340 0.7702 0.8776
No log 0.7368 42 0.7045 0.0891 0.7045 0.8393
No log 0.7719 44 0.8432 0.2140 0.8432 0.9183
No log 0.8070 46 0.7144 0.2563 0.7144 0.8452
No log 0.8421 48 0.8678 0.1357 0.8678 0.9315
No log 0.8772 50 0.8429 0.1770 0.8429 0.9181
No log 0.9123 52 0.6182 0.3016 0.6182 0.7863
No log 0.9474 54 0.9506 0.1930 0.9506 0.9750
No log 0.9825 56 0.9884 0.2137 0.9884 0.9942
No log 1.0175 58 0.6176 0.0805 0.6176 0.7859
No log 1.0526 60 0.7773 0.1515 0.7773 0.8817
No log 1.0877 62 0.8049 0.1416 0.8049 0.8972
No log 1.1228 64 0.6388 0.3073 0.6388 0.7992
No log 1.1579 66 0.8656 0.1366 0.8656 0.9304
No log 1.1930 68 1.9819 0.1111 1.9819 1.4078
No log 1.2281 70 2.0746 0.0950 2.0746 1.4403
No log 1.2632 72 0.8658 0.2579 0.8658 0.9305
No log 1.2982 74 0.6834 0.1852 0.6834 0.8267
No log 1.3333 76 0.8243 0.1927 0.8243 0.9079
No log 1.3684 78 0.8730 0.1867 0.8730 0.9343
No log 1.4035 80 0.7639 0.2986 0.7639 0.8740
No log 1.4386 82 1.1133 0.0949 1.1133 1.0551
No log 1.4737 84 1.2999 0.1531 1.2999 1.1402
No log 1.5088 86 0.8950 0.1545 0.8950 0.9461
No log 1.5439 88 0.8437 0.0909 0.8437 0.9186
No log 1.5789 90 1.0514 0.1111 1.0514 1.0254
No log 1.6140 92 0.9929 0.1111 0.9929 0.9965
No log 1.6491 94 0.8094 0.0099 0.8094 0.8997
No log 1.6842 96 0.7545 0.12 0.7545 0.8686
No log 1.7193 98 0.7918 0.1489 0.7918 0.8898
No log 1.7544 100 1.0849 0.0492 1.0849 1.0416
No log 1.7895 102 1.7221 0.1099 1.7221 1.3123
No log 1.8246 104 1.4792 0.0541 1.4792 1.2162
No log 1.8596 106 0.8686 0.0680 0.8686 0.9320
No log 1.8947 108 0.8244 0.1489 0.8244 0.9079
No log 1.9298 110 0.9578 0.1148 0.9578 0.9787
No log 1.9649 112 0.6903 0.2184 0.6903 0.8308
No log 2.0 114 0.6069 0.1902 0.6069 0.7790
No log 2.0351 116 0.6273 0.2298 0.6273 0.7920
No log 2.0702 118 0.9828 0.25 0.9828 0.9914
No log 2.1053 120 0.7358 0.1921 0.7358 0.8578
No log 2.1404 122 0.5681 0.2611 0.5681 0.7537
No log 2.1754 124 0.5684 0.3000 0.5684 0.7539
No log 2.2105 126 0.8120 0.1925 0.8120 0.9011
No log 2.2456 128 0.7971 0.1925 0.7971 0.8928
No log 2.2807 130 0.5846 0.2370 0.5846 0.7646
No log 2.3158 132 0.5927 0.2370 0.5927 0.7699
No log 2.3509 134 0.7380 0.2323 0.7380 0.8591
No log 2.3860 136 0.7265 0.2323 0.7265 0.8523
No log 2.4211 138 0.5801 0.2704 0.5801 0.7617
No log 2.4561 140 0.8848 0.1429 0.8848 0.9407
No log 2.4912 142 0.6829 0.1135 0.6829 0.8264
No log 2.5263 144 0.7646 0.2332 0.7646 0.8744
No log 2.5614 146 0.9641 0.0988 0.9641 0.9819
No log 2.5965 148 0.6327 0.3292 0.6327 0.7954
No log 2.6316 150 0.6638 0.1111 0.6638 0.8148
No log 2.6667 152 0.6354 0.0838 0.6354 0.7971
No log 2.7018 154 0.7506 0.2332 0.7506 0.8664
No log 2.7368 156 0.9712 0.2180 0.9712 0.9855
No log 2.7719 158 0.7002 0.2421 0.7002 0.8368
No log 2.8070 160 0.6463 0.1910 0.6463 0.8039
No log 2.8421 162 0.6219 0.1000 0.6219 0.7886
No log 2.8772 164 0.7196 0.2258 0.7196 0.8483
No log 2.9123 166 1.3114 0.0952 1.3114 1.1452
No log 2.9474 168 1.1558 0.1383 1.1558 1.0751
No log 2.9825 170 0.6925 0.2258 0.6925 0.8322
No log 3.0175 172 0.5754 0.2000 0.5754 0.7585
No log 3.0526 174 0.5900 0.2099 0.5900 0.7681
No log 3.0877 176 0.8066 0.1923 0.8066 0.8981
No log 3.1228 178 0.7707 0.2075 0.7707 0.8779
No log 3.1579 180 0.6465 0.2179 0.6465 0.8041
No log 3.1930 182 0.8414 0.1770 0.8414 0.9173
No log 3.2281 184 0.8008 0.1527 0.8008 0.8949
No log 3.2632 186 0.7094 0.1011 0.7094 0.8423
No log 3.2982 188 0.8028 0.0647 0.8028 0.8960
No log 3.3333 190 0.6985 0.1135 0.6985 0.8358
No log 3.3684 192 0.7038 0.1135 0.7038 0.8390
No log 3.4035 194 0.7721 0.1269 0.7721 0.8787
No log 3.4386 196 0.7639 0.1600 0.7639 0.8740
No log 3.4737 198 0.7174 0.1923 0.7174 0.8470
No log 3.5088 200 0.7473 0.1600 0.7473 0.8644
No log 3.5439 202 0.9692 0.1807 0.9692 0.9845
No log 3.5789 204 0.7537 0.0995 0.7537 0.8682
No log 3.6140 206 0.6077 0.2832 0.6077 0.7795
No log 3.6491 208 0.6053 0.1813 0.6053 0.7780
No log 3.6842 210 0.8096 0.1570 0.8096 0.8998
No log 3.7193 212 0.8354 0.1570 0.8354 0.9140
No log 3.7544 214 0.5875 0.2593 0.5875 0.7665
No log 3.7895 216 0.6213 0.1675 0.6213 0.7882
No log 3.8246 218 0.5958 0.2251 0.5958 0.7719
No log 3.8596 220 0.5507 0.3631 0.5507 0.7421
No log 3.8947 222 0.9162 0.1652 0.9162 0.9572
No log 3.9298 224 1.0833 0.1127 1.0833 1.0408
No log 3.9649 226 0.8371 0.1273 0.8371 0.9149
No log 4.0 228 0.5489 0.1888 0.5489 0.7409
No log 4.0351 230 0.7264 0.1619 0.7264 0.8523
No log 4.0702 232 0.7322 0.1619 0.7322 0.8557
No log 4.1053 234 0.5658 0.2405 0.5658 0.7522
No log 4.1404 236 0.7624 0.2233 0.7624 0.8732
No log 4.1754 238 1.0396 0.1062 1.0396 1.0196
No log 4.2105 240 0.9615 0.0968 0.9615 0.9806
No log 4.2456 242 0.7029 0.2850 0.7029 0.8384
No log 4.2807 244 0.7109 0.1285 0.7109 0.8431
No log 4.3158 246 0.7441 0.1568 0.7441 0.8626
No log 4.3509 248 0.6979 0.25 0.6979 0.8354
No log 4.3860 250 0.8070 0.1718 0.8070 0.8983
No log 4.4211 252 0.7961 0.1712 0.7961 0.8923
No log 4.4561 254 0.6570 0.2644 0.6570 0.8105
No log 4.4912 256 0.6353 0.2000 0.6353 0.7971
No log 4.5263 258 0.6246 0.2099 0.6246 0.7903
No log 4.5614 260 0.6216 0.2099 0.6216 0.7884
No log 4.5965 262 0.6385 0.2749 0.6385 0.7990
No log 4.6316 264 0.6580 0.2360 0.6580 0.8112
No log 4.6667 266 0.8004 0.1111 0.8004 0.8946
No log 4.7018 268 0.9827 0.0988 0.9827 0.9913
No log 4.7368 270 0.8341 0.1131 0.8341 0.9133
No log 4.7719 272 0.6672 0.2370 0.6672 0.8168
No log 4.8070 274 0.6473 0.2683 0.6473 0.8045
No log 4.8421 276 0.6579 0.3103 0.6579 0.8111
No log 4.8772 278 0.6691 0.2370 0.6691 0.8180
No log 4.9123 280 0.6893 0.3953 0.6893 0.8303
No log 4.9474 282 0.8023 0.1493 0.8023 0.8957
No log 4.9825 284 0.7337 0.3016 0.7337 0.8565
No log 5.0175 286 0.6031 0.1282 0.6031 0.7766
No log 5.0526 288 0.7605 0.1456 0.7605 0.8720
No log 5.0877 290 0.9241 0.1366 0.9241 0.9613
No log 5.1228 292 0.7610 0.1456 0.7610 0.8724
No log 5.1579 294 0.5870 0.2000 0.5870 0.7662
No log 5.1930 296 0.8281 0.1211 0.8281 0.9100
No log 5.2281 298 1.0110 0.1373 1.0110 1.0055
No log 5.2632 300 0.8796 0.1579 0.8796 0.9379
No log 5.2982 302 0.6179 0.2889 0.6179 0.7861
No log 5.3333 304 0.5813 0.2771 0.5813 0.7625
No log 5.3684 306 0.5845 0.2771 0.5845 0.7645
No log 5.4035 308 0.6229 0.2350 0.6229 0.7893
No log 5.4386 310 0.8145 0.1560 0.8145 0.9025
No log 5.4737 312 0.8374 0.1928 0.8374 0.9151
No log 5.5088 314 0.6850 0.2821 0.6850 0.8277
No log 5.5439 316 0.6189 0.1698 0.6189 0.7867
No log 5.5789 318 0.6325 0.2663 0.6325 0.7953
No log 5.6140 320 0.6341 0.1500 0.6341 0.7963
No log 5.6491 322 0.6960 0.2245 0.6960 0.8343
No log 5.6842 324 0.8431 0.0742 0.8431 0.9182
No log 5.7193 326 0.9004 0.1525 0.9004 0.9489
No log 5.7544 328 0.8719 0.1169 0.8719 0.9337
No log 5.7895 330 0.7548 0.2000 0.7548 0.8688
No log 5.8246 332 0.6973 0.2821 0.6973 0.8351
No log 5.8596 334 0.7480 0.1619 0.7480 0.8649
No log 5.8947 336 0.7106 0.2821 0.7106 0.8429
No log 5.9298 338 0.6309 0.2000 0.6309 0.7943
No log 5.9649 340 0.6241 0.2000 0.6241 0.7900
No log 6.0 342 0.6676 0.2842 0.6676 0.8171
No log 6.0351 344 0.8086 0.1481 0.8086 0.8992
No log 6.0702 346 0.7846 0.2075 0.7846 0.8858
No log 6.1053 348 0.7542 0.2077 0.7542 0.8685
No log 6.1404 350 0.6821 0.3161 0.6821 0.8259
No log 6.1754 352 0.6718 0.2766 0.6718 0.8196
No log 6.2105 354 0.6636 0.2766 0.6636 0.8146
No log 6.2456 356 0.6760 0.2842 0.6760 0.8222
No log 6.2807 358 0.6753 0.2842 0.6753 0.8217
No log 6.3158 360 0.6565 0.2842 0.6565 0.8103
No log 6.3509 362 0.6515 0.2865 0.6515 0.8072
No log 6.3860 364 0.5941 0.2318 0.5941 0.7708
No log 6.4211 366 0.6203 0.2096 0.6203 0.7876
No log 6.4561 368 0.6312 0.2096 0.6312 0.7945
No log 6.4912 370 0.6418 0.2749 0.6418 0.8011
No log 6.5263 372 0.8300 0.1055 0.8300 0.9110
No log 6.5614 374 0.9874 0.1331 0.9874 0.9937
No log 6.5965 376 0.9319 0.2129 0.9319 0.9653
No log 6.6316 378 0.7488 0.1636 0.7488 0.8653
No log 6.6667 380 0.6517 0.2727 0.6517 0.8073
No log 6.7018 382 0.6302 0.3455 0.6302 0.7938
No log 6.7368 384 0.6291 0.2471 0.6291 0.7931
No log 6.7719 386 0.7257 0.2153 0.7257 0.8519
No log 6.8070 388 0.8101 0.1193 0.8101 0.9000
No log 6.8421 390 0.7513 0.1481 0.7513 0.8668
No log 6.8772 392 0.6395 0.3023 0.6395 0.7997
No log 6.9123 394 0.5752 0.2000 0.5752 0.7584
No log 6.9474 396 0.5696 0.1678 0.5696 0.7547
No log 6.9825 398 0.5685 0.2318 0.5685 0.7540
No log 7.0175 400 0.5871 0.2911 0.5871 0.7662
No log 7.0526 402 0.5984 0.2911 0.5984 0.7736
No log 7.0877 404 0.6030 0.2832 0.6030 0.7765
No log 7.1228 406 0.6003 0.2857 0.6003 0.7748
No log 7.1579 408 0.6166 0.2457 0.6166 0.7852
No log 7.1930 410 0.6753 0.2941 0.6753 0.8218
No log 7.2281 412 0.6933 0.2871 0.6933 0.8326
No log 7.2632 414 0.6567 0.2967 0.6567 0.8104
No log 7.2982 416 0.6329 0.2889 0.6329 0.7956
No log 7.3333 418 0.6054 0.3253 0.6054 0.7781
No log 7.3684 420 0.6036 0.2795 0.6036 0.7769
No log 7.4035 422 0.6177 0.2381 0.6177 0.7859
No log 7.4386 424 0.6737 0.2941 0.6737 0.8208
No log 7.4737 426 0.6905 0.2871 0.6905 0.8310
No log 7.5088 428 0.6679 0.2432 0.6679 0.8173
No log 7.5439 430 0.6352 0.2749 0.6352 0.7970
No log 7.5789 432 0.6321 0.4083 0.6321 0.7951
No log 7.6140 434 0.6313 0.3121 0.6313 0.7946
No log 7.6491 436 0.6241 0.3086 0.6241 0.7900
No log 7.6842 438 0.6223 0.2795 0.6223 0.7888
No log 7.7193 440 0.6499 0.2444 0.6499 0.8062
No log 7.7544 442 0.6567 0.2994 0.6567 0.8104
No log 7.7895 444 0.6267 0.2914 0.6267 0.7916
No log 7.8246 446 0.5919 0.1892 0.5919 0.7694
No log 7.8596 448 0.5895 0.2318 0.5895 0.7678
No log 7.8947 450 0.5930 0.2318 0.5930 0.7701
No log 7.9298 452 0.5980 0.1892 0.5980 0.7733
No log 7.9649 454 0.6220 0.2471 0.6220 0.7886
No log 8.0 456 0.6479 0.3488 0.6479 0.8049
No log 8.0351 458 0.6494 0.3446 0.6494 0.8059
No log 8.0702 460 0.6246 0.3412 0.6246 0.7903
No log 8.1053 462 0.6114 0.2771 0.6114 0.7819
No log 8.1404 464 0.6195 0.2393 0.6195 0.7871
No log 8.1754 466 0.6340 0.2189 0.6340 0.7963
No log 8.2105 468 0.6282 0.2857 0.6282 0.7926
No log 8.2456 470 0.6408 0.2857 0.6408 0.8005
No log 8.2807 472 0.7194 0.2464 0.7194 0.8482
No log 8.3158 474 0.8099 0.1453 0.8099 0.8999
No log 8.3509 476 0.8254 0.1464 0.8254 0.9085
No log 8.3860 478 0.7725 0.2074 0.7725 0.8789
No log 8.4211 480 0.7074 0.2821 0.7074 0.8410
No log 8.4561 482 0.6580 0.3520 0.6580 0.8112
No log 8.4912 484 0.6513 0.2626 0.6513 0.8070
No log 8.5263 486 0.6516 0.2090 0.6516 0.8072
No log 8.5614 488 0.6424 0.3258 0.6424 0.8015
No log 8.5965 490 0.6477 0.3913 0.6477 0.8048
No log 8.6316 492 0.6744 0.2842 0.6744 0.8212
No log 8.6667 494 0.7298 0.2464 0.7298 0.8543
No log 8.7018 496 0.7621 0.1776 0.7621 0.8730
No log 8.7368 498 0.7683 0.1776 0.7683 0.8765
0.3811 8.7719 500 0.7315 0.2464 0.7315 0.8553
0.3811 8.8070 502 0.6769 0.2842 0.6769 0.8228
0.3811 8.8421 504 0.6381 0.3258 0.6381 0.7988
0.3811 8.8772 506 0.6256 0.3023 0.6256 0.7910
0.3811 8.9123 508 0.6233 0.3023 0.6233 0.7895
0.3811 8.9474 510 0.6245 0.3563 0.6245 0.7902
0.3811 8.9825 512 0.6313 0.3295 0.6313 0.7945
0.3811 9.0175 514 0.6400 0.3258 0.6400 0.8000
0.3811 9.0526 516 0.6569 0.3778 0.6569 0.8105
0.3811 9.0877 518 0.6852 0.2917 0.6852 0.8278
0.3811 9.1228 520 0.7119 0.2917 0.7119 0.8438
0.3811 9.1579 522 0.7246 0.2475 0.7246 0.8512
0.3811 9.1930 524 0.7122 0.2917 0.7122 0.8439
0.3811 9.2281 526 0.6920 0.2917 0.6920 0.8319
0.3811 9.2632 528 0.6710 0.3846 0.6710 0.8191
0.3811 9.2982 530 0.6628 0.3898 0.6628 0.8141
0.3811 9.3333 532 0.6607 0.3778 0.6607 0.8128
0.3811 9.3684 534 0.6530 0.3778 0.6530 0.8081
0.3811 9.4035 536 0.6518 0.3778 0.6518 0.8073
0.3811 9.4386 538 0.6610 0.3778 0.6610 0.8130
0.3811 9.4737 540 0.6662 0.3778 0.6662 0.8162
0.3811 9.5088 542 0.6714 0.2821 0.6714 0.8194
0.3811 9.5439 544 0.6716 0.2821 0.6716 0.8195
0.3811 9.5789 546 0.6751 0.2917 0.6751 0.8217
0.3811 9.6140 548 0.6767 0.2917 0.6767 0.8226
0.3811 9.6491 550 0.6776 0.2917 0.6776 0.8232
0.3811 9.6842 552 0.6774 0.2917 0.6774 0.8230
0.3811 9.7193 554 0.6754 0.2917 0.6754 0.8218
0.3811 9.7544 556 0.6754 0.2917 0.6754 0.8219
0.3811 9.7895 558 0.6778 0.2917 0.6778 0.8233
0.3811 9.8246 560 0.6815 0.2917 0.6815 0.8255
0.3811 9.8596 562 0.6825 0.2917 0.6825 0.8262
0.3811 9.8947 564 0.6813 0.2917 0.6813 0.8254
0.3811 9.9298 566 0.6801 0.2917 0.6801 0.8247
0.3811 9.9649 568 0.6800 0.2917 0.6800 0.8246
0.3811 10.0 570 0.6801 0.2917 0.6801 0.8247

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
1
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits4_WithDuplicationsForScore5_FineTuningAraBERT_run2_AugV5_k13_task3_organization

Finetuned
(4023)
this model