ArabicNewSplits7_FineTuningAraBERT_run1_AugV5_k18_task2_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.9698
  • Qwk: 0.2743
  • Mse: 0.9698
  • Rmse: 0.9848

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0364 2 4.7124 0.0010 4.7124 2.1708
No log 0.0727 4 2.7950 0.0256 2.7950 1.6718
No log 0.1091 6 2.0037 0.0041 2.0037 1.4155
No log 0.1455 8 2.4059 -0.0233 2.4059 1.5511
No log 0.1818 10 1.7765 0.0372 1.7765 1.3328
No log 0.2182 12 1.5301 0.0372 1.5301 1.2370
No log 0.2545 14 1.4699 0.0372 1.4699 1.2124
No log 0.2909 16 1.6785 0.0925 1.6785 1.2956
No log 0.3273 18 2.0285 0.1060 2.0285 1.4242
No log 0.3636 20 2.2768 0.0599 2.2768 1.5089
No log 0.4 22 1.9544 0.1060 1.9544 1.3980
No log 0.4364 24 1.5428 0.0925 1.5428 1.2421
No log 0.4727 26 1.2213 0.1715 1.2213 1.1051
No log 0.5091 28 1.1485 0.1521 1.1485 1.0717
No log 0.5455 30 1.1689 0.1521 1.1689 1.0812
No log 0.5818 32 1.1900 0.1521 1.1900 1.0909
No log 0.6182 34 1.2162 0.1110 1.2162 1.1028
No log 0.6545 36 1.2494 0.0945 1.2494 1.1178
No log 0.6909 38 1.3201 0.0532 1.3201 1.1489
No log 0.7273 40 1.4029 -0.1531 1.4029 1.1845
No log 0.7636 42 1.4493 -0.1696 1.4493 1.2039
No log 0.8 44 1.3581 -0.1022 1.3581 1.1654
No log 0.8364 46 1.2856 -0.0181 1.2856 1.1338
No log 0.8727 48 1.1756 0.1654 1.1756 1.0843
No log 0.9091 50 1.1441 0.1546 1.1441 1.0696
No log 0.9455 52 1.1390 0.0974 1.1390 1.0673
No log 0.9818 54 1.1732 0.0974 1.1732 1.0832
No log 1.0182 56 1.1778 0.2023 1.1778 1.0853
No log 1.0545 58 1.1943 0.1585 1.1943 1.0928
No log 1.0909 60 1.1884 0.0974 1.1884 1.0901
No log 1.1273 62 1.1824 0.1609 1.1824 1.0874
No log 1.1636 64 1.2066 0.0914 1.2066 1.0984
No log 1.2 66 1.2098 0.1952 1.2098 1.0999
No log 1.2364 68 1.1842 0.0642 1.1842 1.0882
No log 1.2727 70 1.1547 0.1211 1.1547 1.0746
No log 1.3091 72 1.1678 0.1918 1.1678 1.0807
No log 1.3455 74 1.1995 0.2689 1.1995 1.0952
No log 1.3818 76 1.2280 0.1793 1.2280 1.1082
No log 1.4182 78 1.4556 0.1174 1.4556 1.2065
No log 1.4545 80 1.4183 0.1871 1.4183 1.1909
No log 1.4909 82 1.2327 0.1089 1.2327 1.1103
No log 1.5273 84 1.2212 0.2692 1.2212 1.1051
No log 1.5636 86 1.2365 0.1895 1.2365 1.1120
No log 1.6 88 1.2398 0.1895 1.2398 1.1135
No log 1.6364 90 1.2600 0.1119 1.2600 1.1225
No log 1.6727 92 1.2548 0.1119 1.2548 1.1202
No log 1.7091 94 1.2673 0.1651 1.2673 1.1257
No log 1.7455 96 1.2829 0.2108 1.2829 1.1327
No log 1.7818 98 1.3189 0.1024 1.3189 1.1484
No log 1.8182 100 1.6425 0.1448 1.6425 1.2816
No log 1.8545 102 1.5900 0.1756 1.5900 1.2610
No log 1.8909 104 1.3301 0.1234 1.3301 1.1533
No log 1.9273 106 1.1944 0.3344 1.1944 1.0929
No log 1.9636 108 1.1994 0.2198 1.1994 1.0952
No log 2.0 110 1.3656 0.1288 1.3656 1.1686
No log 2.0364 112 1.3650 0.1288 1.3650 1.1683
No log 2.0727 114 1.2806 0.0796 1.2806 1.1316
No log 2.1091 116 1.2124 0.2417 1.2124 1.1011
No log 2.1455 118 1.2366 0.2113 1.2366 1.1120
No log 2.1818 120 1.2721 0.1650 1.2721 1.1279
No log 2.2182 122 1.3046 0.1737 1.3046 1.1422
No log 2.2545 124 1.3353 0.1182 1.3353 1.1555
No log 2.2909 126 1.3745 0.0503 1.3745 1.1724
No log 2.3273 128 1.4157 0.0555 1.4157 1.1898
No log 2.3636 130 1.4562 0.0480 1.4562 1.2068
No log 2.4 132 1.4785 0.1744 1.4785 1.2159
No log 2.4364 134 1.4723 0.1621 1.4723 1.2134
No log 2.4727 136 1.4221 0.0433 1.4221 1.1925
No log 2.5091 138 1.4050 0.0941 1.4050 1.1853
No log 2.5455 140 1.3265 0.0786 1.3265 1.1517
No log 2.5818 142 1.2830 0.0585 1.2830 1.1327
No log 2.6182 144 1.4769 0.1080 1.4769 1.2153
No log 2.6545 146 1.4770 0.1138 1.4770 1.2153
No log 2.6909 148 1.2194 0.1447 1.2194 1.1042
No log 2.7273 150 1.1890 0.1447 1.1890 1.0904
No log 2.7636 152 1.4437 0.2019 1.4437 1.2015
No log 2.8 154 1.6615 0.1571 1.6615 1.2890
No log 2.8364 156 1.5904 0.1140 1.5904 1.2611
No log 2.8727 158 1.2978 0.1437 1.2978 1.1392
No log 2.9091 160 1.0786 0.1774 1.0786 1.0386
No log 2.9455 162 1.0685 0.2831 1.0685 1.0337
No log 2.9818 164 1.1466 0.1791 1.1466 1.0708
No log 3.0182 166 1.4550 0.2733 1.4550 1.2062
No log 3.0545 168 1.5064 0.2733 1.5064 1.2273
No log 3.0909 170 1.2346 0.2424 1.2346 1.1111
No log 3.1273 172 1.0787 0.3084 1.0787 1.0386
No log 3.1636 174 1.0816 0.3132 1.0816 1.0400
No log 3.2 176 1.0509 0.3619 1.0509 1.0252
No log 3.2364 178 1.0214 0.3619 1.0214 1.0106
No log 3.2727 180 1.0828 0.2095 1.0828 1.0406
No log 3.3091 182 1.2138 0.2242 1.2138 1.1017
No log 3.3455 184 1.1121 0.2750 1.1121 1.0545
No log 3.3818 186 0.9785 0.3463 0.9785 0.9892
No log 3.4182 188 0.9714 0.3275 0.9714 0.9856
No log 3.4545 190 0.9862 0.3463 0.9862 0.9931
No log 3.4909 192 1.0514 0.3381 1.0514 1.0254
No log 3.5273 194 1.2557 0.3093 1.2557 1.1206
No log 3.5636 196 1.3485 0.2559 1.3485 1.1613
No log 3.6 198 1.1989 0.2243 1.1989 1.0949
No log 3.6364 200 1.0504 0.2369 1.0504 1.0249
No log 3.6727 202 1.1188 0.2317 1.1188 1.0577
No log 3.7091 204 1.1305 0.2288 1.1305 1.0632
No log 3.7455 206 1.1146 0.2048 1.1146 1.0558
No log 3.7818 208 1.1654 0.2485 1.1654 1.0795
No log 3.8182 210 1.1459 0.2812 1.1459 1.0705
No log 3.8545 212 1.1521 0.2479 1.1521 1.0733
No log 3.8909 214 1.1496 0.2811 1.1496 1.0722
No log 3.9273 216 1.1411 0.1878 1.1411 1.0682
No log 3.9636 218 1.1468 0.1593 1.1468 1.0709
No log 4.0 220 1.1504 0.1593 1.1504 1.0726
No log 4.0364 222 1.1460 0.1732 1.1460 1.0705
No log 4.0727 224 1.1458 0.1632 1.1458 1.0704
No log 4.1091 226 1.1530 0.1556 1.1530 1.0738
No log 4.1455 228 1.1663 0.1766 1.1663 1.0800
No log 4.1818 230 1.1711 0.2568 1.1711 1.0822
No log 4.2182 232 1.2096 0.3390 1.2096 1.0998
No log 4.2545 234 1.2680 0.3294 1.2680 1.1261
No log 4.2909 236 1.2592 0.2937 1.2592 1.1221
No log 4.3273 238 1.1544 0.3505 1.1544 1.0744
No log 4.3636 240 1.0943 0.2822 1.0943 1.0461
No log 4.4 242 1.0842 0.2612 1.0842 1.0412
No log 4.4364 244 1.0757 0.1849 1.0757 1.0371
No log 4.4727 246 1.1036 0.2085 1.1036 1.0505
No log 4.5091 248 1.1538 0.2233 1.1538 1.0742
No log 4.5455 250 1.2745 0.2491 1.2745 1.1289
No log 4.5818 252 1.2330 0.2138 1.2330 1.1104
No log 4.6182 254 1.1706 0.2493 1.1706 1.0819
No log 4.6545 256 1.3100 0.3294 1.3100 1.1446
No log 4.6909 258 1.2748 0.3061 1.2748 1.1291
No log 4.7273 260 1.2727 0.1647 1.2727 1.1281
No log 4.7636 262 1.6638 0.1226 1.6638 1.2899
No log 4.8 264 1.9204 0.1053 1.9204 1.3858
No log 4.8364 266 1.7670 0.1154 1.7670 1.3293
No log 4.8727 268 1.3882 0.2705 1.3882 1.1782
No log 4.9091 270 1.1365 0.2068 1.1365 1.0661
No log 4.9455 272 1.1228 0.3069 1.1228 1.0596
No log 4.9818 274 1.0789 0.2360 1.0789 1.0387
No log 5.0182 276 1.0522 0.1556 1.0522 1.0258
No log 5.0545 278 1.1113 0.2342 1.1113 1.0542
No log 5.0909 280 1.1970 0.2342 1.1970 1.0941
No log 5.1273 282 1.1506 0.2342 1.1506 1.0727
No log 5.1636 284 1.0875 0.2180 1.0875 1.0429
No log 5.2 286 1.0408 0.2471 1.0408 1.0202
No log 5.2364 288 1.0325 0.2369 1.0325 1.0161
No log 5.2727 290 1.0266 0.2010 1.0266 1.0132
No log 5.3091 292 1.0617 0.2124 1.0617 1.0304
No log 5.3455 294 1.0991 0.2432 1.0991 1.0484
No log 5.3818 296 1.0740 0.2618 1.0740 1.0364
No log 5.4182 298 1.0223 0.2681 1.0223 1.0111
No log 5.4545 300 1.0020 0.3201 1.0020 1.0010
No log 5.4909 302 0.9946 0.3151 0.9946 0.9973
No log 5.5273 304 0.9765 0.2678 0.9765 0.9882
No log 5.5636 306 0.9619 0.3354 0.9619 0.9808
No log 5.6 308 0.9613 0.2843 0.9613 0.9805
No log 5.6364 310 0.9483 0.2843 0.9483 0.9738
No log 5.6727 312 0.9726 0.3880 0.9726 0.9862
No log 5.7091 314 1.0292 0.4259 1.0292 1.0145
No log 5.7455 316 1.0730 0.4128 1.0730 1.0359
No log 5.7818 318 1.1013 0.4053 1.1013 1.0495
No log 5.8182 320 1.0592 0.3412 1.0592 1.0292
No log 5.8545 322 1.0165 0.2545 1.0165 1.0082
No log 5.8909 324 1.0188 0.2447 1.0188 1.0094
No log 5.9273 326 1.0222 0.2402 1.0222 1.0110
No log 5.9636 328 1.0269 0.1951 1.0269 1.0134
No log 6.0 330 1.0326 0.2737 1.0326 1.0162
No log 6.0364 332 1.0272 0.2903 1.0272 1.0135
No log 6.0727 334 1.0162 0.2584 1.0162 1.0081
No log 6.1091 336 1.0901 0.3152 1.0901 1.0441
No log 6.1455 338 1.0934 0.3152 1.0934 1.0457
No log 6.1818 340 1.0234 0.3790 1.0234 1.0116
No log 6.2182 342 0.9651 0.3660 0.9651 0.9824
No log 6.2545 344 0.9491 0.2755 0.9491 0.9742
No log 6.2909 346 0.9486 0.3819 0.9486 0.9739
No log 6.3273 348 0.9282 0.3301 0.9282 0.9634
No log 6.3636 350 0.9579 0.3848 0.9579 0.9787
No log 6.4 352 1.0065 0.3009 1.0065 1.0032
No log 6.4364 354 0.9622 0.2961 0.9622 0.9809
No log 6.4727 356 0.8854 0.4102 0.8854 0.9410
No log 6.5091 358 0.8879 0.5175 0.8879 0.9423
No log 6.5455 360 0.8823 0.5175 0.8823 0.9393
No log 6.5818 362 0.8527 0.4736 0.8527 0.9234
No log 6.6182 364 0.8528 0.4440 0.8528 0.9235
No log 6.6545 366 0.8651 0.3230 0.8651 0.9301
No log 6.6909 368 0.8906 0.3467 0.8906 0.9437
No log 6.7273 370 0.9492 0.3848 0.9492 0.9743
No log 6.7636 372 0.9596 0.3704 0.9596 0.9796
No log 6.8 374 0.9344 0.3704 0.9344 0.9666
No log 6.8364 376 0.9255 0.3578 0.9255 0.9620
No log 6.8727 378 0.9411 0.3819 0.9411 0.9701
No log 6.9091 380 0.9609 0.3571 0.9609 0.9802
No log 6.9455 382 1.0901 0.3494 1.0901 1.0441
No log 6.9818 384 1.1020 0.3198 1.1020 1.0498
No log 7.0182 386 1.0092 0.2718 1.0092 1.0046
No log 7.0545 388 0.9401 0.4159 0.9401 0.9696
No log 7.0909 390 0.9672 0.4238 0.9672 0.9835
No log 7.1273 392 0.9711 0.4238 0.9711 0.9854
No log 7.1636 394 0.9561 0.4006 0.9561 0.9778
No log 7.2 396 0.9328 0.3290 0.9328 0.9658
No log 7.2364 398 0.9291 0.3558 0.9291 0.9639
No log 7.2727 400 0.9269 0.3571 0.9269 0.9627
No log 7.3091 402 0.9163 0.3382 0.9163 0.9572
No log 7.3455 404 0.9152 0.3337 0.9152 0.9567
No log 7.3818 406 0.9316 0.3424 0.9316 0.9652
No log 7.4182 408 0.9135 0.2942 0.9135 0.9558
No log 7.4545 410 0.9069 0.3806 0.9069 0.9523
No log 7.4909 412 0.9185 0.4135 0.9185 0.9584
No log 7.5273 414 0.9104 0.3558 0.9104 0.9541
No log 7.5636 416 0.9162 0.2424 0.9162 0.9572
No log 7.6 418 0.9566 0.4110 0.9566 0.9781
No log 7.6364 420 1.0087 0.4378 1.0087 1.0044
No log 7.6727 422 1.0617 0.4313 1.0617 1.0304
No log 7.7091 424 1.0889 0.4313 1.0889 1.0435
No log 7.7455 426 1.0100 0.3744 1.0100 1.0050
No log 7.7818 428 0.9709 0.2970 0.9709 0.9854
No log 7.8182 430 0.9793 0.3069 0.9793 0.9896
No log 7.8545 432 1.0091 0.3786 1.0091 1.0045
No log 7.8909 434 1.0138 0.3533 1.0138 1.0069
No log 7.9273 436 1.0321 0.2893 1.0321 1.0159
No log 7.9636 438 1.0484 0.2503 1.0484 1.0239
No log 8.0 440 1.0704 0.2503 1.0704 1.0346
No log 8.0364 442 1.0981 0.1927 1.0981 1.0479
No log 8.0727 444 1.1240 0.1962 1.1240 1.0602
No log 8.1091 446 1.1576 0.1962 1.1576 1.0759
No log 8.1455 448 1.1691 0.1521 1.1691 1.0813
No log 8.1818 450 1.1630 0.1582 1.1630 1.0784
No log 8.2182 452 1.1016 0.1752 1.1016 1.0496
No log 8.2545 454 1.0659 0.2043 1.0659 1.0324
No log 8.2909 456 1.0601 0.2057 1.0601 1.0296
No log 8.3273 458 1.0441 0.1903 1.0441 1.0218
No log 8.3636 460 1.0234 0.2083 1.0234 1.0116
No log 8.4 462 1.0133 0.2689 1.0133 1.0066
No log 8.4364 464 0.9896 0.2595 0.9896 0.9948
No log 8.4727 466 0.9743 0.3137 0.9743 0.9871
No log 8.5091 468 0.9669 0.3062 0.9669 0.9833
No log 8.5455 470 0.9671 0.3062 0.9671 0.9834
No log 8.5818 472 0.9660 0.2892 0.9660 0.9829
No log 8.6182 474 0.9755 0.2336 0.9755 0.9877
No log 8.6545 476 0.9889 0.2892 0.9889 0.9944
No log 8.6909 478 1.0437 0.3602 1.0437 1.0216
No log 8.7273 480 1.1537 0.2816 1.1537 1.0741
No log 8.7636 482 1.1449 0.2912 1.1449 1.0700
No log 8.8 484 1.0595 0.3310 1.0595 1.0293
No log 8.8364 486 1.0174 0.2610 1.0174 1.0086
No log 8.8727 488 1.0472 0.2621 1.0472 1.0233
No log 8.9091 490 1.0480 0.2218 1.0480 1.0237
No log 8.9455 492 1.0189 0.2610 1.0189 1.0094
No log 8.9818 494 1.0096 0.2310 1.0096 1.0048
No log 9.0182 496 1.0504 0.3106 1.0504 1.0249
No log 9.0545 498 1.0786 0.2912 1.0786 1.0385
0.4 9.0909 500 1.0545 0.3250 1.0545 1.0269
0.4 9.1273 502 0.9974 0.3747 0.9974 0.9987
0.4 9.1636 504 0.9716 0.2643 0.9716 0.9857
0.4 9.2 506 0.9798 0.2692 0.9798 0.9899
0.4 9.2364 508 0.9717 0.3018 0.9717 0.9858
0.4 9.2727 510 0.9637 0.2643 0.9637 0.9817
0.4 9.3091 512 0.9698 0.2743 0.9698 0.9848

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
1
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_FineTuningAraBERT_run1_AugV5_k18_task2_organization

Finetuned
(4023)
this model