ArabicNewSplits7_B_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k1_task1_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 1.1718
  • Qwk: 0.4138
  • Mse: 1.1718
  • Rmse: 1.0825

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.4 2 7.2051 -0.0111 7.2051 2.6842
No log 0.8 4 4.7992 0.0784 4.7992 2.1907
No log 1.2 6 4.5612 -0.0463 4.5612 2.1357
No log 1.6 8 4.7394 -0.1096 4.7394 2.1770
No log 2.0 10 2.9844 -0.0280 2.9844 1.7275
No log 2.4 12 2.1899 0.0157 2.1899 1.4798
No log 2.8 14 2.0183 0.1565 2.0183 1.4207
No log 3.2 16 1.9058 0.1852 1.9058 1.3805
No log 3.6 18 1.8315 0.1869 1.8315 1.3533
No log 4.0 20 1.8483 0.3214 1.8483 1.3595
No log 4.4 22 1.9859 0.3101 1.9859 1.4092
No log 4.8 24 2.1039 0.1295 2.1039 1.4505
No log 5.2 26 1.8441 0.3866 1.8441 1.3580
No log 5.6 28 1.7342 0.2883 1.7342 1.3169
No log 6.0 30 1.7624 0.2569 1.7624 1.3275
No log 6.4 32 1.7241 0.2883 1.7241 1.3130
No log 6.8 34 1.7012 0.3667 1.7012 1.3043
No log 7.2 36 2.0282 0.2302 2.0282 1.4241
No log 7.6 38 2.3991 0.0800 2.3991 1.5489
No log 8.0 40 2.3736 0.0800 2.3736 1.5406
No log 8.4 42 1.9760 0.2174 1.9760 1.4057
No log 8.8 44 1.6236 0.4194 1.6236 1.2742
No log 9.2 46 1.4344 0.3186 1.4344 1.1977
No log 9.6 48 1.4032 0.3186 1.4032 1.1846
No log 10.0 50 1.4857 0.3333 1.4857 1.2189
No log 10.4 52 1.6587 0.4308 1.6587 1.2879
No log 10.8 54 1.7324 0.3939 1.7324 1.3162
No log 11.2 56 1.8184 0.3731 1.8184 1.3485
No log 11.6 58 1.6513 0.3846 1.6513 1.2850
No log 12.0 60 1.6903 0.3206 1.6903 1.3001
No log 12.4 62 1.7899 0.2879 1.7899 1.3379
No log 12.8 64 1.9900 0.2815 1.9900 1.4107
No log 13.2 66 2.1832 0.2345 2.1832 1.4776
No log 13.6 68 1.8432 0.2748 1.8432 1.3577
No log 14.0 70 1.5614 0.2759 1.5614 1.2496
No log 14.4 72 1.5123 0.3051 1.5123 1.2297
No log 14.8 74 1.5960 0.3360 1.5960 1.2633
No log 15.2 76 1.6011 0.3411 1.6011 1.2654
No log 15.6 78 1.4765 0.3810 1.4765 1.2151
No log 16.0 80 1.3389 0.4262 1.3389 1.1571
No log 16.4 82 1.3056 0.3866 1.3056 1.1426
No log 16.8 84 1.3286 0.4921 1.3286 1.1527
No log 17.2 86 1.3626 0.5 1.3626 1.1673
No log 17.6 88 1.3778 0.4032 1.3778 1.1738
No log 18.0 90 1.3644 0.4640 1.3644 1.1681
No log 18.4 92 1.3422 0.4882 1.3422 1.1585
No log 18.8 94 1.3002 0.5156 1.3002 1.1403
No log 19.2 96 1.2573 0.4390 1.2573 1.1213
No log 19.6 98 1.3257 0.4959 1.3257 1.1514
No log 20.0 100 1.3562 0.3967 1.3562 1.1646
No log 20.4 102 1.3594 0.4923 1.3594 1.1659
No log 20.8 104 1.4906 0.4154 1.4906 1.2209
No log 21.2 106 1.5655 0.4 1.5655 1.2512
No log 21.6 108 1.4426 0.4160 1.4426 1.2011
No log 22.0 110 1.3416 0.48 1.3416 1.1583
No log 22.4 112 1.3475 0.4793 1.3475 1.1608
No log 22.8 114 1.3053 0.4202 1.3053 1.1425
No log 23.2 116 1.2707 0.5197 1.2707 1.1273
No log 23.6 118 1.3148 0.4921 1.3148 1.1467
No log 24.0 120 1.3321 0.4762 1.3321 1.1542
No log 24.4 122 1.3221 0.4844 1.3221 1.1498
No log 24.8 124 1.3182 0.4921 1.3182 1.1481
No log 25.2 126 1.2859 0.48 1.2859 1.1340
No log 25.6 128 1.2677 0.4878 1.2677 1.1259
No log 26.0 130 1.3069 0.4407 1.3069 1.1432
No log 26.4 132 1.3501 0.4274 1.3501 1.1619
No log 26.8 134 1.3768 0.4174 1.3768 1.1734
No log 27.2 136 1.4359 0.2982 1.4359 1.1983
No log 27.6 138 1.6466 0.3306 1.6466 1.2832
No log 28.0 140 1.7016 0.2927 1.7016 1.3045
No log 28.4 142 1.6244 0.2203 1.6244 1.2745
No log 28.8 144 1.6701 0.2281 1.6701 1.2923
No log 29.2 146 1.7871 0.2393 1.7871 1.3368
No log 29.6 148 1.8201 0.2185 1.8201 1.3491
No log 30.0 150 1.6993 0.2609 1.6993 1.3036
No log 30.4 152 1.5697 0.3361 1.5697 1.2529
No log 30.8 154 1.4541 0.3140 1.4541 1.2059
No log 31.2 156 1.4004 0.4160 1.4004 1.1834
No log 31.6 158 1.3044 0.3826 1.3044 1.1421
No log 32.0 160 1.3005 0.3966 1.3005 1.1404
No log 32.4 162 1.3428 0.4390 1.3428 1.1588
No log 32.8 164 1.2815 0.4262 1.2815 1.1320
No log 33.2 166 1.2408 0.4132 1.2408 1.1139
No log 33.6 168 1.2559 0.4553 1.2559 1.1207
No log 34.0 170 1.2059 0.4839 1.2059 1.0981
No log 34.4 172 1.2108 0.4237 1.2108 1.1004
No log 34.8 174 1.2537 0.4333 1.2537 1.1197
No log 35.2 176 1.2978 0.3761 1.2978 1.1392
No log 35.6 178 1.3127 0.3932 1.3127 1.1457
No log 36.0 180 1.3528 0.3684 1.3528 1.1631
No log 36.4 182 1.3584 0.4 1.3584 1.1655
No log 36.8 184 1.3766 0.3652 1.3766 1.1733
No log 37.2 186 1.3722 0.3793 1.3722 1.1714
No log 37.6 188 1.3517 0.3684 1.3517 1.1626
No log 38.0 190 1.3525 0.3214 1.3525 1.1630
No log 38.4 192 1.3572 0.3478 1.3572 1.1650
No log 38.8 194 1.3305 0.3652 1.3305 1.1535
No log 39.2 196 1.3260 0.3866 1.3260 1.1515
No log 39.6 198 1.2989 0.3652 1.2989 1.1397
No log 40.0 200 1.2931 0.4706 1.2931 1.1372
No log 40.4 202 1.2979 0.4667 1.2979 1.1393
No log 40.8 204 1.2799 0.4538 1.2799 1.1313
No log 41.2 206 1.2385 0.4 1.2385 1.1129
No log 41.6 208 1.2320 0.4 1.2320 1.1100
No log 42.0 210 1.2411 0.4 1.2411 1.1141
No log 42.4 212 1.2588 0.3826 1.2588 1.1220
No log 42.8 214 1.2850 0.4138 1.2850 1.1336
No log 43.2 216 1.2826 0.4174 1.2826 1.1325
No log 43.6 218 1.2842 0.4 1.2842 1.1332
No log 44.0 220 1.2536 0.3932 1.2536 1.1197
No log 44.4 222 1.2597 0.3932 1.2597 1.1224
No log 44.8 224 1.2868 0.4370 1.2868 1.1344
No log 45.2 226 1.3127 0.4793 1.3127 1.1457
No log 45.6 228 1.3286 0.4793 1.3286 1.1526
No log 46.0 230 1.2708 0.4237 1.2708 1.1273
No log 46.4 232 1.2334 0.4237 1.2334 1.1106
No log 46.8 234 1.2394 0.4878 1.2394 1.1133
No log 47.2 236 1.2419 0.4754 1.2419 1.1144
No log 47.6 238 1.2305 0.4793 1.2305 1.1093
No log 48.0 240 1.2510 0.4138 1.2510 1.1185
No log 48.4 242 1.2837 0.3932 1.2837 1.1330
No log 48.8 244 1.2715 0.3932 1.2715 1.1276
No log 49.2 246 1.2704 0.3932 1.2704 1.1271
No log 49.6 248 1.2703 0.4237 1.2703 1.1271
No log 50.0 250 1.2672 0.4237 1.2672 1.1257
No log 50.4 252 1.2649 0.3932 1.2649 1.1247
No log 50.8 254 1.2274 0.3932 1.2274 1.1079
No log 51.2 256 1.2091 0.3826 1.2091 1.0996
No log 51.6 258 1.1912 0.4538 1.1912 1.0914
No log 52.0 260 1.1733 0.4538 1.1733 1.0832
No log 52.4 262 1.1829 0.4878 1.1829 1.0876
No log 52.8 264 1.1609 0.4667 1.1609 1.0774
No log 53.2 266 1.1315 0.4370 1.1315 1.0637
No log 53.6 268 1.1250 0.4202 1.1250 1.0606
No log 54.0 270 1.1285 0.4202 1.1285 1.0623
No log 54.4 272 1.1178 0.4202 1.1178 1.0572
No log 54.8 274 1.1104 0.4237 1.1104 1.0538
No log 55.2 276 1.1414 0.5041 1.1414 1.0684
No log 55.6 278 1.1590 0.5556 1.1590 1.0766
No log 56.0 280 1.1403 0.5161 1.1403 1.0678
No log 56.4 282 1.1063 0.4754 1.1063 1.0518
No log 56.8 284 1.0917 0.4628 1.0917 1.0449
No log 57.2 286 1.0991 0.4463 1.0991 1.0484
No log 57.6 288 1.1024 0.4333 1.1024 1.0499
No log 58.0 290 1.1084 0.4793 1.1084 1.0528
No log 58.4 292 1.1369 0.5238 1.1369 1.0662
No log 58.8 294 1.1828 0.528 1.1828 1.0876
No log 59.2 296 1.1787 0.496 1.1787 1.0857
No log 59.6 298 1.1538 0.5079 1.1538 1.0741
No log 60.0 300 1.1351 0.4793 1.1351 1.0654
No log 60.4 302 1.1441 0.4628 1.1441 1.0696
No log 60.8 304 1.1460 0.4463 1.1460 1.0705
No log 61.2 306 1.1394 0.4370 1.1394 1.0674
No log 61.6 308 1.1366 0.4370 1.1366 1.0661
No log 62.0 310 1.1317 0.4138 1.1317 1.0638
No log 62.4 312 1.1485 0.4667 1.1485 1.0717
No log 62.8 314 1.1813 0.5124 1.1813 1.0869
No log 63.2 316 1.2260 0.5 1.2260 1.1073
No log 63.6 318 1.2399 0.5203 1.2399 1.1135
No log 64.0 320 1.2394 0.5203 1.2394 1.1133
No log 64.4 322 1.2594 0.5203 1.2594 1.1222
No log 64.8 324 1.2517 0.4715 1.2517 1.1188
No log 65.2 326 1.2415 0.5041 1.2415 1.1142
No log 65.6 328 1.2080 0.4793 1.2080 1.0991
No log 66.0 330 1.1826 0.4500 1.1826 1.0875
No log 66.4 332 1.1633 0.4333 1.1633 1.0786
No log 66.8 334 1.1366 0.4706 1.1366 1.0661
No log 67.2 336 1.1310 0.4310 1.1310 1.0635
No log 67.6 338 1.1355 0.4333 1.1355 1.0656
No log 68.0 340 1.1376 0.4310 1.1376 1.0666
No log 68.4 342 1.1456 0.4310 1.1456 1.0703
No log 68.8 344 1.1682 0.3932 1.1682 1.0808
No log 69.2 346 1.1896 0.4463 1.1896 1.0907
No log 69.6 348 1.1959 0.4333 1.1959 1.0936
No log 70.0 350 1.1899 0.4463 1.1899 1.0908
No log 70.4 352 1.1710 0.4370 1.1710 1.0821
No log 70.8 354 1.1572 0.4237 1.1572 1.0757
No log 71.2 356 1.1511 0.4167 1.1511 1.0729
No log 71.6 358 1.1545 0.4463 1.1545 1.0745
No log 72.0 360 1.1554 0.4590 1.1554 1.0749
No log 72.4 362 1.1563 0.4390 1.1563 1.0753
No log 72.8 364 1.1530 0.4390 1.1530 1.0738
No log 73.2 366 1.1638 0.4677 1.1638 1.0788
No log 73.6 368 1.1659 0.4463 1.1659 1.0798
No log 74.0 370 1.1538 0.4677 1.1538 1.0741
No log 74.4 372 1.1391 0.5082 1.1391 1.0673
No log 74.8 374 1.1313 0.4667 1.1313 1.0636
No log 75.2 376 1.1334 0.4667 1.1334 1.0646
No log 75.6 378 1.1329 0.4615 1.1329 1.0644
No log 76.0 380 1.1305 0.4310 1.1305 1.0632
No log 76.4 382 1.1240 0.4274 1.1240 1.0602
No log 76.8 384 1.1303 0.4103 1.1303 1.0632
No log 77.2 386 1.1510 0.4538 1.1510 1.0729
No log 77.6 388 1.1821 0.4370 1.1821 1.0872
No log 78.0 390 1.2074 0.4918 1.2074 1.0988
No log 78.4 392 1.1995 0.4793 1.1995 1.0952
No log 78.8 394 1.1749 0.4333 1.1749 1.0839
No log 79.2 396 1.1522 0.4538 1.1522 1.0734
No log 79.6 398 1.1334 0.4237 1.1334 1.0646
No log 80.0 400 1.1239 0.4500 1.1239 1.0601
No log 80.4 402 1.1246 0.4667 1.1246 1.0605
No log 80.8 404 1.1330 0.4667 1.1330 1.0644
No log 81.2 406 1.1443 0.4237 1.1443 1.0697
No log 81.6 408 1.1584 0.4138 1.1584 1.0763
No log 82.0 410 1.1726 0.4138 1.1726 1.0829
No log 82.4 412 1.1814 0.4138 1.1814 1.0869
No log 82.8 414 1.1841 0.4 1.1841 1.0882
No log 83.2 416 1.1879 0.3540 1.1879 1.0899
No log 83.6 418 1.1972 0.4 1.1972 1.0942
No log 84.0 420 1.2021 0.4 1.2021 1.0964
No log 84.4 422 1.2003 0.4 1.2003 1.0956
No log 84.8 424 1.1966 0.4 1.1966 1.0939
No log 85.2 426 1.1926 0.3363 1.1926 1.0921
No log 85.6 428 1.1921 0.3684 1.1921 1.0918
No log 86.0 430 1.1931 0.4138 1.1931 1.0923
No log 86.4 432 1.1966 0.4138 1.1966 1.0939
No log 86.8 434 1.1949 0.4138 1.1949 1.0931
No log 87.2 436 1.1896 0.4138 1.1896 1.0907
No log 87.6 438 1.1854 0.4138 1.1854 1.0887
No log 88.0 440 1.1838 0.4138 1.1838 1.0880
No log 88.4 442 1.1832 0.4103 1.1832 1.0877
No log 88.8 444 1.1856 0.4202 1.1856 1.0888
No log 89.2 446 1.1912 0.4167 1.1912 1.0914
No log 89.6 448 1.1929 0.4167 1.1929 1.0922
No log 90.0 450 1.1892 0.4202 1.1892 1.0905
No log 90.4 452 1.1804 0.4202 1.1804 1.0865
No log 90.8 454 1.1748 0.4103 1.1748 1.0839
No log 91.2 456 1.1709 0.4103 1.1709 1.0821
No log 91.6 458 1.1674 0.3966 1.1674 1.0804
No log 92.0 460 1.1656 0.4 1.1656 1.0796
No log 92.4 462 1.1647 0.4 1.1647 1.0792
No log 92.8 464 1.1624 0.4 1.1624 1.0781
No log 93.2 466 1.1604 0.4 1.1604 1.0772
No log 93.6 468 1.1594 0.4 1.1594 1.0768
No log 94.0 470 1.1611 0.4138 1.1611 1.0775
No log 94.4 472 1.1643 0.4138 1.1643 1.0790
No log 94.8 474 1.1667 0.4138 1.1667 1.0801
No log 95.2 476 1.1694 0.4138 1.1694 1.0814
No log 95.6 478 1.1717 0.4138 1.1717 1.0824
No log 96.0 480 1.1728 0.4138 1.1728 1.0829
No log 96.4 482 1.1725 0.4138 1.1725 1.0828
No log 96.8 484 1.1721 0.4138 1.1721 1.0826
No log 97.2 486 1.1711 0.4138 1.1711 1.0822
No log 97.6 488 1.1709 0.4138 1.1709 1.0821
No log 98.0 490 1.1713 0.4138 1.1713 1.0822
No log 98.4 492 1.1715 0.4138 1.1715 1.0824
No log 98.8 494 1.1716 0.4138 1.1716 1.0824
No log 99.2 496 1.1716 0.4138 1.1716 1.0824
No log 99.6 498 1.1717 0.4138 1.1717 1.0825
0.2859 100.0 500 1.1718 0.4138 1.1718 1.0825

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_B_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k1_task1_organization

Finetuned
(4019)
this model