ArabicNewSplits7_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k3_task7_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 1.2029
  • Qwk: 0.1458
  • Mse: 1.2029
  • Rmse: 1.0968

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.25 2 2.5978 -0.1213 2.5978 1.6118
No log 0.5 4 1.2736 0.1565 1.2736 1.1285
No log 0.75 6 0.8329 0.1724 0.8329 0.9126
No log 1.0 8 0.8480 0.1953 0.8480 0.9209
No log 1.25 10 0.9110 0.1651 0.9110 0.9545
No log 1.5 12 0.7202 0.1800 0.7202 0.8486
No log 1.75 14 0.7641 0.3807 0.7641 0.8741
No log 2.0 16 0.7821 0.3167 0.7821 0.8843
No log 2.25 18 0.6689 0.3416 0.6689 0.8178
No log 2.5 20 0.6891 0.2819 0.6891 0.8301
No log 2.75 22 0.6996 0.2819 0.6996 0.8364
No log 3.0 24 0.7185 0.2652 0.7185 0.8476
No log 3.25 26 0.7967 0.2817 0.7967 0.8926
No log 3.5 28 0.8994 0.2193 0.8994 0.9484
No log 3.75 30 0.9499 0.1461 0.9499 0.9746
No log 4.0 32 0.8822 0.2012 0.8822 0.9392
No log 4.25 34 0.8537 0.0888 0.8537 0.9239
No log 4.5 36 1.0020 0.1362 1.0020 1.0010
No log 4.75 38 1.1050 0.1525 1.1050 1.0512
No log 5.0 40 0.9958 0.2513 0.9958 0.9979
No log 5.25 42 1.0472 0.2779 1.0472 1.0233
No log 5.5 44 1.0882 0.3003 1.0882 1.0432
No log 5.75 46 1.0204 0.3073 1.0204 1.0101
No log 6.0 48 0.8045 0.2691 0.8045 0.8969
No log 6.25 50 0.8338 0.2576 0.8338 0.9131
No log 6.5 52 1.2321 0.2439 1.2321 1.1100
No log 6.75 54 1.5089 0.1075 1.5089 1.2284
No log 7.0 56 1.4974 0.1075 1.4974 1.2237
No log 7.25 58 1.2323 0.2052 1.2323 1.1101
No log 7.5 60 0.8079 0.2335 0.8079 0.8988
No log 7.75 62 0.7666 0.2140 0.7666 0.8756
No log 8.0 64 0.7768 0.2661 0.7768 0.8813
No log 8.25 66 0.9247 0.2420 0.9247 0.9616
No log 8.5 68 1.5337 0.2412 1.5337 1.2384
No log 8.75 70 1.6378 0.1735 1.6378 1.2798
No log 9.0 72 1.4233 0.2374 1.4233 1.1930
No log 9.25 74 0.9601 0.3019 0.9601 0.9799
No log 9.5 76 0.7671 0.2780 0.7671 0.8758
No log 9.75 78 0.7569 0.2843 0.7569 0.8700
No log 10.0 80 0.9122 0.2464 0.9122 0.9551
No log 10.25 82 1.3179 0.1630 1.3179 1.1480
No log 10.5 84 1.5882 0.1311 1.5882 1.2603
No log 10.75 86 1.4081 0.1473 1.4081 1.1866
No log 11.0 88 1.0956 0.2507 1.0956 1.0467
No log 11.25 90 1.0601 0.2507 1.0601 1.0296
No log 11.5 92 1.1088 0.2032 1.1088 1.0530
No log 11.75 94 1.1778 0.1795 1.1778 1.0852
No log 12.0 96 1.1681 0.1795 1.1681 1.0808
No log 12.25 98 1.0336 0.2504 1.0336 1.0167
No log 12.5 100 0.8380 0.3076 0.8380 0.9154
No log 12.75 102 0.8705 0.2830 0.8705 0.9330
No log 13.0 104 0.7821 0.2781 0.7821 0.8844
No log 13.25 106 0.8206 0.3665 0.8206 0.9059
No log 13.5 108 0.9689 0.2806 0.9689 0.9843
No log 13.75 110 0.9047 0.3579 0.9047 0.9511
No log 14.0 112 0.9415 0.2806 0.9415 0.9703
No log 14.25 114 1.0471 0.2166 1.0471 1.0233
No log 14.5 116 1.3150 0.1907 1.3150 1.1467
No log 14.75 118 1.3603 0.1621 1.3603 1.1663
No log 15.0 120 1.1815 0.1432 1.1815 1.0870
No log 15.25 122 1.0486 0.1599 1.0486 1.0240
No log 15.5 124 1.0584 0.1599 1.0584 1.0288
No log 15.75 126 1.0276 0.2271 1.0276 1.0137
No log 16.0 128 1.0591 0.1591 1.0591 1.0291
No log 16.25 130 1.1440 0.2153 1.1440 1.0696
No log 16.5 132 1.0636 0.2354 1.0636 1.0313
No log 16.75 134 0.9295 0.2487 0.9295 0.9641
No log 17.0 136 1.0379 0.2017 1.0379 1.0188
No log 17.25 138 1.1723 0.2115 1.1723 1.0827
No log 17.5 140 1.0765 0.2227 1.0765 1.0375
No log 17.75 142 0.8493 0.3371 0.8493 0.9216
No log 18.0 144 0.7888 0.3731 0.7888 0.8881
No log 18.25 146 0.8781 0.2703 0.8781 0.9370
No log 18.5 148 1.0203 0.2457 1.0203 1.0101
No log 18.75 150 1.1510 0.1907 1.1510 1.0728
No log 19.0 152 1.0607 0.2457 1.0607 1.0299
No log 19.25 154 0.8405 0.3371 0.8405 0.9168
No log 19.5 156 0.7745 0.2223 0.7745 0.8800
No log 19.75 158 0.8475 0.2643 0.8475 0.9206
No log 20.0 160 1.0649 0.2166 1.0649 1.0319
No log 20.25 162 1.1494 0.2084 1.1494 1.0721
No log 20.5 164 1.0667 0.2227 1.0667 1.0328
No log 20.75 166 1.0078 0.2227 1.0078 1.0039
No log 21.0 168 0.9494 0.2552 0.9494 0.9744
No log 21.25 170 0.8753 0.2756 0.8753 0.9356
No log 21.5 172 0.8532 0.2287 0.8532 0.9237
No log 21.75 174 0.9212 0.3051 0.9212 0.9598
No log 22.0 176 1.0937 0.2070 1.0937 1.0458
No log 22.25 178 1.0335 0.2231 1.0335 1.0166
No log 22.5 180 0.9356 0.2554 0.9356 0.9672
No log 22.75 182 0.9640 0.2703 0.9640 0.9819
No log 23.0 184 1.0352 0.2125 1.0352 1.0175
No log 23.25 186 1.1214 0.1784 1.1214 1.0590
No log 23.5 188 1.1939 0.1458 1.1939 1.0927
No log 23.75 190 1.1101 0.2141 1.1101 1.0536
No log 24.0 192 0.9381 0.2756 0.9381 0.9685
No log 24.25 194 0.8600 0.2046 0.8600 0.9274
No log 24.5 196 0.8825 0.2259 0.8825 0.9394
No log 24.75 198 0.9943 0.2552 0.9943 0.9971
No log 25.0 200 1.2790 0.1175 1.2790 1.1309
No log 25.25 202 1.4353 0.1450 1.4353 1.1981
No log 25.5 204 1.3287 0.1574 1.3287 1.1527
No log 25.75 206 1.0130 0.1976 1.0130 1.0065
No log 26.0 208 0.7901 0.2129 0.7901 0.8889
No log 26.25 210 0.7511 0.2182 0.7511 0.8667
No log 26.5 212 0.7664 0.1850 0.7664 0.8755
No log 26.75 214 0.8662 0.1765 0.8662 0.9307
No log 27.0 216 0.9971 0.1492 0.9971 0.9986
No log 27.25 218 1.1350 0.1784 1.1350 1.0654
No log 27.5 220 1.1832 0.2100 1.1832 1.0878
No log 27.75 222 1.1009 0.2141 1.1009 1.0492
No log 28.0 224 0.9891 0.2271 0.9891 0.9945
No log 28.25 226 0.8956 0.2000 0.8956 0.9464
No log 28.5 228 0.8688 0.1734 0.8688 0.9321
No log 28.75 230 0.9063 0.2000 0.9063 0.9520
No log 29.0 232 1.0154 0.1389 1.0154 1.0077
No log 29.25 234 1.1221 0.2084 1.1221 1.0593
No log 29.5 236 1.0925 0.1896 1.0925 1.0452
No log 29.75 238 0.9404 0.1603 0.9404 0.9697
No log 30.0 240 0.8720 0.1734 0.8720 0.9338
No log 30.25 242 0.8508 0.1461 0.8508 0.9224
No log 30.5 244 0.9082 0.1734 0.9082 0.9530
No log 30.75 246 0.9514 0.1869 0.9514 0.9754
No log 31.0 248 1.0377 0.2166 1.0377 1.0187
No log 31.25 250 1.0238 0.2166 1.0238 1.0118
No log 31.5 252 0.9476 0.1528 0.9476 0.9735
No log 31.75 254 0.9819 0.2209 0.9819 0.9909
No log 32.0 256 1.0027 0.1389 1.0027 1.0013
No log 32.25 258 1.0450 0.1389 1.0450 1.0222
No log 32.5 260 1.1036 0.1389 1.1036 1.0505
No log 32.75 262 1.1840 0.2006 1.1840 1.0881
No log 33.0 264 1.2293 0.2006 1.2293 1.1087
No log 33.25 266 1.2412 0.1262 1.2412 1.1141
No log 33.5 268 1.1738 0.1356 1.1738 1.0834
No log 33.75 270 1.2037 0.1057 1.2037 1.0971
No log 34.0 272 1.1938 0.1057 1.1938 1.0926
No log 34.25 274 1.2731 0.1262 1.2731 1.1283
No log 34.5 276 1.3377 0.1427 1.3377 1.1566
No log 34.75 278 1.2981 0.1458 1.2981 1.1394
No log 35.0 280 1.1988 0.1057 1.1988 1.0949
No log 35.25 282 1.1175 0.1146 1.1175 1.0571
No log 35.5 284 1.0667 0.1176 1.0667 1.0328
No log 35.75 286 0.9827 0.1492 0.9827 0.9913
No log 36.0 288 0.9029 0.1682 0.9029 0.9502
No log 36.25 290 0.8857 0.2094 0.8857 0.9411
No log 36.5 292 0.9475 0.1869 0.9475 0.9734
No log 36.75 294 1.0407 0.1909 1.0407 1.0201
No log 37.0 296 1.0737 0.1909 1.0737 1.0362
No log 37.25 298 1.0770 0.1909 1.0770 1.0378
No log 37.5 300 1.0195 0.1990 1.0195 1.0097
No log 37.75 302 0.9758 0.1492 0.9758 0.9878
No log 38.0 304 0.9924 0.1492 0.9924 0.9962
No log 38.25 306 1.0603 0.1176 1.0603 1.0297
No log 38.5 308 1.0199 0.1384 1.0199 1.0099
No log 38.75 310 0.9258 0.1343 0.9258 0.9622
No log 39.0 312 0.8778 0.1723 0.8778 0.9369
No log 39.25 314 0.8572 0.2142 0.8572 0.9258
No log 39.5 316 0.8611 0.1777 0.8611 0.9279
No log 39.75 318 0.9891 0.1499 0.9891 0.9945
No log 40.0 320 1.1733 0.1784 1.1733 1.0832
No log 40.25 322 1.1797 0.1784 1.1797 1.0861
No log 40.5 324 1.0842 0.2457 1.0842 1.0412
No log 40.75 326 0.9325 0.1573 0.9325 0.9656
No log 41.0 328 0.8369 0.2142 0.8369 0.9148
No log 41.25 330 0.8597 0.2094 0.8597 0.9272
No log 41.5 332 0.9709 0.1176 0.9709 0.9853
No log 41.75 334 1.0753 0.1389 1.0753 1.0369
No log 42.0 336 1.0670 0.1896 1.0670 1.0330
No log 42.25 338 1.0141 0.1896 1.0141 1.0070
No log 42.5 340 0.9580 0.1747 0.9580 0.9788
No log 42.75 342 0.8753 0.1734 0.8753 0.9356
No log 43.0 344 0.8237 0.1501 0.8237 0.9076
No log 43.25 346 0.8188 0.1867 0.8188 0.9049
No log 43.5 348 0.8900 0.1692 0.8900 0.9434
No log 43.75 350 0.9965 0.1787 0.9965 0.9983
No log 44.0 352 1.1450 0.2084 1.1450 1.0700
No log 44.25 354 1.1701 0.2084 1.1701 1.0817
No log 44.5 356 1.1815 0.2084 1.1815 1.0870
No log 44.75 358 1.1362 0.2457 1.1362 1.0659
No log 45.0 360 1.0303 0.1709 1.0303 1.0150
No log 45.25 362 0.9286 0.1787 0.9286 0.9636
No log 45.5 364 0.8860 0.1535 0.8860 0.9413
No log 45.75 366 0.8167 0.2410 0.8167 0.9037
No log 46.0 368 0.8282 0.2410 0.8282 0.9101
No log 46.25 370 0.8732 0.1348 0.8732 0.9344
No log 46.5 372 0.9382 0.1869 0.9382 0.9686
No log 46.75 374 0.9371 0.1869 0.9371 0.9680
No log 47.0 376 0.9266 0.1869 0.9266 0.9626
No log 47.25 378 0.9456 0.1869 0.9456 0.9724
No log 47.5 380 0.9508 0.1827 0.9508 0.9751
No log 47.75 382 1.0085 0.2363 1.0085 1.0042
No log 48.0 384 1.0415 0.1990 1.0415 1.0206
No log 48.25 386 1.0613 0.1626 1.0613 1.0302
No log 48.5 388 1.0480 0.1115 1.0480 1.0237
No log 48.75 390 1.0245 0.1146 1.0245 1.0122
No log 49.0 392 0.9873 0.0925 0.9873 0.9936
No log 49.25 394 0.9538 0.1308 0.9538 0.9766
No log 49.5 396 0.8815 0.2173 0.8815 0.9389
No log 49.75 398 0.8783 0.2223 0.8783 0.9372
No log 50.0 400 0.9205 0.1765 0.9205 0.9594
No log 50.25 402 0.8914 0.2223 0.8914 0.9442
No log 50.5 404 0.9131 0.2124 0.9131 0.9556
No log 50.75 406 0.9733 0.0666 0.9733 0.9866
No log 51.0 408 1.0605 0.0925 1.0605 1.0298
No log 51.25 410 1.1253 0.1176 1.1253 1.0608
No log 51.5 412 1.2161 0.2084 1.2161 1.1028
No log 51.75 414 1.2916 0.1490 1.2916 1.1365
No log 52.0 416 1.3228 0.1671 1.3228 1.1501
No log 52.25 418 1.2910 0.1490 1.2910 1.1362
No log 52.5 420 1.2150 0.2084 1.2150 1.1023
No log 52.75 422 1.1162 0.1115 1.1162 1.0565
No log 53.0 424 1.0196 0.1013 1.0196 1.0097
No log 53.25 426 0.9608 0.1013 0.9608 0.9802
No log 53.5 428 0.9679 0.1013 0.9679 0.9838
No log 53.75 430 0.9782 0.0666 0.9782 0.9890
No log 54.0 432 1.0284 0.0666 1.0284 1.0141
No log 54.25 434 1.0720 0.0925 1.0720 1.0354
No log 54.5 436 1.1059 0.1422 1.1059 1.0516
No log 54.75 438 1.2270 0.2247 1.2270 1.1077
No log 55.0 440 1.3057 0.1956 1.3057 1.1427
No log 55.25 442 1.3328 0.1919 1.3328 1.1545
No log 55.5 444 1.2809 0.1956 1.2809 1.1318
No log 55.75 446 1.1444 0.1626 1.1444 1.0698
No log 56.0 448 1.0710 0.1422 1.0710 1.0349
No log 56.25 450 1.0426 0.1422 1.0426 1.0211
No log 56.5 452 1.0615 0.1422 1.0615 1.0303
No log 56.75 454 1.1195 0.1591 1.1195 1.0581
No log 57.0 456 1.1505 0.1591 1.1505 1.0726
No log 57.25 458 1.1912 0.1591 1.1912 1.0914
No log 57.5 460 1.2159 0.1557 1.2159 1.1027
No log 57.75 462 1.2085 0.1557 1.2085 1.0993
No log 58.0 464 1.1787 0.1591 1.1787 1.0857
No log 58.25 466 1.1296 0.1422 1.1296 1.0628
No log 58.5 468 1.1211 0.1422 1.1211 1.0588
No log 58.75 470 1.1100 0.1389 1.1100 1.0536
No log 59.0 472 1.1441 0.1262 1.1441 1.0696
No log 59.25 474 1.1988 0.1458 1.1988 1.0949
No log 59.5 476 1.2664 0.1895 1.2664 1.1254
No log 59.75 478 1.3265 0.1309 1.3265 1.1517
No log 60.0 480 1.3141 0.2020 1.3141 1.1463
No log 60.25 482 1.2427 0.1427 1.2427 1.1148
No log 60.5 484 1.1844 0.0975 1.1844 1.0883
No log 60.75 486 1.1317 0.1324 1.1317 1.0638
No log 61.0 488 1.0537 0.1662 1.0537 1.0265
No log 61.25 490 1.0107 0.1176 1.0107 1.0053
No log 61.5 492 0.9672 0.1312 0.9672 0.9835
No log 61.75 494 0.9583 0.1312 0.9583 0.9789
No log 62.0 496 0.9567 0.0953 0.9567 0.9781
No log 62.25 498 1.0161 0.1208 1.0161 1.0080
0.1799 62.5 500 1.1137 0.1662 1.1137 1.0553
0.1799 62.75 502 1.1613 0.1324 1.1613 1.0776
0.1799 63.0 504 1.1966 0.1458 1.1966 1.0939
0.1799 63.25 506 1.2065 0.1458 1.2065 1.0984
0.1799 63.5 508 1.2079 0.1458 1.2079 1.0991
0.1799 63.75 510 1.2029 0.1458 1.2029 1.0968

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k3_task7_organization

Finetuned
(4019)
this model