ArabicNewSplits7_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k3_task5_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 1.0274
  • Qwk: 0.1986
  • Mse: 1.0274
  • Rmse: 1.0136

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.25 2 4.1514 0.0182 4.1514 2.0375
No log 0.5 4 2.2956 0.0542 2.2956 1.5151
No log 0.75 6 1.7754 0.0318 1.7754 1.3324
No log 1.0 8 1.7704 0.0639 1.7704 1.3306
No log 1.25 10 1.3770 0.1033 1.3770 1.1735
No log 1.5 12 1.0566 0.3082 1.0566 1.0279
No log 1.75 14 1.1772 0.0938 1.1772 1.0850
No log 2.0 16 1.2745 0.0201 1.2745 1.1290
No log 2.25 18 1.2741 -0.0064 1.2741 1.1288
No log 2.5 20 1.3257 -0.0212 1.3257 1.1514
No log 2.75 22 1.3142 0.0075 1.3142 1.1464
No log 3.0 24 1.2565 0.0489 1.2565 1.1209
No log 3.25 26 1.2373 0.0065 1.2373 1.1124
No log 3.5 28 1.2798 0.1091 1.2798 1.1313
No log 3.75 30 1.2768 0.0462 1.2768 1.1299
No log 4.0 32 1.1643 0.1576 1.1643 1.0790
No log 4.25 34 1.1125 0.1997 1.1125 1.0548
No log 4.5 36 1.1408 0.1498 1.1408 1.0681
No log 4.75 38 1.1909 0.0584 1.1909 1.0913
No log 5.0 40 1.1800 0.1805 1.1800 1.0863
No log 5.25 42 1.1552 0.1725 1.1552 1.0748
No log 5.5 44 1.1724 0.0823 1.1724 1.0828
No log 5.75 46 1.2622 -0.0112 1.2622 1.1235
No log 6.0 48 1.2964 -0.0833 1.2964 1.1386
No log 6.25 50 1.3280 -0.0255 1.3280 1.1524
No log 6.5 52 1.3575 -0.0112 1.3575 1.1651
No log 6.75 54 1.4735 -0.1798 1.4735 1.2139
No log 7.0 56 1.5852 -0.1798 1.5852 1.2590
No log 7.25 58 1.7251 -0.0541 1.7251 1.3134
No log 7.5 60 1.6423 0.0667 1.6423 1.2815
No log 7.75 62 1.4021 0.1663 1.4021 1.1841
No log 8.0 64 1.3872 0.0978 1.3872 1.1778
No log 8.25 66 1.5745 0.1442 1.5745 1.2548
No log 8.5 68 1.5528 0.1545 1.5528 1.2461
No log 8.75 70 1.3817 0.0920 1.3817 1.1755
No log 9.0 72 1.3118 0.0909 1.3118 1.1453
No log 9.25 74 1.2993 0.0817 1.2993 1.1399
No log 9.5 76 1.3029 0.1275 1.3029 1.1415
No log 9.75 78 1.2322 0.2125 1.2322 1.1100
No log 10.0 80 1.2489 0.1135 1.2489 1.1175
No log 10.25 82 1.2458 0.2340 1.2458 1.1162
No log 10.5 84 1.3390 0.1962 1.3390 1.1572
No log 10.75 86 1.5495 0.2391 1.5495 1.2448
No log 11.0 88 1.4669 0.2126 1.4669 1.2112
No log 11.25 90 1.2926 0.1053 1.2926 1.1369
No log 11.5 92 1.1472 0.0954 1.1472 1.0711
No log 11.75 94 1.0923 0.2492 1.0923 1.0451
No log 12.0 96 1.0790 0.3129 1.0790 1.0388
No log 12.25 98 1.1705 0.2250 1.1705 1.0819
No log 12.5 100 1.3731 0.2126 1.3731 1.1718
No log 12.75 102 1.3626 0.2424 1.3626 1.1673
No log 13.0 104 1.2653 0.2687 1.2653 1.1248
No log 13.25 106 1.1904 0.1598 1.1904 1.0911
No log 13.5 108 1.0744 0.2887 1.0744 1.0365
No log 13.75 110 1.0515 0.2887 1.0515 1.0254
No log 14.0 112 1.0941 0.2815 1.0941 1.0460
No log 14.25 114 1.0988 0.3099 1.0988 1.0482
No log 14.5 116 1.1546 0.1573 1.1546 1.0745
No log 14.75 118 1.2114 0.2206 1.2114 1.1007
No log 15.0 120 1.2237 0.1898 1.2237 1.1062
No log 15.25 122 1.1477 0.1500 1.1477 1.0713
No log 15.5 124 1.1833 0.1581 1.1833 1.0878
No log 15.75 126 1.2984 0.1652 1.2984 1.1395
No log 16.0 128 1.3074 0.1713 1.3074 1.1434
No log 16.25 130 1.3348 0.1713 1.3348 1.1553
No log 16.5 132 1.2755 0.2049 1.2755 1.1294
No log 16.75 134 1.2435 0.1928 1.2435 1.1151
No log 17.0 136 1.2141 0.1793 1.2141 1.1019
No log 17.25 138 1.1312 0.1863 1.1312 1.0636
No log 17.5 140 1.1368 0.1863 1.1368 1.0662
No log 17.75 142 1.1192 0.1863 1.1192 1.0579
No log 18.0 144 1.0994 0.1282 1.0994 1.0485
No log 18.25 146 1.0578 0.1823 1.0578 1.0285
No log 18.5 148 1.0274 0.2263 1.0274 1.0136
No log 18.75 150 1.1359 0.2195 1.1359 1.0658
No log 19.0 152 1.2416 0.2520 1.2416 1.1143
No log 19.25 154 1.2228 0.3266 1.2228 1.1058
No log 19.5 156 1.1214 0.2768 1.1214 1.0590
No log 19.75 158 1.0421 0.3577 1.0421 1.0209
No log 20.0 160 1.0161 0.2963 1.0161 1.0080
No log 20.25 162 0.9874 0.2647 0.9874 0.9937
No log 20.5 164 1.0386 0.0220 1.0386 1.0191
No log 20.75 166 1.1577 0.2149 1.1577 1.0760
No log 21.0 168 1.2361 0.2315 1.2361 1.1118
No log 21.25 170 1.1756 0.1770 1.1756 1.0843
No log 21.5 172 1.0801 0.0602 1.0801 1.0393
No log 21.75 174 1.0827 0.0602 1.0827 1.0405
No log 22.0 176 1.1772 0.1793 1.1772 1.0850
No log 22.25 178 1.1949 0.2177 1.1949 1.0931
No log 22.5 180 1.1605 0.2410 1.1605 1.0772
No log 22.75 182 1.1498 0.2026 1.1498 1.0723
No log 23.0 184 1.0364 0.2704 1.0364 1.0180
No log 23.25 186 0.9567 0.2359 0.9567 0.9781
No log 23.5 188 0.9646 0.2066 0.9646 0.9822
No log 23.75 190 1.0155 0.1434 1.0155 1.0077
No log 24.0 192 1.1835 0.2542 1.1835 1.0879
No log 24.25 194 1.2274 0.2313 1.2274 1.1079
No log 24.5 196 1.1395 0.2667 1.1395 1.0675
No log 24.75 198 1.0430 0.3590 1.0430 1.0213
No log 25.0 200 1.0649 0.3480 1.0649 1.0319
No log 25.25 202 1.1154 0.2837 1.1154 1.0561
No log 25.5 204 1.1475 0.2089 1.1475 1.0712
No log 25.75 206 1.0648 0.1202 1.0648 1.0319
No log 26.0 208 0.9959 0.0864 0.9959 0.9980
No log 26.25 210 0.9778 0.0864 0.9778 0.9888
No log 26.5 212 0.9760 0.1873 0.9760 0.9879
No log 26.75 214 0.9760 0.2553 0.9760 0.9879
No log 27.0 216 0.9893 0.3107 0.9893 0.9946
No log 27.25 218 1.0071 0.3103 1.0071 1.0036
No log 27.5 220 1.0496 0.3692 1.0496 1.0245
No log 27.75 222 1.1176 0.3396 1.1176 1.0572
No log 28.0 224 1.1143 0.3040 1.1143 1.0556
No log 28.25 226 1.0769 0.2359 1.0769 1.0377
No log 28.5 228 0.9820 0.2623 0.9820 0.9910
No log 28.75 230 0.9270 0.3631 0.9270 0.9628
No log 29.0 232 0.9276 0.3236 0.9276 0.9631
No log 29.25 234 0.9826 0.1582 0.9826 0.9913
No log 29.5 236 1.0916 0.1587 1.0916 1.0448
No log 29.75 238 1.1448 0.2772 1.1448 1.0700
No log 30.0 240 1.1553 0.2770 1.1553 1.0748
No log 30.25 242 1.1358 0.2127 1.1358 1.0657
No log 30.5 244 1.0729 0.2640 1.0729 1.0358
No log 30.75 246 1.0420 0.2432 1.0420 1.0208
No log 31.0 248 1.0516 0.2296 1.0516 1.0255
No log 31.25 250 1.0877 0.2489 1.0877 1.0429
No log 31.5 252 1.0762 0.2897 1.0762 1.0374
No log 31.75 254 1.0766 0.2812 1.0766 1.0376
No log 32.0 256 1.0305 0.2298 1.0305 1.0151
No log 32.25 258 0.9923 0.2117 0.9923 0.9962
No log 32.5 260 0.9824 0.2238 0.9824 0.9912
No log 32.75 262 1.0077 0.1474 1.0077 1.0038
No log 33.0 264 1.0429 0.1017 1.0429 1.0212
No log 33.25 266 1.0876 0.1986 1.0876 1.0429
No log 33.5 268 1.0721 0.2726 1.0721 1.0354
No log 33.75 270 1.0423 0.3333 1.0423 1.0209
No log 34.0 272 0.9580 0.2770 0.9580 0.9788
No log 34.25 274 0.9549 0.3161 0.9549 0.9772
No log 34.5 276 0.9840 0.2179 0.9840 0.9920
No log 34.75 278 1.0979 0.275 1.0979 1.0478
No log 35.0 280 1.2424 0.2474 1.2424 1.1146
No log 35.25 282 1.2635 0.1744 1.2635 1.1240
No log 35.5 284 1.2023 0.1407 1.2023 1.0965
No log 35.75 286 1.0793 0.0931 1.0793 1.0389
No log 36.0 288 0.9851 0.1446 0.9851 0.9925
No log 36.25 290 0.9427 0.2325 0.9427 0.9709
No log 36.5 292 0.9562 0.2200 0.9562 0.9779
No log 36.75 294 1.0124 0.1823 1.0124 1.0062
No log 37.0 296 1.0392 0.2726 1.0392 1.0194
No log 37.25 298 1.1014 0.2614 1.1014 1.0495
No log 37.5 300 1.0874 0.2308 1.0874 1.0428
No log 37.75 302 1.0717 0.0961 1.0717 1.0352
No log 38.0 304 1.0224 0.0961 1.0224 1.0112
No log 38.25 306 1.0049 0.1017 1.0049 1.0025
No log 38.5 308 1.0336 0.0587 1.0336 1.0167
No log 38.75 310 1.0643 0.0433 1.0643 1.0317
No log 39.0 312 1.0962 0.0811 1.0962 1.0470
No log 39.25 314 1.1637 0.1170 1.1637 1.0788
No log 39.5 316 1.2133 0.2686 1.2133 1.1015
No log 39.75 318 1.1714 0.2623 1.1714 1.0823
No log 40.0 320 1.1558 0.2623 1.1558 1.0751
No log 40.25 322 1.1318 0.2623 1.1318 1.0639
No log 40.5 324 1.1548 0.2495 1.1548 1.0746
No log 40.75 326 1.1208 0.2038 1.1208 1.0587
No log 41.0 328 1.0517 0.2331 1.0517 1.0255
No log 41.25 330 1.0376 0.1823 1.0376 1.0186
No log 41.5 332 1.0373 0.1823 1.0373 1.0185
No log 41.75 334 1.0764 0.1797 1.0764 1.0375
No log 42.0 336 1.0719 0.1407 1.0719 1.0353
No log 42.25 338 1.0330 0.1823 1.0330 1.0164
No log 42.5 340 1.0299 0.1474 1.0299 1.0149
No log 42.75 342 1.0686 0.1047 1.0686 1.0337
No log 43.0 344 1.1251 0.0990 1.1251 1.0607
No log 43.25 346 1.1311 0.0990 1.1311 1.0635
No log 43.5 348 1.1016 0.0990 1.1016 1.0496
No log 43.75 350 1.0196 0.1351 1.0196 1.0097
No log 44.0 352 0.9684 0.2226 0.9684 0.9841
No log 44.25 354 0.9533 0.2226 0.9533 0.9764
No log 44.5 356 0.9754 0.2674 0.9754 0.9876
No log 44.75 358 1.0466 0.1407 1.0466 1.0230
No log 45.0 360 1.0890 0.1986 1.0890 1.0435
No log 45.25 362 1.1033 0.2614 1.1033 1.0504
No log 45.5 364 1.0475 0.3021 1.0475 1.0235
No log 45.75 366 0.9630 0.3124 0.9630 0.9813
No log 46.0 368 0.9033 0.3922 0.9033 0.9504
No log 46.25 370 0.8680 0.4337 0.8680 0.9317
No log 46.5 372 0.8610 0.4086 0.8610 0.9279
No log 46.75 374 0.8687 0.4337 0.8687 0.9321
No log 47.0 376 0.9172 0.3922 0.9172 0.9577
No log 47.25 378 1.0013 0.2298 1.0013 1.0007
No log 47.5 380 1.0336 0.2416 1.0336 1.0167
No log 47.75 382 1.0108 0.1823 1.0108 1.0054
No log 48.0 384 0.9580 0.2117 0.9580 0.9788
No log 48.25 386 0.9295 0.3107 0.9295 0.9641
No log 48.5 388 0.9150 0.4468 0.9150 0.9566
No log 48.75 390 0.9364 0.2698 0.9364 0.9677
No log 49.0 392 0.9753 0.2117 0.9753 0.9876
No log 49.25 394 1.0150 0.1823 1.0150 1.0075
No log 49.5 396 0.9980 0.1474 0.9980 0.9990
No log 49.75 398 0.9399 0.2375 0.9399 0.9695
No log 50.0 400 0.9176 0.3753 0.9176 0.9579
No log 50.25 402 0.9111 0.3498 0.9111 0.9545
No log 50.5 404 0.9179 0.3348 0.9179 0.9581
No log 50.75 406 0.9322 0.2921 0.9322 0.9655
No log 51.0 408 0.9415 0.2818 0.9415 0.9703
No log 51.25 410 0.9351 0.3214 0.9351 0.9670
No log 51.5 412 0.9539 0.3107 0.9539 0.9767
No log 51.75 414 0.9882 0.3024 0.9882 0.9941
No log 52.0 416 0.9926 0.3310 0.9926 0.9963
No log 52.25 418 0.9752 0.2674 0.9752 0.9875
No log 52.5 420 0.9496 0.2698 0.9496 0.9745
No log 52.75 422 0.9497 0.2375 0.9497 0.9745
No log 53.0 424 0.9695 0.2350 0.9695 0.9846
No log 53.25 426 0.9778 0.2350 0.9778 0.9889
No log 53.5 428 1.0024 0.2325 1.0024 1.0012
No log 53.75 430 1.0141 0.2529 1.0141 1.0070
No log 54.0 432 1.0541 0.1407 1.0541 1.0267
No log 54.25 434 1.0845 0.1343 1.0845 1.0414
No log 54.5 436 1.1502 0.2577 1.1502 1.0725
No log 54.75 438 1.1731 0.2623 1.1731 1.0831
No log 55.0 440 1.1694 0.2623 1.1694 1.0814
No log 55.25 442 1.1209 0.2577 1.1209 1.0587
No log 55.5 444 1.0576 0.1654 1.0576 1.0284
No log 55.75 446 1.0170 0.2535 1.0170 1.0085
No log 56.0 448 1.0063 0.2299 1.0063 1.0032
No log 56.25 450 1.0074 0.2117 1.0074 1.0037
No log 56.5 452 1.0481 0.2051 1.0481 1.0238
No log 56.75 454 1.1020 0.2528 1.1020 1.0498
No log 57.0 456 1.1343 0.2896 1.1343 1.0650
No log 57.25 458 1.1758 0.2588 1.1758 1.0843
No log 57.5 460 1.1911 0.2495 1.1911 1.0914
No log 57.75 462 1.1696 0.2395 1.1696 1.0815
No log 58.0 464 1.1316 0.1434 1.1316 1.0638
No log 58.25 466 1.0718 0.0587 1.0718 1.0353
No log 58.5 468 1.0192 0.1017 1.0192 1.0095
No log 58.75 470 0.9665 0.2983 0.9665 0.9831
No log 59.0 472 0.9525 0.4335 0.9525 0.9760
No log 59.25 474 0.9549 0.4004 0.9549 0.9772
No log 59.5 476 0.9815 0.3675 0.9815 0.9907
No log 59.75 478 0.9895 0.2972 0.9895 0.9947
No log 60.0 480 0.9810 0.3268 0.9810 0.9904
No log 60.25 482 0.9503 0.4020 0.9503 0.9748
No log 60.5 484 0.9276 0.3569 0.9276 0.9631
No log 60.75 486 0.9173 0.3569 0.9173 0.9578
No log 61.0 488 0.9293 0.3147 0.9293 0.9640
No log 61.25 490 0.9655 0.1873 0.9655 0.9826
No log 61.5 492 0.9949 0.1017 0.9949 0.9974
No log 61.75 494 1.0558 0.0961 1.0558 1.0275
No log 62.0 496 1.1034 0.2577 1.1034 1.0504
No log 62.25 498 1.1393 0.2772 1.1393 1.0674
0.2174 62.5 500 1.1375 0.2623 1.1375 1.0666
0.2174 62.75 502 1.0986 0.2528 1.0986 1.0481
0.2174 63.0 504 1.0768 0.2227 1.0768 1.0377
0.2174 63.25 506 1.0609 0.2227 1.0609 1.0300
0.2174 63.5 508 1.0356 0.1986 1.0356 1.0177
0.2174 63.75 510 1.0274 0.1986 1.0274 1.0136

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k3_task5_organization

Finetuned
(4019)
this model