ArabicNewSplits7_usingWellWrittenEssays_FineTuningAraBERT_run3_AugV5_k3_task5_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 1.0343
  • Qwk: 0.2614
  • Mse: 1.0343
  • Rmse: 1.0170

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.25 2 4.1514 0.0182 4.1514 2.0375
No log 0.5 4 2.2956 0.0542 2.2956 1.5151
No log 0.75 6 1.7754 0.0318 1.7754 1.3324
No log 1.0 8 1.7704 0.0639 1.7704 1.3306
No log 1.25 10 1.3770 0.1033 1.3770 1.1735
No log 1.5 12 1.0566 0.3082 1.0566 1.0279
No log 1.75 14 1.1772 0.0938 1.1772 1.0850
No log 2.0 16 1.2745 0.0201 1.2745 1.1290
No log 2.25 18 1.2741 -0.0064 1.2741 1.1288
No log 2.5 20 1.3257 -0.0212 1.3257 1.1514
No log 2.75 22 1.3142 0.0075 1.3142 1.1464
No log 3.0 24 1.2565 0.0489 1.2565 1.1209
No log 3.25 26 1.2373 0.0065 1.2373 1.1124
No log 3.5 28 1.2797 0.1091 1.2797 1.1313
No log 3.75 30 1.2768 0.0462 1.2768 1.1299
No log 4.0 32 1.1643 0.1576 1.1643 1.0790
No log 4.25 34 1.1125 0.1997 1.1125 1.0548
No log 4.5 36 1.1408 0.1498 1.1408 1.0681
No log 4.75 38 1.1909 0.0584 1.1909 1.0913
No log 5.0 40 1.1800 0.1805 1.1800 1.0863
No log 5.25 42 1.1552 0.1725 1.1552 1.0748
No log 5.5 44 1.1724 0.0823 1.1724 1.0828
No log 5.75 46 1.2622 -0.0112 1.2622 1.1235
No log 6.0 48 1.2964 -0.0833 1.2964 1.1386
No log 6.25 50 1.3280 -0.0255 1.3280 1.1524
No log 6.5 52 1.3575 -0.0112 1.3575 1.1651
No log 6.75 54 1.4735 -0.1798 1.4735 1.2139
No log 7.0 56 1.5852 -0.1798 1.5852 1.2590
No log 7.25 58 1.7251 -0.0541 1.7251 1.3134
No log 7.5 60 1.6422 0.0667 1.6422 1.2815
No log 7.75 62 1.4021 0.1663 1.4021 1.1841
No log 8.0 64 1.3872 0.0978 1.3872 1.1778
No log 8.25 66 1.5745 0.1442 1.5745 1.2548
No log 8.5 68 1.5528 0.1545 1.5528 1.2461
No log 8.75 70 1.3817 0.0920 1.3817 1.1755
No log 9.0 72 1.3117 0.0909 1.3117 1.1453
No log 9.25 74 1.2993 0.0817 1.2993 1.1399
No log 9.5 76 1.3029 0.1275 1.3029 1.1415
No log 9.75 78 1.2322 0.2125 1.2322 1.1100
No log 10.0 80 1.2489 0.1135 1.2489 1.1175
No log 10.25 82 1.2458 0.2340 1.2458 1.1162
No log 10.5 84 1.3390 0.1962 1.3390 1.1572
No log 10.75 86 1.5496 0.2391 1.5496 1.2448
No log 11.0 88 1.4669 0.2126 1.4669 1.2112
No log 11.25 90 1.2926 0.1053 1.2926 1.1369
No log 11.5 92 1.1472 0.0954 1.1472 1.0711
No log 11.75 94 1.0923 0.2492 1.0923 1.0451
No log 12.0 96 1.0790 0.3129 1.0790 1.0387
No log 12.25 98 1.1705 0.2250 1.1705 1.0819
No log 12.5 100 1.3731 0.2126 1.3731 1.1718
No log 12.75 102 1.3626 0.2424 1.3626 1.1673
No log 13.0 104 1.2653 0.2687 1.2653 1.1248
No log 13.25 106 1.1904 0.1598 1.1904 1.0911
No log 13.5 108 1.0744 0.2887 1.0744 1.0365
No log 13.75 110 1.0515 0.2887 1.0515 1.0254
No log 14.0 112 1.0941 0.2815 1.0941 1.0460
No log 14.25 114 1.0988 0.3099 1.0988 1.0482
No log 14.5 116 1.1547 0.1573 1.1547 1.0746
No log 14.75 118 1.2115 0.2206 1.2115 1.1007
No log 15.0 120 1.2237 0.1898 1.2237 1.1062
No log 15.25 122 1.1477 0.1500 1.1477 1.0713
No log 15.5 124 1.1833 0.1581 1.1833 1.0878
No log 15.75 126 1.2983 0.1652 1.2983 1.1394
No log 16.0 128 1.3074 0.1713 1.3074 1.1434
No log 16.25 130 1.3348 0.1713 1.3348 1.1553
No log 16.5 132 1.2755 0.2049 1.2755 1.1294
No log 16.75 134 1.2436 0.1928 1.2436 1.1152
No log 17.0 136 1.2142 0.1793 1.2142 1.1019
No log 17.25 138 1.1313 0.1863 1.1313 1.0636
No log 17.5 140 1.1368 0.1863 1.1368 1.0662
No log 17.75 142 1.1192 0.1863 1.1192 1.0579
No log 18.0 144 1.0994 0.1282 1.0994 1.0485
No log 18.25 146 1.0578 0.1823 1.0578 1.0285
No log 18.5 148 1.0274 0.2263 1.0274 1.0136
No log 18.75 150 1.1359 0.2195 1.1359 1.0658
No log 19.0 152 1.2417 0.2520 1.2417 1.1143
No log 19.25 154 1.2229 0.3266 1.2229 1.1058
No log 19.5 156 1.1215 0.2768 1.1215 1.0590
No log 19.75 158 1.0421 0.3577 1.0421 1.0209
No log 20.0 160 1.0161 0.2963 1.0161 1.0080
No log 20.25 162 0.9874 0.2647 0.9874 0.9937
No log 20.5 164 1.0386 0.0220 1.0386 1.0191
No log 20.75 166 1.1577 0.2149 1.1577 1.0760
No log 21.0 168 1.2361 0.2315 1.2361 1.1118
No log 21.25 170 1.1757 0.1770 1.1757 1.0843
No log 21.5 172 1.0802 0.0602 1.0802 1.0393
No log 21.75 174 1.0827 0.0602 1.0827 1.0405
No log 22.0 176 1.1772 0.1793 1.1772 1.0850
No log 22.25 178 1.1949 0.2177 1.1949 1.0931
No log 22.5 180 1.1604 0.2410 1.1604 1.0772
No log 22.75 182 1.1497 0.2026 1.1497 1.0723
No log 23.0 184 1.0364 0.2704 1.0364 1.0180
No log 23.25 186 0.9566 0.2359 0.9566 0.9780
No log 23.5 188 0.9645 0.2066 0.9645 0.9821
No log 23.75 190 1.0154 0.1434 1.0154 1.0077
No log 24.0 192 1.1833 0.2542 1.1833 1.0878
No log 24.25 194 1.2274 0.2313 1.2274 1.1079
No log 24.5 196 1.1395 0.2667 1.1395 1.0675
No log 24.75 198 1.0430 0.3590 1.0430 1.0213
No log 25.0 200 1.0648 0.3480 1.0648 1.0319
No log 25.25 202 1.1153 0.2837 1.1153 1.0561
No log 25.5 204 1.1475 0.2089 1.1475 1.0712
No log 25.75 206 1.0649 0.1202 1.0649 1.0319
No log 26.0 208 0.9961 0.0864 0.9961 0.9980
No log 26.25 210 0.9779 0.0864 0.9779 0.9889
No log 26.5 212 0.9761 0.1873 0.9761 0.9880
No log 26.75 214 0.9761 0.2553 0.9761 0.9880
No log 27.0 216 0.9894 0.3107 0.9894 0.9947
No log 27.25 218 1.0072 0.3103 1.0072 1.0036
No log 27.5 220 1.0498 0.3692 1.0498 1.0246
No log 27.75 222 1.1178 0.3396 1.1178 1.0573
No log 28.0 224 1.1147 0.3040 1.1147 1.0558
No log 28.25 226 1.0772 0.2359 1.0772 1.0379
No log 28.5 228 0.9823 0.2623 0.9823 0.9911
No log 28.75 230 0.9271 0.3631 0.9271 0.9628
No log 29.0 232 0.9276 0.3236 0.9276 0.9631
No log 29.25 234 0.9824 0.1582 0.9824 0.9912
No log 29.5 236 1.0913 0.1587 1.0913 1.0447
No log 29.75 238 1.1448 0.2772 1.1448 1.0699
No log 30.0 240 1.1553 0.2770 1.1553 1.0748
No log 30.25 242 1.1359 0.2127 1.1359 1.0658
No log 30.5 244 1.0730 0.2640 1.0730 1.0358
No log 30.75 246 1.0421 0.2432 1.0421 1.0208
No log 31.0 248 1.0517 0.2296 1.0517 1.0255
No log 31.25 250 1.0878 0.2489 1.0878 1.0430
No log 31.5 252 1.0762 0.2897 1.0762 1.0374
No log 31.75 254 1.0764 0.2812 1.0764 1.0375
No log 32.0 256 1.0303 0.2298 1.0303 1.0150
No log 32.25 258 0.9921 0.2117 0.9921 0.9961
No log 32.5 260 0.9822 0.2238 0.9822 0.9910
No log 32.75 262 1.0075 0.1474 1.0075 1.0037
No log 33.0 264 1.0426 0.1017 1.0426 1.0211
No log 33.25 266 1.0867 0.1986 1.0867 1.0424
No log 33.5 268 1.0711 0.2726 1.0711 1.0349
No log 33.75 270 1.0412 0.3333 1.0412 1.0204
No log 34.0 272 0.9571 0.2770 0.9571 0.9783
No log 34.25 274 0.9541 0.3161 0.9541 0.9768
No log 34.5 276 0.9835 0.2179 0.9835 0.9917
No log 34.75 278 1.0983 0.275 1.0983 1.0480
No log 35.0 280 1.2440 0.2474 1.2440 1.1153
No log 35.25 282 1.2652 0.1744 1.2652 1.1248
No log 35.5 284 1.2035 0.1407 1.2035 1.0970
No log 35.75 286 1.0800 0.0931 1.0800 1.0392
No log 36.0 288 0.9855 0.1446 0.9855 0.9927
No log 36.25 290 0.9430 0.2325 0.9430 0.9711
No log 36.5 292 0.9563 0.2200 0.9563 0.9779
No log 36.75 294 1.0122 0.1823 1.0122 1.0061
No log 37.0 296 1.0386 0.2726 1.0386 1.0191
No log 37.25 298 1.1024 0.2614 1.1024 1.0499
No log 37.5 300 1.0900 0.2308 1.0900 1.0441
No log 37.75 302 1.0750 0.0961 1.0750 1.0368
No log 38.0 304 1.0253 0.0961 1.0253 1.0126
No log 38.25 306 1.0067 0.1017 1.0067 1.0033
No log 38.5 308 1.0342 0.0587 1.0342 1.0170
No log 38.75 310 1.0640 0.0433 1.0640 1.0315
No log 39.0 312 1.0963 0.0811 1.0963 1.0470
No log 39.25 314 1.1653 0.1170 1.1653 1.0795
No log 39.5 316 1.2172 0.2686 1.2172 1.1033
No log 39.75 318 1.1770 0.2896 1.1770 1.0849
No log 40.0 320 1.1625 0.2623 1.1625 1.0782
No log 40.25 322 1.1371 0.2623 1.1371 1.0664
No log 40.5 324 1.1578 0.2495 1.1578 1.0760
No log 40.75 326 1.1206 0.2038 1.1206 1.0586
No log 41.0 328 1.0499 0.2331 1.0499 1.0246
No log 41.25 330 1.0393 0.1823 1.0393 1.0194
No log 41.5 332 1.0407 0.1823 1.0407 1.0202
No log 41.75 334 1.0805 0.1797 1.0805 1.0395
No log 42.0 336 1.0748 0.1407 1.0748 1.0367
No log 42.25 338 1.0345 0.1823 1.0345 1.0171
No log 42.5 340 1.0304 0.1474 1.0304 1.0151
No log 42.75 342 1.0686 0.1047 1.0686 1.0337
No log 43.0 344 1.1269 0.0990 1.1269 1.0616
No log 43.25 346 1.1338 0.0990 1.1338 1.0648
No log 43.5 348 1.1043 0.0990 1.1043 1.0509
No log 43.75 350 1.0216 0.1474 1.0216 1.0107
No log 44.0 352 0.9687 0.2226 0.9687 0.9842
No log 44.25 354 0.9528 0.2226 0.9528 0.9761
No log 44.5 356 0.9745 0.2674 0.9745 0.9872
No log 44.75 358 1.0451 0.1407 1.0451 1.0223
No log 45.0 360 1.0875 0.1986 1.0875 1.0428
No log 45.25 362 1.1020 0.2614 1.1020 1.0498
No log 45.5 364 1.0467 0.3021 1.0467 1.0231
No log 45.75 366 0.9634 0.3124 0.9634 0.9815
No log 46.0 368 0.9046 0.3922 0.9046 0.9511
No log 46.25 370 0.8686 0.4337 0.8686 0.9320
No log 46.5 372 0.8609 0.3795 0.8609 0.9279
No log 46.75 374 0.8679 0.4337 0.8679 0.9316
No log 47.0 376 0.9150 0.3922 0.9150 0.9566
No log 47.25 378 0.9988 0.1702 0.9988 0.9994
No log 47.5 380 1.0311 0.2416 1.0311 1.0154
No log 47.75 382 1.0079 0.1823 1.0079 1.0039
No log 48.0 384 0.9554 0.2529 0.9554 0.9774
No log 48.25 386 0.9275 0.3250 0.9275 0.9631
No log 48.5 388 0.9143 0.4468 0.9143 0.9562
No log 48.75 390 0.9363 0.2698 0.9363 0.9676
No log 49.0 392 0.9764 0.2117 0.9764 0.9881
No log 49.25 394 1.0156 0.1823 1.0156 1.0078
No log 49.5 396 0.9984 0.1823 0.9984 0.9992
No log 49.75 398 0.9403 0.2375 0.9403 0.9697
No log 50.0 400 0.9181 0.3753 0.9181 0.9582
No log 50.25 402 0.9117 0.3498 0.9117 0.9548
No log 50.5 404 0.9178 0.3498 0.9178 0.9580
No log 50.75 406 0.9317 0.2921 0.9317 0.9653
No log 51.0 408 0.9406 0.3214 0.9406 0.9699
No log 51.25 410 0.9339 0.3214 0.9339 0.9664
No log 51.5 412 0.9522 0.3107 0.9522 0.9758
No log 51.75 414 0.9876 0.3024 0.9876 0.9938
No log 52.0 416 0.9932 0.3584 0.9932 0.9966
No log 52.25 418 0.9758 0.2723 0.9758 0.9878
No log 52.5 420 0.9500 0.2698 0.9500 0.9747
No log 52.75 422 0.9511 0.2375 0.9511 0.9752
No log 53.0 424 0.9722 0.2350 0.9722 0.9860
No log 53.25 426 0.9807 0.2350 0.9807 0.9903
No log 53.5 428 1.0050 0.2200 1.0050 1.0025
No log 53.75 430 1.0151 0.2529 1.0151 1.0075
No log 54.0 432 1.0536 0.1823 1.0536 1.0265
No log 54.25 434 1.0824 0.1343 1.0824 1.0404
No log 54.5 436 1.1472 0.2577 1.1472 1.0711
No log 54.75 438 1.1692 0.2623 1.1692 1.0813
No log 55.0 440 1.1656 0.2623 1.1656 1.0796
No log 55.25 442 1.1179 0.2577 1.1179 1.0573
No log 55.5 444 1.0583 0.2051 1.0583 1.0288
No log 55.75 446 1.0235 0.2535 1.0235 1.0117
No log 56.0 448 1.0181 0.1873 1.0181 1.0090
No log 56.25 450 1.0198 0.2213 1.0198 1.0099
No log 56.5 452 1.0626 0.1654 1.0626 1.0308
No log 56.75 454 1.1091 0.2528 1.1091 1.0531
No log 57.0 456 1.1319 0.2816 1.1319 1.0639
No log 57.25 458 1.1652 0.2495 1.1652 1.0795
No log 57.5 460 1.1756 0.2686 1.1756 1.0843
No log 57.75 462 1.1520 0.2395 1.1520 1.0733
No log 58.0 464 1.1146 0.1434 1.1146 1.0557
No log 58.25 466 1.0570 0.1017 1.0570 1.0281
No log 58.5 468 1.0136 0.1017 1.0136 1.0068
No log 58.75 470 0.9675 0.3648 0.9675 0.9836
No log 59.0 472 0.9593 0.3714 0.9593 0.9794
No log 59.25 474 0.9674 0.3525 0.9674 0.9836
No log 59.5 476 1.0011 0.3298 1.0011 1.0005
No log 59.75 478 1.0123 0.3298 1.0123 1.0061
No log 60.0 480 1.0043 0.2916 1.0043 1.0021
No log 60.25 482 0.9705 0.3236 0.9705 0.9851
No log 60.5 484 0.9432 0.3147 0.9432 0.9712
No log 60.75 486 0.9308 0.3147 0.9308 0.9648
No log 61.0 488 0.9455 0.3147 0.9455 0.9724
No log 61.25 490 0.9764 0.1446 0.9764 0.9881
No log 61.5 492 0.9945 0.1446 0.9945 0.9973
No log 61.75 494 1.0437 0.1379 1.0437 1.0216
No log 62.0 496 1.0812 0.2227 1.0812 1.0398
No log 62.25 498 1.1184 0.2896 1.1184 1.0575
0.2175 62.5 500 1.1231 0.2623 1.1231 1.0598
0.2175 62.75 502 1.0924 0.2227 1.0924 1.0452
0.2175 63.0 504 1.0759 0.2227 1.0759 1.0372
0.2175 63.25 506 1.0651 0.2227 1.0651 1.0321
0.2175 63.5 508 1.0434 0.1986 1.0434 1.0215
0.2175 63.75 510 1.0343 0.2614 1.0343 1.0170

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_usingWellWrittenEssays_FineTuningAraBERT_run3_AugV5_k3_task5_organization

Finetuned
(4019)
this model