ArabicNewSplits7_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k12_task7_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 1.1123
  • Qwk: 0.2282
  • Mse: 1.1123
  • Rmse: 1.0547

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0690 2 2.5453 -0.0702 2.5453 1.5954
No log 0.1379 4 1.4224 0.0698 1.4224 1.1926
No log 0.2069 6 1.3452 -0.2040 1.3452 1.1598
No log 0.2759 8 1.0964 -0.1887 1.0964 1.0471
No log 0.3448 10 0.9759 -0.0475 0.9759 0.9879
No log 0.4138 12 0.8375 0.0410 0.8375 0.9152
No log 0.4828 14 0.7690 0.0410 0.7690 0.8769
No log 0.5517 16 0.7869 0.0410 0.7869 0.8871
No log 0.6207 18 0.8093 0.0410 0.8093 0.8996
No log 0.6897 20 0.7982 0.0444 0.7982 0.8934
No log 0.7586 22 0.7896 0.0410 0.7896 0.8886
No log 0.8276 24 0.9492 0.0520 0.9492 0.9743
No log 0.8966 26 1.4011 -0.1847 1.4011 1.1837
No log 0.9655 28 1.4887 -0.3187 1.4887 1.2201
No log 1.0345 30 1.3264 -0.2966 1.3264 1.1517
No log 1.1034 32 1.1235 -0.1224 1.1235 1.0599
No log 1.1724 34 0.9535 -0.0444 0.9535 0.9765
No log 1.2414 36 0.8409 0.0 0.8409 0.9170
No log 1.3103 38 0.7680 0.0 0.7680 0.8764
No log 1.3793 40 0.7320 0.0840 0.7320 0.8556
No log 1.4483 42 0.7030 0.2046 0.7030 0.8384
No log 1.5172 44 0.7116 0.2206 0.7116 0.8436
No log 1.5862 46 0.8470 0.2574 0.8470 0.9203
No log 1.6552 48 0.9467 0.2651 0.9467 0.9730
No log 1.7241 50 0.9298 0.2995 0.9298 0.9643
No log 1.7931 52 0.7766 0.4307 0.7766 0.8812
No log 1.8621 54 0.6628 0.2374 0.6628 0.8142
No log 1.9310 56 0.7288 0.2073 0.7288 0.8537
No log 2.0 58 0.7525 0.2703 0.7525 0.8675
No log 2.0690 60 0.7495 0.2736 0.7495 0.8657
No log 2.1379 62 0.8000 0.3270 0.8000 0.8944
No log 2.2069 64 0.7066 0.1673 0.7066 0.8406
No log 2.2759 66 0.6684 0.1440 0.6684 0.8176
No log 2.3448 68 0.6724 0.2847 0.6724 0.8200
No log 2.4138 70 0.7599 0.3425 0.7599 0.8717
No log 2.4828 72 0.6984 0.3099 0.6984 0.8357
No log 2.5517 74 0.6763 0.3020 0.6763 0.8223
No log 2.6207 76 0.8037 0.3042 0.8037 0.8965
No log 2.6897 78 0.9913 0.2857 0.9913 0.9957
No log 2.7586 80 1.0309 0.2756 1.0309 1.0153
No log 2.8276 82 1.0378 0.2756 1.0378 1.0187
No log 2.8966 84 1.3013 0.0770 1.3013 1.1408
No log 2.9655 86 1.2751 0.0072 1.2751 1.1292
No log 3.0345 88 1.0300 0.1528 1.0300 1.0149
No log 3.1034 90 0.8254 0.2492 0.8254 0.9085
No log 3.1724 92 0.8447 0.2364 0.8447 0.9191
No log 3.2414 94 1.0338 0.2192 1.0338 1.0168
No log 3.3103 96 1.1057 0.2363 1.1057 1.0515
No log 3.3793 98 1.2128 0.2815 1.2128 1.1013
No log 3.4483 100 1.2848 0.2638 1.2848 1.1335
No log 3.5172 102 1.2701 0.2552 1.2701 1.1270
No log 3.5862 104 1.1458 0.1618 1.1458 1.0704
No log 3.6552 106 1.1997 0.1320 1.1997 1.0953
No log 3.7241 108 1.2963 0.1638 1.2963 1.1386
No log 3.7931 110 1.4326 0.1313 1.4326 1.1969
No log 3.8621 112 1.3054 0.1222 1.3054 1.1426
No log 3.9310 114 1.2440 0.1740 1.2440 1.1153
No log 4.0 116 1.1805 0.2707 1.1805 1.0865
No log 4.0690 118 1.1308 0.2707 1.1308 1.0634
No log 4.1379 120 1.0854 0.2964 1.0854 1.0418
No log 4.2069 122 1.1987 0.2580 1.1987 1.0948
No log 4.2759 124 1.1554 0.2358 1.1554 1.0749
No log 4.3448 126 1.0050 0.2487 1.0050 1.0025
No log 4.4138 128 0.9770 0.2336 0.9770 0.9885
No log 4.4828 130 1.0938 0.2659 1.0938 1.0459
No log 4.5517 132 1.0999 0.2707 1.0999 1.0487
No log 4.6207 134 1.1431 0.2567 1.1431 1.0691
No log 4.6897 136 1.1143 0.2876 1.1143 1.0556
No log 4.7586 138 1.1411 0.2876 1.1411 1.0682
No log 4.8276 140 1.0561 0.2910 1.0561 1.0277
No log 4.8966 142 1.1131 0.2687 1.1131 1.0550
No log 4.9655 144 1.2438 0.2142 1.2438 1.1153
No log 5.0345 146 1.0958 0.2590 1.0958 1.0468
No log 5.1034 148 0.9215 0.2492 0.9215 0.9599
No log 5.1724 150 0.9948 0.2977 0.9948 0.9974
No log 5.2414 152 1.1905 0.2168 1.1905 1.0911
No log 5.3103 154 1.3375 0.1670 1.3375 1.1565
No log 5.3793 156 1.2240 0.2358 1.2240 1.1063
No log 5.4483 158 1.0228 0.2977 1.0228 1.0113
No log 5.5172 160 0.8734 0.2662 0.8734 0.9346
No log 5.5862 162 0.8567 0.3032 0.8567 0.9256
No log 5.6552 164 0.9426 0.2781 0.9426 0.9709
No log 5.7241 166 1.0423 0.3137 1.0423 1.0209
No log 5.7931 168 0.9723 0.2781 0.9723 0.9861
No log 5.8621 170 0.9571 0.2781 0.9571 0.9783
No log 5.9310 172 0.9413 0.2836 0.9413 0.9702
No log 6.0 174 1.0835 0.2977 1.0835 1.0409
No log 6.0690 176 1.3014 0.1896 1.3014 1.1408
No log 6.1379 178 1.3541 0.1473 1.3541 1.1636
No log 6.2069 180 1.2290 0.1873 1.2290 1.1086
No log 6.2759 182 1.0542 0.1814 1.0542 1.0268
No log 6.3448 184 0.9741 0.2751 0.9741 0.9870
No log 6.4138 186 0.9736 0.2467 0.9736 0.9867
No log 6.4828 188 1.1044 0.0692 1.1044 1.0509
No log 6.5517 190 1.1811 0.1492 1.1811 1.0868
No log 6.6207 192 1.2348 0.0424 1.2348 1.1112
No log 6.6897 194 1.1672 0.1457 1.1672 1.0803
No log 6.7586 196 1.0594 0.1747 1.0594 1.0293
No log 6.8276 198 1.0449 0.1775 1.0449 1.0222
No log 6.8966 200 1.0514 0.1897 1.0514 1.0254
No log 6.9655 202 1.1900 0.1908 1.1900 1.0909
No log 7.0345 204 1.3164 0.1605 1.3164 1.1473
No log 7.1034 206 1.3235 0.1479 1.3235 1.1504
No log 7.1724 208 1.2338 0.2518 1.2338 1.1108
No log 7.2414 210 1.0942 0.3264 1.0942 1.0460
No log 7.3103 212 1.0570 0.3359 1.0570 1.0281
No log 7.3793 214 1.0527 0.2316 1.0527 1.0260
No log 7.4483 216 1.0572 0.2833 1.0572 1.0282
No log 7.5172 218 1.0296 0.3892 1.0296 1.0147
No log 7.5862 220 0.9556 0.3347 0.9556 0.9776
No log 7.6552 222 1.0129 0.3517 1.0129 1.0065
No log 7.7241 224 1.2213 0.2579 1.2213 1.1051
No log 7.7931 226 1.2493 0.2243 1.2493 1.1177
No log 7.8621 228 1.1671 0.3264 1.1671 1.0803
No log 7.9310 230 1.1199 0.3193 1.1199 1.0582
No log 8.0 232 1.0080 0.2964 1.0080 1.0040
No log 8.0690 234 1.0667 0.2577 1.0667 1.0328
No log 8.1379 236 1.3783 0.1701 1.3783 1.1740
No log 8.2069 238 1.9061 0.1123 1.9061 1.3806
No log 8.2759 240 1.8737 0.0986 1.8737 1.3688
No log 8.3448 242 1.5625 0.0881 1.5625 1.2500
No log 8.4138 244 1.2192 0.2579 1.2192 1.1042
No log 8.4828 246 0.9108 0.2726 0.9108 0.9544
No log 8.5517 248 0.8519 0.1416 0.8519 0.9230
No log 8.6207 250 0.8828 0.1373 0.8828 0.9396
No log 8.6897 252 1.0296 0.3051 1.0296 1.0147
No log 8.7586 254 1.3965 0.1874 1.3965 1.1818
No log 8.8276 256 1.7094 0.0583 1.7094 1.3074
No log 8.8966 258 1.7008 0.0805 1.7008 1.3041
No log 8.9655 260 1.4709 0.1310 1.4709 1.2128
No log 9.0345 262 1.2912 0.2591 1.2912 1.1363
No log 9.1034 264 1.1542 0.3110 1.1542 1.0743
No log 9.1724 266 1.0716 0.2297 1.0716 1.0352
No log 9.2414 268 1.1445 0.3247 1.1445 1.0698
No log 9.3103 270 1.4008 0.1310 1.4008 1.1835
No log 9.3793 272 1.5354 0.1010 1.5354 1.2391
No log 9.4483 274 1.5357 0.0789 1.5357 1.2392
No log 9.5172 276 1.3495 0.1951 1.3495 1.1617
No log 9.5862 278 1.1327 0.3088 1.1327 1.0643
No log 9.6552 280 1.0282 0.2756 1.0282 1.0140
No log 9.7241 282 1.0785 0.2297 1.0785 1.0385
No log 9.7931 284 1.2803 0.1956 1.2803 1.1315
No log 9.8621 286 1.4595 0.0876 1.4595 1.2081
No log 9.9310 288 1.5436 0.0260 1.5436 1.2424
No log 10.0 290 1.5623 0.0606 1.5623 1.2499
No log 10.0690 292 1.5525 0.0847 1.5525 1.2460
No log 10.1379 294 1.4074 0.1823 1.4074 1.1863
No log 10.2069 296 1.1822 0.2522 1.1822 1.0873
No log 10.2759 298 1.1437 0.2522 1.1437 1.0695
No log 10.3448 300 1.0739 0.2926 1.0739 1.0363
No log 10.4138 302 0.9725 0.3455 0.9725 0.9862
No log 10.4828 304 0.9310 0.3516 0.9310 0.9649
No log 10.5517 306 0.9793 0.3347 0.9793 0.9896
No log 10.6207 308 1.0167 0.3455 1.0167 1.0083
No log 10.6897 310 1.0232 0.3576 1.0232 1.0115
No log 10.7586 312 0.9329 0.3110 0.9329 0.9659
No log 10.8276 314 0.8895 0.2463 0.8895 0.9431
No log 10.8966 316 0.9434 0.3294 0.9434 0.9713
No log 10.9655 318 1.0656 0.3481 1.0656 1.0323
No log 11.0345 320 1.2166 0.2319 1.2166 1.1030
No log 11.1034 322 1.2682 0.2622 1.2682 1.1261
No log 11.1724 324 1.1598 0.2961 1.1598 1.0769
No log 11.2414 326 1.0200 0.2810 1.0200 1.0100
No log 11.3103 328 0.9203 0.0715 0.9203 0.9593
No log 11.3793 330 0.9019 0.1142 0.9019 0.9497
No log 11.4483 332 0.9514 0.1501 0.9514 0.9754
No log 11.5172 334 1.0678 0.3395 1.0678 1.0334
No log 11.5862 336 1.2057 0.2297 1.2057 1.0981
No log 11.6552 338 1.2472 0.1985 1.2472 1.1168
No log 11.7241 340 1.1796 0.2184 1.1796 1.0861
No log 11.7931 342 1.0324 0.2939 1.0324 1.0161
No log 11.8621 344 0.9520 0.2308 0.9520 0.9757
No log 11.9310 346 0.9529 0.2308 0.9529 0.9762
No log 12.0 348 0.9908 0.3676 0.9908 0.9954
No log 12.0690 350 1.0699 0.3225 1.0699 1.0344
No log 12.1379 352 1.1258 0.2145 1.1258 1.0611
No log 12.2069 354 1.1724 0.1860 1.1724 1.0828
No log 12.2759 356 1.2032 0.1985 1.2032 1.0969
No log 12.3448 358 1.1858 0.2056 1.1858 1.0889
No log 12.4138 360 1.1557 0.2622 1.1557 1.0750
No log 12.4828 362 1.0981 0.3140 1.0981 1.0479
No log 12.5517 364 0.9709 0.3359 0.9709 0.9853
No log 12.6207 366 0.8607 0.3473 0.8607 0.9277
No log 12.6897 368 0.7994 0.2063 0.7994 0.8941
No log 12.7586 370 0.8257 0.2518 0.8257 0.9087
No log 12.8276 372 0.9385 0.3051 0.9385 0.9688
No log 12.8966 374 1.1399 0.2579 1.1399 1.0676
No log 12.9655 376 1.3647 0.2099 1.3647 1.1682
No log 13.0345 378 1.5144 0.1495 1.5144 1.2306
No log 13.1034 380 1.5080 0.1495 1.5080 1.2280
No log 13.1724 382 1.4742 0.1763 1.4742 1.2142
No log 13.2414 384 1.3589 0.1864 1.3589 1.1657
No log 13.3103 386 1.1941 0.2358 1.1941 1.0927
No log 13.3793 388 1.0621 0.2529 1.0621 1.0306
No log 13.4483 390 1.0251 0.2389 1.0251 1.0125
No log 13.5172 392 1.0566 0.2482 1.0566 1.0279
No log 13.5862 394 1.1462 0.2914 1.1462 1.0706
No log 13.6552 396 1.2111 0.2713 1.2111 1.1005
No log 13.7241 398 1.2204 0.1968 1.2204 1.1047
No log 13.7931 400 1.2010 0.2006 1.2010 1.0959
No log 13.8621 402 1.2485 0.2130 1.2485 1.1173
No log 13.9310 404 1.3226 0.1508 1.3226 1.1500
No log 14.0 406 1.2864 0.2417 1.2864 1.1342
No log 14.0690 408 1.1407 0.2457 1.1407 1.0680
No log 14.1379 410 0.9768 0.3287 0.9768 0.9884
No log 14.2069 412 0.8737 0.2518 0.8737 0.9347
No log 14.2759 414 0.8600 0.2518 0.8600 0.9274
No log 14.3448 416 0.8886 0.2982 0.8886 0.9426
No log 14.4138 418 0.9871 0.3938 0.9871 0.9935
No log 14.4828 420 1.1414 0.2799 1.1414 1.0684
No log 14.5517 422 1.1561 0.2843 1.1561 1.0752
No log 14.6207 424 1.1299 0.2710 1.1299 1.0630
No log 14.6897 426 1.0088 0.3516 1.0088 1.0044
No log 14.7586 428 0.9327 0.3110 0.9327 0.9657
No log 14.8276 430 0.9166 0.2518 0.9166 0.9574
No log 14.8966 432 0.9083 0.2518 0.9083 0.9531
No log 14.9655 434 0.9159 0.2518 0.9159 0.9570
No log 15.0345 436 0.9121 0.2142 0.9121 0.9551
No log 15.1034 438 0.9574 0.3110 0.9574 0.9784
No log 15.1724 440 0.9640 0.3110 0.9640 0.9818
No log 15.2414 442 0.8985 0.2615 0.8985 0.9479
No log 15.3103 444 0.8710 0.2358 0.8710 0.9333
No log 15.3793 446 0.8916 0.2866 0.8916 0.9443
No log 15.4483 448 0.9205 0.2728 0.9205 0.9594
No log 15.5172 450 0.9694 0.2881 0.9694 0.9846
No log 15.5862 452 1.0474 0.3445 1.0474 1.0234
No log 15.6552 454 1.0238 0.3324 1.0238 1.0118
No log 15.7241 456 0.9071 0.2866 0.9071 0.9524
No log 15.7931 458 0.8244 0.2784 0.8244 0.9080
No log 15.8621 460 0.7681 0.2527 0.7681 0.8764
No log 15.9310 462 0.7799 0.2817 0.7799 0.8831
No log 16.0 464 0.8277 0.2843 0.8277 0.9098
No log 16.0690 466 0.9057 0.3538 0.9057 0.9517
No log 16.1379 468 0.9828 0.3269 0.9828 0.9913
No log 16.2069 470 1.0301 0.3688 1.0301 1.0149
No log 16.2759 472 0.9876 0.3251 0.9876 0.9938
No log 16.3448 474 0.9553 0.3601 0.9553 0.9774
No log 16.4138 476 0.9145 0.3918 0.9145 0.9563
No log 16.4828 478 0.8031 0.3302 0.8031 0.8961
No log 16.5517 480 0.7555 0.2754 0.7555 0.8692
No log 16.6207 482 0.7974 0.3302 0.7974 0.8930
No log 16.6897 484 0.9436 0.3409 0.9436 0.9714
No log 16.7586 486 1.0342 0.2872 1.0342 1.0170
No log 16.8276 488 0.9773 0.3409 0.9773 0.9886
No log 16.8966 490 0.8885 0.3231 0.8885 0.9426
No log 16.9655 492 0.9129 0.3231 0.9129 0.9555
No log 17.0345 494 1.0265 0.3080 1.0265 1.0132
No log 17.1034 496 1.1948 0.2008 1.1948 1.0931
No log 17.1724 498 1.2405 0.1740 1.2405 1.1138
0.3144 17.2414 500 1.1572 0.2336 1.1572 1.0757
0.3144 17.3103 502 1.0214 0.3370 1.0214 1.0106
0.3144 17.3793 504 0.8892 0.3287 0.8892 0.9430
0.3144 17.4483 506 0.8354 0.3746 0.8354 0.9140
0.3144 17.5172 508 0.8439 0.3746 0.8439 0.9186
0.3144 17.5862 510 0.8780 0.3409 0.8780 0.9370
0.3144 17.6552 512 0.9665 0.3359 0.9665 0.9831
0.3144 17.7241 514 1.1534 0.2733 1.1534 1.0740
0.3144 17.7931 516 1.3200 0.1973 1.3200 1.1489
0.3144 17.8621 518 1.3647 0.1832 1.3647 1.1682
0.3144 17.9310 520 1.2963 0.2121 1.2963 1.1385
0.3144 18.0 522 1.1172 0.2755 1.1172 1.0570
0.3144 18.0690 524 1.0394 0.3193 1.0394 1.0195
0.3144 18.1379 526 1.0260 0.3849 1.0260 1.0129
0.3144 18.2069 528 1.0581 0.3849 1.0581 1.0286
0.3144 18.2759 530 1.0406 0.3849 1.0406 1.0201
0.3144 18.3448 532 1.0567 0.3849 1.0567 1.0280
0.3144 18.4138 534 1.1068 0.2802 1.1068 1.0521
0.3144 18.4828 536 1.2143 0.2043 1.2143 1.1019
0.3144 18.5517 538 1.2711 0.2043 1.2711 1.1274
0.3144 18.6207 540 1.2812 0.2008 1.2812 1.1319
0.3144 18.6897 542 1.2321 0.2280 1.2321 1.1100
0.3144 18.7586 544 1.1904 0.2421 1.1904 1.0911
0.3144 18.8276 546 1.1168 0.2478 1.1168 1.0568
0.3144 18.8966 548 1.0788 0.2780 1.0788 1.0386
0.3144 18.9655 550 1.0672 0.2780 1.0672 1.0330
0.3144 19.0345 552 1.0126 0.3460 1.0126 1.0063
0.3144 19.1034 554 0.9984 0.3460 0.9984 0.9992
0.3144 19.1724 556 0.9563 0.3787 0.9563 0.9779
0.3144 19.2414 558 0.9957 0.3460 0.9957 0.9978
0.3144 19.3103 560 1.0726 0.3140 1.0726 1.0357
0.3144 19.3793 562 1.0980 0.2827 1.0980 1.0479
0.3144 19.4483 564 1.1300 0.2168 1.1300 1.0630
0.3144 19.5172 566 1.1664 0.2522 1.1664 1.0800
0.3144 19.5862 568 1.2493 0.2184 1.2493 1.1177
0.3144 19.6552 570 1.3074 0.2020 1.3074 1.1434
0.3144 19.7241 572 1.2435 0.2020 1.2435 1.1151
0.3144 19.7931 574 1.1456 0.2107 1.1456 1.0703
0.3144 19.8621 576 1.1123 0.2282 1.1123 1.0547

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k12_task7_organization

Finetuned
(4019)
this model