ArabicNewSplits7_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k15_task7_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 1.2621
  • Qwk: 0.2059
  • Mse: 1.2621
  • Rmse: 1.1234

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0556 2 2.6546 -0.0593 2.6546 1.6293
No log 0.1111 4 1.2753 0.0997 1.2753 1.1293
No log 0.1667 6 0.9133 0.0535 0.9133 0.9556
No log 0.2222 8 0.8322 0.0771 0.8322 0.9123
No log 0.2778 10 0.7834 0.0410 0.7834 0.8851
No log 0.3333 12 0.8037 0.1321 0.8037 0.8965
No log 0.3889 14 0.7549 0.1184 0.7549 0.8688
No log 0.4444 16 0.7330 0.1184 0.7330 0.8562
No log 0.5 18 0.7192 0.0804 0.7192 0.8480
No log 0.5556 20 0.7348 0.1786 0.7348 0.8572
No log 0.6111 22 0.7275 0.1786 0.7275 0.8529
No log 0.6667 24 0.7219 0.0937 0.7219 0.8497
No log 0.7222 26 0.7166 0.0428 0.7166 0.8465
No log 0.7778 28 0.7286 0.0295 0.7286 0.8536
No log 0.8333 30 0.7940 0.1142 0.7940 0.8911
No log 0.8889 32 0.8344 -0.0269 0.8344 0.9135
No log 0.9444 34 0.8587 -0.0269 0.8587 0.9266
No log 1.0 36 0.8241 0.0129 0.8241 0.9078
No log 1.0556 38 0.7394 0.0393 0.7394 0.8599
No log 1.1111 40 0.7009 0.1327 0.7009 0.8372
No log 1.1667 42 0.7203 0.0846 0.7203 0.8487
No log 1.2222 44 0.7328 0.0376 0.7328 0.8560
No log 1.2778 46 0.7769 0.1400 0.7769 0.8814
No log 1.3333 48 0.7690 0.1400 0.7690 0.8769
No log 1.3889 50 0.7785 0.2751 0.7785 0.8823
No log 1.4444 52 0.7155 0.2413 0.7155 0.8459
No log 1.5 54 0.9690 0.2643 0.9690 0.9844
No log 1.5556 56 0.9173 0.2697 0.9173 0.9577
No log 1.6111 58 0.6743 0.3408 0.6743 0.8212
No log 1.6667 60 0.7050 0.3665 0.7050 0.8396
No log 1.7222 62 0.8148 0.3305 0.8148 0.9027
No log 1.7778 64 1.0670 0.2999 1.0670 1.0330
No log 1.8333 66 1.1266 0.1935 1.1266 1.0614
No log 1.8889 68 1.2371 0.1002 1.2371 1.1123
No log 1.9444 70 1.4000 0.1401 1.4000 1.1832
No log 2.0 72 1.0699 0.2554 1.0699 1.0344
No log 2.0556 74 0.8151 0.2953 0.8151 0.9028
No log 2.1111 76 0.7686 0.3031 0.7686 0.8767
No log 2.1667 78 0.7795 0.3408 0.7795 0.8829
No log 2.2222 80 1.0362 0.2643 1.0362 1.0180
No log 2.2778 82 1.2021 0.1109 1.2021 1.0964
No log 2.3333 84 1.3732 -0.0991 1.3732 1.1719
No log 2.3889 86 1.3007 -0.0257 1.3007 1.1405
No log 2.4444 88 0.9696 0.1766 0.9696 0.9847
No log 2.5 90 0.7767 0.0930 0.7767 0.8813
No log 2.5556 92 0.7666 0.2116 0.7666 0.8755
No log 2.6111 94 0.6899 0.2327 0.6899 0.8306
No log 2.6667 96 0.7055 0.3155 0.7055 0.8400
No log 2.7222 98 0.8247 0.3371 0.8247 0.9082
No log 2.7778 100 0.8312 0.3371 0.8312 0.9117
No log 2.8333 102 0.7184 0.4165 0.7184 0.8476
No log 2.8889 104 0.6760 0.3581 0.6760 0.8222
No log 2.9444 106 0.6782 0.3556 0.6782 0.8235
No log 3.0 108 0.8701 0.3623 0.8701 0.9328
No log 3.0556 110 1.1766 0.2115 1.1766 1.0847
No log 3.1111 112 1.0801 0.2682 1.0801 1.0393
No log 3.1667 114 0.7772 0.3630 0.7772 0.8816
No log 3.2222 116 0.7059 0.2994 0.7059 0.8402
No log 3.2778 118 0.7861 0.3505 0.7861 0.8866
No log 3.3333 120 0.7452 0.3316 0.7452 0.8632
No log 3.3889 122 0.6979 0.2684 0.6979 0.8354
No log 3.4444 124 0.7901 0.3564 0.7901 0.8889
No log 3.5 126 0.8911 0.3938 0.8911 0.9440
No log 3.5556 128 0.9006 0.3938 0.9006 0.9490
No log 3.6111 130 0.7964 0.3630 0.7964 0.8924
No log 3.6667 132 0.7308 0.3060 0.7308 0.8549
No log 3.7222 134 0.7263 0.3116 0.7263 0.8522
No log 3.7778 136 0.7919 0.2812 0.7919 0.8899
No log 3.8333 138 0.9654 0.3439 0.9654 0.9826
No log 3.8889 140 1.0089 0.2948 1.0089 1.0044
No log 3.9444 142 0.8781 0.3008 0.8781 0.9371
No log 4.0 144 0.7780 0.2784 0.7780 0.8821
No log 4.0556 146 0.7882 0.2784 0.7882 0.8878
No log 4.1111 148 0.9037 0.2726 0.9037 0.9506
No log 4.1667 150 1.0446 0.3029 1.0446 1.0221
No log 4.2222 152 1.0161 0.2806 1.0161 1.0080
No log 4.2778 154 0.8930 0.2297 0.8930 0.9450
No log 4.3333 156 0.8393 0.2109 0.8393 0.9161
No log 4.3889 158 0.8415 0.3504 0.8415 0.9173
No log 4.4444 160 0.8754 0.2808 0.8754 0.9356
No log 4.5 162 0.9574 0.2626 0.9574 0.9785
No log 4.5556 164 1.0390 0.3082 1.0390 1.0193
No log 4.6111 166 0.9985 0.3082 0.9985 0.9992
No log 4.6667 168 0.9033 0.2923 0.9033 0.9504
No log 4.7222 170 0.7897 0.2815 0.7897 0.8886
No log 4.7778 172 0.7981 0.3243 0.7981 0.8933
No log 4.8333 174 0.7847 0.2916 0.7847 0.8859
No log 4.8889 176 0.7935 0.2319 0.7935 0.8908
No log 4.9444 178 0.8450 0.2063 0.8450 0.9192
No log 5.0 180 0.8984 0.1867 0.8984 0.9478
No log 5.0556 182 0.9091 0.1867 0.9091 0.9535
No log 5.1111 184 0.8617 0.1718 0.8617 0.9283
No log 5.1667 186 0.8314 0.1866 0.8314 0.9118
No log 5.2222 188 0.9006 0.2142 0.9006 0.9490
No log 5.2778 190 0.9413 0.2358 0.9413 0.9702
No log 5.3333 192 1.0603 0.2119 1.0603 1.0297
No log 5.3889 194 1.0560 0.2119 1.0560 1.0276
No log 5.4444 196 0.9439 0.2358 0.9439 0.9715
No log 5.5 198 0.8839 0.2244 0.8839 0.9402
No log 5.5556 200 0.9139 0.2142 0.9139 0.9560
No log 5.6111 202 1.0762 0.2810 1.0762 1.0374
No log 5.6667 204 1.2509 0.2084 1.2509 1.1184
No log 5.7222 206 1.3340 0.1458 1.3340 1.1550
No log 5.7778 208 1.2185 0.2392 1.2185 1.1039
No log 5.8333 210 1.1307 0.2075 1.1308 1.0634
No log 5.8889 212 1.0217 0.2460 1.0217 1.0108
No log 5.9444 214 0.9792 0.2810 0.9792 0.9895
No log 6.0 216 0.9606 0.2810 0.9606 0.9801
No log 6.0556 218 1.0180 0.2460 1.0180 1.0090
No log 6.1111 220 1.1083 0.2833 1.1083 1.0527
No log 6.1667 222 1.0692 0.2833 1.0692 1.0340
No log 6.2222 224 0.9658 0.3347 0.9658 0.9828
No log 6.2778 226 0.9428 0.2810 0.9428 0.9710
No log 6.3333 228 0.9491 0.2670 0.9491 0.9742
No log 6.3889 230 0.9384 0.1962 0.9384 0.9687
No log 6.4444 232 1.0177 0.2670 1.0177 1.0088
No log 6.5 234 1.2130 0.1210 1.2130 1.1014
No log 6.5556 236 1.4492 0.1017 1.4492 1.2038
No log 6.6111 238 1.4265 0.1305 1.4265 1.1944
No log 6.6667 240 1.1891 0.1176 1.1891 1.0905
No log 6.7222 242 0.9636 0.2244 0.9636 0.9816
No log 6.7778 244 0.9437 0.2193 0.9437 0.9714
No log 6.8333 246 1.0623 0.1869 1.0623 1.0307
No log 6.8889 248 1.1809 0.2412 1.1809 1.0867
No log 6.9444 250 1.1977 0.2412 1.1977 1.0944
No log 7.0 252 1.2391 0.1427 1.2391 1.1132
No log 7.0556 254 1.1192 0.2482 1.1192 1.0579
No log 7.1111 256 0.9304 0.1765 0.9304 0.9646
No log 7.1667 258 0.7651 0.2085 0.7651 0.8747
No log 7.2222 260 0.7341 0.2270 0.7341 0.8568
No log 7.2778 262 0.7660 0.2027 0.7660 0.8752
No log 7.3333 264 0.9490 0.1962 0.9490 0.9741
No log 7.3889 266 1.1571 0.2437 1.1571 1.0757
No log 7.4444 268 1.1808 0.2501 1.1808 1.0866
No log 7.5 270 1.0804 0.3169 1.0804 1.0394
No log 7.5556 272 0.9822 0.2193 0.9822 0.9911
No log 7.6111 274 1.0214 0.2358 1.0214 1.0106
No log 7.6667 276 1.1094 0.3137 1.1094 1.0533
No log 7.7222 278 1.0554 0.3193 1.0554 1.0273
No log 7.7778 280 0.9940 0.3473 0.9940 0.9970
No log 7.8333 282 0.9359 0.3538 0.9359 0.9674
No log 7.8889 284 0.8614 0.2632 0.8614 0.9281
No log 7.9444 286 0.8337 0.2692 0.8337 0.9131
No log 8.0 288 0.8500 0.2754 0.8500 0.9220
No log 8.0556 290 0.9329 0.2574 0.9329 0.9659
No log 8.1111 292 1.0060 0.2410 1.0060 1.0030
No log 8.1667 294 1.0595 0.2211 1.0595 1.0293
No log 8.2222 296 1.0532 0.2211 1.0532 1.0262
No log 8.2778 298 1.0128 0.2615 1.0128 1.0064
No log 8.3333 300 0.9885 0.2726 0.9885 0.9943
No log 8.3889 302 1.0392 0.1869 1.0392 1.0194
No log 8.4444 304 0.9998 0.1955 0.9998 0.9999
No log 8.5 306 0.9767 0.1955 0.9767 0.9883
No log 8.5556 308 0.9064 0.2967 0.9064 0.9520
No log 8.6111 310 0.8099 0.3167 0.8099 0.9000
No log 8.6667 312 0.8423 0.3032 0.8423 0.9178
No log 8.7222 314 0.9862 0.1955 0.9862 0.9931
No log 8.7778 316 1.0790 0.2227 1.0790 1.0388
No log 8.8333 318 1.0292 0.2211 1.0292 1.0145
No log 8.8889 320 0.8712 0.3032 0.8712 0.9334
No log 8.9444 322 0.8278 0.3099 0.8278 0.9099
No log 9.0 324 0.9039 0.2574 0.9039 0.9507
No log 9.0556 326 1.1001 0.1747 1.1001 1.0489
No log 9.1111 328 1.2890 0.1782 1.2890 1.1353
No log 9.1667 330 1.2317 0.1713 1.2317 1.1098
No log 9.2222 332 1.0792 0.1535 1.0792 1.0388
No log 9.2778 334 1.0132 0.2410 1.0132 1.0066
No log 9.3333 336 1.0070 0.2142 1.0070 1.0035
No log 9.3889 338 0.9518 0.2574 0.9518 0.9756
No log 9.4444 340 0.9171 0.2574 0.9171 0.9576
No log 9.5 342 0.9966 0.2142 0.9966 0.9983
No log 9.5556 344 1.1289 0.0922 1.1289 1.0625
No log 9.6111 346 1.1410 0.1428 1.1410 1.0682
No log 9.6667 348 1.0579 0.1651 1.0579 1.0285
No log 9.7222 350 0.9280 0.2574 0.9280 0.9633
No log 9.7778 352 0.8153 0.2817 0.8153 0.9029
No log 9.8333 354 0.8298 0.3099 0.8298 0.9109
No log 9.8889 356 0.9111 0.2632 0.9111 0.9545
No log 9.9444 358 1.0498 0.1274 1.0498 1.0246
No log 10.0 360 1.1423 0.0925 1.1423 1.0688
No log 10.0556 362 1.1792 0.1428 1.1792 1.0859
No log 10.1111 364 1.1351 0.1671 1.1351 1.0654
No log 10.1667 366 0.9722 0.1651 0.9722 0.9860
No log 10.2222 368 0.9110 0.2632 0.9110 0.9545
No log 10.2778 370 0.8750 0.3099 0.8750 0.9354
No log 10.3333 372 0.8906 0.2632 0.8906 0.9437
No log 10.3889 374 0.9819 0.2000 0.9819 0.9909
No log 10.4444 376 1.1085 0.2504 1.1085 1.0529
No log 10.5 378 1.1522 0.2183 1.1522 1.0734
No log 10.5556 380 1.1208 0.0896 1.1208 1.0587
No log 10.6111 382 1.0033 0.2358 1.0033 1.0016
No log 10.6667 384 0.9860 0.2046 0.9860 0.9930
No log 10.7222 386 1.0337 0.1651 1.0337 1.0167
No log 10.7778 388 1.1224 0.1210 1.1224 1.0594
No log 10.8333 390 1.1100 0.0894 1.1100 1.0536
No log 10.8889 392 1.1468 0.1147 1.1468 1.0709
No log 10.9444 394 1.2221 0.1028 1.2221 1.1055
No log 11.0 396 1.1631 0.0812 1.1631 1.0785
No log 11.0556 398 1.0077 0.1612 1.0077 1.0038
No log 11.1111 400 0.9431 0.2410 0.9431 0.9711
No log 11.1667 402 0.9066 0.2012 0.9066 0.9522
No log 11.2222 404 0.8678 0.2171 0.8678 0.9315
No log 11.2778 406 0.9077 0.2754 0.9077 0.9528
No log 11.3333 408 1.0498 0.0953 1.0498 1.0246
No log 11.3889 410 1.2073 0.1230 1.2073 1.0988
No log 11.4444 412 1.1888 0.1003 1.1888 1.0903
No log 11.5 414 1.0647 0.1528 1.0647 1.0318
No log 11.5556 416 0.9555 0.2518 0.9555 0.9775
No log 11.6111 418 0.8907 0.2692 0.8907 0.9438
No log 11.6667 420 0.8318 0.2692 0.8318 0.9120
No log 11.7222 422 0.8662 0.2692 0.8662 0.9307
No log 11.7778 424 0.9815 0.2410 0.9815 0.9907
No log 11.8333 426 1.1288 0.1870 1.1288 1.0625
No log 11.8889 428 1.1478 0.1870 1.1478 1.0713
No log 11.9444 430 1.0221 0.1984 1.0221 1.0110
No log 12.0 432 0.8598 0.2409 0.8598 0.9272
No log 12.0556 434 0.8056 0.2467 0.8056 0.8975
No log 12.1111 436 0.8468 0.2409 0.8468 0.9202
No log 12.1667 438 1.0492 0.1274 1.0492 1.0243
No log 12.2222 440 1.4516 0.1335 1.4516 1.2048
No log 12.2778 442 1.9150 0.0734 1.9150 1.3838
No log 12.3333 444 1.9615 0.1082 1.9615 1.4005
No log 12.3889 446 1.6933 0.0389 1.6933 1.3013
No log 12.4444 448 1.2755 0.1748 1.2755 1.1294
No log 12.5 450 0.9716 0.2463 0.9716 0.9857
No log 12.5556 452 0.8897 0.2754 0.8897 0.9432
No log 12.6111 454 0.9333 0.2518 0.9333 0.9661
No log 12.6667 456 1.0123 0.1955 1.0123 1.0061
No log 12.7222 458 1.0906 0.2412 1.0906 1.0443
No log 12.7778 460 1.0808 0.3059 1.0808 1.0396
No log 12.8333 462 1.0398 0.2651 1.0398 1.0197
No log 12.8889 464 0.9541 0.2410 0.9541 0.9768
No log 12.9444 466 0.9342 0.2843 0.9342 0.9665
No log 13.0 468 0.8926 0.3099 0.8926 0.9448
No log 13.0556 470 0.8552 0.2883 0.8552 0.9247
No log 13.1111 472 0.8644 0.2883 0.8644 0.9298
No log 13.1667 474 0.8860 0.2817 0.8860 0.9413
No log 13.2222 476 0.8897 0.3099 0.8897 0.9432
No log 13.2778 478 0.9137 0.3099 0.9137 0.9559
No log 13.3333 480 0.9541 0.2244 0.9541 0.9768
No log 13.3889 482 0.9717 0.2410 0.9717 0.9858
No log 13.4444 484 1.0191 0.1651 1.0191 1.0095
No log 13.5 486 1.0779 0.2756 1.0779 1.0382
No log 13.5556 488 1.1276 0.2271 1.1276 1.0619
No log 13.6111 490 1.0860 0.2211 1.0860 1.0421
No log 13.6667 492 1.0489 0.2211 1.0489 1.0241
No log 13.7222 494 1.0026 0.2259 1.0026 1.0013
No log 13.7778 496 0.9486 0.2142 0.9486 0.9740
No log 13.8333 498 0.9370 0.2518 0.9370 0.9680
0.2784 13.8889 500 0.9368 0.2518 0.9368 0.9679
0.2784 13.9444 502 0.9411 0.2410 0.9411 0.9701
0.2784 14.0 504 0.9108 0.2843 0.9108 0.9543
0.2784 14.0556 506 0.8878 0.2632 0.8878 0.9422
0.2784 14.1111 508 0.9087 0.3425 0.9087 0.9533
0.2784 14.1667 510 0.9216 0.3425 0.9216 0.9600
0.2784 14.2222 512 0.8934 0.2843 0.8934 0.9452
0.2784 14.2778 514 0.8573 0.2754 0.8573 0.9259
0.2784 14.3333 516 0.8406 0.2817 0.8406 0.9168
0.2784 14.3889 518 0.8511 0.2754 0.8511 0.9225
0.2784 14.4444 520 0.8435 0.2754 0.8435 0.9184
0.2784 14.5 522 0.8895 0.2632 0.8895 0.9432
0.2784 14.5556 524 0.9061 0.2632 0.9061 0.9519
0.2784 14.6111 526 0.8464 0.2817 0.8464 0.9200
0.2784 14.6667 528 0.8165 0.2950 0.8165 0.9036
0.2784 14.7222 530 0.8079 0.2950 0.8079 0.8988
0.2784 14.7778 532 0.8283 0.2883 0.8283 0.9101
0.2784 14.8333 534 0.8478 0.2883 0.8478 0.9208
0.2784 14.8889 536 0.8530 0.2883 0.8530 0.9236
0.2784 14.9444 538 0.8483 0.2883 0.8483 0.9210
0.2784 15.0 540 0.8680 0.2883 0.8680 0.9317
0.2784 15.0556 542 0.8580 0.2883 0.8580 0.9263
0.2784 15.1111 544 0.8494 0.2883 0.8494 0.9216
0.2784 15.1667 546 0.9052 0.2632 0.9052 0.9514
0.2784 15.2222 548 0.9444 0.2193 0.9444 0.9718
0.2784 15.2778 550 0.8930 0.2692 0.8930 0.9450
0.2784 15.3333 552 0.8648 0.2409 0.8648 0.9299
0.2784 15.3889 554 0.8311 0.2883 0.8311 0.9116
0.2784 15.4444 556 0.8137 0.2883 0.8137 0.9021
0.2784 15.5 558 0.8499 0.2817 0.8499 0.9219
0.2784 15.5556 560 0.9351 0.2193 0.9351 0.9670
0.2784 15.6111 562 1.0795 0.2833 1.0795 1.0390
0.2784 15.6667 564 1.2188 0.1784 1.2188 1.1040
0.2784 15.7222 566 1.2106 0.1784 1.2106 1.1003
0.2784 15.7778 568 1.0599 0.1775 1.0599 1.0295
0.2784 15.8333 570 0.8991 0.2352 0.8991 0.9482
0.2784 15.8889 572 0.8064 0.2527 0.8064 0.8980
0.2784 15.9444 574 0.8060 0.2527 0.8060 0.8978
0.2784 16.0 576 0.8891 0.2297 0.8891 0.9429
0.2784 16.0556 578 0.9698 0.2211 0.9698 0.9848
0.2784 16.1111 580 1.0916 0.2504 1.0916 1.0448
0.2784 16.1667 582 1.2111 0.1832 1.2111 1.1005
0.2784 16.2222 584 1.2621 0.2059 1.2621 1.1234

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k15_task7_organization

Finetuned
(4019)
this model