ArabicNewSplits7_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k19_task7_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.7706
  • Qwk: 0.2353
  • Mse: 0.7706
  • Rmse: 0.8778

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0444 2 2.6359 -0.0262 2.6359 1.6235
No log 0.0889 4 1.3851 0.0750 1.3851 1.1769
No log 0.1333 6 1.2984 -0.1983 1.2984 1.1395
No log 0.1778 8 1.2589 -0.2787 1.2589 1.1220
No log 0.2222 10 1.1997 -0.1356 1.1997 1.0953
No log 0.2667 12 1.2070 -0.1705 1.2070 1.0986
No log 0.3111 14 1.3576 -0.1384 1.3576 1.1652
No log 0.3556 16 1.6718 0.0289 1.6718 1.2930
No log 0.4 18 1.5585 0.0278 1.5585 1.2484
No log 0.4444 20 1.1516 -0.0961 1.1516 1.0731
No log 0.4889 22 0.9965 -0.0288 0.9965 0.9982
No log 0.5333 24 0.8746 0.0725 0.8746 0.9352
No log 0.5778 26 0.8535 0.0410 0.8535 0.9238
No log 0.6222 28 0.8574 0.0937 0.8574 0.9260
No log 0.6667 30 0.8894 0.0509 0.8894 0.9431
No log 0.7111 32 0.8933 0.0947 0.8933 0.9452
No log 0.7556 34 0.9631 0.0609 0.9631 0.9814
No log 0.8 36 1.0418 0.0329 1.0418 1.0207
No log 0.8444 38 1.1758 0.0095 1.1758 1.0843
No log 0.8889 40 1.1874 0.0126 1.1874 1.0897
No log 0.9333 42 0.9553 -0.0484 0.9553 0.9774
No log 0.9778 44 0.8113 0.1737 0.8113 0.9007
No log 1.0222 46 0.9473 0.2651 0.9473 0.9733
No log 1.0667 48 1.0228 0.2757 1.0228 1.0113
No log 1.1111 50 1.0274 0.2543 1.0274 1.0136
No log 1.1556 52 0.8946 0.2063 0.8946 0.9458
No log 1.2 54 0.7827 0.0757 0.7827 0.8847
No log 1.2444 56 0.8118 0.0804 0.8118 0.9010
No log 1.2889 58 0.9449 0.0506 0.9449 0.9721
No log 1.3333 60 1.1381 0.0661 1.1381 1.0668
No log 1.3778 62 1.1565 0.0643 1.1565 1.0754
No log 1.4222 64 1.0107 0.2092 1.0107 1.0054
No log 1.4667 66 0.9388 0.2034 0.9388 0.9689
No log 1.5111 68 0.9827 0.2118 0.9827 0.9913
No log 1.5556 70 1.0083 0.1460 1.0083 1.0041
No log 1.6 72 0.9885 0.0823 0.9885 0.9942
No log 1.6444 74 1.0197 0.1153 1.0197 1.0098
No log 1.6889 76 0.9375 0.0813 0.9375 0.9682
No log 1.7333 78 0.8498 0.0 0.8498 0.9218
No log 1.7778 80 0.8160 0.1440 0.8160 0.9033
No log 1.8222 82 0.8221 0.1440 0.8221 0.9067
No log 1.8667 84 0.8814 0.0727 0.8814 0.9388
No log 1.9111 86 1.0439 0.1427 1.0439 1.0217
No log 1.9556 88 1.0433 0.1122 1.0433 1.0214
No log 2.0 90 1.0135 0.0722 1.0135 1.0067
No log 2.0444 92 0.9971 0.1311 0.9971 0.9985
No log 2.0889 94 0.9859 0.1627 0.9859 0.9929
No log 2.1333 96 0.9564 0.1627 0.9564 0.9780
No log 2.1778 98 0.9328 0.1289 0.9328 0.9658
No log 2.2222 100 0.9468 0.1661 0.9468 0.9730
No log 2.2667 102 0.9298 0.1289 0.9298 0.9643
No log 2.3111 104 1.0249 0.1121 1.0249 1.0124
No log 2.3556 106 0.9742 0.1419 0.9742 0.9870
No log 2.4 108 0.8918 0.1410 0.8918 0.9443
No log 2.4444 110 0.8916 0.1455 0.8916 0.9443
No log 2.4889 112 0.8683 0.1684 0.8683 0.9318
No log 2.5333 114 0.8725 0.1986 0.8725 0.9341
No log 2.5778 116 0.8877 0.2279 0.8877 0.9422
No log 2.6222 118 0.9302 0.2192 0.9302 0.9645
No log 2.6667 120 0.9946 0.1801 0.9946 0.9973
No log 2.7111 122 0.9407 0.2053 0.9407 0.9699
No log 2.7556 124 0.9338 0.2256 0.9338 0.9663
No log 2.8 126 0.9154 0.2172 0.9154 0.9568
No log 2.8444 128 0.9244 0.1901 0.9244 0.9615
No log 2.8889 130 0.8507 0.2099 0.8507 0.9223
No log 2.9333 132 0.8177 0.2099 0.8177 0.9043
No log 2.9778 134 0.8226 0.1373 0.8226 0.9070
No log 3.0222 136 0.7946 0.2748 0.7946 0.8914
No log 3.0667 138 0.7981 0.2413 0.7981 0.8934
No log 3.1111 140 0.8030 0.2413 0.8030 0.8961
No log 3.1556 142 0.8052 0.2224 0.8052 0.8973
No log 3.2 144 0.8033 0.2193 0.8033 0.8963
No log 3.2444 146 0.7992 0.2224 0.7992 0.8940
No log 3.2889 148 0.7863 0.2224 0.7863 0.8868
No log 3.3333 150 0.7897 0.2398 0.7897 0.8886
No log 3.3778 152 0.7800 0.1733 0.7800 0.8832
No log 3.4222 154 0.8836 0.3942 0.8836 0.9400
No log 3.4667 156 1.0237 0.2702 1.0237 1.0118
No log 3.5111 158 1.0113 0.3068 1.0113 1.0056
No log 3.5556 160 0.8837 0.3007 0.8837 0.9401
No log 3.6 162 0.9049 0.2287 0.9049 0.9513
No log 3.6444 164 0.9073 0.2929 0.9073 0.9525
No log 3.6889 166 1.0193 0.3115 1.0193 1.0096
No log 3.7333 168 1.3068 0.0996 1.3068 1.1431
No log 3.7778 170 1.2650 0.1270 1.2650 1.1247
No log 3.8222 172 1.0853 0.2066 1.0853 1.0418
No log 3.8667 174 0.9475 0.1627 0.9475 0.9734
No log 3.9111 176 0.9393 0.1361 0.9393 0.9692
No log 3.9556 178 0.9032 0.1935 0.9032 0.9504
No log 4.0 180 0.9443 0.3023 0.9443 0.9718
No log 4.0444 182 1.0410 0.2437 1.0410 1.0203
No log 4.0889 184 1.0042 0.2467 1.0042 1.0021
No log 4.1333 186 0.9166 0.3015 0.9166 0.9574
No log 4.1778 188 0.9054 0.3015 0.9054 0.9515
No log 4.2222 190 0.9717 0.2576 0.9717 0.9858
No log 4.2667 192 0.9867 0.3746 0.9867 0.9933
No log 4.3111 194 0.8844 0.2359 0.8844 0.9404
No log 4.3556 196 0.8209 0.2302 0.8209 0.9061
No log 4.4 198 0.8173 0.1970 0.8173 0.9041
No log 4.4444 200 0.9371 0.2294 0.9371 0.9680
No log 4.4889 202 1.2527 0.1814 1.2527 1.1192
No log 4.5333 204 1.2824 0.2062 1.2824 1.1324
No log 4.5778 206 1.0096 0.2728 1.0096 1.0048
No log 4.6222 208 0.8085 0.2413 0.8085 0.8992
No log 4.6667 210 0.9250 0.1158 0.9250 0.9618
No log 4.7111 212 0.9316 0.1158 0.9316 0.9652
No log 4.7556 214 0.8184 0.1746 0.8184 0.9047
No log 4.8 216 0.8481 0.2383 0.8481 0.9209
No log 4.8444 218 0.9099 0.2697 0.9099 0.9539
No log 4.8889 220 0.8584 0.2692 0.8584 0.9265
No log 4.9333 222 0.8381 0.2652 0.8381 0.9155
No log 4.9778 224 0.8421 0.2717 0.8421 0.9177
No log 5.0222 226 0.8487 0.0694 0.8487 0.9212
No log 5.0667 228 0.8518 0.0734 0.8518 0.9229
No log 5.1111 230 0.8451 0.2023 0.8451 0.9193
No log 5.1556 232 0.8862 0.2780 0.8862 0.9414
No log 5.2 234 0.9131 0.2920 0.9131 0.9556
No log 5.2444 236 0.9344 0.2914 0.9344 0.9667
No log 5.2889 238 0.9482 0.2400 0.9482 0.9738
No log 5.3333 240 0.9596 0.2920 0.9596 0.9796
No log 5.3778 242 0.9901 0.2236 0.9901 0.9950
No log 5.4222 244 0.9813 0.2471 0.9813 0.9906
No log 5.4667 246 0.9246 0.2544 0.9246 0.9616
No log 5.5111 248 0.8723 0.2535 0.8723 0.9340
No log 5.5556 250 0.8629 0.1741 0.8629 0.9289
No log 5.6 252 0.8369 0.1052 0.8369 0.9148
No log 5.6444 254 0.8093 0.2386 0.8093 0.8996
No log 5.6889 256 0.8478 0.3471 0.8478 0.9208
No log 5.7333 258 0.8445 0.3224 0.8445 0.9190
No log 5.7778 260 0.8165 0.2413 0.8165 0.9036
No log 5.8222 262 0.8475 0.1127 0.8475 0.9206
No log 5.8667 264 0.8828 0.1201 0.8828 0.9396
No log 5.9111 266 0.8408 0.1440 0.8408 0.9169
No log 5.9556 268 0.8235 0.2413 0.8235 0.9075
No log 6.0 270 0.8147 0.2413 0.8147 0.9026
No log 6.0444 272 0.8096 0.1393 0.8096 0.8998
No log 6.0889 274 0.8162 0.0349 0.8162 0.9034
No log 6.1333 276 0.8120 0.1012 0.8120 0.9011
No log 6.1778 278 0.8109 0.2023 0.8109 0.9005
No log 6.2222 280 0.8220 0.2023 0.8220 0.9067
No log 6.2667 282 0.8290 0.2023 0.8290 0.9105
No log 6.3111 284 0.8265 0.2023 0.8265 0.9091
No log 6.3556 286 0.8296 0.2023 0.8296 0.9108
No log 6.4 288 0.8760 0.2161 0.8760 0.9359
No log 6.4444 290 0.9703 0.2518 0.9703 0.9850
No log 6.4889 292 0.9586 0.2328 0.9586 0.9791
No log 6.5333 294 0.8941 0.2349 0.8941 0.9456
No log 6.5778 296 0.9063 0.2985 0.9063 0.9520
No log 6.6222 298 0.9112 0.1089 0.9112 0.9546
No log 6.6667 300 0.8918 0.0790 0.8918 0.9443
No log 6.7111 302 0.8200 0.1052 0.8200 0.9055
No log 6.7556 304 0.7945 0.2907 0.7945 0.8914
No log 6.8 306 0.8321 0.3116 0.8321 0.9122
No log 6.8444 308 0.8312 0.3116 0.8312 0.9117
No log 6.8889 310 0.7955 0.2717 0.7955 0.8919
No log 6.9333 312 0.8168 0.2843 0.8168 0.9038
No log 6.9778 314 0.8287 0.2530 0.8287 0.9103
No log 7.0222 316 0.8406 0.2913 0.8406 0.9168
No log 7.0667 318 0.8525 0.2847 0.8525 0.9233
No log 7.1111 320 0.8395 0.2843 0.8395 0.9162
No log 7.1556 322 0.8630 0.2953 0.8630 0.9290
No log 7.2 324 0.8986 0.2751 0.8986 0.9480
No log 7.2444 326 0.9044 0.3001 0.9044 0.9510
No log 7.2889 328 0.8522 0.3088 0.8522 0.9231
No log 7.3333 330 0.8280 0.2907 0.8280 0.9100
No log 7.3778 332 0.7823 0.1723 0.7823 0.8845
No log 7.4222 334 0.7785 0.1661 0.7785 0.8823
No log 7.4667 336 0.7891 0.2043 0.7891 0.8883
No log 7.5111 338 0.7788 0.2043 0.7788 0.8825
No log 7.5556 340 0.7739 0.2043 0.7739 0.8797
No log 7.6 342 0.7910 0.3011 0.7910 0.8894
No log 7.6444 344 0.7786 0.2353 0.7786 0.8824
No log 7.6889 346 0.7582 0.2043 0.7582 0.8707
No log 7.7333 348 0.7902 0.3011 0.7902 0.8889
No log 7.7778 350 0.8527 0.3382 0.8527 0.9234
No log 7.8222 352 0.8486 0.2943 0.8486 0.9212
No log 7.8667 354 0.8267 0.1986 0.8267 0.9093
No log 7.9111 356 0.8467 0.2053 0.8467 0.9202
No log 7.9556 358 0.8351 0.1684 0.8351 0.9138
No log 8.0 360 0.8617 0.2043 0.8617 0.9283
No log 8.0444 362 0.9528 0.2722 0.9528 0.9761
No log 8.0889 364 0.9532 0.3099 0.9532 0.9763
No log 8.1333 366 0.9081 0.2751 0.9081 0.9529
No log 8.1778 368 0.9283 0.2751 0.9283 0.9635
No log 8.2222 370 0.9211 0.3155 0.9211 0.9598
No log 8.2667 372 0.8999 0.2877 0.8999 0.9486
No log 8.3111 374 0.9076 0.3248 0.9076 0.9527
No log 8.3556 376 0.8780 0.2893 0.8780 0.9370
No log 8.4 378 0.8567 0.2907 0.8567 0.9256
No log 8.4444 380 0.8766 0.2866 0.8766 0.9363
No log 8.4889 382 0.9326 0.2467 0.9326 0.9657
No log 8.5333 384 0.9620 0.2975 0.9620 0.9808
No log 8.5778 386 0.8712 0.3238 0.8712 0.9334
No log 8.6222 388 0.8082 0.2413 0.8082 0.8990
No log 8.6667 390 0.8098 0.2413 0.8098 0.8999
No log 8.7111 392 0.8454 0.2590 0.8454 0.9194
No log 8.7556 394 1.0286 0.1434 1.0286 1.0142
No log 8.8 396 1.1765 0.2363 1.1765 1.0847
No log 8.8444 398 1.1085 0.1736 1.1085 1.0528
No log 8.8889 400 0.9298 0.2283 0.9298 0.9643
No log 8.9333 402 0.8502 0.2715 0.8502 0.9221
No log 8.9778 404 0.8415 0.1684 0.8415 0.9174
No log 9.0222 406 0.8471 0.2748 0.8471 0.9204
No log 9.0667 408 0.8932 0.2751 0.8932 0.9451
No log 9.1111 410 0.9457 0.2547 0.9457 0.9725
No log 9.1556 412 0.9549 0.2843 0.9549 0.9772
No log 9.2 414 0.8703 0.3372 0.8703 0.9329
No log 9.2444 416 0.8513 0.2817 0.8513 0.9226
No log 9.2889 418 0.9158 0.3372 0.9158 0.9570
No log 9.3333 420 0.9539 0.3234 0.9539 0.9767
No log 9.3778 422 0.9717 0.2949 0.9717 0.9858
No log 9.4222 424 0.9354 0.3372 0.9354 0.9672
No log 9.4667 426 0.8960 0.2995 0.8960 0.9466
No log 9.5111 428 0.8889 0.2839 0.8889 0.9428
No log 9.5556 430 0.8407 0.3425 0.8407 0.9169
No log 9.6 432 0.8575 0.3121 0.8575 0.9260
No log 9.6444 434 0.8828 0.2839 0.8828 0.9395
No log 9.6889 436 0.8961 0.2839 0.8961 0.9467
No log 9.7333 438 0.8904 0.2839 0.8904 0.9436
No log 9.7778 440 0.8323 0.3127 0.8323 0.9123
No log 9.8222 442 0.7814 0.3324 0.7814 0.8840
No log 9.8667 444 0.7656 0.2685 0.7656 0.8750
No log 9.9111 446 0.7718 0.2981 0.7718 0.8785
No log 9.9556 448 0.7809 0.2981 0.7809 0.8837
No log 10.0 450 0.7428 0.2007 0.7428 0.8618
No log 10.0444 452 0.7462 0.1353 0.7462 0.8638
No log 10.0889 454 0.7739 0.2319 0.7739 0.8797
No log 10.1333 456 0.7455 0.1723 0.7455 0.8634
No log 10.1778 458 0.7119 0.1813 0.7119 0.8437
No log 10.2222 460 0.7088 0.1813 0.7088 0.8419
No log 10.2667 462 0.7322 0.2043 0.7322 0.8557
No log 10.3111 464 0.7916 0.2981 0.7916 0.8897
No log 10.3556 466 0.8579 0.2912 0.8579 0.9262
No log 10.4 468 0.8193 0.3088 0.8193 0.9052
No log 10.4444 470 0.7763 0.2652 0.7763 0.8811
No log 10.4889 472 0.7550 0.2353 0.7550 0.8689
No log 10.5333 474 0.7726 0.2319 0.7726 0.8790
No log 10.5778 476 0.7849 0.2981 0.7849 0.8860
No log 10.6222 478 0.7883 0.2981 0.7883 0.8879
No log 10.6667 480 0.7736 0.2981 0.7736 0.8795
No log 10.7111 482 0.7171 0.1686 0.7171 0.8468
No log 10.7556 484 0.7142 0.2078 0.7142 0.8451
No log 10.8 486 0.7713 0.1686 0.7713 0.8782
No log 10.8444 488 0.8517 0.2447 0.8517 0.9229
No log 10.8889 490 0.8973 0.3059 0.8973 0.9473
No log 10.9333 492 0.8908 0.2968 0.8908 0.9438
No log 10.9778 494 0.8219 0.2621 0.8219 0.9066
No log 11.0222 496 0.7777 0.1686 0.7777 0.8818
No log 11.0667 498 0.7982 0.2007 0.7982 0.8934
0.3434 11.1111 500 0.8827 0.2592 0.8827 0.9395
0.3434 11.1556 502 0.9414 0.3620 0.9414 0.9703
0.3434 11.2 504 0.8950 0.3343 0.8950 0.9460
0.3434 11.2444 506 0.8087 0.2027 0.8087 0.8993
0.3434 11.2889 508 0.7435 0.2684 0.7435 0.8623
0.3434 11.3333 510 0.7203 0.2684 0.7203 0.8487
0.3434 11.3778 512 0.7357 0.2590 0.7357 0.8578
0.3434 11.4222 514 0.7981 0.4479 0.7981 0.8934
0.3434 11.4667 516 0.8327 0.4067 0.8327 0.9125
0.3434 11.5111 518 0.8258 0.4642 0.8258 0.9088
0.3434 11.5556 520 0.8401 0.4470 0.8401 0.9165
0.3434 11.6 522 0.9063 0.3638 0.9063 0.9520
0.3434 11.6444 524 0.9027 0.4007 0.9027 0.9501
0.3434 11.6889 526 0.8036 0.4491 0.8036 0.8964
0.3434 11.7333 528 0.7125 0.2043 0.7125 0.8441
0.3434 11.7778 530 0.7117 0.2193 0.7117 0.8436
0.3434 11.8222 532 0.7289 0.2193 0.7289 0.8538
0.3434 11.8667 534 0.7373 0.2135 0.7373 0.8587
0.3434 11.9111 536 0.7706 0.2353 0.7706 0.8778

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k19_task7_organization

Finetuned
(4019)
this model