ArabicNewSplits7_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k13_task7_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 1.0165
  • Qwk: 0.1077
  • Mse: 1.0165
  • Rmse: 1.0082

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0645 2 2.5689 -0.0758 2.5689 1.6028
No log 0.1290 4 1.3470 0.0985 1.3470 1.1606
No log 0.1935 6 1.2992 -0.1871 1.2992 1.1398
No log 0.2581 8 0.9648 -0.1569 0.9648 0.9823
No log 0.3226 10 0.8289 0.0313 0.8289 0.9105
No log 0.3871 12 0.7585 0.0846 0.7585 0.8709
No log 0.4516 14 0.7376 0.0444 0.7376 0.8589
No log 0.5161 16 0.7601 0.0393 0.7601 0.8718
No log 0.5806 18 0.7828 0.1456 0.7828 0.8847
No log 0.6452 20 0.7168 0.2181 0.7168 0.8467
No log 0.7097 22 0.6717 0.1660 0.6717 0.8196
No log 0.7742 24 0.7674 0.2846 0.7674 0.8760
No log 0.8387 26 0.7903 0.2087 0.7903 0.8890
No log 0.9032 28 0.7985 0.1739 0.7985 0.8936
No log 0.9677 30 0.7911 0.0522 0.7911 0.8894
No log 1.0323 32 0.7372 0.0937 0.7372 0.8586
No log 1.0968 34 0.6835 0.0889 0.6835 0.8267
No log 1.1613 36 0.7124 0.1508 0.7124 0.8440
No log 1.2258 38 0.7244 0.0 0.7244 0.8511
No log 1.2903 40 0.8085 0.0053 0.8085 0.8992
No log 1.3548 42 0.8816 0.0944 0.8816 0.9389
No log 1.4194 44 0.8094 0.0053 0.8094 0.8997
No log 1.4839 46 0.7740 0.1863 0.7740 0.8798
No log 1.5484 48 0.8476 0.1065 0.8476 0.9206
No log 1.6129 50 0.8885 0.2297 0.8885 0.9426
No log 1.6774 52 0.8079 0.1972 0.8079 0.8988
No log 1.7419 54 0.7521 0.2443 0.7521 0.8673
No log 1.8065 56 0.7469 0.2786 0.7469 0.8643
No log 1.8710 58 0.8253 0.2498 0.8253 0.9085
No log 1.9355 60 1.1261 0.0838 1.1261 1.0612
No log 2.0 62 1.1771 0.1328 1.1771 1.0849
No log 2.0645 64 0.9466 0.1853 0.9466 0.9729
No log 2.1290 66 0.6952 0.2786 0.6952 0.8338
No log 2.1935 68 0.7153 0.2443 0.7153 0.8458
No log 2.2581 70 0.7031 0.2443 0.7031 0.8385
No log 2.3226 72 0.6826 0.2786 0.6826 0.8262
No log 2.3871 74 0.6901 0.3117 0.6901 0.8307
No log 2.4516 76 0.7695 0.2227 0.7695 0.8772
No log 2.5161 78 0.8922 0.2075 0.8922 0.9445
No log 2.5806 80 0.9769 0.1870 0.9769 0.9884
No log 2.6452 82 1.1617 0.2130 1.1617 1.0778
No log 2.7097 84 1.2462 0.1473 1.2462 1.1163
No log 2.7742 86 1.0601 0.1443 1.0601 1.0296
No log 2.8387 88 0.8643 0.3794 0.8643 0.9297
No log 2.9032 90 0.8359 0.2231 0.8359 0.9143
No log 2.9677 92 0.8770 0.2546 0.8770 0.9365
No log 3.0323 94 0.8240 0.2923 0.8240 0.9077
No log 3.0968 96 0.9468 0.3219 0.9468 0.9730
No log 3.1613 98 1.1961 0.1233 1.1961 1.0937
No log 3.2258 100 1.1507 0.1296 1.1507 1.0727
No log 3.2903 102 0.9507 0.1312 0.9507 0.9750
No log 3.3548 104 0.8534 0.3712 0.8534 0.9238
No log 3.4194 106 0.8104 0.2883 0.8104 0.9002
No log 3.4839 108 0.8280 0.3637 0.8280 0.9100
No log 3.5484 110 0.8964 0.2982 0.8964 0.9468
No log 3.6129 112 0.9465 0.2343 0.9465 0.9729
No log 3.6774 114 0.9305 0.2949 0.9305 0.9646
No log 3.7419 116 0.9497 0.2754 0.9497 0.9745
No log 3.8065 118 1.1194 0.1774 1.1194 1.0580
No log 3.8710 120 1.1225 0.1671 1.1225 1.0595
No log 3.9355 122 1.0313 0.1787 1.0313 1.0155
No log 4.0 124 0.8884 0.2784 0.8884 0.9425
No log 4.0645 126 0.8684 0.3843 0.8684 0.9319
No log 4.1290 128 0.9218 0.2358 0.9218 0.9601
No log 4.1935 130 1.0044 0.1827 1.0044 1.0022
No log 4.2581 132 1.0906 0.2271 1.0906 1.0443
No log 4.3226 134 1.0869 0.2271 1.0869 1.0425
No log 4.3871 136 0.9740 0.1734 0.9740 0.9869
No log 4.4516 138 0.9775 0.2784 0.9775 0.9887
No log 4.5161 140 1.0998 0.1208 1.0998 1.0487
No log 4.5806 142 1.1692 0.1146 1.1692 1.0813
No log 4.6452 144 1.0911 0.2000 1.0911 1.0446
No log 4.7097 146 1.0711 0.1422 1.0711 1.0350
No log 4.7742 148 1.1069 0.0713 1.1069 1.0521
No log 4.8387 150 1.2694 0.0838 1.2694 1.1267
No log 4.9032 152 1.5201 0.0283 1.5201 1.2329
No log 4.9677 154 1.4779 0.0578 1.4779 1.2157
No log 5.0323 156 1.2431 0.0561 1.2431 1.1149
No log 5.0968 158 1.0726 0.1461 1.0726 1.0356
No log 5.1613 160 1.0738 0.1461 1.0738 1.0362
No log 5.2258 162 1.1953 0.0660 1.1953 1.0933
No log 5.2903 164 1.4173 0.0620 1.4173 1.1905
No log 5.3548 166 1.4171 0.0620 1.4171 1.1904
No log 5.4194 168 1.3349 0.0665 1.3349 1.1554
No log 5.4839 170 1.2095 0.1116 1.2095 1.0998
No log 5.5484 172 1.1537 0.1911 1.1537 1.0741
No log 5.6129 174 1.0413 0.1955 1.0413 1.0204
No log 5.6774 176 1.0124 0.2510 1.0124 1.0062
No log 5.7419 178 1.1001 0.2141 1.1001 1.0488
No log 5.8065 180 1.1979 0.1067 1.1979 1.0945
No log 5.8710 182 1.1438 0.2199 1.1438 1.0695
No log 5.9355 184 0.9711 0.3777 0.9711 0.9854
No log 6.0 186 0.8755 0.2932 0.8755 0.9357
No log 6.0645 188 0.8112 0.2685 0.8112 0.9007
No log 6.1290 190 0.8142 0.2981 0.8142 0.9023
No log 6.1935 192 0.9034 0.1628 0.9034 0.9505
No log 6.2581 194 1.0482 0.1642 1.0482 1.0238
No log 6.3226 196 1.2252 0.1254 1.2252 1.1069
No log 6.3871 198 1.1682 0.0327 1.1682 1.0808
No log 6.4516 200 0.9849 0.1777 0.9849 0.9924
No log 6.5161 202 0.8380 0.2652 0.8380 0.9154
No log 6.5806 204 0.8302 0.2145 0.8302 0.9112
No log 6.6452 206 0.8402 0.0971 0.8402 0.9166
No log 6.7097 208 0.8428 0.2718 0.8428 0.9180
No log 6.7742 210 0.9424 0.2297 0.9424 0.9708
No log 6.8387 212 1.1885 0.0368 1.1885 1.0902
No log 6.9032 214 1.3135 0.0188 1.3135 1.1461
No log 6.9677 216 1.2863 0.0471 1.2863 1.1341
No log 7.0323 218 1.2913 0.0225 1.2913 1.1364
No log 7.0968 220 1.1523 0.0074 1.1523 1.0734
No log 7.1613 222 1.1267 0.1014 1.1267 1.0614
No log 7.2258 224 1.1900 0.0056 1.1900 1.0909
No log 7.2903 226 1.3326 0.0975 1.3326 1.1544
No log 7.3548 228 1.3288 0.1203 1.3288 1.1527
No log 7.4194 230 1.1838 0.0609 1.1838 1.0880
No log 7.4839 232 0.9528 0.2244 0.9528 0.9761
No log 7.5484 234 0.8407 0.2527 0.8407 0.9169
No log 7.6129 236 0.8402 0.2467 0.8402 0.9167
No log 7.6774 238 0.9099 0.1914 0.9099 0.9539
No log 7.7419 240 0.9559 0.1914 0.9559 0.9777
No log 7.8065 242 0.9971 0.2142 0.9971 0.9986
No log 7.8710 244 1.0387 0.3294 1.0387 1.0192
No log 7.9355 246 0.9657 0.1723 0.9657 0.9827
No log 8.0 248 0.9289 0.1454 0.9289 0.9638
No log 8.0645 250 0.9083 0.1542 0.9083 0.9531
No log 8.1290 252 0.9414 0.2142 0.9414 0.9703
No log 8.1935 254 0.9553 0.2029 0.9553 0.9774
No log 8.2581 256 1.0510 0.1642 1.0510 1.0252
No log 8.3226 258 1.0919 0.2259 1.0919 1.0449
No log 8.3871 260 1.0495 0.0829 1.0495 1.0244
No log 8.4516 262 1.0206 0.0891 1.0206 1.0102
No log 8.5161 264 0.9753 0.1254 0.9753 0.9876
No log 8.5806 266 1.0356 0.0891 1.0356 1.0177
No log 8.6452 268 1.0436 0.1461 1.0436 1.0216
No log 8.7097 270 0.8984 0.1867 0.8984 0.9478
No log 8.7742 272 0.8941 0.1914 0.8941 0.9456
No log 8.8387 274 0.9298 0.1822 0.9298 0.9642
No log 8.9032 276 1.0862 0.1651 1.0862 1.0422
No log 8.9677 278 1.2275 0.1290 1.2275 1.1079
No log 9.0323 280 1.1849 0.2017 1.1849 1.0885
No log 9.0968 282 0.9839 0.1777 0.9839 0.9919
No log 9.1613 284 0.8012 0.3755 0.8012 0.8951
No log 9.2258 286 0.7707 0.3243 0.7707 0.8779
No log 9.2903 288 0.7883 0.3465 0.7883 0.8879
No log 9.3548 290 0.8574 0.3950 0.8574 0.9260
No log 9.4194 292 1.0923 0.1077 1.0923 1.0451
No log 9.4839 294 1.2975 0.1296 1.2975 1.1391
No log 9.5484 296 1.3097 0.0973 1.3097 1.1444
No log 9.6129 298 1.1619 0.2032 1.1619 1.0779
No log 9.6774 300 0.9515 0.2463 0.9515 0.9754
No log 9.7419 302 0.7907 0.3341 0.7907 0.8892
No log 9.8065 304 0.7568 0.3081 0.7568 0.8700
No log 9.8710 306 0.7596 0.2819 0.7596 0.8716
No log 9.9355 308 0.8017 0.3092 0.8017 0.8954
No log 10.0 310 0.9069 0.2754 0.9069 0.9523
No log 10.0645 312 0.9972 0.2670 0.9972 0.9986
No log 10.1290 314 0.9597 0.1672 0.9597 0.9796
No log 10.1935 316 0.8425 0.3312 0.8425 0.9179
No log 10.2581 318 0.7422 0.3123 0.7422 0.8615
No log 10.3226 320 0.7290 0.2890 0.7290 0.8538
No log 10.3871 322 0.7562 0.3387 0.7562 0.8696
No log 10.4516 324 0.9130 0.2029 0.9130 0.9555
No log 10.5161 326 1.0938 0.2577 1.0938 1.0459
No log 10.5806 328 1.1104 0.2601 1.1104 1.0538
No log 10.6452 330 1.0475 0.2939 1.0475 1.0235
No log 10.7097 332 0.9935 0.2866 0.9935 0.9967
No log 10.7742 334 0.9376 0.1777 0.9376 0.9683
No log 10.8387 336 0.9208 0.1777 0.9208 0.9596
No log 10.9032 338 0.9982 0.2923 0.9982 0.9991
No log 10.9677 340 1.0313 0.2562 1.0313 1.0155
No log 11.0323 342 1.0564 0.2810 1.0564 1.0278
No log 11.0968 344 0.9705 0.1422 0.9705 0.9852
No log 11.1613 346 0.8286 0.3372 0.8286 0.9103
No log 11.2258 348 0.7768 0.3092 0.7768 0.8813
No log 11.2903 350 0.8023 0.3444 0.8023 0.8957
No log 11.3548 352 0.9102 0.2193 0.9102 0.9541
No log 11.4194 354 1.0466 0.1897 1.0466 1.0230
No log 11.4839 356 1.0660 0.1897 1.0660 1.0325
No log 11.5484 358 0.9456 0.2387 0.9456 0.9724
No log 11.6129 360 0.8342 0.3032 0.8342 0.9133
No log 11.6774 362 0.8231 0.3032 0.8231 0.9072
No log 11.7419 364 0.8694 0.2967 0.8694 0.9324
No log 11.8065 366 0.9076 0.2967 0.9076 0.9527
No log 11.8710 368 0.9603 0.2142 0.9603 0.9800
No log 11.9355 370 0.8956 0.2967 0.8956 0.9463
No log 12.0 372 0.8265 0.3372 0.8265 0.9091
No log 12.0645 374 0.7802 0.3444 0.7802 0.8833
No log 12.1290 376 0.7984 0.3444 0.7984 0.8935
No log 12.1935 378 0.8176 0.3444 0.8176 0.9042
No log 12.2581 380 0.9028 0.3494 0.9028 0.9501
No log 12.3226 382 0.9447 0.2726 0.9447 0.9720
No log 12.3871 384 0.9722 0.2358 0.9722 0.9860
No log 12.4516 386 0.9810 0.2000 0.9810 0.9905
No log 12.5161 388 0.9314 0.2784 0.9314 0.9651
No log 12.5806 390 0.9029 0.2574 0.9029 0.9502
No log 12.6452 392 0.9380 0.2574 0.9380 0.9685
No log 12.7097 394 1.0299 0.2000 1.0299 1.0148
No log 12.7742 396 1.0512 0.1803 1.0512 1.0253
No log 12.8387 398 1.0086 0.2643 1.0086 1.0043
No log 12.9032 400 0.9602 0.1808 0.9602 0.9799
No log 12.9677 402 0.9949 0.1454 0.9949 0.9975
No log 13.0323 404 0.9516 0.1808 0.9516 0.9755
No log 13.0968 406 0.8843 0.2754 0.8843 0.9404
No log 13.1613 408 0.8421 0.2883 0.8421 0.9177
No log 13.2258 410 0.8713 0.2328 0.8713 0.9335
No log 13.2903 412 0.9690 0.1853 0.9690 0.9844
No log 13.3548 414 1.0974 0.1976 1.0974 1.0476
No log 13.4194 416 1.1107 0.1662 1.1107 1.0539
No log 13.4839 418 1.0601 0.1662 1.0601 1.0296
No log 13.5484 420 1.0571 0.2075 1.0571 1.0281
No log 13.6129 422 1.0203 0.2510 1.0203 1.0101
No log 13.6774 424 0.9454 0.2574 0.9454 0.9723
No log 13.7419 426 0.8852 0.2467 0.8852 0.9408
No log 13.8065 428 0.8487 0.3387 0.8487 0.9212
No log 13.8710 430 0.8236 0.3387 0.8236 0.9075
No log 13.9355 432 0.8196 0.2950 0.8196 0.9053
No log 14.0 434 0.8490 0.3127 0.8490 0.9214
No log 14.0645 436 0.9510 0.2183 0.9510 0.9752
No log 14.1290 438 1.0495 0.1961 1.0495 1.0245
No log 14.1935 440 1.1073 0.1898 1.1073 1.0523
No log 14.2581 442 1.0513 0.1754 1.0513 1.0253
No log 14.3226 444 0.9964 0.1586 0.9964 0.9982
No log 14.3871 446 0.9415 0.1712 0.9415 0.9703
No log 14.4516 448 0.9259 0.2193 0.9259 0.9622
No log 14.5161 450 0.9196 0.2193 0.9196 0.9590
No log 14.5806 452 0.9425 0.2410 0.9425 0.9708
No log 14.6452 454 0.9510 0.2358 0.9510 0.9752
No log 14.7097 456 0.9331 0.1822 0.9331 0.9660
No log 14.7742 458 0.9559 0.1777 0.9559 0.9777
No log 14.8387 460 0.9637 0.1461 0.9637 0.9817
No log 14.9032 462 0.9322 0.1867 0.9322 0.9655
No log 14.9677 464 0.9086 0.2244 0.9086 0.9532
No log 15.0323 466 0.9338 0.1867 0.9338 0.9663
No log 15.0968 468 0.9819 0.2046 0.9819 0.9909
No log 15.1613 470 0.9724 0.2046 0.9724 0.9861
No log 15.2258 472 0.9275 0.2410 0.9275 0.9631
No log 15.2903 474 0.8342 0.3032 0.8342 0.9134
No log 15.3548 476 0.7451 0.3238 0.7451 0.8632
No log 15.4194 478 0.7417 0.2950 0.7417 0.8612
No log 15.4839 480 0.8391 0.2754 0.8391 0.9160
No log 15.5484 482 0.9250 0.2492 0.9250 0.9618
No log 15.6129 484 1.0314 0.2192 1.0314 1.0156
No log 15.6774 486 1.1901 0.1764 1.1901 1.0909
No log 15.7419 488 1.2475 0.1704 1.2475 1.1169
No log 15.8065 490 1.2021 0.1557 1.2021 1.0964
No log 15.8710 492 1.1204 0.1210 1.1204 1.0585
No log 15.9355 494 1.0433 0.1911 1.0433 1.0214
No log 16.0 496 0.9596 0.1911 0.9596 0.9796
No log 16.0645 498 0.9225 0.1777 0.9225 0.9605
0.2957 16.1290 500 0.9066 0.1822 0.9066 0.9522
0.2957 16.1935 502 0.9043 0.1822 0.9043 0.9509
0.2957 16.2581 504 0.9406 0.1777 0.9406 0.9698
0.2957 16.3226 506 1.0368 0.2164 1.0368 1.0183
0.2957 16.3871 508 1.1925 0.1671 1.1925 1.0920
0.2957 16.4516 510 1.2787 0.1293 1.2787 1.1308
0.2957 16.5161 512 1.3125 0.1174 1.3125 1.1457
0.2957 16.5806 514 1.2668 0.1663 1.2668 1.1255
0.2957 16.6452 516 1.1072 0.1856 1.1072 1.0522
0.2957 16.7097 518 1.0365 0.1692 1.0365 1.0181
0.2957 16.7742 520 0.9771 0.1145 0.9771 0.9885
0.2957 16.8387 522 0.9615 0.1542 0.9615 0.9805
0.2957 16.9032 524 0.9670 0.1501 0.9670 0.9834
0.2957 16.9677 526 1.0165 0.1077 1.0165 1.0082

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k13_task7_organization

Finetuned
(4019)
this model