ArabicNewSplits7_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k13_task7_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 1.1811
  • Qwk: 0.0058
  • Mse: 1.1811
  • Rmse: 1.0868

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0645 2 2.5689 -0.0758 2.5689 1.6028
No log 0.1290 4 1.3470 0.0985 1.3470 1.1606
No log 0.1935 6 1.2992 -0.1871 1.2992 1.1398
No log 0.2581 8 0.9648 -0.1569 0.9648 0.9823
No log 0.3226 10 0.8289 0.0313 0.8289 0.9105
No log 0.3871 12 0.7585 0.0846 0.7585 0.8709
No log 0.4516 14 0.7376 0.0444 0.7376 0.8589
No log 0.5161 16 0.7601 0.0393 0.7601 0.8718
No log 0.5806 18 0.7828 0.1456 0.7828 0.8847
No log 0.6452 20 0.7168 0.2181 0.7168 0.8467
No log 0.7097 22 0.6717 0.1660 0.6717 0.8196
No log 0.7742 24 0.7674 0.2846 0.7674 0.8760
No log 0.8387 26 0.7903 0.2087 0.7903 0.8890
No log 0.9032 28 0.7985 0.1739 0.7985 0.8936
No log 0.9677 30 0.7911 0.0522 0.7911 0.8894
No log 1.0323 32 0.7372 0.0937 0.7372 0.8586
No log 1.0968 34 0.6835 0.0889 0.6835 0.8267
No log 1.1613 36 0.7124 0.1508 0.7124 0.8440
No log 1.2258 38 0.7244 0.0 0.7244 0.8511
No log 1.2903 40 0.8085 0.0053 0.8085 0.8992
No log 1.3548 42 0.8816 0.0944 0.8816 0.9389
No log 1.4194 44 0.8094 0.0053 0.8094 0.8997
No log 1.4839 46 0.7740 0.1863 0.7740 0.8798
No log 1.5484 48 0.8476 0.1065 0.8476 0.9206
No log 1.6129 50 0.8885 0.2297 0.8885 0.9426
No log 1.6774 52 0.8079 0.1972 0.8079 0.8988
No log 1.7419 54 0.7521 0.2443 0.7521 0.8673
No log 1.8065 56 0.7469 0.2786 0.7469 0.8643
No log 1.8710 58 0.8253 0.2498 0.8253 0.9085
No log 1.9355 60 1.1260 0.0838 1.1260 1.0612
No log 2.0 62 1.1771 0.1328 1.1771 1.0849
No log 2.0645 64 0.9466 0.1853 0.9466 0.9729
No log 2.1290 66 0.6952 0.2786 0.6952 0.8338
No log 2.1935 68 0.7153 0.2443 0.7153 0.8458
No log 2.2581 70 0.7031 0.2443 0.7031 0.8385
No log 2.3226 72 0.6826 0.2786 0.6826 0.8262
No log 2.3871 74 0.6901 0.3117 0.6901 0.8307
No log 2.4516 76 0.7694 0.2227 0.7694 0.8772
No log 2.5161 78 0.8921 0.2075 0.8921 0.9445
No log 2.5806 80 0.9769 0.1870 0.9769 0.9884
No log 2.6452 82 1.1617 0.2130 1.1617 1.0778
No log 2.7097 84 1.2462 0.1473 1.2462 1.1163
No log 2.7742 86 1.0601 0.1443 1.0601 1.0296
No log 2.8387 88 0.8642 0.3794 0.8642 0.9296
No log 2.9032 90 0.8358 0.2231 0.8358 0.9142
No log 2.9677 92 0.8770 0.2546 0.8770 0.9365
No log 3.0323 94 0.8240 0.2923 0.8240 0.9077
No log 3.0968 96 0.9469 0.3219 0.9469 0.9731
No log 3.1613 98 1.1963 0.1233 1.1963 1.0938
No log 3.2258 100 1.1509 0.1296 1.1509 1.0728
No log 3.2903 102 0.9508 0.1312 0.9508 0.9751
No log 3.3548 104 0.8533 0.3712 0.8533 0.9238
No log 3.4194 106 0.8102 0.2883 0.8102 0.9001
No log 3.4839 108 0.8279 0.3637 0.8279 0.9099
No log 3.5484 110 0.8963 0.2982 0.8963 0.9467
No log 3.6129 112 0.9467 0.2343 0.9467 0.9730
No log 3.6774 114 0.9308 0.2949 0.9308 0.9648
No log 3.7419 116 0.9498 0.2754 0.9498 0.9746
No log 3.8065 118 1.1196 0.1774 1.1196 1.0581
No log 3.8710 120 1.1228 0.1671 1.1228 1.0596
No log 3.9355 122 1.0317 0.1747 1.0317 1.0157
No log 4.0 124 0.8886 0.2784 0.8886 0.9427
No log 4.0645 126 0.8687 0.3843 0.8687 0.9320
No log 4.1290 128 0.9221 0.2358 0.9221 0.9602
No log 4.1935 130 1.0045 0.1827 1.0045 1.0023
No log 4.2581 132 1.0905 0.2271 1.0905 1.0443
No log 4.3226 134 1.0866 0.2271 1.0866 1.0424
No log 4.3871 136 0.9740 0.1734 0.9740 0.9869
No log 4.4516 138 0.9785 0.2784 0.9785 0.9892
No log 4.5161 140 1.1019 0.1208 1.1019 1.0497
No log 4.5806 142 1.1701 0.1146 1.1701 1.0817
No log 4.6452 144 1.0898 0.2000 1.0898 1.0439
No log 4.7097 146 1.0688 0.1422 1.0688 1.0338
No log 4.7742 148 1.1073 0.0713 1.1073 1.0523
No log 4.8387 150 1.2726 0.0838 1.2726 1.1281
No log 4.9032 152 1.5227 0.0283 1.5227 1.2340
No log 4.9677 154 1.4802 0.0578 1.4802 1.2167
No log 5.0323 156 1.2432 0.0561 1.2432 1.1150
No log 5.0968 158 1.0696 0.1461 1.0696 1.0342
No log 5.1613 160 1.0681 0.1822 1.0681 1.0335
No log 5.2258 162 1.1867 0.0660 1.1867 1.0894
No log 5.2903 164 1.4071 0.0620 1.4071 1.1862
No log 5.3548 166 1.4084 0.0620 1.4084 1.1868
No log 5.4194 168 1.3284 0.0665 1.3284 1.1526
No log 5.4839 170 1.2068 0.1428 1.2068 1.0986
No log 5.5484 172 1.1537 0.1911 1.1537 1.0741
No log 5.6129 174 1.0416 0.1955 1.0416 1.0206
No log 5.6774 176 1.0133 0.2510 1.0133 1.0066
No log 5.7419 178 1.1012 0.2141 1.1012 1.0494
No log 5.8065 180 1.1986 0.1067 1.1986 1.0948
No log 5.8710 182 1.1438 0.2199 1.1438 1.0695
No log 5.9355 184 0.9715 0.3777 0.9715 0.9856
No log 6.0 186 0.8776 0.2932 0.8776 0.9368
No log 6.0645 188 0.8135 0.2621 0.8135 0.9020
No log 6.1290 190 0.8165 0.2981 0.8165 0.9036
No log 6.1935 192 0.9059 0.1628 0.9059 0.9518
No log 6.2581 194 1.0485 0.1642 1.0485 1.0240
No log 6.3226 196 1.2208 0.1254 1.2208 1.1049
No log 6.3871 198 1.1575 0.0327 1.1575 1.0759
No log 6.4516 200 0.9705 0.1822 0.9705 0.9851
No log 6.5161 202 0.8356 0.2652 0.8356 0.9141
No log 6.5806 204 0.8323 0.2145 0.8323 0.9123
No log 6.6452 206 0.8418 0.1353 0.8418 0.9175
No log 6.7097 208 0.8477 0.2718 0.8477 0.9207
No log 6.7742 210 0.9605 0.2574 0.9605 0.9801
No log 6.8387 212 1.2208 -0.0027 1.2208 1.1049
No log 6.9032 214 1.3416 0.0642 1.3416 1.1583
No log 6.9677 216 1.3035 0.0471 1.3035 1.1417
No log 7.0323 218 1.2588 0.0561 1.2588 1.1220
No log 7.0968 220 1.0891 0.1734 1.0891 1.0436
No log 7.1613 222 1.0534 0.2094 1.0534 1.0263
No log 7.2258 224 1.0964 0.1014 1.0964 1.0471
No log 7.2903 226 1.2520 0.0256 1.2520 1.1189
No log 7.3548 228 1.2889 0.0741 1.2889 1.1353
No log 7.4194 230 1.1742 0.1492 1.1742 1.0836
No log 7.4839 232 0.9626 0.2193 0.9626 0.9811
No log 7.5484 234 0.8453 0.2527 0.8453 0.9194
No log 7.6129 236 0.8439 0.2467 0.8439 0.9186
No log 7.6774 238 0.9090 0.1914 0.9090 0.9534
No log 7.7419 240 0.9469 0.1914 0.9469 0.9731
No log 7.8065 242 0.9851 0.2142 0.9851 0.9925
No log 7.8710 244 1.0111 0.1682 1.0111 1.0055
No log 7.9355 246 0.9354 0.1723 0.9354 0.9672
No log 8.0 248 0.9027 0.1628 0.9027 0.9501
No log 8.0645 250 0.8714 0.2467 0.8714 0.9335
No log 8.1290 252 0.9122 0.2193 0.9122 0.9551
No log 8.1935 254 0.9543 0.1682 0.9543 0.9769
No log 8.2581 256 1.0863 0.1521 1.0863 1.0423
No log 8.3226 258 1.1388 0.2211 1.1388 1.0671
No log 8.3871 260 1.0761 0.1422 1.0761 1.0374
No log 8.4516 262 1.0605 0.0241 1.0605 1.0298
No log 8.5161 264 1.0419 0.0539 1.0419 1.0207
No log 8.5806 266 1.1213 0.0196 1.1213 1.0589
No log 8.6452 268 1.1418 0.0769 1.1418 1.0686
No log 8.7097 270 0.9894 0.1461 0.9894 0.9947
No log 8.7742 272 0.9805 0.1461 0.9805 0.9902
No log 8.8387 274 0.9417 0.1822 0.9417 0.9704
No log 8.9032 276 0.9904 0.1461 0.9904 0.9952
No log 8.9677 278 1.0483 0.1308 1.0483 1.0238
No log 9.0323 280 1.1173 0.1274 1.1173 1.0570
No log 9.0968 282 1.0221 0.1077 1.0221 1.0110
No log 9.1613 284 0.8545 0.3099 0.8545 0.9244
No log 9.2258 286 0.8256 0.3518 0.8256 0.9086
No log 9.2903 288 0.9075 0.2193 0.9075 0.9526
No log 9.3548 290 1.0771 0.1013 1.0771 1.0378
No log 9.4194 292 1.3042 0.0244 1.3042 1.1420
No log 9.4839 294 1.3758 0.0206 1.3758 1.1729
No log 9.5484 296 1.3383 0.0493 1.3383 1.1569
No log 9.6129 298 1.1919 0.0058 1.1919 1.0917
No log 9.6774 300 1.0034 0.2142 1.0034 1.0017
No log 9.7419 302 0.9430 0.2518 0.9430 0.9711
No log 9.8065 304 0.9781 0.2094 0.9781 0.9890
No log 9.8710 306 1.0525 0.1573 1.0525 1.0259
No log 9.9355 308 1.0693 0.1573 1.0693 1.0340
No log 10.0 310 1.0748 0.1013 1.0748 1.0367
No log 10.0645 312 0.9998 0.2336 0.9998 0.9999
No log 10.1290 314 0.8960 0.2817 0.8960 0.9466
No log 10.1935 316 0.8704 0.2847 0.8704 0.9330
No log 10.2581 318 0.9045 0.3099 0.9045 0.9510
No log 10.3226 320 1.0036 0.2387 1.0036 1.0018
No log 10.3871 322 1.2273 0.0336 1.2273 1.1078
No log 10.4516 324 1.3381 0.0790 1.3381 1.1568
No log 10.5161 326 1.2586 0.0790 1.2586 1.1219
No log 10.5806 328 1.0872 0.1692 1.0872 1.0427
No log 10.6452 330 1.0176 0.1734 1.0176 1.0087
No log 10.7097 332 0.9894 0.1777 0.9894 0.9947
No log 10.7742 334 0.9688 0.2142 0.9688 0.9843
No log 10.8387 336 0.9604 0.1822 0.9604 0.9800
No log 10.9032 338 1.0507 0.1422 1.0507 1.0250
No log 10.9677 340 1.2710 0.0778 1.2710 1.1274
No log 11.0323 342 1.4835 0.0930 1.4835 1.2180
No log 11.0968 344 1.4377 0.1121 1.4377 1.1991
No log 11.1613 346 1.1982 0.0641 1.1982 1.0946
No log 11.2258 348 0.9425 0.2193 0.9425 0.9708
No log 11.2903 350 0.8316 0.2527 0.8316 0.9119
No log 11.3548 352 0.8384 0.2527 0.8384 0.9157
No log 11.4194 354 0.9489 0.1822 0.9489 0.9741
No log 11.4839 356 1.1222 0.0713 1.1222 1.0593
No log 11.5484 358 1.1565 0.0368 1.1565 1.0754
No log 11.6129 360 1.1137 0.0713 1.1137 1.0553
No log 11.6774 362 1.0372 0.1110 1.0372 1.0184
No log 11.7419 364 1.0212 0.1110 1.0212 1.0106
No log 11.8065 366 1.0342 0.1110 1.0342 1.0169
No log 11.8710 368 1.1095 0.0390 1.1095 1.0533
No log 11.9355 370 1.1692 0.0076 1.1692 1.0813
No log 12.0 372 1.3077 -0.0259 1.3077 1.1435
No log 12.0645 374 1.3873 -0.0285 1.3873 1.1778
No log 12.1290 376 1.2738 -0.0259 1.2738 1.1286
No log 12.1935 378 1.0838 0.0413 1.0838 1.0410
No log 12.2581 380 0.9668 0.1628 0.9668 0.9833
No log 12.3226 382 0.9673 0.1914 0.9673 0.9835
No log 12.3871 384 0.9974 0.1461 0.9974 0.9987
No log 12.4516 386 1.0625 0.0413 1.0625 1.0308
No log 12.5161 388 1.1410 0.0058 1.1410 1.0682
No log 12.5806 390 1.1343 0.0058 1.1343 1.0650
No log 12.6452 392 1.1630 0.0058 1.1630 1.0784
No log 12.7097 394 1.1802 -0.0245 1.1802 1.0864
No log 12.7742 396 1.1513 0.0368 1.1513 1.0730
No log 12.8387 398 1.2181 -0.0245 1.2181 1.1037
No log 12.9032 400 1.3882 0.0153 1.3882 1.1782
No log 12.9677 402 1.4886 0.0564 1.4886 1.2201
No log 13.0323 404 1.3812 0.0599 1.3812 1.1752
No log 13.0968 406 1.1717 0.0058 1.1717 1.0824
No log 13.1613 408 1.0109 0.1217 1.0109 1.0054
No log 13.2258 410 0.9761 0.1584 0.9761 0.9880
No log 13.2903 412 1.0344 0.1144 1.0344 1.0170
No log 13.3548 414 1.1831 0.0379 1.1831 1.0877
No log 13.4194 416 1.2202 0.0039 1.2202 1.1046
No log 13.4839 418 1.2043 0.0022 1.2043 1.0974
No log 13.5484 420 1.2372 0.0523 1.2372 1.1123
No log 13.6129 422 1.2193 0.0523 1.2193 1.1042
No log 13.6774 424 1.1438 0.0315 1.1438 1.0695
No log 13.7419 426 1.0789 0.1077 1.0789 1.0387
No log 13.8065 428 1.0346 0.2244 1.0346 1.0171
No log 13.8710 430 1.0309 0.1765 1.0309 1.0153
No log 13.9355 432 1.0414 0.1765 1.0414 1.0205
No log 14.0 434 1.0462 0.1765 1.0462 1.0228
No log 14.0645 436 1.1204 0.1594 1.1204 1.0585
No log 14.1290 438 1.1596 0.1716 1.1596 1.0768
No log 14.1935 440 1.1669 0.1274 1.1669 1.0802
No log 14.2581 442 1.1627 0.1146 1.1627 1.0783
No log 14.3226 444 1.1474 0.1662 1.1474 1.0712
No log 14.3871 446 1.0875 0.2343 1.0875 1.0429
No log 14.4516 448 1.0772 0.1815 1.0772 1.0379
No log 14.5161 450 1.0669 0.1013 1.0669 1.0329
No log 14.5806 452 1.0679 0.1573 1.0679 1.0334
No log 14.6452 454 1.0679 0.1348 1.0679 1.0334
No log 14.7097 456 1.1340 0.0283 1.1340 1.0649
No log 14.7742 458 1.2436 0.0741 1.2436 1.1151
No log 14.8387 460 1.2538 0.0741 1.2538 1.1197
No log 14.9032 462 1.1485 0.0666 1.1485 1.0717
No log 14.9677 464 0.9804 0.1723 0.9804 0.9902
No log 15.0323 466 0.8887 0.2692 0.8887 0.9427
No log 15.0968 468 0.8780 0.2692 0.8780 0.9370
No log 15.1613 470 0.9733 0.1765 0.9733 0.9866
No log 15.2258 472 1.1197 0.0983 1.1197 1.0581
No log 15.2903 474 1.2494 0.0765 1.2494 1.1178
No log 15.3548 476 1.3087 0.0741 1.3087 1.1440
No log 15.4194 478 1.2599 0.0741 1.2599 1.1224
No log 15.4839 480 1.2672 0.0717 1.2672 1.1257
No log 15.5484 482 1.2407 0.1057 1.2407 1.1139
No log 15.6129 484 1.2252 0.0561 1.2252 1.1069
No log 15.6774 486 1.2588 -0.0278 1.2588 1.1219
No log 15.7419 488 1.2664 -0.0278 1.2664 1.1254
No log 15.8065 490 1.2791 -0.0026 1.2791 1.1310
No log 15.8710 492 1.2718 -0.0026 1.2718 1.1277
No log 15.9355 494 1.2487 0.0237 1.2487 1.1175
No log 16.0 496 1.1990 0.0022 1.1990 1.0950
No log 16.0645 498 1.1433 0.0561 1.1433 1.0693
0.2951 16.1290 500 1.1072 0.0346 1.1072 1.0522
0.2951 16.1935 502 1.1027 0.0346 1.1027 1.0501
0.2951 16.2581 504 1.1709 0.0786 1.1709 1.0821
0.2951 16.3226 506 1.2494 0.0761 1.2494 1.1178
0.2951 16.3871 508 1.2542 0.0786 1.2542 1.1199
0.2951 16.4516 510 1.1977 0.0283 1.1977 1.0944
0.2951 16.5161 512 1.1203 0.0368 1.1203 1.0584
0.2951 16.5806 514 1.0680 0.1422 1.0680 1.0334
0.2951 16.6452 516 1.0286 0.1822 1.0286 1.0142
0.2951 16.7097 518 1.1012 0.1308 1.1012 1.0494
0.2951 16.7742 520 1.1388 0.0336 1.1388 1.0671
0.2951 16.8387 522 1.1530 0.0368 1.1530 1.0738
0.2951 16.9032 524 1.1550 0.0058 1.1550 1.0747
0.2951 16.9677 526 1.2156 0.0786 1.2156 1.1025
0.2951 17.0323 528 1.1863 0.1086 1.1863 1.0892
0.2951 17.0968 530 1.1443 0.1116 1.1443 1.0697
0.2951 17.1613 532 1.1555 0.1086 1.1555 1.0750
0.2951 17.2258 534 1.1534 0.1116 1.1534 1.0740
0.2951 17.2903 536 1.1457 0.1116 1.1457 1.0704
0.2951 17.3548 538 1.1813 0.1116 1.1813 1.0869
0.2951 17.4194 540 1.2174 0.0561 1.2174 1.1033
0.2951 17.4839 542 1.1759 0.0346 1.1759 1.0844
0.2951 17.5484 544 1.1662 0.0346 1.1662 1.0799
0.2951 17.6129 546 1.2134 0.0346 1.2134 1.1015
0.2951 17.6774 548 1.2168 0.0040 1.2168 1.1031
0.2951 17.7419 550 1.1811 0.0058 1.1811 1.0868

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k13_task7_organization

Finetuned
(4019)
this model