ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k20_task3_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.8157
  • Qwk: 0.0412
  • Mse: 0.8157
  • Rmse: 0.9031

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0194 2 3.8611 0.0118 3.8611 1.9650
No log 0.0388 4 1.8869 -0.0496 1.8869 1.3736
No log 0.0583 6 1.5495 0.0219 1.5495 1.2448
No log 0.0777 8 1.4736 0.0768 1.4736 1.2139
No log 0.0971 10 0.7986 0.1148 0.7986 0.8936
No log 0.1165 12 0.7153 -0.0069 0.7153 0.8457
No log 0.1359 14 0.7074 0.0460 0.7074 0.8411
No log 0.1553 16 0.7520 0.0857 0.7520 0.8672
No log 0.1748 18 0.7303 0.1444 0.7303 0.8546
No log 0.1942 20 0.7224 -0.0069 0.7224 0.8500
No log 0.2136 22 0.7413 0.0807 0.7413 0.8610
No log 0.2330 24 0.7660 0.1023 0.7660 0.8752
No log 0.2524 26 0.8132 0.0585 0.8132 0.9018
No log 0.2718 28 0.8033 -0.0465 0.8033 0.8963
No log 0.2913 30 1.2045 0.1699 1.2045 1.0975
No log 0.3107 32 1.5277 0.1067 1.5277 1.2360
No log 0.3301 34 1.0202 0.0175 1.0202 1.0101
No log 0.3495 36 0.7953 -0.0449 0.7953 0.8918
No log 0.3689 38 0.8275 0.0030 0.8275 0.9097
No log 0.3883 40 0.7775 0.1081 0.7775 0.8817
No log 0.4078 42 0.7606 0.1148 0.7606 0.8721
No log 0.4272 44 0.7334 0.1758 0.7334 0.8564
No log 0.4466 46 0.7226 0.0857 0.7226 0.8501
No log 0.4660 48 0.7226 0.1691 0.7226 0.8500
No log 0.4854 50 0.7139 0.2034 0.7139 0.8449
No log 0.5049 52 0.7053 0.0869 0.7053 0.8398
No log 0.5243 54 0.7428 0.0905 0.7428 0.8618
No log 0.5437 56 0.7509 0.1304 0.7509 0.8665
No log 0.5631 58 0.7504 0.2121 0.7504 0.8662
No log 0.5825 60 0.7572 0.1050 0.7572 0.8702
No log 0.6019 62 0.8074 0.0580 0.8074 0.8986
No log 0.6214 64 0.8541 0.0618 0.8541 0.9242
No log 0.6408 66 0.8245 0.2294 0.8245 0.9080
No log 0.6602 68 0.8159 0.1866 0.8159 0.9032
No log 0.6796 70 0.8188 0.1973 0.8188 0.9049
No log 0.6990 72 1.0474 -0.0023 1.0474 1.0234
No log 0.7184 74 0.9802 -0.0386 0.9802 0.9900
No log 0.7379 76 0.7684 0.2087 0.7684 0.8766
No log 0.7573 78 0.7853 0.0068 0.7853 0.8861
No log 0.7767 80 0.7976 0.0517 0.7976 0.8931
No log 0.7961 82 0.8425 0.1846 0.8425 0.9179
No log 0.8155 84 0.9813 0.1520 0.9813 0.9906
No log 0.8350 86 1.0419 0.1870 1.0419 1.0208
No log 0.8544 88 1.0074 0.1908 1.0074 1.0037
No log 0.8738 90 0.9423 0.1894 0.9423 0.9707
No log 0.8932 92 0.9973 0.1014 0.9973 0.9986
No log 0.9126 94 0.9605 0.0741 0.9605 0.9801
No log 0.9320 96 0.7117 0.1443 0.7117 0.8436
No log 0.9515 98 0.8883 0.0946 0.8883 0.9425
No log 0.9709 100 0.9733 0.0612 0.9733 0.9866
No log 0.9903 102 0.7510 0.1249 0.7510 0.8666
No log 1.0097 104 0.8265 0.1793 0.8265 0.9091
No log 1.0291 106 0.8565 0.1661 0.8565 0.9255
No log 1.0485 108 0.9421 0.1024 0.9421 0.9706
No log 1.0680 110 1.1580 0.1297 1.1580 1.0761
No log 1.0874 112 1.0261 0.0997 1.0261 1.0130
No log 1.1068 114 1.0023 0.2090 1.0023 1.0011
No log 1.1262 116 1.0244 0.1086 1.0244 1.0121
No log 1.1456 118 0.9669 0.1284 0.9669 0.9833
No log 1.1650 120 0.7880 0.3072 0.7880 0.8877
No log 1.1845 122 0.9456 0.0519 0.9456 0.9724
No log 1.2039 124 0.7993 0.1001 0.7993 0.8940
No log 1.2233 126 0.7947 0.1793 0.7947 0.8915
No log 1.2427 128 0.8245 0.0957 0.8245 0.9080
No log 1.2621 130 0.9390 0.0666 0.9390 0.9690
No log 1.2816 132 0.9532 0.0974 0.9532 0.9763
No log 1.3010 134 0.9308 0.1789 0.9308 0.9648
No log 1.3204 136 0.9993 0.1120 0.9993 0.9997
No log 1.3398 138 1.0727 0.0808 1.0727 1.0357
No log 1.3592 140 1.4655 0.1049 1.4655 1.2106
No log 1.3786 142 1.5250 0.1331 1.5250 1.2349
No log 1.3981 144 1.2234 0.0521 1.2234 1.1061
No log 1.4175 146 0.8638 0.1009 0.8638 0.9294
No log 1.4369 148 0.8354 0.1277 0.8354 0.9140
No log 1.4563 150 0.7988 0.0432 0.7988 0.8937
No log 1.4757 152 0.8226 0.0179 0.8226 0.9070
No log 1.4951 154 0.8911 -0.0133 0.8911 0.9440
No log 1.5146 156 0.9581 0.1919 0.9581 0.9789
No log 1.5340 158 1.0204 0.1615 1.0204 1.0102
No log 1.5534 160 1.2992 0.0755 1.2992 1.1398
No log 1.5728 162 1.1586 0.0667 1.1586 1.0764
No log 1.5922 164 0.9250 0.2826 0.9250 0.9618
No log 1.6117 166 0.8817 0.0856 0.8817 0.9390
No log 1.6311 168 0.9138 0.0618 0.9138 0.9559
No log 1.6505 170 0.9826 0.0723 0.9826 0.9913
No log 1.6699 172 0.8430 0.2051 0.8430 0.9182
No log 1.6893 174 0.9279 0.0566 0.9279 0.9633
No log 1.7087 176 1.1471 0.0804 1.1471 1.0710
No log 1.7282 178 0.8959 0.0727 0.8959 0.9465
No log 1.7476 180 0.8431 0.0991 0.8431 0.9182
No log 1.7670 182 1.0481 0.0247 1.0481 1.0238
No log 1.7864 184 0.9703 0.0753 0.9703 0.9851
No log 1.8058 186 0.7886 0.0323 0.7886 0.8881
No log 1.8252 188 0.9135 0.0762 0.9135 0.9558
No log 1.8447 190 0.9216 0.0727 0.9216 0.9600
No log 1.8641 192 0.7553 0.0680 0.7553 0.8691
No log 1.8835 194 0.8320 0.1078 0.8320 0.9122
No log 1.9029 196 0.8841 0.1078 0.8841 0.9403
No log 1.9223 198 0.8244 0.1251 0.8244 0.9080
No log 1.9417 200 0.8510 0.1723 0.8510 0.9225
No log 1.9612 202 0.9199 0.1401 0.9199 0.9591
No log 1.9806 204 1.3779 0.1076 1.3779 1.1738
No log 2.0 206 1.5642 0.0828 1.5642 1.2507
No log 2.0194 208 1.2631 0.0810 1.2631 1.1239
No log 2.0388 210 0.9073 0.1539 0.9073 0.9525
No log 2.0583 212 1.0614 0.0888 1.0614 1.0302
No log 2.0777 214 0.9761 0.1475 0.9761 0.9880
No log 2.0971 216 0.8750 0.1127 0.8750 0.9354
No log 2.1165 218 1.0491 0.1586 1.0491 1.0242
No log 2.1359 220 0.9565 0.1284 0.9565 0.9780
No log 2.1553 222 0.8213 0.2382 0.8213 0.9063
No log 2.1748 224 0.8762 0.1541 0.8762 0.9360
No log 2.1942 226 1.0424 0.1013 1.0424 1.0210
No log 2.2136 228 0.9176 0.1027 0.9176 0.9579
No log 2.2330 230 0.8592 0.1754 0.8592 0.9269
No log 2.2524 232 1.0432 0.1274 1.0432 1.0214
No log 2.2718 234 1.0232 0.1148 1.0232 1.0115
No log 2.2913 236 0.9957 0.0977 0.9957 0.9978
No log 2.3107 238 0.8640 0.1739 0.8640 0.9295
No log 2.3301 240 0.8028 -0.0156 0.8028 0.8960
No log 2.3495 242 0.8088 0.0574 0.8088 0.8993
No log 2.3689 244 0.7639 -0.0228 0.7639 0.8740
No log 2.3883 246 0.7614 0.0488 0.7614 0.8726
No log 2.4078 248 0.8643 0.1078 0.8643 0.9297
No log 2.4272 250 0.8708 0.1078 0.8708 0.9332
No log 2.4466 252 0.8232 0.0870 0.8232 0.9073
No log 2.4660 254 0.8268 0.0227 0.8268 0.9093
No log 2.4854 256 0.8471 0.1347 0.8471 0.9204
No log 2.5049 258 0.8499 0.0727 0.8499 0.9219
No log 2.5243 260 0.8675 0.0594 0.8675 0.9314
No log 2.5437 262 0.9149 0.0978 0.9149 0.9565
No log 2.5631 264 1.0073 0.1210 1.0072 1.0036
No log 2.5825 266 1.0885 0.0505 1.0885 1.0433
No log 2.6019 268 0.9647 0.0909 0.9647 0.9822
No log 2.6214 270 0.9150 0.1264 0.9150 0.9566
No log 2.6408 272 0.9259 0.0363 0.9259 0.9623
No log 2.6602 274 0.8403 0.1597 0.8403 0.9167
No log 2.6796 276 1.0061 0.0179 1.0061 1.0030
No log 2.6990 278 0.9364 0.1368 0.9364 0.9677
No log 2.7184 280 0.7876 0.0481 0.7876 0.8875
No log 2.7379 282 0.7859 0.0917 0.7859 0.8865
No log 2.7573 284 0.7770 0.0246 0.7770 0.8815
No log 2.7767 286 0.8133 0.1310 0.8133 0.9018
No log 2.7961 288 0.8018 0.1448 0.8018 0.8955
No log 2.8155 290 0.8300 0.1272 0.8300 0.9110
No log 2.8350 292 0.8073 0.1775 0.8073 0.8985
No log 2.8544 294 0.8945 0.0966 0.8945 0.9458
No log 2.8738 296 0.8853 0.1237 0.8853 0.9409
No log 2.8932 298 0.8348 0.2353 0.8348 0.9137
No log 2.9126 300 0.8480 0.2379 0.8480 0.9209
No log 2.9320 302 0.9039 0.0860 0.9039 0.9507
No log 2.9515 304 1.0929 0.1500 1.0929 1.0454
No log 2.9709 306 1.2211 0.0535 1.2211 1.1051
No log 2.9903 308 1.0912 0.1535 1.0912 1.0446
No log 3.0097 310 0.8736 0.1460 0.8736 0.9347
No log 3.0291 312 0.8274 0.2103 0.8274 0.9096
No log 3.0485 314 0.8317 0.1601 0.8317 0.9120
No log 3.0680 316 0.8417 0.1290 0.8417 0.9174
No log 3.0874 318 0.8506 0.1649 0.8506 0.9223
No log 3.1068 320 0.8605 0.1637 0.8605 0.9276
No log 3.1262 322 0.7955 0.1580 0.7955 0.8919
No log 3.1456 324 0.8177 0.1961 0.8177 0.9043
No log 3.1650 326 0.8809 0.1581 0.8809 0.9386
No log 3.1845 328 0.9339 0.0635 0.9339 0.9664
No log 3.2039 330 0.9918 0.0694 0.9918 0.9959
No log 3.2233 332 1.1458 0.0798 1.1458 1.0704
No log 3.2427 334 0.9439 0.0673 0.9439 0.9715
No log 3.2621 336 0.8288 0.0146 0.8288 0.9104
No log 3.2816 338 0.8808 -0.0007 0.8808 0.9385
No log 3.3010 340 0.8666 0.0060 0.8666 0.9309
No log 3.3204 342 0.8572 0.0341 0.8572 0.9258
No log 3.3398 344 0.8863 0.0097 0.8863 0.9414
No log 3.3592 346 0.9255 0.0962 0.9255 0.9620
No log 3.3786 348 0.9434 0.0765 0.9434 0.9713
No log 3.3981 350 0.9309 0.0042 0.9309 0.9648
No log 3.4175 352 0.9057 0.0632 0.9057 0.9517
No log 3.4369 354 0.9444 0.0462 0.9444 0.9718
No log 3.4563 356 0.9075 0.1414 0.9075 0.9526
No log 3.4757 358 0.8871 0.0813 0.8871 0.9418
No log 3.4951 360 0.8992 0.0946 0.8992 0.9483
No log 3.5146 362 0.9098 0.1351 0.9098 0.9538
No log 3.5340 364 0.9831 0.0326 0.9831 0.9915
No log 3.5534 366 0.9968 0.0764 0.9968 0.9984
No log 3.5728 368 0.8374 0.1282 0.8374 0.9151
No log 3.5922 370 0.8129 0.0959 0.8129 0.9016
No log 3.6117 372 0.7783 0.1431 0.7783 0.8822
No log 3.6311 374 0.8099 0.0061 0.8099 0.8999
No log 3.6505 376 1.0428 0.0247 1.0428 1.0212
No log 3.6699 378 1.0905 0.0285 1.0905 1.0443
No log 3.6893 380 0.9286 -0.0245 0.9286 0.9636
No log 3.7087 382 0.8244 0.0700 0.8244 0.9079
No log 3.7282 384 0.9459 0.0692 0.9459 0.9726
No log 3.7476 386 0.9575 0.0692 0.9575 0.9785
No log 3.7670 388 0.8420 0.1423 0.8420 0.9176
No log 3.7864 390 0.8494 0.0187 0.8494 0.9216
No log 3.8058 392 0.8821 -0.0111 0.8821 0.9392
No log 3.8252 394 0.8243 0.0089 0.8243 0.9079
No log 3.8447 396 0.8243 0.1048 0.8243 0.9079
No log 3.8641 398 0.8638 0.0574 0.8638 0.9294
No log 3.8835 400 0.9136 0.1410 0.9136 0.9558
No log 3.9029 402 0.9641 0.1055 0.9641 0.9819
No log 3.9223 404 0.9457 0.0699 0.9457 0.9725
No log 3.9417 406 0.9215 0.0402 0.9215 0.9599
No log 3.9612 408 0.8957 0.0614 0.8957 0.9464
No log 3.9806 410 0.9134 0.0970 0.9134 0.9557
No log 4.0 412 0.9024 0.0970 0.9024 0.9500
No log 4.0194 414 0.8614 0.0106 0.8614 0.9281
No log 4.0388 416 0.9163 0.0498 0.9163 0.9572
No log 4.0583 418 1.0296 0.1594 1.0296 1.0147
No log 4.0777 420 1.0273 0.1339 1.0273 1.0135
No log 4.0971 422 1.0590 0.1615 1.0590 1.0291
No log 4.1165 424 0.9343 0.1325 0.9343 0.9666
No log 4.1359 426 0.8417 -0.0047 0.8417 0.9174
No log 4.1553 428 0.8604 -0.0589 0.8604 0.9276
No log 4.1748 430 0.9575 0.1045 0.9575 0.9785
No log 4.1942 432 0.9638 0.0746 0.9638 0.9817
No log 4.2136 434 0.8981 -0.0524 0.8981 0.9477
No log 4.2330 436 0.8507 -0.0821 0.8507 0.9223
No log 4.2524 438 0.7649 -0.0065 0.7649 0.8746
No log 4.2718 440 0.7486 0.0395 0.7486 0.8652
No log 4.2913 442 0.7557 -0.0065 0.7557 0.8693
No log 4.3107 444 0.7793 0.0454 0.7793 0.8828
No log 4.3301 446 0.8062 -0.0240 0.8062 0.8979
No log 4.3495 448 0.8224 -0.0240 0.8224 0.9069
No log 4.3689 450 0.8862 0.1005 0.8862 0.9414
No log 4.3883 452 1.0466 0.1297 1.0466 1.0230
No log 4.4078 454 1.0326 0.1301 1.0326 1.0162
No log 4.4272 456 0.8721 0.0378 0.8721 0.9339
No log 4.4466 458 0.7944 -0.0195 0.7944 0.8913
No log 4.4660 460 0.7507 0.0863 0.7507 0.8665
No log 4.4854 462 0.7294 0.1379 0.7294 0.8540
No log 4.5049 464 0.7427 0.0863 0.7427 0.8618
No log 4.5243 466 0.7937 0.0869 0.7937 0.8909
No log 4.5437 468 0.8857 0.0643 0.8857 0.9411
No log 4.5631 470 0.9449 0.0975 0.9449 0.9721
No log 4.5825 472 0.9099 0.0868 0.9099 0.9539
No log 4.6019 474 0.8704 0.0597 0.8704 0.9330
No log 4.6214 476 0.8686 0.0961 0.8686 0.9320
No log 4.6408 478 0.8390 0.0757 0.8390 0.9160
No log 4.6602 480 0.8944 0.1605 0.8944 0.9457
No log 4.6796 482 0.8998 0.1196 0.8998 0.9486
No log 4.6990 484 0.8548 0.0784 0.8548 0.9246
No log 4.7184 486 0.8446 0.0827 0.8446 0.9190
No log 4.7379 488 0.8897 0.0635 0.8897 0.9433
No log 4.7573 490 0.8309 0.0905 0.8309 0.9115
No log 4.7767 492 0.7940 0.0660 0.7940 0.8910
No log 4.7961 494 0.7968 0.0196 0.7968 0.8926
No log 4.8155 496 0.8060 0.0749 0.8060 0.8978
No log 4.8350 498 0.8250 0.0257 0.8250 0.9083
0.3161 4.8544 500 0.8245 0.0804 0.8245 0.9080
0.3161 4.8738 502 0.8426 0.0482 0.8426 0.9179
0.3161 4.8932 504 0.8253 0.0804 0.8253 0.9085
0.3161 4.9126 506 0.8383 0.0804 0.8383 0.9156
0.3161 4.9320 508 0.8760 0.1492 0.8760 0.9360
0.3161 4.9515 510 0.9377 0.0823 0.9377 0.9683
0.3161 4.9709 512 1.0294 0.1469 1.0294 1.0146
0.3161 4.9903 514 1.0645 0.0661 1.0645 1.0317
0.3161 5.0097 516 0.9715 0.0881 0.9715 0.9856
0.3161 5.0291 518 0.8808 0.0618 0.8808 0.9385
0.3161 5.0485 520 0.8439 0.0246 0.8439 0.9186
0.3161 5.0680 522 0.8276 0.0359 0.8276 0.9097
0.3161 5.0874 524 0.8139 0.0412 0.8139 0.9022
0.3161 5.1068 526 0.8157 0.0412 0.8157 0.9031

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
1
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k20_task3_organization

Finetuned
(4019)
this model