ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run3_AugV5_k3_task3_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.8359
  • Qwk: 0.1026
  • Mse: 0.8359
  • Rmse: 0.9143

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.125 2 3.6220 -0.0068 3.6220 1.9031
No log 0.25 4 1.9644 0.0560 1.9644 1.4016
No log 0.375 6 2.0314 0.0119 2.0314 1.4253
No log 0.5 8 1.7099 0.0406 1.7099 1.3076
No log 0.625 10 0.8132 0.0191 0.8132 0.9018
No log 0.75 12 0.7240 0.0807 0.7240 0.8509
No log 0.875 14 0.8338 -0.0408 0.8338 0.9131
No log 1.0 16 0.9645 -0.0236 0.9645 0.9821
No log 1.125 18 0.7588 0.0549 0.7588 0.8711
No log 1.25 20 0.7054 0.1259 0.7054 0.8399
No log 1.375 22 0.7412 0.0512 0.7412 0.8609
No log 1.5 24 1.0371 0.0353 1.0371 1.0184
No log 1.625 26 0.8094 0.0377 0.8094 0.8997
No log 1.75 28 0.7919 -0.1067 0.7919 0.8899
No log 1.875 30 0.8334 0.0128 0.8334 0.9129
No log 2.0 32 0.7632 -0.1551 0.7632 0.8736
No log 2.125 34 0.7307 0.0670 0.7307 0.8548
No log 2.25 36 1.2096 0.1026 1.2096 1.0998
No log 2.375 38 1.1443 0.0823 1.1443 1.0697
No log 2.5 40 0.7357 0.0628 0.7357 0.8577
No log 2.625 42 0.7032 -0.0520 0.7032 0.8386
No log 2.75 44 0.7208 0.1080 0.7208 0.8490
No log 2.875 46 0.7716 0.0551 0.7716 0.8784
No log 3.0 48 0.8619 0.0378 0.8619 0.9284
No log 3.125 50 0.8303 0.1168 0.8303 0.9112
No log 3.25 52 0.9212 0.0301 0.9212 0.9598
No log 3.375 54 0.8773 0.2126 0.8773 0.9366
No log 3.5 56 0.8969 0.2023 0.8969 0.9470
No log 3.625 58 0.9439 0.0689 0.9439 0.9715
No log 3.75 60 0.9489 0.0052 0.9489 0.9741
No log 3.875 62 0.8337 0.0211 0.8337 0.9130
No log 4.0 64 0.8447 0.0285 0.8447 0.9191
No log 4.125 66 0.9439 0.0498 0.9439 0.9716
No log 4.25 68 1.0249 0.0391 1.0249 1.0124
No log 4.375 70 0.8133 0.0545 0.8133 0.9018
No log 4.5 72 1.3515 -0.0057 1.3515 1.1626
No log 4.625 74 1.2075 0.0596 1.2075 1.0989
No log 4.75 76 0.7706 -0.1094 0.7706 0.8778
No log 4.875 78 1.3671 0.0119 1.3671 1.1692
No log 5.0 80 1.5231 -0.0734 1.5231 1.2341
No log 5.125 82 0.9264 0.0877 0.9264 0.9625
No log 5.25 84 0.7717 -0.0449 0.7717 0.8784
No log 5.375 86 0.8001 -0.0345 0.8001 0.8945
No log 5.5 88 0.8401 0.0470 0.8401 0.9166
No log 5.625 90 1.0217 0.0659 1.0217 1.0108
No log 5.75 92 0.8517 0.0304 0.8517 0.9229
No log 5.875 94 0.9133 0.0770 0.9133 0.9557
No log 6.0 96 1.1852 0.0254 1.1852 1.0887
No log 6.125 98 0.9411 0.0815 0.9411 0.9701
No log 6.25 100 0.7608 0.0660 0.7608 0.8722
No log 6.375 102 0.7822 0.0871 0.7822 0.8844
No log 6.5 104 0.7595 0.2275 0.7595 0.8715
No log 6.625 106 0.9729 0.0142 0.9729 0.9864
No log 6.75 108 0.8871 0.0721 0.8871 0.9418
No log 6.875 110 0.8032 0.1093 0.8032 0.8962
No log 7.0 112 0.8474 0.0799 0.8474 0.9205
No log 7.125 114 0.7888 0.1767 0.7888 0.8882
No log 7.25 116 0.7615 0.2235 0.7615 0.8726
No log 7.375 118 0.7544 0.1537 0.7544 0.8686
No log 7.5 120 0.7161 0.1835 0.7161 0.8463
No log 7.625 122 0.8054 0.0719 0.8054 0.8974
No log 7.75 124 0.7660 0.1336 0.7660 0.8752
No log 7.875 126 0.7705 0.2235 0.7705 0.8778
No log 8.0 128 0.8236 0.1714 0.8236 0.9075
No log 8.125 130 0.9061 0.0998 0.9061 0.9519
No log 8.25 132 0.8977 0.1754 0.8977 0.9475
No log 8.375 134 0.8585 0.0074 0.8585 0.9265
No log 8.5 136 0.8449 0.2285 0.8449 0.9192
No log 8.625 138 0.7851 0.0509 0.7851 0.8861
No log 8.75 140 0.7815 0.0741 0.7815 0.8840
No log 8.875 142 0.7630 0.0412 0.7630 0.8735
No log 9.0 144 0.8195 0.1770 0.8195 0.9053
No log 9.125 146 1.0190 0.0747 1.0190 1.0095
No log 9.25 148 0.9018 0.1612 0.9018 0.9496
No log 9.375 150 0.8563 0.1259 0.8563 0.9254
No log 9.5 152 0.7996 0.1729 0.7996 0.8942
No log 9.625 154 0.7680 0.1031 0.7680 0.8764
No log 9.75 156 0.7328 0.0247 0.7328 0.8560
No log 9.875 158 0.7409 0.0670 0.7409 0.8607
No log 10.0 160 0.7147 0.0414 0.7147 0.8454
No log 10.125 162 0.7442 0.1408 0.7442 0.8627
No log 10.25 164 0.8249 0.0999 0.8249 0.9082
No log 10.375 166 0.7739 0.0947 0.7739 0.8797
No log 10.5 168 0.7596 -0.0079 0.7596 0.8715
No log 10.625 170 0.7873 0.1408 0.7873 0.8873
No log 10.75 172 0.7888 0.0851 0.7888 0.8882
No log 10.875 174 0.7969 0.1310 0.7969 0.8927
No log 11.0 176 0.8033 0.1408 0.8033 0.8963
No log 11.125 178 0.7958 0.1686 0.7958 0.8921
No log 11.25 180 0.7781 0.1550 0.7781 0.8821
No log 11.375 182 0.7735 0.2156 0.7735 0.8795
No log 11.5 184 0.7719 0.1868 0.7719 0.8786
No log 11.625 186 0.7045 0.1413 0.7045 0.8393
No log 11.75 188 0.7161 0.2078 0.7161 0.8462
No log 11.875 190 0.6959 0.0926 0.6959 0.8342
No log 12.0 192 0.7107 0.0884 0.7107 0.8430
No log 12.125 194 0.7242 0.1778 0.7242 0.8510
No log 12.25 196 0.7474 0.1887 0.7474 0.8645
No log 12.375 198 0.6932 0.1840 0.6932 0.8326
No log 12.5 200 0.6767 0.1740 0.6767 0.8226
No log 12.625 202 0.7021 0.0095 0.7021 0.8379
No log 12.75 204 0.7345 0.0913 0.7345 0.8570
No log 12.875 206 0.6809 0.1807 0.6809 0.8252
No log 13.0 208 0.6779 0.1371 0.6779 0.8233
No log 13.125 210 0.7019 0.0639 0.7019 0.8378
No log 13.25 212 0.7667 0.1149 0.7667 0.8756
No log 13.375 214 0.7598 0.1239 0.7598 0.8717
No log 13.5 216 0.7269 0.1189 0.7269 0.8526
No log 13.625 218 0.7235 0.1340 0.7235 0.8506
No log 13.75 220 0.7318 0.1146 0.7318 0.8555
No log 13.875 222 0.7351 0.1096 0.7351 0.8574
No log 14.0 224 0.7154 0.0503 0.7154 0.8458
No log 14.125 226 0.7045 0.0914 0.7045 0.8394
No log 14.25 228 0.7176 0.1612 0.7176 0.8471
No log 14.375 230 0.7048 0.1807 0.7048 0.8395
No log 14.5 232 0.7000 0.1254 0.7000 0.8367
No log 14.625 234 0.6847 -0.0032 0.6847 0.8275
No log 14.75 236 0.6736 0.0 0.6736 0.8207
No log 14.875 238 0.6638 0.1444 0.6638 0.8147
No log 15.0 240 0.6852 0.1627 0.6852 0.8278
No log 15.125 242 0.6748 0.0918 0.6748 0.8214
No log 15.25 244 0.7409 0.2219 0.7409 0.8608
No log 15.375 246 0.7521 0.2454 0.7521 0.8672
No log 15.5 248 0.8133 0.1487 0.8133 0.9018
No log 15.625 250 0.8273 0.1531 0.8273 0.9096
No log 15.75 252 0.7915 0.2264 0.7915 0.8897
No log 15.875 254 0.7700 0.2325 0.7700 0.8775
No log 16.0 256 0.7253 0.1232 0.7253 0.8516
No log 16.125 258 0.7301 0.1387 0.7301 0.8545
No log 16.25 260 0.7253 0.1440 0.7253 0.8517
No log 16.375 262 0.7068 0.1095 0.7068 0.8407
No log 16.5 264 0.7367 0.1859 0.7367 0.8583
No log 16.625 266 0.8004 0.1193 0.8004 0.8946
No log 16.75 268 0.8001 0.0755 0.8001 0.8945
No log 16.875 270 0.7030 0.3611 0.7030 0.8384
No log 17.0 272 0.7033 0.2168 0.7033 0.8386
No log 17.125 274 0.7057 0.2437 0.7057 0.8401
No log 17.25 276 0.7982 0.0755 0.7982 0.8934
No log 17.375 278 0.7470 0.1387 0.7470 0.8643
No log 17.5 280 0.6762 0.2195 0.6762 0.8223
No log 17.625 282 0.6783 0.2195 0.6783 0.8236
No log 17.75 284 0.7291 0.1048 0.7291 0.8539
No log 17.875 286 0.8037 0.1239 0.8037 0.8965
No log 18.0 288 0.7549 0.1049 0.7549 0.8688
No log 18.125 290 0.7494 0.2522 0.7494 0.8657
No log 18.25 292 0.7350 0.2194 0.7350 0.8573
No log 18.375 294 0.7423 0.0562 0.7423 0.8616
No log 18.5 296 0.7461 0.0956 0.7461 0.8638
No log 18.625 298 0.7526 0.0956 0.7526 0.8676
No log 18.75 300 0.6775 0.1675 0.6775 0.8231
No log 18.875 302 0.6759 0.1423 0.6759 0.8221
No log 19.0 304 0.7093 0.0557 0.7093 0.8422
No log 19.125 306 0.8181 0.1078 0.8181 0.9045
No log 19.25 308 0.7728 0.2258 0.7728 0.8791
No log 19.375 310 0.6973 0.1354 0.6973 0.8351
No log 19.5 312 0.7806 0.1239 0.7806 0.8835
No log 19.625 314 0.8490 0.1150 0.8490 0.9214
No log 19.75 316 0.7526 0.0999 0.7526 0.8675
No log 19.875 318 0.6625 0.0914 0.6625 0.8139
No log 20.0 320 0.6594 0.0436 0.6594 0.8120
No log 20.125 322 0.6675 0.0914 0.6675 0.8170
No log 20.25 324 0.7026 0.1146 0.7026 0.8382
No log 20.375 326 0.7580 0.0490 0.7580 0.8706
No log 20.5 328 0.7169 0.1047 0.7169 0.8467
No log 20.625 330 0.7003 0.1362 0.7003 0.8368
No log 20.75 332 0.6984 0.1311 0.6984 0.8357
No log 20.875 334 0.7343 0.1047 0.7343 0.8569
No log 21.0 336 0.7689 0.0956 0.7689 0.8769
No log 21.125 338 0.7183 0.1612 0.7183 0.8475
No log 21.25 340 0.7295 0.1379 0.7295 0.8541
No log 21.375 342 0.7693 0.2087 0.7693 0.8771
No log 21.5 344 0.7380 0.1660 0.7380 0.8591
No log 21.625 346 0.7673 0.1047 0.7673 0.8760
No log 21.75 348 0.7476 0.1047 0.7476 0.8646
No log 21.875 350 0.6948 0.1787 0.6948 0.8336
No log 22.0 352 0.6720 0.1371 0.6720 0.8198
No log 22.125 354 0.6715 0.1371 0.6715 0.8194
No log 22.25 356 0.6843 0.1740 0.6843 0.8272
No log 22.375 358 0.7098 0.1675 0.7098 0.8425
No log 22.5 360 0.7831 0.1687 0.7831 0.8849
No log 22.625 362 0.9302 0.1077 0.9302 0.9645
No log 22.75 364 0.9438 0.1077 0.9438 0.9715
No log 22.875 366 0.8177 0.1430 0.8177 0.9043
No log 23.0 368 0.7336 0.1859 0.7336 0.8565
No log 23.125 370 0.6950 0.1740 0.6950 0.8337
No log 23.25 372 0.6975 0.1612 0.6975 0.8351
No log 23.375 374 0.7615 0.2141 0.7615 0.8727
No log 23.5 376 0.7905 0.1150 0.7905 0.8891
No log 23.625 378 0.7434 0.1342 0.7434 0.8622
No log 23.75 380 0.7177 0.0562 0.7177 0.8472
No log 23.875 382 0.7032 0.1553 0.7032 0.8386
No log 24.0 384 0.7078 0.1287 0.7078 0.8413
No log 24.125 386 0.7404 0.2926 0.7404 0.8604
No log 24.25 388 0.7371 0.1660 0.7371 0.8586
No log 24.375 390 0.8322 0.0755 0.8322 0.9123
No log 24.5 392 0.9173 0.1189 0.9173 0.9578
No log 24.625 394 0.8734 0.1453 0.8734 0.9346
No log 24.75 396 0.7585 0.1716 0.7585 0.8709
No log 24.875 398 0.6766 0.1506 0.6766 0.8226
No log 25.0 400 0.6467 0.1371 0.6467 0.8042
No log 25.125 402 0.6447 0.1371 0.6447 0.8029
No log 25.25 404 0.6687 0.1506 0.6687 0.8178
No log 25.375 406 0.7699 0.1196 0.7699 0.8774
No log 25.5 408 0.7991 0.1502 0.7991 0.8939
No log 25.625 410 0.7184 0.1387 0.7184 0.8476
No log 25.75 412 0.6851 0.1249 0.6851 0.8277
No log 25.875 414 0.6752 0.0821 0.6752 0.8217
No log 26.0 416 0.6672 0.1675 0.6672 0.8168
No log 26.125 418 0.6639 0.2180 0.6639 0.8148
No log 26.25 420 0.6555 0.1675 0.6555 0.8096
No log 26.375 422 0.6582 0.1311 0.6582 0.8113
No log 26.5 424 0.6563 0.1675 0.6563 0.8101
No log 26.625 426 0.6536 0.1740 0.6536 0.8085
No log 26.75 428 0.6541 0.2105 0.6541 0.8088
No log 26.875 430 0.6388 0.0914 0.6388 0.7992
No log 27.0 432 0.6383 0.0914 0.6383 0.7989
No log 27.125 434 0.6428 0.0914 0.6428 0.8018
No log 27.25 436 0.6747 0.0436 0.6747 0.8214
No log 27.375 438 0.6820 0.1413 0.6820 0.8259
No log 27.5 440 0.6808 0.1254 0.6808 0.8251
No log 27.625 442 0.6888 0.2122 0.6888 0.8299
No log 27.75 444 0.6821 0.1311 0.6821 0.8259
No log 27.875 446 0.6705 0.0863 0.6705 0.8189
No log 28.0 448 0.6799 0.2105 0.6799 0.8246
No log 28.125 450 0.6963 0.1047 0.6963 0.8345
No log 28.25 452 0.6971 0.0600 0.6971 0.8349
No log 28.375 454 0.6961 0.0821 0.6961 0.8343
No log 28.5 456 0.7201 0.1287 0.7201 0.8486
No log 28.625 458 0.7178 0.1585 0.7178 0.8472
No log 28.75 460 0.7619 0.1286 0.7619 0.8729
No log 28.875 462 0.8091 0.1193 0.8091 0.8995
No log 29.0 464 0.7619 0.0913 0.7619 0.8729
No log 29.125 466 0.7626 0.0913 0.7626 0.8733
No log 29.25 468 0.7478 0.0611 0.7478 0.8648
No log 29.375 470 0.7613 0.0504 0.7613 0.8725
No log 29.5 472 0.7685 0.0913 0.7685 0.8766
No log 29.625 474 0.7238 0.1096 0.7238 0.8507
No log 29.75 476 0.6860 0.1675 0.6860 0.8283
No log 29.875 478 0.6808 0.1675 0.6808 0.8251
No log 30.0 480 0.6799 0.1675 0.6799 0.8245
No log 30.125 482 0.6923 0.1354 0.6923 0.8320
No log 30.25 484 0.7158 0.0488 0.7158 0.8460
No log 30.375 486 0.7136 0.1413 0.7136 0.8447
No log 30.5 488 0.7020 0.1659 0.7020 0.8378
No log 30.625 490 0.7635 0.0871 0.7635 0.8738
No log 30.75 492 0.7697 0.0831 0.7697 0.8773
No log 30.875 494 0.7273 0.1599 0.7273 0.8528
No log 31.0 496 0.7189 0.2096 0.7189 0.8479
No log 31.125 498 0.7256 0.1612 0.7256 0.8518
0.2186 31.25 500 0.7297 0.1096 0.7297 0.8542
0.2186 31.375 502 0.7659 0.1387 0.7659 0.8752
0.2186 31.5 504 0.7724 0.1387 0.7724 0.8789
0.2186 31.625 506 0.7757 0.1716 0.7757 0.8808
0.2186 31.75 508 0.8418 0.0676 0.8418 0.9175
0.2186 31.875 510 0.8566 0.0304 0.8566 0.9255
0.2186 32.0 512 0.8532 0.1437 0.8532 0.9237
0.2186 32.125 514 0.7861 0.0650 0.7861 0.8866
0.2186 32.25 516 0.7585 0.1630 0.7585 0.8709
0.2186 32.375 518 0.7454 0.1192 0.7454 0.8634
0.2186 32.5 520 0.7374 0.1192 0.7374 0.8587
0.2186 32.625 522 0.7264 0.1192 0.7264 0.8523
0.2186 32.75 524 0.7165 0.1196 0.7165 0.8465
0.2186 32.875 526 0.7211 0.1675 0.7211 0.8492
0.2186 33.0 528 0.7314 0.2105 0.7314 0.8552
0.2186 33.125 530 0.7197 0.1249 0.7197 0.8484
0.2186 33.25 532 0.7284 0.1298 0.7284 0.8535
0.2186 33.375 534 0.7554 0.2181 0.7554 0.8691
0.2186 33.5 536 0.7651 0.1689 0.7651 0.8747
0.2186 33.625 538 0.7889 0.1143 0.7889 0.8882
0.2186 33.75 540 0.7660 0.1689 0.7660 0.8752
0.2186 33.875 542 0.7432 0.1244 0.7432 0.8621
0.2186 34.0 544 0.7429 0.1599 0.7429 0.8619
0.2186 34.125 546 0.7878 0.1742 0.7878 0.8876
0.2186 34.25 548 0.7809 0.1440 0.7809 0.8837
0.2186 34.375 550 0.7293 0.1659 0.7293 0.8540
0.2186 34.5 552 0.7374 0.2168 0.7374 0.8587
0.2186 34.625 554 0.7494 0.2168 0.7494 0.8657
0.2186 34.75 556 0.7403 0.2519 0.7403 0.8604
0.2186 34.875 558 0.7384 0.1964 0.7384 0.8593
0.2186 35.0 560 0.7436 0.1859 0.7436 0.8623
0.2186 35.125 562 0.7455 0.1859 0.7455 0.8634
0.2186 35.25 564 0.7433 0.1986 0.7433 0.8621
0.2186 35.375 566 0.7261 0.1573 0.7261 0.8521
0.2186 35.5 568 0.7254 0.1674 0.7254 0.8517
0.2186 35.625 570 0.7214 0.1617 0.7214 0.8493
0.2186 35.75 572 0.7171 0.1518 0.7171 0.8468
0.2186 35.875 574 0.7318 0.1859 0.7318 0.8555
0.2186 36.0 576 0.7730 0.1701 0.7730 0.8792
0.2186 36.125 578 0.7558 0.2318 0.7558 0.8694
0.2186 36.25 580 0.7147 0.2034 0.7147 0.8454
0.2186 36.375 582 0.7107 0.1553 0.7107 0.8430
0.2186 36.5 584 0.7128 0.1612 0.7128 0.8443
0.2186 36.625 586 0.7136 0.1347 0.7136 0.8448
0.2186 36.75 588 0.7318 0.1340 0.7318 0.8555
0.2186 36.875 590 0.7325 0.1340 0.7325 0.8558
0.2186 37.0 592 0.7381 0.1192 0.7381 0.8591
0.2186 37.125 594 0.7640 0.1095 0.7640 0.8741
0.2186 37.25 596 0.7861 0.0961 0.7861 0.8866
0.2186 37.375 598 0.7934 0.0961 0.7934 0.8907
0.2186 37.5 600 0.8059 0.0734 0.8059 0.8977
0.2186 37.625 602 0.8359 0.1026 0.8359 0.9143

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run3_AugV5_k3_task3_organization

Finetuned
(4019)
this model