ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k11_task3_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.7284
  • Qwk: -0.1026
  • Mse: 0.7284
  • Rmse: 0.8535

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0351 2 3.9030 0.0017 3.9030 1.9756
No log 0.0702 4 2.2261 0.0247 2.2261 1.4920
No log 0.1053 6 1.3277 -0.0479 1.3277 1.1523
No log 0.1404 8 1.2360 0.0329 1.2360 1.1117
No log 0.1754 10 0.9333 -0.0079 0.9333 0.9661
No log 0.2105 12 0.7446 0.0506 0.7446 0.8629
No log 0.2456 14 0.7481 0.0506 0.7481 0.8649
No log 0.2807 16 0.8518 -0.1251 0.8518 0.9229
No log 0.3158 18 1.2478 0.0751 1.2478 1.1170
No log 0.3509 20 2.0735 -0.0138 2.0735 1.4399
No log 0.3860 22 1.9916 0.0470 1.9916 1.4112
No log 0.4211 24 1.1409 0.0391 1.1409 1.0682
No log 0.4561 26 0.8321 -0.1747 0.8321 0.9122
No log 0.4912 28 0.8160 -0.0695 0.8160 0.9033
No log 0.5263 30 0.9693 0.0277 0.9693 0.9845
No log 0.5614 32 1.2326 0.0006 1.2326 1.1102
No log 0.5965 34 1.5783 0.1670 1.5783 1.2563
No log 0.6316 36 1.2062 0.0673 1.2062 1.0983
No log 0.6667 38 1.0062 -0.0424 1.0062 1.0031
No log 0.7018 40 0.8349 0.0374 0.8349 0.9137
No log 0.7368 42 0.7786 0.1148 0.7786 0.8824
No log 0.7719 44 0.8393 0.0549 0.8393 0.9161
No log 0.8070 46 0.9008 0.0609 0.9008 0.9491
No log 0.8421 48 0.9366 0.0157 0.9366 0.9678
No log 0.8772 50 0.8759 0.0260 0.8759 0.9359
No log 0.9123 52 0.8123 0.1836 0.8123 0.9013
No log 0.9474 54 0.8427 0.0867 0.8427 0.9180
No log 0.9825 56 0.8023 0.0191 0.8023 0.8957
No log 1.0175 58 0.7932 0.0414 0.7932 0.8906
No log 1.0526 60 0.8572 -0.0132 0.8572 0.9259
No log 1.0877 62 1.0346 -0.0050 1.0346 1.0172
No log 1.1228 64 1.0290 -0.0409 1.0290 1.0144
No log 1.1579 66 0.8530 0.0393 0.8530 0.9236
No log 1.1930 68 0.8054 0.0123 0.8054 0.8974
No log 1.2281 70 0.8032 0.0123 0.8032 0.8962
No log 1.2632 72 0.7646 0.0214 0.7646 0.8744
No log 1.2982 74 1.0699 -0.0513 1.0699 1.0344
No log 1.3333 76 1.1515 -0.0563 1.1515 1.0731
No log 1.3684 78 0.8619 -0.0743 0.8619 0.9284
No log 1.4035 80 0.8427 -0.1395 0.8427 0.9180
No log 1.4386 82 0.8534 -0.0705 0.8534 0.9238
No log 1.4737 84 0.7540 0.0289 0.7540 0.8683
No log 1.5088 86 0.9274 0.1589 0.9274 0.9630
No log 1.5439 88 1.0592 0.1738 1.0592 1.0292
No log 1.5789 90 0.8123 0.1514 0.8123 0.9013
No log 1.6140 92 0.9062 -0.0473 0.9062 0.9520
No log 1.6491 94 1.0077 -0.0521 1.0077 1.0039
No log 1.6842 96 0.8823 -0.0107 0.8823 0.9393
No log 1.7193 98 0.9084 -0.1093 0.9084 0.9531
No log 1.7544 100 0.9301 -0.0669 0.9301 0.9644
No log 1.7895 102 0.9028 0.0879 0.9028 0.9502
No log 1.8246 104 0.9328 0.0928 0.9328 0.9658
No log 1.8596 106 0.9536 0.0378 0.9536 0.9765
No log 1.8947 108 0.8913 0.0879 0.8913 0.9441
No log 1.9298 110 0.9234 0.0465 0.9234 0.9609
No log 1.9649 112 0.9033 0.0452 0.9033 0.9504
No log 2.0 114 0.8878 -0.0238 0.8878 0.9422
No log 2.0351 116 0.9724 -0.0295 0.9724 0.9861
No log 2.0702 118 0.9016 -0.0047 0.9016 0.9495
No log 2.1053 120 0.9141 0.1012 0.9141 0.9561
No log 2.1404 122 0.9286 0.1538 0.9286 0.9637
No log 2.1754 124 0.9465 0.0798 0.9465 0.9729
No log 2.2105 126 0.9965 0.1065 0.9965 0.9982
No log 2.2456 128 0.9382 0.2522 0.9382 0.9686
No log 2.2807 130 0.9640 0.0253 0.9640 0.9818
No log 2.3158 132 0.9637 -0.0071 0.9637 0.9817
No log 2.3509 134 0.9399 0.1468 0.9399 0.9695
No log 2.3860 136 0.9647 0.1468 0.9647 0.9822
No log 2.4211 138 0.9520 0.1010 0.9520 0.9757
No log 2.4561 140 0.9769 0.1219 0.9769 0.9884
No log 2.4912 142 0.9384 0.1010 0.9384 0.9687
No log 2.5263 144 0.9312 0.0423 0.9312 0.9650
No log 2.5614 146 1.1403 0.1339 1.1403 1.0678
No log 2.5965 148 1.1007 0.0774 1.1007 1.0491
No log 2.6316 150 0.9363 -0.1333 0.9363 0.9676
No log 2.6667 152 1.0148 -0.0007 1.0148 1.0074
No log 2.7018 154 0.9186 0.0551 0.9186 0.9584
No log 2.7368 156 0.8644 -0.0675 0.8644 0.9297
No log 2.7719 158 1.1653 0.0845 1.1653 1.0795
No log 2.8070 160 1.1847 0.0566 1.1847 1.0884
No log 2.8421 162 0.9297 0.0007 0.9297 0.9642
No log 2.8772 164 0.8250 0.0196 0.8250 0.9083
No log 2.9123 166 0.9226 0.0362 0.9226 0.9605
No log 2.9474 168 0.8357 0.0167 0.8357 0.9142
No log 2.9825 170 0.8016 -0.0143 0.8016 0.8953
No log 3.0175 172 0.8249 -0.0477 0.8249 0.9083
No log 3.0526 174 0.9006 -0.1851 0.9006 0.9490
No log 3.0877 176 0.8895 -0.0317 0.8895 0.9431
No log 3.1228 178 0.7740 0.0441 0.7740 0.8798
No log 3.1579 180 0.8820 0.0676 0.8820 0.9391
No log 3.1930 182 1.1113 0.0458 1.1113 1.0542
No log 3.2281 184 1.0417 0.0260 1.0417 1.0206
No log 3.2632 186 0.8755 -0.0023 0.8755 0.9357
No log 3.2982 188 1.0176 0.0364 1.0176 1.0087
No log 3.3333 190 0.9856 0.0281 0.9856 0.9928
No log 3.3684 192 0.9621 0.0101 0.9621 0.9809
No log 3.4035 194 1.2918 0.0584 1.2918 1.1366
No log 3.4386 196 1.3089 0.0627 1.3089 1.1441
No log 3.4737 198 0.9085 0.0041 0.9085 0.9532
No log 3.5088 200 0.8289 0.0581 0.8289 0.9104
No log 3.5439 202 0.8981 0.0721 0.8981 0.9477
No log 3.5789 204 0.8721 -0.0543 0.8721 0.9339
No log 3.6140 206 0.8161 -0.0086 0.8161 0.9034
No log 3.6491 208 1.0888 -0.0182 1.0888 1.0435
No log 3.6842 210 1.1510 0.0065 1.1510 1.0729
No log 3.7193 212 0.8726 0.1239 0.8726 0.9341
No log 3.7544 214 0.8057 0.0633 0.8057 0.8976
No log 3.7895 216 0.9530 0.0129 0.9530 0.9762
No log 3.8246 218 0.9302 -0.0236 0.9302 0.9645
No log 3.8596 220 0.7976 0.1558 0.7976 0.8931
No log 3.8947 222 0.7617 0.0966 0.7617 0.8727
No log 3.9298 224 0.8331 0.0362 0.8331 0.9127
No log 3.9649 226 0.7824 0.0837 0.7824 0.8845
No log 4.0 228 0.7651 0.0959 0.7651 0.8747
No log 4.0351 230 0.7606 0.1817 0.7606 0.8721
No log 4.0702 232 0.7401 0.1196 0.7401 0.8603
No log 4.1053 234 0.7402 0.1674 0.7402 0.8603
No log 4.1404 236 0.7527 0.1475 0.7527 0.8676
No log 4.1754 238 0.7804 0.1221 0.7804 0.8834
No log 4.2105 240 0.8154 0.0856 0.8154 0.9030
No log 4.2456 242 0.8592 -0.0870 0.8592 0.9269
No log 4.2807 244 0.8876 -0.1691 0.8876 0.9421
No log 4.3158 246 0.8583 -0.1659 0.8583 0.9264
No log 4.3509 248 0.8391 -0.0870 0.8391 0.9160
No log 4.3860 250 0.8256 -0.0076 0.8256 0.9086
No log 4.4211 252 0.8210 -0.0095 0.8210 0.9061
No log 4.4561 254 0.8754 0.1685 0.8754 0.9356
No log 4.4912 256 0.9130 0.0696 0.9130 0.9555
No log 4.5263 258 0.8519 0.0919 0.8519 0.9230
No log 4.5614 260 0.8784 0.0529 0.8784 0.9372
No log 4.5965 262 0.9646 0.0762 0.9646 0.9822
No log 4.6316 264 0.8581 0.0573 0.8581 0.9264
No log 4.6667 266 0.9188 0.0027 0.9188 0.9585
No log 4.7018 268 1.0350 -0.0464 1.0350 1.0174
No log 4.7368 270 0.9258 0.0048 0.9258 0.9622
No log 4.7719 272 0.8038 -0.0599 0.8038 0.8966
No log 4.8070 274 0.8712 0.1449 0.8712 0.9334
No log 4.8421 276 0.8562 0.1395 0.8562 0.9253
No log 4.8772 278 0.8297 0.1456 0.8297 0.9109
No log 4.9123 280 0.8246 0.2608 0.8246 0.9081
No log 4.9474 282 0.8135 0.1979 0.8135 0.9019
No log 4.9825 284 0.7981 0.0097 0.7981 0.8934
No log 5.0175 286 0.7844 -0.0288 0.7844 0.8856
No log 5.0526 288 0.7562 0.2239 0.7562 0.8696
No log 5.0877 290 0.8027 0.1449 0.8027 0.8960
No log 5.1228 292 0.7974 0.1627 0.7974 0.8930
No log 5.1579 294 0.7935 -0.0992 0.7935 0.8908
No log 5.1930 296 0.9043 -0.1709 0.9043 0.9510
No log 5.2281 298 0.9859 -0.0930 0.9859 0.9929
No log 5.2632 300 0.8933 -0.1354 0.8933 0.9451
No log 5.2982 302 0.8252 0.0236 0.8252 0.9084
No log 5.3333 304 0.8217 0.1047 0.8217 0.9065
No log 5.3684 306 0.7838 -0.0532 0.7838 0.8853
No log 5.4035 308 0.8230 -0.1606 0.8230 0.9072
No log 5.4386 310 0.8422 -0.1538 0.8422 0.9177
No log 5.4737 312 0.7941 -0.1026 0.7941 0.8911
No log 5.5088 314 0.8249 0.1148 0.8249 0.9083
No log 5.5439 316 1.0247 -0.0471 1.0247 1.0123
No log 5.5789 318 1.0900 0.0820 1.0900 1.0441
No log 5.6140 320 0.9172 0.0068 0.9172 0.9577
No log 5.6491 322 0.8645 -0.0955 0.8645 0.9298
No log 5.6842 324 0.8413 -0.1762 0.8413 0.9172
No log 5.7193 326 0.8129 -0.0195 0.8129 0.9016
No log 5.7544 328 0.8526 0.0953 0.8526 0.9234
No log 5.7895 330 0.8972 0.0953 0.8972 0.9472
No log 5.8246 332 0.8780 0.0208 0.8780 0.9370
No log 5.8596 334 0.9129 -0.0995 0.9129 0.9555
No log 5.8947 336 0.9456 0.0064 0.9456 0.9724
No log 5.9298 338 0.9569 0.0646 0.9569 0.9782
No log 5.9649 340 0.9581 0.0973 0.9581 0.9788
No log 6.0 342 0.9149 -0.0112 0.9149 0.9565
No log 6.0351 344 0.8723 0.0705 0.8723 0.9340
No log 6.0702 346 0.8098 0.0408 0.8098 0.8999
No log 6.1053 348 0.7612 0.2009 0.7612 0.8725
No log 6.1404 350 0.7351 0.2150 0.7351 0.8574
No log 6.1754 352 0.7463 0.2053 0.7463 0.8639
No log 6.2105 354 0.7890 0.1136 0.7890 0.8882
No log 6.2456 356 0.8395 0.0597 0.8395 0.9162
No log 6.2807 358 0.9556 0.0786 0.9556 0.9775
No log 6.3158 360 0.8916 0.1239 0.8916 0.9442
No log 6.3509 362 0.7945 0.0804 0.7945 0.8914
No log 6.3860 364 0.9134 -0.1677 0.9134 0.9557
No log 6.4211 366 0.9455 -0.1677 0.9455 0.9724
No log 6.4561 368 0.8179 -0.1665 0.8179 0.9044
No log 6.4912 370 0.8060 0.2349 0.8060 0.8978
No log 6.5263 372 0.9340 0.0676 0.9340 0.9664
No log 6.5614 374 0.8795 0.0409 0.8795 0.9378
No log 6.5965 376 0.7794 0.1599 0.7794 0.8828
No log 6.6316 378 0.8197 0.0153 0.8197 0.9054
No log 6.6667 380 0.8486 0.0321 0.8486 0.9212
No log 6.7018 382 0.9079 0.0109 0.9079 0.9528
No log 6.7368 384 0.8992 -0.0260 0.8992 0.9483
No log 6.7719 386 0.8460 0.0289 0.8460 0.9198
No log 6.8070 388 0.8176 0.1498 0.8176 0.9042
No log 6.8421 390 0.8327 0.1001 0.8327 0.9125
No log 6.8772 392 0.9054 0.0409 0.9054 0.9515
No log 6.9123 394 0.9049 0.0748 0.9049 0.9512
No log 6.9474 396 0.8298 0.1243 0.8298 0.9109
No log 6.9825 398 0.7562 0.2034 0.7562 0.8696
No log 7.0175 400 0.7427 0.2105 0.7427 0.8618
No log 7.0526 402 0.7517 0.2105 0.7517 0.8670
No log 7.0877 404 0.7521 0.2105 0.7521 0.8672
No log 7.1228 406 0.7749 0.1506 0.7749 0.8803
No log 7.1579 408 0.7746 0.1565 0.7746 0.8801
No log 7.1930 410 0.7686 0.2105 0.7686 0.8767
No log 7.2281 412 0.7633 0.1659 0.7633 0.8737
No log 7.2632 414 0.7809 0.0323 0.7809 0.8837
No log 7.2982 416 0.7998 0.0690 0.7998 0.8943
No log 7.3333 418 0.8168 0.1048 0.8168 0.9038
No log 7.3684 420 0.7968 0.0323 0.7968 0.8926
No log 7.4035 422 0.8160 -0.0079 0.8160 0.9034
No log 7.4386 424 0.8431 -0.0159 0.8431 0.9182
No log 7.4737 426 0.8051 0.0101 0.8051 0.8973
No log 7.5088 428 0.7816 0.0639 0.7816 0.8841
No log 7.5439 430 0.8857 0.0676 0.8857 0.9411
No log 7.5789 432 0.8786 0.0642 0.8786 0.9373
No log 7.6140 434 0.7509 0.0768 0.7509 0.8666
No log 7.6491 436 0.7635 0.0557 0.7635 0.8738
No log 7.6842 438 0.7830 -0.0606 0.7830 0.8849
No log 7.7193 440 0.7599 -0.0366 0.7599 0.8717
No log 7.7544 442 0.7528 -0.0027 0.7528 0.8676
No log 7.7895 444 0.7744 0.0460 0.7744 0.8800
No log 7.8246 446 0.7734 0.1236 0.7734 0.8794
No log 7.8596 448 0.7508 0.0869 0.7508 0.8665
No log 7.8947 450 0.7428 0.1740 0.7428 0.8619
No log 7.9298 452 0.7279 0.1433 0.7279 0.8532
No log 7.9649 454 0.7497 -0.1332 0.7497 0.8659
No log 8.0 456 0.7934 -0.0208 0.7934 0.8908
No log 8.0351 458 0.8229 -0.0660 0.8229 0.9072
No log 8.0702 460 0.8220 0.1095 0.8220 0.9067
No log 8.1053 462 0.8265 0.1553 0.8265 0.9091
No log 8.1404 464 0.8013 0.0359 0.8013 0.8952
No log 8.1754 466 0.8091 -0.0831 0.8091 0.8995
No log 8.2105 468 0.8118 -0.0407 0.8118 0.9010
No log 8.2456 470 0.8056 -0.0992 0.8056 0.8976
No log 8.2807 472 0.7987 0.0323 0.7987 0.8937
No log 8.3158 474 0.8176 0.0680 0.8176 0.9042
No log 8.3509 476 0.8306 0.0205 0.8306 0.9113
No log 8.3860 478 0.8058 -0.0500 0.8058 0.8977
No log 8.4211 480 0.8484 -0.0904 0.8484 0.9211
No log 8.4561 482 0.8298 -0.1354 0.8298 0.9109
No log 8.4912 484 0.7753 -0.0108 0.7753 0.8805
No log 8.5263 486 0.8242 0.1440 0.8242 0.9078
No log 8.5614 488 0.8752 0.1239 0.8752 0.9355
No log 8.5965 490 0.7921 0.1096 0.7921 0.8900
No log 8.6316 492 0.7709 -0.0483 0.7709 0.8780
No log 8.6667 494 0.7815 -0.0408 0.7815 0.8840
No log 8.7018 496 0.7721 0.0226 0.7721 0.8787
No log 8.7368 498 0.8133 0.1431 0.8133 0.9018
0.3139 8.7719 500 0.8381 0.1817 0.8381 0.9155
0.3139 8.8070 502 0.7804 -0.0132 0.7804 0.8834
0.3139 8.8421 504 0.7915 -0.1211 0.7915 0.8896
0.3139 8.8772 506 0.7991 -0.1201 0.7991 0.8939
0.3139 8.9123 508 0.7637 -0.1473 0.7637 0.8739
0.3139 8.9474 510 0.7284 -0.1026 0.7284 0.8535

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
5
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k11_task3_organization

Finetuned
(4019)
this model