ArabicNewSplits7_B_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k3_task7_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.9376
  • Qwk: 0.1808
  • Mse: 0.9376
  • Rmse: 0.9683

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.1176 2 2.7039 -0.0084 2.7039 1.6444
No log 0.2353 4 1.3829 0.0750 1.3829 1.1760
No log 0.3529 6 1.2347 -0.0424 1.2347 1.1112
No log 0.4706 8 1.2462 -0.0745 1.2462 1.1163
No log 0.5882 10 0.9989 0.1101 0.9989 0.9994
No log 0.7059 12 0.9430 0.2552 0.9430 0.9711
No log 0.8235 14 0.9603 0.2601 0.9603 0.9799
No log 0.9412 16 0.9669 0.3051 0.9669 0.9833
No log 1.0588 18 0.8417 0.1504 0.8417 0.9174
No log 1.1765 20 0.8093 0.0344 0.8093 0.8996
No log 1.2941 22 0.7963 0.0444 0.7963 0.8923
No log 1.4118 24 0.8065 0.0893 0.8065 0.8981
No log 1.5294 26 0.7688 0.1282 0.7688 0.8768
No log 1.6471 28 0.7438 0.1699 0.7438 0.8624
No log 1.7647 30 0.7991 0.1718 0.7991 0.8939
No log 1.8824 32 0.8951 0.2615 0.8951 0.9461
No log 2.0 34 0.9059 0.2463 0.9059 0.9518
No log 2.1176 36 0.8511 0.1103 0.8511 0.9226
No log 2.2353 38 0.8024 0.0327 0.8024 0.8958
No log 2.3529 40 0.7912 0.0327 0.7912 0.8895
No log 2.4706 42 0.8161 0.1416 0.8161 0.9034
No log 2.5882 44 0.8949 0.2871 0.8949 0.9460
No log 2.7059 46 0.9467 0.2492 0.9467 0.9730
No log 2.8235 48 0.9555 0.2812 0.9555 0.9775
No log 2.9412 50 0.8588 0.2692 0.8588 0.9267
No log 3.0588 52 0.8014 0.1673 0.8014 0.8952
No log 3.1765 54 0.8012 0.0771 0.8012 0.8951
No log 3.2941 56 0.8111 0.1093 0.8111 0.9006
No log 3.4118 58 0.8326 0.1673 0.8326 0.9125
No log 3.5294 60 0.9259 0.2142 0.9259 0.9622
No log 3.6471 62 1.1553 0.1203 1.1553 1.0748
No log 3.7647 64 1.1942 0.1093 1.1942 1.0928
No log 3.8824 66 1.0865 0.0324 1.0865 1.0424
No log 4.0 68 0.9142 0.1409 0.9142 0.9562
No log 4.1176 70 0.8592 0.2353 0.8592 0.9269
No log 4.2353 72 0.8291 0.2043 0.8291 0.9106
No log 4.3529 74 0.8387 0.2319 0.8387 0.9158
No log 4.4706 76 0.9273 0.1254 0.9273 0.9630
No log 4.5882 78 0.9766 0.1461 0.9766 0.9882
No log 4.7059 80 0.9712 0.1493 0.9712 0.9855
No log 4.8235 82 1.0125 0.1694 1.0125 1.0062
No log 4.9412 84 1.0159 0.1693 1.0159 1.0079
No log 5.0588 86 1.0408 0.1032 1.0408 1.0202
No log 5.1765 88 1.0518 0.2314 1.0518 1.0256
No log 5.2941 90 1.0162 0.3011 1.0162 1.0081
No log 5.4118 92 0.9902 0.1518 0.9902 0.9951
No log 5.5294 94 1.0004 0.1742 1.0004 1.0002
No log 5.6471 96 1.0128 0.1448 1.0128 1.0064
No log 5.7647 98 1.0365 0.2139 1.0365 1.0181
No log 5.8824 100 1.0055 0.2274 1.0055 1.0028
No log 6.0 102 0.9864 0.2365 0.9864 0.9932
No log 6.1176 104 1.0092 0.2475 1.0092 1.0046
No log 6.2353 106 1.0202 0.2023 1.0202 1.0100
No log 6.3529 108 0.9736 0.2077 0.9736 0.9867
No log 6.4706 110 0.9395 0.1961 0.9395 0.9693
No log 6.5882 112 0.8480 0.3023 0.8480 0.9209
No log 6.7059 114 0.8115 0.3498 0.8115 0.9008
No log 6.8235 116 0.8052 0.1697 0.8052 0.8973
No log 6.9412 118 0.8468 0.1710 0.8468 0.9202
No log 7.0588 120 0.8381 0.3409 0.8381 0.9155
No log 7.1765 122 0.9156 0.2917 0.9156 0.9569
No log 7.2941 124 1.0355 0.2658 1.0355 1.0176
No log 7.4118 126 1.0605 0.1770 1.0605 1.0298
No log 7.5294 128 1.0521 0.1733 1.0521 1.0257
No log 7.6471 130 1.0352 0.2167 1.0352 1.0174
No log 7.7647 132 0.9548 0.2539 0.9548 0.9771
No log 7.8824 134 0.8853 0.2232 0.8853 0.9409
No log 8.0 136 0.8759 0.3594 0.8759 0.9359
No log 8.1176 138 0.9285 0.3377 0.9285 0.9636
No log 8.2353 140 0.9977 0.3154 0.9977 0.9988
No log 8.3529 142 0.9980 0.2698 0.9980 0.9990
No log 8.4706 144 0.9728 0.2849 0.9728 0.9863
No log 8.5882 146 0.9557 0.2774 0.9557 0.9776
No log 8.7059 148 1.0051 0.2533 1.0051 1.0025
No log 8.8235 150 1.0248 0.2911 1.0248 1.0123
No log 8.9412 152 1.0019 0.2795 1.0019 1.0009
No log 9.0588 154 0.9643 0.1945 0.9643 0.9820
No log 9.1765 156 0.9677 0.1791 0.9677 0.9837
No log 9.2941 158 0.9645 0.2593 0.9645 0.9821
No log 9.4118 160 0.9187 0.2389 0.9187 0.9585
No log 9.5294 162 0.9054 0.2843 0.9054 0.9515
No log 9.6471 164 0.9283 0.2009 0.9283 0.9635
No log 9.7647 166 1.0013 0.1969 1.0013 1.0007
No log 9.8824 168 1.1325 0.1920 1.1325 1.0642
No log 10.0 170 1.1271 0.1779 1.1271 1.0617
No log 10.1176 172 1.0577 0.1707 1.0577 1.0284
No log 10.2353 174 0.9340 0.1448 0.9340 0.9665
No log 10.3529 176 0.8945 0.2305 0.8945 0.9458
No log 10.4706 178 0.9307 0.2013 0.9307 0.9647
No log 10.5882 180 0.9764 0.1304 0.9764 0.9881
No log 10.7059 182 0.9657 0.1930 0.9657 0.9827
No log 10.8235 184 0.9507 0.2335 0.9507 0.9750
No log 10.9412 186 0.9348 0.2058 0.9348 0.9669
No log 11.0588 188 0.9231 0.2414 0.9231 0.9608
No log 11.1765 190 0.9414 0.2392 0.9414 0.9702
No log 11.2941 192 1.0479 0.1521 1.0479 1.0237
No log 11.4118 194 1.0728 0.0346 1.0728 1.0357
No log 11.5294 196 1.0651 0.0368 1.0651 1.0320
No log 11.6471 198 1.0797 0.1680 1.0797 1.0391
No log 11.7647 200 0.9458 0.2467 0.9458 0.9725
No log 11.8824 202 0.8637 0.2802 0.8637 0.9293
No log 12.0 204 0.8765 0.1725 0.8765 0.9362
No log 12.1176 206 0.8505 0.2404 0.8505 0.9222
No log 12.2353 208 0.8914 0.1900 0.8914 0.9441
No log 12.3529 210 0.9677 0.2467 0.9677 0.9837
No log 12.4706 212 1.0227 0.2468 1.0227 1.0113
No log 12.5882 214 0.9790 0.2772 0.9790 0.9895
No log 12.7059 216 0.9000 0.3349 0.9000 0.9487
No log 12.8235 218 0.8882 0.2966 0.8882 0.9425
No log 12.9412 220 0.9377 0.2993 0.9377 0.9683
No log 13.0588 222 1.1014 0.1636 1.1014 1.0495
No log 13.1765 224 1.2188 0.1536 1.2188 1.1040
No log 13.2941 226 1.1825 0.1313 1.1825 1.0874
No log 13.4118 228 1.0846 0.1886 1.0846 1.0414
No log 13.5294 230 0.9749 0.2399 0.9749 0.9874
No log 13.6471 232 0.9074 0.2687 0.9074 0.9526
No log 13.7647 234 0.8845 0.2458 0.8845 0.9405
No log 13.8824 236 0.8996 0.3146 0.8996 0.9485
No log 14.0 238 0.9123 0.2866 0.9123 0.9551
No log 14.1176 240 0.9207 0.3248 0.9207 0.9595
No log 14.2353 242 0.9659 0.2059 0.9659 0.9828
No log 14.3529 244 1.0538 0.0955 1.0538 1.0265
No log 14.4706 246 1.1063 0.0561 1.1063 1.0518
No log 14.5882 248 1.0968 0.0357 1.0968 1.0473
No log 14.7059 250 1.0098 0.0837 1.0098 1.0049
No log 14.8235 252 0.9570 0.1566 0.9570 0.9783
No log 14.9412 254 0.9365 0.2153 0.9365 0.9677
No log 15.0588 256 0.9487 0.2397 0.9487 0.9740
No log 15.1765 258 0.9556 0.2127 0.9556 0.9776
No log 15.2941 260 0.9937 0.2912 0.9937 0.9968
No log 15.4118 262 0.9694 0.3149 0.9694 0.9846
No log 15.5294 264 0.9729 0.2861 0.9729 0.9864
No log 15.6471 266 1.0901 0.1962 1.0901 1.0441
No log 15.7647 268 1.3213 0.1323 1.3213 1.1495
No log 15.8824 270 1.3995 0.1418 1.3994 1.1830
No log 16.0 272 1.2939 0.1146 1.2939 1.1375
No log 16.1176 274 1.1581 0.0896 1.1581 1.0762
No log 16.2353 276 1.0290 0.1339 1.0290 1.0144
No log 16.3529 278 0.9560 0.2643 0.9560 0.9778
No log 16.4706 280 0.9158 0.2102 0.9158 0.9570
No log 16.5882 282 0.9050 0.1816 0.9050 0.9513
No log 16.7059 284 0.8966 0.1307 0.8966 0.9469
No log 16.8235 286 0.8926 0.2053 0.8926 0.9448
No log 16.9412 288 0.9269 0.2415 0.9269 0.9628
No log 17.0588 290 0.9571 0.2059 0.9571 0.9783
No log 17.1765 292 0.9322 0.2415 0.9322 0.9655
No log 17.2941 294 0.8788 0.2467 0.8788 0.9374
No log 17.4118 296 0.8569 0.2960 0.8569 0.9257
No log 17.5294 298 0.8650 0.3482 0.8650 0.9301
No log 17.6471 300 0.8728 0.3937 0.8728 0.9342
No log 17.7647 302 0.8764 0.4091 0.8764 0.9362
No log 17.8824 304 0.8731 0.3409 0.8731 0.9344
No log 18.0 306 0.8604 0.3395 0.8604 0.9276
No log 18.1176 308 0.8351 0.4363 0.8351 0.9139
No log 18.2353 310 0.8095 0.4240 0.8095 0.8997
No log 18.3529 312 0.8172 0.4374 0.8172 0.9040
No log 18.4706 314 0.8816 0.2723 0.8816 0.9389
No log 18.5882 316 0.9293 0.2662 0.9293 0.9640
No log 18.7059 318 1.0430 0.0953 1.0430 1.0213
No log 18.8235 320 1.1546 0.1057 1.1546 1.0745
No log 18.9412 322 1.2527 0.2177 1.2527 1.1192
No log 19.0588 324 1.1692 0.1290 1.1692 1.0813
No log 19.1765 326 1.0240 0.1274 1.0240 1.0119
No log 19.2941 328 0.8984 0.2604 0.8984 0.9478
No log 19.4118 330 0.8600 0.2960 0.8600 0.9274
No log 19.5294 332 0.8368 0.3299 0.8368 0.9148
No log 19.6471 334 0.8340 0.4190 0.8340 0.9133
No log 19.7647 336 0.8384 0.3601 0.8384 0.9156
No log 19.8824 338 0.8419 0.3259 0.8419 0.9176
No log 20.0 340 0.8374 0.3648 0.8374 0.9151
No log 20.1176 342 0.8519 0.3575 0.8519 0.9230
No log 20.2353 344 0.8574 0.3980 0.8574 0.9260
No log 20.3529 346 0.8582 0.3441 0.8582 0.9264
No log 20.4706 348 0.8423 0.4147 0.8423 0.9178
No log 20.5882 350 0.8418 0.3274 0.8418 0.9175
No log 20.7059 352 0.8865 0.3305 0.8865 0.9415
No log 20.8235 354 0.9561 0.2156 0.9561 0.9778
No log 20.9412 356 0.9807 0.2156 0.9807 0.9903
No log 21.0588 358 0.9322 0.2696 0.9322 0.9655
No log 21.1765 360 0.9006 0.2724 0.9006 0.9490
No log 21.2941 362 0.9135 0.2975 0.9135 0.9558
No log 21.4118 364 0.9168 0.2975 0.9168 0.9575
No log 21.5294 366 0.9911 0.3314 0.9911 0.9955
No log 21.6471 368 1.0553 0.2881 1.0553 1.0273
No log 21.7647 370 1.1105 0.2794 1.1105 1.0538
No log 21.8824 372 1.1843 0.2184 1.1843 1.0882
No log 22.0 374 1.1229 0.1287 1.1229 1.0597
No log 22.1176 376 0.9595 0.2013 0.9595 0.9795
No log 22.2353 378 0.8798 0.2576 0.8798 0.9380
No log 22.3529 380 0.8429 0.2283 0.8429 0.9181
No log 22.4706 382 0.8605 0.3121 0.8605 0.9276
No log 22.5882 384 0.8596 0.3060 0.8596 0.9271
No log 22.7059 386 0.8872 0.2342 0.8872 0.9419
No log 22.8235 388 0.8885 0.1557 0.8885 0.9426
No log 22.9412 390 0.8368 0.2090 0.8368 0.9148
No log 23.0588 392 0.8159 0.2203 0.8159 0.9033
No log 23.1765 394 0.8248 0.1839 0.8248 0.9082
No log 23.2941 396 0.8600 0.2495 0.8600 0.9274
No log 23.4118 398 0.8626 0.3095 0.8626 0.9288
No log 23.5294 400 0.8351 0.1683 0.8351 0.9138
No log 23.6471 402 0.8232 0.1773 0.8232 0.9073
No log 23.7647 404 0.8009 0.1970 0.8009 0.8949
No log 23.8824 406 0.8128 0.1935 0.8128 0.9016
No log 24.0 408 0.8460 0.2253 0.8460 0.9198
No log 24.1176 410 0.9215 0.2105 0.9215 0.9599
No log 24.2353 412 0.9454 0.2059 0.9454 0.9723
No log 24.3529 414 0.9147 0.2059 0.9147 0.9564
No log 24.4706 416 0.8426 0.3525 0.8426 0.9180
No log 24.5882 418 0.7955 0.2895 0.7955 0.8919
No log 24.7059 420 0.7885 0.3654 0.7885 0.8880
No log 24.8235 422 0.7906 0.2929 0.7906 0.8892
No log 24.9412 424 0.8348 0.3088 0.8348 0.9137
No log 25.0588 426 0.8989 0.2670 0.8989 0.9481
No log 25.1765 428 0.8813 0.2643 0.8813 0.9388
No log 25.2941 430 0.8850 0.2643 0.8850 0.9407
No log 25.4118 432 0.8452 0.3355 0.8452 0.9194
No log 25.5294 434 0.8296 0.3498 0.8296 0.9108
No log 25.6471 436 0.8518 0.3221 0.8518 0.9229
No log 25.7647 438 0.8752 0.2943 0.8752 0.9355
No log 25.8824 440 0.9366 0.2590 0.9366 0.9678
No log 26.0 442 0.9438 0.2247 0.9438 0.9715
No log 26.1176 444 0.9130 0.2943 0.9130 0.9555
No log 26.2353 446 0.8628 0.3776 0.8628 0.9289
No log 26.3529 448 0.7889 0.3936 0.7889 0.8882
No log 26.4706 450 0.7689 0.4029 0.7689 0.8769
No log 26.5882 452 0.8277 0.3440 0.8277 0.9098
No log 26.7059 454 0.8778 0.2952 0.8778 0.9369
No log 26.8235 456 0.8818 0.3417 0.8818 0.9390
No log 26.9412 458 0.8247 0.3248 0.8247 0.9081
No log 27.0588 460 0.8104 0.2843 0.8104 0.9002
No log 27.1765 462 0.8109 0.3498 0.8109 0.9005
No log 27.2941 464 0.8376 0.3544 0.8376 0.9152
No log 27.4118 466 0.8394 0.3095 0.8394 0.9162
No log 27.5294 468 0.8249 0.3544 0.8249 0.9083
No log 27.6471 470 0.8169 0.3942 0.8169 0.9038
No log 27.7647 472 0.8075 0.3942 0.8075 0.8986
No log 27.8824 474 0.7991 0.3088 0.7991 0.8939
No log 28.0 476 0.8002 0.3088 0.8002 0.8945
No log 28.1176 478 0.8271 0.2995 0.8271 0.9095
No log 28.2353 480 0.8700 0.1461 0.8700 0.9327
No log 28.3529 482 0.8822 0.1501 0.8822 0.9392
No log 28.4706 484 0.8900 0.1501 0.8900 0.9434
No log 28.5882 486 0.9005 0.1765 0.9005 0.9489
No log 28.7059 488 0.8475 0.3425 0.8475 0.9206
No log 28.8235 490 0.7966 0.3081 0.7966 0.8925
No log 28.9412 492 0.7960 0.2373 0.7960 0.8922
No log 29.0588 494 0.7975 0.2652 0.7975 0.8930
No log 29.1765 496 0.7997 0.3239 0.7997 0.8943
No log 29.2941 498 0.8589 0.3798 0.8589 0.9268
0.3446 29.4118 500 0.9252 0.3389 0.9252 0.9619
0.3446 29.5294 502 0.9228 0.3446 0.9228 0.9606
0.3446 29.6471 504 0.8577 0.3500 0.8577 0.9261
0.3446 29.7647 506 0.8198 0.3274 0.8198 0.9054
0.3446 29.8824 508 0.8308 0.3051 0.8308 0.9115
0.3446 30.0 510 0.8613 0.3248 0.8613 0.9281
0.3446 30.1176 512 0.9123 0.1839 0.9123 0.9552
0.3446 30.2353 514 0.9376 0.1808 0.9376 0.9683

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_B_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k3_task7_organization

Finetuned
(4019)
this model