ArabicNewSplits7_usingWellWrittenEssays_FineTuningAraBERT_run3_AugV5_k17_task7_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 1.0638
  • Qwk: 0.1210
  • Mse: 1.0638
  • Rmse: 1.0314

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.05 2 2.7815 -0.0481 2.7815 1.6678
No log 0.1 4 1.7927 0.0061 1.7927 1.3389
No log 0.15 6 2.0211 -0.1653 2.0211 1.4217
No log 0.2 8 1.3569 -0.1328 1.3569 1.1648
No log 0.25 10 1.0144 0.0054 1.0144 1.0072
No log 0.3 12 0.9010 0.1461 0.9010 0.9492
No log 0.35 14 0.9014 0.1534 0.9014 0.9494
No log 0.4 16 0.8922 0.1636 0.8922 0.9445
No log 0.45 18 0.8487 0.0679 0.8487 0.9212
No log 0.5 20 0.9147 0.1511 0.9147 0.9564
No log 0.55 22 1.0285 0.1259 1.0285 1.0141
No log 0.6 24 1.0813 0.0986 1.0813 1.0398
No log 0.65 26 0.8837 0.2132 0.8837 0.9401
No log 0.7 28 0.7961 0.0937 0.7961 0.8922
No log 0.75 30 0.7689 0.0481 0.7689 0.8769
No log 0.8 32 0.7462 0.0481 0.7462 0.8638
No log 0.85 34 0.7381 0.0884 0.7381 0.8591
No log 0.9 36 0.7520 0.0 0.7520 0.8672
No log 0.95 38 0.7783 0.0481 0.7783 0.8822
No log 1.0 40 0.7709 0.0 0.7709 0.8780
No log 1.05 42 0.7433 0.0 0.7433 0.8622
No log 1.1 44 0.7383 0.0 0.7383 0.8592
No log 1.15 46 0.7397 0.0 0.7397 0.8601
No log 1.2 48 0.7337 0.0884 0.7337 0.8566
No log 1.25 50 0.7301 0.1236 0.7301 0.8544
No log 1.3 52 0.7242 0.1456 0.7242 0.8510
No log 1.35 54 0.7434 0.1807 0.7434 0.8622
No log 1.4 56 0.7364 0.1508 0.7364 0.8581
No log 1.45 58 0.7273 0.1187 0.7273 0.8528
No log 1.5 60 0.7258 0.0840 0.7258 0.8520
No log 1.55 62 0.7335 0.0444 0.7335 0.8565
No log 1.6 64 0.7513 0.0937 0.7513 0.8668
No log 1.65 66 0.7398 0.0481 0.7398 0.8601
No log 1.7 68 0.7442 0.0 0.7442 0.8627
No log 1.75 70 0.7480 0.0 0.7480 0.8649
No log 1.8 72 0.7431 -0.0027 0.7431 0.8620
No log 1.85 74 0.7483 0.0893 0.7483 0.8651
No log 1.9 76 0.7506 0.0026 0.7506 0.8664
No log 1.95 78 0.7455 0.0026 0.7455 0.8634
No log 2.0 80 0.7296 0.0764 0.7296 0.8542
No log 2.05 82 0.7244 0.0410 0.7244 0.8511
No log 2.1 84 0.7185 0.0481 0.7185 0.8476
No log 2.15 86 0.7201 0.0481 0.7201 0.8486
No log 2.2 88 0.7684 0.0688 0.7684 0.8766
No log 2.25 90 0.8472 -0.0047 0.8472 0.9204
No log 2.3 92 0.9219 0.0336 0.9219 0.9602
No log 2.35 94 0.8593 0.0661 0.8593 0.9270
No log 2.4 96 0.7927 0.1448 0.7927 0.8903
No log 2.45 98 0.7396 0.2158 0.7396 0.8600
No log 2.5 100 0.7441 0.2158 0.7441 0.8626
No log 2.55 102 0.7275 0.1867 0.7275 0.8529
No log 2.6 104 0.7325 0.2509 0.7325 0.8559
No log 2.65 106 0.7702 0.2218 0.7702 0.8776
No log 2.7 108 0.7711 0.2158 0.7711 0.8781
No log 2.75 110 0.7584 0.2158 0.7584 0.8709
No log 2.8 112 0.7625 0.2158 0.7625 0.8732
No log 2.85 114 0.7761 0.2413 0.7761 0.8810
No log 2.9 116 0.7775 0.1901 0.7775 0.8817
No log 2.95 118 0.7895 0.2847 0.7895 0.8885
No log 3.0 120 0.7612 0.1624 0.7612 0.8725
No log 3.05 122 0.7446 0.2158 0.7446 0.8629
No log 3.1 124 0.7593 0.1010 0.7593 0.8714
No log 3.15 126 0.8075 0.0971 0.8075 0.8986
No log 3.2 128 0.7974 0.0971 0.7974 0.8930
No log 3.25 130 0.7765 0.0697 0.7765 0.8812
No log 3.3 132 0.7984 0.1051 0.7984 0.8935
No log 3.35 134 0.9101 0.2149 0.9101 0.9540
No log 3.4 136 1.0358 0.2521 1.0358 1.0178
No log 3.45 138 1.0472 0.2364 1.0472 1.0233
No log 3.5 140 0.9628 0.1995 0.9628 0.9812
No log 3.55 142 0.9067 0.0584 0.9067 0.9522
No log 3.6 144 0.8637 0.0697 0.8637 0.9293
No log 3.65 146 0.9336 0.0975 0.9336 0.9662
No log 3.7 148 0.9420 0.0856 0.9420 0.9706
No log 3.75 150 0.9851 0.2193 0.9851 0.9925
No log 3.8 152 0.9770 0.2892 0.9770 0.9884
No log 3.85 154 0.9242 0.2439 0.9242 0.9614
No log 3.9 156 0.8655 0.2943 0.8655 0.9303
No log 3.95 158 0.8632 0.3369 0.8632 0.9291
No log 4.0 160 0.9113 0.2912 0.9113 0.9546
No log 4.05 162 0.8748 0.3115 0.8748 0.9353
No log 4.1 164 0.8484 0.3157 0.8484 0.9211
No log 4.15 166 0.8875 0.2059 0.8875 0.9421
No log 4.2 168 0.8397 0.2662 0.8397 0.9163
No log 4.25 170 0.7146 0.3020 0.7146 0.8454
No log 4.3 172 0.6767 0.1829 0.6767 0.8226
No log 4.35 174 0.6851 0.2181 0.6851 0.8277
No log 4.4 176 0.7400 0.4052 0.7400 0.8603
No log 4.45 178 0.8557 0.4251 0.8557 0.9250
No log 4.5 180 0.8616 0.3754 0.8616 0.9282
No log 4.55 182 0.8288 0.4251 0.8288 0.9104
No log 4.6 184 0.7762 0.3167 0.7762 0.8810
No log 4.65 186 0.7737 0.3167 0.7737 0.8796
No log 4.7 188 0.7623 0.3622 0.7623 0.8731
No log 4.75 190 0.7501 0.3341 0.7501 0.8661
No log 4.8 192 0.7490 0.2950 0.7490 0.8654
No log 4.85 194 0.7781 0.4884 0.7781 0.8821
No log 4.9 196 0.7830 0.5120 0.7830 0.8849
No log 4.95 198 0.8065 0.3789 0.8065 0.8980
No log 5.0 200 0.8066 0.2950 0.8066 0.8981
No log 5.05 202 0.8577 0.2967 0.8577 0.9261
No log 5.1 204 0.8968 0.4462 0.8968 0.9470
No log 5.15 206 0.8145 0.3372 0.8145 0.9025
No log 5.2 208 0.7696 0.2847 0.7696 0.8773
No log 5.25 210 0.8335 0.4224 0.8335 0.9130
No log 5.3 212 0.9054 0.3312 0.9054 0.9515
No log 5.35 214 0.9655 0.3727 0.9655 0.9826
No log 5.4 216 0.9311 0.3012 0.9311 0.9650
No log 5.45 218 0.9400 0.2779 0.9400 0.9696
No log 5.5 220 0.8975 0.1029 0.8975 0.9474
No log 5.55 222 0.8744 0.1577 0.8744 0.9351
No log 5.6 224 0.8183 0.1624 0.8183 0.9046
No log 5.65 226 0.7530 0.3127 0.7530 0.8677
No log 5.7 228 0.7639 0.3399 0.7639 0.8740
No log 5.75 230 0.8319 0.3425 0.8319 0.9121
No log 5.8 232 0.9150 0.3579 0.9150 0.9566
No log 5.85 234 1.0053 0.3059 1.0053 1.0027
No log 5.9 236 1.0216 0.3059 1.0216 1.0107
No log 5.95 238 0.9655 0.2886 0.9655 0.9826
No log 6.0 240 0.9886 0.2886 0.9886 0.9943
No log 6.05 242 1.0009 0.3247 1.0009 1.0005
No log 6.1 244 0.8882 0.2923 0.8882 0.9425
No log 6.15 246 0.8587 0.2574 0.8587 0.9266
No log 6.2 248 0.8357 0.3545 0.8357 0.9142
No log 6.25 250 0.8439 0.2722 0.8439 0.9187
No log 6.3 252 0.8608 0.2632 0.8608 0.9278
No log 6.35 254 0.9157 0.4113 0.9157 0.9569
No log 6.4 256 0.9078 0.4113 0.9078 0.9528
No log 6.45 258 0.8582 0.4255 0.8582 0.9264
No log 6.5 260 0.8166 0.3169 0.8166 0.9037
No log 6.55 262 0.8785 0.3359 0.8785 0.9373
No log 6.6 264 0.9918 0.3636 0.9918 0.9959
No log 6.65 266 1.0106 0.2977 1.0106 1.0053
No log 6.7 268 0.9642 0.3417 0.9642 0.9820
No log 6.75 270 0.9529 0.3417 0.9529 0.9761
No log 6.8 272 0.9829 0.2977 0.9829 0.9914
No log 6.85 274 1.0113 0.2876 1.0113 1.0056
No log 6.9 276 1.0778 0.2264 1.0778 1.0382
No log 6.95 278 1.1308 0.1995 1.1308 1.0634
No log 7.0 280 1.0436 0.2153 1.0436 1.0216
No log 7.05 282 1.0113 0.3203 1.0113 1.0056
No log 7.1 284 1.0281 0.3557 1.0281 1.0140
No log 7.15 286 0.9954 0.3557 0.9954 0.9977
No log 7.2 288 0.8996 0.5224 0.8996 0.9485
No log 7.25 290 0.8518 0.4144 0.8518 0.9229
No log 7.3 292 0.8686 0.3991 0.8686 0.9320
No log 7.35 294 0.9693 0.3804 0.9693 0.9845
No log 7.4 296 1.0446 0.2683 1.0446 1.0220
No log 7.45 298 1.0021 0.3059 1.0021 1.0010
No log 7.5 300 0.8985 0.5077 0.8985 0.9479
No log 7.55 302 0.8730 0.3819 0.8730 0.9344
No log 7.6 304 0.9193 0.4183 0.9193 0.9588
No log 7.65 306 0.9622 0.3827 0.9622 0.9809
No log 7.7 308 0.9313 0.3909 0.9313 0.9650
No log 7.75 310 0.8382 0.2691 0.8382 0.9155
No log 7.8 312 0.7896 0.2270 0.7896 0.8886
No log 7.85 314 0.7760 0.1970 0.7760 0.8809
No log 7.9 316 0.7893 0.1953 0.7893 0.8884
No log 7.95 318 0.9185 0.3319 0.9185 0.9584
No log 8.0 320 1.1194 0.1805 1.1194 1.0580
No log 8.05 322 1.2948 0.1928 1.2948 1.1379
No log 8.1 324 1.2932 0.1772 1.2932 1.1372
No log 8.15 326 1.1206 0.2006 1.1206 1.0586
No log 8.2 328 0.9192 0.3643 0.9192 0.9588
No log 8.25 330 0.8041 0.3302 0.8041 0.8967
No log 8.3 332 0.7928 0.2847 0.7928 0.8904
No log 8.35 334 0.8269 0.3060 0.8269 0.9094
No log 8.4 336 0.8679 0.3032 0.8679 0.9316
No log 8.45 338 0.8737 0.3425 0.8737 0.9347
No log 8.5 340 0.9036 0.3042 0.9036 0.9506
No log 8.55 342 0.9226 0.3231 0.9226 0.9605
No log 8.6 344 0.8918 0.3359 0.8918 0.9444
No log 8.65 346 0.8919 0.3494 0.8919 0.9444
No log 8.7 348 0.8747 0.3494 0.8747 0.9353
No log 8.75 350 0.8822 0.3494 0.8822 0.9392
No log 8.8 352 0.9251 0.3425 0.9251 0.9618
No log 8.85 354 0.9902 0.1948 0.9902 0.9951
No log 8.9 356 1.0493 0.2254 1.0493 1.0244
No log 8.95 358 1.1347 0.2101 1.1347 1.0652
No log 9.0 360 1.1084 0.2065 1.1084 1.0528
No log 9.05 362 0.9892 0.3740 0.9892 0.9946
No log 9.1 364 0.8581 0.2722 0.8581 0.9263
No log 9.15 366 0.8147 0.1850 0.8147 0.9026
No log 9.2 368 0.8229 0.1222 0.8229 0.9071
No log 9.25 370 0.8574 0.2558 0.8574 0.9260
No log 9.3 372 0.9153 0.2812 0.9153 0.9567
No log 9.35 374 1.0103 0.2482 1.0103 1.0052
No log 9.4 376 1.0035 0.2659 1.0035 1.0018
No log 9.45 378 0.8795 0.3294 0.8795 0.9378
No log 9.5 380 0.7655 0.3594 0.7655 0.8749
No log 9.55 382 0.7229 0.2787 0.7229 0.8502
No log 9.6 384 0.7385 0.3092 0.7385 0.8593
No log 9.65 386 0.8228 0.3444 0.8228 0.9071
No log 9.7 388 0.9977 0.2507 0.9977 0.9988
No log 9.75 390 1.1586 0.2110 1.1586 1.0764
No log 9.8 392 1.0844 0.2681 1.0844 1.0414
No log 9.85 394 0.9994 0.1732 0.9994 0.9997
No log 9.9 396 1.2277 0.1943 1.2277 1.1080
No log 9.95 398 1.2390 0.1630 1.2390 1.1131
No log 10.0 400 0.9586 0.2273 0.9586 0.9791
No log 10.05 402 0.8407 0.1886 0.8407 0.9169
No log 10.1 404 0.8639 0.3302 0.8639 0.9294
No log 10.15 406 0.8913 0.4387 0.8913 0.9441
No log 10.2 408 0.8495 0.3564 0.8495 0.9217
No log 10.25 410 0.8209 0.2467 0.8209 0.9061
No log 10.3 412 0.8366 0.2784 0.8366 0.9147
No log 10.35 414 0.9088 0.2651 0.9088 0.9533
No log 10.4 416 1.0128 0.1935 1.0128 1.0064
No log 10.45 418 1.1477 0.2264 1.1477 1.0713
No log 10.5 420 1.1282 0.2223 1.1282 1.0622
No log 10.55 422 1.0860 0.1262 1.0860 1.0421
No log 10.6 424 1.0206 0.1626 1.0206 1.0103
No log 10.65 426 0.9140 0.3231 0.9140 0.9560
No log 10.7 428 0.8787 0.2463 0.8787 0.9374
No log 10.75 430 0.8418 0.2967 0.8418 0.9175
No log 10.8 432 0.8436 0.2967 0.8436 0.9185
No log 10.85 434 0.8496 0.2967 0.8496 0.9217
No log 10.9 436 0.9454 0.2411 0.9454 0.9723
No log 10.95 438 1.0114 0.1870 1.0114 1.0057
No log 11.0 440 1.0724 0.1427 1.0724 1.0355
No log 11.05 442 1.0042 0.2703 1.0042 1.0021
No log 11.1 444 0.8957 0.2463 0.8957 0.9464
No log 11.15 446 0.8300 0.3032 0.8300 0.9110
No log 11.2 448 0.8627 0.3234 0.8627 0.9288
No log 11.25 450 0.9133 0.2463 0.9133 0.9557
No log 11.3 452 0.9451 0.2094 0.9451 0.9721
No log 11.35 454 0.9505 0.2046 0.9505 0.9749
No log 11.4 456 0.9980 0.1277 0.9980 0.9990
No log 11.45 458 1.0005 0.1827 1.0005 1.0002
No log 11.5 460 0.9608 0.2046 0.9608 0.9802
No log 11.55 462 0.8894 0.2116 0.8894 0.9431
No log 11.6 464 0.8586 0.2171 0.8586 0.9266
No log 11.65 466 0.8538 0.2171 0.8538 0.9240
No log 11.7 468 0.9072 0.2116 0.9072 0.9524
No log 11.75 470 0.9803 0.1584 0.9803 0.9901
No log 11.8 472 0.9825 0.1962 0.9825 0.9912
No log 11.85 474 1.0236 0.1217 1.0236 1.0117
No log 11.9 476 1.0833 0.1077 1.0833 1.0408
No log 11.95 478 1.1497 0.1115 1.1497 1.0722
No log 12.0 480 1.1607 0.0518 1.1607 1.0774
No log 12.05 482 1.1690 0.0497 1.1690 1.0812
No log 12.1 484 1.0866 0.1990 1.0866 1.0424
No log 12.15 486 0.9593 0.2615 0.9593 0.9794
No log 12.2 488 0.8367 0.2692 0.8367 0.9147
No log 12.25 490 0.8228 0.2692 0.8228 0.9071
No log 12.3 492 0.8954 0.2615 0.8954 0.9463
No log 12.35 494 0.9409 0.2866 0.9409 0.9700
No log 12.4 496 0.9996 0.2651 0.9996 0.9998
No log 12.45 498 1.0650 0.2567 1.0650 1.0320
0.3671 12.5 500 1.0279 0.2552 1.0279 1.0138
0.3671 12.55 502 0.8917 0.2923 0.8917 0.9443
0.3671 12.6 504 0.7573 0.2950 0.7573 0.8702
0.3671 12.65 506 0.7198 0.2787 0.7198 0.8484
0.3671 12.7 508 0.7041 0.2471 0.7041 0.8391
0.3671 12.75 510 0.7262 0.2787 0.7262 0.8522
0.3671 12.8 512 0.8109 0.3712 0.8109 0.9005
0.3671 12.85 514 0.8954 0.3473 0.8954 0.9462
0.3671 12.9 516 0.9126 0.3051 0.9126 0.9553
0.3671 12.95 518 0.9474 0.3228 0.9474 0.9733
0.3671 13.0 520 0.9033 0.3294 0.9033 0.9504
0.3671 13.05 522 0.8217 0.3099 0.8217 0.9065
0.3671 13.1 524 0.8160 0.2883 0.8160 0.9033
0.3671 13.15 526 0.8538 0.3234 0.8538 0.9240
0.3671 13.2 528 0.9502 0.2358 0.9502 0.9748
0.3671 13.25 530 1.0451 0.2271 1.0451 1.0223
0.3671 13.3 532 1.0864 0.1709 1.0864 1.0423
0.3671 13.35 534 1.0974 0.2227 1.0974 1.0476
0.3671 13.4 536 1.0638 0.1210 1.0638 1.0314

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_usingWellWrittenEssays_FineTuningAraBERT_run3_AugV5_k17_task7_organization

Finetuned
(4019)
this model