ArabicNewSplits7_B_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k16_task7_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.6472
  • Qwk: 0.4067
  • Mse: 0.6472
  • Rmse: 0.8045

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0244 2 2.7814 -0.0568 2.7814 1.6678
No log 0.0488 4 1.5740 0.0560 1.5740 1.2546
No log 0.0732 6 1.1744 -0.0886 1.1744 1.0837
No log 0.0976 8 1.1453 -0.0722 1.1453 1.0702
No log 0.1220 10 1.1804 -0.0113 1.1804 1.0865
No log 0.1463 12 1.2289 0.0515 1.2289 1.1086
No log 0.1707 14 1.0352 0.2059 1.0352 1.0174
No log 0.1951 16 0.7293 0.1884 0.7293 0.8540
No log 0.2195 18 0.6725 0.2345 0.6725 0.8201
No log 0.2439 20 0.6521 0.3385 0.6521 0.8075
No log 0.2683 22 0.6500 0.2319 0.6500 0.8062
No log 0.2927 24 0.6714 0.2464 0.6714 0.8194
No log 0.3171 26 1.1659 0.1719 1.1659 1.0798
No log 0.3415 28 1.2514 0.1913 1.2514 1.1187
No log 0.3659 30 0.9995 0.2298 0.9995 0.9998
No log 0.3902 32 0.6909 0.2804 0.6909 0.8312
No log 0.4146 34 0.6364 0.2142 0.6364 0.7977
No log 0.4390 36 0.6149 0.2405 0.6149 0.7842
No log 0.4634 38 0.6227 0.2930 0.6227 0.7891
No log 0.4878 40 0.6675 0.2632 0.6675 0.8170
No log 0.5122 42 0.6688 0.2063 0.6688 0.8178
No log 0.5366 44 0.6436 0.2537 0.6436 0.8022
No log 0.5610 46 0.6574 0.3387 0.6574 0.8108
No log 0.5854 48 0.8229 0.3231 0.8229 0.9072
No log 0.6098 50 0.9356 0.2703 0.9356 0.9673
No log 0.6341 52 0.8478 0.2492 0.8478 0.9208
No log 0.6585 54 0.8179 0.2007 0.8179 0.9044
No log 0.6829 56 1.0360 -0.0124 1.0360 1.0178
No log 0.7073 58 1.3802 -0.1439 1.3802 1.1748
No log 0.7317 60 1.5074 -0.1437 1.5074 1.2278
No log 0.7561 62 1.3564 -0.0704 1.3564 1.1646
No log 0.7805 64 0.9478 0.1454 0.9478 0.9735
No log 0.8049 66 0.7958 0.3444 0.7958 0.8921
No log 0.8293 68 0.8586 0.3494 0.8586 0.9266
No log 0.8537 70 1.2075 0.1397 1.2075 1.0989
No log 0.8780 72 1.3611 0.1648 1.3611 1.1667
No log 0.9024 74 1.2598 0.1397 1.2598 1.1224
No log 0.9268 76 0.9469 0.1651 0.9469 0.9731
No log 0.9512 78 0.8310 0.2754 0.8310 0.9116
No log 0.9756 80 0.9553 0.2463 0.9553 0.9774
No log 1.0 82 1.2205 0.1203 1.2205 1.1048
No log 1.0244 84 1.2363 0.1606 1.2363 1.1119
No log 1.0488 86 1.2469 0.0717 1.2469 1.1166
No log 1.0732 88 1.1025 0.1277 1.1025 1.0500
No log 1.0976 90 0.9618 0.2410 0.9618 0.9807
No log 1.1220 92 0.9215 0.2142 0.9215 0.9599
No log 1.1463 94 0.9337 0.1984 0.9337 0.9663
No log 1.1707 96 0.8279 0.3287 0.8279 0.9099
No log 1.1951 98 0.8422 0.3060 0.8422 0.9177
No log 1.2195 100 1.0697 0.1911 1.0697 1.0343
No log 1.2439 102 1.5416 0.1454 1.5416 1.2416
No log 1.2683 104 1.5902 0.1067 1.5902 1.2610
No log 1.2927 106 1.2708 0.2215 1.2708 1.1273
No log 1.3171 108 1.0385 0.2044 1.0385 1.0190
No log 1.3415 110 1.0139 0.2044 1.0139 1.0069
No log 1.3659 112 1.1904 0.1003 1.1904 1.0911
No log 1.3902 114 1.4434 0.0127 1.4434 1.2014
No log 1.4146 116 1.6074 0.1146 1.6074 1.2678
No log 1.4390 118 1.5736 0.1122 1.5736 1.2544
No log 1.4634 120 1.3615 0.0099 1.3615 1.1668
No log 1.4878 122 1.2492 0.0761 1.2492 1.1177
No log 1.5122 124 1.2537 0.0975 1.2537 1.1197
No log 1.5366 126 1.1276 0.1671 1.1276 1.0619
No log 1.5610 128 1.0803 0.1573 1.0803 1.0394
No log 1.5854 130 1.0603 0.1243 1.0603 1.0297
No log 1.6098 132 1.0960 0.0894 1.0960 1.0469
No log 1.6341 134 1.1032 0.1709 1.1032 1.0503
No log 1.6585 136 1.1586 0.1713 1.1586 1.0764
No log 1.6829 138 0.9886 0.2094 0.9886 0.9943
No log 1.7073 140 0.8035 0.2817 0.8035 0.8964
No log 1.7317 142 0.7944 0.1179 0.7944 0.8913
No log 1.7561 144 0.8225 0.0833 0.8225 0.9069
No log 1.7805 146 0.7843 0.1179 0.7843 0.8856
No log 1.8049 148 0.8369 0.3099 0.8369 0.9148
No log 1.8293 150 1.1213 0.2755 1.1213 1.0589
No log 1.8537 152 1.2058 0.2280 1.2058 1.0981
No log 1.8780 154 1.0325 0.2756 1.0325 1.0161
No log 1.9024 156 0.8258 0.2754 0.8258 0.9087
No log 1.9268 158 0.7614 0.3011 0.7614 0.8726
No log 1.9512 160 0.7619 0.3127 0.7619 0.8729
No log 1.9756 162 0.8650 0.3105 0.8650 0.9301
No log 2.0 164 1.0946 0.3431 1.0946 1.0462
No log 2.0244 166 1.1645 0.2799 1.1645 1.0791
No log 2.0488 168 1.0647 0.3247 1.0647 1.0318
No log 2.0732 170 0.9080 0.3606 0.9080 0.9529
No log 2.0976 172 0.8265 0.2967 0.8265 0.9091
No log 2.1220 174 0.8368 0.2967 0.8368 0.9148
No log 2.1463 176 0.8902 0.3359 0.8902 0.9435
No log 2.1707 178 0.9950 0.3538 0.9950 0.9975
No log 2.1951 180 1.0297 0.2756 1.0297 1.0147
No log 2.2195 182 1.1153 0.2478 1.1153 1.0561
No log 2.2439 184 1.2631 0.1574 1.2631 1.1239
No log 2.2683 186 1.3943 0.1364 1.3943 1.1808
No log 2.2927 188 1.3114 0.1670 1.3114 1.1452
No log 2.3171 190 1.0375 0.3481 1.0375 1.0186
No log 2.3415 192 0.7946 0.3843 0.7946 0.8914
No log 2.3659 194 0.6902 0.2973 0.6902 0.8308
No log 2.3902 196 0.6832 0.2973 0.6832 0.8265
No log 2.4146 198 0.7421 0.3712 0.7421 0.8615
No log 2.4390 200 0.8258 0.4014 0.8258 0.9087
No log 2.4634 202 0.8693 0.3688 0.8693 0.9324
No log 2.4878 204 0.9070 0.3560 0.9070 0.9524
No log 2.5122 206 0.8683 0.3623 0.8683 0.9318
No log 2.5366 208 0.8688 0.3847 0.8688 0.9321
No log 2.5610 210 0.7800 0.3456 0.7800 0.8832
No log 2.5854 212 0.6900 0.3444 0.6900 0.8307
No log 2.6098 214 0.6733 0.3196 0.6733 0.8206
No log 2.6341 216 0.6630 0.4060 0.6630 0.8142
No log 2.6585 218 0.7221 0.3564 0.7221 0.8497
No log 2.6829 220 0.8878 0.3477 0.8878 0.9423
No log 2.7073 222 1.0233 0.3431 1.0233 1.0116
No log 2.7317 224 0.9978 0.3484 0.9978 0.9989
No log 2.7561 226 0.8699 0.3381 0.8699 0.9327
No log 2.7805 228 0.7824 0.3384 0.7824 0.8845
No log 2.8049 230 0.8081 0.3425 0.8081 0.8990
No log 2.8293 232 0.8488 0.3433 0.8488 0.9213
No log 2.8537 234 0.8158 0.3359 0.8158 0.9032
No log 2.8780 236 0.7148 0.3712 0.7148 0.8455
No log 2.9024 238 0.6743 0.1050 0.6743 0.8212
No log 2.9268 240 0.6817 0.1829 0.6817 0.8256
No log 2.9512 242 0.6976 0.3782 0.6976 0.8352
No log 2.9756 244 0.8489 0.3217 0.8489 0.9213
No log 3.0 246 1.0785 0.3233 1.0785 1.0385
No log 3.0244 248 1.1557 0.3086 1.1557 1.0750
No log 3.0488 250 0.9605 0.3114 0.9605 0.9800
No log 3.0732 252 0.8021 0.3231 0.8021 0.8956
No log 3.0976 254 0.7371 0.3234 0.7371 0.8585
No log 3.1220 256 0.7374 0.3302 0.7374 0.8587
No log 3.1463 258 0.7983 0.2923 0.7983 0.8935
No log 3.1707 260 0.9690 0.3131 0.9690 0.9844
No log 3.1951 262 1.1715 0.3013 1.1715 1.0824
No log 3.2195 264 1.1558 0.3013 1.1558 1.0751
No log 3.2439 266 1.0440 0.3031 1.0440 1.0218
No log 3.2683 268 0.8512 0.3709 0.8512 0.9226
No log 3.2927 270 0.8248 0.4154 0.8248 0.9082
No log 3.3171 272 0.8601 0.3643 0.8601 0.9274
No log 3.3415 274 0.8067 0.4707 0.8067 0.8982
No log 3.3659 276 0.7008 0.4392 0.7008 0.8371
No log 3.3902 278 0.6710 0.4569 0.6710 0.8192
No log 3.4146 280 0.6278 0.3387 0.6278 0.7923
No log 3.4390 282 0.6118 0.3081 0.6118 0.7822
No log 3.4634 284 0.6095 0.3092 0.6095 0.7807
No log 3.4878 286 0.6335 0.4569 0.6335 0.7960
No log 3.5122 288 0.7128 0.3981 0.7128 0.8443
No log 3.5366 290 0.7205 0.3981 0.7205 0.8488
No log 3.5610 292 0.6655 0.3673 0.6655 0.8158
No log 3.5854 294 0.6658 0.2121 0.6658 0.8159
No log 3.6098 296 0.6760 0.2407 0.6760 0.8222
No log 3.6341 298 0.7022 0.3615 0.7022 0.8380
No log 3.6585 300 0.7665 0.3754 0.7665 0.8755
No log 3.6829 302 0.7635 0.3822 0.7635 0.8738
No log 3.7073 304 0.7181 0.3092 0.7181 0.8474
No log 3.7317 306 0.7238 0.3092 0.7238 0.8508
No log 3.7561 308 0.7866 0.4067 0.7866 0.8869
No log 3.7805 310 0.9018 0.3381 0.9018 0.9496
No log 3.8049 312 0.9471 0.3324 0.9471 0.9732
No log 3.8293 314 0.9296 0.3302 0.9296 0.9642
No log 3.8537 316 0.7906 0.4230 0.7906 0.8891
No log 3.8780 318 0.6671 0.3637 0.6671 0.8167
No log 3.9024 320 0.6324 0.3518 0.6324 0.7953
No log 3.9268 322 0.6244 0.3894 0.6244 0.7902
No log 3.9512 324 0.6230 0.4020 0.6230 0.7893
No log 3.9756 326 0.6260 0.3445 0.6260 0.7912
No log 4.0 328 0.6318 0.3445 0.6318 0.7948
No log 4.0244 330 0.6353 0.3599 0.6353 0.7970
No log 4.0488 332 0.6385 0.3577 0.6385 0.7991
No log 4.0732 334 0.6843 0.1923 0.6843 0.8272
No log 4.0976 336 0.7500 0.3778 0.7500 0.8660
No log 4.1220 338 0.7293 0.3730 0.7293 0.8540
No log 4.1463 340 0.6350 0.4307 0.6350 0.7969
No log 4.1707 342 0.5807 0.3762 0.5807 0.7621
No log 4.1951 344 0.6774 0.4457 0.6774 0.8230
No log 4.2195 346 0.7562 0.4180 0.7562 0.8696
No log 4.2439 348 0.6993 0.4424 0.6993 0.8362
No log 4.2683 350 0.6352 0.4219 0.6352 0.7970
No log 4.2927 352 0.5944 0.4243 0.5944 0.7710
No log 4.3171 354 0.5748 0.3625 0.5748 0.7582
No log 4.3415 356 0.5859 0.4576 0.5859 0.7654
No log 4.3659 358 0.6700 0.4295 0.6700 0.8186
No log 4.3902 360 0.8014 0.4444 0.8014 0.8952
No log 4.4146 362 0.7802 0.4444 0.7802 0.8833
No log 4.4390 364 0.6492 0.4193 0.6492 0.8057
No log 4.4634 366 0.5704 0.3702 0.5704 0.7552
No log 4.4878 368 0.5993 0.2847 0.5993 0.7741
No log 4.5122 370 0.6109 0.1850 0.6109 0.7816
No log 4.5366 372 0.6081 0.2543 0.6081 0.7798
No log 4.5610 374 0.6003 0.3474 0.6003 0.7748
No log 4.5854 376 0.6216 0.3599 0.6216 0.7884
No log 4.6098 378 0.6487 0.4134 0.6487 0.8054
No log 4.6341 380 0.6369 0.4044 0.6369 0.7981
No log 4.6585 382 0.6064 0.4147 0.6064 0.7787
No log 4.6829 384 0.6032 0.3625 0.6032 0.7766
No log 4.7073 386 0.6273 0.3622 0.6273 0.7921
No log 4.7317 388 0.7183 0.3409 0.7183 0.8475
No log 4.7561 390 0.9151 0.4250 0.9151 0.9566
No log 4.7805 392 1.0243 0.3987 1.0243 1.0121
No log 4.8049 394 0.9973 0.3557 0.9973 0.9986
No log 4.8293 396 0.8627 0.3499 0.8627 0.9288
No log 4.8537 398 0.7622 0.4067 0.7622 0.8731
No log 4.8780 400 0.6948 0.4479 0.6948 0.8335
No log 4.9024 402 0.6922 0.4479 0.6922 0.8320
No log 4.9268 404 0.7379 0.3894 0.7379 0.8590
No log 4.9512 406 0.7817 0.4144 0.7817 0.8841
No log 4.9756 408 0.8279 0.3371 0.8279 0.9099
No log 5.0 410 0.7976 0.3371 0.7976 0.8931
No log 5.0244 412 0.7148 0.3637 0.7148 0.8455
No log 5.0488 414 0.6754 0.3950 0.6754 0.8218
No log 5.0732 416 0.6678 0.3950 0.6678 0.8172
No log 5.0976 418 0.6760 0.3673 0.6760 0.8222
No log 5.1220 420 0.7348 0.3963 0.7348 0.8572
No log 5.1463 422 0.8946 0.3290 0.8946 0.9458
No log 5.1707 424 1.0050 0.3102 1.0050 1.0025
No log 5.1951 426 0.9617 0.2894 0.9617 0.9807
No log 5.2195 428 0.8277 0.3822 0.8277 0.9098
No log 5.2439 430 0.7225 0.3673 0.7225 0.8500
No log 5.2683 432 0.6712 0.1752 0.6712 0.8193
No log 5.2927 434 0.6598 0.2787 0.6598 0.8123
No log 5.3171 436 0.6771 0.3950 0.6771 0.8229
No log 5.3415 438 0.7189 0.4409 0.7189 0.8479
No log 5.3659 440 0.7506 0.3461 0.7506 0.8664
No log 5.3902 442 0.7470 0.3521 0.7470 0.8643
No log 5.4146 444 0.6692 0.3699 0.6692 0.8180
No log 5.4390 446 0.5833 0.4724 0.5833 0.7637
No log 5.4634 448 0.5595 0.5177 0.5595 0.7480
No log 5.4878 450 0.5678 0.5107 0.5678 0.7536
No log 5.5122 452 0.5753 0.5177 0.5753 0.7585
No log 5.5366 454 0.6105 0.4642 0.6105 0.7813
No log 5.5610 456 0.6457 0.4703 0.6457 0.8035
No log 5.5854 458 0.6687 0.4272 0.6687 0.8178
No log 5.6098 460 0.6582 0.4451 0.6582 0.8113
No log 5.6341 462 0.6398 0.2973 0.6398 0.7999
No log 5.6585 464 0.6535 0.1133 0.6535 0.8084
No log 5.6829 466 0.6631 0.0778 0.6631 0.8143
No log 5.7073 468 0.6649 0.1092 0.6649 0.8154
No log 5.7317 470 0.6563 0.4182 0.6563 0.8101
No log 5.7561 472 0.6622 0.3789 0.6622 0.8138
No log 5.7805 474 0.6807 0.4307 0.6807 0.8251
No log 5.8049 476 0.6977 0.4175 0.6977 0.8353
No log 5.8293 478 0.6801 0.4582 0.6801 0.8247
No log 5.8537 480 0.6493 0.5271 0.6493 0.8058
No log 5.8780 482 0.6474 0.4740 0.6474 0.8046
No log 5.9024 484 0.6407 0.4735 0.6407 0.8004
No log 5.9268 486 0.6371 0.5336 0.6371 0.7982
No log 5.9512 488 0.6548 0.5025 0.6548 0.8092
No log 5.9756 490 0.6868 0.4492 0.6868 0.8287
No log 6.0 492 0.6853 0.4492 0.6853 0.8278
No log 6.0244 494 0.6702 0.4711 0.6702 0.8186
No log 6.0488 496 0.6240 0.4555 0.6240 0.7899
No log 6.0732 498 0.6001 0.4484 0.6001 0.7747
0.3671 6.0976 500 0.5992 0.4217 0.5992 0.7741
0.3671 6.1220 502 0.6137 0.3654 0.6137 0.7834
0.3671 6.1463 504 0.6139 0.3607 0.6139 0.7835
0.3671 6.1707 506 0.6027 0.4617 0.6027 0.7763
0.3671 6.1951 508 0.6277 0.4576 0.6277 0.7922
0.3671 6.2195 510 0.7004 0.4606 0.7004 0.8369
0.3671 6.2439 512 0.7803 0.4385 0.7803 0.8834
0.3671 6.2683 514 0.7626 0.4587 0.7626 0.8733
0.3671 6.2927 516 0.6714 0.4664 0.6714 0.8194
0.3671 6.3171 518 0.6220 0.4229 0.6220 0.7887
0.3671 6.3415 520 0.6139 0.3352 0.6139 0.7835
0.3671 6.3659 522 0.6182 0.3369 0.6182 0.7863
0.3671 6.3902 524 0.6428 0.4330 0.6428 0.8018
0.3671 6.4146 526 0.6988 0.3918 0.6988 0.8360
0.3671 6.4390 528 0.7127 0.4112 0.7127 0.8442
0.3671 6.4634 530 0.7415 0.4457 0.7415 0.8611
0.3671 6.4878 532 0.7190 0.4531 0.7190 0.8479
0.3671 6.5122 534 0.6388 0.4493 0.6388 0.7993
0.3671 6.5366 536 0.5834 0.3762 0.5834 0.7638
0.3671 6.5610 538 0.5814 0.3651 0.5814 0.7625
0.3671 6.5854 540 0.5879 0.3651 0.5879 0.7667
0.3671 6.6098 542 0.5952 0.4171 0.5952 0.7715
0.3671 6.6341 544 0.6116 0.4397 0.6116 0.7821
0.3671 6.6585 546 0.6365 0.4397 0.6365 0.7978
0.3671 6.6829 548 0.6359 0.4013 0.6359 0.7974
0.3671 6.7073 550 0.6255 0.4013 0.6255 0.7909
0.3671 6.7317 552 0.6205 0.4013 0.6205 0.7877
0.3671 6.7561 554 0.6093 0.3762 0.6093 0.7806
0.3671 6.7805 556 0.6472 0.4067 0.6472 0.8045

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_B_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k16_task7_organization

Finetuned
(4019)
this model