ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run3_AugV5_k20_task2_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.7677
  • Qwk: 0.4931
  • Mse: 0.7677
  • Rmse: 0.8762

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0177 2 4.6186 0.0096 4.6186 2.1491
No log 0.0354 4 3.5674 0.0165 3.5674 1.8887
No log 0.0531 6 1.6803 0.0629 1.6803 1.2963
No log 0.0708 8 1.3026 0.0867 1.3026 1.1413
No log 0.0885 10 1.1789 0.1752 1.1789 1.0858
No log 0.1062 12 1.1675 0.2464 1.1675 1.0805
No log 0.1239 14 1.2171 0.1413 1.2171 1.1032
No log 0.1416 16 1.4013 0.1501 1.4013 1.1838
No log 0.1593 18 1.3283 0.1585 1.3283 1.1525
No log 0.1770 20 1.1847 0.1315 1.1847 1.0884
No log 0.1947 22 1.1561 0.2191 1.1561 1.0752
No log 0.2124 24 1.1483 0.1671 1.1483 1.0716
No log 0.2301 26 1.1424 0.3525 1.1424 1.0688
No log 0.2478 28 1.1537 0.2313 1.1537 1.0741
No log 0.2655 30 1.1684 0.2537 1.1684 1.0809
No log 0.2832 32 1.2279 0.2683 1.2279 1.1081
No log 0.3009 34 1.3464 0.3227 1.3464 1.1603
No log 0.3186 36 1.6212 0.2801 1.6212 1.2733
No log 0.3363 38 1.5385 0.28 1.5385 1.2404
No log 0.3540 40 1.1322 0.2619 1.1322 1.0641
No log 0.3717 42 0.9447 0.4579 0.9447 0.9719
No log 0.3894 44 1.2343 0.4497 1.2343 1.1110
No log 0.4071 46 1.2647 0.3414 1.2647 1.1246
No log 0.4248 48 0.9766 0.5403 0.9766 0.9882
No log 0.4425 50 0.8759 0.5131 0.8759 0.9359
No log 0.4602 52 1.0389 0.3262 1.0389 1.0193
No log 0.4779 54 1.0944 0.2201 1.0944 1.0461
No log 0.4956 56 1.0151 0.4387 1.0151 1.0075
No log 0.5133 58 1.0452 0.4238 1.0452 1.0223
No log 0.5310 60 0.8668 0.5416 0.8668 0.9310
No log 0.5487 62 0.8077 0.5975 0.8077 0.8987
No log 0.5664 64 0.9154 0.5879 0.9154 0.9568
No log 0.5841 66 0.9462 0.5392 0.9462 0.9727
No log 0.6018 68 0.8047 0.5098 0.8047 0.8971
No log 0.6195 70 0.7866 0.5932 0.7866 0.8869
No log 0.6372 72 0.9242 0.4606 0.9242 0.9614
No log 0.6549 74 1.0022 0.4280 1.0022 1.0011
No log 0.6726 76 0.9224 0.4686 0.9224 0.9604
No log 0.6903 78 0.8729 0.4840 0.8729 0.9343
No log 0.7080 80 0.9378 0.4454 0.9378 0.9684
No log 0.7257 82 0.9881 0.4444 0.9881 0.9940
No log 0.7434 84 0.9740 0.4910 0.9740 0.9869
No log 0.7611 86 0.9822 0.5518 0.9822 0.9910
No log 0.7788 88 0.9929 0.5209 0.9929 0.9964
No log 0.7965 90 0.9120 0.5965 0.9120 0.9550
No log 0.8142 92 0.9009 0.5884 0.9009 0.9492
No log 0.8319 94 0.8891 0.5884 0.8891 0.9429
No log 0.8496 96 0.9163 0.5458 0.9163 0.9572
No log 0.8673 98 0.8487 0.5632 0.8487 0.9212
No log 0.8850 100 0.8036 0.5983 0.8036 0.8964
No log 0.9027 102 0.7985 0.5898 0.7985 0.8936
No log 0.9204 104 0.8253 0.5253 0.8253 0.9085
No log 0.9381 106 0.9171 0.5511 0.9171 0.9577
No log 0.9558 108 0.8644 0.5601 0.8644 0.9298
No log 0.9735 110 0.7821 0.6228 0.7821 0.8844
No log 0.9912 112 0.8098 0.6004 0.8098 0.8999
No log 1.0088 114 0.7482 0.6454 0.7482 0.8650
No log 1.0265 116 0.7051 0.6172 0.7051 0.8397
No log 1.0442 118 0.7037 0.5632 0.7037 0.8388
No log 1.0619 120 0.7051 0.5697 0.7051 0.8397
No log 1.0796 122 0.7148 0.5759 0.7148 0.8454
No log 1.0973 124 0.7178 0.5759 0.7178 0.8472
No log 1.1150 126 0.7237 0.5918 0.7237 0.8507
No log 1.1327 128 0.7461 0.6010 0.7461 0.8638
No log 1.1504 130 0.7859 0.5977 0.7859 0.8865
No log 1.1681 132 0.9869 0.5566 0.9869 0.9934
No log 1.1858 134 1.0241 0.5308 1.0241 1.0120
No log 1.2035 136 0.9341 0.5814 0.9341 0.9665
No log 1.2212 138 0.7874 0.5082 0.7874 0.8874
No log 1.2389 140 0.8637 0.4725 0.8637 0.9294
No log 1.2566 142 0.9339 0.5078 0.9339 0.9664
No log 1.2743 144 0.8342 0.5182 0.8342 0.9134
No log 1.2920 146 0.8042 0.5386 0.8042 0.8968
No log 1.3097 148 0.8516 0.5283 0.8516 0.9228
No log 1.3274 150 0.8646 0.5080 0.8646 0.9298
No log 1.3451 152 0.7888 0.6274 0.7888 0.8882
No log 1.3628 154 0.7892 0.5824 0.7892 0.8884
No log 1.3805 156 0.7818 0.6079 0.7818 0.8842
No log 1.3982 158 0.8129 0.6420 0.8129 0.9016
No log 1.4159 160 0.9121 0.5718 0.9121 0.9550
No log 1.4336 162 1.1444 0.5215 1.1444 1.0698
No log 1.4513 164 1.1346 0.5512 1.1346 1.0652
No log 1.4690 166 0.9605 0.4984 0.9605 0.9801
No log 1.4867 168 0.8541 0.4469 0.8541 0.9242
No log 1.5044 170 0.8362 0.4596 0.8362 0.9144
No log 1.5221 172 0.7755 0.6285 0.7755 0.8806
No log 1.5398 174 0.7465 0.5769 0.7465 0.8640
No log 1.5575 176 0.7866 0.6392 0.7866 0.8869
No log 1.5752 178 1.0064 0.5389 1.0064 1.0032
No log 1.5929 180 1.2603 0.5264 1.2603 1.1226
No log 1.6106 182 1.3118 0.4282 1.3118 1.1454
No log 1.6283 184 1.1297 0.5015 1.1297 1.0629
No log 1.6460 186 0.8919 0.4734 0.8919 0.9444
No log 1.6637 188 0.8444 0.4681 0.8444 0.9189
No log 1.6814 190 0.8673 0.4646 0.8673 0.9313
No log 1.6991 192 0.8431 0.4781 0.8431 0.9182
No log 1.7168 194 0.8357 0.5167 0.8357 0.9142
No log 1.7345 196 0.8743 0.3738 0.8743 0.9350
No log 1.7522 198 0.8990 0.3747 0.8990 0.9482
No log 1.7699 200 0.9638 0.4074 0.9638 0.9817
No log 1.7876 202 1.0058 0.4287 1.0058 1.0029
No log 1.8053 204 0.9647 0.4113 0.9647 0.9822
No log 1.8230 206 0.8861 0.4181 0.8861 0.9413
No log 1.8407 208 0.8698 0.4294 0.8698 0.9326
No log 1.8584 210 0.8629 0.3998 0.8629 0.9289
No log 1.8761 212 0.8563 0.3396 0.8563 0.9254
No log 1.8938 214 0.8815 0.4540 0.8815 0.9389
No log 1.9115 216 0.9633 0.5105 0.9633 0.9815
No log 1.9292 218 0.8965 0.4770 0.8965 0.9469
No log 1.9469 220 0.8413 0.4435 0.8413 0.9172
No log 1.9646 222 0.7839 0.5107 0.7839 0.8854
No log 1.9823 224 0.7844 0.5012 0.7844 0.8857
No log 2.0 226 0.8232 0.4009 0.8232 0.9073
No log 2.0177 228 1.0148 0.4976 1.0148 1.0074
No log 2.0354 230 1.1722 0.5144 1.1722 1.0827
No log 2.0531 232 1.0509 0.5433 1.0509 1.0251
No log 2.0708 234 0.8250 0.5539 0.8250 0.9083
No log 2.0885 236 0.7544 0.4872 0.7544 0.8686
No log 2.1062 238 0.8093 0.4961 0.8093 0.8996
No log 2.1239 240 0.7535 0.5127 0.7535 0.8680
No log 2.1416 242 0.7558 0.5308 0.7558 0.8694
No log 2.1593 244 0.8157 0.5513 0.8157 0.9032
No log 2.1770 246 0.7815 0.5686 0.7815 0.8840
No log 2.1947 248 0.7088 0.5678 0.7088 0.8419
No log 2.2124 250 0.6889 0.5992 0.6889 0.8300
No log 2.2301 252 0.7122 0.6336 0.7122 0.8439
No log 2.2478 254 0.8479 0.5270 0.8479 0.9208
No log 2.2655 256 0.9741 0.5429 0.9741 0.9870
No log 2.2832 258 0.9285 0.5270 0.9285 0.9636
No log 2.3009 260 0.7735 0.5359 0.7735 0.8795
No log 2.3186 262 0.7382 0.5486 0.7382 0.8592
No log 2.3363 264 0.7449 0.5684 0.7449 0.8631
No log 2.3540 266 0.7487 0.5566 0.7487 0.8653
No log 2.3717 268 0.7560 0.5749 0.7560 0.8695
No log 2.3894 270 0.7724 0.5098 0.7724 0.8789
No log 2.4071 272 0.8855 0.5637 0.8855 0.9410
No log 2.4248 274 0.9354 0.5346 0.9354 0.9672
No log 2.4425 276 0.9228 0.4615 0.9228 0.9606
No log 2.4602 278 0.8904 0.4107 0.8904 0.9436
No log 2.4779 280 0.8402 0.4366 0.8402 0.9166
No log 2.4956 282 0.8159 0.4981 0.8159 0.9033
No log 2.5133 284 0.8382 0.4727 0.8382 0.9155
No log 2.5310 286 0.9706 0.5310 0.9706 0.9852
No log 2.5487 288 1.1613 0.5617 1.1613 1.0776
No log 2.5664 290 1.1565 0.5617 1.1565 1.0754
No log 2.5841 292 0.9946 0.5375 0.9946 0.9973
No log 2.6018 294 0.8576 0.5357 0.8576 0.9261
No log 2.6195 296 0.8298 0.5055 0.8298 0.9110
No log 2.6372 298 0.8280 0.5543 0.8280 0.9100
No log 2.6549 300 0.8486 0.5210 0.8486 0.9212
No log 2.6726 302 0.8354 0.5569 0.8354 0.9140
No log 2.6903 304 0.8059 0.5067 0.8059 0.8977
No log 2.7080 306 0.9039 0.5339 0.9039 0.9508
No log 2.7257 308 0.9523 0.5365 0.9523 0.9758
No log 2.7434 310 0.8670 0.5250 0.8670 0.9311
No log 2.7611 312 0.7883 0.4704 0.7883 0.8879
No log 2.7788 314 0.7927 0.5334 0.7927 0.8903
No log 2.7965 316 0.7823 0.5216 0.7823 0.8845
No log 2.8142 318 0.7926 0.4280 0.7926 0.8903
No log 2.8319 320 0.8173 0.4657 0.8173 0.9040
No log 2.8496 322 0.8285 0.4876 0.8285 0.9102
No log 2.8673 324 0.8563 0.5487 0.8563 0.9253
No log 2.8850 326 0.8307 0.5759 0.8307 0.9114
No log 2.9027 328 0.8519 0.5759 0.8519 0.9230
No log 2.9204 330 0.8625 0.5164 0.8625 0.9287
No log 2.9381 332 0.8932 0.4712 0.8932 0.9451
No log 2.9558 334 0.9381 0.4348 0.9381 0.9686
No log 2.9735 336 0.9868 0.4309 0.9868 0.9934
No log 2.9912 338 1.0575 0.4302 1.0575 1.0283
No log 3.0088 340 1.0124 0.4714 1.0124 1.0062
No log 3.0265 342 0.8768 0.5498 0.8768 0.9364
No log 3.0442 344 0.8360 0.4982 0.8360 0.9143
No log 3.0619 346 0.8215 0.5011 0.8215 0.9064
No log 3.0796 348 0.8107 0.4568 0.8107 0.9004
No log 3.0973 350 0.8065 0.4982 0.8065 0.8980
No log 3.1150 352 0.8632 0.4649 0.8632 0.9291
No log 3.1327 354 0.8906 0.4625 0.8906 0.9437
No log 3.1504 356 0.8383 0.5395 0.8383 0.9156
No log 3.1681 358 0.8116 0.5575 0.8116 0.9009
No log 3.1858 360 0.8247 0.5423 0.8247 0.9081
No log 3.2035 362 0.7893 0.5417 0.7893 0.8884
No log 3.2212 364 0.7391 0.5708 0.7391 0.8597
No log 3.2389 366 0.7694 0.4853 0.7694 0.8772
No log 3.2566 368 0.8292 0.4367 0.8292 0.9106
No log 3.2743 370 0.8416 0.3326 0.8416 0.9174
No log 3.2920 372 0.8806 0.3489 0.8806 0.9384
No log 3.3097 374 0.9671 0.4641 0.9671 0.9834
No log 3.3274 376 0.9975 0.4975 0.9975 0.9988
No log 3.3451 378 0.9562 0.4975 0.9562 0.9779
No log 3.3628 380 0.8294 0.4972 0.8294 0.9107
No log 3.3805 382 0.7462 0.6098 0.7462 0.8638
No log 3.3982 384 0.7284 0.6340 0.7284 0.8534
No log 3.4159 386 0.7323 0.6107 0.7323 0.8557
No log 3.4336 388 0.7374 0.6425 0.7374 0.8587
No log 3.4513 390 0.8072 0.5844 0.8072 0.8985
No log 3.4690 392 0.8990 0.5384 0.8990 0.9481
No log 3.4867 394 0.8869 0.5233 0.8869 0.9417
No log 3.5044 396 0.8209 0.5537 0.8209 0.9061
No log 3.5221 398 0.7724 0.5635 0.7724 0.8789
No log 3.5398 400 0.7674 0.5951 0.7674 0.8760
No log 3.5575 402 0.7960 0.5561 0.7960 0.8922
No log 3.5752 404 0.8661 0.5570 0.8661 0.9306
No log 3.5929 406 0.9099 0.5570 0.9099 0.9539
No log 3.6106 408 0.9032 0.5570 0.9032 0.9504
No log 3.6283 410 0.8270 0.5490 0.8270 0.9094
No log 3.6460 412 0.7872 0.5542 0.7872 0.8872
No log 3.6637 414 0.7937 0.5142 0.7937 0.8909
No log 3.6814 416 0.8036 0.3994 0.8036 0.8964
No log 3.6991 418 0.8198 0.3847 0.8198 0.9054
No log 3.7168 420 0.8298 0.3812 0.8298 0.9109
No log 3.7345 422 0.8399 0.4938 0.8399 0.9165
No log 3.7522 424 0.8083 0.5470 0.8083 0.8991
No log 3.7699 426 0.7629 0.6411 0.7629 0.8734
No log 3.7876 428 0.7635 0.6235 0.7635 0.8738
No log 3.8053 430 0.7713 0.6151 0.7713 0.8783
No log 3.8230 432 0.7542 0.6058 0.7542 0.8685
No log 3.8407 434 0.7389 0.6172 0.7389 0.8596
No log 3.8584 436 0.7395 0.5854 0.7395 0.8599
No log 3.8761 438 0.7434 0.5149 0.7434 0.8622
No log 3.8938 440 0.7702 0.4297 0.7702 0.8776
No log 3.9115 442 0.7931 0.4297 0.7931 0.8906
No log 3.9292 444 0.8003 0.4743 0.8003 0.8946
No log 3.9469 446 0.7893 0.4789 0.7893 0.8884
No log 3.9646 448 0.8132 0.4540 0.8132 0.9018
No log 3.9823 450 0.8451 0.4595 0.8451 0.9193
No log 4.0 452 0.8561 0.4069 0.8561 0.9253
No log 4.0177 454 0.8419 0.4948 0.8419 0.9175
No log 4.0354 456 0.8343 0.4902 0.8343 0.9134
No log 4.0531 458 0.8685 0.4606 0.8685 0.9319
No log 4.0708 460 0.8589 0.4606 0.8589 0.9268
No log 4.0885 462 0.8311 0.4331 0.8311 0.9117
No log 4.1062 464 0.8325 0.4587 0.8325 0.9124
No log 4.1239 466 0.8257 0.4369 0.8257 0.9087
No log 4.1416 468 0.8518 0.4202 0.8518 0.9229
No log 4.1593 470 0.8709 0.3372 0.8709 0.9332
No log 4.1770 472 0.8605 0.3720 0.8605 0.9276
No log 4.1947 474 0.8271 0.4002 0.8271 0.9095
No log 4.2124 476 0.8201 0.4002 0.8201 0.9056
No log 4.2301 478 0.7971 0.4555 0.7971 0.8928
No log 4.2478 480 0.7768 0.4780 0.7768 0.8814
No log 4.2655 482 0.7792 0.4488 0.7792 0.8827
No log 4.2832 484 0.8118 0.5236 0.8118 0.9010
No log 4.3009 486 0.7919 0.4845 0.7919 0.8899
No log 4.3186 488 0.7751 0.4555 0.7751 0.8804
No log 4.3363 490 0.7991 0.4819 0.7991 0.8939
No log 4.3540 492 0.8001 0.4819 0.8001 0.8945
No log 4.3717 494 0.7677 0.5197 0.7677 0.8762
No log 4.3894 496 0.7325 0.5327 0.7325 0.8559
No log 4.4071 498 0.7450 0.5498 0.7450 0.8632
0.3361 4.4248 500 0.8104 0.5044 0.8104 0.9002
0.3361 4.4425 502 0.8015 0.5374 0.8015 0.8952
0.3361 4.4602 504 0.7909 0.5442 0.7909 0.8893
0.3361 4.4779 506 0.7658 0.5368 0.7658 0.8751
0.3361 4.4956 508 0.7596 0.5498 0.7596 0.8715
0.3361 4.5133 510 0.7591 0.5962 0.7591 0.8712
0.3361 4.5310 512 0.7839 0.5636 0.7839 0.8854
0.3361 4.5487 514 0.8705 0.5306 0.8705 0.9330
0.3361 4.5664 516 0.9054 0.5287 0.9054 0.9515
0.3361 4.5841 518 0.9314 0.5465 0.9314 0.9651
0.3361 4.6018 520 0.8768 0.5490 0.8768 0.9364
0.3361 4.6195 522 0.8212 0.5513 0.8212 0.9062
0.3361 4.6372 524 0.7891 0.6351 0.7891 0.8883
0.3361 4.6549 526 0.8008 0.6242 0.8008 0.8949
0.3361 4.6726 528 0.8664 0.5487 0.8664 0.9308
0.3361 4.6903 530 0.9437 0.5030 0.9437 0.9714
0.3361 4.7080 532 0.9597 0.4519 0.9597 0.9796
0.3361 4.7257 534 0.9166 0.4649 0.9166 0.9574
0.3361 4.7434 536 0.8190 0.4826 0.8190 0.9050
0.3361 4.7611 538 0.7677 0.4931 0.7677 0.8762

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
1
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run3_AugV5_k20_task2_organization

Finetuned
(4019)
this model