ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run3_AugV5_k14_task7_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.4761
  • Qwk: 0.5770
  • Mse: 0.4761
  • Rmse: 0.6900

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0286 2 2.5709 -0.0109 2.5709 1.6034
No log 0.0571 4 1.6036 0.0789 1.6036 1.2663
No log 0.0857 6 0.7839 0.2109 0.7839 0.8854
No log 0.1143 8 0.7816 0.0236 0.7816 0.8841
No log 0.1429 10 1.0592 0.0812 1.0592 1.0292
No log 0.1714 12 1.2204 0.2121 1.2204 1.1047
No log 0.2 14 1.0234 0.3118 1.0234 1.0116
No log 0.2286 16 0.7229 0.3594 0.7229 0.8502
No log 0.2571 18 0.6540 0.0481 0.6540 0.8087
No log 0.2857 20 0.7883 0.3695 0.7883 0.8879
No log 0.3143 22 0.8236 0.4142 0.8236 0.9075
No log 0.3429 24 0.7766 0.4142 0.7766 0.8812
No log 0.3714 26 0.5946 0.3173 0.5946 0.7711
No log 0.4 28 0.5505 0.2041 0.5505 0.7420
No log 0.4286 30 0.6052 0.3323 0.6052 0.7779
No log 0.4571 32 0.6276 0.3359 0.6276 0.7922
No log 0.4857 34 0.5336 0.5177 0.5336 0.7305
No log 0.5143 36 0.5297 0.5283 0.5297 0.7278
No log 0.5429 38 0.5180 0.5591 0.5180 0.7197
No log 0.5714 40 0.5311 0.5510 0.5311 0.7288
No log 0.6 42 0.5009 0.5731 0.5009 0.7078
No log 0.6286 44 0.4873 0.5937 0.4873 0.6981
No log 0.6571 46 0.5212 0.5152 0.5212 0.7220
No log 0.6857 48 0.5846 0.3693 0.5846 0.7646
No log 0.7143 50 0.6395 0.3514 0.6395 0.7997
No log 0.7429 52 0.5816 0.3800 0.5816 0.7626
No log 0.7714 54 0.5164 0.4809 0.5164 0.7186
No log 0.8 56 0.4914 0.5517 0.4914 0.7010
No log 0.8286 58 0.4732 0.6132 0.4732 0.6879
No log 0.8571 60 0.4482 0.6645 0.4482 0.6694
No log 0.8857 62 0.4363 0.6469 0.4363 0.6605
No log 0.9143 64 0.5036 0.6496 0.5036 0.7096
No log 0.9429 66 0.4920 0.6681 0.4920 0.7015
No log 0.9714 68 0.4392 0.7154 0.4392 0.6627
No log 1.0 70 0.4223 0.6130 0.4223 0.6499
No log 1.0286 72 0.4452 0.5569 0.4452 0.6672
No log 1.0571 74 0.4357 0.5044 0.4357 0.6601
No log 1.0857 76 0.4286 0.6105 0.4286 0.6547
No log 1.1143 78 0.4385 0.6014 0.4385 0.6622
No log 1.1429 80 0.4585 0.5835 0.4585 0.6771
No log 1.1714 82 0.4629 0.6314 0.4629 0.6803
No log 1.2 84 0.5503 0.6060 0.5503 0.7418
No log 1.2286 86 0.4890 0.6796 0.4890 0.6993
No log 1.2571 88 0.5000 0.5498 0.5000 0.7071
No log 1.2857 90 0.4851 0.5368 0.4851 0.6965
No log 1.3143 92 0.4581 0.5419 0.4581 0.6768
No log 1.3429 94 0.7506 0.5242 0.7506 0.8664
No log 1.3714 96 0.8406 0.4702 0.8406 0.9168
No log 1.4 98 0.6510 0.5085 0.6510 0.8069
No log 1.4286 100 0.4654 0.5357 0.4654 0.6822
No log 1.4571 102 0.4784 0.5171 0.4784 0.6916
No log 1.4857 104 0.4767 0.5692 0.4767 0.6904
No log 1.5143 106 0.4942 0.6010 0.4942 0.7030
No log 1.5429 108 0.7130 0.5581 0.7130 0.8444
No log 1.5714 110 1.2238 0.3631 1.2238 1.1062
No log 1.6 112 1.2865 0.3631 1.2865 1.1342
No log 1.6286 114 0.9265 0.5179 0.9265 0.9626
No log 1.6571 116 0.5316 0.5864 0.5316 0.7291
No log 1.6857 118 0.4767 0.5522 0.4767 0.6904
No log 1.7143 120 0.4952 0.5266 0.4952 0.7037
No log 1.7429 122 0.4910 0.5970 0.4910 0.7007
No log 1.7714 124 0.5172 0.6015 0.5172 0.7192
No log 1.8 126 0.7258 0.5672 0.7258 0.8519
No log 1.8286 128 0.9157 0.5033 0.9157 0.9569
No log 1.8571 130 0.8889 0.4593 0.8889 0.9428
No log 1.8857 132 0.6945 0.5009 0.6945 0.8334
No log 1.9143 134 0.5634 0.5692 0.5634 0.7506
No log 1.9429 136 0.5062 0.5514 0.5062 0.7115
No log 1.9714 138 0.4902 0.5923 0.4902 0.7001
No log 2.0 140 0.4963 0.6152 0.4963 0.7045
No log 2.0286 142 0.5299 0.5799 0.5299 0.7280
No log 2.0571 144 0.6393 0.6139 0.6393 0.7996
No log 2.0857 146 0.7279 0.6073 0.7279 0.8532
No log 2.1143 148 0.6911 0.6264 0.6911 0.8313
No log 2.1429 150 0.5949 0.5688 0.5949 0.7713
No log 2.1714 152 0.5352 0.5639 0.5352 0.7316
No log 2.2 154 0.4969 0.5633 0.4969 0.7049
No log 2.2286 156 0.4958 0.5566 0.4958 0.7041
No log 2.2571 158 0.4955 0.5538 0.4955 0.7039
No log 2.2857 160 0.5413 0.5962 0.5413 0.7357
No log 2.3143 162 0.5620 0.5981 0.5620 0.7496
No log 2.3429 164 0.5032 0.6551 0.5032 0.7093
No log 2.3714 166 0.5145 0.5932 0.5145 0.7173
No log 2.4 168 0.5196 0.6017 0.5196 0.7208
No log 2.4286 170 0.4759 0.5956 0.4759 0.6898
No log 2.4571 172 0.4736 0.5290 0.4736 0.6882
No log 2.4857 174 0.6169 0.5382 0.6169 0.7854
No log 2.5143 176 0.7468 0.4519 0.7468 0.8642
No log 2.5429 178 0.7650 0.4519 0.7650 0.8746
No log 2.5714 180 0.8031 0.4519 0.8031 0.8961
No log 2.6 182 0.7646 0.4096 0.7646 0.8744
No log 2.6286 184 0.8027 0.4208 0.8027 0.8960
No log 2.6571 186 0.8053 0.4208 0.8053 0.8974
No log 2.6857 188 0.7475 0.4650 0.7475 0.8646
No log 2.7143 190 0.7527 0.5529 0.7527 0.8676
No log 2.7429 192 0.8460 0.5733 0.8460 0.9198
No log 2.7714 194 0.7783 0.5835 0.7783 0.8822
No log 2.8 196 0.7490 0.5755 0.7490 0.8655
No log 2.8286 198 0.6292 0.5574 0.6292 0.7932
No log 2.8571 200 0.5366 0.5473 0.5366 0.7325
No log 2.8857 202 0.5157 0.5552 0.5157 0.7181
No log 2.9143 204 0.5598 0.5190 0.5598 0.7482
No log 2.9429 206 0.7844 0.5801 0.7844 0.8857
No log 2.9714 208 0.9898 0.4939 0.9898 0.9949
No log 3.0 210 0.9298 0.5721 0.9298 0.9643
No log 3.0286 212 0.7222 0.5876 0.7222 0.8498
No log 3.0571 214 0.5437 0.5244 0.5437 0.7373
No log 3.0857 216 0.5122 0.5412 0.5122 0.7157
No log 3.1143 218 0.5292 0.5044 0.5292 0.7274
No log 3.1429 220 0.7237 0.5186 0.7237 0.8507
No log 3.1714 222 1.0033 0.5563 1.0033 1.0016
No log 3.2 224 1.1885 0.4168 1.1885 1.0902
No log 3.2286 226 1.0364 0.5074 1.0364 1.0180
No log 3.2571 228 0.7734 0.5680 0.7734 0.8795
No log 3.2857 230 0.5976 0.5133 0.5976 0.7730
No log 3.3143 232 0.5494 0.4782 0.5494 0.7412
No log 3.3429 234 0.5650 0.5289 0.5650 0.7516
No log 3.3714 236 0.6267 0.5148 0.6267 0.7916
No log 3.4 238 0.6722 0.4837 0.6722 0.8199
No log 3.4286 240 0.6604 0.5231 0.6604 0.8127
No log 3.4571 242 0.5913 0.6045 0.5913 0.7690
No log 3.4857 244 0.5211 0.5473 0.5211 0.7218
No log 3.5143 246 0.4985 0.5447 0.4985 0.7061
No log 3.5429 248 0.5507 0.6028 0.5507 0.7421
No log 3.5714 250 0.6773 0.6421 0.6773 0.8230
No log 3.6 252 0.7756 0.6143 0.7756 0.8807
No log 3.6286 254 0.6986 0.5765 0.6986 0.8358
No log 3.6571 256 0.5716 0.5998 0.5716 0.7560
No log 3.6857 258 0.5277 0.5485 0.5277 0.7264
No log 3.7143 260 0.5439 0.5617 0.5439 0.7375
No log 3.7429 262 0.5473 0.5524 0.5473 0.7398
No log 3.7714 264 0.5843 0.6151 0.5843 0.7644
No log 3.8 266 0.6837 0.6082 0.6837 0.8269
No log 3.8286 268 0.6566 0.6229 0.6566 0.8103
No log 3.8571 270 0.6011 0.6200 0.6011 0.7753
No log 3.8857 272 0.5327 0.5950 0.5327 0.7299
No log 3.9143 274 0.5466 0.5735 0.5466 0.7393
No log 3.9429 276 0.6147 0.5675 0.6147 0.7840
No log 3.9714 278 0.6316 0.5738 0.6316 0.7947
No log 4.0 280 0.6429 0.5738 0.6429 0.8018
No log 4.0286 282 0.5648 0.5787 0.5648 0.7516
No log 4.0571 284 0.5625 0.5338 0.5625 0.7500
No log 4.0857 286 0.6327 0.5632 0.6327 0.7955
No log 4.1143 288 0.7013 0.5517 0.7013 0.8374
No log 4.1429 290 0.7562 0.4939 0.7562 0.8696
No log 4.1714 292 0.7038 0.5581 0.7038 0.8390
No log 4.2 294 0.6680 0.6044 0.6680 0.8173
No log 4.2286 296 0.6241 0.5873 0.6241 0.7900
No log 4.2571 298 0.5434 0.6352 0.5434 0.7371
No log 4.2857 300 0.5109 0.6279 0.5109 0.7148
No log 4.3143 302 0.4987 0.5538 0.4987 0.7062
No log 4.3429 304 0.5765 0.6054 0.5765 0.7593
No log 4.3714 306 0.6292 0.6054 0.6292 0.7932
No log 4.4 308 0.5409 0.5338 0.5409 0.7355
No log 4.4286 310 0.4761 0.5042 0.4761 0.6900
No log 4.4571 312 0.4696 0.5304 0.4696 0.6853
No log 4.4857 314 0.4763 0.5304 0.4763 0.6901
No log 4.5143 316 0.4730 0.5133 0.4730 0.6878
No log 4.5429 318 0.5279 0.5655 0.5279 0.7266
No log 4.5714 320 0.7156 0.5156 0.7156 0.8459
No log 4.6 322 0.9574 0.4758 0.9574 0.9785
No log 4.6286 324 1.0732 0.4910 1.0732 1.0359
No log 4.6571 326 1.0072 0.4862 1.0072 1.0036
No log 4.6857 328 0.7964 0.5351 0.7964 0.8924
No log 4.7143 330 0.5844 0.5612 0.5844 0.7644
No log 4.7429 332 0.5187 0.5306 0.5187 0.7202
No log 4.7714 334 0.5551 0.5538 0.5551 0.7450
No log 4.8 336 0.6630 0.5481 0.6630 0.8143
No log 4.8286 338 0.7720 0.5395 0.7720 0.8786
No log 4.8571 340 0.8520 0.5164 0.8520 0.9230
No log 4.8857 342 0.9052 0.5425 0.9052 0.9514
No log 4.9143 344 0.8592 0.5017 0.8592 0.9269
No log 4.9429 346 0.7515 0.5034 0.7515 0.8669
No log 4.9714 348 0.6961 0.4992 0.6961 0.8343
No log 5.0 350 0.6360 0.5085 0.6360 0.7975
No log 5.0286 352 0.5524 0.4801 0.5524 0.7432
No log 5.0571 354 0.5407 0.4801 0.5407 0.7354
No log 5.0857 356 0.5725 0.4925 0.5725 0.7566
No log 5.1143 358 0.7160 0.5638 0.7160 0.8462
No log 5.1429 360 0.8704 0.5616 0.8704 0.9330
No log 5.1714 362 0.8313 0.5626 0.8313 0.9117
No log 5.2 364 0.6818 0.4921 0.6818 0.8257
No log 5.2286 366 0.5511 0.4602 0.5511 0.7424
No log 5.2571 368 0.5396 0.4622 0.5396 0.7346
No log 5.2857 370 0.5904 0.4684 0.5904 0.7684
No log 5.3143 372 0.6827 0.4400 0.6827 0.8263
No log 5.3429 374 0.6979 0.5050 0.6979 0.8354
No log 5.3714 376 0.7380 0.4961 0.7380 0.8591
No log 5.4 378 0.7560 0.5169 0.7560 0.8695
No log 5.4286 380 0.8373 0.5017 0.8373 0.9151
No log 5.4571 382 1.0018 0.4475 1.0018 1.0009
No log 5.4857 384 1.0732 0.3601 1.0732 1.0359
No log 5.5143 386 0.9241 0.4406 0.9241 0.9613
No log 5.5429 388 0.8024 0.4828 0.8024 0.8958
No log 5.5714 390 0.6672 0.5716 0.6672 0.8168
No log 5.6 392 0.6287 0.5659 0.6287 0.7929
No log 5.6286 394 0.6137 0.5396 0.6137 0.7834
No log 5.6571 396 0.5960 0.5700 0.5960 0.7720
No log 5.6857 398 0.5818 0.5958 0.5818 0.7628
No log 5.7143 400 0.5473 0.5875 0.5473 0.7398
No log 5.7429 402 0.5302 0.5396 0.5302 0.7282
No log 5.7714 404 0.5454 0.5561 0.5454 0.7385
No log 5.8 406 0.5663 0.6125 0.5663 0.7525
No log 5.8286 408 0.5530 0.5966 0.5530 0.7436
No log 5.8571 410 0.5907 0.5827 0.5907 0.7686
No log 5.8857 412 0.5721 0.5763 0.5721 0.7564
No log 5.9143 414 0.5750 0.5763 0.5750 0.7583
No log 5.9429 416 0.6720 0.5955 0.6720 0.8198
No log 5.9714 418 0.7292 0.5980 0.7292 0.8539
No log 6.0 420 0.6622 0.5692 0.6622 0.8138
No log 6.0286 422 0.5407 0.6214 0.5407 0.7354
No log 6.0571 424 0.4885 0.5768 0.4885 0.6990
No log 6.0857 426 0.5081 0.5151 0.5081 0.7128
No log 6.1143 428 0.5460 0.5597 0.5460 0.7389
No log 6.1429 430 0.6175 0.4943 0.6175 0.7858
No log 6.1714 432 0.6863 0.4703 0.6863 0.8284
No log 6.2 434 0.6481 0.5354 0.6481 0.8050
No log 6.2286 436 0.5581 0.5639 0.5581 0.7471
No log 6.2571 438 0.4982 0.5631 0.4982 0.7059
No log 6.2857 440 0.4793 0.6200 0.4793 0.6923
No log 6.3143 442 0.4924 0.5845 0.4924 0.7017
No log 6.3429 444 0.5860 0.6315 0.5860 0.7655
No log 6.3714 446 0.8627 0.5617 0.8627 0.9288
No log 6.4 448 1.0744 0.5037 1.0744 1.0365
No log 6.4286 450 1.0877 0.4812 1.0877 1.0429
No log 6.4571 452 0.8900 0.5365 0.8900 0.9434
No log 6.4857 454 0.6853 0.6119 0.6853 0.8278
No log 6.5143 456 0.5707 0.5974 0.5707 0.7554
No log 6.5429 458 0.5044 0.6011 0.5044 0.7102
No log 6.5714 460 0.5261 0.6273 0.5261 0.7253
No log 6.6 462 0.5929 0.5865 0.5929 0.7700
No log 6.6286 464 0.6387 0.5173 0.6387 0.7992
No log 6.6571 466 0.6207 0.5992 0.6207 0.7879
No log 6.6857 468 0.6097 0.5993 0.6097 0.7808
No log 6.7143 470 0.5652 0.6178 0.5652 0.7518
No log 6.7429 472 0.5655 0.6189 0.5655 0.7520
No log 6.7714 474 0.5837 0.6178 0.5837 0.7640
No log 6.8 476 0.5438 0.6248 0.5438 0.7375
No log 6.8286 478 0.5347 0.6032 0.5347 0.7312
No log 6.8571 480 0.5480 0.6096 0.5480 0.7403
No log 6.8857 482 0.5495 0.5906 0.5495 0.7413
No log 6.9143 484 0.6030 0.5682 0.6030 0.7765
No log 6.9429 486 0.6567 0.4992 0.6567 0.8104
No log 6.9714 488 0.6954 0.5032 0.6954 0.8339
No log 7.0 490 0.6510 0.4992 0.6510 0.8068
No log 7.0286 492 0.5611 0.4330 0.5611 0.7490
No log 7.0571 494 0.5197 0.4845 0.5197 0.7209
No log 7.0857 496 0.4912 0.5577 0.4912 0.7009
No log 7.1143 498 0.5403 0.6137 0.5403 0.7351
0.3499 7.1429 500 0.6328 0.6201 0.6328 0.7955
0.3499 7.1714 502 0.6153 0.6344 0.6153 0.7844
0.3499 7.2 504 0.5365 0.6200 0.5365 0.7325
0.3499 7.2286 506 0.4782 0.6344 0.4782 0.6915
0.3499 7.2571 508 0.4404 0.6341 0.4404 0.6637
0.3499 7.2857 510 0.4495 0.6321 0.4495 0.6704
0.3499 7.3143 512 0.4779 0.6150 0.4779 0.6913
0.3499 7.3429 514 0.4916 0.6226 0.4916 0.7011
0.3499 7.3714 516 0.4576 0.6313 0.4576 0.6765
0.3499 7.4 518 0.4460 0.6712 0.4460 0.6678
0.3499 7.4286 520 0.4692 0.6313 0.4692 0.6850
0.3499 7.4571 522 0.4720 0.6313 0.4720 0.6870
0.3499 7.4857 524 0.4575 0.5816 0.4575 0.6764
0.3499 7.5143 526 0.4717 0.6321 0.4717 0.6868
0.3499 7.5429 528 0.5267 0.5918 0.5267 0.7257
0.3499 7.5714 530 0.5549 0.6342 0.5549 0.7449
0.3499 7.6 532 0.5573 0.6342 0.5573 0.7466
0.3499 7.6286 534 0.5601 0.6342 0.5601 0.7484
0.3499 7.6571 536 0.5813 0.6283 0.5813 0.7624
0.3499 7.6857 538 0.5635 0.6275 0.5635 0.7507
0.3499 7.7143 540 0.5942 0.5958 0.5942 0.7709
0.3499 7.7429 542 0.6458 0.6246 0.6458 0.8036
0.3499 7.7714 544 0.6263 0.6128 0.6263 0.7914
0.3499 7.8 546 0.5747 0.5966 0.5747 0.7581
0.3499 7.8286 548 0.4836 0.5845 0.4836 0.6954
0.3499 7.8571 550 0.4662 0.5738 0.4662 0.6828
0.3499 7.8857 552 0.4761 0.5770 0.4761 0.6900

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run3_AugV5_k14_task7_organization

Finetuned
(4019)
this model