ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k20_task2_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.8730
  • Qwk: 0.5063
  • Mse: 0.8730
  • Rmse: 0.9344

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0177 2 4.6530 0.0129 4.6530 2.1571
No log 0.0354 4 2.8340 -0.0463 2.8340 1.6835
No log 0.0531 6 1.7816 0.0504 1.7816 1.3348
No log 0.0708 8 1.2664 0.0811 1.2664 1.1254
No log 0.0885 10 1.1792 0.1689 1.1792 1.0859
No log 0.1062 12 1.2133 0.1689 1.2133 1.1015
No log 0.1239 14 1.2233 0.1470 1.2233 1.1060
No log 0.1416 16 1.3231 0.0819 1.3231 1.1502
No log 0.1593 18 2.0167 0.0527 2.0167 1.4201
No log 0.1770 20 2.2265 0.0322 2.2265 1.4921
No log 0.1947 22 1.9444 0.1438 1.9444 1.3944
No log 0.2124 24 1.4698 0.2457 1.4698 1.2123
No log 0.2301 26 1.1036 0.3142 1.1036 1.0505
No log 0.2478 28 1.0979 0.2290 1.0979 1.0478
No log 0.2655 30 1.1098 0.2401 1.1098 1.0534
No log 0.2832 32 1.1054 0.2556 1.1054 1.0514
No log 0.3009 34 1.1251 0.2409 1.1251 1.0607
No log 0.3186 36 1.1654 0.2395 1.1654 1.0795
No log 0.3363 38 1.1133 0.3683 1.1133 1.0551
No log 0.3540 40 1.0675 0.3326 1.0675 1.0332
No log 0.3717 42 1.0326 0.3664 1.0326 1.0162
No log 0.3894 44 1.0175 0.4469 1.0175 1.0087
No log 0.4071 46 1.0998 0.3321 1.0998 1.0487
No log 0.4248 48 1.2027 0.3663 1.2027 1.0967
No log 0.4425 50 1.0710 0.3656 1.0710 1.0349
No log 0.4602 52 0.9467 0.4444 0.9467 0.9730
No log 0.4779 54 0.9381 0.4201 0.9381 0.9686
No log 0.4956 56 0.9438 0.4631 0.9438 0.9715
No log 0.5133 58 0.9929 0.4296 0.9929 0.9964
No log 0.5310 60 1.0101 0.5251 1.0101 1.0050
No log 0.5487 62 1.0485 0.5283 1.0485 1.0239
No log 0.5664 64 1.0519 0.4832 1.0519 1.0256
No log 0.5841 66 1.0359 0.5014 1.0359 1.0178
No log 0.6018 68 1.0278 0.4726 1.0278 1.0138
No log 0.6195 70 1.0198 0.4511 1.0198 1.0099
No log 0.6372 72 0.9678 0.4501 0.9678 0.9838
No log 0.6549 74 0.8521 0.5893 0.8521 0.9231
No log 0.6726 76 0.8572 0.5393 0.8572 0.9259
No log 0.6903 78 1.0162 0.4749 1.0162 1.0081
No log 0.7080 80 0.9724 0.4440 0.9724 0.9861
No log 0.7257 82 0.8098 0.5345 0.8098 0.8999
No log 0.7434 84 0.7838 0.5766 0.7838 0.8853
No log 0.7611 86 0.9511 0.4696 0.9511 0.9753
No log 0.7788 88 0.9903 0.4565 0.9903 0.9951
No log 0.7965 90 0.8681 0.5160 0.8681 0.9317
No log 0.8142 92 0.8789 0.4846 0.8789 0.9375
No log 0.8319 94 1.0412 0.4792 1.0412 1.0204
No log 0.8496 96 1.1390 0.4921 1.1390 1.0672
No log 0.8673 98 1.1789 0.5050 1.1789 1.0858
No log 0.8850 100 1.1976 0.4733 1.1976 1.0944
No log 0.9027 102 1.0632 0.3929 1.0632 1.0311
No log 0.9204 104 0.9002 0.5201 0.9002 0.9488
No log 0.9381 106 0.8326 0.5885 0.8326 0.9125
No log 0.9558 108 0.8451 0.5094 0.8451 0.9193
No log 0.9735 110 0.8828 0.5025 0.8828 0.9396
No log 0.9912 112 0.9403 0.5249 0.9403 0.9697
No log 1.0088 114 1.1123 0.4952 1.1123 1.0546
No log 1.0265 116 1.1467 0.3723 1.1467 1.0709
No log 1.0442 118 0.9556 0.5258 0.9556 0.9775
No log 1.0619 120 0.8222 0.5898 0.8222 0.9067
No log 1.0796 122 0.8112 0.5443 0.8112 0.9006
No log 1.0973 124 0.8284 0.5532 0.8284 0.9101
No log 1.1150 126 0.8342 0.6077 0.8342 0.9133
No log 1.1327 128 0.9273 0.5393 0.9273 0.9630
No log 1.1504 130 1.0005 0.5051 1.0005 1.0003
No log 1.1681 132 1.0376 0.4702 1.0376 1.0186
No log 1.1858 134 0.9781 0.5260 0.9781 0.9890
No log 1.2035 136 0.8442 0.5698 0.8442 0.9188
No log 1.2212 138 0.8239 0.5545 0.8239 0.9077
No log 1.2389 140 0.8121 0.5545 0.8121 0.9011
No log 1.2566 142 0.8209 0.6384 0.8209 0.9060
No log 1.2743 144 0.7865 0.5922 0.7865 0.8868
No log 1.2920 146 0.8019 0.5922 0.8019 0.8955
No log 1.3097 148 0.9633 0.6004 0.9633 0.9815
No log 1.3274 150 1.2361 0.4607 1.2361 1.1118
No log 1.3451 152 1.2404 0.4426 1.2404 1.1137
No log 1.3628 154 1.0077 0.5050 1.0077 1.0038
No log 1.3805 156 0.8200 0.5648 0.8200 0.9055
No log 1.3982 158 0.7510 0.5370 0.7510 0.8666
No log 1.4159 160 0.7901 0.5127 0.7901 0.8889
No log 1.4336 162 0.7969 0.5094 0.7969 0.8927
No log 1.4513 164 0.8158 0.5702 0.8158 0.9032
No log 1.4690 166 1.0096 0.4916 1.0096 1.0048
No log 1.4867 168 1.1921 0.4681 1.1921 1.0919
No log 1.5044 170 1.1849 0.4612 1.1849 1.0885
No log 1.5221 172 1.1609 0.4618 1.1609 1.0775
No log 1.5398 174 1.0585 0.5237 1.0585 1.0288
No log 1.5575 176 0.9185 0.5958 0.9185 0.9584
No log 1.5752 178 0.8924 0.5958 0.8924 0.9447
No log 1.5929 180 0.9097 0.6034 0.9097 0.9538
No log 1.6106 182 0.9486 0.5375 0.9486 0.9739
No log 1.6283 184 0.9697 0.5548 0.9697 0.9847
No log 1.6460 186 1.0241 0.5353 1.0241 1.0120
No log 1.6637 188 0.9855 0.5014 0.9855 0.9927
No log 1.6814 190 1.0174 0.4902 1.0174 1.0087
No log 1.6991 192 1.0474 0.4708 1.0474 1.0234
No log 1.7168 194 1.0166 0.4545 1.0166 1.0083
No log 1.7345 196 1.0408 0.3916 1.0408 1.0202
No log 1.7522 198 1.1155 0.3632 1.1155 1.0562
No log 1.7699 200 1.1555 0.3767 1.1555 1.0749
No log 1.7876 202 1.0097 0.4913 1.0097 1.0048
No log 1.8053 204 0.9183 0.6023 0.9183 0.9583
No log 1.8230 206 0.8291 0.6171 0.8291 0.9105
No log 1.8407 208 0.8063 0.5988 0.8063 0.8980
No log 1.8584 210 0.8186 0.6066 0.8186 0.9047
No log 1.8761 212 0.8413 0.5649 0.8413 0.9172
No log 1.8938 214 0.9178 0.5899 0.9178 0.9580
No log 1.9115 216 0.9205 0.5412 0.9205 0.9594
No log 1.9292 218 0.8359 0.5844 0.8359 0.9143
No log 1.9469 220 0.8081 0.5902 0.8081 0.8990
No log 1.9646 222 0.9095 0.5333 0.9095 0.9537
No log 1.9823 224 1.0667 0.4918 1.0667 1.0328
No log 2.0 226 1.2466 0.5003 1.2466 1.1165
No log 2.0177 228 1.2264 0.5003 1.2264 1.1074
No log 2.0354 230 1.1023 0.5177 1.1023 1.0499
No log 2.0531 232 0.9725 0.5273 0.9725 0.9861
No log 2.0708 234 0.9183 0.5261 0.9183 0.9583
No log 2.0885 236 0.9726 0.5398 0.9726 0.9862
No log 2.1062 238 0.9447 0.5479 0.9447 0.9720
No log 2.1239 240 0.8646 0.5810 0.8646 0.9298
No log 2.1416 242 0.8650 0.5786 0.8650 0.9301
No log 2.1593 244 0.8852 0.5766 0.8852 0.9409
No log 2.1770 246 0.9417 0.5726 0.9417 0.9704
No log 2.1947 248 0.9024 0.5176 0.9024 0.9500
No log 2.2124 250 0.8799 0.5390 0.8799 0.9380
No log 2.2301 252 0.9506 0.5393 0.9506 0.9750
No log 2.2478 254 1.0722 0.5095 1.0722 1.0355
No log 2.2655 256 0.9753 0.5544 0.9753 0.9875
No log 2.2832 258 0.9333 0.5292 0.9333 0.9661
No log 2.3009 260 0.8759 0.5377 0.8759 0.9359
No log 2.3186 262 0.8621 0.5738 0.8621 0.9285
No log 2.3363 264 0.9702 0.5153 0.9702 0.9850
No log 2.3540 266 0.9028 0.5427 0.9028 0.9502
No log 2.3717 268 0.8014 0.5264 0.8014 0.8952
No log 2.3894 270 0.7765 0.5291 0.7765 0.8812
No log 2.4071 272 0.8162 0.5763 0.8162 0.9034
No log 2.4248 274 0.9270 0.5474 0.9270 0.9628
No log 2.4425 276 0.9888 0.5171 0.9888 0.9944
No log 2.4602 278 0.9477 0.5267 0.9477 0.9735
No log 2.4779 280 0.8011 0.5708 0.8011 0.8951
No log 2.4956 282 0.7576 0.5519 0.7576 0.8704
No log 2.5133 284 0.7745 0.5303 0.7745 0.8801
No log 2.5310 286 0.8182 0.5303 0.8182 0.9045
No log 2.5487 288 0.8492 0.5447 0.8492 0.9215
No log 2.5664 290 0.7843 0.5424 0.7843 0.8856
No log 2.5841 292 0.7679 0.5519 0.7679 0.8763
No log 2.6018 294 0.7693 0.5610 0.7693 0.8771
No log 2.6195 296 0.7913 0.6069 0.7913 0.8896
No log 2.6372 298 0.9052 0.5665 0.9052 0.9514
No log 2.6549 300 1.0545 0.5271 1.0545 1.0269
No log 2.6726 302 1.0269 0.5231 1.0269 1.0133
No log 2.6903 304 0.9165 0.6084 0.9165 0.9574
No log 2.7080 306 0.9000 0.5976 0.9000 0.9487
No log 2.7257 308 0.9645 0.5594 0.9645 0.9821
No log 2.7434 310 1.0746 0.5265 1.0746 1.0366
No log 2.7611 312 0.9998 0.5572 0.9998 0.9999
No log 2.7788 314 0.8680 0.5645 0.8680 0.9317
No log 2.7965 316 0.8381 0.5868 0.8381 0.9155
No log 2.8142 318 0.8391 0.5937 0.8391 0.9160
No log 2.8319 320 0.8772 0.5418 0.8772 0.9366
No log 2.8496 322 0.9099 0.4862 0.9099 0.9539
No log 2.8673 324 0.9779 0.5059 0.9779 0.9889
No log 2.8850 326 0.9987 0.5059 0.9987 0.9993
No log 2.9027 328 0.9589 0.4987 0.9589 0.9793
No log 2.9204 330 0.8589 0.5951 0.8589 0.9268
No log 2.9381 332 0.8405 0.4996 0.8405 0.9168
No log 2.9558 334 0.8216 0.4575 0.8216 0.9064
No log 2.9735 336 0.8392 0.5752 0.8392 0.9161
No log 2.9912 338 0.9168 0.5426 0.9168 0.9575
No log 3.0088 340 0.9503 0.4886 0.9503 0.9749
No log 3.0265 342 0.9077 0.5333 0.9077 0.9527
No log 3.0442 344 0.8740 0.5673 0.8740 0.9349
No log 3.0619 346 0.8631 0.5731 0.8631 0.9290
No log 3.0796 348 0.8840 0.5731 0.8840 0.9402
No log 3.0973 350 0.9026 0.5706 0.9026 0.9500
No log 3.1150 352 0.9294 0.5706 0.9294 0.9641
No log 3.1327 354 0.8934 0.5731 0.8934 0.9452
No log 3.1504 356 0.8402 0.5731 0.8402 0.9166
No log 3.1681 358 0.8440 0.6171 0.8440 0.9187
No log 3.1858 360 0.9251 0.5249 0.9251 0.9618
No log 3.2035 362 0.8939 0.5370 0.8939 0.9455
No log 3.2212 364 0.8927 0.5218 0.8927 0.9448
No log 3.2389 366 0.9285 0.4902 0.9285 0.9636
No log 3.2566 368 0.9808 0.4960 0.9808 0.9904
No log 3.2743 370 0.9801 0.4677 0.9801 0.9900
No log 3.2920 372 0.9092 0.5245 0.9092 0.9535
No log 3.3097 374 0.8406 0.5239 0.8406 0.9168
No log 3.3274 376 0.8216 0.5248 0.8216 0.9064
No log 3.3451 378 0.8183 0.5523 0.8183 0.9046
No log 3.3628 380 0.9024 0.5865 0.9024 0.9499
No log 3.3805 382 1.1282 0.4943 1.1282 1.0621
No log 3.3982 384 1.2797 0.4851 1.2797 1.1312
No log 3.4159 386 1.2172 0.4705 1.2172 1.1033
No log 3.4336 388 1.0509 0.5329 1.0509 1.0251
No log 3.4513 390 0.9883 0.5787 0.9883 0.9941
No log 3.4690 392 0.9326 0.6207 0.9326 0.9657
No log 3.4867 394 0.8619 0.5968 0.8619 0.9284
No log 3.5044 396 0.8178 0.5696 0.8178 0.9043
No log 3.5221 398 0.8087 0.5806 0.8087 0.8993
No log 3.5398 400 0.8403 0.5865 0.8403 0.9167
No log 3.5575 402 0.8725 0.5769 0.8725 0.9341
No log 3.5752 404 0.8911 0.5670 0.8911 0.9440
No log 3.5929 406 0.8810 0.6034 0.8810 0.9386
No log 3.6106 408 0.8325 0.5656 0.8325 0.9124
No log 3.6283 410 0.8373 0.5783 0.8373 0.9150
No log 3.6460 412 0.9149 0.5913 0.9149 0.9565
No log 3.6637 414 0.8802 0.6034 0.8802 0.9382
No log 3.6814 416 0.8030 0.6082 0.8030 0.8961
No log 3.6991 418 0.8178 0.5992 0.8178 0.9043
No log 3.7168 420 0.8152 0.5738 0.8152 0.9029
No log 3.7345 422 0.8389 0.5636 0.8389 0.9159
No log 3.7522 424 0.8209 0.5659 0.8209 0.9060
No log 3.7699 426 0.8157 0.5601 0.8157 0.9032
No log 3.7876 428 0.8007 0.5451 0.8007 0.8948
No log 3.8053 430 0.8362 0.5601 0.8362 0.9145
No log 3.8230 432 0.8967 0.5451 0.8967 0.9469
No log 3.8407 434 0.9055 0.5513 0.9055 0.9516
No log 3.8584 436 0.9026 0.5224 0.9026 0.9500
No log 3.8761 438 0.8774 0.5166 0.8774 0.9367
No log 3.8938 440 0.8533 0.54 0.8533 0.9237
No log 3.9115 442 0.8043 0.5124 0.8043 0.8968
No log 3.9292 444 0.7571 0.5770 0.7571 0.8701
No log 3.9469 446 0.7299 0.6227 0.7299 0.8544
No log 3.9646 448 0.7208 0.6444 0.7208 0.8490
No log 3.9823 450 0.7328 0.6646 0.7328 0.8560
No log 4.0 452 0.7666 0.6249 0.7666 0.8756
No log 4.0177 454 0.8418 0.5716 0.8418 0.9175
No log 4.0354 456 0.8939 0.5040 0.8939 0.9454
No log 4.0531 458 0.9003 0.5040 0.9003 0.9489
No log 4.0708 460 0.8819 0.5637 0.8819 0.9391
No log 4.0885 462 0.9156 0.5513 0.9156 0.9569
No log 4.1062 464 0.9694 0.5101 0.9694 0.9846
No log 4.1239 466 0.9067 0.5614 0.9067 0.9522
No log 4.1416 468 0.8279 0.5753 0.8279 0.9099
No log 4.1593 470 0.8097 0.5455 0.8097 0.8998
No log 4.1770 472 0.7879 0.5381 0.7879 0.8876
No log 4.1947 474 0.7953 0.5501 0.7953 0.8918
No log 4.2124 476 0.8825 0.5556 0.8825 0.9394
No log 4.2301 478 1.0188 0.4810 1.0188 1.0093
No log 4.2478 480 1.0348 0.4810 1.0348 1.0173
No log 4.2655 482 0.9933 0.4820 0.9933 0.9967
No log 4.2832 484 0.9676 0.4936 0.9676 0.9837
No log 4.3009 486 0.9100 0.5532 0.9100 0.9539
No log 4.3186 488 0.8233 0.5756 0.8233 0.9074
No log 4.3363 490 0.7701 0.5862 0.7701 0.8776
No log 4.3540 492 0.7894 0.6056 0.7894 0.8885
No log 4.3717 494 0.8591 0.5945 0.8591 0.9269
No log 4.3894 496 0.9127 0.5530 0.9127 0.9553
No log 4.4071 498 0.8337 0.5841 0.8337 0.9131
0.3344 4.4248 500 0.8238 0.5562 0.8238 0.9076
0.3344 4.4425 502 0.8092 0.5272 0.8092 0.8995
0.3344 4.4602 504 0.8031 0.5733 0.8031 0.8962
0.3344 4.4779 506 0.8419 0.5365 0.8419 0.9176
0.3344 4.4956 508 0.9024 0.5268 0.9024 0.9500
0.3344 4.5133 510 0.9449 0.4133 0.9449 0.9721
0.3344 4.5310 512 0.8928 0.5063 0.8928 0.9449
0.3344 4.5487 514 0.8730 0.5063 0.8730 0.9344

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
1
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k20_task2_organization

Finetuned
(4019)
this model