ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k17_task7_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.6081
  • Qwk: 0.5448
  • Mse: 0.6081
  • Rmse: 0.7798

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0235 2 2.4881 -0.0109 2.4881 1.5774
No log 0.0471 4 1.2662 0.0754 1.2662 1.1253
No log 0.0706 6 0.7618 0.1358 0.7618 0.8728
No log 0.0941 8 0.9326 0.2105 0.9326 0.9657
No log 0.1176 10 0.7842 0.1850 0.7842 0.8856
No log 0.1412 12 0.6851 0.3029 0.6851 0.8277
No log 0.1647 14 1.0104 0.2131 1.0104 1.0052
No log 0.1882 16 1.3911 0.1606 1.3911 1.1794
No log 0.2118 18 1.0458 0.1857 1.0458 1.0227
No log 0.2353 20 0.6336 0.4638 0.6336 0.7960
No log 0.2588 22 1.1110 0.1236 1.1110 1.0540
No log 0.2824 24 1.0622 0.1466 1.0622 1.0306
No log 0.3059 26 0.6264 0.5239 0.6264 0.7914
No log 0.3294 28 0.5290 0.5304 0.5290 0.7273
No log 0.3529 30 0.5375 0.5960 0.5375 0.7331
No log 0.3765 32 0.5105 0.4681 0.5105 0.7145
No log 0.4 34 0.5043 0.5361 0.5043 0.7101
No log 0.4235 36 0.4952 0.5361 0.4952 0.7037
No log 0.4471 38 0.5707 0.5195 0.5707 0.7554
No log 0.4706 40 0.7322 0.4979 0.7322 0.8557
No log 0.4941 42 0.5455 0.6137 0.5455 0.7386
No log 0.5176 44 0.4979 0.6172 0.4979 0.7056
No log 0.5412 46 0.4788 0.6423 0.4788 0.6920
No log 0.5647 48 0.5018 0.6248 0.5018 0.7083
No log 0.5882 50 0.4777 0.6040 0.4777 0.6912
No log 0.6118 52 0.4988 0.5947 0.4988 0.7062
No log 0.6353 54 0.6250 0.5146 0.6250 0.7906
No log 0.6588 56 0.6285 0.5543 0.6285 0.7928
No log 0.6824 58 0.4838 0.5642 0.4838 0.6956
No log 0.7059 60 0.4831 0.6398 0.4831 0.6950
No log 0.7294 62 0.4848 0.6853 0.4848 0.6963
No log 0.7529 64 0.4960 0.6416 0.4960 0.7043
No log 0.7765 66 0.6035 0.6009 0.6035 0.7768
No log 0.8 68 0.5653 0.5854 0.5653 0.7518
No log 0.8235 70 0.5711 0.6635 0.5711 0.7557
No log 0.8471 72 0.7685 0.5811 0.7685 0.8767
No log 0.8706 74 0.7267 0.5846 0.7267 0.8525
No log 0.8941 76 0.4890 0.6100 0.4890 0.6993
No log 0.9176 78 0.5428 0.5643 0.5428 0.7368
No log 0.9412 80 0.6780 0.5401 0.6780 0.8234
No log 0.9647 82 0.6218 0.5401 0.6218 0.7886
No log 0.9882 84 0.4815 0.6496 0.4815 0.6939
No log 1.0118 86 0.7162 0.5506 0.7162 0.8463
No log 1.0353 88 0.7442 0.5360 0.7442 0.8627
No log 1.0588 90 0.5156 0.5538 0.5156 0.7181
No log 1.0824 92 0.6100 0.5664 0.6100 0.7810
No log 1.1059 94 0.7901 0.4784 0.7901 0.8889
No log 1.1294 96 0.7180 0.5464 0.7180 0.8474
No log 1.1529 98 0.5545 0.5950 0.5545 0.7446
No log 1.1765 100 0.5308 0.5627 0.5308 0.7286
No log 1.2 102 0.5617 0.5056 0.5617 0.7495
No log 1.2235 104 0.5920 0.4404 0.5920 0.7694
No log 1.2471 106 0.5331 0.5308 0.5331 0.7301
No log 1.2706 108 0.4874 0.5159 0.4874 0.6981
No log 1.2941 110 0.4854 0.6052 0.4854 0.6967
No log 1.3176 112 0.5222 0.6388 0.5222 0.7226
No log 1.3412 114 0.5549 0.7170 0.5549 0.7449
No log 1.3647 116 0.5773 0.6920 0.5773 0.7598
No log 1.3882 118 0.5475 0.6742 0.5475 0.7399
No log 1.4118 120 0.5062 0.6420 0.5062 0.7115
No log 1.4353 122 0.4830 0.5902 0.4830 0.6950
No log 1.4588 124 0.5208 0.5425 0.5208 0.7216
No log 1.4824 126 0.5497 0.5909 0.5497 0.7414
No log 1.5059 128 0.5261 0.6262 0.5261 0.7253
No log 1.5294 130 0.5165 0.6332 0.5165 0.7187
No log 1.5529 132 0.5190 0.6486 0.5190 0.7204
No log 1.5765 134 0.6331 0.5471 0.6331 0.7957
No log 1.6 136 0.7167 0.4979 0.7167 0.8466
No log 1.6235 138 0.5530 0.4997 0.5530 0.7436
No log 1.6471 140 0.4938 0.5390 0.4938 0.7027
No log 1.6706 142 0.4983 0.5596 0.4983 0.7059
No log 1.6941 144 0.5494 0.5149 0.5494 0.7412
No log 1.7176 146 0.6956 0.4684 0.6956 0.8340
No log 1.7412 148 0.6632 0.4701 0.6632 0.8144
No log 1.7647 150 0.5257 0.5612 0.5257 0.7250
No log 1.7882 152 0.5149 0.5434 0.5149 0.7175
No log 1.8118 154 0.5198 0.5405 0.5198 0.7210
No log 1.8353 156 0.5483 0.5612 0.5483 0.7405
No log 1.8588 158 0.6077 0.4789 0.6077 0.7796
No log 1.8824 160 0.6574 0.4457 0.6574 0.8108
No log 1.9059 162 0.5848 0.4625 0.5848 0.7647
No log 1.9294 164 0.5061 0.5118 0.5061 0.7114
No log 1.9529 166 0.5024 0.5135 0.5024 0.7088
No log 1.9765 168 0.5199 0.5462 0.5199 0.7211
No log 2.0 170 0.7086 0.4738 0.7086 0.8418
No log 2.0235 172 0.7411 0.3889 0.7411 0.8609
No log 2.0471 174 0.5676 0.4845 0.5676 0.7534
No log 2.0706 176 0.4863 0.5160 0.4863 0.6973
No log 2.0941 178 0.5389 0.5970 0.5389 0.7341
No log 2.1176 180 0.5493 0.6445 0.5493 0.7412
No log 2.1412 182 0.5614 0.5847 0.5614 0.7493
No log 2.1647 184 0.6995 0.5639 0.6995 0.8364
No log 2.1882 186 0.6642 0.5487 0.6642 0.8150
No log 2.2118 188 0.5266 0.5902 0.5266 0.7257
No log 2.2353 190 0.5164 0.5840 0.5164 0.7186
No log 2.2588 192 0.5258 0.5786 0.5258 0.7251
No log 2.2824 194 0.6955 0.5190 0.6955 0.8340
No log 2.3059 196 0.9035 0.5226 0.9035 0.9505
No log 2.3294 198 0.8390 0.5002 0.8390 0.9160
No log 2.3529 200 0.6023 0.4531 0.6023 0.7761
No log 2.3765 202 0.4923 0.5488 0.4923 0.7016
No log 2.4 204 0.4811 0.5697 0.4811 0.6936
No log 2.4235 206 0.5074 0.5668 0.5074 0.7123
No log 2.4471 208 0.5804 0.5233 0.5804 0.7618
No log 2.4706 210 0.5478 0.5587 0.5478 0.7401
No log 2.4941 212 0.5058 0.5682 0.5058 0.7112
No log 2.5176 214 0.4859 0.5167 0.4859 0.6971
No log 2.5412 216 0.4869 0.5662 0.4869 0.6978
No log 2.5647 218 0.5000 0.5473 0.5000 0.7071
No log 2.5882 220 0.5098 0.5379 0.5098 0.7140
No log 2.6118 222 0.4860 0.5367 0.4860 0.6972
No log 2.6353 224 0.4953 0.5485 0.4953 0.7037
No log 2.6588 226 0.5432 0.5039 0.5432 0.7371
No log 2.6824 228 0.5392 0.5639 0.5392 0.7343
No log 2.7059 230 0.5364 0.4753 0.5364 0.7324
No log 2.7294 232 0.5121 0.4984 0.5121 0.7156
No log 2.7529 234 0.5417 0.4493 0.5417 0.7360
No log 2.7765 236 0.7240 0.4835 0.7240 0.8509
No log 2.8 238 0.8268 0.3739 0.8268 0.9093
No log 2.8235 240 0.8385 0.3228 0.8385 0.9157
No log 2.8471 242 0.7135 0.4549 0.7135 0.8447
No log 2.8706 244 0.5653 0.4945 0.5653 0.7519
No log 2.8941 246 0.5279 0.4729 0.5279 0.7266
No log 2.9176 248 0.5425 0.5367 0.5425 0.7366
No log 2.9412 250 0.5498 0.4997 0.5498 0.7415
No log 2.9647 252 0.6449 0.4862 0.6449 0.8031
No log 2.9882 254 0.7647 0.5139 0.7647 0.8745
No log 3.0118 256 0.7998 0.4942 0.7998 0.8943
No log 3.0353 258 0.6679 0.4815 0.6679 0.8172
No log 3.0588 260 0.5307 0.5845 0.5307 0.7285
No log 3.0824 262 0.5210 0.5845 0.5210 0.7218
No log 3.1059 264 0.5922 0.5243 0.5922 0.7695
No log 3.1294 266 0.6412 0.4723 0.6412 0.8007
No log 3.1529 268 0.5896 0.5149 0.5896 0.7678
No log 3.1765 270 0.4862 0.6111 0.4862 0.6973
No log 3.2 272 0.4716 0.5796 0.4716 0.6868
No log 3.2235 274 0.5187 0.5584 0.5187 0.7202
No log 3.2471 276 0.4997 0.5794 0.4997 0.7069
No log 3.2706 278 0.4559 0.6255 0.4559 0.6752
No log 3.2941 280 0.5245 0.6010 0.5245 0.7242
No log 3.3176 282 0.6276 0.5688 0.6276 0.7922
No log 3.3412 284 0.5869 0.6237 0.5869 0.7661
No log 3.3647 286 0.5201 0.5583 0.5201 0.7212
No log 3.3882 288 0.4859 0.4955 0.4859 0.6971
No log 3.4118 290 0.4863 0.5951 0.4863 0.6974
No log 3.4353 292 0.5168 0.6337 0.5168 0.7189
No log 3.4588 294 0.5600 0.6509 0.5600 0.7484
No log 3.4824 296 0.7124 0.5684 0.7124 0.8441
No log 3.5059 298 0.7673 0.5357 0.7673 0.8760
No log 3.5294 300 0.6188 0.5727 0.6188 0.7867
No log 3.5529 302 0.5294 0.5499 0.5294 0.7276
No log 3.5765 304 0.5038 0.5947 0.5038 0.7098
No log 3.6 306 0.4828 0.6228 0.4828 0.6949
No log 3.6235 308 0.4730 0.5698 0.4730 0.6877
No log 3.6471 310 0.5038 0.5479 0.5038 0.7098
No log 3.6706 312 0.4957 0.5479 0.4957 0.7041
No log 3.6941 314 0.4792 0.5195 0.4792 0.6923
No log 3.7176 316 0.5617 0.5934 0.5617 0.7495
No log 3.7412 318 0.6720 0.5219 0.6720 0.8198
No log 3.7647 320 0.6570 0.5409 0.6570 0.8106
No log 3.7882 322 0.5441 0.5934 0.5441 0.7376
No log 3.8118 324 0.4858 0.6032 0.4858 0.6970
No log 3.8353 326 0.4776 0.5826 0.4776 0.6911
No log 3.8588 328 0.4999 0.5660 0.4999 0.7070
No log 3.8824 330 0.5116 0.5515 0.5116 0.7153
No log 3.9059 332 0.5170 0.5970 0.5170 0.7190
No log 3.9294 334 0.5348 0.5229 0.5348 0.7313
No log 3.9529 336 0.5607 0.5276 0.5607 0.7488
No log 3.9765 338 0.5554 0.5291 0.5554 0.7453
No log 4.0 340 0.5226 0.5093 0.5226 0.7229
No log 4.0235 342 0.4851 0.5488 0.4851 0.6965
No log 4.0471 344 0.4829 0.5340 0.4829 0.6949
No log 4.0706 346 0.4885 0.5269 0.4885 0.6989
No log 4.0941 348 0.5187 0.5697 0.5187 0.7202
No log 4.1176 350 0.6019 0.5712 0.6019 0.7759
No log 4.1412 352 0.6773 0.5824 0.6773 0.8230
No log 4.1647 354 0.7490 0.5710 0.7490 0.8655
No log 4.1882 356 0.7016 0.5846 0.7016 0.8376
No log 4.2118 358 0.5608 0.5973 0.5608 0.7488
No log 4.2353 360 0.4931 0.6662 0.4931 0.7022
No log 4.2588 362 0.4727 0.7022 0.4727 0.6875
No log 4.2824 364 0.4659 0.7022 0.4659 0.6826
No log 4.3059 366 0.4620 0.6472 0.4620 0.6797
No log 4.3294 368 0.5079 0.5840 0.5079 0.7127
No log 4.3529 370 0.5455 0.6285 0.5455 0.7386
No log 4.3765 372 0.5297 0.6277 0.5297 0.7278
No log 4.4 374 0.4690 0.6359 0.4690 0.6848
No log 4.4235 376 0.4636 0.6411 0.4636 0.6808
No log 4.4471 378 0.4539 0.6411 0.4539 0.6737
No log 4.4706 380 0.4521 0.7114 0.4521 0.6724
No log 4.4941 382 0.4689 0.6158 0.4689 0.6848
No log 4.5176 384 0.4957 0.5933 0.4957 0.7041
No log 4.5412 386 0.4712 0.5910 0.4712 0.6864
No log 4.5647 388 0.4440 0.6542 0.4440 0.6663
No log 4.5882 390 0.4482 0.6739 0.4482 0.6695
No log 4.6118 392 0.4530 0.6929 0.4530 0.6731
No log 4.6353 394 0.4510 0.6727 0.4510 0.6716
No log 4.6588 396 0.4613 0.6252 0.4613 0.6792
No log 4.6824 398 0.5211 0.5884 0.5211 0.7219
No log 4.7059 400 0.5646 0.5801 0.5646 0.7514
No log 4.7294 402 0.5151 0.5884 0.5151 0.7177
No log 4.7529 404 0.4604 0.6252 0.4604 0.6786
No log 4.7765 406 0.4553 0.6210 0.4553 0.6748
No log 4.8 408 0.4541 0.6210 0.4541 0.6739
No log 4.8235 410 0.4443 0.6171 0.4443 0.6665
No log 4.8471 412 0.4735 0.6058 0.4735 0.6881
No log 4.8706 414 0.5540 0.5538 0.5540 0.7443
No log 4.8941 416 0.5695 0.5107 0.5695 0.7547
No log 4.9176 418 0.4899 0.5909 0.4899 0.6999
No log 4.9412 420 0.4373 0.6344 0.4373 0.6613
No log 4.9647 422 0.4394 0.6344 0.4394 0.6629
No log 4.9882 424 0.4447 0.5625 0.4447 0.6669
No log 5.0118 426 0.4661 0.6145 0.4661 0.6827
No log 5.0353 428 0.4833 0.6160 0.4833 0.6952
No log 5.0588 430 0.5115 0.6187 0.5115 0.7152
No log 5.0824 432 0.5046 0.6127 0.5046 0.7103
No log 5.1059 434 0.5023 0.5640 0.5023 0.7087
No log 5.1294 436 0.4934 0.5604 0.4934 0.7024
No log 5.1529 438 0.4849 0.5565 0.4849 0.6964
No log 5.1765 440 0.4936 0.5089 0.4936 0.7025
No log 5.2 442 0.5546 0.5677 0.5546 0.7447
No log 5.2235 444 0.6501 0.4418 0.6501 0.8063
No log 5.2471 446 0.6727 0.4614 0.6727 0.8202
No log 5.2706 448 0.6483 0.4812 0.6483 0.8052
No log 5.2941 450 0.6701 0.4812 0.6701 0.8186
No log 5.3176 452 0.6549 0.4812 0.6549 0.8093
No log 5.3412 454 0.6373 0.5801 0.6373 0.7983
No log 5.3647 456 0.6347 0.5966 0.6347 0.7967
No log 5.3882 458 0.5572 0.5639 0.5572 0.7465
No log 5.4118 460 0.4966 0.5947 0.4966 0.7047
No log 5.4353 462 0.4493 0.6339 0.4493 0.6703
No log 5.4588 464 0.4445 0.6339 0.4445 0.6667
No log 5.4824 466 0.4798 0.5543 0.4798 0.6927
No log 5.5059 468 0.5337 0.5845 0.5337 0.7305
No log 5.5294 470 0.5086 0.5741 0.5086 0.7132
No log 5.5529 472 0.4426 0.5860 0.4426 0.6653
No log 5.5765 474 0.4521 0.6943 0.4521 0.6724
No log 5.6 476 0.4815 0.6606 0.4815 0.6939
No log 5.6235 478 0.4847 0.6847 0.4847 0.6962
No log 5.6471 480 0.5048 0.6078 0.5048 0.7105
No log 5.6706 482 0.6309 0.5824 0.6309 0.7943
No log 5.6941 484 0.6781 0.5396 0.6781 0.8235
No log 5.7176 486 0.6102 0.5308 0.6102 0.7812
No log 5.7412 488 0.5064 0.5559 0.5064 0.7116
No log 5.7647 490 0.4756 0.5647 0.4756 0.6896
No log 5.7882 492 0.4816 0.5723 0.4816 0.6940
No log 5.8118 494 0.4950 0.5501 0.4950 0.7036
No log 5.8353 496 0.5363 0.5015 0.5363 0.7323
No log 5.8588 498 0.5413 0.5223 0.5413 0.7357
0.3029 5.8824 500 0.5207 0.5457 0.5207 0.7216
0.3029 5.9059 502 0.5421 0.5276 0.5421 0.7363
0.3029 5.9294 504 0.5428 0.5226 0.5428 0.7367
0.3029 5.9529 506 0.5165 0.5164 0.5165 0.7187
0.3029 5.9765 508 0.4762 0.5288 0.4762 0.6900
0.3029 6.0 510 0.4636 0.6344 0.4636 0.6809
0.3029 6.0235 512 0.4693 0.6228 0.4693 0.6851
0.3029 6.0471 514 0.5190 0.4864 0.5190 0.7204
0.3029 6.0706 516 0.5736 0.4764 0.5736 0.7573
0.3029 6.0941 518 0.6084 0.4964 0.6084 0.7800
0.3029 6.1176 520 0.6298 0.5076 0.6298 0.7936
0.3029 6.1412 522 0.6081 0.5448 0.6081 0.7798

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k17_task7_organization

Finetuned
(4019)
this model