ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k17_task7_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.5717
  • Qwk: 0.4979
  • Mse: 0.5717
  • Rmse: 0.7561

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0235 2 2.5530 0.0070 2.5530 1.5978
No log 0.0471 4 1.2030 0.0487 1.2030 1.0968
No log 0.0706 6 0.7308 0.1786 0.7308 0.8549
No log 0.0941 8 0.8797 0.1911 0.8797 0.9379
No log 0.1176 10 1.1941 -0.1098 1.1941 1.0928
No log 0.1412 12 1.2298 0.0120 1.2298 1.1089
No log 0.1647 14 1.1563 0.1557 1.1563 1.0753
No log 0.1882 16 0.8465 0.1628 0.8465 0.9201
No log 0.2118 18 0.7764 0.0 0.7764 0.8811
No log 0.2353 20 0.7237 0.0428 0.7237 0.8507
No log 0.2588 22 0.6865 0.0428 0.6865 0.8285
No log 0.2824 24 0.6567 0.0840 0.6567 0.8104
No log 0.3059 26 0.6463 0.3166 0.6463 0.8039
No log 0.3294 28 0.8062 0.2211 0.8062 0.8979
No log 0.3529 30 1.0529 0.2455 1.0529 1.0261
No log 0.3765 32 0.9873 0.3118 0.9873 0.9936
No log 0.4 34 0.6362 0.3737 0.6362 0.7976
No log 0.4235 36 0.5521 0.4591 0.5521 0.7430
No log 0.4471 38 0.5893 0.4802 0.5893 0.7677
No log 0.4706 40 0.6882 0.3937 0.6882 0.8296
No log 0.4941 42 0.6522 0.4575 0.6522 0.8076
No log 0.5176 44 0.6045 0.5547 0.6045 0.7775
No log 0.5412 46 0.5539 0.5742 0.5539 0.7443
No log 0.5647 48 0.6128 0.5705 0.6128 0.7828
No log 0.5882 50 0.6284 0.5931 0.6284 0.7927
No log 0.6118 52 0.5447 0.5653 0.5447 0.7380
No log 0.6353 54 0.5737 0.3833 0.5737 0.7574
No log 0.6588 56 0.5983 0.4335 0.5983 0.7735
No log 0.6824 58 0.5753 0.4315 0.5753 0.7585
No log 0.7059 60 0.5315 0.5711 0.5315 0.7290
No log 0.7294 62 0.5138 0.6078 0.5138 0.7168
No log 0.7529 64 0.5807 0.4634 0.5807 0.7620
No log 0.7765 66 0.9321 0.3309 0.9321 0.9655
No log 0.8 68 0.9915 0.2746 0.9915 0.9958
No log 0.8235 70 0.7422 0.3869 0.7422 0.8615
No log 0.8471 72 0.5757 0.4934 0.5757 0.7587
No log 0.8706 74 0.7561 0.4400 0.7561 0.8695
No log 0.8941 76 0.7825 0.4222 0.7825 0.8846
No log 0.9176 78 0.6148 0.5003 0.6148 0.7841
No log 0.9412 80 0.6823 0.4474 0.6823 0.8260
No log 0.9647 82 0.7587 0.3873 0.7587 0.8711
No log 0.9882 84 0.6162 0.4452 0.6162 0.7850
No log 1.0118 86 0.5271 0.5728 0.5271 0.7260
No log 1.0353 88 0.5166 0.5765 0.5166 0.7188
No log 1.0588 90 0.5818 0.5363 0.5818 0.7627
No log 1.0824 92 0.8393 0.3462 0.8393 0.9161
No log 1.1059 94 0.8838 0.3418 0.8838 0.9401
No log 1.1294 96 0.7011 0.4124 0.7011 0.8373
No log 1.1529 98 0.5104 0.6771 0.5104 0.7144
No log 1.1765 100 0.4987 0.6115 0.4987 0.7062
No log 1.2 102 0.5115 0.5473 0.5115 0.7152
No log 1.2235 104 0.5211 0.4795 0.5211 0.7219
No log 1.2471 106 0.5220 0.4224 0.5220 0.7225
No log 1.2706 108 0.5239 0.5170 0.5239 0.7238
No log 1.2941 110 0.5495 0.5639 0.5495 0.7413
No log 1.3176 112 0.5520 0.5845 0.5520 0.7430
No log 1.3412 114 0.5555 0.5974 0.5555 0.7453
No log 1.3647 116 0.5810 0.6298 0.5810 0.7622
No log 1.3882 118 0.6579 0.4744 0.6579 0.8111
No log 1.4118 120 0.6446 0.4501 0.6446 0.8029
No log 1.4353 122 0.5855 0.6307 0.5855 0.7652
No log 1.4588 124 0.8142 0.4946 0.8142 0.9023
No log 1.4824 126 0.9806 0.5263 0.9806 0.9902
No log 1.5059 128 0.7851 0.4548 0.7851 0.8860
No log 1.5294 130 0.5460 0.5758 0.5460 0.7389
No log 1.5529 132 0.4763 0.5739 0.4763 0.6901
No log 1.5765 134 0.5408 0.5636 0.5408 0.7354
No log 1.6 136 0.6041 0.4851 0.6041 0.7772
No log 1.6235 138 0.5448 0.5014 0.5448 0.7381
No log 1.6471 140 0.4954 0.5472 0.4954 0.7038
No log 1.6706 142 0.5279 0.5141 0.5279 0.7265
No log 1.6941 144 0.5146 0.4526 0.5146 0.7174
No log 1.7176 146 0.4866 0.5405 0.4866 0.6976
No log 1.7412 148 0.4649 0.6530 0.4649 0.6818
No log 1.7647 150 0.4406 0.5801 0.4406 0.6638
No log 1.7882 152 0.4872 0.5736 0.4872 0.6980
No log 1.8118 154 0.5907 0.5709 0.5907 0.7686
No log 1.8353 156 0.5705 0.5632 0.5705 0.7553
No log 1.8588 158 0.4774 0.6633 0.4774 0.6909
No log 1.8824 160 0.4871 0.6516 0.4871 0.6979
No log 1.9059 162 0.5612 0.5752 0.5612 0.7492
No log 1.9294 164 0.6264 0.5382 0.6264 0.7915
No log 1.9529 166 0.4990 0.5642 0.4990 0.7064
No log 1.9765 168 0.4460 0.5587 0.4460 0.6678
No log 2.0 170 0.4642 0.5867 0.4642 0.6813
No log 2.0235 172 0.4684 0.6087 0.4684 0.6844
No log 2.0471 174 0.4678 0.6007 0.4678 0.6839
No log 2.0706 176 0.5042 0.6036 0.5042 0.7101
No log 2.0941 178 0.4885 0.6356 0.4885 0.6989
No log 2.1176 180 0.5230 0.5649 0.5230 0.7232
No log 2.1412 182 0.4876 0.6356 0.4876 0.6983
No log 2.1647 184 0.4684 0.6609 0.4684 0.6844
No log 2.1882 186 0.6243 0.5024 0.6243 0.7901
No log 2.2118 188 0.7646 0.4496 0.7646 0.8744
No log 2.2353 190 0.6308 0.4949 0.6308 0.7942
No log 2.2588 192 0.5838 0.5431 0.5838 0.7640
No log 2.2824 194 0.4896 0.5979 0.4896 0.6997
No log 2.3059 196 0.5074 0.6247 0.5074 0.7124
No log 2.3294 198 0.4879 0.5719 0.4879 0.6985
No log 2.3529 200 0.4880 0.5485 0.4880 0.6986
No log 2.3765 202 0.5224 0.5252 0.5224 0.7228
No log 2.4 204 0.5093 0.5209 0.5093 0.7137
No log 2.4235 206 0.5136 0.5111 0.5136 0.7166
No log 2.4471 208 0.4905 0.5209 0.4905 0.7003
No log 2.4706 210 0.4898 0.5836 0.4898 0.6998
No log 2.4941 212 0.4708 0.5868 0.4708 0.6861
No log 2.5176 214 0.4693 0.5868 0.4693 0.6851
No log 2.5412 216 0.4723 0.5567 0.4723 0.6872
No log 2.5647 218 0.4802 0.5647 0.4802 0.6930
No log 2.5882 220 0.5354 0.5697 0.5354 0.7317
No log 2.6118 222 0.6115 0.5562 0.6115 0.7820
No log 2.6353 224 0.6825 0.4597 0.6825 0.8261
No log 2.6588 226 0.6837 0.4180 0.6837 0.8268
No log 2.6824 228 0.6192 0.5107 0.6192 0.7869
No log 2.7059 230 0.5259 0.5086 0.5259 0.7252
No log 2.7294 232 0.5122 0.5177 0.5122 0.7157
No log 2.7529 234 0.5710 0.5523 0.5710 0.7557
No log 2.7765 236 0.6487 0.5065 0.6487 0.8054
No log 2.8 238 0.5694 0.4964 0.5694 0.7546
No log 2.8235 240 0.5337 0.4684 0.5337 0.7305
No log 2.8471 242 0.5259 0.4902 0.5259 0.7252
No log 2.8706 244 0.5388 0.4968 0.5388 0.7341
No log 2.8941 246 0.5181 0.5182 0.5181 0.7198
No log 2.9176 248 0.5752 0.4898 0.5752 0.7584
No log 2.9412 250 0.6390 0.5085 0.6390 0.7994
No log 2.9647 252 0.6183 0.4582 0.6183 0.7863
No log 2.9882 254 0.5561 0.4243 0.5561 0.7457
No log 3.0118 256 0.5386 0.4879 0.5386 0.7339
No log 3.0353 258 0.5237 0.4238 0.5237 0.7237
No log 3.0588 260 0.5206 0.4001 0.5206 0.7215
No log 3.0824 262 0.5238 0.5111 0.5238 0.7237
No log 3.1059 264 0.5325 0.5063 0.5325 0.7297
No log 3.1294 266 0.5102 0.5209 0.5102 0.7143
No log 3.1529 268 0.4728 0.5836 0.4728 0.6876
No log 3.1765 270 0.4695 0.5798 0.4695 0.6852
No log 3.2 272 0.4814 0.5918 0.4814 0.6938
No log 3.2235 274 0.4967 0.5895 0.4967 0.7048
No log 3.2471 276 0.5159 0.6182 0.5159 0.7183
No log 3.2706 278 0.5582 0.5315 0.5582 0.7471
No log 3.2941 280 0.5851 0.5331 0.5851 0.7649
No log 3.3176 282 0.5161 0.5265 0.5161 0.7184
No log 3.3412 284 0.5046 0.4768 0.5046 0.7103
No log 3.3647 286 0.5098 0.4515 0.5098 0.7140
No log 3.3882 288 0.5315 0.4211 0.5315 0.7290
No log 3.4118 290 0.5837 0.5208 0.5837 0.7640
No log 3.4353 292 0.5459 0.4429 0.5459 0.7389
No log 3.4588 294 0.5246 0.5326 0.5246 0.7243
No log 3.4824 296 0.5401 0.5214 0.5401 0.7349
No log 3.5059 298 0.5285 0.5214 0.5285 0.7269
No log 3.5294 300 0.5192 0.6060 0.5192 0.7206
No log 3.5529 302 0.5332 0.5634 0.5332 0.7302
No log 3.5765 304 0.5467 0.6265 0.5467 0.7394
No log 3.6 306 0.6337 0.5336 0.6337 0.7961
No log 3.6235 308 0.6898 0.4775 0.6898 0.8306
No log 3.6471 310 0.6928 0.4222 0.6928 0.8324
No log 3.6706 312 0.7088 0.3645 0.7088 0.8419
No log 3.6941 314 0.5874 0.5177 0.5874 0.7664
No log 3.7176 316 0.5387 0.5749 0.5387 0.7340
No log 3.7412 318 0.5351 0.5749 0.5351 0.7315
No log 3.7647 320 0.5728 0.5254 0.5728 0.7568
No log 3.7882 322 0.5853 0.5528 0.5853 0.7650
No log 3.8118 324 0.5821 0.5745 0.5821 0.7629
No log 3.8353 326 0.5679 0.5956 0.5679 0.7536
No log 3.8588 328 0.5611 0.5815 0.5611 0.7491
No log 3.8824 330 0.5566 0.4964 0.5566 0.7460
No log 3.9059 332 0.6090 0.5275 0.6090 0.7804
No log 3.9294 334 0.7144 0.4304 0.7144 0.8452
No log 3.9529 336 0.6646 0.4199 0.6646 0.8152
No log 3.9765 338 0.5672 0.5720 0.5672 0.7531
No log 4.0 340 0.5063 0.5467 0.5063 0.7115
No log 4.0235 342 0.5031 0.5467 0.5031 0.7093
No log 4.0471 344 0.5317 0.5448 0.5317 0.7292
No log 4.0706 346 0.5564 0.5622 0.5564 0.7459
No log 4.0941 348 0.5259 0.5752 0.5259 0.7252
No log 4.1176 350 0.5242 0.5549 0.5242 0.7240
No log 4.1412 352 0.5352 0.5912 0.5352 0.7316
No log 4.1647 354 0.5766 0.5712 0.5766 0.7593
No log 4.1882 356 0.5460 0.6067 0.5460 0.7389
No log 4.2118 358 0.5251 0.5361 0.5251 0.7247
No log 4.2353 360 0.5587 0.5117 0.5587 0.7475
No log 4.2588 362 0.5206 0.5208 0.5206 0.7215
No log 4.2824 364 0.5184 0.5485 0.5184 0.7200
No log 4.3059 366 0.6378 0.5042 0.6378 0.7986
No log 4.3294 368 0.7327 0.4183 0.7327 0.8560
No log 4.3529 370 0.6830 0.4805 0.6830 0.8265
No log 4.3765 372 0.5860 0.5639 0.5860 0.7655
No log 4.4 374 0.5102 0.5819 0.5102 0.7143
No log 4.4235 376 0.5119 0.4990 0.5119 0.7155
No log 4.4471 378 0.5166 0.5171 0.5166 0.7188
No log 4.4706 380 0.4972 0.5782 0.4972 0.7051
No log 4.4941 382 0.5234 0.5752 0.5234 0.7234
No log 4.5176 384 0.6379 0.4893 0.6379 0.7987
No log 4.5412 386 0.7825 0.4183 0.7825 0.8846
No log 4.5647 388 0.7633 0.4039 0.7633 0.8737
No log 4.5882 390 0.6380 0.4887 0.6380 0.7987
No log 4.6118 392 0.4982 0.5715 0.4982 0.7059
No log 4.6353 394 0.4741 0.6017 0.4741 0.6885
No log 4.6588 396 0.5123 0.6223 0.5123 0.7158
No log 4.6824 398 0.5390 0.5855 0.5390 0.7342
No log 4.7059 400 0.5452 0.5657 0.5452 0.7384
No log 4.7294 402 0.5591 0.5584 0.5591 0.7477
No log 4.7529 404 0.5083 0.5714 0.5083 0.7130
No log 4.7765 406 0.5091 0.5715 0.5091 0.7135
No log 4.8 408 0.5761 0.4684 0.5761 0.7590
No log 4.8235 410 0.6061 0.3843 0.6061 0.7785
No log 4.8471 412 0.5805 0.4502 0.5805 0.7619
No log 4.8706 414 0.5400 0.4867 0.5400 0.7348
No log 4.8941 416 0.5155 0.5131 0.5155 0.7180
No log 4.9176 418 0.4995 0.5386 0.4995 0.7068
No log 4.9412 420 0.5196 0.5655 0.5196 0.7208
No log 4.9647 422 0.5141 0.5655 0.5141 0.7170
No log 4.9882 424 0.4754 0.6024 0.4754 0.6895
No log 5.0118 426 0.4892 0.6884 0.4892 0.6995
No log 5.0353 428 0.4681 0.6579 0.4681 0.6842
No log 5.0588 430 0.4605 0.6293 0.4605 0.6786
No log 5.0824 432 0.4802 0.6293 0.4802 0.6930
No log 5.1059 434 0.5500 0.5237 0.5500 0.7416
No log 5.1294 436 0.5966 0.4759 0.5966 0.7724
No log 5.1529 438 0.6323 0.4385 0.6323 0.7952
No log 5.1765 440 0.6979 0.4468 0.6979 0.8354
No log 5.2 442 0.7914 0.3747 0.7914 0.8896
No log 5.2235 444 0.7628 0.4070 0.7628 0.8734
No log 5.2471 446 0.6590 0.4468 0.6590 0.8118
No log 5.2706 448 0.6571 0.4812 0.6571 0.8106
No log 5.2941 450 0.6548 0.4684 0.6548 0.8092
No log 5.3176 452 0.6207 0.4755 0.6207 0.7878
No log 5.3412 454 0.5366 0.6067 0.5366 0.7325
No log 5.3647 456 0.4848 0.5955 0.4848 0.6963
No log 5.3882 458 0.4792 0.6186 0.4792 0.6923
No log 5.4118 460 0.4903 0.6289 0.4903 0.7002
No log 5.4353 462 0.4965 0.5852 0.4965 0.7046
No log 5.4588 464 0.5235 0.4855 0.5235 0.7235
No log 5.4824 466 0.5316 0.5271 0.5316 0.7291
No log 5.5059 468 0.5177 0.5081 0.5177 0.7195
No log 5.5294 470 0.5114 0.5267 0.5114 0.7151
No log 5.5529 472 0.4996 0.5189 0.4996 0.7068
No log 5.5765 474 0.5012 0.6553 0.5012 0.7080
No log 5.6 476 0.5176 0.6914 0.5176 0.7195
No log 5.6235 478 0.5375 0.6613 0.5375 0.7332
No log 5.6471 480 0.5767 0.4987 0.5767 0.7594
No log 5.6706 482 0.5751 0.4309 0.5751 0.7584
No log 5.6941 484 0.5045 0.6047 0.5045 0.7103
No log 5.7176 486 0.4705 0.6530 0.4705 0.6859
No log 5.7412 488 0.6057 0.4315 0.6057 0.7783
No log 5.7647 490 0.7133 0.4114 0.7133 0.8446
No log 5.7882 492 0.6644 0.4315 0.6644 0.8151
No log 5.8118 494 0.5481 0.5497 0.5481 0.7403
No log 5.8353 496 0.4822 0.5868 0.4822 0.6944
No log 5.8588 498 0.4867 0.5475 0.4867 0.6977
0.3238 5.8824 500 0.5029 0.5356 0.5029 0.7092
0.3238 5.9059 502 0.4944 0.6154 0.4944 0.7031
0.3238 5.9294 504 0.4950 0.6052 0.4950 0.7036
0.3238 5.9529 506 0.5009 0.5845 0.5009 0.7078
0.3238 5.9765 508 0.5150 0.6025 0.5150 0.7176
0.3238 6.0 510 0.5634 0.5568 0.5634 0.7506
0.3238 6.0235 512 0.5590 0.5166 0.5590 0.7477
0.3238 6.0471 514 0.5385 0.6067 0.5385 0.7338
0.3238 6.0706 516 0.5477 0.5560 0.5477 0.7401
0.3238 6.0941 518 0.5614 0.5528 0.5614 0.7493
0.3238 6.1176 520 0.5427 0.5957 0.5427 0.7367
0.3238 6.1412 522 0.4929 0.6348 0.4929 0.7020
0.3238 6.1647 524 0.5218 0.5933 0.5218 0.7224
0.3238 6.1882 526 0.5469 0.5161 0.5469 0.7396
0.3238 6.2118 528 0.5271 0.5933 0.5271 0.7260
0.3238 6.2353 530 0.4855 0.5649 0.4855 0.6968
0.3238 6.2588 532 0.4671 0.6335 0.4671 0.6835
0.3238 6.2824 534 0.4649 0.6335 0.4649 0.6818
0.3238 6.3059 536 0.4602 0.6200 0.4602 0.6784
0.3238 6.3294 538 0.4660 0.5995 0.4660 0.6826
0.3238 6.3529 540 0.4999 0.6004 0.4999 0.7070
0.3238 6.3765 542 0.5262 0.6214 0.5262 0.7254
0.3238 6.4 544 0.4988 0.6025 0.4988 0.7062
0.3238 6.4235 546 0.4823 0.6214 0.4823 0.6945
0.3238 6.4471 548 0.4839 0.6251 0.4839 0.6956
0.3238 6.4706 550 0.4804 0.6317 0.4804 0.6931
0.3238 6.4941 552 0.4842 0.6757 0.4842 0.6958
0.3238 6.5176 554 0.5038 0.6295 0.5038 0.7098
0.3238 6.5412 556 0.4723 0.6757 0.4723 0.6873
0.3238 6.5647 558 0.5080 0.5980 0.5080 0.7127
0.3238 6.5882 560 0.6012 0.5308 0.6012 0.7754
0.3238 6.6118 562 0.7049 0.4684 0.7049 0.8396
0.3238 6.6353 564 0.6928 0.4614 0.6928 0.8324
0.3238 6.6588 566 0.6464 0.4819 0.6464 0.8040
0.3238 6.6824 568 0.5717 0.4979 0.5717 0.7561

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k17_task7_organization

Finetuned
(4019)
this model