ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k19_task7_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.9024
  • Qwk: 0.4213
  • Mse: 0.9024
  • Rmse: 0.9499

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0211 2 2.5039 -0.0593 2.5039 1.5824
No log 0.0421 4 1.0942 0.1856 1.0942 1.0461
No log 0.0632 6 0.7171 0.0444 0.7171 0.8468
No log 0.0842 8 0.8750 0.1867 0.8750 0.9354
No log 0.1053 10 0.8115 0.2703 0.8115 0.9008
No log 0.1263 12 0.6432 0.2336 0.6432 0.8020
No log 0.1474 14 0.7057 0.2494 0.7057 0.8400
No log 0.1684 16 0.6328 0.1786 0.6328 0.7955
No log 0.1895 18 0.6263 0.2526 0.6263 0.7914
No log 0.2105 20 0.6324 0.3492 0.6324 0.7952
No log 0.2316 22 0.5549 0.3187 0.5549 0.7449
No log 0.2526 24 0.4852 0.5267 0.4852 0.6966
No log 0.2737 26 0.4829 0.5577 0.4829 0.6949
No log 0.2947 28 0.4681 0.5501 0.4681 0.6842
No log 0.3158 30 0.4473 0.6330 0.4473 0.6688
No log 0.3368 32 0.4496 0.6648 0.4496 0.6705
No log 0.3579 34 0.4609 0.6448 0.4609 0.6789
No log 0.3789 36 0.4809 0.6349 0.4809 0.6935
No log 0.4 38 0.4771 0.6349 0.4771 0.6907
No log 0.4211 40 0.4738 0.6196 0.4738 0.6883
No log 0.4421 42 0.5360 0.5650 0.5360 0.7321
No log 0.4632 44 0.4647 0.5846 0.4647 0.6817
No log 0.4842 46 0.4549 0.5488 0.4549 0.6745
No log 0.5053 48 0.4804 0.5184 0.4804 0.6931
No log 0.5263 50 0.5030 0.5673 0.5030 0.7092
No log 0.5474 52 0.6013 0.5584 0.6013 0.7754
No log 0.5684 54 0.7522 0.5160 0.7522 0.8673
No log 0.5895 56 0.7234 0.5354 0.7234 0.8505
No log 0.6105 58 0.7729 0.5576 0.7729 0.8791
No log 0.6316 60 0.6893 0.5608 0.6893 0.8303
No log 0.6526 62 0.5996 0.5521 0.5996 0.7743
No log 0.6737 64 0.5796 0.5521 0.5796 0.7613
No log 0.6947 66 0.6225 0.5692 0.6225 0.7890
No log 0.7158 68 0.6841 0.5589 0.6841 0.8271
No log 0.7368 70 0.6777 0.5159 0.6777 0.8232
No log 0.7579 72 0.6000 0.6003 0.6000 0.7746
No log 0.7789 74 0.5561 0.5889 0.5561 0.7457
No log 0.8 76 0.5633 0.5784 0.5633 0.7505
No log 0.8211 78 0.5463 0.5840 0.5463 0.7391
No log 0.8421 80 0.5458 0.5744 0.5458 0.7388
No log 0.8632 82 0.6079 0.5278 0.6079 0.7797
No log 0.8842 84 0.7862 0.5757 0.7862 0.8867
No log 0.9053 86 0.7841 0.5526 0.7841 0.8855
No log 0.9263 88 0.6504 0.5632 0.6504 0.8065
No log 0.9474 90 0.5641 0.5117 0.5641 0.7511
No log 0.9684 92 0.5536 0.5149 0.5536 0.7440
No log 0.9895 94 0.6057 0.5042 0.6057 0.7783
No log 1.0105 96 0.7273 0.5322 0.7273 0.8528
No log 1.0316 98 0.9281 0.4456 0.9281 0.9634
No log 1.0526 100 1.0470 0.4297 1.0470 1.0232
No log 1.0737 102 1.0144 0.4966 1.0144 1.0072
No log 1.0947 104 0.8072 0.4288 0.8072 0.8984
No log 1.1158 106 0.5471 0.5591 0.5471 0.7397
No log 1.1368 108 0.5090 0.5379 0.5090 0.7135
No log 1.1579 110 0.5480 0.5042 0.5480 0.7403
No log 1.1789 112 0.6091 0.4369 0.6091 0.7805
No log 1.2 114 0.5837 0.5562 0.5837 0.7640
No log 1.2211 116 0.5166 0.6797 0.5166 0.7188
No log 1.2421 118 0.6081 0.5255 0.6081 0.7798
No log 1.2632 120 0.6127 0.5692 0.6127 0.7827
No log 1.2842 122 0.5917 0.5195 0.5917 0.7692
No log 1.3053 124 0.5468 0.5426 0.5468 0.7394
No log 1.3263 126 0.5668 0.5920 0.5668 0.7529
No log 1.3474 128 0.7194 0.4268 0.7194 0.8482
No log 1.3684 130 0.7812 0.4703 0.7812 0.8838
No log 1.3895 132 0.6732 0.4438 0.6732 0.8205
No log 1.4105 134 0.5920 0.4961 0.5920 0.7694
No log 1.4316 136 0.5428 0.6135 0.5428 0.7367
No log 1.4526 138 0.5440 0.5799 0.5440 0.7376
No log 1.4737 140 0.5887 0.5812 0.5887 0.7673
No log 1.4947 142 0.6512 0.5935 0.6512 0.8069
No log 1.5158 144 0.6504 0.6104 0.6504 0.8065
No log 1.5368 146 0.6439 0.5545 0.6439 0.8024
No log 1.5579 148 0.6111 0.6149 0.6111 0.7817
No log 1.5789 150 0.5937 0.5692 0.5937 0.7705
No log 1.6 152 0.5521 0.5262 0.5521 0.7430
No log 1.6211 154 0.5278 0.5103 0.5278 0.7265
No log 1.6421 156 0.5604 0.5845 0.5604 0.7486
No log 1.6632 158 0.6952 0.4704 0.6952 0.8338
No log 1.6842 160 0.8742 0.4267 0.8742 0.9350
No log 1.7053 162 0.8607 0.4713 0.8607 0.9278
No log 1.7263 164 0.7574 0.5340 0.7574 0.8703
No log 1.7474 166 0.7011 0.4705 0.7011 0.8373
No log 1.7684 168 0.5775 0.4728 0.5775 0.7599
No log 1.7895 170 0.5100 0.6004 0.5100 0.7141
No log 1.8105 172 0.5005 0.5996 0.5005 0.7075
No log 1.8316 174 0.5291 0.6052 0.5291 0.7274
No log 1.8526 176 0.5790 0.6197 0.5790 0.7609
No log 1.8737 178 0.6124 0.5863 0.6124 0.7826
No log 1.8947 180 0.6354 0.5560 0.6354 0.7971
No log 1.9158 182 0.5914 0.5546 0.5914 0.7690
No log 1.9368 184 0.5475 0.5156 0.5475 0.7399
No log 1.9579 186 0.5245 0.5195 0.5245 0.7242
No log 1.9789 188 0.5189 0.6052 0.5189 0.7203
No log 2.0 190 0.5453 0.6028 0.5453 0.7384
No log 2.0211 192 0.6117 0.5334 0.6117 0.7821
No log 2.0421 194 0.7300 0.5146 0.7300 0.8544
No log 2.0632 196 0.6740 0.5555 0.6740 0.8210
No log 2.0842 198 0.5656 0.6653 0.5656 0.7521
No log 2.1053 200 0.5405 0.6006 0.5405 0.7352
No log 2.1263 202 0.5182 0.5853 0.5182 0.7198
No log 2.1474 204 0.6585 0.5295 0.6585 0.8115
No log 2.1684 206 0.8847 0.3945 0.8847 0.9406
No log 2.1895 208 0.8924 0.2977 0.8924 0.9447
No log 2.2105 210 0.8114 0.3137 0.8114 0.9008
No log 2.2316 212 0.6558 0.4349 0.6558 0.8098
No log 2.2526 214 0.6528 0.5443 0.6528 0.8079
No log 2.2737 216 0.8157 0.5126 0.8157 0.9032
No log 2.2947 218 0.8733 0.5126 0.8733 0.9345
No log 2.3158 220 0.7559 0.5446 0.7559 0.8694
No log 2.3368 222 0.6838 0.5416 0.6838 0.8269
No log 2.3579 224 0.6330 0.5042 0.6330 0.7956
No log 2.3789 226 0.5971 0.5086 0.5971 0.7727
No log 2.4 228 0.6095 0.4835 0.6095 0.7807
No log 2.4211 230 0.6604 0.4424 0.6604 0.8126
No log 2.4421 232 0.6840 0.4587 0.6840 0.8270
No log 2.4632 234 0.7361 0.4450 0.7361 0.8579
No log 2.4842 236 0.6888 0.5163 0.6888 0.8299
No log 2.5053 238 0.6811 0.4877 0.6811 0.8253
No log 2.5263 240 0.7307 0.4992 0.7307 0.8548
No log 2.5474 242 0.6436 0.4665 0.6436 0.8022
No log 2.5684 244 0.5723 0.5140 0.5723 0.7565
No log 2.5895 246 0.6145 0.5326 0.6145 0.7839
No log 2.6105 248 0.6905 0.4738 0.6905 0.8310
No log 2.6316 250 0.7957 0.4409 0.7957 0.8920
No log 2.6526 252 0.7441 0.4296 0.7441 0.8626
No log 2.6737 254 0.7819 0.4133 0.7819 0.8843
No log 2.6947 256 0.8500 0.4186 0.8500 0.9220
No log 2.7158 258 0.7822 0.4096 0.7822 0.8844
No log 2.7368 260 0.7412 0.4580 0.7412 0.8609
No log 2.7579 262 0.8842 0.4376 0.8842 0.9403
No log 2.7789 264 0.8263 0.4604 0.8263 0.9090
No log 2.8 266 0.7685 0.4536 0.7685 0.8767
No log 2.8211 268 0.7210 0.4400 0.7210 0.8491
No log 2.8421 270 0.6655 0.4916 0.6655 0.8158
No log 2.8632 272 0.7550 0.4462 0.7550 0.8689
No log 2.8842 274 0.9170 0.4118 0.9170 0.9576
No log 2.9053 276 0.8120 0.4378 0.8120 0.9011
No log 2.9263 278 0.6768 0.5243 0.6768 0.8227
No log 2.9474 280 0.6640 0.5455 0.6640 0.8149
No log 2.9684 282 0.6908 0.4438 0.6908 0.8311
No log 2.9895 284 0.7249 0.3582 0.7249 0.8514
No log 3.0105 286 0.8802 0.3499 0.8802 0.9382
No log 3.0316 288 0.9180 0.3290 0.9180 0.9581
No log 3.0526 290 0.9864 0.4077 0.9864 0.9932
No log 3.0737 292 0.8252 0.3988 0.8252 0.9084
No log 3.0947 294 0.6790 0.3934 0.6790 0.8240
No log 3.1158 296 0.7364 0.4438 0.7364 0.8581
No log 3.1368 298 0.8636 0.3559 0.8636 0.9293
No log 3.1579 300 0.7871 0.3618 0.7871 0.8872
No log 3.1789 302 0.6323 0.4329 0.6323 0.7952
No log 3.2 304 0.6051 0.4430 0.6051 0.7779
No log 3.2211 306 0.6901 0.4592 0.6901 0.8308
No log 3.2421 308 0.8866 0.3928 0.8866 0.9416
No log 3.2632 310 0.9454 0.4021 0.9454 0.9723
No log 3.2842 312 0.7849 0.4250 0.7849 0.8860
No log 3.3053 314 0.7698 0.4396 0.7698 0.8774
No log 3.3263 316 0.7612 0.4396 0.7612 0.8725
No log 3.3474 318 0.8248 0.3868 0.8248 0.9082
No log 3.3684 320 0.8950 0.3697 0.8950 0.9460
No log 3.3895 322 0.7824 0.3868 0.7824 0.8845
No log 3.4105 324 0.6532 0.4014 0.6532 0.8082
No log 3.4316 326 0.6122 0.4479 0.6122 0.7824
No log 3.4526 328 0.6043 0.4491 0.6043 0.7774
No log 3.4737 330 0.7139 0.4228 0.7139 0.8450
No log 3.4947 332 0.8486 0.4149 0.8486 0.9212
No log 3.5158 334 0.8826 0.4091 0.8826 0.9395
No log 3.5368 336 0.7521 0.4364 0.7521 0.8672
No log 3.5579 338 0.6172 0.4123 0.6172 0.7856
No log 3.5789 340 0.5840 0.3894 0.5840 0.7642
No log 3.6 342 0.5994 0.3471 0.5994 0.7742
No log 3.6211 344 0.7126 0.3913 0.7126 0.8442
No log 3.6421 346 1.0080 0.3587 1.0080 1.0040
No log 3.6632 348 1.2029 0.3375 1.2029 1.0968
No log 3.6842 350 1.0850 0.3802 1.0850 1.0416
No log 3.7053 352 0.7979 0.4444 0.7979 0.8933
No log 3.7263 354 0.6039 0.3157 0.6039 0.7771
No log 3.7474 356 0.5297 0.4929 0.5297 0.7278
No log 3.7684 358 0.5307 0.5460 0.5307 0.7285
No log 3.7895 360 0.5688 0.5373 0.5688 0.7542
No log 3.8105 362 0.6340 0.4436 0.6340 0.7963
No log 3.8316 364 0.8068 0.4288 0.8068 0.8982
No log 3.8526 366 0.9895 0.3869 0.9895 0.9947
No log 3.8737 368 0.9366 0.3481 0.9366 0.9678
No log 3.8947 370 0.8875 0.3481 0.8875 0.9421
No log 3.9158 372 0.7824 0.3606 0.7824 0.8845
No log 3.9368 374 0.6491 0.3869 0.6491 0.8057
No log 3.9579 376 0.5338 0.4964 0.5338 0.7306
No log 3.9789 378 0.4938 0.5095 0.4938 0.7027
No log 4.0 380 0.4852 0.5604 0.4852 0.6966
No log 4.0211 382 0.5076 0.6040 0.5076 0.7125
No log 4.0421 384 0.5356 0.5877 0.5356 0.7318
No log 4.0632 386 0.5486 0.5709 0.5486 0.7407
No log 4.0842 388 0.5335 0.6200 0.5335 0.7304
No log 4.1053 390 0.5097 0.6419 0.5097 0.7139
No log 4.1263 392 0.4801 0.6492 0.4801 0.6929
No log 4.1474 394 0.4680 0.6060 0.4680 0.6841
No log 4.1684 396 0.4778 0.5819 0.4778 0.6912
No log 4.1895 398 0.5403 0.4745 0.5403 0.7351
No log 4.2105 400 0.5762 0.4521 0.5762 0.7590
No log 4.2316 402 0.6229 0.5103 0.6229 0.7892
No log 4.2526 404 0.5363 0.6293 0.5363 0.7323
No log 4.2737 406 0.4846 0.5934 0.4846 0.6961
No log 4.2947 408 0.4614 0.5671 0.4614 0.6792
No log 4.3158 410 0.4684 0.5289 0.4684 0.6844
No log 4.3368 412 0.4534 0.5631 0.4534 0.6734
No log 4.3579 414 0.4597 0.5631 0.4597 0.6780
No log 4.3789 416 0.4793 0.6087 0.4793 0.6923
No log 4.4 418 0.4660 0.6087 0.4660 0.6827
No log 4.4211 420 0.4350 0.6010 0.4350 0.6596
No log 4.4421 422 0.4415 0.5860 0.4415 0.6644
No log 4.4632 424 0.4417 0.6228 0.4417 0.6646
No log 4.4842 426 0.4644 0.6087 0.4644 0.6814
No log 4.5053 428 0.5250 0.6109 0.5250 0.7246
No log 4.5263 430 0.5344 0.5586 0.5344 0.7310
No log 4.5474 432 0.5401 0.4979 0.5401 0.7349
No log 4.5684 434 0.5062 0.5597 0.5062 0.7115
No log 4.5895 436 0.5115 0.5036 0.5115 0.7152
No log 4.6105 438 0.5372 0.5639 0.5372 0.7330
No log 4.6316 440 0.6197 0.4795 0.6197 0.7872
No log 4.6526 442 0.6326 0.52 0.6326 0.7954
No log 4.6737 444 0.5431 0.6096 0.5431 0.7370
No log 4.6947 446 0.4845 0.5345 0.4845 0.6960
No log 4.7158 448 0.4512 0.5648 0.4512 0.6718
No log 4.7368 450 0.4333 0.5648 0.4333 0.6583
No log 4.7579 452 0.4276 0.6307 0.4276 0.6539
No log 4.7789 454 0.4539 0.6187 0.4539 0.6737
No log 4.8 456 0.4717 0.6087 0.4717 0.6868
No log 4.8211 458 0.4746 0.6187 0.4746 0.6889
No log 4.8421 460 0.4990 0.5692 0.4990 0.7064
No log 4.8632 462 0.4903 0.5692 0.4903 0.7002
No log 4.8842 464 0.4601 0.5980 0.4601 0.6783
No log 4.9053 466 0.4494 0.5897 0.4494 0.6704
No log 4.9263 468 0.4804 0.5617 0.4804 0.6931
No log 4.9474 470 0.5055 0.5639 0.5055 0.7110
No log 4.9684 472 0.5630 0.5293 0.5630 0.7503
No log 4.9895 474 0.5401 0.5510 0.5401 0.7349
No log 5.0105 476 0.5519 0.5471 0.5519 0.7429
No log 5.0316 478 0.5332 0.5735 0.5332 0.7302
No log 5.0526 480 0.5166 0.5261 0.5166 0.7188
No log 5.0737 482 0.4571 0.6214 0.4571 0.6761
No log 5.0947 484 0.4499 0.6215 0.4499 0.6707
No log 5.1158 486 0.4558 0.6215 0.4558 0.6751
No log 5.1368 488 0.4551 0.6833 0.4551 0.6746
No log 5.1579 490 0.4513 0.5980 0.4513 0.6718
No log 5.1789 492 0.4749 0.5411 0.4749 0.6891
No log 5.2 494 0.5059 0.5368 0.5059 0.7112
No log 5.2211 496 0.4926 0.5957 0.4926 0.7018
No log 5.2421 498 0.4413 0.6339 0.4413 0.6643
0.3127 5.2632 500 0.4456 0.6087 0.4456 0.6676
0.3127 5.2842 502 0.4528 0.6292 0.4528 0.6729
0.3127 5.3053 504 0.4519 0.6542 0.4519 0.6722
0.3127 5.3263 506 0.4849 0.6430 0.4849 0.6963
0.3127 5.3474 508 0.4938 0.6010 0.4938 0.7027
0.3127 5.3684 510 0.4815 0.6355 0.4815 0.6939
0.3127 5.3895 512 0.4570 0.6458 0.4570 0.6760
0.3127 5.4105 514 0.4560 0.6289 0.4560 0.6753
0.3127 5.4316 516 0.4629 0.6346 0.4629 0.6804
0.3127 5.4526 518 0.5063 0.5524 0.5063 0.7116
0.3127 5.4737 520 0.5293 0.5524 0.5293 0.7276
0.3127 5.4947 522 0.4961 0.5871 0.4961 0.7043
0.3127 5.5158 524 0.4721 0.6423 0.4721 0.6871
0.3127 5.5368 526 0.4660 0.6423 0.4660 0.6826
0.3127 5.5579 528 0.4769 0.6239 0.4769 0.6906
0.3127 5.5789 530 0.5229 0.6248 0.5229 0.7231
0.3127 5.6 532 0.5575 0.5716 0.5575 0.7466
0.3127 5.6211 534 0.5318 0.6325 0.5318 0.7293
0.3127 5.6421 536 0.4878 0.6251 0.4878 0.6984
0.3127 5.6632 538 0.4693 0.5697 0.4693 0.6850
0.3127 5.6842 540 0.4518 0.6298 0.4518 0.6722
0.3127 5.7053 542 0.4469 0.6265 0.4469 0.6685
0.3127 5.7263 544 0.4555 0.5923 0.4555 0.6749
0.3127 5.7474 546 0.4992 0.5817 0.4992 0.7065
0.3127 5.7684 548 0.6304 0.4536 0.6304 0.7940
0.3127 5.7895 550 0.8616 0.4213 0.8616 0.9282
0.3127 5.8105 552 0.9024 0.4213 0.9024 0.9499

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k19_task7_organization

Finetuned
(4019)
this model