ArabicNewSplits7_B_usingALLEssays_FineTuningAraBERT_run1_AugV5_k13_task7_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.5684
  • Qwk: 0.4315
  • Mse: 0.5684
  • Rmse: 0.7539

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0312 2 2.6424 -0.0568 2.6424 1.6255
No log 0.0625 4 1.3542 0.1256 1.3542 1.1637
No log 0.0938 6 1.1591 -0.1866 1.1591 1.0766
No log 0.125 8 0.9944 0.0391 0.9944 0.9972
No log 0.1562 10 0.7688 0.1800 0.7688 0.8768
No log 0.1875 12 0.7130 0.1264 0.7130 0.8444
No log 0.2188 14 0.7374 0.4087 0.7374 0.8587
No log 0.25 16 0.6626 0.3141 0.6626 0.8140
No log 0.2812 18 0.6627 0.2498 0.6627 0.8141
No log 0.3125 20 0.6169 0.3416 0.6169 0.7854
No log 0.3438 22 0.6857 0.3156 0.6857 0.8281
No log 0.375 24 0.8992 0.2728 0.8992 0.9483
No log 0.4062 26 0.8480 0.2585 0.8480 0.9209
No log 0.4375 28 0.7412 0.3388 0.7412 0.8609
No log 0.4688 30 0.7389 0.2926 0.7389 0.8596
No log 0.5 32 0.7438 0.2735 0.7438 0.8624
No log 0.5312 34 0.6956 0.3982 0.6956 0.8340
No log 0.5625 36 0.9589 0.2861 0.9589 0.9792
No log 0.5938 38 1.5211 0.0792 1.5211 1.2333
No log 0.625 40 1.5988 0.0796 1.5988 1.2644
No log 0.6562 42 1.3572 0.1390 1.3572 1.1650
No log 0.6875 44 1.0031 0.3281 1.0031 1.0015
No log 0.7188 46 0.6587 0.4028 0.6587 0.8116
No log 0.75 48 0.6319 0.3563 0.6319 0.7949
No log 0.7812 50 0.6453 0.4198 0.6453 0.8033
No log 0.8125 52 0.7041 0.4139 0.7041 0.8391
No log 0.8438 54 0.7486 0.3829 0.7486 0.8652
No log 0.875 56 0.7280 0.3802 0.7280 0.8533
No log 0.9062 58 0.6455 0.2748 0.6455 0.8034
No log 0.9375 60 0.6447 0.3416 0.6447 0.8029
No log 0.9688 62 0.6630 0.3718 0.6630 0.8142
No log 1.0 64 0.7733 0.4028 0.7733 0.8794
No log 1.0312 66 0.9197 0.2384 0.9197 0.9590
No log 1.0625 68 0.7941 0.3706 0.7941 0.8911
No log 1.0938 70 0.6817 0.3886 0.6817 0.8256
No log 1.125 72 0.6558 0.4555 0.6558 0.8098
No log 1.1562 74 0.6623 0.4262 0.6623 0.8138
No log 1.1875 76 0.8906 0.2995 0.8906 0.9437
No log 1.2188 78 0.9016 0.3251 0.9016 0.9495
No log 1.25 80 0.9195 0.3251 0.9195 0.9589
No log 1.2812 82 0.7279 0.2984 0.7279 0.8532
No log 1.3125 84 0.6574 0.3050 0.6574 0.8108
No log 1.3438 86 0.7176 0.4014 0.7176 0.8471
No log 1.375 88 0.6914 0.3127 0.6914 0.8315
No log 1.4062 90 0.6867 0.3196 0.6867 0.8287
No log 1.4375 92 0.7269 0.4569 0.7269 0.8526
No log 1.4688 94 0.8776 0.3346 0.8776 0.9368
No log 1.5 96 0.9711 0.2882 0.9711 0.9854
No log 1.5312 98 0.8441 0.3849 0.8441 0.9188
No log 1.5625 100 0.8048 0.3849 0.8048 0.8971
No log 1.5938 102 0.8609 0.3807 0.8609 0.9279
No log 1.625 104 0.9730 0.3059 0.9730 0.9864
No log 1.6562 106 1.0882 0.1891 1.0882 1.0432
No log 1.6875 108 1.0293 0.2616 1.0293 1.0145
No log 1.7188 110 0.8061 0.4735 0.8061 0.8978
No log 1.75 112 0.6312 0.4575 0.6312 0.7945
No log 1.7812 114 0.5874 0.4991 0.5874 0.7664
No log 1.8125 116 0.5915 0.5141 0.5915 0.7691
No log 1.8438 118 0.5886 0.5195 0.5886 0.7672
No log 1.875 120 0.6337 0.4770 0.6337 0.7961
No log 1.9062 122 0.6252 0.4901 0.6252 0.7907
No log 1.9375 124 0.6419 0.5498 0.6419 0.8012
No log 1.9688 126 0.6564 0.5445 0.6564 0.8102
No log 2.0 128 0.5969 0.5352 0.5969 0.7726
No log 2.0312 130 0.6468 0.5030 0.6468 0.8043
No log 2.0625 132 0.6396 0.5030 0.6396 0.7997
No log 2.0938 134 0.5809 0.5718 0.5809 0.7622
No log 2.125 136 0.6181 0.5031 0.6181 0.7862
No log 2.1562 138 0.6108 0.5095 0.6108 0.7816
No log 2.1875 140 0.6660 0.4482 0.6660 0.8161
No log 2.2188 142 0.5870 0.5811 0.5870 0.7661
No log 2.25 144 0.5537 0.5617 0.5537 0.7441
No log 2.2812 146 0.5887 0.4979 0.5887 0.7673
No log 2.3125 148 0.6642 0.4634 0.6642 0.8150
No log 2.3438 150 0.8061 0.3152 0.8061 0.8978
No log 2.375 152 0.7979 0.3390 0.7979 0.8933
No log 2.4062 154 0.7941 0.3668 0.7941 0.8911
No log 2.4375 156 0.6967 0.4636 0.6967 0.8347
No log 2.4688 158 0.7005 0.4444 0.7005 0.8369
No log 2.5 160 0.7793 0.3377 0.7793 0.8828
No log 2.5312 162 0.8003 0.3906 0.8003 0.8946
No log 2.5625 164 0.6420 0.5283 0.6420 0.8012
No log 2.5938 166 0.5813 0.4314 0.5813 0.7625
No log 2.625 168 0.6438 0.4023 0.6438 0.8024
No log 2.6562 170 0.5876 0.4086 0.5876 0.7665
No log 2.6875 172 0.6527 0.4351 0.6527 0.8079
No log 2.7188 174 0.7064 0.5101 0.7064 0.8405
No log 2.75 176 0.7442 0.4471 0.7442 0.8627
No log 2.7812 178 0.7272 0.3957 0.7272 0.8528
No log 2.8125 180 0.6023 0.5947 0.6023 0.7761
No log 2.8438 182 0.5681 0.6197 0.5681 0.7538
No log 2.875 184 0.5994 0.5095 0.5994 0.7742
No log 2.9062 186 0.6656 0.4652 0.6656 0.8159
No log 2.9375 188 0.6057 0.5161 0.6057 0.7783
No log 2.9688 190 0.5157 0.5681 0.5157 0.7181
No log 3.0 192 0.5153 0.5687 0.5153 0.7179
No log 3.0312 194 0.5044 0.5784 0.5044 0.7102
No log 3.0625 196 0.4996 0.5687 0.4996 0.7069
No log 3.0938 198 0.5007 0.6170 0.5007 0.7076
No log 3.125 200 0.5059 0.6210 0.5059 0.7113
No log 3.1562 202 0.4983 0.6479 0.4983 0.7059
No log 3.1875 204 0.4976 0.6479 0.4976 0.7054
No log 3.2188 206 0.5130 0.5926 0.5130 0.7163
No log 3.25 208 0.5338 0.6423 0.5338 0.7306
No log 3.2812 210 0.5413 0.6656 0.5413 0.7357
No log 3.3125 212 0.5549 0.6222 0.5549 0.7449
No log 3.3438 214 0.6149 0.4859 0.6149 0.7841
No log 3.375 216 0.6582 0.4362 0.6582 0.8113
No log 3.4062 218 0.7827 0.3527 0.7827 0.8847
No log 3.4375 220 0.7070 0.4379 0.7070 0.8408
No log 3.4688 222 0.5829 0.5352 0.5829 0.7635
No log 3.5 224 0.6054 0.6052 0.6054 0.7781
No log 3.5312 226 0.5812 0.5604 0.5812 0.7624
No log 3.5625 228 0.5595 0.5421 0.5595 0.7480
No log 3.5938 230 0.5578 0.5436 0.5578 0.7469
No log 3.625 232 0.5592 0.5611 0.5592 0.7478
No log 3.6562 234 0.5555 0.5826 0.5555 0.7453
No log 3.6875 236 0.5433 0.5649 0.5433 0.7371
No log 3.7188 238 0.5438 0.6046 0.5438 0.7375
No log 3.75 240 0.5859 0.4885 0.5859 0.7654
No log 3.7812 242 0.6071 0.5360 0.6071 0.7791
No log 3.8125 244 0.5697 0.6184 0.5697 0.7548
No log 3.8438 246 0.6595 0.4558 0.6595 0.8121
No log 3.875 248 0.7293 0.4442 0.7293 0.8540
No log 3.9062 250 0.6873 0.4287 0.6873 0.8291
No log 3.9375 252 0.5470 0.5592 0.5470 0.7396
No log 3.9688 254 0.5381 0.6255 0.5381 0.7336
No log 4.0 256 0.5686 0.5642 0.5686 0.7541
No log 4.0312 258 0.7356 0.4142 0.7356 0.8577
No log 4.0625 260 0.8185 0.4177 0.8185 0.9047
No log 4.0938 262 0.7291 0.3913 0.7291 0.8539
No log 4.125 264 0.5974 0.5078 0.5974 0.7729
No log 4.1562 266 0.5452 0.5042 0.5452 0.7384
No log 4.1875 268 0.5448 0.4788 0.5448 0.7381
No log 4.2188 270 0.5534 0.5339 0.5534 0.7439
No log 4.25 272 0.6820 0.4327 0.6820 0.8259
No log 4.2812 274 0.7938 0.4413 0.7938 0.8909
No log 4.3125 276 0.8993 0.4051 0.8993 0.9483
No log 4.3438 278 0.8062 0.4601 0.8062 0.8979
No log 4.375 280 0.6588 0.4150 0.6588 0.8117
No log 4.4062 282 0.6467 0.4359 0.6467 0.8042
No log 4.4375 284 0.7345 0.4232 0.7345 0.8571
No log 4.4688 286 0.8015 0.4601 0.8015 0.8953
No log 4.5 288 0.7266 0.4232 0.7266 0.8524
No log 4.5312 290 0.6086 0.5483 0.6086 0.7801
No log 4.5625 292 0.5883 0.5184 0.5883 0.7670
No log 4.5938 294 0.6161 0.4997 0.6161 0.7849
No log 4.625 296 0.6556 0.3829 0.6556 0.8097
No log 4.6562 298 0.6476 0.4290 0.6476 0.8048
No log 4.6875 300 0.6064 0.4981 0.6064 0.7787
No log 4.7188 302 0.6150 0.4923 0.6150 0.7842
No log 4.75 304 0.6375 0.4705 0.6375 0.7984
No log 4.7812 306 0.6165 0.4705 0.6165 0.7852
No log 4.8125 308 0.6268 0.4705 0.6268 0.7917
No log 4.8438 310 0.7206 0.4472 0.7206 0.8489
No log 4.875 312 0.7639 0.4620 0.7639 0.8740
No log 4.9062 314 0.6381 0.4393 0.6381 0.7988
No log 4.9375 316 0.5837 0.6198 0.5837 0.7640
No log 4.9688 318 0.6309 0.5528 0.6309 0.7943
No log 5.0 320 0.6021 0.5488 0.6021 0.7760
No log 5.0312 322 0.5669 0.6222 0.5669 0.7529
No log 5.0625 324 0.5664 0.6222 0.5664 0.7526
No log 5.0938 326 0.5531 0.6222 0.5531 0.7437
No log 5.125 328 0.5743 0.5414 0.5743 0.7579
No log 5.1562 330 0.6053 0.5184 0.6053 0.7780
No log 5.1875 332 0.6656 0.4186 0.6656 0.8158
No log 5.2188 334 0.6185 0.5184 0.6185 0.7864
No log 5.25 336 0.6288 0.4914 0.6288 0.7930
No log 5.2812 338 0.6688 0.4584 0.6688 0.8178
No log 5.3125 340 0.7304 0.4204 0.7304 0.8547
No log 5.3438 342 0.6979 0.4768 0.6979 0.8354
No log 5.375 344 0.6159 0.5420 0.6159 0.7848
No log 5.4062 346 0.6207 0.6405 0.6207 0.7878
No log 5.4375 348 0.6479 0.5660 0.6479 0.8049
No log 5.4688 350 0.6835 0.5660 0.6835 0.8268
No log 5.5 352 0.6719 0.5225 0.6719 0.8197
No log 5.5312 354 0.6553 0.4789 0.6553 0.8095
No log 5.5625 356 0.6427 0.5135 0.6427 0.8017
No log 5.5938 358 0.6296 0.5135 0.6296 0.7934
No log 5.625 360 0.6258 0.5302 0.6258 0.7911
No log 5.6562 362 0.6346 0.4945 0.6346 0.7966
No log 5.6875 364 0.6456 0.5301 0.6456 0.8035
No log 5.7188 366 0.6597 0.4890 0.6597 0.8122
No log 5.75 368 0.6837 0.5190 0.6837 0.8268
No log 5.7812 370 0.6881 0.4939 0.6881 0.8295
No log 5.8125 372 0.6763 0.4849 0.6763 0.8224
No log 5.8438 374 0.6562 0.4895 0.6562 0.8101
No log 5.875 376 0.6085 0.5450 0.6085 0.7801
No log 5.9062 378 0.5700 0.5549 0.5700 0.7550
No log 5.9375 380 0.5608 0.5549 0.5608 0.7489
No log 5.9688 382 0.5847 0.5383 0.5847 0.7646
No log 6.0 384 0.6156 0.5810 0.6156 0.7846
No log 6.0312 386 0.6229 0.6222 0.6229 0.7892
No log 6.0625 388 0.6049 0.5853 0.6049 0.7778
No log 6.0938 390 0.5891 0.5853 0.5891 0.7675
No log 6.125 392 0.5804 0.6265 0.5804 0.7618
No log 6.1562 394 0.5604 0.6339 0.5604 0.7486
No log 6.1875 396 0.5485 0.5009 0.5485 0.7406
No log 6.2188 398 0.5606 0.4963 0.5606 0.7487
No log 6.25 400 0.5694 0.5267 0.5694 0.7546
No log 6.2812 402 0.5731 0.4997 0.5731 0.7570
No log 6.3125 404 0.5735 0.4963 0.5735 0.7573
No log 6.3438 406 0.5789 0.4935 0.5789 0.7609
No log 6.375 408 0.5881 0.4997 0.5881 0.7669
No log 6.4062 410 0.6821 0.4013 0.6821 0.8259
No log 6.4375 412 0.7195 0.4013 0.7195 0.8482
No log 6.4688 414 0.6384 0.4672 0.6384 0.7990
No log 6.5 416 0.5845 0.5057 0.5845 0.7646
No log 6.5312 418 0.6383 0.5015 0.6383 0.7989
No log 6.5625 420 0.6615 0.5382 0.6615 0.8133
No log 6.5938 422 0.6364 0.6115 0.6364 0.7977
No log 6.625 424 0.6294 0.5303 0.6294 0.7933
No log 6.6562 426 0.6584 0.4951 0.6584 0.8114
No log 6.6875 428 0.6953 0.4339 0.6953 0.8338
No log 6.7188 430 0.6582 0.5024 0.6582 0.8113
No log 6.75 432 0.6231 0.5989 0.6231 0.7894
No log 6.7812 434 0.5882 0.5926 0.5882 0.7670
No log 6.8125 436 0.5599 0.5797 0.5599 0.7483
No log 6.8438 438 0.5528 0.4697 0.5528 0.7435
No log 6.875 440 0.5499 0.4878 0.5499 0.7415
No log 6.9062 442 0.5633 0.5386 0.5633 0.7505
No log 6.9375 444 0.5891 0.5655 0.5891 0.7675
No log 6.9688 446 0.5737 0.6265 0.5737 0.7574
No log 7.0 448 0.6145 0.5074 0.6145 0.7839
No log 7.0312 450 0.8243 0.4098 0.8243 0.9079
No log 7.0625 452 0.9564 0.2928 0.9564 0.9779
No log 7.0938 454 0.9459 0.3457 0.9459 0.9726
No log 7.125 456 0.8225 0.4426 0.8225 0.9069
No log 7.1562 458 0.7207 0.4618 0.7207 0.8489
No log 7.1875 460 0.5877 0.5461 0.5877 0.7666
No log 7.2188 462 0.5698 0.5822 0.5698 0.7549
No log 7.25 464 0.5397 0.6222 0.5397 0.7346
No log 7.2812 466 0.5156 0.6455 0.5156 0.7180
No log 7.3125 468 0.5103 0.5782 0.5103 0.7143
No log 7.3438 470 0.5239 0.6114 0.5239 0.7238
No log 7.375 472 0.5762 0.5599 0.5762 0.7591
No log 7.4062 474 0.5850 0.5445 0.5850 0.7648
No log 7.4375 476 0.5598 0.6182 0.5598 0.7482
No log 7.4688 478 0.5465 0.64 0.5465 0.7393
No log 7.5 480 0.5706 0.5621 0.5706 0.7554
No log 7.5312 482 0.5979 0.5621 0.5979 0.7732
No log 7.5625 484 0.5969 0.5891 0.5969 0.7726
No log 7.5938 486 0.5996 0.5288 0.5996 0.7743
No log 7.625 488 0.6082 0.5476 0.6082 0.7799
No log 7.6562 490 0.6014 0.4958 0.6014 0.7755
No log 7.6875 492 0.5823 0.5406 0.5823 0.7631
No log 7.7188 494 0.5888 0.5420 0.5888 0.7673
No log 7.75 496 0.5823 0.5741 0.5823 0.7631
No log 7.7812 498 0.5807 0.6021 0.5807 0.7620
0.3224 7.8125 500 0.5910 0.5660 0.5910 0.7688
0.3224 7.8438 502 0.5680 0.6121 0.5680 0.7536
0.3224 7.875 504 0.5604 0.5614 0.5604 0.7486
0.3224 7.9062 506 0.6001 0.4474 0.6001 0.7747
0.3224 7.9375 508 0.6073 0.4875 0.6073 0.7793
0.3224 7.9688 510 0.5717 0.5184 0.5717 0.7561
0.3224 8.0 512 0.5684 0.4315 0.5684 0.7539

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_B_usingALLEssays_FineTuningAraBERT_run1_AugV5_k13_task7_organization

Finetuned
(4019)
this model