ArabicNewSplits7_B_usingALLEssays_FineTuningAraBERT_run3_AugV5_k2_task7_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.5006
  • Qwk: 0.6047
  • Mse: 0.5006
  • Rmse: 0.7075

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.1818 2 2.6360 -0.0407 2.6360 1.6236
No log 0.3636 4 1.2185 0.0452 1.2185 1.1039
No log 0.5455 6 1.0373 -0.1794 1.0373 1.0185
No log 0.7273 8 0.9791 0.2308 0.9791 0.9895
No log 0.9091 10 0.9686 0.2504 0.9686 0.9842
No log 1.0909 12 0.8944 0.3287 0.8944 0.9457
No log 1.2727 14 0.7143 0.1863 0.7143 0.8452
No log 1.4545 16 0.7191 0.2336 0.7191 0.8480
No log 1.6364 18 0.6933 0.1236 0.6933 0.8327
No log 1.8182 20 0.6791 0.1714 0.6791 0.8241
No log 2.0 22 0.6459 0.1604 0.6459 0.8037
No log 2.1818 24 0.6473 0.2606 0.6473 0.8046
No log 2.3636 26 0.6317 0.1790 0.6317 0.7948
No log 2.5455 28 0.6864 0.3444 0.6864 0.8285
No log 2.7273 30 0.6237 0.1846 0.6237 0.7898
No log 2.9091 32 0.6281 0.0851 0.6281 0.7926
No log 3.0909 34 0.6602 0.1315 0.6602 0.8125
No log 3.2727 36 0.7127 0.2132 0.7127 0.8442
No log 3.4545 38 0.6755 0.1228 0.6755 0.8219
No log 3.6364 40 0.7181 0.3712 0.7181 0.8474
No log 3.8182 42 0.7303 0.4089 0.7303 0.8546
No log 4.0 44 0.6749 0.3577 0.6749 0.8215
No log 4.1818 46 0.7437 0.2821 0.7437 0.8624
No log 4.3636 48 0.6926 0.4427 0.6926 0.8322
No log 4.5455 50 0.6712 0.5214 0.6712 0.8193
No log 4.7273 52 0.6405 0.5307 0.6405 0.8003
No log 4.9091 54 0.6685 0.4868 0.6685 0.8176
No log 5.0909 56 0.6710 0.4652 0.6710 0.8192
No log 5.2727 58 0.5818 0.4586 0.5818 0.7628
No log 5.4545 60 0.5943 0.5353 0.5943 0.7709
No log 5.6364 62 0.5756 0.5257 0.5756 0.7587
No log 5.8182 64 0.5802 0.6047 0.5802 0.7617
No log 6.0 66 0.7106 0.5003 0.7106 0.8430
No log 6.1818 68 0.6030 0.6047 0.6030 0.7765
No log 6.3636 70 0.5621 0.5886 0.5621 0.7497
No log 6.5455 72 0.6281 0.4929 0.6281 0.7925
No log 6.7273 74 0.5759 0.5237 0.5759 0.7589
No log 6.9091 76 0.5383 0.5344 0.5383 0.7337
No log 7.0909 78 0.5468 0.4591 0.5468 0.7395
No log 7.2727 80 0.6714 0.4387 0.6714 0.8194
No log 7.4545 82 0.8497 0.3782 0.8497 0.9218
No log 7.6364 84 0.7389 0.3770 0.7389 0.8596
No log 7.8182 86 0.6319 0.3200 0.6319 0.7949
No log 8.0 88 0.7636 0.3502 0.7636 0.8738
No log 8.1818 90 0.9027 0.2129 0.9027 0.9501
No log 8.3636 92 0.7837 0.3300 0.7837 0.8853
No log 8.5455 94 0.6158 0.5046 0.6158 0.7847
No log 8.7273 96 0.5774 0.5050 0.5774 0.7599
No log 8.9091 98 0.5758 0.4467 0.5758 0.7588
No log 9.0909 100 0.5704 0.5195 0.5704 0.7552
No log 9.2727 102 0.6124 0.4367 0.6124 0.7825
No log 9.4545 104 0.5921 0.5305 0.5921 0.7695
No log 9.6364 106 0.6895 0.4409 0.6895 0.8304
No log 9.8182 108 0.8788 0.3650 0.8788 0.9374
No log 10.0 110 0.8465 0.4096 0.8465 0.9200
No log 10.1818 112 0.7042 0.4880 0.7042 0.8392
No log 10.3636 114 0.5581 0.4287 0.5581 0.7471
No log 10.5455 116 0.7879 0.4413 0.7880 0.8877
No log 10.7273 118 1.0301 0.3735 1.0301 1.0149
No log 10.9091 120 0.8990 0.4321 0.8990 0.9482
No log 11.0909 122 0.6377 0.5266 0.6377 0.7986
No log 11.2727 124 0.5458 0.5678 0.5458 0.7388
No log 11.4545 126 0.5238 0.5357 0.5238 0.7237
No log 11.6364 128 0.5179 0.4820 0.5179 0.7197
No log 11.8182 130 0.5157 0.5252 0.5157 0.7181
No log 12.0 132 0.5475 0.4889 0.5475 0.7399
No log 12.1818 134 0.6415 0.4667 0.6415 0.8009
No log 12.3636 136 0.6550 0.4468 0.6550 0.8093
No log 12.5455 138 0.5593 0.5543 0.5593 0.7479
No log 12.7273 140 0.5480 0.5831 0.5480 0.7403
No log 12.9091 142 0.5848 0.4655 0.5848 0.7647
No log 13.0909 144 0.5451 0.5177 0.5451 0.7383
No log 13.2727 146 0.5634 0.5086 0.5634 0.7506
No log 13.4545 148 0.6057 0.4665 0.6057 0.7783
No log 13.6364 150 0.5549 0.5149 0.5549 0.7449
No log 13.8182 152 0.4969 0.5141 0.4969 0.7049
No log 14.0 154 0.4930 0.5379 0.4930 0.7021
No log 14.1818 156 0.5260 0.5560 0.5260 0.7253
No log 14.3636 158 0.5615 0.5379 0.5615 0.7493
No log 14.5455 160 0.5514 0.5206 0.5514 0.7426
No log 14.7273 162 0.4928 0.6431 0.4928 0.7020
No log 14.9091 164 0.4960 0.6105 0.4960 0.7043
No log 15.0909 166 0.5193 0.6402 0.5193 0.7206
No log 15.2727 168 0.5004 0.5915 0.5004 0.7074
No log 15.4545 170 0.4859 0.6092 0.4859 0.6971
No log 15.6364 172 0.4882 0.5888 0.4882 0.6987
No log 15.8182 174 0.4894 0.5753 0.4894 0.6996
No log 16.0 176 0.4835 0.5753 0.4835 0.6954
No log 16.1818 178 0.4823 0.5846 0.4823 0.6945
No log 16.3636 180 0.4799 0.5609 0.4799 0.6928
No log 16.5455 182 0.5200 0.5560 0.5200 0.7211
No log 16.7273 184 0.5423 0.5770 0.5423 0.7364
No log 16.9091 186 0.4965 0.5951 0.4965 0.7046
No log 17.0909 188 0.5737 0.5078 0.5737 0.7575
No log 17.2727 190 0.5880 0.4868 0.5880 0.7668
No log 17.4545 192 0.5273 0.5340 0.5273 0.7262
No log 17.6364 194 0.5357 0.5816 0.5357 0.7319
No log 17.8182 196 0.5220 0.5816 0.5220 0.7225
No log 18.0 198 0.5261 0.5677 0.5261 0.7253
No log 18.1818 200 0.5244 0.5951 0.5244 0.7242
No log 18.3636 202 0.5330 0.5201 0.5330 0.7301
No log 18.5455 204 0.5743 0.4883 0.5743 0.7578
No log 18.7273 206 0.5525 0.5201 0.5525 0.7433
No log 18.9091 208 0.5443 0.5722 0.5443 0.7378
No log 19.0909 210 0.7823 0.4511 0.7823 0.8845
No log 19.2727 212 1.0550 0.2574 1.0550 1.0271
No log 19.4545 214 0.9781 0.2553 0.9781 0.9890
No log 19.6364 216 0.6901 0.5900 0.6901 0.8307
No log 19.8182 218 0.5646 0.5570 0.5646 0.7514
No log 20.0 220 0.5445 0.6136 0.5445 0.7379
No log 20.1818 222 0.5503 0.6130 0.5503 0.7418
No log 20.3636 224 0.6097 0.5368 0.6097 0.7808
No log 20.5455 226 0.6170 0.5368 0.6170 0.7855
No log 20.7273 228 0.5449 0.5890 0.5449 0.7381
No log 20.9091 230 0.5325 0.5201 0.5325 0.7297
No log 21.0909 232 0.5598 0.4864 0.5598 0.7482
No log 21.2727 234 0.5469 0.4644 0.5469 0.7396
No log 21.4545 236 0.5037 0.5846 0.5037 0.7097
No log 21.6364 238 0.5139 0.6589 0.5139 0.7169
No log 21.8182 240 0.5082 0.5918 0.5082 0.7129
No log 22.0 242 0.4891 0.6269 0.4891 0.6994
No log 22.1818 244 0.4876 0.5846 0.4876 0.6983
No log 22.3636 246 0.4948 0.5985 0.4948 0.7034
No log 22.5455 248 0.4811 0.6114 0.4811 0.6936
No log 22.7273 250 0.4686 0.6114 0.4686 0.6846
No log 22.9091 252 0.4536 0.6339 0.4536 0.6735
No log 23.0909 254 0.4942 0.5485 0.4942 0.7030
No log 23.2727 256 0.5568 0.5712 0.5568 0.7462
No log 23.4545 258 0.5140 0.5601 0.5140 0.7169
No log 23.6364 260 0.4916 0.5528 0.4916 0.7011
No log 23.8182 262 0.4750 0.5411 0.4750 0.6892
No log 24.0 264 0.4568 0.5738 0.4568 0.6759
No log 24.1818 266 0.4569 0.6759 0.4569 0.6760
No log 24.3636 268 0.4649 0.6957 0.4649 0.6819
No log 24.5455 270 0.4649 0.6254 0.4649 0.6818
No log 24.7273 272 0.5394 0.5211 0.5394 0.7345
No log 24.9091 274 0.6205 0.5101 0.6205 0.7877
No log 25.0909 276 0.5808 0.5045 0.5808 0.7621
No log 25.2727 278 0.5127 0.5379 0.5127 0.7160
No log 25.4545 280 0.5111 0.6170 0.5111 0.7149
No log 25.6364 282 0.5252 0.6389 0.5252 0.7247
No log 25.8182 284 0.5152 0.5993 0.5152 0.7178
No log 26.0 286 0.5191 0.5988 0.5191 0.7205
No log 26.1818 288 0.5367 0.5388 0.5367 0.7326
No log 26.3636 290 0.5342 0.5473 0.5342 0.7309
No log 26.5455 292 0.5018 0.6650 0.5018 0.7084
No log 26.7273 294 0.5119 0.5915 0.5119 0.7155
No log 26.9091 296 0.5320 0.6210 0.5320 0.7294
No log 27.0909 298 0.5076 0.6295 0.5076 0.7125
No log 27.2727 300 0.4898 0.5846 0.4898 0.6999
No log 27.4545 302 0.5022 0.4985 0.5022 0.7087
No log 27.6364 304 0.5641 0.4854 0.5641 0.7511
No log 27.8182 306 0.5813 0.5081 0.5813 0.7624
No log 28.0 308 0.5264 0.5034 0.5264 0.7255
No log 28.1818 310 0.4979 0.5053 0.4979 0.7056
No log 28.3636 312 0.4911 0.5071 0.4911 0.7008
No log 28.5455 314 0.4952 0.5517 0.4952 0.7037
No log 28.7273 316 0.4961 0.6254 0.4961 0.7044
No log 28.9091 318 0.5042 0.6761 0.5042 0.7101
No log 29.0909 320 0.5105 0.6389 0.5105 0.7145
No log 29.2727 322 0.4972 0.5993 0.4972 0.7051
No log 29.4545 324 0.5008 0.5289 0.5008 0.7077
No log 29.6364 326 0.5384 0.5015 0.5384 0.7338
No log 29.8182 328 0.5413 0.5254 0.5413 0.7357
No log 30.0 330 0.5204 0.5184 0.5204 0.7214
No log 30.1818 332 0.5001 0.5053 0.5001 0.7072
No log 30.3636 334 0.4962 0.5053 0.4962 0.7044
No log 30.5455 336 0.4946 0.5053 0.4946 0.7033
No log 30.7273 338 0.4978 0.5053 0.4978 0.7055
No log 30.9091 340 0.4833 0.5912 0.4833 0.6952
No log 31.0909 342 0.4858 0.6242 0.4858 0.6970
No log 31.2727 344 0.4883 0.6024 0.4883 0.6988
No log 31.4545 346 0.4826 0.6317 0.4826 0.6947
No log 31.6364 348 0.4861 0.5897 0.4861 0.6972
No log 31.8182 350 0.4912 0.5671 0.4912 0.7009
No log 32.0 352 0.4971 0.5671 0.4971 0.7051
No log 32.1818 354 0.4869 0.5897 0.4869 0.6978
No log 32.3636 356 0.4843 0.5897 0.4843 0.6959
No log 32.5455 358 0.4818 0.5703 0.4818 0.6941
No log 32.7273 360 0.4823 0.6046 0.4823 0.6945
No log 32.9091 362 0.4851 0.6052 0.4851 0.6965
No log 33.0909 364 0.4668 0.6254 0.4668 0.6832
No log 33.2727 366 0.4629 0.6364 0.4629 0.6804
No log 33.4545 368 0.4644 0.6060 0.4644 0.6814
No log 33.6364 370 0.4686 0.5926 0.4686 0.6845
No log 33.8182 372 0.4610 0.6426 0.4610 0.6790
No log 34.0 374 0.4528 0.7227 0.4528 0.6729
No log 34.1818 376 0.4584 0.6503 0.4584 0.6771
No log 34.3636 378 0.4860 0.6620 0.4860 0.6971
No log 34.5455 380 0.5457 0.5650 0.5457 0.7387
No log 34.7273 382 0.5202 0.5650 0.5202 0.7212
No log 34.9091 384 0.4508 0.7052 0.4508 0.6714
No log 35.0909 386 0.4464 0.6730 0.4464 0.6681
No log 35.2727 388 0.4704 0.5872 0.4704 0.6859
No log 35.4545 390 0.4665 0.5872 0.4665 0.6830
No log 35.6364 392 0.4352 0.6750 0.4352 0.6597
No log 35.8182 394 0.4531 0.6960 0.4531 0.6731
No log 36.0 396 0.4783 0.6127 0.4783 0.6916
No log 36.1818 398 0.4679 0.6960 0.4679 0.6840
No log 36.3636 400 0.4460 0.6762 0.4460 0.6678
No log 36.5455 402 0.4476 0.6750 0.4476 0.6690
No log 36.7273 404 0.4735 0.5741 0.4735 0.6881
No log 36.9091 406 0.5095 0.5442 0.5095 0.7138
No log 37.0909 408 0.4855 0.5485 0.4855 0.6968
No log 37.2727 410 0.4613 0.6317 0.4613 0.6792
No log 37.4545 412 0.4620 0.6228 0.4620 0.6797
No log 37.6364 414 0.4703 0.6096 0.4703 0.6858
No log 37.8182 416 0.4796 0.5593 0.4796 0.6925
No log 38.0 418 0.4905 0.5593 0.4905 0.7004
No log 38.1818 420 0.5137 0.4883 0.5137 0.7168
No log 38.3636 422 0.5088 0.5362 0.5088 0.7133
No log 38.5455 424 0.4944 0.5362 0.4944 0.7031
No log 38.7273 426 0.4942 0.5362 0.4942 0.7030
No log 38.9091 428 0.4983 0.5362 0.4983 0.7059
No log 39.0909 430 0.4913 0.5362 0.4913 0.7009
No log 39.2727 432 0.4809 0.5289 0.4809 0.6935
No log 39.4545 434 0.4780 0.6242 0.4780 0.6914
No log 39.6364 436 0.4783 0.6024 0.4783 0.6916
No log 39.8182 438 0.4726 0.5648 0.4726 0.6875
No log 40.0 440 0.4751 0.5455 0.4751 0.6893
No log 40.1818 442 0.4877 0.5362 0.4877 0.6983
No log 40.3636 444 0.4895 0.5272 0.4895 0.6996
No log 40.5455 446 0.4698 0.5289 0.4698 0.6854
No log 40.7273 448 0.4597 0.5800 0.4597 0.6780
No log 40.9091 450 0.4656 0.5800 0.4656 0.6823
No log 41.0909 452 0.4791 0.5784 0.4791 0.6922
No log 41.2727 454 0.4835 0.5687 0.4835 0.6953
No log 41.4545 456 0.4833 0.5687 0.4833 0.6952
No log 41.6364 458 0.4885 0.4819 0.4885 0.6989
No log 41.8182 460 0.4752 0.5050 0.4752 0.6894
No log 42.0 462 0.4484 0.6351 0.4484 0.6697
No log 42.1818 464 0.4588 0.7062 0.4588 0.6774
No log 42.3636 466 0.5113 0.6431 0.5113 0.7151
No log 42.5455 468 0.5402 0.5957 0.5402 0.7350
No log 42.7273 470 0.5168 0.6169 0.5168 0.7189
No log 42.9091 472 0.4757 0.6431 0.4757 0.6897
No log 43.0909 474 0.4523 0.6566 0.4523 0.6725
No log 43.2727 476 0.4600 0.6242 0.4600 0.6782
No log 43.4545 478 0.4682 0.5567 0.4682 0.6842
No log 43.6364 480 0.4773 0.5567 0.4773 0.6909
No log 43.8182 482 0.4896 0.5428 0.4896 0.6997
No log 44.0 484 0.4973 0.5411 0.4973 0.7052
No log 44.1818 486 0.4849 0.5341 0.4849 0.6963
No log 44.3636 488 0.4767 0.5559 0.4767 0.6904
No log 44.5455 490 0.4671 0.6173 0.4671 0.6835
No log 44.7273 492 0.4652 0.6750 0.4652 0.6821
No log 44.9091 494 0.4726 0.6197 0.4726 0.6875
No log 45.0909 496 0.4872 0.6210 0.4872 0.6980
No log 45.2727 498 0.4927 0.6295 0.4927 0.7019
0.2524 45.4545 500 0.4746 0.6295 0.4746 0.6889
0.2524 45.6364 502 0.4617 0.5672 0.4617 0.6795
0.2524 45.8182 504 0.4655 0.5831 0.4655 0.6823
0.2524 46.0 506 0.4791 0.5517 0.4791 0.6922
0.2524 46.1818 508 0.4721 0.5831 0.4721 0.6871
0.2524 46.3636 510 0.4624 0.6439 0.4624 0.6800
0.2524 46.5455 512 0.4668 0.5956 0.4668 0.6832
0.2524 46.7273 514 0.4786 0.5687 0.4786 0.6918
0.2524 46.9091 516 0.4816 0.5687 0.4816 0.6940
0.2524 47.0909 518 0.4682 0.6170 0.4682 0.6843
0.2524 47.2727 520 0.4639 0.6643 0.4639 0.6811
0.2524 47.4545 522 0.4802 0.5404 0.4802 0.6930
0.2524 47.6364 524 0.4981 0.5745 0.4981 0.7058
0.2524 47.8182 526 0.4944 0.5447 0.4944 0.7032
0.2524 48.0 528 0.4742 0.6009 0.4742 0.6886
0.2524 48.1818 530 0.4697 0.6849 0.4697 0.6853
0.2524 48.3636 532 0.5063 0.5956 0.5063 0.7115
0.2524 48.5455 534 0.5577 0.6104 0.5577 0.7468
0.2524 48.7273 536 0.5682 0.5922 0.5682 0.7538
0.2524 48.9091 538 0.5429 0.5706 0.5429 0.7368
0.2524 49.0909 540 0.5006 0.6047 0.5006 0.7075

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
3
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_B_usingALLEssays_FineTuningAraBERT_run3_AugV5_k2_task7_organization

Finetuned
(4019)
this model