ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k18_task7_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.4551
  • Qwk: 0.5324
  • Mse: 0.4551
  • Rmse: 0.6746

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0222 2 2.4467 -0.0136 2.4467 1.5642
No log 0.0444 4 1.4423 0.1230 1.4423 1.2010
No log 0.0667 6 0.6559 0.1729 0.6559 0.8099
No log 0.0889 8 0.6892 0.3399 0.6892 0.8302
No log 0.1111 10 0.8610 0.3455 0.8610 0.9279
No log 0.1333 12 0.6850 0.4884 0.6850 0.8276
No log 0.1556 14 0.4917 0.5386 0.4917 0.7012
No log 0.1778 16 0.4163 0.6377 0.4163 0.6452
No log 0.2 18 0.4438 0.6181 0.4438 0.6662
No log 0.2222 20 0.7654 0.5309 0.7654 0.8749
No log 0.2444 22 0.7325 0.5309 0.7325 0.8559
No log 0.2667 24 0.4449 0.7317 0.4449 0.6670
No log 0.2889 26 0.4366 0.7606 0.4366 0.6608
No log 0.3111 28 0.6371 0.4815 0.6371 0.7982
No log 0.3333 30 0.8555 0.4969 0.8555 0.9249
No log 0.3556 32 0.6384 0.4937 0.6384 0.7990
No log 0.3778 34 0.3879 0.7032 0.3879 0.6228
No log 0.4 36 0.4004 0.6924 0.4004 0.6328
No log 0.4222 38 0.4167 0.6462 0.4167 0.6455
No log 0.4444 40 0.5291 0.5016 0.5291 0.7274
No log 0.4667 42 0.5077 0.4860 0.5077 0.7125
No log 0.4889 44 0.5378 0.5129 0.5378 0.7334
No log 0.5111 46 0.6631 0.4977 0.6631 0.8143
No log 0.5333 48 0.5395 0.5529 0.5395 0.7345
No log 0.5556 50 0.4447 0.7263 0.4447 0.6668
No log 0.5778 52 0.4697 0.6930 0.4697 0.6854
No log 0.6 54 0.4554 0.5943 0.4554 0.6749
No log 0.6222 56 0.6548 0.5295 0.6548 0.8092
No log 0.6444 58 0.7181 0.4977 0.7181 0.8474
No log 0.6667 60 0.5832 0.5265 0.5832 0.7637
No log 0.6889 62 0.4721 0.6325 0.4721 0.6871
No log 0.7111 64 0.5884 0.6221 0.5884 0.7671
No log 0.7333 66 0.5395 0.6392 0.5395 0.7345
No log 0.7556 68 0.5307 0.6557 0.5307 0.7285
No log 0.7778 70 0.5310 0.6360 0.5310 0.7287
No log 0.8 72 0.5161 0.6482 0.5161 0.7184
No log 0.8222 74 0.4745 0.6557 0.4745 0.6888
No log 0.8444 76 0.4562 0.6247 0.4562 0.6754
No log 0.8667 78 0.4972 0.5636 0.4972 0.7051
No log 0.8889 80 0.4424 0.6503 0.4424 0.6651
No log 0.9111 82 0.4427 0.6038 0.4427 0.6653
No log 0.9333 84 0.5762 0.5195 0.5762 0.7591
No log 0.9556 86 0.5385 0.5403 0.5385 0.7338
No log 0.9778 88 0.4221 0.6452 0.4221 0.6497
No log 1.0 90 0.6263 0.4650 0.6263 0.7914
No log 1.0222 92 0.7027 0.3867 0.7027 0.8383
No log 1.0444 94 0.5759 0.5018 0.5759 0.7589
No log 1.0667 96 0.5071 0.6092 0.5071 0.7121
No log 1.0889 98 0.4915 0.6277 0.4915 0.7010
No log 1.1111 100 0.5221 0.5556 0.5221 0.7226
No log 1.1333 102 0.5536 0.4957 0.5536 0.7440
No log 1.1556 104 0.5379 0.5575 0.5379 0.7334
No log 1.1778 106 0.5279 0.5538 0.5279 0.7265
No log 1.2 108 0.5695 0.5735 0.5695 0.7546
No log 1.2222 110 0.5345 0.6108 0.5345 0.7311
No log 1.2444 112 0.5283 0.6252 0.5283 0.7268
No log 1.2667 114 0.5366 0.6596 0.5366 0.7325
No log 1.2889 116 0.5256 0.6345 0.5256 0.7250
No log 1.3111 118 0.4968 0.6560 0.4968 0.7048
No log 1.3333 120 0.4700 0.6656 0.4700 0.6855
No log 1.3556 122 0.4520 0.7333 0.4520 0.6723
No log 1.3778 124 0.4408 0.6553 0.4408 0.6640
No log 1.4 126 0.4423 0.6806 0.4423 0.6650
No log 1.4222 128 0.4628 0.6694 0.4628 0.6803
No log 1.4444 130 0.4690 0.6609 0.4690 0.6848
No log 1.4667 132 0.4617 0.6284 0.4617 0.6795
No log 1.4889 134 0.5050 0.5748 0.5050 0.7106
No log 1.5111 136 0.4437 0.6639 0.4437 0.6661
No log 1.5333 138 0.4530 0.6020 0.4530 0.6731
No log 1.5556 140 0.5142 0.6442 0.5142 0.7171
No log 1.5778 142 0.5069 0.6709 0.5069 0.7119
No log 1.6 144 0.4707 0.6088 0.4707 0.6861
No log 1.6222 146 0.4852 0.6467 0.4852 0.6966
No log 1.6444 148 0.4644 0.6136 0.4644 0.6815
No log 1.6667 150 0.4729 0.6337 0.4729 0.6877
No log 1.6889 152 0.4727 0.6248 0.4727 0.6875
No log 1.7111 154 0.4213 0.6364 0.4213 0.6491
No log 1.7333 156 0.4129 0.6542 0.4129 0.6426
No log 1.7556 158 0.4516 0.6114 0.4516 0.6720
No log 1.7778 160 0.4835 0.6260 0.4835 0.6953
No log 1.8 162 0.4499 0.6514 0.4499 0.6707
No log 1.8222 164 0.4076 0.6087 0.4076 0.6385
No log 1.8444 166 0.4082 0.6632 0.4082 0.6389
No log 1.8667 168 0.4132 0.6467 0.4132 0.6428
No log 1.8889 170 0.5727 0.5489 0.5727 0.7568
No log 1.9111 172 0.6725 0.5667 0.6725 0.8200
No log 1.9333 174 0.5687 0.6379 0.5687 0.7541
No log 1.9556 176 0.5546 0.6060 0.5546 0.7447
No log 1.9778 178 0.5302 0.6118 0.5302 0.7281
No log 2.0 180 0.4649 0.6561 0.4649 0.6818
No log 2.0222 182 0.4887 0.6182 0.4887 0.6991
No log 2.0444 184 0.4916 0.6017 0.4916 0.7011
No log 2.0667 186 0.4845 0.6896 0.4845 0.6960
No log 2.0889 188 0.4405 0.6492 0.4405 0.6637
No log 2.1111 190 0.4329 0.6371 0.4329 0.6580
No log 2.1333 192 0.4495 0.6492 0.4495 0.6705
No log 2.1556 194 0.4493 0.6641 0.4493 0.6703
No log 2.1778 196 0.4436 0.6723 0.4436 0.6660
No log 2.2 198 0.4338 0.6014 0.4338 0.6586
No log 2.2222 200 0.4541 0.6419 0.4541 0.6739
No log 2.2444 202 0.4724 0.5970 0.4724 0.6873
No log 2.2667 204 0.4250 0.6455 0.4250 0.6519
No log 2.2889 206 0.4612 0.6604 0.4612 0.6791
No log 2.3111 208 0.5083 0.5775 0.5083 0.7129
No log 2.3333 210 0.4657 0.6337 0.4657 0.6824
No log 2.3556 212 0.4204 0.6736 0.4204 0.6484
No log 2.3778 214 0.4421 0.5406 0.4421 0.6649
No log 2.4 216 0.4648 0.5841 0.4648 0.6817
No log 2.4222 218 0.4303 0.6101 0.4303 0.6560
No log 2.4444 220 0.4260 0.6818 0.4260 0.6527
No log 2.4667 222 0.4279 0.6818 0.4279 0.6541
No log 2.4889 224 0.4282 0.5611 0.4282 0.6544
No log 2.5111 226 0.4884 0.5627 0.4884 0.6988
No log 2.5333 228 0.4587 0.5627 0.4587 0.6773
No log 2.5556 230 0.4210 0.6736 0.4210 0.6488
No log 2.5778 232 0.4930 0.6087 0.4930 0.7021
No log 2.6 234 0.4926 0.6394 0.4926 0.7019
No log 2.6222 236 0.4400 0.6908 0.4400 0.6634
No log 2.6444 238 0.4512 0.5826 0.4512 0.6717
No log 2.6667 240 0.6260 0.5934 0.6260 0.7912
No log 2.6889 242 0.7229 0.4859 0.7229 0.8503
No log 2.7111 244 0.6491 0.5850 0.6491 0.8057
No log 2.7333 246 0.5625 0.6046 0.5625 0.7500
No log 2.7556 248 0.5723 0.6305 0.5723 0.7565
No log 2.7778 250 0.5077 0.6529 0.5077 0.7125
No log 2.8 252 0.4439 0.6242 0.4439 0.6663
No log 2.8222 254 0.4474 0.5617 0.4474 0.6688
No log 2.8444 256 0.4623 0.4960 0.4623 0.6799
No log 2.8667 258 0.4439 0.5538 0.4439 0.6663
No log 2.8889 260 0.4278 0.6632 0.4278 0.6541
No log 2.9111 262 0.4398 0.6346 0.4398 0.6631
No log 2.9333 264 0.4487 0.5725 0.4487 0.6698
No log 2.9556 266 0.4510 0.5725 0.4510 0.6716
No log 2.9778 268 0.4506 0.6346 0.4506 0.6713
No log 3.0 270 0.4538 0.6632 0.4538 0.6737
No log 3.0222 272 0.4456 0.5446 0.4456 0.6676
No log 3.0444 274 0.4975 0.5627 0.4975 0.7053
No log 3.0667 276 0.4811 0.5627 0.4811 0.6936
No log 3.0889 278 0.4514 0.6632 0.4514 0.6719
No log 3.1111 280 0.5498 0.5812 0.5498 0.7415
No log 3.1333 282 0.6051 0.5739 0.6051 0.7779
No log 3.1556 284 0.5094 0.6188 0.5094 0.7137
No log 3.1778 286 0.4544 0.5672 0.4544 0.6741
No log 3.2 288 0.5188 0.5230 0.5188 0.7203
No log 3.2222 290 0.5195 0.4997 0.5195 0.7207
No log 3.2444 292 0.4714 0.5321 0.4714 0.6866
No log 3.2667 294 0.4596 0.6010 0.4596 0.6779
No log 3.2889 296 0.4608 0.6115 0.4608 0.6789
No log 3.3111 298 0.4462 0.6060 0.4462 0.6680
No log 3.3333 300 0.4861 0.6047 0.4861 0.6972
No log 3.3556 302 0.5039 0.6074 0.5039 0.7099
No log 3.3778 304 0.4654 0.5826 0.4654 0.6822
No log 3.4 306 0.4430 0.5956 0.4430 0.6656
No log 3.4222 308 0.4782 0.5115 0.4782 0.6915
No log 3.4444 310 0.5110 0.4948 0.5110 0.7148
No log 3.4667 312 0.4867 0.5184 0.4867 0.6977
No log 3.4889 314 0.4864 0.5617 0.4864 0.6974
No log 3.5111 316 0.4673 0.6038 0.4673 0.6836
No log 3.5333 318 0.4260 0.6242 0.4260 0.6527
No log 3.5556 320 0.4538 0.6330 0.4538 0.6736
No log 3.5778 322 0.5088 0.6273 0.5088 0.7133
No log 3.6 324 0.4930 0.6092 0.4930 0.7021
No log 3.6222 326 0.4695 0.6515 0.4695 0.6852
No log 3.6444 328 0.4591 0.6553 0.4591 0.6776
No log 3.6667 330 0.4950 0.6174 0.4950 0.7035
No log 3.6889 332 0.5160 0.5884 0.5160 0.7184
No log 3.7111 334 0.4924 0.6313 0.4924 0.7017
No log 3.7333 336 0.4730 0.5625 0.4730 0.6878
No log 3.7556 338 0.4746 0.5625 0.4746 0.6889
No log 3.7778 340 0.4710 0.6129 0.4710 0.6863
No log 3.8 342 0.4621 0.6620 0.4621 0.6798
No log 3.8222 344 0.4526 0.6142 0.4526 0.6727
No log 3.8444 346 0.4557 0.6530 0.4557 0.6751
No log 3.8667 348 0.4706 0.6145 0.4706 0.6860
No log 3.8889 350 0.4574 0.6073 0.4574 0.6763
No log 3.9111 352 0.4791 0.6431 0.4791 0.6922
No log 3.9333 354 0.4931 0.6061 0.4931 0.7022
No log 3.9556 356 0.4645 0.5248 0.4645 0.6815
No log 3.9778 358 0.4677 0.6820 0.4677 0.6839
No log 4.0 360 0.4917 0.5310 0.4917 0.7012
No log 4.0222 362 0.5019 0.5067 0.5019 0.7085
No log 4.0444 364 0.4871 0.6505 0.4871 0.6979
No log 4.0667 366 0.4626 0.6142 0.4626 0.6802
No log 4.0889 368 0.4611 0.5672 0.4611 0.6790
No log 4.1111 370 0.4715 0.5904 0.4715 0.6867
No log 4.1333 372 0.4709 0.5580 0.4709 0.6862
No log 4.1556 374 0.4746 0.5888 0.4746 0.6889
No log 4.1778 376 0.4696 0.5405 0.4696 0.6853
No log 4.2 378 0.4650 0.5373 0.4650 0.6819
No log 4.2222 380 0.4689 0.6129 0.4689 0.6848
No log 4.2444 382 0.4769 0.6313 0.4769 0.6906
No log 4.2667 384 0.4696 0.6624 0.4696 0.6853
No log 4.2889 386 0.4571 0.6215 0.4571 0.6761
No log 4.3111 388 0.4631 0.5321 0.4631 0.6805
No log 4.3333 390 0.4916 0.5212 0.4916 0.7011
No log 4.3556 392 0.4830 0.5642 0.4830 0.6949
No log 4.3778 394 0.4457 0.5446 0.4457 0.6676
No log 4.4 396 0.4506 0.6908 0.4506 0.6712
No log 4.4222 398 0.4407 0.6908 0.4407 0.6639
No log 4.4444 400 0.4463 0.5611 0.4463 0.6680
No log 4.4667 402 0.4787 0.5794 0.4787 0.6918
No log 4.4889 404 0.4644 0.5569 0.4644 0.6815
No log 4.5111 406 0.4282 0.5304 0.4282 0.6544
No log 4.5333 408 0.4232 0.7208 0.4232 0.6505
No log 4.5556 410 0.4323 0.6702 0.4323 0.6575
No log 4.5778 412 0.4245 0.6830 0.4245 0.6515
No log 4.6 414 0.4553 0.6079 0.4553 0.6748
No log 4.6222 416 0.4970 0.6181 0.4970 0.7050
No log 4.6444 418 0.4802 0.6181 0.4802 0.6930
No log 4.6667 420 0.4684 0.5983 0.4684 0.6844
No log 4.6889 422 0.4305 0.5373 0.4305 0.6561
No log 4.7111 424 0.4231 0.5681 0.4231 0.6505
No log 4.7333 426 0.4286 0.5681 0.4286 0.6547
No log 4.7556 428 0.4340 0.5304 0.4340 0.6588
No log 4.7778 430 0.4340 0.5390 0.4340 0.6588
No log 4.8 432 0.4356 0.5797 0.4356 0.6600
No log 4.8222 434 0.4441 0.6007 0.4441 0.6664
No log 4.8444 436 0.4478 0.6289 0.4478 0.6692
No log 4.8667 438 0.4461 0.6741 0.4461 0.6679
No log 4.8889 440 0.4669 0.6418 0.4669 0.6833
No log 4.9111 442 0.4990 0.6025 0.4990 0.7064
No log 4.9333 444 0.4918 0.6406 0.4918 0.7013
No log 4.9556 446 0.4558 0.6506 0.4558 0.6751
No log 4.9778 448 0.4320 0.6820 0.4320 0.6572
No log 5.0 450 0.4250 0.6724 0.4250 0.6519
No log 5.0222 452 0.4193 0.6724 0.4193 0.6475
No log 5.0444 454 0.4256 0.6724 0.4256 0.6524
No log 5.0667 456 0.4224 0.6228 0.4224 0.6499
No log 5.0889 458 0.4193 0.6142 0.4193 0.6476
No log 5.1111 460 0.4440 0.7170 0.4440 0.6663
No log 5.1333 462 0.4920 0.6687 0.4920 0.7014
No log 5.1556 464 0.4921 0.6775 0.4921 0.7015
No log 5.1778 466 0.4606 0.7170 0.4606 0.6787
No log 5.2 468 0.4371 0.6443 0.4371 0.6612
No log 5.2222 470 0.4316 0.6267 0.4316 0.6570
No log 5.2444 472 0.4239 0.6739 0.4239 0.6510
No log 5.2667 474 0.4349 0.6530 0.4349 0.6595
No log 5.2889 476 0.4681 0.6776 0.4681 0.6842
No log 5.3111 478 0.4574 0.6601 0.4574 0.6763
No log 5.3333 480 0.4246 0.6898 0.4246 0.6516
No log 5.3556 482 0.4028 0.6542 0.4028 0.6347
No log 5.3778 484 0.4060 0.6364 0.4060 0.6372
No log 5.4 486 0.4030 0.6242 0.4030 0.6348
No log 5.4222 488 0.4014 0.6242 0.4014 0.6335
No log 5.4444 490 0.4089 0.7098 0.4089 0.6395
No log 5.4667 492 0.4180 0.7011 0.4180 0.6465
No log 5.4889 494 0.4192 0.5986 0.4192 0.6475
No log 5.5111 496 0.4431 0.5246 0.4431 0.6657
No log 5.5333 498 0.4524 0.5414 0.4524 0.6726
0.2943 5.5556 500 0.4452 0.5765 0.4452 0.6672
0.2943 5.5778 502 0.4639 0.5923 0.4639 0.6811
0.2943 5.6 504 0.4729 0.5945 0.4729 0.6877
0.2943 5.6222 506 0.4622 0.6053 0.4622 0.6798
0.2943 5.6444 508 0.4929 0.5455 0.4929 0.7021
0.2943 5.6667 510 0.5986 0.5363 0.5986 0.7737
0.2943 5.6889 512 0.5934 0.5146 0.5934 0.7703
0.2943 5.7111 514 0.5074 0.5603 0.5074 0.7123
0.2943 5.7333 516 0.4551 0.5324 0.4551 0.6746

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
1
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k18_task7_organization

Finetuned
(4019)
this model