ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k7_task7_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.7316
  • Qwk: 0.3892
  • Mse: 0.7316
  • Rmse: 0.8553

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0571 2 2.6417 -0.0109 2.6417 1.6253
No log 0.1143 4 1.4861 0.1004 1.4861 1.2190
No log 0.1714 6 0.7882 0.1754 0.7882 0.8878
No log 0.2286 8 0.8500 0.1142 0.8500 0.9219
No log 0.2857 10 1.0687 0.0594 1.0687 1.0338
No log 0.3429 12 1.0178 0.1065 1.0178 1.0089
No log 0.4 14 0.8615 0.2518 0.8615 0.9282
No log 0.4571 16 0.8456 0.2967 0.8456 0.9196
No log 0.5143 18 1.0135 0.1709 1.0135 1.0067
No log 0.5714 20 1.0207 0.1949 1.0207 1.0103
No log 0.6286 22 0.8777 0.2463 0.8777 0.9369
No log 0.6857 24 0.7745 0.0359 0.7745 0.8800
No log 0.7429 26 0.7672 0.0 0.7672 0.8759
No log 0.8 28 0.7423 0.0840 0.7423 0.8616
No log 0.8571 30 0.7200 0.1922 0.7200 0.8485
No log 0.9143 32 0.7686 0.2463 0.7686 0.8767
No log 0.9714 34 0.7756 0.3439 0.7756 0.8807
No log 1.0286 36 0.5985 0.4393 0.5985 0.7737
No log 1.0857 38 0.5273 0.5159 0.5273 0.7262
No log 1.1429 40 0.5134 0.5115 0.5134 0.7165
No log 1.2 42 0.5554 0.5455 0.5554 0.7453
No log 1.2571 44 0.5742 0.5645 0.5742 0.7578
No log 1.3143 46 0.5140 0.5624 0.5140 0.7169
No log 1.3714 48 0.4896 0.5802 0.4896 0.6997
No log 1.4286 50 0.4664 0.4914 0.4664 0.6830
No log 1.4857 52 0.4813 0.4345 0.4813 0.6938
No log 1.5429 54 0.6542 0.4946 0.6542 0.8088
No log 1.6 56 0.6962 0.5119 0.6962 0.8344
No log 1.6571 58 0.5667 0.4674 0.5667 0.7528
No log 1.7143 60 0.5285 0.5050 0.5285 0.7270
No log 1.7714 62 0.5434 0.5044 0.5434 0.7371
No log 1.8286 64 0.5309 0.5663 0.5309 0.7286
No log 1.8857 66 0.5894 0.5809 0.5894 0.7677
No log 1.9429 68 0.5344 0.5991 0.5344 0.7310
No log 2.0 70 0.5502 0.5669 0.5502 0.7417
No log 2.0571 72 0.8173 0.4051 0.8173 0.9041
No log 2.1143 74 0.9749 0.3454 0.9749 0.9874
No log 2.1714 76 0.7519 0.5050 0.7519 0.8671
No log 2.2286 78 0.4843 0.5587 0.4843 0.6959
No log 2.2857 80 0.5078 0.5470 0.5078 0.7126
No log 2.3429 82 0.5012 0.5426 0.5012 0.7080
No log 2.4 84 0.4707 0.4938 0.4707 0.6861
No log 2.4571 86 0.5602 0.5083 0.5602 0.7484
No log 2.5143 88 0.5227 0.4908 0.5227 0.7230
No log 2.5714 90 0.4863 0.4515 0.4863 0.6974
No log 2.6286 92 0.4934 0.4538 0.4934 0.7025
No log 2.6857 94 0.5561 0.4776 0.5561 0.7457
No log 2.7429 96 0.7645 0.3847 0.7645 0.8743
No log 2.8 98 0.7481 0.4070 0.7481 0.8649
No log 2.8571 100 0.5870 0.5263 0.5870 0.7661
No log 2.9143 102 0.5033 0.6387 0.5033 0.7094
No log 2.9714 104 0.7498 0.5016 0.7498 0.8659
No log 3.0286 106 0.6683 0.5322 0.6683 0.8175
No log 3.0857 108 0.5168 0.5634 0.5168 0.7189
No log 3.1429 110 0.6294 0.5614 0.6294 0.7933
No log 3.2 112 0.6087 0.5400 0.6087 0.7802
No log 3.2571 114 0.4856 0.5649 0.4856 0.6968
No log 3.3143 116 0.4764 0.5201 0.4764 0.6902
No log 3.3714 118 0.4806 0.5882 0.4806 0.6933
No log 3.4286 120 0.4600 0.6289 0.4600 0.6783
No log 3.4857 122 0.4796 0.4329 0.4796 0.6925
No log 3.5429 124 0.5201 0.4315 0.5201 0.7212
No log 3.6 126 0.4627 0.5512 0.4627 0.6803
No log 3.6571 128 0.4272 0.6339 0.4272 0.6536
No log 3.7143 130 0.4393 0.6040 0.4393 0.6628
No log 3.7714 132 0.5208 0.6385 0.5208 0.7216
No log 3.8286 134 0.7048 0.5912 0.7048 0.8395
No log 3.8857 136 0.6376 0.5792 0.6376 0.7985
No log 3.9429 138 0.4523 0.6773 0.4523 0.6725
No log 4.0 140 0.4484 0.6541 0.4484 0.6696
No log 4.0571 142 0.4434 0.6431 0.4434 0.6658
No log 4.1143 144 0.4429 0.6228 0.4429 0.6655
No log 4.1714 146 0.4457 0.5796 0.4457 0.6676
No log 4.2286 148 0.4500 0.5747 0.4500 0.6708
No log 4.2857 150 0.4429 0.5304 0.4429 0.6655
No log 4.3429 152 0.4526 0.5933 0.4526 0.6728
No log 4.4 154 0.4390 0.5488 0.4390 0.6625
No log 4.4571 156 0.4679 0.5426 0.4679 0.6840
No log 4.5143 158 0.4425 0.5344 0.4425 0.6652
No log 4.5714 160 0.4514 0.5611 0.4514 0.6719
No log 4.6286 162 0.5641 0.5822 0.5641 0.7511
No log 4.6857 164 0.5560 0.6010 0.5560 0.7456
No log 4.7429 166 0.4549 0.6419 0.4549 0.6745
No log 4.8 168 0.4788 0.5291 0.4788 0.6919
No log 4.8571 170 0.4773 0.5081 0.4773 0.6909
No log 4.9143 172 0.4304 0.6828 0.4304 0.6561
No log 4.9714 174 0.4483 0.6141 0.4483 0.6696
No log 5.0286 176 0.4262 0.6491 0.4262 0.6529
No log 5.0857 178 0.4520 0.5786 0.4520 0.6723
No log 5.1429 180 0.4782 0.5379 0.4782 0.6915
No log 5.2 182 0.4199 0.6313 0.4199 0.6480
No log 5.2571 184 0.4650 0.6154 0.4650 0.6819
No log 5.3143 186 0.5772 0.6242 0.5772 0.7597
No log 5.3714 188 0.5295 0.6570 0.5295 0.7277
No log 5.4286 190 0.4625 0.6603 0.4625 0.6801
No log 5.4857 192 0.5004 0.6333 0.5004 0.7074
No log 5.5429 194 0.6013 0.5650 0.6013 0.7754
No log 5.6 196 0.5510 0.5520 0.5510 0.7423
No log 5.6571 198 0.4471 0.4837 0.4471 0.6687
No log 5.7143 200 0.4514 0.4869 0.4514 0.6718
No log 5.7714 202 0.4516 0.4752 0.4516 0.6720
No log 5.8286 204 0.5026 0.5481 0.5026 0.7090
No log 5.8857 206 0.5578 0.5439 0.5578 0.7469
No log 5.9429 208 0.5938 0.5520 0.5938 0.7706
No log 6.0 210 0.4660 0.5978 0.4660 0.6827
No log 6.0571 212 0.4314 0.6305 0.4314 0.6568
No log 6.1143 214 0.4672 0.6442 0.4672 0.6835
No log 6.1714 216 0.5225 0.6129 0.5225 0.7229
No log 6.2286 218 0.4905 0.5672 0.4905 0.7003
No log 6.2857 220 0.4235 0.5665 0.4235 0.6507
No log 6.3429 222 0.5028 0.5622 0.5028 0.7091
No log 6.4 224 0.6108 0.4880 0.6108 0.7815
No log 6.4571 226 0.6331 0.4880 0.6331 0.7957
No log 6.5143 228 0.5108 0.5907 0.5108 0.7147
No log 6.5714 230 0.4449 0.5422 0.4449 0.6670
No log 6.6286 232 0.4592 0.5105 0.4592 0.6776
No log 6.6857 234 0.4546 0.5085 0.4546 0.6743
No log 6.7429 236 0.4551 0.5587 0.4551 0.6746
No log 6.8 238 0.4663 0.5939 0.4663 0.6829
No log 6.8571 240 0.4907 0.6011 0.4907 0.7005
No log 6.9143 242 0.4712 0.6227 0.4712 0.6865
No log 6.9714 244 0.4524 0.6267 0.4524 0.6726
No log 7.0286 246 0.4800 0.5841 0.4800 0.6928
No log 7.0857 248 0.5433 0.5581 0.5433 0.7371
No log 7.1429 250 0.5744 0.5595 0.5744 0.7579
No log 7.2 252 0.5235 0.6248 0.5235 0.7236
No log 7.2571 254 0.4794 0.6121 0.4794 0.6924
No log 7.3143 256 0.6097 0.5065 0.6097 0.7808
No log 7.3714 258 0.7348 0.5310 0.7348 0.8572
No log 7.4286 260 0.6847 0.5310 0.6847 0.8275
No log 7.4857 262 0.5428 0.5117 0.5428 0.7367
No log 7.5429 264 0.4858 0.4949 0.4858 0.6970
No log 7.6 266 0.4849 0.4949 0.4849 0.6963
No log 7.6571 268 0.4857 0.4949 0.4857 0.6969
No log 7.7143 270 0.5300 0.5081 0.5300 0.7280
No log 7.7714 272 0.5617 0.5326 0.5617 0.7495
No log 7.8286 274 0.5722 0.5326 0.5722 0.7565
No log 7.8857 276 0.5599 0.5275 0.5599 0.7483
No log 7.9429 278 0.5529 0.5219 0.5529 0.7435
No log 8.0 280 0.5563 0.5233 0.5563 0.7459
No log 8.0571 282 0.5192 0.5151 0.5192 0.7206
No log 8.1143 284 0.5083 0.4569 0.5083 0.7130
No log 8.1714 286 0.5178 0.5597 0.5178 0.7196
No log 8.2286 288 0.5403 0.5452 0.5403 0.7350
No log 8.2857 290 0.5925 0.5275 0.5925 0.7697
No log 8.3429 292 0.6342 0.5163 0.6342 0.7964
No log 8.4 294 0.5912 0.5354 0.5912 0.7689
No log 8.4571 296 0.5212 0.5291 0.5212 0.7220
No log 8.5143 298 0.4955 0.5115 0.4955 0.7039
No log 8.5714 300 0.5006 0.5254 0.5006 0.7075
No log 8.6286 302 0.5168 0.4622 0.5168 0.7189
No log 8.6857 304 0.5079 0.5420 0.5079 0.7127
No log 8.7429 306 0.5127 0.5597 0.5127 0.7161
No log 8.8 308 0.4979 0.5214 0.4979 0.7056
No log 8.8571 310 0.5119 0.5214 0.5119 0.7155
No log 8.9143 312 0.5263 0.4788 0.5263 0.7255
No log 8.9714 314 0.4878 0.5022 0.4878 0.6984
No log 9.0286 316 0.4788 0.5750 0.4788 0.6920
No log 9.0857 318 0.4856 0.4847 0.4856 0.6968
No log 9.1429 320 0.5166 0.4883 0.5166 0.7187
No log 9.2 322 0.5042 0.4966 0.5042 0.7101
No log 9.2571 324 0.4818 0.4934 0.4818 0.6941
No log 9.3143 326 0.4935 0.5141 0.4935 0.7025
No log 9.3714 328 0.5101 0.5468 0.5101 0.7142
No log 9.4286 330 0.4846 0.4763 0.4846 0.6961
No log 9.4857 332 0.4828 0.5687 0.4828 0.6948
No log 9.5429 334 0.4833 0.5357 0.4833 0.6952
No log 9.6 336 0.4934 0.4681 0.4934 0.7024
No log 9.6571 338 0.6366 0.5085 0.6366 0.7979
No log 9.7143 340 0.7531 0.4519 0.7531 0.8678
No log 9.7714 342 0.6783 0.4438 0.6783 0.8236
No log 9.8286 344 0.5489 0.4749 0.5489 0.7409
No log 9.8857 346 0.5057 0.4547 0.5057 0.7111
No log 9.9429 348 0.4925 0.4983 0.4925 0.7018
No log 10.0 350 0.4890 0.4681 0.4890 0.6993
No log 10.0571 352 0.5184 0.5157 0.5184 0.7200
No log 10.1143 354 0.5403 0.5059 0.5403 0.7350
No log 10.1714 356 0.5113 0.4582 0.5113 0.7150
No log 10.2286 358 0.4901 0.5305 0.4901 0.7001
No log 10.2857 360 0.4922 0.4774 0.4922 0.7016
No log 10.3429 362 0.5312 0.4484 0.5312 0.7288
No log 10.4 364 0.6028 0.5139 0.6028 0.7764
No log 10.4571 366 0.5889 0.4812 0.5889 0.7674
No log 10.5143 368 0.5134 0.5081 0.5134 0.7165
No log 10.5714 370 0.4785 0.4561 0.4785 0.6917
No log 10.6286 372 0.4998 0.4888 0.4998 0.7069
No log 10.6857 374 0.4890 0.4888 0.4890 0.6993
No log 10.7429 376 0.4747 0.5095 0.4747 0.6890
No log 10.8 378 0.5735 0.4666 0.5735 0.7573
No log 10.8571 380 0.6508 0.4789 0.6508 0.8067
No log 10.9143 382 0.6254 0.4521 0.6254 0.7908
No log 10.9714 384 0.5513 0.4905 0.5513 0.7425
No log 11.0286 386 0.4826 0.4752 0.4826 0.6947
No log 11.0857 388 0.4860 0.4767 0.4860 0.6971
No log 11.1429 390 0.5065 0.5153 0.5065 0.7117
No log 11.2 392 0.4765 0.4838 0.4765 0.6903
No log 11.2571 394 0.4774 0.4681 0.4774 0.6909
No log 11.3143 396 0.5642 0.5190 0.5642 0.7511
No log 11.3714 398 0.6347 0.5124 0.6347 0.7967
No log 11.4286 400 0.5910 0.5139 0.5910 0.7688
No log 11.4857 402 0.5097 0.4207 0.5097 0.7139
No log 11.5429 404 0.4824 0.4171 0.4824 0.6946
No log 11.6 406 0.4784 0.4402 0.4784 0.6916
No log 11.6571 408 0.4908 0.4468 0.4908 0.7006
No log 11.7143 410 0.4962 0.4468 0.4962 0.7044
No log 11.7714 412 0.4982 0.4468 0.4982 0.7058
No log 11.8286 414 0.5142 0.4705 0.5142 0.7171
No log 11.8857 416 0.5468 0.5101 0.5468 0.7395
No log 11.9429 418 0.5285 0.4522 0.5285 0.7270
No log 12.0 420 0.5115 0.4609 0.5115 0.7152
No log 12.0571 422 0.4730 0.5235 0.4730 0.6878
No log 12.1143 424 0.4662 0.4817 0.4662 0.6828
No log 12.1714 426 0.4690 0.4538 0.4690 0.6849
No log 12.2286 428 0.4904 0.4929 0.4904 0.7003
No log 12.2857 430 0.5555 0.4100 0.5555 0.7453
No log 12.3429 432 0.5705 0.4270 0.5705 0.7553
No log 12.4 434 0.5290 0.4100 0.5290 0.7274
No log 12.4571 436 0.4882 0.4249 0.4882 0.6987
No log 12.5143 438 0.4903 0.4538 0.4903 0.7002
No log 12.5714 440 0.4952 0.4249 0.4952 0.7037
No log 12.6286 442 0.5035 0.3947 0.5035 0.7096
No log 12.6857 444 0.5315 0.4538 0.5315 0.7291
No log 12.7429 446 0.5964 0.3677 0.5964 0.7723
No log 12.8 448 0.6276 0.4218 0.6276 0.7922
No log 12.8571 450 0.6189 0.4112 0.6189 0.7867
No log 12.9143 452 0.6051 0.4112 0.6051 0.7779
No log 12.9714 454 0.5736 0.4371 0.5736 0.7574
No log 13.0286 456 0.5098 0.4729 0.5098 0.7140
No log 13.0857 458 0.4772 0.4795 0.4772 0.6908
No log 13.1429 460 0.4792 0.4795 0.4792 0.6922
No log 13.2 462 0.4761 0.4538 0.4761 0.6900
No log 13.2571 464 0.4911 0.4659 0.4911 0.7008
No log 13.3143 466 0.5450 0.4292 0.5450 0.7382
No log 13.3714 468 0.7054 0.4917 0.7054 0.8399
No log 13.4286 470 0.8414 0.4562 0.8414 0.9173
No log 13.4857 472 0.7986 0.4632 0.7986 0.8937
No log 13.5429 474 0.6926 0.4930 0.6926 0.8322
No log 13.6 476 0.5495 0.4534 0.5495 0.7413
No log 13.6571 478 0.4809 0.4425 0.4809 0.6934
No log 13.7143 480 0.4861 0.5030 0.4861 0.6972
No log 13.7714 482 0.4931 0.5030 0.4931 0.7022
No log 13.8286 484 0.4969 0.4314 0.4969 0.7049
No log 13.8857 486 0.5215 0.4224 0.5215 0.7222
No log 13.9429 488 0.5292 0.4171 0.5292 0.7275
No log 14.0 490 0.5251 0.4314 0.5251 0.7246
No log 14.0571 492 0.5171 0.4314 0.5171 0.7191
No log 14.1143 494 0.5189 0.4314 0.5189 0.7203
No log 14.1714 496 0.5334 0.4007 0.5334 0.7303
No log 14.2286 498 0.5668 0.3813 0.5668 0.7529
0.2801 14.2857 500 0.6085 0.5362 0.6085 0.7801
0.2801 14.3429 502 0.6059 0.5497 0.6059 0.7784
0.2801 14.4 504 0.5581 0.4724 0.5581 0.7471
0.2801 14.4571 506 0.5244 0.3625 0.5244 0.7242
0.2801 14.5143 508 0.5037 0.4698 0.5037 0.7097
0.2801 14.5714 510 0.5444 0.5619 0.5444 0.7378
0.2801 14.6286 512 0.5630 0.5841 0.5630 0.7504
0.2801 14.6857 514 0.5278 0.5390 0.5278 0.7265
0.2801 14.7429 516 0.4862 0.4538 0.4862 0.6973
0.2801 14.8 518 0.5331 0.4027 0.5331 0.7301
0.2801 14.8571 520 0.6338 0.4513 0.6338 0.7961
0.2801 14.9143 522 0.7243 0.3892 0.7243 0.8511
0.2801 14.9714 524 0.7316 0.3892 0.7316 0.8553

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k7_task7_organization

Finetuned
(4019)
this model