ArabicNewSplits7_B_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k3_task7_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.5142
  • Qwk: 0.4468
  • Mse: 0.5142
  • Rmse: 0.7171

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.1176 2 2.5744 -0.0230 2.5744 1.6045
No log 0.2353 4 1.1833 0.0736 1.1833 1.0878
No log 0.3529 6 0.8444 0.0937 0.8444 0.9189
No log 0.4706 8 0.7939 0.1962 0.7939 0.8910
No log 0.5882 10 0.7502 0.2099 0.7502 0.8661
No log 0.7059 12 0.7964 0.2440 0.7964 0.8924
No log 0.8235 14 0.8944 0.2626 0.8944 0.9457
No log 0.9412 16 0.6696 0.4289 0.6696 0.8183
No log 1.0588 18 0.6607 0.4235 0.6607 0.8128
No log 1.1765 20 0.6697 0.4881 0.6697 0.8184
No log 1.2941 22 0.6226 0.4126 0.6226 0.7890
No log 1.4118 24 0.7257 0.3547 0.7257 0.8519
No log 1.5294 26 0.6316 0.4322 0.6316 0.7947
No log 1.6471 28 0.5895 0.3213 0.5895 0.7678
No log 1.7647 30 0.6519 0.4089 0.6519 0.8074
No log 1.8824 32 0.7156 0.4051 0.7156 0.8460
No log 2.0 34 0.6237 0.3996 0.6237 0.7897
No log 2.1176 36 0.6124 0.4292 0.6124 0.7825
No log 2.2353 38 0.6451 0.4464 0.6451 0.8032
No log 2.3529 40 0.8534 0.4462 0.8534 0.9238
No log 2.4706 42 0.9420 0.3923 0.9420 0.9706
No log 2.5882 44 0.8798 0.4250 0.8798 0.9380
No log 2.7059 46 0.6160 0.4419 0.6160 0.7848
No log 2.8235 48 0.5852 0.4857 0.5852 0.7650
No log 2.9412 50 0.5791 0.5095 0.5791 0.7610
No log 3.0588 52 0.6046 0.5098 0.6046 0.7776
No log 3.1765 54 0.5871 0.4872 0.5871 0.7663
No log 3.2941 56 0.5749 0.5291 0.5749 0.7582
No log 3.4118 58 0.5266 0.5517 0.5266 0.7257
No log 3.5294 60 0.5528 0.5298 0.5528 0.7435
No log 3.6471 62 0.5714 0.5486 0.5714 0.7559
No log 3.7647 64 0.5417 0.5486 0.5417 0.7360
No log 3.8824 66 0.5201 0.5471 0.5201 0.7211
No log 4.0 68 0.4940 0.6245 0.4940 0.7029
No log 4.1176 70 0.6473 0.4702 0.6473 0.8045
No log 4.2353 72 0.5973 0.4602 0.5973 0.7728
No log 4.3529 74 0.4972 0.5597 0.4972 0.7051
No log 4.4706 76 0.5848 0.4997 0.5848 0.7647
No log 4.5882 78 0.4733 0.6228 0.4733 0.6879
No log 4.7059 80 0.5885 0.4527 0.5885 0.7671
No log 4.8235 82 0.8016 0.3583 0.8016 0.8953
No log 4.9412 84 0.6488 0.3228 0.6488 0.8055
No log 5.0588 86 0.4991 0.5095 0.4991 0.7065
No log 5.1765 88 0.5795 0.4769 0.5795 0.7613
No log 5.2941 90 0.5231 0.5472 0.5231 0.7233
No log 5.4118 92 0.5737 0.5078 0.5737 0.7574
No log 5.5294 94 0.6327 0.4482 0.6327 0.7954
No log 5.6471 96 0.5486 0.4484 0.5486 0.7407
No log 5.7647 98 0.7566 0.4199 0.7566 0.8698
No log 5.8824 100 1.0217 0.4189 1.0217 1.0108
No log 6.0 102 0.9632 0.4481 0.9632 0.9814
No log 6.1176 104 0.6720 0.4023 0.6720 0.8198
No log 6.2353 106 0.5568 0.5024 0.5568 0.7462
No log 6.3529 108 0.5711 0.4096 0.5711 0.7557
No log 6.4706 110 0.5557 0.5141 0.5557 0.7455
No log 6.5882 112 0.7104 0.4550 0.7104 0.8429
No log 6.7059 114 0.8268 0.4877 0.8268 0.9093
No log 6.8235 116 0.7601 0.4741 0.7601 0.8719
No log 6.9412 118 0.6512 0.4335 0.6512 0.8069
No log 7.0588 120 0.4944 0.5609 0.4944 0.7031
No log 7.1765 122 0.5107 0.6156 0.5107 0.7146
No log 7.2941 124 0.4932 0.6020 0.4932 0.7023
No log 7.4118 126 0.4767 0.6046 0.4767 0.6904
No log 7.5294 128 0.5626 0.4864 0.5626 0.7501
No log 7.6471 130 0.5253 0.5324 0.5253 0.7248
No log 7.7647 132 0.5008 0.6197 0.5008 0.7077
No log 7.8824 134 0.6685 0.4978 0.6685 0.8176
No log 8.0 136 0.7384 0.4683 0.7384 0.8593
No log 8.1176 138 0.6013 0.5387 0.6013 0.7754
No log 8.2353 140 0.5111 0.5781 0.5111 0.7149
No log 8.3529 142 0.5312 0.4724 0.5312 0.7288
No log 8.4706 144 0.5725 0.4542 0.5725 0.7567
No log 8.5882 146 0.5227 0.5034 0.5227 0.7230
No log 8.7059 148 0.5242 0.5195 0.5242 0.7240
No log 8.8235 150 0.5331 0.5589 0.5331 0.7302
No log 8.9412 152 0.5619 0.5357 0.5619 0.7496
No log 9.0588 154 0.6476 0.5111 0.6476 0.8047
No log 9.1765 156 0.6726 0.5208 0.6726 0.8201
No log 9.2941 158 0.5804 0.5442 0.5804 0.7618
No log 9.4118 160 0.5656 0.5150 0.5656 0.7520
No log 9.5294 162 0.5693 0.5617 0.5693 0.7545
No log 9.6471 164 0.6528 0.5259 0.6528 0.8079
No log 9.7647 166 0.7140 0.4684 0.7140 0.8450
No log 9.8824 168 0.7496 0.4502 0.7496 0.8658
No log 10.0 170 0.6333 0.4827 0.6333 0.7958
No log 10.1176 172 0.5464 0.5214 0.5464 0.7392
No log 10.2353 174 0.5454 0.5070 0.5454 0.7385
No log 10.3529 176 0.5942 0.4727 0.5942 0.7709
No log 10.4706 178 0.5693 0.4901 0.5693 0.7545
No log 10.5882 180 0.5236 0.5095 0.5236 0.7236
No log 10.7059 182 0.6056 0.4925 0.6056 0.7782
No log 10.8235 184 0.6070 0.4925 0.6070 0.7791
No log 10.9412 186 0.5723 0.4997 0.5723 0.7565
No log 11.0588 188 0.5139 0.5601 0.5139 0.7169
No log 11.1765 190 0.4763 0.5734 0.4763 0.6901
No log 11.2941 192 0.4690 0.5584 0.4690 0.6848
No log 11.4118 194 0.4761 0.6214 0.4761 0.6900
No log 11.5294 196 0.5308 0.5342 0.5308 0.7286
No log 11.6471 198 0.5356 0.5342 0.5356 0.7319
No log 11.7647 200 0.5366 0.5591 0.5366 0.7325
No log 11.8824 202 0.4806 0.6227 0.4806 0.6932
No log 12.0 204 0.4617 0.6197 0.4617 0.6795
No log 12.1176 206 0.4566 0.6530 0.4566 0.6757
No log 12.2353 208 0.4898 0.6025 0.4898 0.6999
No log 12.3529 210 0.4746 0.5501 0.4746 0.6889
No log 12.4706 212 0.4510 0.6242 0.4510 0.6716
No log 12.5882 214 0.5044 0.5869 0.5044 0.7102
No log 12.7059 216 0.5079 0.6342 0.5079 0.7127
No log 12.8235 218 0.4666 0.6395 0.4666 0.6831
No log 12.9412 220 0.4661 0.6158 0.4661 0.6827
No log 13.0588 222 0.4759 0.6561 0.4759 0.6899
No log 13.1765 224 0.4847 0.6405 0.4847 0.6962
No log 13.2941 226 0.5086 0.6362 0.5086 0.7132
No log 13.4118 228 0.5349 0.6459 0.5349 0.7314
No log 13.5294 230 0.5106 0.5868 0.5106 0.7145
No log 13.6471 232 0.4684 0.6330 0.4684 0.6844
No log 13.7647 234 0.4982 0.5655 0.4982 0.7059
No log 13.8824 236 0.5295 0.5252 0.5295 0.7277
No log 14.0 238 0.5333 0.5252 0.5333 0.7303
No log 14.1176 240 0.4820 0.5485 0.4820 0.6943
No log 14.2353 242 0.4919 0.5750 0.4919 0.7014
No log 14.3529 244 0.5068 0.6130 0.5068 0.7119
No log 14.4706 246 0.4916 0.6398 0.4916 0.7011
No log 14.5882 248 0.4929 0.6862 0.4929 0.7021
No log 14.7059 250 0.4929 0.6942 0.4929 0.7021
No log 14.8235 252 0.5347 0.6870 0.5347 0.7312
No log 14.9412 254 0.5344 0.6468 0.5344 0.7310
No log 15.0588 256 0.4738 0.6793 0.4738 0.6884
No log 15.1765 258 0.4429 0.6929 0.4429 0.6655
No log 15.2941 260 0.4663 0.6101 0.4663 0.6829
No log 15.4118 262 0.5105 0.4967 0.5105 0.7145
No log 15.5294 264 0.5357 0.5022 0.5357 0.7319
No log 15.6471 266 0.4807 0.6188 0.4807 0.6933
No log 15.7647 268 0.4685 0.6441 0.4685 0.6845
No log 15.8824 270 0.4687 0.6088 0.4687 0.6846
No log 16.0 272 0.4715 0.6239 0.4715 0.6867
No log 16.1176 274 0.4754 0.5909 0.4754 0.6895
No log 16.2353 276 0.5021 0.5736 0.5021 0.7086
No log 16.3529 278 0.4999 0.5965 0.4999 0.7070
No log 16.4706 280 0.5000 0.5867 0.5000 0.7071
No log 16.5882 282 0.4903 0.5455 0.4903 0.7002
No log 16.7059 284 0.5040 0.6300 0.5040 0.7099
No log 16.8235 286 0.5212 0.6105 0.5212 0.7220
No log 16.9412 288 0.5250 0.5341 0.5250 0.7245
No log 17.0588 290 0.5235 0.5609 0.5235 0.7235
No log 17.1765 292 0.5244 0.5413 0.5244 0.7242
No log 17.2941 294 0.5286 0.5141 0.5286 0.7271
No log 17.4118 296 0.5717 0.5327 0.5717 0.7561
No log 17.5294 298 0.5967 0.5117 0.5967 0.7724
No log 17.6471 300 0.5682 0.5059 0.5682 0.7538
No log 17.7647 302 0.5390 0.5677 0.5390 0.7342
No log 17.8824 304 0.5157 0.5516 0.5157 0.7181
No log 18.0 306 0.5167 0.5836 0.5167 0.7188
No log 18.1176 308 0.5375 0.5597 0.5375 0.7331
No log 18.2353 310 0.5199 0.5836 0.5199 0.7211
No log 18.3529 312 0.4968 0.6201 0.4968 0.7049
No log 18.4706 314 0.4899 0.6414 0.4899 0.6999
No log 18.5882 316 0.4901 0.6241 0.4901 0.7000
No log 18.7059 318 0.5057 0.6388 0.5057 0.7111
No log 18.8235 320 0.5422 0.5538 0.5422 0.7363
No log 18.9412 322 0.5135 0.5812 0.5135 0.7166
No log 19.0588 324 0.5098 0.5596 0.5098 0.7140
No log 19.1765 326 0.4956 0.5446 0.4956 0.7040
No log 19.2941 328 0.4945 0.5980 0.4945 0.7032
No log 19.4118 330 0.4938 0.6169 0.4938 0.7027
No log 19.5294 332 0.4919 0.5915 0.4919 0.7014
No log 19.6471 334 0.4981 0.5656 0.4981 0.7057
No log 19.7647 336 0.5503 0.6170 0.5503 0.7418
No log 19.8824 338 0.5706 0.7057 0.5706 0.7554
No log 20.0 340 0.5519 0.6863 0.5519 0.7429
No log 20.1176 342 0.5394 0.5858 0.5394 0.7344
No log 20.2353 344 0.4954 0.5827 0.4954 0.7039
No log 20.3529 346 0.4628 0.6228 0.4628 0.6803
No log 20.4706 348 0.4434 0.5550 0.4434 0.6659
No log 20.5882 350 0.4572 0.6111 0.4572 0.6762
No log 20.7059 352 0.4580 0.5883 0.4580 0.6768
No log 20.8235 354 0.4520 0.5386 0.4520 0.6723
No log 20.9412 356 0.4895 0.5736 0.4895 0.6997
No log 21.0588 358 0.5193 0.5510 0.5193 0.7206
No log 21.1765 360 0.4944 0.5831 0.4944 0.7031
No log 21.2941 362 0.4622 0.6627 0.4622 0.6799
No log 21.4118 364 0.4838 0.5663 0.4838 0.6955
No log 21.5294 366 0.5064 0.5407 0.5064 0.7116
No log 21.6471 368 0.4786 0.5943 0.4786 0.6918
No log 21.7647 370 0.4618 0.6341 0.4618 0.6795
No log 21.8824 372 0.4559 0.6505 0.4559 0.6752
No log 22.0 374 0.4433 0.6636 0.4433 0.6658
No log 22.1176 376 0.4719 0.6066 0.4719 0.6869
No log 22.2353 378 0.5131 0.6036 0.5131 0.7163
No log 22.3529 380 0.5029 0.6036 0.5029 0.7092
No log 22.4706 382 0.4666 0.6677 0.4666 0.6831
No log 22.5882 384 0.4552 0.7158 0.4552 0.6747
No log 22.7059 386 0.4515 0.6797 0.4515 0.6720
No log 22.8235 388 0.4426 0.7208 0.4426 0.6653
No log 22.9412 390 0.4680 0.6698 0.4680 0.6841
No log 23.0588 392 0.4841 0.6431 0.4841 0.6958
No log 23.1765 394 0.4616 0.6491 0.4616 0.6794
No log 23.2941 396 0.4580 0.7104 0.4580 0.6768
No log 23.4118 398 0.4650 0.7284 0.4650 0.6819
No log 23.5294 400 0.4703 0.6899 0.4703 0.6858
No log 23.6471 402 0.4732 0.7101 0.4732 0.6879
No log 23.7647 404 0.4731 0.7273 0.4731 0.6878
No log 23.8824 406 0.4679 0.6919 0.4679 0.6841
No log 24.0 408 0.4767 0.6115 0.4767 0.6904
No log 24.1176 410 0.4910 0.5923 0.4910 0.7007
No log 24.2353 412 0.4840 0.5593 0.4840 0.6957
No log 24.3529 414 0.4842 0.5133 0.4842 0.6958
No log 24.4706 416 0.4953 0.5339 0.4953 0.7038
No log 24.5882 418 0.4795 0.6068 0.4795 0.6925
No log 24.7059 420 0.4610 0.6443 0.4610 0.6790
No log 24.8235 422 0.4581 0.6530 0.4581 0.6769
No log 24.9412 424 0.4638 0.6088 0.4638 0.6810
No log 25.0588 426 0.4469 0.6241 0.4469 0.6685
No log 25.1765 428 0.4415 0.6542 0.4415 0.6644
No log 25.2941 430 0.4597 0.6168 0.4597 0.6780
No log 25.4118 432 0.4630 0.5714 0.4630 0.6804
No log 25.5294 434 0.4552 0.5846 0.4552 0.6747
No log 25.6471 436 0.4724 0.5897 0.4724 0.6873
No log 25.7647 438 0.5761 0.5096 0.5761 0.7590
No log 25.8824 440 0.6618 0.5251 0.6618 0.8135
No log 26.0 442 0.6268 0.5147 0.6268 0.7917
No log 26.1176 444 0.5291 0.5206 0.5291 0.7274
No log 26.2353 446 0.4756 0.5248 0.4756 0.6897
No log 26.3529 448 0.4653 0.5248 0.4653 0.6821
No log 26.4706 450 0.4687 0.5657 0.4687 0.6846
No log 26.5882 452 0.5000 0.5111 0.5000 0.7071
No log 26.7059 454 0.5350 0.4684 0.5350 0.7314
No log 26.8235 456 0.5646 0.4997 0.5646 0.7514
No log 26.9412 458 0.5440 0.5206 0.5440 0.7375
No log 27.0588 460 0.4946 0.5867 0.4946 0.7032
No log 27.1765 462 0.4824 0.6326 0.4824 0.6946
No log 27.2941 464 0.4874 0.5974 0.4874 0.6982
No log 27.4118 466 0.4853 0.6623 0.4853 0.6967
No log 27.5294 468 0.4752 0.6252 0.4752 0.6894
No log 27.6471 470 0.4638 0.6650 0.4638 0.6810
No log 27.7647 472 0.4713 0.5797 0.4713 0.6865
No log 27.8824 474 0.4778 0.5596 0.4778 0.6912
No log 28.0 476 0.4748 0.5672 0.4748 0.6891
No log 28.1176 478 0.4648 0.6039 0.4648 0.6817
No log 28.2353 480 0.4860 0.5671 0.4860 0.6971
No log 28.3529 482 0.5246 0.5639 0.5246 0.7243
No log 28.4706 484 0.5493 0.5639 0.5493 0.7412
No log 28.5882 486 0.5385 0.5639 0.5385 0.7338
No log 28.7059 488 0.5182 0.5307 0.5182 0.7198
No log 28.8235 490 0.5049 0.5307 0.5049 0.7105
No log 28.9412 492 0.5074 0.5307 0.5074 0.7123
No log 29.0588 494 0.5000 0.5671 0.5000 0.7071
No log 29.1765 496 0.5030 0.5671 0.5030 0.7092
No log 29.2941 498 0.5045 0.4964 0.5045 0.7103
0.2772 29.4118 500 0.4922 0.4983 0.4922 0.7016
0.2772 29.5294 502 0.4824 0.4983 0.4824 0.6945
0.2772 29.6471 504 0.4875 0.5657 0.4875 0.6982
0.2772 29.7647 506 0.5173 0.4684 0.5173 0.7192
0.2772 29.8824 508 0.5791 0.4430 0.5791 0.7610
0.2772 30.0 510 0.6078 0.4759 0.6078 0.7796
0.2772 30.1176 512 0.5557 0.4845 0.5557 0.7455
0.2772 30.2353 514 0.4961 0.5131 0.4961 0.7043
0.2772 30.3529 516 0.4918 0.5248 0.4918 0.7013
0.2772 30.4706 518 0.5006 0.5600 0.5006 0.7076
0.2772 30.5882 520 0.5023 0.5600 0.5023 0.7087
0.2772 30.7059 522 0.5076 0.5248 0.5076 0.7124
0.2772 30.8235 524 0.5358 0.5601 0.5358 0.7320
0.2772 30.9412 526 0.5471 0.5677 0.5471 0.7397
0.2772 31.0588 528 0.5472 0.4614 0.5472 0.7398
0.2772 31.1765 530 0.5406 0.4614 0.5406 0.7353
0.2772 31.2941 532 0.5312 0.4803 0.5312 0.7289
0.2772 31.4118 534 0.5217 0.4538 0.5217 0.7223
0.2772 31.5294 536 0.5142 0.4468 0.5142 0.7171

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_B_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k3_task7_organization

Finetuned
(4019)
this model