ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run3_AugV5_k11_task7_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.6197
  • Qwk: 0.4307
  • Mse: 0.6197
  • Rmse: 0.7872

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0364 2 2.5776 -0.0230 2.5776 1.6055
No log 0.0727 4 1.2860 0.1241 1.2860 1.1340
No log 0.1091 6 0.7795 0.2642 0.7795 0.8829
No log 0.1455 8 0.6051 0.4034 0.6051 0.7779
No log 0.1818 10 0.7668 0.3193 0.7668 0.8757
No log 0.2182 12 1.1712 -0.0231 1.1712 1.0822
No log 0.2545 14 1.2020 -0.2706 1.2020 1.0964
No log 0.2909 16 1.1846 0.0188 1.1846 1.0884
No log 0.3273 18 1.0342 0.2020 1.0342 1.0169
No log 0.3636 20 0.6520 0.4243 0.6520 0.8074
No log 0.4 22 0.6507 0.2430 0.6507 0.8066
No log 0.4364 24 0.7087 0.2494 0.7087 0.8419
No log 0.4727 26 0.6619 0.1714 0.6619 0.8136
No log 0.5091 28 0.6789 0.0798 0.6789 0.8239
No log 0.5455 30 0.8138 0.3099 0.8138 0.9021
No log 0.5818 32 1.0175 0.2240 1.0175 1.0087
No log 0.6182 34 1.0357 0.1944 1.0357 1.0177
No log 0.6545 36 0.9251 0.3140 0.9251 0.9618
No log 0.6909 38 0.7831 0.4002 0.7831 0.8849
No log 0.7273 40 0.6583 0.4493 0.6583 0.8114
No log 0.7636 42 0.6375 0.4358 0.6375 0.7985
No log 0.8 44 0.6391 0.4617 0.6391 0.7994
No log 0.8364 46 0.6516 0.4656 0.6516 0.8072
No log 0.8727 48 0.7110 0.4242 0.7110 0.8432
No log 0.9091 50 0.8742 0.2752 0.8742 0.9350
No log 0.9455 52 0.8160 0.3346 0.8160 0.9033
No log 0.9818 54 0.6717 0.3464 0.6717 0.8196
No log 1.0182 56 0.6389 0.3325 0.6389 0.7993
No log 1.0545 58 0.9201 0.2866 0.9201 0.9592
No log 1.0909 60 1.0096 0.0860 1.0096 1.0048
No log 1.1273 62 1.0698 0.1147 1.0698 1.0343
No log 1.1636 64 1.1264 0.1057 1.1264 1.0613
No log 1.2 66 0.9720 0.2460 0.9720 0.9859
No log 1.2364 68 0.7263 0.2652 0.7263 0.8522
No log 1.2727 70 0.6267 0.2884 0.6267 0.7917
No log 1.3091 72 0.5757 0.4338 0.5757 0.7587
No log 1.3455 74 0.5739 0.4808 0.5739 0.7576
No log 1.3818 76 0.6405 0.4642 0.6405 0.8003
No log 1.4182 78 0.6761 0.4529 0.6761 0.8223
No log 1.4545 80 0.6145 0.4940 0.6145 0.7839
No log 1.4909 82 0.5442 0.5770 0.5442 0.7377
No log 1.5273 84 0.5226 0.5361 0.5226 0.7229
No log 1.5636 86 0.5201 0.5324 0.5201 0.7212
No log 1.6 88 0.5158 0.5574 0.5158 0.7182
No log 1.6364 90 0.5218 0.4763 0.5218 0.7224
No log 1.6727 92 0.5657 0.5920 0.5657 0.7521
No log 1.7091 94 0.6482 0.4940 0.6482 0.8051
No log 1.7455 96 0.6830 0.4874 0.6830 0.8265
No log 1.7818 98 0.6055 0.5147 0.6055 0.7781
No log 1.8182 100 0.5397 0.4924 0.5397 0.7347
No log 1.8545 102 0.5286 0.6092 0.5286 0.7271
No log 1.8909 104 0.5346 0.5840 0.5346 0.7311
No log 1.9273 106 0.5366 0.6210 0.5366 0.7325
No log 1.9636 108 0.5301 0.5756 0.5301 0.7280
No log 2.0 110 0.6043 0.4830 0.6043 0.7773
No log 2.0364 112 0.6852 0.4898 0.6852 0.8278
No log 2.0727 114 0.6377 0.4684 0.6377 0.7986
No log 2.1091 116 0.5227 0.5768 0.5227 0.7229
No log 2.1455 118 0.5124 0.5995 0.5124 0.7158
No log 2.1818 120 0.5973 0.4904 0.5973 0.7728
No log 2.2182 122 0.8213 0.5170 0.8213 0.9062
No log 2.2545 124 0.9032 0.5002 0.9032 0.9504
No log 2.2909 126 0.7299 0.5438 0.7299 0.8543
No log 2.3273 128 0.5908 0.5170 0.5908 0.7687
No log 2.3636 130 0.5747 0.5170 0.5747 0.7581
No log 2.4 132 0.5485 0.4925 0.5485 0.7406
No log 2.4364 134 0.6138 0.5435 0.6138 0.7835
No log 2.4727 136 0.7631 0.4914 0.7631 0.8736
No log 2.5091 138 0.7332 0.4799 0.7332 0.8563
No log 2.5455 140 0.6082 0.4893 0.6082 0.7799
No log 2.5818 142 0.4926 0.5723 0.4926 0.7019
No log 2.6182 144 0.4995 0.5201 0.4995 0.7067
No log 2.6545 146 0.5305 0.5032 0.5305 0.7283
No log 2.6909 148 0.5908 0.5185 0.5908 0.7686
No log 2.7273 150 0.6114 0.5185 0.6114 0.7819
No log 2.7636 152 0.5739 0.5852 0.5739 0.7576
No log 2.8 154 0.5307 0.5324 0.5307 0.7285
No log 2.8364 156 0.5014 0.4526 0.5014 0.7081
No log 2.8727 158 0.4998 0.5909 0.4998 0.7070
No log 2.9091 160 0.5976 0.5246 0.5976 0.7731
No log 2.9455 162 0.7510 0.5043 0.7510 0.8666
No log 2.9818 164 0.7365 0.5294 0.7365 0.8582
No log 3.0182 166 0.6268 0.4819 0.6268 0.7917
No log 3.0545 168 0.6008 0.4392 0.6008 0.7751
No log 3.0909 170 0.6016 0.4824 0.6016 0.7756
No log 3.1273 172 0.6514 0.4943 0.6514 0.8071
No log 3.1636 174 0.6405 0.4943 0.6405 0.8003
No log 3.2 176 0.6609 0.5278 0.6609 0.8129
No log 3.2364 178 0.6437 0.5533 0.6437 0.8023
No log 3.2727 180 0.5696 0.5787 0.5696 0.7547
No log 3.3091 182 0.5847 0.5787 0.5847 0.7646
No log 3.3455 184 0.6498 0.5409 0.6498 0.8061
No log 3.3818 186 0.6572 0.5185 0.6572 0.8107
No log 3.4182 188 0.6605 0.4862 0.6605 0.8127
No log 3.4545 190 0.6525 0.4212 0.6525 0.8078
No log 3.4909 192 0.7414 0.4423 0.7414 0.8611
No log 3.5273 194 0.9489 0.4779 0.9489 0.9741
No log 3.5636 196 0.9101 0.4563 0.9101 0.9540
No log 3.6 198 0.7314 0.4748 0.7314 0.8552
No log 3.6364 200 0.8028 0.4968 0.8028 0.8960
No log 3.6727 202 0.7664 0.4865 0.7664 0.8754
No log 3.7091 204 0.6218 0.5059 0.6218 0.7885
No log 3.7455 206 0.5692 0.5166 0.5692 0.7545
No log 3.7818 208 0.5302 0.5567 0.5302 0.7282
No log 3.8182 210 0.5190 0.5159 0.5190 0.7204
No log 3.8545 212 0.5068 0.5801 0.5068 0.7119
No log 3.8909 214 0.5822 0.5652 0.5822 0.7630
No log 3.9273 216 0.5676 0.6232 0.5676 0.7534
No log 3.9636 218 0.5522 0.6130 0.5522 0.7431
No log 4.0 220 0.5433 0.5915 0.5433 0.7371
No log 4.0364 222 0.5608 0.5771 0.5608 0.7489
No log 4.0727 224 0.5361 0.6305 0.5361 0.7322
No log 4.1091 226 0.5281 0.5118 0.5281 0.7267
No log 4.1455 228 0.5282 0.5915 0.5282 0.7267
No log 4.1818 230 0.5183 0.6683 0.5183 0.7199
No log 4.2182 232 0.5130 0.6210 0.5130 0.7162
No log 4.2545 234 0.4877 0.6241 0.4877 0.6983
No log 4.2909 236 0.6357 0.5568 0.6357 0.7973
No log 4.3273 238 0.8605 0.3381 0.8605 0.9276
No log 4.3636 240 0.8261 0.3110 0.8261 0.9089
No log 4.4 242 0.7008 0.3972 0.7008 0.8371
No log 4.4364 244 0.6284 0.4867 0.6284 0.7927
No log 4.4727 246 0.6217 0.5111 0.6217 0.7885
No log 4.5091 248 0.7132 0.4470 0.7132 0.8445
No log 4.5455 250 0.8352 0.3390 0.8352 0.9139
No log 4.5818 252 0.9437 0.2968 0.9437 0.9714
No log 4.6182 254 0.8548 0.3759 0.8548 0.9246
No log 4.6545 256 0.6496 0.5243 0.6496 0.8060
No log 4.6909 258 0.5078 0.6122 0.5078 0.7126
No log 4.7273 260 0.4993 0.6135 0.4993 0.7066
No log 4.7636 262 0.4739 0.6070 0.4739 0.6884
No log 4.8 264 0.4481 0.6065 0.4481 0.6694
No log 4.8364 266 0.4505 0.6222 0.4505 0.6712
No log 4.8727 268 0.4732 0.6526 0.4732 0.6879
No log 4.9091 270 0.4685 0.6484 0.4685 0.6845
No log 4.9455 272 0.4858 0.5835 0.4858 0.6970
No log 4.9818 274 0.4888 0.6199 0.4888 0.6991
No log 5.0182 276 0.4747 0.6021 0.4747 0.6890
No log 5.0545 278 0.4636 0.6656 0.4636 0.6809
No log 5.0909 280 0.5330 0.6547 0.5330 0.7301
No log 5.1273 282 0.5808 0.5552 0.5808 0.7621
No log 5.1636 284 0.5498 0.6351 0.5498 0.7415
No log 5.2 286 0.5084 0.6307 0.5084 0.7130
No log 5.2364 288 0.6022 0.6080 0.6022 0.7760
No log 5.2727 290 0.6629 0.5658 0.6629 0.8142
No log 5.3091 292 0.6093 0.5323 0.6093 0.7806
No log 5.3455 294 0.5331 0.5647 0.5331 0.7301
No log 5.3818 296 0.5015 0.5184 0.5015 0.7082
No log 5.4182 298 0.5148 0.5798 0.5148 0.7175
No log 5.4545 300 0.5724 0.5933 0.5724 0.7566
No log 5.4909 302 0.5983 0.5761 0.5983 0.7735
No log 5.5273 304 0.6219 0.5735 0.6219 0.7886
No log 5.5636 306 0.5787 0.5512 0.5787 0.7607
No log 5.6 308 0.5258 0.6187 0.5258 0.7251
No log 5.6364 310 0.5582 0.4911 0.5582 0.7472
No log 5.6727 312 0.6451 0.4624 0.6451 0.8032
No log 5.7091 314 0.6434 0.4788 0.6434 0.8021
No log 5.7455 316 0.6756 0.4630 0.6756 0.8219
No log 5.7818 318 0.6093 0.4438 0.6093 0.7806
No log 5.8182 320 0.5231 0.5577 0.5231 0.7233
No log 5.8545 322 0.5446 0.5345 0.5446 0.7380
No log 5.8909 324 0.6108 0.5591 0.6108 0.7815
No log 5.9273 326 0.5727 0.5705 0.5727 0.7568
No log 5.9636 328 0.5103 0.4934 0.5103 0.7144
No log 6.0 330 0.5075 0.5344 0.5075 0.7124
No log 6.0364 332 0.5433 0.6082 0.5433 0.7371
No log 6.0727 334 0.5647 0.5577 0.5647 0.7515
No log 6.1091 336 0.6238 0.5512 0.6238 0.7898
No log 6.1455 338 0.6603 0.4424 0.6603 0.8126
No log 6.1818 340 0.6347 0.4728 0.6347 0.7967
No log 6.2182 342 0.5977 0.5330 0.5977 0.7731
No log 6.2545 344 0.5431 0.5036 0.5431 0.7369
No log 6.2909 346 0.5077 0.5550 0.5077 0.7125
No log 6.3273 348 0.5252 0.5457 0.5252 0.7247
No log 6.3636 350 0.5196 0.5668 0.5196 0.7209
No log 6.4 352 0.5250 0.5991 0.5250 0.7246
No log 6.4364 354 0.5246 0.5772 0.5246 0.7243
No log 6.4727 356 0.5409 0.6404 0.5409 0.7354
No log 6.5091 358 0.5095 0.5201 0.5095 0.7138
No log 6.5455 360 0.5386 0.4929 0.5386 0.7339
No log 6.5818 362 0.5540 0.4684 0.5540 0.7443
No log 6.6182 364 0.5377 0.4524 0.5377 0.7333
No log 6.6545 366 0.5737 0.4451 0.5737 0.7575
No log 6.6909 368 0.6388 0.52 0.6388 0.7992
No log 6.7273 370 0.5832 0.5394 0.5832 0.7636
No log 6.7636 372 0.4988 0.4847 0.4988 0.7063
No log 6.8 374 0.5011 0.5714 0.5011 0.7079
No log 6.8364 376 0.5301 0.5265 0.5301 0.7281
No log 6.8727 378 0.5075 0.5714 0.5075 0.7124
No log 6.9091 380 0.5041 0.4705 0.5041 0.7100
No log 6.9455 382 0.7137 0.5234 0.7137 0.8448
No log 6.9818 384 0.9237 0.3999 0.9237 0.9611
No log 7.0182 386 0.8822 0.3919 0.8822 0.9393
No log 7.0545 388 0.7124 0.4432 0.7124 0.8440
No log 7.0909 390 0.5451 0.4158 0.5451 0.7383
No log 7.1273 392 0.5146 0.4729 0.5146 0.7174
No log 7.1636 394 0.5133 0.4659 0.5133 0.7164
No log 7.2 396 0.5401 0.4945 0.5401 0.7349
No log 7.2364 398 0.6710 0.4203 0.6710 0.8192
No log 7.2727 400 0.8691 0.4194 0.8691 0.9323
No log 7.3091 402 0.9133 0.4402 0.9133 0.9556
No log 7.3455 404 0.7620 0.5460 0.7620 0.8729
No log 7.3818 406 0.5455 0.5978 0.5455 0.7386
No log 7.4182 408 0.4570 0.5768 0.4570 0.6760
No log 7.4545 410 0.4677 0.5538 0.4677 0.6839
No log 7.4909 412 0.4889 0.5698 0.4889 0.6992
No log 7.5273 414 0.5246 0.5784 0.5246 0.7243
No log 7.5636 416 0.6332 0.5362 0.6332 0.7958
No log 7.6 418 0.7323 0.3807 0.7323 0.8557
No log 7.6364 420 0.7331 0.4828 0.7331 0.8562
No log 7.6727 422 0.6155 0.5358 0.6155 0.7845
No log 7.7091 424 0.4935 0.5845 0.4935 0.7025
No log 7.7455 426 0.4740 0.5965 0.4740 0.6885
No log 7.7818 428 0.4768 0.5965 0.4768 0.6905
No log 7.8182 430 0.4957 0.5337 0.4957 0.7040
No log 7.8545 432 0.5103 0.6063 0.5103 0.7143
No log 7.8909 434 0.5334 0.5839 0.5334 0.7304
No log 7.9273 436 0.5682 0.5678 0.5682 0.7538
No log 7.9636 438 0.5622 0.5730 0.5622 0.7498
No log 8.0 440 0.5222 0.6025 0.5222 0.7226
No log 8.0364 442 0.4932 0.6313 0.4932 0.7023
No log 8.0727 444 0.5030 0.6025 0.5030 0.7092
No log 8.1091 446 0.5569 0.5744 0.5569 0.7462
No log 8.1455 448 0.5441 0.5802 0.5441 0.7377
No log 8.1818 450 0.5041 0.6493 0.5041 0.7100
No log 8.2182 452 0.4803 0.6401 0.4803 0.6931
No log 8.2545 454 0.4884 0.5831 0.4884 0.6988
No log 8.2909 456 0.5237 0.6128 0.5237 0.7237
No log 8.3273 458 0.4930 0.6337 0.4930 0.7021
No log 8.3636 460 0.4471 0.6365 0.4471 0.6686
No log 8.4 462 0.4694 0.6530 0.4694 0.6851
No log 8.4364 464 0.4824 0.6604 0.4824 0.6945
No log 8.4727 466 0.4607 0.6228 0.4607 0.6787
No log 8.5091 468 0.4727 0.6032 0.4727 0.6875
No log 8.5455 470 0.4762 0.5753 0.4762 0.6900
No log 8.5818 472 0.4727 0.6720 0.4727 0.6876
No log 8.6182 474 0.4710 0.6975 0.4710 0.6863
No log 8.6545 476 0.4501 0.6198 0.4501 0.6709
No log 8.6909 478 0.4403 0.5707 0.4403 0.6635
No log 8.7273 480 0.4608 0.6096 0.4608 0.6788
No log 8.7636 482 0.5065 0.6507 0.5065 0.7117
No log 8.8 484 0.5260 0.6498 0.5260 0.7252
No log 8.8364 486 0.5405 0.6154 0.5405 0.7352
No log 8.8727 488 0.5472 0.5953 0.5472 0.7397
No log 8.9091 490 0.5206 0.5858 0.5206 0.7215
No log 8.9455 492 0.5243 0.5553 0.5243 0.7241
No log 8.9818 494 0.5505 0.5858 0.5505 0.7420
No log 9.0182 496 0.5611 0.5858 0.5611 0.7491
No log 9.0545 498 0.6172 0.5827 0.6172 0.7856
0.3373 9.0909 500 0.6314 0.5295 0.6314 0.7946
0.3373 9.1273 502 0.5878 0.5631 0.5878 0.7667
0.3373 9.1636 504 0.5841 0.5439 0.5841 0.7643
0.3373 9.2 506 0.5778 0.5243 0.5778 0.7601
0.3373 9.2364 508 0.5575 0.5659 0.5575 0.7467
0.3373 9.2727 510 0.4809 0.5708 0.4809 0.6935
0.3373 9.3091 512 0.4724 0.6503 0.4724 0.6873
0.3373 9.3455 514 0.4672 0.6721 0.4672 0.6835
0.3373 9.3818 516 0.4797 0.6503 0.4797 0.6926
0.3373 9.4182 518 0.5237 0.6011 0.5237 0.7237
0.3373 9.4545 520 0.5379 0.6226 0.5379 0.7334
0.3373 9.4909 522 0.5006 0.6318 0.5006 0.7076
0.3373 9.5273 524 0.5024 0.5831 0.5024 0.7088
0.3373 9.5636 526 0.4854 0.5801 0.4854 0.6967
0.3373 9.6 528 0.5079 0.5831 0.5079 0.7126
0.3373 9.6364 530 0.5182 0.5801 0.5182 0.7199
0.3373 9.6727 532 0.4921 0.6303 0.4921 0.7015
0.3373 9.7091 534 0.4648 0.6228 0.4648 0.6817
0.3373 9.7455 536 0.4620 0.6542 0.4620 0.6797
0.3373 9.7818 538 0.4921 0.6188 0.4921 0.7015
0.3373 9.8182 540 0.5073 0.5864 0.5073 0.7122
0.3373 9.8545 542 0.4923 0.6606 0.4923 0.7017
0.3373 9.8909 544 0.4648 0.6530 0.4648 0.6818
0.3373 9.9273 546 0.4578 0.6263 0.4578 0.6766
0.3373 9.9636 548 0.4614 0.6263 0.4614 0.6793
0.3373 10.0 550 0.4697 0.5632 0.4697 0.6854
0.3373 10.0364 552 0.4546 0.5770 0.4546 0.6742
0.3373 10.0727 554 0.4637 0.5770 0.4637 0.6810
0.3373 10.1091 556 0.4582 0.5799 0.4582 0.6769
0.3373 10.1455 558 0.4429 0.6467 0.4429 0.6655
0.3373 10.1818 560 0.4511 0.6156 0.4511 0.6716
0.3373 10.2182 562 0.4766 0.6082 0.4766 0.6904
0.3373 10.2545 564 0.5126 0.5882 0.5126 0.7160
0.3373 10.2909 566 0.5707 0.4982 0.5707 0.7554
0.3373 10.3273 568 0.5845 0.4550 0.5845 0.7645
0.3373 10.3636 570 0.5815 0.4550 0.5815 0.7625
0.3373 10.4 572 0.5781 0.5017 0.5781 0.7603
0.3373 10.4364 574 0.6197 0.4307 0.6197 0.7872

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run3_AugV5_k11_task7_organization

Finetuned
(4019)
this model