ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k12_task7_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.5049
  • Qwk: 0.4774
  • Mse: 0.5049
  • Rmse: 0.7106

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0333 2 2.7443 -0.0593 2.7443 1.6566
No log 0.0667 4 1.2455 0.1243 1.2455 1.1160
No log 0.1 6 0.7866 0.0937 0.7866 0.8869
No log 0.1333 8 0.7359 0.2012 0.7359 0.8579
No log 0.1667 10 0.7198 0.1587 0.7198 0.8484
No log 0.2 12 0.7381 0.2057 0.7381 0.8591
No log 0.2333 14 0.7210 0.3388 0.7210 0.8491
No log 0.2667 16 0.6292 0.4018 0.6292 0.7932
No log 0.3 18 0.5640 0.4211 0.5640 0.7510
No log 0.3333 20 0.5344 0.4747 0.5344 0.7310
No log 0.3667 22 0.5256 0.4891 0.5256 0.7250
No log 0.4 24 0.6579 0.3484 0.6579 0.8111
No log 0.4333 26 0.6508 0.2880 0.6508 0.8067
No log 0.4667 28 0.5148 0.4538 0.5148 0.7175
No log 0.5 30 0.5906 0.5291 0.5906 0.7685
No log 0.5333 32 0.5816 0.6038 0.5816 0.7626
No log 0.5667 34 0.4766 0.4774 0.4766 0.6903
No log 0.6 36 0.6294 0.5069 0.6294 0.7933
No log 0.6333 38 0.5651 0.4589 0.5651 0.7517
No log 0.6667 40 0.4463 0.5719 0.4463 0.6681
No log 0.7 42 0.5474 0.6660 0.5474 0.7398
No log 0.7333 44 0.4794 0.6070 0.4794 0.6924
No log 0.7667 46 0.4765 0.4928 0.4765 0.6903
No log 0.8 48 0.7770 0.4286 0.7770 0.8815
No log 0.8333 50 0.7177 0.4382 0.7177 0.8472
No log 0.8667 52 0.4910 0.5042 0.4910 0.7007
No log 0.9 54 0.6179 0.5624 0.6179 0.7861
No log 0.9333 56 0.5443 0.5965 0.5443 0.7378
No log 0.9667 58 0.5999 0.4584 0.5999 0.7745
No log 1.0 60 0.9032 0.4171 0.9032 0.9504
No log 1.0333 62 1.0239 0.2701 1.0239 1.0119
No log 1.0667 64 0.7845 0.4382 0.7845 0.8857
No log 1.1 66 0.5105 0.4364 0.5105 0.7145
No log 1.1333 68 0.6063 0.5931 0.6063 0.7787
No log 1.1667 70 0.6528 0.5358 0.6528 0.8079
No log 1.2 72 0.5485 0.5802 0.5485 0.7406
No log 1.2333 74 0.5178 0.4539 0.5178 0.7196
No log 1.2667 76 0.6926 0.3827 0.6926 0.8322
No log 1.3 78 0.7759 0.4088 0.7759 0.8808
No log 1.3333 80 0.6952 0.4238 0.6952 0.8338
No log 1.3667 82 0.6120 0.5152 0.6120 0.7823
No log 1.4 84 0.5489 0.5586 0.5489 0.7409
No log 1.4333 86 0.5985 0.5772 0.5985 0.7737
No log 1.4667 88 0.6982 0.5231 0.6982 0.8356
No log 1.5 90 0.5941 0.5036 0.5941 0.7708
No log 1.5333 92 0.5467 0.4384 0.5467 0.7394
No log 1.5667 94 0.5604 0.4345 0.5604 0.7486
No log 1.6 96 0.5608 0.3779 0.5608 0.7489
No log 1.6333 98 0.5230 0.4918 0.5230 0.7232
No log 1.6667 100 0.5245 0.5555 0.5245 0.7242
No log 1.7 102 0.5121 0.5543 0.5121 0.7156
No log 1.7333 104 0.5200 0.5758 0.5200 0.7211
No log 1.7667 106 0.5061 0.5397 0.5061 0.7114
No log 1.8 108 0.5612 0.5212 0.5612 0.7492
No log 1.8333 110 0.6204 0.5220 0.6204 0.7876
No log 1.8667 112 0.5734 0.5778 0.5734 0.7573
No log 1.9 114 0.5533 0.6314 0.5533 0.7439
No log 1.9333 116 0.6051 0.5819 0.6051 0.7779
No log 1.9667 118 0.5886 0.5675 0.5886 0.7672
No log 2.0 120 0.5254 0.6247 0.5254 0.7248
No log 2.0333 122 0.5135 0.5797 0.5135 0.7166
No log 2.0667 124 0.5246 0.6247 0.5246 0.7243
No log 2.1 126 0.5455 0.5836 0.5455 0.7386
No log 2.1333 128 0.5382 0.5663 0.5382 0.7336
No log 2.1667 130 0.5348 0.5736 0.5348 0.7313
No log 2.2 132 0.4913 0.5159 0.4913 0.7009
No log 2.2333 134 0.4889 0.5577 0.4889 0.6992
No log 2.2667 136 0.5143 0.6034 0.5143 0.7171
No log 2.3 138 0.6536 0.4494 0.6536 0.8084
No log 2.3333 140 0.6594 0.4890 0.6594 0.8120
No log 2.3667 142 0.5156 0.6395 0.5156 0.7181
No log 2.4 144 0.6118 0.5739 0.6118 0.7822
No log 2.4333 146 0.6845 0.5885 0.6845 0.8273
No log 2.4667 148 0.7649 0.5526 0.7649 0.8746
No log 2.5 150 0.6243 0.5802 0.6243 0.7901
No log 2.5333 152 0.5238 0.4576 0.5238 0.7237
No log 2.5667 154 0.5172 0.4918 0.5172 0.7192
No log 2.6 156 0.5310 0.4849 0.5310 0.7287
No log 2.6333 158 0.5453 0.5098 0.5453 0.7384
No log 2.6667 160 0.5309 0.5286 0.5309 0.7286
No log 2.7 162 0.5373 0.5655 0.5373 0.7330
No log 2.7333 164 0.5741 0.5379 0.5741 0.7577
No log 2.7667 166 0.5036 0.5816 0.5036 0.7097
No log 2.8 168 0.4921 0.5335 0.4921 0.7015
No log 2.8333 170 0.5240 0.6092 0.5240 0.7239
No log 2.8667 172 0.4612 0.5523 0.4612 0.6791
No log 2.9 174 0.5436 0.5802 0.5436 0.7373
No log 2.9333 176 0.6266 0.5139 0.6266 0.7916
No log 2.9667 178 0.5155 0.5379 0.5155 0.7180
No log 3.0 180 0.4553 0.6339 0.4553 0.6747
No log 3.0333 182 0.4626 0.5302 0.4626 0.6801
No log 3.0667 184 0.4704 0.6027 0.4704 0.6859
No log 3.1 186 0.4837 0.6895 0.4837 0.6955
No log 3.1333 188 0.5225 0.6078 0.5225 0.7228
No log 3.1667 190 0.4948 0.6329 0.4948 0.7034
No log 3.2 192 0.5000 0.5592 0.5000 0.7071
No log 3.2333 194 0.5921 0.5595 0.5921 0.7695
No log 3.2667 196 0.5466 0.5883 0.5466 0.7393
No log 3.3 198 0.4558 0.5335 0.4558 0.6752
No log 3.3333 200 0.4473 0.6228 0.4473 0.6688
No log 3.3667 202 0.4703 0.6426 0.4703 0.6858
No log 3.4 204 0.4599 0.5719 0.4599 0.6782
No log 3.4333 206 0.4677 0.4991 0.4677 0.6839
No log 3.4667 208 0.4747 0.5195 0.4747 0.6890
No log 3.5 210 0.4834 0.4953 0.4834 0.6953
No log 3.5333 212 0.5203 0.5516 0.5203 0.7213
No log 3.5667 214 0.5242 0.5516 0.5242 0.7240
No log 3.6 216 0.5117 0.5362 0.5117 0.7154
No log 3.6333 218 0.4879 0.6339 0.4879 0.6985
No log 3.6667 220 0.5162 0.5433 0.5162 0.7184
No log 3.7 222 0.5848 0.5323 0.5848 0.7648
No log 3.7333 224 0.5864 0.5323 0.5864 0.7658
No log 3.7667 226 0.5882 0.5323 0.5882 0.7669
No log 3.8 228 0.5128 0.5614 0.5128 0.7161
No log 3.8333 230 0.5263 0.4832 0.5263 0.7254
No log 3.8667 232 0.6349 0.5129 0.6349 0.7968
No log 3.9 234 0.6163 0.4904 0.6163 0.7851
No log 3.9333 236 0.5228 0.4527 0.5228 0.7231
No log 3.9667 238 0.5641 0.5259 0.5641 0.7511
No log 4.0 240 0.6817 0.5080 0.6817 0.8257
No log 4.0333 242 0.6740 0.5080 0.6740 0.8210
No log 4.0667 244 0.5514 0.6261 0.5514 0.7426
No log 4.1 246 0.5288 0.5075 0.5288 0.7272
No log 4.1333 248 0.5408 0.5110 0.5408 0.7354
No log 4.1667 250 0.5043 0.5340 0.5043 0.7101
No log 4.2 252 0.5412 0.5786 0.5412 0.7356
No log 4.2333 254 0.6552 0.4610 0.6552 0.8095
No log 4.2667 256 0.7766 0.5017 0.7766 0.8812
No log 4.3 258 0.6577 0.4801 0.6577 0.8110
No log 4.3333 260 0.5405 0.4835 0.5405 0.7352
No log 4.3667 262 0.5047 0.5698 0.5047 0.7104
No log 4.4 264 0.4839 0.6303 0.4839 0.6957
No log 4.4333 266 0.4886 0.5580 0.4886 0.6990
No log 4.4667 268 0.5816 0.5528 0.5816 0.7626
No log 4.5 270 0.5849 0.5599 0.5849 0.7648
No log 4.5333 272 0.5272 0.4991 0.5272 0.7261
No log 4.5667 274 0.5721 0.4788 0.5721 0.7564
No log 4.6 276 0.6386 0.4759 0.6386 0.7991
No log 4.6333 278 0.6346 0.4451 0.6346 0.7966
No log 4.6667 280 0.6450 0.4451 0.6450 0.8031
No log 4.7 282 0.6267 0.4684 0.6267 0.7916
No log 4.7333 284 0.5800 0.4707 0.5800 0.7616
No log 4.7667 286 0.5736 0.4983 0.5736 0.7574
No log 4.8 288 0.5704 0.5151 0.5704 0.7552
No log 4.8333 290 0.5912 0.5151 0.5912 0.7689
No log 4.8667 292 0.6440 0.4134 0.6440 0.8025
No log 4.9 294 0.6403 0.4186 0.6403 0.8002
No log 4.9333 296 0.5678 0.5584 0.5678 0.7535
No log 4.9667 298 0.5791 0.4770 0.5791 0.7610
No log 5.0 300 0.5728 0.4227 0.5728 0.7569
No log 5.0333 302 0.5464 0.5584 0.5464 0.7392
No log 5.0667 304 0.5817 0.5177 0.5817 0.7627
No log 5.1 306 0.5802 0.5177 0.5802 0.7617
No log 5.1333 308 0.5527 0.5597 0.5527 0.7434
No log 5.1667 310 0.5341 0.5131 0.5341 0.7308
No log 5.2 312 0.5207 0.4322 0.5207 0.7216
No log 5.2333 314 0.5308 0.4590 0.5308 0.7286
No log 5.2667 316 0.5400 0.4717 0.5400 0.7348
No log 5.3 318 0.5205 0.4590 0.5205 0.7215
No log 5.3333 320 0.5192 0.5396 0.5192 0.7205
No log 5.3667 322 0.5486 0.5195 0.5486 0.7407
No log 5.4 324 0.5466 0.5195 0.5466 0.7393
No log 5.4333 326 0.5456 0.4945 0.5456 0.7387
No log 5.4667 328 0.5234 0.5228 0.5234 0.7235
No log 5.5 330 0.4976 0.5404 0.4976 0.7054
No log 5.5333 332 0.4881 0.6096 0.4881 0.6986
No log 5.5667 334 0.4841 0.6431 0.4841 0.6958
No log 5.6 336 0.4811 0.6210 0.4811 0.6936
No log 5.6333 338 0.5199 0.6084 0.5199 0.7210
No log 5.6667 340 0.5307 0.6096 0.5307 0.7285
No log 5.7 342 0.5119 0.5893 0.5119 0.7155
No log 5.7333 344 0.4951 0.6277 0.4951 0.7037
No log 5.7667 346 0.5493 0.6115 0.5493 0.7411
No log 5.8 348 0.5648 0.5577 0.5648 0.7515
No log 5.8333 350 0.6046 0.5310 0.6046 0.7776
No log 5.8667 352 0.5989 0.5310 0.5989 0.7739
No log 5.9 354 0.5327 0.5516 0.5327 0.7299
No log 5.9333 356 0.5266 0.5614 0.5266 0.7257
No log 5.9667 358 0.5177 0.5131 0.5177 0.7195
No log 6.0 360 0.5242 0.5939 0.5242 0.7240
No log 6.0333 362 0.5290 0.5897 0.5290 0.7273
No log 6.0667 364 0.5082 0.5926 0.5082 0.7129
No log 6.1 366 0.4937 0.6060 0.4937 0.7026
No log 6.1333 368 0.4828 0.6060 0.4828 0.6948
No log 6.1667 370 0.4771 0.5750 0.4771 0.6907
No log 6.2 372 0.4769 0.5304 0.4769 0.6906
No log 6.2333 374 0.4811 0.5505 0.4811 0.6936
No log 6.2667 376 0.5552 0.5498 0.5552 0.7451
No log 6.3 378 0.6275 0.5773 0.6275 0.7922
No log 6.3333 380 0.5899 0.5310 0.5899 0.7681
No log 6.3667 382 0.5022 0.5170 0.5022 0.7087
No log 6.4 384 0.4643 0.5522 0.4643 0.6814
No log 6.4333 386 0.4639 0.5493 0.4639 0.6811
No log 6.4667 388 0.4507 0.5714 0.4507 0.6714
No log 6.5 390 0.4686 0.5995 0.4686 0.6846
No log 6.5333 392 0.4928 0.5947 0.4928 0.7020
No log 6.5667 394 0.4615 0.6623 0.4615 0.6793
No log 6.6 396 0.4426 0.5999 0.4426 0.6653
No log 6.6333 398 0.5275 0.5657 0.5275 0.7263
No log 6.6667 400 0.5776 0.5178 0.5776 0.7600
No log 6.7 402 0.5159 0.5748 0.5159 0.7183
No log 6.7333 404 0.4468 0.5522 0.4468 0.6684
No log 6.7667 406 0.5051 0.5831 0.5051 0.7107
No log 6.8 408 0.5587 0.5624 0.5587 0.7475
No log 6.8333 410 0.5665 0.5067 0.5665 0.7527
No log 6.8667 412 0.5056 0.5752 0.5056 0.7111
No log 6.9 414 0.4659 0.4768 0.4659 0.6825
No log 6.9333 416 0.4784 0.5714 0.4784 0.6916
No log 6.9667 418 0.4702 0.5930 0.4702 0.6857
No log 7.0 420 0.4503 0.6214 0.4503 0.6710
No log 7.0333 422 0.4949 0.5671 0.4949 0.7035
No log 7.0667 424 0.5337 0.5882 0.5337 0.7305
No log 7.1 426 0.4766 0.5671 0.4766 0.6904
No log 7.1333 428 0.4475 0.6214 0.4475 0.6690
No log 7.1667 430 0.4708 0.5446 0.4708 0.6862
No log 7.2 432 0.4811 0.5110 0.4811 0.6936
No log 7.2333 434 0.4644 0.6364 0.4644 0.6814
No log 7.2667 436 0.4609 0.6426 0.4609 0.6789
No log 7.3 438 0.4606 0.6736 0.4606 0.6787
No log 7.3333 440 0.4637 0.6530 0.4637 0.6809
No log 7.3667 442 0.4651 0.6530 0.4651 0.6820
No log 7.4 444 0.4685 0.5584 0.4685 0.6845
No log 7.4333 446 0.4779 0.6186 0.4779 0.6913
No log 7.4667 448 0.4907 0.5614 0.4907 0.7005
No log 7.5 450 0.4937 0.5614 0.4937 0.7026
No log 7.5333 452 0.4829 0.5852 0.4829 0.6949
No log 7.5667 454 0.4782 0.5852 0.4782 0.6915
No log 7.6 456 0.4542 0.6096 0.4542 0.6739
No log 7.6333 458 0.4524 0.5831 0.4524 0.6726
No log 7.6667 460 0.4935 0.4977 0.4935 0.7025
No log 7.7 462 0.4981 0.4908 0.4981 0.7058
No log 7.7333 464 0.4670 0.5831 0.4670 0.6834
No log 7.7667 466 0.4413 0.6330 0.4413 0.6643
No log 7.8 468 0.4484 0.6317 0.4484 0.6696
No log 7.8333 470 0.4431 0.6330 0.4431 0.6656
No log 7.8667 472 0.4467 0.6053 0.4467 0.6684
No log 7.9 474 0.4556 0.5985 0.4556 0.6750
No log 7.9333 476 0.4731 0.6013 0.4731 0.6878
No log 7.9667 478 0.5228 0.5178 0.5228 0.7230
No log 8.0 480 0.4875 0.5083 0.4875 0.6982
No log 8.0333 482 0.4400 0.6255 0.4400 0.6633
No log 8.0667 484 0.4409 0.6115 0.4409 0.6640
No log 8.1 486 0.4483 0.6201 0.4483 0.6695
No log 8.1333 488 0.4622 0.5980 0.4622 0.6799
No log 8.1667 490 0.4548 0.6303 0.4548 0.6744
No log 8.2 492 0.4648 0.6082 0.4648 0.6818
No log 8.2333 494 0.4714 0.6082 0.4714 0.6866
No log 8.2667 496 0.4827 0.5786 0.4827 0.6947
No log 8.3 498 0.4749 0.6082 0.4749 0.6891
0.293 8.3333 500 0.4775 0.5648 0.4775 0.6910
0.293 8.3667 502 0.4845 0.5044 0.4845 0.6960
0.293 8.4 504 0.4957 0.4561 0.4957 0.7041
0.293 8.4333 506 0.4827 0.4795 0.4827 0.6948
0.293 8.4667 508 0.4622 0.5457 0.4622 0.6799
0.293 8.5 510 0.4492 0.5930 0.4492 0.6702
0.293 8.5333 512 0.4740 0.5414 0.4740 0.6884
0.293 8.5667 514 0.5326 0.5247 0.5326 0.7298
0.293 8.6 516 0.4920 0.5811 0.4920 0.7015
0.293 8.6333 518 0.4335 0.6330 0.4335 0.6584
0.293 8.6667 520 0.5223 0.5677 0.5223 0.7227
0.293 8.7 522 0.6227 0.4644 0.6227 0.7891
0.293 8.7333 524 0.5945 0.4815 0.5945 0.7711
0.293 8.7667 526 0.5778 0.5233 0.5778 0.7602
0.293 8.8 528 0.5474 0.5233 0.5474 0.7399
0.293 8.8333 530 0.5049 0.4774 0.5049 0.7106

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k12_task7_organization

Finetuned
(4019)
this model