ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run3_AugV5_k6_task7_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.6103
  • Qwk: 0.4486
  • Mse: 0.6103
  • Rmse: 0.7812

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0667 2 2.4928 -0.0109 2.4928 1.5788
No log 0.1333 4 1.2261 0.1248 1.2261 1.1073
No log 0.2 6 0.7822 0.1786 0.7822 0.8844
No log 0.2667 8 0.6526 0.0937 0.6526 0.8078
No log 0.3333 10 0.5926 0.3942 0.5926 0.7698
No log 0.4 12 0.7680 0.3166 0.7680 0.8764
No log 0.4667 14 0.5575 0.5265 0.5575 0.7467
No log 0.5333 16 0.7021 0.3846 0.7021 0.8379
No log 0.6 18 0.9719 0.2589 0.9719 0.9859
No log 0.6667 20 0.9151 0.2316 0.9151 0.9566
No log 0.7333 22 0.8717 0.2471 0.8717 0.9336
No log 0.8 24 0.8149 0.0393 0.8149 0.9027
No log 0.8667 26 0.7647 0.1508 0.7647 0.8745
No log 0.9333 28 0.7886 0.4224 0.7886 0.8880
No log 1.0 30 0.7708 0.3746 0.7708 0.8779
No log 1.0667 32 0.6210 0.1508 0.6210 0.7880
No log 1.1333 34 0.6113 0.2979 0.6113 0.7818
No log 1.2 36 0.5934 0.3302 0.5934 0.7703
No log 1.2667 38 0.5824 0.3890 0.5824 0.7632
No log 1.3333 40 0.5675 0.4077 0.5675 0.7534
No log 1.4 42 0.5751 0.3144 0.5751 0.7583
No log 1.4667 44 0.6323 0.4745 0.6323 0.7952
No log 1.5333 46 0.6054 0.4134 0.6054 0.7781
No log 1.6 48 0.5731 0.4418 0.5731 0.7571
No log 1.6667 50 0.6009 0.4615 0.6009 0.7752
No log 1.7333 52 0.6255 0.4858 0.6255 0.7909
No log 1.8 54 0.7743 0.4811 0.7743 0.8799
No log 1.8667 56 0.8880 0.3671 0.8880 0.9423
No log 1.9333 58 0.7284 0.4868 0.7284 0.8534
No log 2.0 60 0.6514 0.5244 0.6514 0.8071
No log 2.0667 62 0.8552 0.4754 0.8552 0.9248
No log 2.1333 64 0.9121 0.4476 0.9121 0.9550
No log 2.2 66 0.6951 0.4779 0.6951 0.8338
No log 2.2667 68 0.7652 0.3517 0.7652 0.8748
No log 2.3333 70 0.8361 0.3312 0.8361 0.9144
No log 2.4 72 0.7072 0.3907 0.7072 0.8410
No log 2.4667 74 0.6047 0.3183 0.6047 0.7776
No log 2.5333 76 0.6273 0.3471 0.6273 0.7920
No log 2.6 78 0.6452 0.3918 0.6452 0.8033
No log 2.6667 80 0.6017 0.4020 0.6017 0.7757
No log 2.7333 82 0.5615 0.3599 0.5615 0.7493
No log 2.8 84 0.5628 0.4157 0.5628 0.7502
No log 2.8667 86 0.5535 0.4700 0.5535 0.7440
No log 2.9333 88 0.5596 0.5421 0.5596 0.7481
No log 3.0 90 0.5666 0.5189 0.5666 0.7527
No log 3.0667 92 0.5521 0.4703 0.5521 0.7430
No log 3.1333 94 0.5785 0.5457 0.5785 0.7606
No log 3.2 96 0.5541 0.5915 0.5541 0.7444
No log 3.2667 98 0.5767 0.4895 0.5767 0.7594
No log 3.3333 100 0.5513 0.5768 0.5513 0.7425
No log 3.4 102 0.5573 0.5257 0.5573 0.7465
No log 3.4667 104 0.5598 0.5390 0.5598 0.7482
No log 3.5333 106 0.7265 0.4635 0.7265 0.8524
No log 3.6 108 0.6658 0.5184 0.6658 0.8160
No log 3.6667 110 0.5365 0.5413 0.5365 0.7325
No log 3.7333 112 0.7133 0.4705 0.7133 0.8445
No log 3.8 114 0.9460 0.3781 0.9460 0.9726
No log 3.8667 116 0.9540 0.4125 0.9540 0.9767
No log 3.9333 118 0.6812 0.4614 0.6812 0.8254
No log 4.0 120 0.5699 0.5098 0.5699 0.7549
No log 4.0667 122 0.6392 0.4868 0.6392 0.7995
No log 4.1333 124 0.5896 0.4614 0.5896 0.7678
No log 4.2 126 0.5516 0.4958 0.5516 0.7427
No log 4.2667 128 0.6579 0.5195 0.6579 0.8111
No log 4.3333 130 0.6368 0.4893 0.6368 0.7980
No log 4.4 132 0.5510 0.5687 0.5510 0.7423
No log 4.4667 134 0.5115 0.5648 0.5115 0.7152
No log 4.5333 136 0.5087 0.5623 0.5087 0.7132
No log 4.6 138 0.5357 0.6004 0.5357 0.7319
No log 4.6667 140 0.5599 0.5617 0.5599 0.7482
No log 4.7333 142 0.5367 0.5937 0.5367 0.7326
No log 4.8 144 0.5554 0.5589 0.5554 0.7453
No log 4.8667 146 0.5925 0.5543 0.5925 0.7697
No log 4.9333 148 0.6649 0.5559 0.6649 0.8154
No log 5.0 150 0.6898 0.5774 0.6898 0.8306
No log 5.0667 152 0.6924 0.5824 0.6924 0.8321
No log 5.1333 154 0.6271 0.5786 0.6271 0.7919
No log 5.2 156 0.5959 0.4640 0.5959 0.7720
No log 5.2667 158 0.5876 0.4937 0.5876 0.7666
No log 5.3333 160 0.6193 0.4134 0.6193 0.7870
No log 5.4 162 0.6052 0.4828 0.6052 0.7779
No log 5.4667 164 0.6308 0.5009 0.6308 0.7942
No log 5.5333 166 0.7427 0.4562 0.7427 0.8618
No log 5.6 168 0.6972 0.4261 0.6972 0.8350
No log 5.6667 170 0.6285 0.4458 0.6285 0.7928
No log 5.7333 172 0.7789 0.4098 0.7789 0.8826
No log 5.8 174 0.9682 0.3538 0.9682 0.9840
No log 5.8667 176 0.8866 0.3302 0.8866 0.9416
No log 5.9333 178 0.6936 0.4123 0.6936 0.8328
No log 6.0 180 0.6137 0.4845 0.6137 0.7834
No log 6.0667 182 0.5721 0.4966 0.5721 0.7564
No log 6.1333 184 0.5720 0.5028 0.5720 0.7563
No log 6.2 186 0.5922 0.5026 0.5922 0.7695
No log 6.2667 188 0.5882 0.4937 0.5882 0.7669
No log 6.3333 190 0.6658 0.4925 0.6658 0.8160
No log 6.4 192 0.6578 0.5117 0.6578 0.8110
No log 6.4667 194 0.6055 0.4555 0.6055 0.7781
No log 6.5333 196 0.5617 0.4898 0.5617 0.7495
No log 6.6 198 0.5712 0.4960 0.5712 0.7558
No log 6.6667 200 0.5563 0.5152 0.5563 0.7458
No log 6.7333 202 0.5436 0.5323 0.5436 0.7373
No log 6.8 204 0.5857 0.5624 0.5857 0.7653
No log 6.8667 206 0.7459 0.5309 0.7459 0.8636
No log 6.9333 208 0.9034 0.4758 0.9034 0.9505
No log 7.0 210 0.8876 0.4926 0.8876 0.9421
No log 7.0667 212 0.7163 0.4874 0.7163 0.8463
No log 7.1333 214 0.6253 0.5229 0.6253 0.7907
No log 7.2 216 0.6136 0.5572 0.6136 0.7833
No log 7.2667 218 0.6025 0.5572 0.6025 0.7762
No log 7.3333 220 0.5937 0.4408 0.5937 0.7706
No log 7.4 222 0.7036 0.4114 0.7036 0.8388
No log 7.4667 224 0.7274 0.4255 0.7274 0.8529
No log 7.5333 226 0.6318 0.3869 0.6318 0.7948
No log 7.6 228 0.5998 0.4251 0.5998 0.7745
No log 7.6667 230 0.5755 0.4329 0.5755 0.7586
No log 7.7333 232 0.5620 0.4827 0.5620 0.7497
No log 7.8 234 0.5883 0.4964 0.5883 0.7670
No log 7.8667 236 0.5608 0.4898 0.5608 0.7489
No log 7.9333 238 0.5211 0.5237 0.5211 0.7219
No log 8.0 240 0.4898 0.5860 0.4898 0.6999
No log 8.0667 242 0.5007 0.5714 0.5007 0.7076
No log 8.1333 244 0.5093 0.5958 0.5093 0.7137
No log 8.2 246 0.5640 0.6056 0.5640 0.7510
No log 8.2667 248 0.6256 0.5812 0.6256 0.7910
No log 8.3333 250 0.6033 0.5875 0.6033 0.7767
No log 8.4 252 0.5390 0.6022 0.5390 0.7342
No log 8.4667 254 0.5174 0.5640 0.5174 0.7193
No log 8.5333 256 0.5176 0.5840 0.5176 0.7195
No log 8.6 258 0.5446 0.6130 0.5446 0.7380
No log 8.6667 260 0.6512 0.6179 0.6512 0.8070
No log 8.7333 262 0.6817 0.5719 0.6817 0.8256
No log 8.8 264 0.6062 0.5625 0.6062 0.7786
No log 8.8667 266 0.5927 0.5659 0.5927 0.7698
No log 8.9333 268 0.5535 0.5528 0.5535 0.7440
No log 9.0 270 0.5605 0.5403 0.5605 0.7487
No log 9.0667 272 0.5293 0.6135 0.5293 0.7275
No log 9.1333 274 0.4832 0.6564 0.4832 0.6952
No log 9.2 276 0.5024 0.5663 0.5024 0.7088
No log 9.2667 278 0.5001 0.5683 0.5001 0.7072
No log 9.3333 280 0.5151 0.6040 0.5151 0.7177
No log 9.4 282 0.5564 0.5852 0.5564 0.7459
No log 9.4667 284 0.5899 0.6188 0.5899 0.7681
No log 9.5333 286 0.5452 0.5583 0.5452 0.7384
No log 9.6 288 0.5431 0.5472 0.5431 0.7369
No log 9.6667 290 0.5588 0.4934 0.5588 0.7475
No log 9.7333 292 0.5734 0.4919 0.5734 0.7572
No log 9.8 294 0.5969 0.5801 0.5969 0.7726
No log 9.8667 296 0.5827 0.5876 0.5827 0.7633
No log 9.9333 298 0.5521 0.5813 0.5521 0.7430
No log 10.0 300 0.5366 0.5756 0.5366 0.7325
No log 10.0667 302 0.5350 0.5926 0.5350 0.7314
No log 10.1333 304 0.5445 0.5926 0.5445 0.7379
No log 10.2 306 0.5647 0.5666 0.5647 0.7514
No log 10.2667 308 0.5678 0.5543 0.5678 0.7535
No log 10.3333 310 0.5676 0.5566 0.5676 0.7534
No log 10.4 312 0.5801 0.5368 0.5801 0.7616
No log 10.4667 314 0.5976 0.5470 0.5976 0.7731
No log 10.5333 316 0.5913 0.4302 0.5913 0.7690
No log 10.6 318 0.6219 0.4819 0.6219 0.7886
No log 10.6667 320 0.6359 0.4819 0.6359 0.7974
No log 10.7333 322 0.5960 0.5063 0.5960 0.7720
No log 10.8 324 0.5569 0.4681 0.5569 0.7462
No log 10.8667 326 0.5621 0.5119 0.5621 0.7497
No log 10.9333 328 0.5722 0.5119 0.5722 0.7565
No log 11.0 330 0.6089 0.5333 0.6089 0.7803
No log 11.0667 332 0.6116 0.5290 0.6116 0.7820
No log 11.1333 334 0.6000 0.5290 0.6000 0.7746
No log 11.2 336 0.5705 0.5113 0.5705 0.7553
No log 11.2667 338 0.5809 0.6075 0.5809 0.7621
No log 11.3333 340 0.5882 0.6011 0.5882 0.7669
No log 11.4 342 0.6737 0.5452 0.6737 0.8208
No log 11.4667 344 0.8982 0.4087 0.8982 0.9477
No log 11.5333 346 0.9712 0.4110 0.9712 0.9855
No log 11.6 348 0.8262 0.5124 0.8262 0.9089
No log 11.6667 350 0.6356 0.5258 0.6356 0.7972
No log 11.7333 352 0.5499 0.5560 0.5499 0.7415
No log 11.8 354 0.5403 0.4300 0.5403 0.7351
No log 11.8667 356 0.5558 0.3813 0.5558 0.7455
No log 11.9333 358 0.6039 0.4576 0.6039 0.7771
No log 12.0 360 0.7087 0.4587 0.7087 0.8418
No log 12.0667 362 0.7356 0.4805 0.7356 0.8577
No log 12.1333 364 0.7269 0.4805 0.7269 0.8526
No log 12.2 366 0.6799 0.5204 0.6799 0.8246
No log 12.2667 368 0.5765 0.5383 0.5765 0.7592
No log 12.3333 370 0.5304 0.4813 0.5304 0.7283
No log 12.4 372 0.5374 0.4813 0.5374 0.7331
No log 12.4667 374 0.5892 0.5152 0.5892 0.7676
No log 12.5333 376 0.5776 0.5553 0.5776 0.7600
No log 12.6 378 0.5208 0.5151 0.5208 0.7217
No log 12.6667 380 0.5029 0.5926 0.5029 0.7092
No log 12.7333 382 0.4941 0.5926 0.4941 0.7029
No log 12.8 384 0.5012 0.5926 0.5012 0.7080
No log 12.8667 386 0.5163 0.5756 0.5163 0.7186
No log 12.9333 388 0.5311 0.5363 0.5311 0.7288
No log 13.0 390 0.5380 0.5273 0.5380 0.7335
No log 13.0667 392 0.5398 0.5273 0.5398 0.7347
No log 13.1333 394 0.5347 0.5674 0.5347 0.7312
No log 13.2 396 0.5392 0.5165 0.5392 0.7343
No log 13.2667 398 0.5510 0.5622 0.5510 0.7423
No log 13.3333 400 0.5767 0.5958 0.5767 0.7594
No log 13.4 402 0.5851 0.6129 0.5851 0.7649
No log 13.4667 404 0.5634 0.5871 0.5634 0.7506
No log 13.5333 406 0.5683 0.5737 0.5683 0.7539
No log 13.6 408 0.5810 0.5814 0.5810 0.7623
No log 13.6667 410 0.5417 0.5978 0.5417 0.7360
No log 13.7333 412 0.5070 0.5756 0.5070 0.7120
No log 13.8 414 0.4900 0.5306 0.4900 0.7000
No log 13.8667 416 0.5028 0.4895 0.5028 0.7091
No log 13.9333 418 0.5110 0.4801 0.5110 0.7149
No log 14.0 420 0.5311 0.5009 0.5311 0.7287
No log 14.0667 422 0.5549 0.4524 0.5549 0.7449
No log 14.1333 424 0.5967 0.4321 0.5967 0.7725
No log 14.2 426 0.6801 0.4893 0.6801 0.8247
No log 14.2667 428 0.7909 0.4852 0.7909 0.8893
No log 14.3333 430 0.8151 0.4584 0.8151 0.9028
No log 14.4 432 0.7708 0.3740 0.7708 0.8780
No log 14.4667 434 0.6857 0.3918 0.6857 0.8280
No log 14.5333 436 0.6837 0.4165 0.6837 0.8269
No log 14.6 438 0.6748 0.4646 0.6748 0.8215
No log 14.6667 440 0.7232 0.4982 0.7232 0.8504
No log 14.7333 442 0.7130 0.4423 0.7130 0.8444
No log 14.8 444 0.7202 0.4315 0.7202 0.8487
No log 14.8667 446 0.7756 0.4051 0.7756 0.8807
No log 14.9333 448 0.7701 0.3940 0.7701 0.8775
No log 15.0 450 0.7107 0.4307 0.7107 0.8430
No log 15.0667 452 0.6650 0.3092 0.6650 0.8155
No log 15.1333 454 0.6371 0.2787 0.6371 0.7982
No log 15.2 456 0.6199 0.3416 0.6199 0.7874
No log 15.2667 458 0.6214 0.4081 0.6214 0.7883
No log 15.3333 460 0.6732 0.5416 0.6732 0.8205
No log 15.4 462 0.7103 0.5204 0.7103 0.8428
No log 15.4667 464 0.6787 0.5204 0.6787 0.8238
No log 15.5333 466 0.6466 0.5246 0.6466 0.8041
No log 15.6 468 0.5971 0.4126 0.5971 0.7727
No log 15.6667 470 0.5808 0.4635 0.5808 0.7621
No log 15.7333 472 0.5775 0.4715 0.5775 0.7599
No log 15.8 474 0.5756 0.5051 0.5756 0.7587
No log 15.8667 476 0.6035 0.5368 0.6035 0.7769
No log 15.9333 478 0.6348 0.6096 0.6348 0.7967
No log 16.0 480 0.6352 0.6096 0.6352 0.7970
No log 16.0667 482 0.6015 0.5417 0.6015 0.7756
No log 16.1333 484 0.5762 0.4919 0.5762 0.7591
No log 16.2 486 0.5655 0.4992 0.5655 0.7520
No log 16.2667 488 0.5669 0.4767 0.5669 0.7529
No log 16.3333 490 0.5796 0.4360 0.5796 0.7613
No log 16.4 492 0.6047 0.4822 0.6047 0.7776
No log 16.4667 494 0.6475 0.5705 0.6475 0.8047
No log 16.5333 496 0.6898 0.5204 0.6898 0.8306
No log 16.6 498 0.6976 0.5204 0.6976 0.8352
0.2637 16.6667 500 0.6427 0.5692 0.6427 0.8017
0.2637 16.7333 502 0.5917 0.5093 0.5917 0.7692
0.2637 16.8 504 0.5936 0.4547 0.5936 0.7704
0.2637 16.8667 506 0.6085 0.4240 0.6085 0.7801
0.2637 16.9333 508 0.6046 0.4659 0.6046 0.7776
0.2637 17.0 510 0.6103 0.4486 0.6103 0.7812

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run3_AugV5_k6_task7_organization

Finetuned
(4019)
this model