ArabicNewSplits7_B_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k7_task7_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.5913
  • Qwk: 0.4684
  • Mse: 0.5913
  • Rmse: 0.7689

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0541 2 2.7641 -0.0951 2.7641 1.6626
No log 0.1081 4 1.4850 -0.0317 1.4850 1.2186
No log 0.1622 6 0.9154 0.0185 0.9154 0.9568
No log 0.2162 8 1.0183 0.1323 1.0183 1.0091
No log 0.2703 10 1.1615 0.1632 1.1615 1.0777
No log 0.3243 12 0.9440 0.1651 0.9440 0.9716
No log 0.3784 14 0.7048 0.1232 0.7048 0.8395
No log 0.4324 16 0.7329 0.2841 0.7329 0.8561
No log 0.4865 18 0.7147 0.3173 0.7147 0.8454
No log 0.5405 20 0.7031 0.3173 0.7031 0.8385
No log 0.5946 22 0.7582 0.2254 0.7582 0.8707
No log 0.6486 24 0.7907 0.0930 0.7907 0.8892
No log 0.7027 26 0.6893 0.2745 0.6893 0.8302
No log 0.7568 28 0.6548 0.3169 0.6548 0.8092
No log 0.8108 30 0.6856 0.3299 0.6856 0.8280
No log 0.8649 32 0.7562 0.3257 0.7562 0.8696
No log 0.9189 34 0.7132 0.3693 0.7132 0.8445
No log 0.9730 36 0.5981 0.3302 0.5981 0.7734
No log 1.0270 38 0.5914 0.4608 0.5914 0.7690
No log 1.0811 40 0.6190 0.3563 0.6190 0.7867
No log 1.1351 42 0.6409 0.3857 0.6409 0.8006
No log 1.1892 44 0.6249 0.4171 0.6249 0.7905
No log 1.2432 46 0.6553 0.3966 0.6553 0.8095
No log 1.2973 48 0.6144 0.4561 0.6144 0.7838
No log 1.3514 50 0.6859 0.3451 0.6859 0.8282
No log 1.4054 52 0.6990 0.3547 0.6990 0.8361
No log 1.4595 54 0.7568 0.2880 0.7568 0.8700
No log 1.5135 56 0.6333 0.3829 0.6333 0.7958
No log 1.5676 58 0.6034 0.4338 0.6034 0.7768
No log 1.6216 60 0.6866 0.2445 0.6866 0.8286
No log 1.6757 62 0.6424 0.3116 0.6424 0.8015
No log 1.7297 64 0.5719 0.4240 0.5719 0.7562
No log 1.7838 66 0.5709 0.5159 0.5709 0.7556
No log 1.8378 68 0.6688 0.4294 0.6688 0.8178
No log 1.8919 70 0.8849 0.4270 0.8849 0.9407
No log 1.9459 72 0.8275 0.4421 0.8275 0.9097
No log 2.0 74 0.6589 0.5291 0.6589 0.8117
No log 2.0541 76 0.5358 0.5831 0.5358 0.7320
No log 2.1081 78 0.5310 0.5248 0.5310 0.7287
No log 2.1622 80 0.5629 0.4262 0.5629 0.7503
No log 2.2162 82 0.5735 0.4569 0.5735 0.7573
No log 2.2703 84 0.5279 0.5133 0.5279 0.7265
No log 2.3243 86 0.5055 0.5344 0.5055 0.7110
No log 2.3784 88 0.6070 0.5468 0.6070 0.7791
No log 2.4324 90 0.6294 0.5677 0.6294 0.7933
No log 2.4865 92 0.5101 0.5980 0.5101 0.7142
No log 2.5405 94 0.4585 0.5797 0.4585 0.6771
No log 2.5946 96 0.4697 0.5812 0.4697 0.6853
No log 2.6486 98 0.5350 0.6539 0.5350 0.7314
No log 2.7027 100 0.8115 0.4481 0.8115 0.9008
No log 2.7568 102 0.7501 0.4734 0.7501 0.8661
No log 2.8108 104 0.5116 0.6735 0.5116 0.7153
No log 2.8649 106 0.6227 0.5803 0.6227 0.7891
No log 2.9189 108 0.6449 0.5455 0.6449 0.8030
No log 2.9730 110 0.5743 0.5726 0.5743 0.7578
No log 3.0270 112 0.5364 0.6481 0.5364 0.7324
No log 3.0811 114 0.5513 0.5926 0.5513 0.7425
No log 3.1351 116 0.5272 0.6587 0.5272 0.7261
No log 3.1892 118 0.6472 0.5252 0.6472 0.8045
No log 3.2432 120 0.7309 0.4458 0.7309 0.8550
No log 3.2973 122 0.5200 0.6631 0.5200 0.7211
No log 3.3514 124 0.4827 0.6001 0.4827 0.6948
No log 3.4054 126 0.4533 0.6277 0.4533 0.6733
No log 3.4595 128 0.5894 0.5281 0.5894 0.7677
No log 3.5135 130 0.8377 0.4429 0.8377 0.9153
No log 3.5676 132 0.6126 0.5234 0.6126 0.7827
No log 3.6216 134 0.4428 0.6460 0.4428 0.6654
No log 3.6757 136 0.6186 0.5534 0.6186 0.7865
No log 3.7297 138 0.6790 0.5306 0.6790 0.8240
No log 3.7838 140 0.5303 0.6540 0.5303 0.7282
No log 3.8378 142 0.4839 0.6333 0.4839 0.6956
No log 3.8919 144 0.4917 0.6427 0.4917 0.7012
No log 3.9459 146 0.5104 0.6289 0.5104 0.7144
No log 4.0 148 0.5603 0.5604 0.5603 0.7485
No log 4.0541 150 0.5357 0.5880 0.5357 0.7319
No log 4.1081 152 0.6308 0.5900 0.6308 0.7942
No log 4.1622 154 0.6513 0.5131 0.6513 0.8071
No log 4.2162 156 0.6359 0.5524 0.6359 0.7974
No log 4.2703 158 0.6500 0.5512 0.6500 0.8062
No log 4.3243 160 0.6570 0.5767 0.6570 0.8105
No log 4.3784 162 0.6272 0.6311 0.6272 0.7920
No log 4.4324 164 0.5998 0.6384 0.5998 0.7745
No log 4.4865 166 0.6185 0.5716 0.6185 0.7865
No log 4.5405 168 0.5837 0.5844 0.5837 0.7640
No log 4.5946 170 0.5024 0.6606 0.5024 0.7088
No log 4.6486 172 0.4953 0.6121 0.4953 0.7037
No log 4.7027 174 0.5316 0.5597 0.5316 0.7291
No log 4.7568 176 0.5218 0.5597 0.5218 0.7224
No log 4.8108 178 0.4953 0.6084 0.4953 0.7038
No log 4.8649 180 0.6032 0.5856 0.6032 0.7766
No log 4.9189 182 0.6447 0.5655 0.6447 0.8029
No log 4.9730 184 0.5951 0.5742 0.5951 0.7714
No log 5.0270 186 0.6234 0.5735 0.6234 0.7895
No log 5.0811 188 0.6575 0.5346 0.6575 0.8109
No log 5.1351 190 0.6370 0.5864 0.6370 0.7981
No log 5.1892 192 0.7107 0.5254 0.7107 0.8430
No log 5.2432 194 0.7588 0.4958 0.7588 0.8711
No log 5.2973 196 0.7044 0.4513 0.7044 0.8393
No log 5.3514 198 0.6553 0.5608 0.6553 0.8095
No log 5.4054 200 0.6028 0.5908 0.6028 0.7764
No log 5.4595 202 0.5515 0.6083 0.5515 0.7426
No log 5.5135 204 0.5351 0.6540 0.5351 0.7315
No log 5.5676 206 0.5037 0.6593 0.5037 0.7097
No log 5.6216 208 0.5220 0.6667 0.5220 0.7225
No log 5.6757 210 0.5368 0.6030 0.5368 0.7327
No log 5.7297 212 0.5507 0.6115 0.5507 0.7421
No log 5.7838 214 0.5736 0.6497 0.5736 0.7574
No log 5.8378 216 0.6023 0.5060 0.6023 0.7761
No log 5.8919 218 0.5786 0.5866 0.5786 0.7606
No log 5.9459 220 0.5454 0.6351 0.5454 0.7385
No log 6.0 222 0.6129 0.5856 0.6129 0.7829
No log 6.0541 224 0.5553 0.6229 0.5553 0.7452
No log 6.1081 226 0.4850 0.7151 0.4850 0.6964
No log 6.1622 228 0.4588 0.6001 0.4588 0.6774
No log 6.2162 230 0.5136 0.5692 0.5136 0.7166
No log 6.2703 232 0.5411 0.5098 0.5411 0.7356
No log 6.3243 234 0.5151 0.5577 0.5151 0.7177
No log 6.3784 236 0.4790 0.6018 0.4790 0.6921
No log 6.4324 238 0.4586 0.6241 0.4586 0.6772
No log 6.4865 240 0.4624 0.6241 0.4624 0.6800
No log 6.5405 242 0.4855 0.6145 0.4855 0.6968
No log 6.5946 244 0.4839 0.6092 0.4839 0.6956
No log 6.6486 246 0.4708 0.6667 0.4708 0.6861
No log 6.7027 248 0.4789 0.7142 0.4789 0.6921
No log 6.7568 250 0.4627 0.6847 0.4627 0.6802
No log 6.8108 252 0.4890 0.6307 0.4890 0.6993
No log 6.8649 254 0.4713 0.6743 0.4713 0.6865
No log 6.9189 256 0.4651 0.6743 0.4651 0.6820
No log 6.9730 258 0.4556 0.7218 0.4556 0.6750
No log 7.0270 260 0.4980 0.6627 0.4980 0.7057
No log 7.0811 262 0.5186 0.6390 0.5186 0.7201
No log 7.1351 264 0.5013 0.6803 0.5013 0.7080
No log 7.1892 266 0.5126 0.6631 0.5126 0.7160
No log 7.2432 268 0.4936 0.6631 0.4936 0.7026
No log 7.2973 270 0.4605 0.5767 0.4605 0.6786
No log 7.3514 272 0.4870 0.6018 0.4870 0.6978
No log 7.4054 274 0.5165 0.6118 0.5165 0.7187
No log 7.4595 276 0.4827 0.5923 0.4827 0.6948
No log 7.5135 278 0.4590 0.6402 0.4590 0.6775
No log 7.5676 280 0.4771 0.5947 0.4771 0.6907
No log 7.6216 282 0.4555 0.6601 0.4555 0.6749
No log 7.6757 284 0.4376 0.6467 0.4376 0.6615
No log 7.7297 286 0.4955 0.5741 0.4955 0.7039
No log 7.7838 288 0.5068 0.5947 0.5068 0.7119
No log 7.8378 290 0.4486 0.5711 0.4486 0.6698
No log 7.8919 292 0.4393 0.6672 0.4393 0.6628
No log 7.9459 294 0.4440 0.6933 0.4440 0.6663
No log 8.0 296 0.4584 0.6388 0.4584 0.6771
No log 8.0541 298 0.4871 0.5877 0.4871 0.6980
No log 8.1081 300 0.4798 0.6100 0.4798 0.6927
No log 8.1622 302 0.4419 0.6388 0.4419 0.6648
No log 8.2162 304 0.4466 0.6909 0.4466 0.6683
No log 8.2703 306 0.4688 0.6371 0.4688 0.6847
No log 8.3243 308 0.4821 0.6284 0.4821 0.6943
No log 8.3784 310 0.4792 0.6360 0.4792 0.6922
No log 8.4324 312 0.4435 0.7091 0.4435 0.6660
No log 8.4865 314 0.4483 0.6909 0.4483 0.6696
No log 8.5405 316 0.4785 0.6127 0.4785 0.6917
No log 8.5946 318 0.5354 0.5411 0.5354 0.7317
No log 8.6486 320 0.4700 0.6127 0.4700 0.6856
No log 8.7027 322 0.4463 0.6773 0.4463 0.6681
No log 8.7568 324 0.4569 0.6613 0.4569 0.6760
No log 8.8108 326 0.4741 0.6914 0.4741 0.6886
No log 8.8649 328 0.4882 0.6914 0.4882 0.6987
No log 8.9189 330 0.5096 0.6643 0.5096 0.7139
No log 8.9730 332 0.4983 0.6815 0.4983 0.7059
No log 9.0270 334 0.4860 0.7069 0.4860 0.6971
No log 9.0811 336 0.4733 0.6937 0.4733 0.6880
No log 9.1351 338 0.4877 0.6389 0.4877 0.6983
No log 9.1892 340 0.4740 0.6692 0.4740 0.6885
No log 9.2432 342 0.4931 0.6474 0.4931 0.7022
No log 9.2973 344 0.5568 0.6100 0.5568 0.7462
No log 9.3514 346 0.5577 0.5288 0.5577 0.7468
No log 9.4054 348 0.5164 0.5725 0.5164 0.7186
No log 9.4595 350 0.4725 0.6645 0.4725 0.6874
No log 9.5135 352 0.4452 0.7012 0.4452 0.6672
No log 9.5676 354 0.4427 0.6407 0.4427 0.6654
No log 9.6216 356 0.4631 0.6880 0.4631 0.6805
No log 9.6757 358 0.4434 0.6761 0.4434 0.6659
No log 9.7297 360 0.4620 0.6828 0.4620 0.6797
No log 9.7838 362 0.5354 0.5970 0.5354 0.7317
No log 9.8378 364 0.5888 0.5605 0.5888 0.7673
No log 9.8919 366 0.5279 0.5897 0.5279 0.7265
No log 9.9459 368 0.4447 0.7110 0.4447 0.6668
No log 10.0 370 0.5035 0.6120 0.5035 0.7096
No log 10.0541 372 0.5038 0.6470 0.5038 0.7098
No log 10.1081 374 0.4718 0.6345 0.4718 0.6869
No log 10.1622 376 0.4722 0.6582 0.4722 0.6872
No log 10.2162 378 0.4751 0.6423 0.4751 0.6893
No log 10.2703 380 0.4925 0.6373 0.4925 0.7018
No log 10.3243 382 0.5110 0.6373 0.5110 0.7148
No log 10.3784 384 0.4803 0.6606 0.4803 0.6931
No log 10.4324 386 0.4704 0.6289 0.4704 0.6859
No log 10.4865 388 0.4666 0.6289 0.4666 0.6831
No log 10.5405 390 0.4592 0.6943 0.4592 0.6777
No log 10.5946 392 0.4526 0.6762 0.4526 0.6728
No log 10.6486 394 0.4539 0.6158 0.4539 0.6737
No log 10.7027 396 0.4472 0.6351 0.4472 0.6687
No log 10.7568 398 0.4502 0.6267 0.4502 0.6710
No log 10.8108 400 0.4732 0.5517 0.4732 0.6879
No log 10.8649 402 0.5294 0.4997 0.5294 0.7276
No log 10.9189 404 0.5683 0.5063 0.5683 0.7539
No log 10.9730 406 0.5373 0.5223 0.5373 0.7330
No log 11.0270 408 0.5402 0.5223 0.5402 0.7350
No log 11.0811 410 0.5695 0.5275 0.5695 0.7547
No log 11.1351 412 0.5770 0.5323 0.5770 0.7596
No log 11.1892 414 0.6154 0.5219 0.6154 0.7844
No log 11.2432 416 0.6353 0.4966 0.6353 0.7971
No log 11.2973 418 0.5884 0.5291 0.5884 0.7671
No log 11.3514 420 0.5305 0.5034 0.5305 0.7284
No log 11.4054 422 0.5069 0.5214 0.5069 0.7120
No log 11.4595 424 0.5259 0.5036 0.5259 0.7252
No log 11.5135 426 0.5454 0.4774 0.5454 0.7385
No log 11.5676 428 0.6140 0.3972 0.6140 0.7836
No log 11.6216 430 0.6448 0.3972 0.6448 0.8030
No log 11.6757 432 0.5809 0.4330 0.5809 0.7622
No log 11.7297 434 0.5008 0.5631 0.5008 0.7076
No log 11.7838 436 0.4894 0.6140 0.4894 0.6996
No log 11.8378 438 0.5003 0.5904 0.5003 0.7073
No log 11.8919 440 0.4829 0.6156 0.4829 0.6949
No log 11.9459 442 0.5218 0.5770 0.5218 0.7224
No log 12.0 444 0.5882 0.5290 0.5882 0.7670
No log 12.0541 446 0.6157 0.5103 0.6157 0.7847
No log 12.1081 448 0.5526 0.5852 0.5526 0.7434
No log 12.1622 450 0.5012 0.6222 0.5012 0.7080
No log 12.2162 452 0.5644 0.5844 0.5644 0.7513
No log 12.2703 454 0.6254 0.5032 0.6254 0.7908
No log 12.3243 456 0.6295 0.5267 0.6295 0.7934
No log 12.3784 458 0.6106 0.5612 0.6106 0.7814
No log 12.4324 460 0.6059 0.5856 0.6059 0.7784
No log 12.4865 462 0.5698 0.6335 0.5698 0.7549
No log 12.5405 464 0.5484 0.6671 0.5484 0.7406
No log 12.5946 466 0.5381 0.6747 0.5381 0.7336
No log 12.6486 468 0.5536 0.5865 0.5536 0.7440
No log 12.7027 470 0.5268 0.5988 0.5268 0.7258
No log 12.7568 472 0.4942 0.6185 0.4942 0.7030
No log 12.8108 474 0.5043 0.7051 0.5043 0.7102
No log 12.8649 476 0.4995 0.6761 0.4995 0.7067
No log 12.9189 478 0.5006 0.6566 0.5006 0.7075
No log 12.9730 480 0.5142 0.5784 0.5142 0.7171
No log 13.0270 482 0.5289 0.5988 0.5289 0.7272
No log 13.0811 484 0.5299 0.6563 0.5299 0.7280
No log 13.1351 486 0.5425 0.5697 0.5425 0.7366
No log 13.1892 488 0.5590 0.5528 0.5590 0.7477
No log 13.2432 490 0.6017 0.4970 0.6017 0.7757
No log 13.2973 492 0.5774 0.5170 0.5774 0.7599
No log 13.3514 494 0.5378 0.5853 0.5378 0.7333
No log 13.4054 496 0.5214 0.6743 0.5214 0.7221
No log 13.4595 498 0.5099 0.6563 0.5099 0.7140
0.3643 13.5135 500 0.5095 0.6185 0.5095 0.7138
0.3643 13.5676 502 0.5301 0.5272 0.5301 0.7281
0.3643 13.6216 504 0.5631 0.5015 0.5631 0.7504
0.3643 13.6757 506 0.5681 0.4929 0.5681 0.7537
0.3643 13.7297 508 0.5813 0.4929 0.5813 0.7624
0.3643 13.7838 510 0.5913 0.4684 0.5913 0.7689

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_B_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k7_task7_organization

Finetuned
(4019)
this model