ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k9_task7_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.4981
  • Qwk: 0.5687
  • Mse: 0.4981
  • Rmse: 0.7058

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0444 2 2.5702 -0.0646 2.5702 1.6032
No log 0.0889 4 1.4284 0.1228 1.4284 1.1952
No log 0.1333 6 0.7223 0.1372 0.7223 0.8499
No log 0.1778 8 0.7626 -0.0127 0.7626 0.8733
No log 0.2222 10 0.8532 0.2756 0.8532 0.9237
No log 0.2667 12 0.7514 0.4080 0.7514 0.8668
No log 0.3111 14 0.5717 0.3649 0.5717 0.7561
No log 0.3556 16 0.5107 0.3580 0.5107 0.7146
No log 0.4 18 0.5364 0.5050 0.5364 0.7324
No log 0.4444 20 0.4662 0.5574 0.4662 0.6828
No log 0.4889 22 0.4818 0.5449 0.4818 0.6941
No log 0.5333 24 0.5988 0.5763 0.5988 0.7738
No log 0.5778 26 0.5844 0.5911 0.5844 0.7645
No log 0.6222 28 0.5480 0.5856 0.5480 0.7402
No log 0.6667 30 0.4559 0.5682 0.4559 0.6752
No log 0.7111 32 0.4367 0.6542 0.4367 0.6608
No log 0.7556 34 0.4329 0.7175 0.4329 0.6580
No log 0.8 36 0.5663 0.5900 0.5663 0.7525
No log 0.8444 38 0.6866 0.5529 0.6866 0.8286
No log 0.8889 40 0.6231 0.5688 0.6231 0.7894
No log 0.9333 42 0.4720 0.7409 0.4720 0.6870
No log 0.9778 44 0.6049 0.6457 0.6049 0.7777
No log 1.0222 46 0.6175 0.6334 0.6175 0.7858
No log 1.0667 48 0.4797 0.7177 0.4797 0.6926
No log 1.1111 50 0.6847 0.4760 0.6847 0.8275
No log 1.1556 52 0.8187 0.4385 0.8187 0.9048
No log 1.2 54 0.6313 0.5234 0.6313 0.7945
No log 1.2444 56 0.4575 0.5888 0.4575 0.6764
No log 1.2889 58 0.6901 0.5402 0.6901 0.8307
No log 1.3333 60 0.6900 0.5364 0.6900 0.8307
No log 1.3778 62 0.5076 0.6025 0.5076 0.7125
No log 1.4222 64 0.4738 0.6118 0.4738 0.6883
No log 1.4667 66 0.4991 0.6620 0.4991 0.7065
No log 1.5111 68 0.5416 0.6303 0.5416 0.7359
No log 1.5556 70 0.4980 0.6280 0.4980 0.7057
No log 1.6 72 0.5653 0.5225 0.5653 0.7519
No log 1.6444 74 0.6272 0.4681 0.6272 0.7919
No log 1.6889 76 0.4715 0.5936 0.4715 0.6867
No log 1.7333 78 0.5121 0.6401 0.5121 0.7156
No log 1.7778 80 0.9030 0.4228 0.9030 0.9503
No log 1.8222 82 0.9652 0.3861 0.9652 0.9825
No log 1.8667 84 0.6222 0.5839 0.6222 0.7888
No log 1.9111 86 0.4302 0.6805 0.4302 0.6559
No log 1.9556 88 0.4503 0.6779 0.4503 0.6710
No log 2.0 90 0.4379 0.6918 0.4379 0.6617
No log 2.0444 92 0.6864 0.5059 0.6864 0.8285
No log 2.0889 94 1.1001 0.3601 1.1001 1.0489
No log 2.1333 96 1.1094 0.3424 1.1094 1.0533
No log 2.1778 98 0.8370 0.4681 0.8370 0.9149
No log 2.2222 100 0.4717 0.5979 0.4717 0.6868
No log 2.2667 102 0.4264 0.6052 0.4264 0.6530
No log 2.3111 104 0.4217 0.6010 0.4217 0.6494
No log 2.3556 106 0.5129 0.5033 0.5129 0.7162
No log 2.4 108 0.5995 0.4556 0.5995 0.7743
No log 2.4444 110 0.5768 0.4926 0.5768 0.7595
No log 2.4889 112 0.4754 0.6013 0.4754 0.6895
No log 2.5333 114 0.4825 0.5779 0.4825 0.6946
No log 2.5778 116 0.6389 0.5281 0.6389 0.7993
No log 2.6222 118 0.6783 0.5281 0.6783 0.8236
No log 2.6667 120 0.5227 0.5513 0.5227 0.7230
No log 2.7111 122 0.4816 0.5476 0.4816 0.6939
No log 2.7556 124 0.4788 0.5373 0.4788 0.6920
No log 2.8 126 0.4739 0.5304 0.4739 0.6884
No log 2.8444 128 0.4919 0.6007 0.4919 0.7014
No log 2.8889 130 0.5355 0.5983 0.5355 0.7318
No log 2.9333 132 0.5285 0.5592 0.5285 0.7270
No log 2.9778 134 0.5401 0.5127 0.5401 0.7349
No log 3.0222 136 0.6252 0.5219 0.6252 0.7907
No log 3.0667 138 0.6783 0.5147 0.6783 0.8236
No log 3.1111 140 0.5652 0.5626 0.5652 0.7518
No log 3.1556 142 0.5421 0.4813 0.5421 0.7363
No log 3.2 144 0.5366 0.4697 0.5366 0.7325
No log 3.2444 146 0.5319 0.5101 0.5319 0.7293
No log 3.2889 148 0.5067 0.5336 0.5067 0.7119
No log 3.3333 150 0.4942 0.6223 0.4942 0.7030
No log 3.3778 152 0.5532 0.6030 0.5532 0.7438
No log 3.4222 154 0.6095 0.5678 0.6095 0.7807
No log 3.4667 156 0.5514 0.5543 0.5514 0.7426
No log 3.5111 158 0.4657 0.6183 0.4657 0.6824
No log 3.5556 160 0.4899 0.5597 0.4899 0.6999
No log 3.6 162 0.5935 0.5364 0.5935 0.7704
No log 3.6444 164 0.5747 0.5627 0.5747 0.7581
No log 3.6889 166 0.4750 0.6805 0.4750 0.6892
No log 3.7333 168 0.6230 0.5401 0.6230 0.7893
No log 3.7778 170 0.7156 0.4304 0.7156 0.8459
No log 3.8222 172 0.5726 0.6218 0.5726 0.7567
No log 3.8667 174 0.4700 0.5812 0.4700 0.6856
No log 3.9111 176 0.5528 0.5190 0.5528 0.7435
No log 3.9556 178 0.5333 0.5450 0.5333 0.7302
No log 4.0 180 0.4784 0.5711 0.4784 0.6916
No log 4.0444 182 0.4381 0.6184 0.4381 0.6619
No log 4.0889 184 0.4430 0.6156 0.4430 0.6656
No log 4.1333 186 0.4466 0.5941 0.4466 0.6683
No log 4.1778 188 0.4348 0.6255 0.4348 0.6594
No log 4.2222 190 0.4215 0.6255 0.4215 0.6492
No log 4.2667 192 0.4373 0.6114 0.4373 0.6613
No log 4.3111 194 0.4759 0.6181 0.4759 0.6898
No log 4.3556 196 0.4481 0.6082 0.4481 0.6694
No log 4.4 198 0.4478 0.4828 0.4478 0.6692
No log 4.4444 200 0.5602 0.4644 0.5602 0.7485
No log 4.4889 202 0.5598 0.4728 0.5598 0.7482
No log 4.5333 204 0.4766 0.5349 0.4766 0.6904
No log 4.5778 206 0.4533 0.5521 0.4533 0.6733
No log 4.6222 208 0.4555 0.6269 0.4555 0.6749
No log 4.6667 210 0.4848 0.5706 0.4848 0.6962
No log 4.7111 212 0.5339 0.6218 0.5339 0.7307
No log 4.7556 214 0.5572 0.6218 0.5572 0.7465
No log 4.8 216 0.5612 0.6218 0.5612 0.7491
No log 4.8444 218 0.4985 0.6053 0.4985 0.7061
No log 4.8889 220 0.4785 0.5142 0.4785 0.6918
No log 4.9333 222 0.5064 0.5528 0.5064 0.7116
No log 4.9778 224 0.5086 0.5254 0.5086 0.7132
No log 5.0222 226 0.4894 0.5485 0.4894 0.6996
No log 5.0667 228 0.4714 0.5020 0.4714 0.6866
No log 5.1111 230 0.4903 0.6197 0.4903 0.7002
No log 5.1556 232 0.4999 0.6317 0.4999 0.7070
No log 5.2 234 0.4885 0.5707 0.4885 0.6989
No log 5.2444 236 0.5443 0.5319 0.5443 0.7377
No log 5.2889 238 0.6445 0.5526 0.6445 0.8028
No log 5.3333 240 0.6805 0.5702 0.6805 0.8249
No log 5.3778 242 0.5315 0.5739 0.5315 0.7291
No log 5.4222 244 0.4536 0.5765 0.4536 0.6735
No log 5.4667 246 0.5219 0.5735 0.5219 0.7224
No log 5.5111 248 0.5343 0.5735 0.5343 0.7310
No log 5.5556 250 0.4892 0.5528 0.4892 0.6994
No log 5.6 252 0.4453 0.6402 0.4453 0.6673
No log 5.6444 254 0.4395 0.6589 0.4395 0.6629
No log 5.6889 256 0.4578 0.6210 0.4578 0.6766
No log 5.7333 258 0.4813 0.5672 0.4813 0.6938
No log 5.7778 260 0.4951 0.5935 0.4951 0.7037
No log 5.8222 262 0.4571 0.6330 0.4571 0.6761
No log 5.8667 264 0.4376 0.6402 0.4376 0.6615
No log 5.9111 266 0.4501 0.6210 0.4501 0.6709
No log 5.9556 268 0.4761 0.5528 0.4761 0.6900
No log 6.0 270 0.4628 0.5733 0.4628 0.6803
No log 6.0444 272 0.4788 0.5855 0.4788 0.6919
No log 6.0889 274 0.5674 0.5595 0.5674 0.7533
No log 6.1333 276 0.6194 0.5543 0.6194 0.7870
No log 6.1778 278 0.5772 0.5749 0.5772 0.7598
No log 6.2222 280 0.5577 0.5331 0.5577 0.7468
No log 6.2667 282 0.4973 0.5135 0.4973 0.7052
No log 6.3111 284 0.4795 0.5883 0.4795 0.6925
No log 6.3556 286 0.5180 0.4292 0.5180 0.7197
No log 6.4 288 0.5339 0.4020 0.5339 0.7307
No log 6.4444 290 0.5078 0.4639 0.5078 0.7126
No log 6.4889 292 0.4862 0.5941 0.4862 0.6973
No log 6.5333 294 0.5148 0.5248 0.5148 0.7175
No log 6.5778 296 0.5249 0.5248 0.5249 0.7245
No log 6.6222 298 0.5060 0.5446 0.5060 0.7113
No log 6.6667 300 0.5253 0.4212 0.5253 0.7248
No log 6.7111 302 0.6489 0.4707 0.6489 0.8055
No log 6.7556 304 0.6484 0.4562 0.6484 0.8053
No log 6.8 306 0.5803 0.4815 0.5803 0.7618
No log 6.8444 308 0.5098 0.4555 0.5098 0.7140
No log 6.8889 310 0.4820 0.5677 0.4820 0.6942
No log 6.9333 312 0.4883 0.5405 0.4883 0.6988
No log 6.9778 314 0.4859 0.5782 0.4859 0.6971
No log 7.0222 316 0.4742 0.5476 0.4742 0.6886
No log 7.0667 318 0.4654 0.5956 0.4654 0.6822
No log 7.1111 320 0.4635 0.5956 0.4635 0.6808
No log 7.1556 322 0.4545 0.5956 0.4545 0.6742
No log 7.2 324 0.4676 0.5860 0.4676 0.6838
No log 7.2444 326 0.5314 0.5552 0.5314 0.7290
No log 7.2889 328 0.5261 0.5973 0.5261 0.7253
No log 7.3333 330 0.4868 0.6560 0.4868 0.6977
No log 7.3778 332 0.5424 0.5997 0.5424 0.7365
No log 7.4222 334 0.5788 0.5862 0.5788 0.7608
No log 7.4667 336 0.5243 0.5970 0.5243 0.7241
No log 7.5111 338 0.4859 0.6267 0.4859 0.6971
No log 7.5556 340 0.5284 0.5501 0.5284 0.7269
No log 7.6 342 0.5379 0.5272 0.5379 0.7334
No log 7.6444 344 0.5112 0.5362 0.5112 0.7150
No log 7.6889 346 0.4948 0.4463 0.4948 0.7034
No log 7.7333 348 0.4912 0.4839 0.4912 0.7009
No log 7.7778 350 0.4940 0.5302 0.4940 0.7028
No log 7.8222 352 0.4965 0.5523 0.4965 0.7046
No log 7.8667 354 0.5027 0.6383 0.5027 0.7090
No log 7.9111 356 0.4937 0.6009 0.4937 0.7026
No log 7.9556 358 0.4733 0.5784 0.4733 0.6880
No log 8.0 360 0.4589 0.5784 0.4589 0.6774
No log 8.0444 362 0.4524 0.6078 0.4524 0.6726
No log 8.0889 364 0.4519 0.5574 0.4519 0.6723
No log 8.1333 366 0.4536 0.5662 0.4536 0.6735
No log 8.1778 368 0.4754 0.6771 0.4754 0.6895
No log 8.2222 370 0.4987 0.6503 0.4987 0.7062
No log 8.2667 372 0.5169 0.5438 0.5169 0.7190
No log 8.3111 374 0.5047 0.5702 0.5047 0.7104
No log 8.3556 376 0.4916 0.5042 0.4916 0.7011
No log 8.4 378 0.4975 0.5665 0.4975 0.7053
No log 8.4444 380 0.5019 0.5472 0.5019 0.7084
No log 8.4889 382 0.4935 0.4634 0.4935 0.7025
No log 8.5333 384 0.4907 0.4126 0.4907 0.7005
No log 8.5778 386 0.4848 0.4384 0.4848 0.6963
No log 8.6222 388 0.4837 0.4569 0.4837 0.6955
No log 8.6667 390 0.4717 0.5248 0.4717 0.6868
No log 8.7111 392 0.4683 0.5899 0.4683 0.6843
No log 8.7556 394 0.4999 0.5723 0.4999 0.7070
No log 8.8 396 0.4946 0.5593 0.4946 0.7033
No log 8.8444 398 0.4630 0.5995 0.4630 0.6804
No log 8.8889 400 0.4641 0.6101 0.4641 0.6813
No log 8.9333 402 0.5386 0.5178 0.5386 0.7339
No log 8.9778 404 0.5746 0.5474 0.5746 0.7580
No log 9.0222 406 0.5376 0.5400 0.5376 0.7332
No log 9.0667 408 0.4880 0.5495 0.4880 0.6986
No log 9.1111 410 0.4771 0.5831 0.4771 0.6907
No log 9.1556 412 0.4958 0.4681 0.4958 0.7041
No log 9.2 414 0.5375 0.5104 0.5375 0.7331
No log 9.2444 416 0.5618 0.4929 0.5618 0.7495
No log 9.2889 418 0.5285 0.5098 0.5285 0.7270
No log 9.3333 420 0.4879 0.6142 0.4879 0.6985
No log 9.3778 422 0.4732 0.6142 0.4732 0.6879
No log 9.4222 424 0.4661 0.6267 0.4661 0.6827
No log 9.4667 426 0.4663 0.6267 0.4663 0.6828
No log 9.5111 428 0.4705 0.5782 0.4705 0.6859
No log 9.5556 430 0.4784 0.5687 0.4784 0.6916
No log 9.6 432 0.4787 0.5831 0.4787 0.6919
No log 9.6444 434 0.4909 0.5472 0.4909 0.7006
No log 9.6889 436 0.5017 0.5232 0.5017 0.7083
No log 9.7333 438 0.4931 0.5472 0.4931 0.7022
No log 9.7778 440 0.4881 0.6111 0.4881 0.6987
No log 9.8222 442 0.4717 0.6542 0.4717 0.6868
No log 9.8667 444 0.4601 0.6655 0.4601 0.6783
No log 9.9111 446 0.4521 0.6566 0.4521 0.6724
No log 9.9556 448 0.4505 0.6377 0.4505 0.6712
No log 10.0 450 0.4467 0.6377 0.4467 0.6684
No log 10.0444 452 0.4500 0.6292 0.4500 0.6708
No log 10.0889 454 0.4669 0.5956 0.4669 0.6833
No log 10.1333 456 0.4596 0.6491 0.4596 0.6779
No log 10.1778 458 0.4576 0.5915 0.4576 0.6765
No log 10.2222 460 0.4811 0.5956 0.4811 0.6936
No log 10.2667 462 0.5333 0.5706 0.5333 0.7303
No log 10.3111 464 0.5333 0.5368 0.5333 0.7303
No log 10.3556 466 0.5042 0.5422 0.5042 0.7101
No log 10.4 468 0.4937 0.5853 0.4937 0.7026
No log 10.4444 470 0.5725 0.5473 0.5725 0.7566
No log 10.4889 472 0.6722 0.5278 0.6722 0.8199
No log 10.5333 474 0.6953 0.5278 0.6953 0.8338
No log 10.5778 476 0.6337 0.4491 0.6337 0.7961
No log 10.6222 478 0.5376 0.4020 0.5376 0.7332
No log 10.6667 480 0.4645 0.5379 0.4645 0.6815
No log 10.7111 482 0.4689 0.5512 0.4689 0.6848
No log 10.7556 484 0.4785 0.5733 0.4785 0.6918
No log 10.8 486 0.4762 0.6505 0.4762 0.6901
No log 10.8444 488 0.4750 0.6329 0.4750 0.6892
No log 10.8889 490 0.4778 0.5765 0.4778 0.6912
No log 10.9333 492 0.5255 0.5299 0.5255 0.7249
No log 10.9778 494 0.5395 0.5112 0.5395 0.7345
No log 11.0222 496 0.5033 0.5368 0.5033 0.7094
No log 11.0667 498 0.4686 0.6007 0.4686 0.6846
0.3077 11.1111 500 0.4648 0.5979 0.4648 0.6818
0.3077 11.1556 502 0.4635 0.5846 0.4635 0.6808
0.3077 11.2 504 0.4621 0.5846 0.4621 0.6798
0.3077 11.2444 506 0.4646 0.6060 0.4646 0.6816
0.3077 11.2889 508 0.4711 0.6060 0.4711 0.6864
0.3077 11.3333 510 0.4769 0.5979 0.4769 0.6906
0.3077 11.3778 512 0.4881 0.6105 0.4881 0.6987
0.3077 11.4222 514 0.4984 0.6105 0.4984 0.7060
0.3077 11.4667 516 0.4947 0.5979 0.4947 0.7034
0.3077 11.5111 518 0.4968 0.5533 0.4968 0.7048
0.3077 11.5556 520 0.4952 0.5609 0.4952 0.7037
0.3077 11.6 522 0.4946 0.5609 0.4946 0.7033
0.3077 11.6444 524 0.4981 0.5687 0.4981 0.7058

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k9_task7_organization

Finetuned
(4019)
this model