ArabicNewSplits7_B_usingALLEssays_FineTuningAraBERT_run1_AugV5_k6_task7_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.5184
  • Qwk: 0.5422
  • Mse: 0.5184
  • Rmse: 0.7200

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0645 2 2.6274 -0.1213 2.6274 1.6209
No log 0.1290 4 1.4390 0.0126 1.4390 1.1996
No log 0.1935 6 0.8039 0.0049 0.8039 0.8966
No log 0.2581 8 0.8160 0.2027 0.8160 0.9033
No log 0.3226 10 0.9106 0.2463 0.9106 0.9543
No log 0.3871 12 0.6861 0.2604 0.6861 0.8283
No log 0.4516 14 0.7781 0.3509 0.7781 0.8821
No log 0.5161 16 0.7994 0.3777 0.7994 0.8941
No log 0.5806 18 0.5626 0.4494 0.5626 0.7501
No log 0.6452 20 0.5566 0.4384 0.5566 0.7460
No log 0.7097 22 0.6730 0.3484 0.6730 0.8204
No log 0.7742 24 0.8019 0.3385 0.8019 0.8955
No log 0.8387 26 0.7233 0.3708 0.7233 0.8505
No log 0.9032 28 0.6045 0.1673 0.6045 0.7775
No log 0.9677 30 0.6588 0.2368 0.6588 0.8117
No log 1.0323 32 0.9363 0.3521 0.9363 0.9676
No log 1.0968 34 1.2156 0.2141 1.2156 1.1025
No log 1.1613 36 1.2252 0.2141 1.2252 1.1069
No log 1.2258 38 0.9554 0.2988 0.9554 0.9774
No log 1.2903 40 0.6445 0.1923 0.6445 0.8028
No log 1.3548 42 0.6431 0.3387 0.6431 0.8020
No log 1.4194 44 0.6238 0.3387 0.6238 0.7898
No log 1.4839 46 0.5820 0.1604 0.5820 0.7629
No log 1.5484 48 0.5730 0.2002 0.5730 0.7570
No log 1.6129 50 0.6179 0.2459 0.6179 0.7861
No log 1.6774 52 0.6857 0.3311 0.6857 0.8281
No log 1.7419 54 0.6992 0.3851 0.6992 0.8362
No log 1.8065 56 0.7401 0.3754 0.7401 0.8603
No log 1.8710 58 0.5727 0.4174 0.5727 0.7568
No log 1.9355 60 0.5193 0.4891 0.5193 0.7206
No log 2.0 62 0.5320 0.4150 0.5320 0.7294
No log 2.0645 64 0.6359 0.4258 0.6359 0.7974
No log 2.1290 66 0.7665 0.4818 0.7665 0.8755
No log 2.1935 68 0.6284 0.3683 0.6284 0.7927
No log 2.2581 70 0.5022 0.5223 0.5022 0.7087
No log 2.3226 72 0.5351 0.4704 0.5351 0.7315
No log 2.3871 74 0.5107 0.4795 0.5107 0.7146
No log 2.4516 76 0.5048 0.5223 0.5048 0.7105
No log 2.5161 78 0.5544 0.4150 0.5544 0.7446
No log 2.5806 80 0.5670 0.4092 0.5670 0.7530
No log 2.6452 82 0.4917 0.5348 0.4917 0.7012
No log 2.7097 84 0.5004 0.5819 0.5004 0.7074
No log 2.7742 86 0.4824 0.5710 0.4824 0.6945
No log 2.8387 88 0.4902 0.5028 0.4902 0.7001
No log 2.9032 90 0.5914 0.4747 0.5914 0.7690
No log 2.9677 92 0.6024 0.4747 0.6024 0.7761
No log 3.0323 94 0.4884 0.5571 0.4884 0.6989
No log 3.0968 96 0.4887 0.6370 0.4887 0.6991
No log 3.1613 98 0.5895 0.3976 0.5895 0.7678
No log 3.2258 100 0.7712 0.3019 0.7712 0.8782
No log 3.2903 102 0.7119 0.3849 0.7119 0.8437
No log 3.3548 104 0.5641 0.5208 0.5641 0.7511
No log 3.4194 106 0.5163 0.6154 0.5163 0.7186
No log 3.4839 108 0.5071 0.6555 0.5071 0.7121
No log 3.5484 110 0.5313 0.5841 0.5313 0.7289
No log 3.6129 112 0.5823 0.5378 0.5823 0.7631
No log 3.6774 114 0.5746 0.5378 0.5746 0.7580
No log 3.7419 116 0.5022 0.6223 0.5022 0.7087
No log 3.8065 118 0.4686 0.5476 0.4686 0.6845
No log 3.8710 120 0.4720 0.5389 0.4720 0.6870
No log 3.9355 122 0.4960 0.6114 0.4960 0.7043
No log 4.0 124 0.6733 0.4651 0.6733 0.8206
No log 4.0645 126 0.6407 0.4811 0.6407 0.8004
No log 4.1290 128 0.4941 0.5999 0.4941 0.7030
No log 4.1935 130 0.4846 0.5677 0.4846 0.6961
No log 4.2581 132 0.5008 0.5627 0.5008 0.7077
No log 4.3226 134 0.6628 0.5017 0.6628 0.8141
No log 4.3871 136 0.9793 0.3535 0.9793 0.9896
No log 4.4516 138 0.9326 0.3599 0.9326 0.9657
No log 4.5161 140 0.6443 0.5579 0.6443 0.8027
No log 4.5806 142 0.5303 0.5653 0.5303 0.7282
No log 4.6452 144 0.5453 0.5445 0.5453 0.7385
No log 4.7097 146 0.5132 0.4984 0.5132 0.7164
No log 4.7742 148 0.5286 0.5265 0.5286 0.7270
No log 4.8387 150 0.7169 0.3381 0.7169 0.8467
No log 4.9032 152 0.8434 0.3410 0.8434 0.9183
No log 4.9677 154 0.7615 0.4906 0.7615 0.8726
No log 5.0323 156 0.5764 0.5031 0.5764 0.7592
No log 5.0968 158 0.5440 0.5291 0.5440 0.7376
No log 5.1613 160 0.6671 0.4893 0.6671 0.8168
No log 5.2258 162 0.6572 0.5204 0.6572 0.8107
No log 5.2903 164 0.5572 0.5273 0.5572 0.7464
No log 5.3548 166 0.5828 0.4811 0.5828 0.7634
No log 5.4194 168 0.6298 0.4874 0.6298 0.7936
No log 5.4839 170 0.5932 0.4672 0.5932 0.7702
No log 5.5484 172 0.5101 0.5970 0.5101 0.7142
No log 5.6129 174 0.5773 0.4664 0.5773 0.7598
No log 5.6774 176 0.6818 0.3991 0.6818 0.8257
No log 5.7419 178 0.6583 0.3991 0.6583 0.8113
No log 5.8065 180 0.5377 0.4352 0.5377 0.7333
No log 5.8710 182 0.5152 0.5152 0.5152 0.7178
No log 5.9355 184 0.7497 0.4364 0.7497 0.8659
No log 6.0 186 0.8610 0.2975 0.8610 0.9279
No log 6.0645 188 0.7735 0.4867 0.7735 0.8795
No log 6.1290 190 0.5405 0.6017 0.5405 0.7352
No log 6.1935 192 0.5794 0.5411 0.5794 0.7612
No log 6.2581 194 0.6720 0.4913 0.6720 0.8197
No log 6.3226 196 0.6223 0.5442 0.6223 0.7889
No log 6.3871 198 0.5502 0.5770 0.5502 0.7417
No log 6.4516 200 0.5496 0.6027 0.5496 0.7414
No log 6.5161 202 0.5577 0.6118 0.5577 0.7468
No log 6.5806 204 0.5408 0.6105 0.5408 0.7354
No log 6.6452 206 0.5430 0.5697 0.5430 0.7369
No log 6.7097 208 0.6372 0.5307 0.6372 0.7983
No log 6.7742 210 0.6554 0.5658 0.6554 0.8096
No log 6.8387 212 0.6398 0.5595 0.6398 0.7999
No log 6.9032 214 0.5541 0.5845 0.5541 0.7444
No log 6.9677 216 0.5197 0.5784 0.5197 0.7209
No log 7.0323 218 0.5098 0.5979 0.5098 0.7140
No log 7.0968 220 0.5076 0.5951 0.5076 0.7125
No log 7.1613 222 0.5026 0.5897 0.5026 0.7089
No log 7.2258 224 0.4982 0.6010 0.4982 0.7058
No log 7.2903 226 0.4886 0.5875 0.4886 0.6990
No log 7.3548 228 0.5178 0.5956 0.5178 0.7196
No log 7.4194 230 0.5678 0.5313 0.5678 0.7535
No log 7.4839 232 0.5407 0.5836 0.5407 0.7353
No log 7.5484 234 0.4753 0.6423 0.4753 0.6894
No log 7.6129 236 0.4780 0.6199 0.4780 0.6914
No log 7.6774 238 0.4798 0.6034 0.4798 0.6927
No log 7.7419 240 0.4533 0.6750 0.4533 0.6733
No log 7.8065 242 0.4508 0.7042 0.4508 0.6714
No log 7.8710 244 0.4715 0.6156 0.4715 0.6866
No log 7.9355 246 0.4522 0.6678 0.4522 0.6725
No log 8.0 248 0.4499 0.6078 0.4499 0.6708
No log 8.0645 250 0.4807 0.5528 0.4807 0.6933
No log 8.1290 252 0.4752 0.5528 0.4752 0.6894
No log 8.1935 254 0.4532 0.6124 0.4532 0.6732
No log 8.2581 256 0.4984 0.6709 0.4984 0.7059
No log 8.3226 258 0.6089 0.4821 0.6089 0.7803
No log 8.3871 260 0.6663 0.4768 0.6663 0.8162
No log 8.4516 262 0.5992 0.5312 0.5992 0.7741
No log 8.5161 264 0.4702 0.6419 0.4702 0.6857
No log 8.5806 266 0.4490 0.6946 0.4490 0.6701
No log 8.6452 268 0.4794 0.5897 0.4794 0.6924
No log 8.7097 270 0.5301 0.5206 0.5301 0.7281
No log 8.7742 272 0.5379 0.5426 0.5379 0.7334
No log 8.8387 274 0.4747 0.5897 0.4747 0.6890
No log 8.9032 276 0.4468 0.6087 0.4468 0.6684
No log 8.9677 278 0.5035 0.6349 0.5035 0.7095
No log 9.0323 280 0.5295 0.6017 0.5295 0.7276
No log 9.0968 282 0.4715 0.6526 0.4715 0.6867
No log 9.1613 284 0.4470 0.6860 0.4470 0.6686
No log 9.2258 286 0.4468 0.6860 0.4468 0.6684
No log 9.2903 288 0.4461 0.6683 0.4461 0.6679
No log 9.3548 290 0.4326 0.6672 0.4326 0.6577
No log 9.4194 292 0.4278 0.6672 0.4278 0.6541
No log 9.4839 294 0.4285 0.6020 0.4285 0.6546
No log 9.5484 296 0.4245 0.6839 0.4245 0.6515
No log 9.6129 298 0.4448 0.6914 0.4448 0.6669
No log 9.6774 300 0.4376 0.6839 0.4376 0.6615
No log 9.7419 302 0.4701 0.6047 0.4701 0.6857
No log 9.8065 304 0.5406 0.6169 0.5406 0.7352
No log 9.8710 306 0.5306 0.5970 0.5306 0.7285
No log 9.9355 308 0.4722 0.6034 0.4722 0.6871
No log 10.0 310 0.4567 0.6564 0.4567 0.6758
No log 10.0645 312 0.4656 0.6564 0.4656 0.6823
No log 10.1290 314 0.5009 0.5881 0.5009 0.7078
No log 10.1935 316 0.5394 0.5692 0.5394 0.7344
No log 10.2581 318 0.5269 0.5692 0.5269 0.7259
No log 10.3226 320 0.4868 0.6170 0.4868 0.6977
No log 10.3871 322 0.4814 0.6084 0.4814 0.6939
No log 10.4516 324 0.4827 0.6170 0.4827 0.6948
No log 10.5161 326 0.4518 0.6411 0.4518 0.6721
No log 10.5806 328 0.4282 0.6656 0.4282 0.6544
No log 10.6452 330 0.4260 0.64 0.4260 0.6527
No log 10.7097 332 0.4636 0.6156 0.4636 0.6809
No log 10.7742 334 0.4611 0.6047 0.4611 0.6790
No log 10.8387 336 0.4710 0.6169 0.4710 0.6863
No log 10.9032 338 0.4276 0.7089 0.4276 0.6539
No log 10.9677 340 0.4261 0.7184 0.4261 0.6528
No log 11.0323 342 0.4431 0.6419 0.4431 0.6657
No log 11.0968 344 0.4568 0.6235 0.4568 0.6759
No log 11.1613 346 0.4404 0.6305 0.4404 0.6636
No log 11.2258 348 0.4319 0.6661 0.4319 0.6572
No log 11.2903 350 0.4363 0.6197 0.4363 0.6605
No log 11.3548 352 0.4782 0.6047 0.4782 0.6915
No log 11.4194 354 0.5068 0.5657 0.5068 0.7119
No log 11.4839 356 0.5676 0.4985 0.5676 0.7534
No log 11.5484 358 0.5915 0.5051 0.5915 0.7691
No log 11.6129 360 0.5309 0.4908 0.5309 0.7286
No log 11.6774 362 0.5020 0.4908 0.5020 0.7085
No log 11.7419 364 0.4554 0.6555 0.4554 0.6748
No log 11.8065 366 0.4442 0.6632 0.4442 0.6665
No log 11.8710 368 0.4456 0.6632 0.4456 0.6675
No log 11.9355 370 0.4491 0.6830 0.4491 0.6702
No log 12.0 372 0.4751 0.6305 0.4751 0.6892
No log 12.0645 374 0.4755 0.6305 0.4755 0.6896
No log 12.1290 376 0.4617 0.5797 0.4617 0.6795
No log 12.1935 378 0.4774 0.6101 0.4774 0.6909
No log 12.2581 380 0.5213 0.5692 0.5213 0.7220
No log 12.3226 382 0.5073 0.5061 0.5073 0.7123
No log 12.3871 384 0.5001 0.5495 0.5001 0.7072
No log 12.4516 386 0.4739 0.6087 0.4739 0.6884
No log 12.5161 388 0.4773 0.6418 0.4773 0.6909
No log 12.5806 390 0.4988 0.6609 0.4988 0.7063
No log 12.6452 392 0.4895 0.6549 0.4895 0.6997
No log 12.7097 394 0.5027 0.6059 0.5027 0.7090
No log 12.7742 396 0.5326 0.5200 0.5326 0.7298
No log 12.8387 398 0.5103 0.5592 0.5103 0.7143
No log 12.9032 400 0.4795 0.6494 0.4795 0.6925
No log 12.9677 402 0.4736 0.6383 0.4736 0.6882
No log 13.0323 404 0.4726 0.6661 0.4726 0.6874
No log 13.0968 406 0.4829 0.5918 0.4829 0.6949
No log 13.1613 408 0.4941 0.5422 0.4941 0.7029
No log 13.2258 410 0.5107 0.5212 0.5107 0.7147
No log 13.2903 412 0.5187 0.4789 0.5187 0.7202
No log 13.3548 414 0.4874 0.6143 0.4874 0.6981
No log 13.4194 416 0.4615 0.6407 0.4615 0.6793
No log 13.4839 418 0.4590 0.6326 0.4590 0.6775
No log 13.5484 420 0.4371 0.6909 0.4371 0.6611
No log 13.6129 422 0.4310 0.6909 0.4310 0.6565
No log 13.6774 424 0.4304 0.6923 0.4304 0.6561
No log 13.7419 426 0.4256 0.6741 0.4256 0.6524
No log 13.8065 428 0.4271 0.6929 0.4271 0.6535
No log 13.8710 430 0.4266 0.6750 0.4266 0.6531
No log 13.9355 432 0.4365 0.6849 0.4365 0.6607
No log 14.0 434 0.4402 0.6849 0.4402 0.6635
No log 14.0645 436 0.4436 0.6849 0.4436 0.6660
No log 14.1290 438 0.4526 0.6305 0.4526 0.6728
No log 14.1935 440 0.4833 0.5855 0.4833 0.6952
No log 14.2581 442 0.5300 0.5178 0.5300 0.7280
No log 14.3226 444 0.5039 0.5528 0.5039 0.7099
No log 14.3871 446 0.4503 0.5904 0.4503 0.6711
No log 14.4516 448 0.4399 0.6252 0.4399 0.6632
No log 14.5161 450 0.4465 0.6446 0.4465 0.6682
No log 14.5806 452 0.4576 0.5853 0.4576 0.6765
No log 14.6452 454 0.5137 0.5584 0.5137 0.7167
No log 14.7097 456 0.5690 0.4909 0.5690 0.7543
No log 14.7742 458 0.5453 0.5095 0.5453 0.7384
No log 14.8387 460 0.4877 0.5571 0.4877 0.6983
No log 14.9032 462 0.4629 0.5649 0.4629 0.6804
No log 14.9677 464 0.4669 0.5649 0.4669 0.6833
No log 15.0323 466 0.4698 0.5649 0.4698 0.6854
No log 15.0968 468 0.4938 0.5587 0.4938 0.7027
No log 15.1613 470 0.5241 0.4931 0.5241 0.7240
No log 15.2258 472 0.5986 0.4783 0.5986 0.7737
No log 15.2903 474 0.6595 0.4142 0.6595 0.8121
No log 15.3548 476 0.6101 0.4842 0.6101 0.7811
No log 15.4194 478 0.5158 0.5452 0.5158 0.7182
No log 15.4839 480 0.4910 0.6382 0.4910 0.7007
No log 15.5484 482 0.5331 0.6450 0.5331 0.7301
No log 15.6129 484 0.5374 0.6608 0.5374 0.7331
No log 15.6774 486 0.5090 0.6611 0.5090 0.7134
No log 15.7419 488 0.4909 0.6304 0.4909 0.7006
No log 15.8065 490 0.4863 0.5782 0.4863 0.6973
No log 15.8710 492 0.4979 0.6183 0.4979 0.7056
No log 15.9355 494 0.5078 0.5319 0.5078 0.7126
No log 16.0 496 0.4852 0.5782 0.4852 0.6966
No log 16.0645 498 0.4706 0.6542 0.4706 0.6860
0.293 16.1290 500 0.4821 0.6632 0.4821 0.6943
0.293 16.1935 502 0.4753 0.6632 0.4753 0.6894
0.293 16.2581 504 0.4738 0.6060 0.4738 0.6883
0.293 16.3226 506 0.4989 0.5781 0.4989 0.7063
0.293 16.3871 508 0.5118 0.5796 0.5118 0.7154
0.293 16.4516 510 0.5184 0.5422 0.5184 0.7200

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_B_usingALLEssays_FineTuningAraBERT_run1_AugV5_k6_task7_organization

Finetuned
(4019)
this model