ArabicNewSplits7_usingALLEssays_FineTuningAraBERT_run1_AugV5_k9_task7_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.5397
  • Qwk: 0.4986
  • Mse: 0.5397
  • Rmse: 0.7347

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0645 2 2.5661 -0.0568 2.5661 1.6019
No log 0.1290 4 1.3242 0.1565 1.3242 1.1507
No log 0.1935 6 0.9185 0.0535 0.9185 0.9584
No log 0.2581 8 0.8300 0.0968 0.8300 0.9110
No log 0.3226 10 0.6765 0.1184 0.6765 0.8225
No log 0.3871 12 0.8118 0.2942 0.8118 0.9010
No log 0.4516 14 1.2057 0.2199 1.2057 1.0981
No log 0.5161 16 0.9938 0.2846 0.9938 0.9969
No log 0.5806 18 0.6770 0.4011 0.6770 0.8228
No log 0.6452 20 0.6543 0.3093 0.6543 0.8089
No log 0.7097 22 0.6952 0.3477 0.6952 0.8338
No log 0.7742 24 0.7353 0.2490 0.7353 0.8575
No log 0.8387 26 0.8799 0.2435 0.8799 0.9380
No log 0.9032 28 1.0757 0.2097 1.0757 1.0372
No log 0.9677 30 1.0136 0.2784 1.0136 1.0068
No log 1.0323 32 0.8094 0.2494 0.8094 0.8997
No log 1.0968 34 0.7383 0.1739 0.7383 0.8593
No log 1.1613 36 0.7100 0.2430 0.7100 0.8426
No log 1.2258 38 0.6630 0.2361 0.6630 0.8143
No log 1.2903 40 0.6727 0.2036 0.6727 0.8202
No log 1.3548 42 0.6394 0.3462 0.6394 0.7997
No log 1.4194 44 0.6236 0.4569 0.6236 0.7897
No log 1.4839 46 0.6431 0.3679 0.6431 0.8019
No log 1.5484 48 0.8106 0.3129 0.8106 0.9003
No log 1.6129 50 0.9342 0.2460 0.9342 0.9665
No log 1.6774 52 0.8933 0.3119 0.8933 0.9451
No log 1.7419 54 0.8962 0.3051 0.8962 0.9467
No log 1.8065 56 0.7360 0.4518 0.7360 0.8579
No log 1.8710 58 0.6837 0.3762 0.6837 0.8268
No log 1.9355 60 0.7094 0.3235 0.7094 0.8422
No log 2.0 62 0.6623 0.3079 0.6623 0.8138
No log 2.0645 64 0.8550 0.3228 0.8550 0.9247
No log 2.1290 66 0.7760 0.4074 0.7760 0.8809
No log 2.1935 68 0.5806 0.3604 0.5806 0.7620
No log 2.2581 70 0.7916 0.4308 0.7916 0.8897
No log 2.3226 72 0.9476 0.3804 0.9476 0.9735
No log 2.3871 74 0.7221 0.4154 0.7221 0.8498
No log 2.4516 76 0.5374 0.3745 0.5374 0.7331
No log 2.5161 78 0.7582 0.3802 0.7582 0.8707
No log 2.5806 80 0.9429 0.3410 0.9429 0.9710
No log 2.6452 82 0.9838 0.2795 0.9838 0.9919
No log 2.7097 84 0.8770 0.3643 0.8770 0.9365
No log 2.7742 86 0.7310 0.3173 0.7310 0.8550
No log 2.8387 88 0.6572 0.2095 0.6572 0.8107
No log 2.9032 90 0.5962 0.2036 0.5962 0.7721
No log 2.9677 92 0.5554 0.3318 0.5554 0.7452
No log 3.0323 94 0.6020 0.3010 0.6020 0.7759
No log 3.0968 96 0.5955 0.3258 0.5955 0.7717
No log 3.1613 98 0.5836 0.3274 0.5836 0.7639
No log 3.2258 100 0.6654 0.4389 0.6654 0.8157
No log 3.2903 102 0.5688 0.3937 0.5688 0.7542
No log 3.3548 104 0.6182 0.4370 0.6182 0.7862
No log 3.4194 106 0.7270 0.3617 0.7270 0.8526
No log 3.4839 108 0.6493 0.3948 0.6493 0.8058
No log 3.5484 110 0.5373 0.4126 0.5373 0.7330
No log 3.6129 112 0.5144 0.3754 0.5144 0.7172
No log 3.6774 114 0.5158 0.3834 0.5158 0.7182
No log 3.7419 116 0.5551 0.3681 0.5551 0.7450
No log 3.8065 118 0.6962 0.4429 0.6962 0.8344
No log 3.8710 120 0.6208 0.4058 0.6208 0.7879
No log 3.9355 122 0.5196 0.4590 0.5196 0.7208
No log 4.0 124 0.5424 0.4664 0.5424 0.7365
No log 4.0645 126 0.5787 0.4251 0.5787 0.7607
No log 4.1290 128 0.5088 0.5289 0.5088 0.7133
No log 4.1935 130 0.5232 0.4548 0.5232 0.7233
No log 4.2581 132 0.6301 0.4501 0.6301 0.7938
No log 4.3226 134 0.6837 0.4315 0.6837 0.8269
No log 4.3871 136 0.5798 0.5048 0.5798 0.7614
No log 4.4516 138 0.5592 0.5657 0.5592 0.7478
No log 4.5161 140 0.5242 0.5926 0.5242 0.7240
No log 4.5806 142 0.5249 0.5782 0.5249 0.7245
No log 4.6452 144 0.5723 0.4674 0.5723 0.7565
No log 4.7097 146 0.5648 0.4927 0.5648 0.7515
No log 4.7742 148 0.5410 0.5619 0.5410 0.7355
No log 4.8387 150 0.5424 0.5470 0.5424 0.7365
No log 4.9032 152 0.5712 0.5265 0.5712 0.7558
No log 4.9677 154 0.5360 0.5869 0.5360 0.7322
No log 5.0323 156 0.5804 0.5048 0.5804 0.7618
No log 5.0968 158 0.5533 0.5112 0.5533 0.7438
No log 5.1613 160 0.5148 0.4590 0.5148 0.7175
No log 5.2258 162 0.5012 0.4212 0.5012 0.7080
No log 5.2903 164 0.4923 0.4473 0.4923 0.7017
No log 5.3548 166 0.4854 0.4473 0.4854 0.6967
No log 5.4194 168 0.4719 0.5749 0.4719 0.6869
No log 5.4839 170 0.4955 0.5098 0.4955 0.7039
No log 5.5484 172 0.5170 0.5513 0.5170 0.7190
No log 5.6129 174 0.5779 0.5200 0.5779 0.7602
No log 5.6774 176 0.7137 0.3953 0.7137 0.8448
No log 5.7419 178 0.7257 0.3867 0.7257 0.8519
No log 5.8065 180 0.5571 0.5460 0.5571 0.7464
No log 5.8710 182 0.4653 0.4660 0.4653 0.6821
No log 5.9355 184 0.4723 0.4493 0.4723 0.6873
No log 6.0 186 0.4950 0.6082 0.4950 0.7035
No log 6.0645 188 0.6438 0.4414 0.6438 0.8023
No log 6.1290 190 0.7407 0.3826 0.7407 0.8607
No log 6.1935 192 0.6715 0.4026 0.6715 0.8194
No log 6.2581 194 0.4882 0.6210 0.4882 0.6987
No log 6.3226 196 0.4518 0.5897 0.4518 0.6721
No log 6.3871 198 0.4563 0.5195 0.4563 0.6755
No log 6.4516 200 0.4572 0.5305 0.4572 0.6762
No log 6.5161 202 0.5113 0.4997 0.5113 0.7150
No log 6.5806 204 0.5715 0.5051 0.5715 0.7560
No log 6.6452 206 0.5679 0.5119 0.5679 0.7536
No log 6.7097 208 0.5083 0.5544 0.5083 0.7129
No log 6.7742 210 0.4633 0.5800 0.4633 0.6807
No log 6.8387 212 0.4652 0.6065 0.4652 0.6820
No log 6.9032 214 0.4665 0.6114 0.4665 0.6830
No log 6.9677 216 0.5273 0.6030 0.5273 0.7261
No log 7.0323 218 0.5496 0.5957 0.5496 0.7413
No log 7.0968 220 0.5282 0.6195 0.5282 0.7267
No log 7.1613 222 0.5113 0.6275 0.5113 0.7151
No log 7.2258 224 0.5372 0.4568 0.5372 0.7329
No log 7.2903 226 0.5100 0.5166 0.5100 0.7141
No log 7.3548 228 0.4920 0.5440 0.4920 0.7015
No log 7.4194 230 0.4920 0.5189 0.4920 0.7014
No log 7.4839 232 0.4914 0.5404 0.4914 0.7010
No log 7.5484 234 0.5083 0.4721 0.5083 0.7129
No log 7.6129 236 0.4924 0.5533 0.4924 0.7017
No log 7.6774 238 0.5266 0.5332 0.5266 0.7257
No log 7.7419 240 0.6733 0.4651 0.6733 0.8205
No log 7.8065 242 0.7068 0.4268 0.7068 0.8407
No log 7.8710 244 0.6029 0.4807 0.6029 0.7765
No log 7.9355 246 0.4996 0.4970 0.4996 0.7068
No log 8.0 248 0.5531 0.4219 0.5531 0.7437
No log 8.0645 250 0.5777 0.3789 0.5777 0.7601
No log 8.1290 252 0.5355 0.4306 0.5355 0.7317
No log 8.1935 254 0.5259 0.3974 0.5259 0.7252
No log 8.2581 256 0.5590 0.4283 0.5590 0.7477
No log 8.3226 258 0.5741 0.4695 0.5741 0.7577
No log 8.3871 260 0.5400 0.5226 0.5400 0.7348
No log 8.4516 262 0.4781 0.5853 0.4781 0.6914
No log 8.5161 264 0.4818 0.4639 0.4818 0.6941
No log 8.5806 266 0.4794 0.5307 0.4794 0.6924
No log 8.6452 268 0.4855 0.6017 0.4855 0.6968
No log 8.7097 270 0.5879 0.4751 0.5879 0.7668
No log 8.7742 272 0.6354 0.4519 0.6354 0.7971
No log 8.8387 274 0.5736 0.4751 0.5736 0.7574
No log 8.9032 276 0.4880 0.5782 0.4880 0.6986
No log 8.9677 278 0.4743 0.5248 0.4743 0.6887
No log 9.0323 280 0.4792 0.4795 0.4792 0.6922
No log 9.0968 282 0.4600 0.5170 0.4600 0.6782
No log 9.1613 284 0.4666 0.5672 0.4666 0.6831
No log 9.2258 286 0.4860 0.6066 0.4860 0.6971
No log 9.2903 288 0.4806 0.6419 0.4806 0.6932
No log 9.3548 290 0.4833 0.6013 0.4833 0.6952
No log 9.4194 292 0.4746 0.5246 0.4746 0.6889
No log 9.4839 294 0.4729 0.4402 0.4729 0.6877
No log 9.5484 296 0.4830 0.4914 0.4830 0.6950
No log 9.6129 298 0.4952 0.5765 0.4952 0.7037
No log 9.6774 300 0.5454 0.5895 0.5454 0.7385
No log 9.7419 302 0.5313 0.5906 0.5313 0.7289
No log 9.8065 304 0.5046 0.5533 0.5046 0.7104
No log 9.8710 306 0.4899 0.5784 0.4899 0.6999
No log 9.9355 308 0.4875 0.5446 0.4875 0.6982
No log 10.0 310 0.5070 0.4756 0.5070 0.7120
No log 10.0645 312 0.4980 0.4569 0.4980 0.7057
No log 10.1290 314 0.4741 0.4746 0.4741 0.6885
No log 10.1935 316 0.4695 0.4968 0.4695 0.6852
No log 10.2581 318 0.4663 0.4746 0.4663 0.6828
No log 10.3226 320 0.4897 0.4997 0.4897 0.6998
No log 10.3871 322 0.4974 0.4931 0.4974 0.7053
No log 10.4516 324 0.4773 0.4958 0.4773 0.6908
No log 10.5161 326 0.4841 0.4958 0.4841 0.6958
No log 10.5806 328 0.4824 0.4958 0.4824 0.6946
No log 10.6452 330 0.4787 0.5248 0.4787 0.6919
No log 10.7097 332 0.4764 0.5640 0.4764 0.6902
No log 10.7742 334 0.4877 0.6021 0.4877 0.6984
No log 10.8387 336 0.4991 0.5769 0.4991 0.7064
No log 10.9032 338 0.5104 0.6058 0.5104 0.7144
No log 10.9677 340 0.4914 0.5614 0.4914 0.7010
No log 11.0323 342 0.4749 0.5143 0.4749 0.6891
No log 11.0968 344 0.4738 0.5321 0.4738 0.6883
No log 11.1613 346 0.4718 0.5397 0.4718 0.6869
No log 11.2258 348 0.4650 0.4752 0.4650 0.6819
No log 11.2903 350 0.4786 0.4962 0.4786 0.6918
No log 11.3548 352 0.4780 0.5036 0.4780 0.6914
No log 11.4194 354 0.4538 0.5248 0.4538 0.6736
No log 11.4839 356 0.4635 0.6370 0.4635 0.6808
No log 11.5484 358 0.5369 0.5664 0.5369 0.7327
No log 11.6129 360 0.5958 0.5088 0.5958 0.7719
No log 11.6774 362 0.5769 0.5862 0.5769 0.7595
No log 11.7419 364 0.5103 0.6154 0.5103 0.7144
No log 11.8065 366 0.4563 0.5648 0.4563 0.6755
No log 11.8710 368 0.4826 0.5104 0.4826 0.6947
No log 11.9355 370 0.4730 0.5386 0.4730 0.6878
No log 12.0 372 0.4603 0.6655 0.4603 0.6784
No log 12.0645 374 0.5667 0.5266 0.5667 0.7528
No log 12.1290 376 0.6485 0.5103 0.6485 0.8053
No log 12.1935 378 0.6221 0.5415 0.6221 0.7888
No log 12.2581 380 0.5341 0.5515 0.5341 0.7308
No log 12.3226 382 0.4809 0.5869 0.4809 0.6935
No log 12.3871 384 0.4452 0.5800 0.4452 0.6672
No log 12.4516 386 0.4685 0.5758 0.4685 0.6845
No log 12.5161 388 0.4685 0.5957 0.4685 0.6845
No log 12.5806 390 0.4584 0.5419 0.4584 0.6771
No log 12.6452 392 0.4576 0.6370 0.4576 0.6764
No log 12.7097 394 0.4753 0.6442 0.4753 0.6894
No log 12.7742 396 0.4792 0.6248 0.4792 0.6922
No log 12.8387 398 0.4664 0.6168 0.4664 0.6829
No log 12.9032 400 0.4821 0.4524 0.4821 0.6943
No log 12.9677 402 0.4850 0.4459 0.4850 0.6964
No log 13.0323 404 0.4777 0.5584 0.4777 0.6912
No log 13.0968 406 0.4905 0.5991 0.4905 0.7004
No log 13.1613 408 0.5143 0.5908 0.5143 0.7172
No log 13.2258 410 0.4977 0.5706 0.4977 0.7054
No log 13.2903 412 0.4839 0.5706 0.4839 0.6956
No log 13.3548 414 0.4674 0.6017 0.4674 0.6836
No log 13.4194 416 0.4644 0.5784 0.4644 0.6815
No log 13.4839 418 0.4607 0.6154 0.4607 0.6788
No log 13.5484 420 0.4607 0.6125 0.4607 0.6787
No log 13.6129 422 0.4790 0.5897 0.4790 0.6921
No log 13.6774 424 0.4911 0.6014 0.4911 0.7008
No log 13.7419 426 0.5009 0.6411 0.5009 0.7077
No log 13.8065 428 0.4975 0.6676 0.4975 0.7053
No log 13.8710 430 0.4960 0.6384 0.4960 0.7043
No log 13.9355 432 0.4848 0.6690 0.4848 0.6963
No log 14.0 434 0.4636 0.6563 0.4636 0.6809
No log 14.0645 436 0.4596 0.5728 0.4596 0.6779
No log 14.1290 438 0.4784 0.5283 0.4784 0.6917
No log 14.1935 440 0.4957 0.5124 0.4957 0.7040
No log 14.2581 442 0.5221 0.5101 0.5221 0.7225
No log 14.3226 444 0.5130 0.5544 0.5130 0.7162
No log 14.3871 446 0.5054 0.5112 0.5054 0.7109
No log 14.4516 448 0.4954 0.5721 0.4954 0.7038
No log 14.5161 450 0.5059 0.5721 0.5059 0.7113
No log 14.5806 452 0.5286 0.5849 0.5286 0.7271
No log 14.6452 454 0.5475 0.5387 0.5475 0.7399
No log 14.7097 456 0.5491 0.5387 0.5491 0.7410
No log 14.7742 458 0.4973 0.6074 0.4973 0.7052
No log 14.8387 460 0.4460 0.6978 0.4460 0.6678
No log 14.9032 462 0.4242 0.6935 0.4242 0.6513
No log 14.9677 464 0.4304 0.6326 0.4304 0.6560
No log 15.0323 466 0.4288 0.6339 0.4288 0.6548
No log 15.0968 468 0.4476 0.6701 0.4476 0.6690
No log 15.1613 470 0.4860 0.6427 0.4860 0.6972
No log 15.2258 472 0.5173 0.6074 0.5173 0.7193
No log 15.2903 474 0.4837 0.6612 0.4837 0.6955
No log 15.3548 476 0.4629 0.6028 0.4629 0.6803
No log 15.4194 478 0.5233 0.5930 0.5233 0.7234
No log 15.4839 480 0.5815 0.5278 0.5815 0.7626
No log 15.5484 482 0.5273 0.4997 0.5273 0.7261
No log 15.6129 484 0.4578 0.5852 0.4578 0.6766
No log 15.6774 486 0.4608 0.6475 0.4608 0.6788
No log 15.7419 488 0.5012 0.6236 0.5012 0.7080
No log 15.8065 490 0.5654 0.5331 0.5654 0.7519
No log 15.8710 492 0.6176 0.4811 0.6176 0.7859
No log 15.9355 494 0.6585 0.4703 0.6585 0.8115
No log 16.0 496 0.6390 0.4703 0.6390 0.7994
No log 16.0645 498 0.5603 0.5331 0.5603 0.7486
0.282 16.1290 500 0.5148 0.6236 0.5148 0.7175
0.282 16.1935 502 0.5436 0.5400 0.5436 0.7373
0.282 16.2581 504 0.6168 0.4308 0.6168 0.7854
0.282 16.3226 506 0.6295 0.4258 0.6295 0.7934
0.282 16.3871 508 0.5885 0.4466 0.5885 0.7672
0.282 16.4516 510 0.5397 0.4986 0.5397 0.7347

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_usingALLEssays_FineTuningAraBERT_run1_AugV5_k9_task7_organization

Finetuned
(4019)
this model