ArabicNewSplits7_B_usingALLEssays_FineTuningAraBERT_run3_AugV5_k5_task7_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.4955
  • Qwk: 0.6414
  • Mse: 0.4955
  • Rmse: 0.7039

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0741 2 2.7101 -0.0545 2.7101 1.6462
No log 0.1481 4 1.3825 0.0526 1.3825 1.1758
No log 0.2222 6 0.9914 -0.0970 0.9914 0.9957
No log 0.2963 8 1.0304 -0.0443 1.0304 1.0151
No log 0.3704 10 1.0242 0.1254 1.0242 1.0120
No log 0.4444 12 0.8203 0.3051 0.8203 0.9057
No log 0.5185 14 0.6535 0.1267 0.6535 0.8084
No log 0.5926 16 0.8021 0.3926 0.8021 0.8956
No log 0.6667 18 0.6682 0.2804 0.6682 0.8174
No log 0.7407 20 0.5855 0.3305 0.5855 0.7652
No log 0.8148 22 0.5838 0.3274 0.5838 0.7641
No log 0.8889 24 0.5723 0.4211 0.5723 0.7565
No log 0.9630 26 0.6065 0.4163 0.6065 0.7788
No log 1.0370 28 0.6133 0.4434 0.6133 0.7831
No log 1.1111 30 0.6240 0.4695 0.6240 0.7899
No log 1.1852 32 0.9134 0.3446 0.9134 0.9557
No log 1.2593 34 1.2025 0.1694 1.2025 1.0966
No log 1.3333 36 1.3260 0.1006 1.3260 1.1515
No log 1.4074 38 1.1737 0.1468 1.1737 1.0834
No log 1.4815 40 0.8074 0.3754 0.8074 0.8985
No log 1.5556 42 0.6531 0.4222 0.6531 0.8082
No log 1.6296 44 0.7489 0.4120 0.7489 0.8654
No log 1.7037 46 0.9782 0.2846 0.9782 0.9890
No log 1.7778 48 0.9222 0.3389 0.9222 0.9603
No log 1.8519 50 0.7316 0.4246 0.7316 0.8553
No log 1.9259 52 0.6663 0.4280 0.6663 0.8163
No log 2.0 54 0.6123 0.3857 0.6123 0.7825
No log 2.0741 56 0.6315 0.3857 0.6315 0.7947
No log 2.1481 58 0.6239 0.3857 0.6239 0.7899
No log 2.2222 60 0.6088 0.4474 0.6088 0.7802
No log 2.2963 62 0.5997 0.4198 0.5997 0.7744
No log 2.3704 64 0.5448 0.4817 0.5448 0.7381
No log 2.4444 66 0.5458 0.4875 0.5458 0.7388
No log 2.5185 68 0.5521 0.4190 0.5521 0.7430
No log 2.5926 70 0.5868 0.5231 0.5868 0.7660
No log 2.6667 72 0.8414 0.3312 0.8414 0.9173
No log 2.7407 74 0.9628 0.2702 0.9628 0.9812
No log 2.8148 76 0.7657 0.4238 0.7657 0.8750
No log 2.8889 78 0.7050 0.4395 0.7050 0.8397
No log 2.9630 80 0.7875 0.3739 0.7875 0.8874
No log 3.0370 82 0.8812 0.2853 0.8812 0.9387
No log 3.1111 84 0.7734 0.3739 0.7734 0.8794
No log 3.1852 86 0.6195 0.5921 0.6195 0.7871
No log 3.2593 88 0.6232 0.5544 0.6232 0.7894
No log 3.3333 90 0.6203 0.5407 0.6203 0.7876
No log 3.4074 92 0.5765 0.5682 0.5765 0.7593
No log 3.4815 94 0.5728 0.4908 0.5728 0.7569
No log 3.5556 96 0.6938 0.5189 0.6938 0.8329
No log 3.6296 98 0.8779 0.2928 0.8779 0.9370
No log 3.7037 100 0.8683 0.2958 0.8683 0.9318
No log 3.7778 102 0.7178 0.5195 0.7178 0.8472
No log 3.8519 104 0.5676 0.5356 0.5676 0.7534
No log 3.9259 106 0.5519 0.4937 0.5519 0.7429
No log 4.0 108 0.5521 0.4937 0.5521 0.7431
No log 4.0741 110 0.5393 0.5107 0.5393 0.7343
No log 4.1481 112 0.5277 0.4934 0.5277 0.7264
No log 4.2222 114 0.5357 0.4717 0.5357 0.7319
No log 4.2963 116 0.5745 0.5161 0.5745 0.7579
No log 4.3704 118 0.5774 0.5368 0.5774 0.7599
No log 4.4444 120 0.5401 0.4595 0.5401 0.7349
No log 4.5185 122 0.5123 0.5476 0.5123 0.7157
No log 4.5926 124 0.5139 0.5323 0.5139 0.7169
No log 4.6667 126 0.5433 0.5877 0.5433 0.7371
No log 4.7407 128 0.5639 0.5144 0.5639 0.7509
No log 4.8148 130 0.5337 0.5649 0.5337 0.7306
No log 4.8889 132 0.5642 0.5212 0.5642 0.7511
No log 4.9630 134 0.6024 0.5513 0.6024 0.7762
No log 5.0370 136 0.6807 0.4444 0.6807 0.8250
No log 5.1111 138 0.5795 0.5908 0.5795 0.7612
No log 5.1852 140 0.5277 0.5929 0.5277 0.7264
No log 5.2593 142 0.5392 0.5678 0.5392 0.7343
No log 5.3333 144 0.6746 0.4444 0.6746 0.8214
No log 5.4074 146 0.6981 0.4250 0.6981 0.8355
No log 5.4815 148 0.5780 0.4536 0.5780 0.7603
No log 5.5556 150 0.5458 0.5605 0.5458 0.7388
No log 5.6296 152 0.6259 0.3985 0.6259 0.7911
No log 5.7037 154 0.5816 0.5516 0.5816 0.7627
No log 5.7778 156 0.5736 0.5533 0.5736 0.7574
No log 5.8519 158 0.5558 0.5533 0.5558 0.7455
No log 5.9259 160 0.6323 0.6032 0.6323 0.7952
No log 6.0 162 0.8544 0.3716 0.8544 0.9244
No log 6.0741 164 0.9525 0.3747 0.9525 0.9760
No log 6.1481 166 0.7506 0.3909 0.7506 0.8664
No log 6.2222 168 0.5492 0.5440 0.5492 0.7410
No log 6.2963 170 0.6461 0.4825 0.6461 0.8038
No log 6.3704 172 0.7362 0.4828 0.7362 0.8580
No log 6.4444 174 0.6990 0.5018 0.6990 0.8361
No log 6.5185 176 0.6698 0.5328 0.6698 0.8184
No log 6.5926 178 0.6791 0.5529 0.6791 0.8241
No log 6.6667 180 0.6513 0.5515 0.6513 0.8070
No log 6.7407 182 0.6366 0.5460 0.6366 0.7979
No log 6.8148 184 0.5782 0.5794 0.5782 0.7604
No log 6.8889 186 0.5029 0.4928 0.5029 0.7092
No log 6.9630 188 0.5121 0.5340 0.5121 0.7156
No log 7.0370 190 0.5280 0.5589 0.5280 0.7267
No log 7.1111 192 0.5468 0.5574 0.5468 0.7395
No log 7.1852 194 0.5328 0.5003 0.5328 0.7300
No log 7.2593 196 0.5190 0.5003 0.5190 0.7204
No log 7.3333 198 0.5119 0.5003 0.5119 0.7155
No log 7.4074 200 0.5349 0.5632 0.5349 0.7314
No log 7.4815 202 0.5236 0.5738 0.5236 0.7236
No log 7.5556 204 0.5126 0.5738 0.5126 0.7159
No log 7.6296 206 0.4976 0.5021 0.4976 0.7054
No log 7.7037 208 0.5007 0.5505 0.5007 0.7076
No log 7.7778 210 0.5075 0.5379 0.5075 0.7124
No log 7.8519 212 0.5072 0.5505 0.5072 0.7121
No log 7.9259 214 0.5129 0.5703 0.5129 0.7162
No log 8.0 216 0.5046 0.5902 0.5046 0.7103
No log 8.0741 218 0.5002 0.5826 0.5002 0.7073
No log 8.1481 220 0.5267 0.5751 0.5267 0.7257
No log 8.2222 222 0.6259 0.5513 0.6259 0.7911
No log 8.2963 224 0.7529 0.4528 0.7529 0.8677
No log 8.3704 226 0.6504 0.4909 0.6504 0.8065
No log 8.4444 228 0.5040 0.6020 0.5040 0.7099
No log 8.5185 230 0.5475 0.5961 0.5475 0.7399
No log 8.5926 232 0.6792 0.5008 0.6792 0.8242
No log 8.6667 234 0.6433 0.5354 0.6433 0.8021
No log 8.7407 236 0.5339 0.5840 0.5339 0.7307
No log 8.8148 238 0.5794 0.6169 0.5794 0.7612
No log 8.8889 240 0.6153 0.5751 0.6153 0.7844
No log 8.9630 242 0.5677 0.5956 0.5677 0.7535
No log 9.0370 244 0.5280 0.5270 0.5280 0.7267
No log 9.1111 246 0.5795 0.5934 0.5795 0.7612
No log 9.1852 248 0.5881 0.5934 0.5881 0.7669
No log 9.2593 250 0.5482 0.6160 0.5482 0.7404
No log 9.3333 252 0.5262 0.5592 0.5262 0.7254
No log 9.4074 254 0.5946 0.5683 0.5946 0.7711
No log 9.4815 256 0.5613 0.5822 0.5613 0.7492
No log 9.5556 258 0.5090 0.5452 0.5090 0.7135
No log 9.6296 260 0.5121 0.5563 0.5121 0.7156
No log 9.7037 262 0.5092 0.5368 0.5092 0.7136
No log 9.7778 264 0.5121 0.5452 0.5121 0.7156
No log 9.8519 266 0.5343 0.6330 0.5343 0.7310
No log 9.9259 268 0.5159 0.6330 0.5159 0.7183
No log 10.0 270 0.4892 0.5234 0.4892 0.6994
No log 10.0741 272 0.4890 0.5619 0.4890 0.6993
No log 10.1481 274 0.5057 0.5784 0.5057 0.7111
No log 10.2222 276 0.4966 0.5913 0.4966 0.7047
No log 10.2963 278 0.4800 0.5753 0.4800 0.6928
No log 10.3704 280 0.5326 0.6017 0.5326 0.7298
No log 10.4444 282 0.6385 0.6491 0.6385 0.7990
No log 10.5185 284 0.5922 0.6314 0.5922 0.7695
No log 10.5926 286 0.5342 0.6017 0.5342 0.7309
No log 10.6667 288 0.4863 0.6034 0.4863 0.6974
No log 10.7407 290 0.4705 0.5523 0.4705 0.6860
No log 10.8148 292 0.4689 0.5286 0.4689 0.6848
No log 10.8889 294 0.4791 0.5195 0.4791 0.6922
No log 10.9630 296 0.4812 0.5195 0.4812 0.6937
No log 11.0370 298 0.4775 0.5195 0.4775 0.6910
No log 11.1111 300 0.4794 0.5195 0.4794 0.6924
No log 11.1852 302 0.4771 0.5812 0.4771 0.6907
No log 11.2593 304 0.4930 0.5810 0.4930 0.7021
No log 11.3333 306 0.4991 0.5590 0.4991 0.7064
No log 11.4074 308 0.4861 0.6114 0.4861 0.6972
No log 11.4815 310 0.5284 0.6342 0.5284 0.7269
No log 11.5556 312 0.5314 0.6342 0.5314 0.7290
No log 11.6296 314 0.4803 0.6383 0.4803 0.6930
No log 11.7037 316 0.4584 0.4953 0.4584 0.6771
No log 11.7778 318 0.5364 0.5560 0.5364 0.7324
No log 11.8519 320 0.5788 0.5677 0.5788 0.7608
No log 11.9259 322 0.5095 0.5786 0.5095 0.7138
No log 12.0 324 0.4546 0.5625 0.4546 0.6743
No log 12.0741 326 0.4722 0.6501 0.4722 0.6872
No log 12.1481 328 0.5237 0.5836 0.5237 0.7237
No log 12.2222 330 0.4991 0.6526 0.4991 0.7065
No log 12.2963 332 0.4567 0.6317 0.4567 0.6758
No log 12.3704 334 0.4602 0.5413 0.4602 0.6784
No log 12.4444 336 0.4739 0.6706 0.4739 0.6884
No log 12.5185 338 0.4782 0.6706 0.4782 0.6915
No log 12.5926 340 0.4664 0.5961 0.4664 0.6830
No log 12.6667 342 0.4870 0.6079 0.4870 0.6979
No log 12.7407 344 0.5366 0.6017 0.5366 0.7326
No log 12.8148 346 0.5169 0.6004 0.5169 0.7190
No log 12.8889 348 0.5090 0.6079 0.5090 0.7134
No log 12.9630 350 0.4724 0.6431 0.4724 0.6873
No log 13.0370 352 0.4506 0.5854 0.4506 0.6712
No log 13.1111 354 0.4454 0.6020 0.4454 0.6674
No log 13.1852 356 0.4531 0.6601 0.4531 0.6731
No log 13.2593 358 0.4750 0.6526 0.4750 0.6892
No log 13.3333 360 0.4852 0.6526 0.4852 0.6966
No log 13.4074 362 0.5122 0.6442 0.5122 0.7157
No log 13.4815 364 0.5165 0.6361 0.5165 0.7187
No log 13.5556 366 0.5085 0.6361 0.5085 0.7131
No log 13.6296 368 0.4465 0.6414 0.4465 0.6682
No log 13.7037 370 0.4333 0.6579 0.4333 0.6582
No log 13.7778 372 0.4404 0.6673 0.4404 0.6636
No log 13.8519 374 0.4469 0.6122 0.4469 0.6685
No log 13.9259 376 0.4461 0.6200 0.4461 0.6679
No log 14.0 378 0.4453 0.6200 0.4453 0.6673
No log 14.0741 380 0.4546 0.5902 0.4546 0.6742
No log 14.1481 382 0.4674 0.6047 0.4674 0.6837
No log 14.2222 384 0.4566 0.6047 0.4566 0.6757
No log 14.2963 386 0.4447 0.6661 0.4447 0.6668
No log 14.3704 388 0.4434 0.6950 0.4434 0.6658
No log 14.4444 390 0.4406 0.6863 0.4406 0.6638
No log 14.5185 392 0.4338 0.6863 0.4338 0.6586
No log 14.5926 394 0.4294 0.6661 0.4294 0.6553
No log 14.6667 396 0.4474 0.6782 0.4474 0.6689
No log 14.7407 398 0.4647 0.6141 0.4647 0.6817
No log 14.8148 400 0.4418 0.6477 0.4418 0.6647
No log 14.8889 402 0.4375 0.5738 0.4375 0.6614
No log 14.9630 404 0.4724 0.6406 0.4724 0.6873
No log 15.0370 406 0.4840 0.6411 0.4840 0.6957
No log 15.1111 408 0.4504 0.5926 0.4504 0.6711
No log 15.1852 410 0.4482 0.6395 0.4482 0.6695
No log 15.2593 412 0.4566 0.6318 0.4566 0.6757
No log 15.3333 414 0.4697 0.6637 0.4697 0.6854
No log 15.4074 416 0.4675 0.6475 0.4675 0.6838
No log 15.4815 418 0.4587 0.6234 0.4587 0.6773
No log 15.5556 420 0.4695 0.6235 0.4695 0.6852
No log 15.6296 422 0.5030 0.5908 0.5030 0.7092
No log 15.7037 424 0.4856 0.5855 0.4856 0.6968
No log 15.7778 426 0.4353 0.6678 0.4353 0.6598
No log 15.8519 428 0.4341 0.6630 0.4341 0.6589
No log 15.9259 430 0.4317 0.7135 0.4317 0.6570
No log 16.0 432 0.4301 0.6719 0.4301 0.6558
No log 16.0741 434 0.4469 0.6719 0.4469 0.6685
No log 16.1481 436 0.4653 0.6397 0.4653 0.6821
No log 16.2222 438 0.4861 0.6397 0.4861 0.6972
No log 16.2963 440 0.5096 0.6145 0.5096 0.7138
No log 16.3704 442 0.5362 0.6178 0.5362 0.7323
No log 16.4444 444 0.5125 0.6411 0.5125 0.7159
No log 16.5185 446 0.4798 0.5711 0.4798 0.6926
No log 16.5926 448 0.4798 0.5854 0.4798 0.6927
No log 16.6667 450 0.4900 0.6514 0.4900 0.7000
No log 16.7407 452 0.4775 0.5854 0.4775 0.6910
No log 16.8148 454 0.4708 0.5826 0.4708 0.6862
No log 16.8889 456 0.4667 0.5826 0.4667 0.6831
No log 16.9630 458 0.4701 0.6223 0.4701 0.6857
No log 17.0370 460 0.4788 0.6514 0.4788 0.6920
No log 17.1111 462 0.4689 0.6326 0.4689 0.6848
No log 17.1852 464 0.4666 0.6326 0.4666 0.6831
No log 17.2593 466 0.4637 0.6326 0.4637 0.6810
No log 17.3333 468 0.4572 0.6326 0.4572 0.6762
No log 17.4074 470 0.4574 0.5084 0.4574 0.6763
No log 17.4815 472 0.4578 0.5574 0.4578 0.6766
No log 17.5556 474 0.4512 0.5725 0.4512 0.6717
No log 17.6296 476 0.4529 0.5853 0.4529 0.6730
No log 17.7037 478 0.4522 0.5853 0.4522 0.6724
No log 17.7778 480 0.4488 0.6210 0.4488 0.6699
No log 17.8519 482 0.4450 0.6575 0.4450 0.6671
No log 17.9259 484 0.4405 0.6771 0.4405 0.6637
No log 18.0 486 0.4347 0.6672 0.4347 0.6593
No log 18.0741 488 0.4371 0.5373 0.4371 0.6611
No log 18.1481 490 0.4543 0.5682 0.4543 0.6741
No log 18.2222 492 0.4898 0.6135 0.4898 0.6998
No log 18.2963 494 0.4876 0.5869 0.4876 0.6983
No log 18.3704 496 0.4718 0.6565 0.4718 0.6869
No log 18.4444 498 0.5025 0.6784 0.5025 0.7089
0.2933 18.5185 500 0.5224 0.6567 0.5224 0.7228
0.2933 18.5926 502 0.5024 0.6613 0.5024 0.7088
0.2933 18.6667 504 0.4860 0.6265 0.4860 0.6971
0.2933 18.7407 506 0.4781 0.6253 0.4781 0.6914
0.2933 18.8148 508 0.4698 0.6630 0.4698 0.6854
0.2933 18.8889 510 0.4570 0.5853 0.4570 0.6760
0.2933 18.9630 512 0.4777 0.6414 0.4777 0.6912
0.2933 19.0370 514 0.5265 0.5983 0.5265 0.7256
0.2933 19.1111 516 0.5318 0.6061 0.5318 0.7292
0.2933 19.1852 518 0.4955 0.6414 0.4955 0.7039

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_B_usingALLEssays_FineTuningAraBERT_run3_AugV5_k5_task7_organization

Finetuned
(4019)
this model