ArabicNewSplits7_B_usingALLEssays_FineTuningAraBERT_run3_AugV5_k4_task7_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.4603
  • Qwk: 0.5868
  • Mse: 0.4603
  • Rmse: 0.6784

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0952 2 2.4297 -0.0262 2.4297 1.5588
No log 0.1905 4 1.2702 0.0997 1.2702 1.1271
No log 0.2857 6 0.7640 0.0481 0.7640 0.8740
No log 0.3810 8 0.8552 0.1777 0.8552 0.9248
No log 0.4762 10 0.9406 0.2782 0.9406 0.9698
No log 0.5714 12 0.7634 0.3169 0.7634 0.8737
No log 0.6667 14 0.6125 0.3974 0.6125 0.7826
No log 0.7619 16 0.8472 0.2784 0.8472 0.9204
No log 0.8571 18 1.3465 0.1670 1.3465 1.1604
No log 0.9524 20 0.9697 0.3141 0.9697 0.9848
No log 1.0476 22 0.6313 0.4373 0.6313 0.7945
No log 1.1429 24 0.6635 0.4395 0.6635 0.8146
No log 1.2381 26 0.6964 0.3598 0.6964 0.8345
No log 1.3333 28 0.8189 0.2394 0.8189 0.9049
No log 1.4286 30 1.0355 0.2496 1.0355 1.0176
No log 1.5238 32 0.8860 0.2604 0.8860 0.9413
No log 1.6190 34 0.5815 0.4918 0.5815 0.7626
No log 1.7143 36 0.5830 0.4035 0.5830 0.7636
No log 1.8095 38 0.5563 0.3863 0.5563 0.7459
No log 1.9048 40 0.7336 0.4529 0.7336 0.8565
No log 2.0 42 0.8186 0.4214 0.8186 0.9048
No log 2.0952 44 0.6870 0.4518 0.6870 0.8289
No log 2.1905 46 0.5557 0.2923 0.5557 0.7455
No log 2.2857 48 0.5541 0.3546 0.5541 0.7444
No log 2.3810 50 0.5814 0.4035 0.5814 0.7625
No log 2.4762 52 0.5613 0.3546 0.5613 0.7492
No log 2.5714 54 0.5751 0.3477 0.5751 0.7584
No log 2.6667 56 0.6260 0.4315 0.6260 0.7912
No log 2.7619 58 0.7309 0.4743 0.7309 0.8550
No log 2.8571 60 0.6274 0.4904 0.6274 0.7921
No log 2.9524 62 0.6191 0.4909 0.6191 0.7868
No log 3.0476 64 0.8185 0.4367 0.8185 0.9047
No log 3.1429 66 0.8188 0.4462 0.8188 0.9049
No log 3.2381 68 0.5840 0.5469 0.5840 0.7642
No log 3.3333 70 0.6184 0.5119 0.6184 0.7864
No log 3.4286 72 0.6822 0.4471 0.6822 0.8260
No log 3.5238 74 0.6348 0.4467 0.6348 0.7967
No log 3.6190 76 0.6006 0.5021 0.6006 0.7750
No log 3.7143 78 0.6108 0.4715 0.6108 0.7815
No log 3.8095 80 0.5955 0.4300 0.5955 0.7717
No log 3.9048 82 0.6830 0.3817 0.6830 0.8265
No log 4.0 84 0.6044 0.4206 0.6044 0.7774
No log 4.0952 86 0.5512 0.4595 0.5512 0.7424
No log 4.1905 88 0.6371 0.4914 0.6371 0.7982
No log 4.2857 90 0.5917 0.4914 0.5917 0.7692
No log 4.3810 92 0.5493 0.5784 0.5493 0.7411
No log 4.4762 94 0.5570 0.5426 0.5570 0.7463
No log 4.5714 96 0.5631 0.5013 0.5631 0.7504
No log 4.6667 98 0.6631 0.5190 0.6631 0.8143
No log 4.7619 100 0.7011 0.5326 0.7011 0.8373
No log 4.8571 102 0.6482 0.5116 0.6482 0.8051
No log 4.9524 104 0.6807 0.5387 0.6807 0.8250
No log 5.0476 106 0.7744 0.4859 0.7744 0.8800
No log 5.1429 108 0.7380 0.5017 0.7380 0.8591
No log 5.2381 110 0.6054 0.5552 0.6054 0.7781
No log 5.3333 112 0.6089 0.5489 0.6089 0.7803
No log 5.4286 114 0.5818 0.5297 0.5818 0.7628
No log 5.5238 116 0.5444 0.5692 0.5444 0.7378
No log 5.6190 118 0.5265 0.5627 0.5265 0.7256
No log 5.7143 120 0.6074 0.5180 0.6074 0.7794
No log 5.8095 122 0.7914 0.5032 0.7914 0.8896
No log 5.9048 124 0.7630 0.5032 0.7630 0.8735
No log 6.0 126 0.4982 0.5569 0.4982 0.7058
No log 6.0952 128 0.6046 0.5591 0.6046 0.7776
No log 6.1905 130 0.7159 0.4531 0.7159 0.8461
No log 6.2857 132 0.5583 0.5115 0.5583 0.7472
No log 6.3810 134 0.4806 0.6479 0.4806 0.6933
No log 6.4762 136 0.5081 0.6349 0.5081 0.7128
No log 6.5714 138 0.4908 0.5888 0.4908 0.7006
No log 6.6667 140 0.5343 0.5543 0.5343 0.7310
No log 6.7619 142 0.5480 0.5457 0.5480 0.7403
No log 6.8571 144 0.4941 0.6046 0.4941 0.7029
No log 6.9524 146 0.5727 0.6104 0.5727 0.7568
No log 7.0476 148 0.7937 0.3924 0.7937 0.8909
No log 7.1429 150 0.7572 0.4601 0.7572 0.8701
No log 7.2381 152 0.5457 0.6271 0.5457 0.7387
No log 7.3333 154 0.5306 0.5983 0.5306 0.7284
No log 7.4286 156 0.6217 0.6039 0.6217 0.7885
No log 7.5238 158 0.5339 0.6771 0.5339 0.7307
No log 7.6190 160 0.6101 0.4874 0.6101 0.7811
No log 7.7143 162 0.6721 0.4379 0.6721 0.8198
No log 7.8095 164 0.5445 0.5895 0.5445 0.7379
No log 7.9048 166 0.5870 0.5670 0.5870 0.7661
No log 8.0 168 0.7441 0.3998 0.7441 0.8626
No log 8.0952 170 0.6249 0.5538 0.6249 0.7905
No log 8.1905 172 0.4859 0.6129 0.4859 0.6971
No log 8.2857 174 0.6163 0.4874 0.6163 0.7851
No log 8.3810 176 0.7985 0.4304 0.7985 0.8936
No log 8.4762 178 0.7290 0.4733 0.7290 0.8538
No log 8.5714 180 0.5414 0.6004 0.5414 0.7358
No log 8.6667 182 0.5317 0.6078 0.5317 0.7292
No log 8.7619 184 0.5624 0.4874 0.5624 0.7500
No log 8.8571 186 0.5465 0.5269 0.5465 0.7393
No log 8.9524 188 0.5037 0.6566 0.5037 0.7097
No log 9.0476 190 0.5014 0.5841 0.5014 0.7081
No log 9.1429 192 0.4799 0.6114 0.4799 0.6927
No log 9.2381 194 0.4866 0.5218 0.4866 0.6976
No log 9.3333 196 0.6127 0.5510 0.6127 0.7828
No log 9.4286 198 0.6424 0.5471 0.6424 0.8015
No log 9.5238 200 0.5309 0.6022 0.5309 0.7286
No log 9.6190 202 0.5097 0.6860 0.5097 0.7139
No log 9.7143 204 0.5806 0.4562 0.5806 0.7620
No log 9.8095 206 0.6757 0.4890 0.6757 0.8220
No log 9.9048 208 0.5996 0.4834 0.5996 0.7744
No log 10.0 210 0.4899 0.5956 0.4899 0.6999
No log 10.0952 212 0.5686 0.5327 0.5686 0.7540
No log 10.1905 214 0.6456 0.5131 0.6456 0.8035
No log 10.2857 216 0.6218 0.4911 0.6218 0.7886
No log 10.3810 218 0.5267 0.5577 0.5267 0.7257
No log 10.4762 220 0.5290 0.5577 0.5290 0.7273
No log 10.5714 222 0.5109 0.5753 0.5109 0.7148
No log 10.6667 224 0.5109 0.6171 0.5109 0.7148
No log 10.7619 226 0.5239 0.5784 0.5239 0.7238
No log 10.8571 228 0.5277 0.5770 0.5277 0.7265
No log 10.9524 230 0.5380 0.5473 0.5380 0.7335
No log 11.0476 232 0.5064 0.5974 0.5064 0.7116
No log 11.1429 234 0.4799 0.5768 0.4799 0.6928
No log 11.2381 236 0.4759 0.6479 0.4759 0.6898
No log 11.3333 238 0.5116 0.6182 0.5116 0.7152
No log 11.4286 240 0.5199 0.6108 0.5199 0.7210
No log 11.5238 242 0.4781 0.6587 0.4781 0.6915
No log 11.6190 244 0.4803 0.6870 0.4803 0.6930
No log 11.7143 246 0.4908 0.6066 0.4908 0.7006
No log 11.8095 248 0.4856 0.6419 0.4856 0.6968
No log 11.9048 250 0.4742 0.6960 0.4742 0.6886
No log 12.0 252 0.4625 0.6435 0.4625 0.6801
No log 12.0952 254 0.4635 0.6344 0.4635 0.6808
No log 12.1905 256 0.4802 0.6197 0.4802 0.6929
No log 12.2857 258 0.5810 0.5460 0.5810 0.7622
No log 12.3810 260 0.5485 0.5599 0.5485 0.7406
No log 12.4762 262 0.4882 0.5888 0.4882 0.6987
No log 12.5714 264 0.5299 0.5894 0.5299 0.7280
No log 12.6667 266 0.5455 0.5291 0.5455 0.7386
No log 12.7619 268 0.4884 0.6046 0.4884 0.6988
No log 12.8571 270 0.5651 0.4949 0.5651 0.7517
No log 12.9524 272 0.6203 0.4598 0.6203 0.7876
No log 13.0476 274 0.6118 0.4598 0.6118 0.7822
No log 13.1429 276 0.5214 0.5811 0.5214 0.7221
No log 13.2381 278 0.4829 0.6168 0.4829 0.6949
No log 13.3333 280 0.4745 0.6068 0.4745 0.6888
No log 13.4286 282 0.4751 0.5846 0.4751 0.6893
No log 13.5238 284 0.4755 0.6053 0.4755 0.6896
No log 13.6190 286 0.4954 0.6197 0.4954 0.7039
No log 13.7143 288 0.5230 0.6047 0.5230 0.7232
No log 13.8095 290 0.4942 0.6114 0.4942 0.7030
No log 13.9048 292 0.4821 0.5883 0.4821 0.6944
No log 14.0 294 0.5328 0.5086 0.5328 0.7299
No log 14.0952 296 0.6564 0.4844 0.6564 0.8102
No log 14.1905 298 0.6467 0.5185 0.6467 0.8042
No log 14.2857 300 0.5567 0.4827 0.5567 0.7461
No log 14.3810 302 0.4559 0.5915 0.4559 0.6752
No log 14.4762 304 0.4862 0.6034 0.4862 0.6973
No log 14.5714 306 0.4764 0.6114 0.4764 0.6902
No log 14.6667 308 0.4461 0.6479 0.4461 0.6679
No log 14.7619 310 0.4502 0.6395 0.4502 0.6709
No log 14.8571 312 0.4646 0.6763 0.4646 0.6816
No log 14.9524 314 0.5313 0.6311 0.5313 0.7289
No log 15.0476 316 0.6361 0.5729 0.6361 0.7976
No log 15.1429 318 0.6057 0.5564 0.6057 0.7783
No log 15.2381 320 0.5834 0.5625 0.5834 0.7638
No log 15.3333 322 0.5507 0.5312 0.5507 0.7421
No log 15.4286 324 0.4752 0.6402 0.4752 0.6893
No log 15.5238 326 0.4535 0.6242 0.4535 0.6734
No log 15.6190 328 0.4761 0.5472 0.4761 0.6900
No log 15.7143 330 0.4695 0.5625 0.4695 0.6852
No log 15.8095 332 0.4738 0.6289 0.4738 0.6883
No log 15.9048 334 0.4870 0.6772 0.4870 0.6978
No log 16.0 336 0.5018 0.6449 0.5018 0.7084
No log 16.0952 338 0.5008 0.5751 0.5008 0.7076
No log 16.1905 340 0.4643 0.6423 0.4643 0.6814
No log 16.2857 342 0.4562 0.6133 0.4562 0.6754
No log 16.3810 344 0.4477 0.6210 0.4477 0.6691
No log 16.4762 346 0.4662 0.6092 0.4662 0.6828
No log 16.5714 348 0.4786 0.6092 0.4786 0.6918
No log 16.6667 350 0.4699 0.6127 0.4699 0.6855
No log 16.7619 352 0.4366 0.6269 0.4366 0.6608
No log 16.8571 354 0.4522 0.5533 0.4522 0.6725
No log 16.9524 356 0.4495 0.5455 0.4495 0.6704
No log 17.0476 358 0.4460 0.5784 0.4460 0.6678
No log 17.1429 360 0.4349 0.5831 0.4349 0.6594
No log 17.2381 362 0.4478 0.6592 0.4478 0.6692
No log 17.3333 364 0.4805 0.6501 0.4805 0.6932
No log 17.4286 366 0.4754 0.6592 0.4754 0.6895
No log 17.5238 368 0.5095 0.6223 0.5095 0.7138
No log 17.6190 370 0.5096 0.6223 0.5096 0.7139
No log 17.7143 372 0.4656 0.6383 0.4656 0.6823
No log 17.8095 374 0.4435 0.5267 0.4435 0.6660
No log 17.9048 376 0.4683 0.6052 0.4683 0.6843
No log 18.0 378 0.4615 0.5852 0.4615 0.6793
No log 18.0952 380 0.4376 0.6770 0.4376 0.6615
No log 18.1905 382 0.4471 0.6501 0.4471 0.6687
No log 18.2857 384 0.4502 0.6501 0.4502 0.6710
No log 18.3810 386 0.4436 0.6168 0.4436 0.6660
No log 18.4762 388 0.4438 0.5986 0.4438 0.6662
No log 18.5714 390 0.4368 0.6464 0.4368 0.6609
No log 18.6667 392 0.4368 0.6383 0.4368 0.6609
No log 18.7619 394 0.4618 0.6330 0.4618 0.6796
No log 18.8571 396 0.4537 0.6414 0.4537 0.6736
No log 18.9524 398 0.4334 0.6402 0.4334 0.6584
No log 19.0476 400 0.4326 0.6068 0.4326 0.6577
No log 19.1429 402 0.4364 0.6648 0.4364 0.6606
No log 19.2381 404 0.4467 0.5214 0.4467 0.6684
No log 19.3333 406 0.4495 0.5631 0.4495 0.6704
No log 19.4286 408 0.4489 0.5503 0.4489 0.6700
No log 19.5238 410 0.4495 0.6154 0.4495 0.6704
No log 19.6190 412 0.4748 0.6501 0.4748 0.6890
No log 19.7143 414 0.5137 0.5836 0.5137 0.7168
No log 19.8095 416 0.5094 0.5765 0.5094 0.7137
No log 19.9048 418 0.5188 0.5765 0.5188 0.7203
No log 20.0 420 0.4655 0.6526 0.4655 0.6823
No log 20.0952 422 0.4359 0.6235 0.4359 0.6602
No log 20.1905 424 0.4244 0.6402 0.4244 0.6515
No log 20.2857 426 0.4261 0.5815 0.4261 0.6528
No log 20.3810 428 0.4283 0.5732 0.4283 0.6545
No log 20.4762 430 0.4872 0.5544 0.4872 0.6980
No log 20.5714 432 0.5217 0.5219 0.5217 0.7223
No log 20.6667 434 0.4722 0.5544 0.4722 0.6872
No log 20.7619 436 0.4252 0.5267 0.4252 0.6521
No log 20.8571 438 0.4425 0.5731 0.4425 0.6652
No log 20.9524 440 0.4795 0.6223 0.4795 0.6925
No log 21.0476 442 0.4549 0.6013 0.4549 0.6745
No log 21.1429 444 0.4134 0.6282 0.4134 0.6430
No log 21.2381 446 0.4518 0.5692 0.4518 0.6721
No log 21.3333 448 0.5425 0.5455 0.5425 0.7365
No log 21.4286 450 0.5428 0.5455 0.5428 0.7368
No log 21.5238 452 0.4761 0.5692 0.4761 0.6900
No log 21.6190 454 0.4221 0.5567 0.4221 0.6497
No log 21.7143 456 0.4478 0.6308 0.4478 0.6692
No log 21.8095 458 0.4755 0.6223 0.4755 0.6896
No log 21.9048 460 0.4585 0.6223 0.4585 0.6771
No log 22.0 462 0.4348 0.6477 0.4348 0.6594
No log 22.0952 464 0.4351 0.6377 0.4351 0.6596
No log 22.1905 466 0.4392 0.6771 0.4392 0.6627
No log 22.2857 468 0.4476 0.6782 0.4476 0.6691
No log 22.3810 470 0.4445 0.6683 0.4445 0.6667
No log 22.4762 472 0.4412 0.6771 0.4412 0.6642
No log 22.5714 474 0.4473 0.6589 0.4473 0.6688
No log 22.6667 476 0.4748 0.6501 0.4748 0.6890
No log 22.7619 478 0.5012 0.6223 0.5012 0.7080
No log 22.8571 480 0.4727 0.6501 0.4727 0.6875
No log 22.9524 482 0.4492 0.6501 0.4492 0.6703
No log 23.0476 484 0.4354 0.6255 0.4354 0.6599
No log 23.1429 486 0.4355 0.6464 0.4355 0.6600
No log 23.2381 488 0.4462 0.6501 0.4462 0.6680
No log 23.3333 490 0.4428 0.6282 0.4428 0.6654
No log 23.4286 492 0.4455 0.6530 0.4455 0.6674
No log 23.5238 494 0.4481 0.6530 0.4481 0.6694
No log 23.6190 496 0.4394 0.6364 0.4394 0.6629
No log 23.7143 498 0.4540 0.6514 0.4540 0.6738
0.2834 23.8095 500 0.4647 0.6431 0.4647 0.6817
0.2834 23.9048 502 0.4597 0.6431 0.4597 0.6780
0.2834 24.0 504 0.4583 0.6431 0.4583 0.6770
0.2834 24.0952 506 0.4458 0.6223 0.4458 0.6677
0.2834 24.1905 508 0.4340 0.5831 0.4340 0.6588
0.2834 24.2857 510 0.4603 0.5868 0.4603 0.6784

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
2
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_B_usingALLEssays_FineTuningAraBERT_run3_AugV5_k4_task7_organization

Finetuned
(4019)
this model