ArabicNewSplits6_FineTuningAraBERT_run2_AugV5_k12_task2_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.7642
  • Qwk: 0.5506
  • Mse: 0.7642
  • Rmse: 0.8742

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 10

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0328 2 3.9815 -0.0279 3.9815 1.9954
No log 0.0656 4 2.1899 0.0138 2.1899 1.4798
No log 0.0984 6 1.2661 0.0602 1.2661 1.1252
No log 0.1311 8 1.2746 0.0220 1.2746 1.1290
No log 0.1639 10 1.0163 0.0203 1.0163 1.0081
No log 0.1967 12 0.8530 0.0464 0.8530 0.9236
No log 0.2295 14 0.7637 0.1615 0.7637 0.8739
No log 0.2623 16 0.6939 0.2250 0.6939 0.8330
No log 0.2951 18 0.6916 0.2333 0.6916 0.8316
No log 0.3279 20 0.7648 0.2171 0.7648 0.8745
No log 0.3607 22 1.1673 0.0445 1.1673 1.0804
No log 0.3934 24 2.1401 0.0578 2.1401 1.4629
No log 0.4262 26 2.3495 0.0410 2.3495 1.5328
No log 0.4590 28 1.7713 0.0854 1.7713 1.3309
No log 0.4918 30 1.4351 0.0296 1.4351 1.1979
No log 0.5246 32 1.4876 0.0678 1.4876 1.2197
No log 0.5574 34 1.2792 0.0296 1.2792 1.1310
No log 0.5902 36 0.8276 0.0873 0.8276 0.9097
No log 0.6230 38 0.7079 0.2333 0.7079 0.8414
No log 0.6557 40 0.6492 0.3162 0.6492 0.8057
No log 0.6885 42 0.6720 0.3140 0.6720 0.8198
No log 0.7213 44 0.6882 0.2812 0.6882 0.8296
No log 0.7541 46 1.0019 0.1312 1.0019 1.0009
No log 0.7869 48 1.6063 0.1855 1.6063 1.2674
No log 0.8197 50 1.6743 0.2213 1.6743 1.2939
No log 0.8525 52 1.3322 0.1395 1.3322 1.1542
No log 0.8852 54 0.9020 0.2210 0.9020 0.9498
No log 0.9180 56 0.6915 0.2453 0.6915 0.8316
No log 0.9508 58 0.6121 0.3645 0.6121 0.7824
No log 0.9836 60 0.6399 0.3223 0.6399 0.7999
No log 1.0164 62 0.6654 0.3140 0.6654 0.8157
No log 1.0492 64 0.6561 0.3140 0.6561 0.8100
No log 1.0820 66 0.6696 0.3259 0.6696 0.8183
No log 1.1148 68 0.6969 0.2571 0.6969 0.8348
No log 1.1475 70 0.7483 0.1981 0.7483 0.8650
No log 1.1803 72 0.8261 0.1717 0.8261 0.9089
No log 1.2131 74 0.9460 0.1475 0.9460 0.9726
No log 1.2459 76 0.9860 0.1260 0.9860 0.9930
No log 1.2787 78 0.8754 0.1988 0.8754 0.9356
No log 1.3115 80 0.8253 0.2062 0.8253 0.9085
No log 1.3443 82 0.7190 0.2711 0.7190 0.8479
No log 1.3770 84 0.6020 0.3639 0.6020 0.7759
No log 1.4098 86 0.5730 0.4839 0.5730 0.7570
No log 1.4426 88 0.5933 0.4442 0.5933 0.7702
No log 1.4754 90 0.6085 0.4018 0.6085 0.7801
No log 1.5082 92 0.6180 0.4548 0.6180 0.7861
No log 1.5410 94 0.6542 0.4241 0.6542 0.8088
No log 1.5738 96 0.7061 0.3962 0.7061 0.8403
No log 1.6066 98 0.7910 0.4105 0.7910 0.8894
No log 1.6393 100 0.7734 0.3839 0.7734 0.8794
No log 1.6721 102 0.6866 0.4651 0.6866 0.8286
No log 1.7049 104 0.6345 0.4781 0.6345 0.7965
No log 1.7377 106 0.5905 0.5361 0.5905 0.7685
No log 1.7705 108 0.6151 0.5407 0.6151 0.7843
No log 1.8033 110 0.7651 0.4388 0.7651 0.8747
No log 1.8361 112 0.9266 0.3780 0.9266 0.9626
No log 1.8689 114 1.0047 0.3710 1.0047 1.0023
No log 1.9016 116 0.9897 0.4034 0.9897 0.9948
No log 1.9344 118 0.8662 0.4538 0.8662 0.9307
No log 1.9672 120 0.8628 0.4633 0.8628 0.9289
No log 2.0 122 0.8223 0.4298 0.8223 0.9068
No log 2.0328 124 0.7656 0.4521 0.7656 0.8750
No log 2.0656 126 0.7145 0.4760 0.7145 0.8453
No log 2.0984 128 0.7109 0.5481 0.7109 0.8431
No log 2.1311 130 0.7140 0.5657 0.7140 0.8450
No log 2.1639 132 0.7114 0.5506 0.7114 0.8435
No log 2.1967 134 0.6777 0.5077 0.6777 0.8232
No log 2.2295 136 0.6632 0.4693 0.6632 0.8143
No log 2.2623 138 0.6685 0.4807 0.6685 0.8176
No log 2.2951 140 0.6671 0.4738 0.6671 0.8168
No log 2.3279 142 0.6725 0.4663 0.6725 0.8201
No log 2.3607 144 0.6778 0.4779 0.6778 0.8233
No log 2.3934 146 0.6724 0.4608 0.6724 0.8200
No log 2.4262 148 0.6735 0.4873 0.6735 0.8207
No log 2.4590 150 0.6741 0.5021 0.6741 0.8210
No log 2.4918 152 0.7020 0.4973 0.7020 0.8378
No log 2.5246 154 0.7446 0.5131 0.7446 0.8629
No log 2.5574 156 0.7589 0.4978 0.7589 0.8712
No log 2.5902 158 0.7896 0.4671 0.7896 0.8886
No log 2.6230 160 0.7306 0.5118 0.7306 0.8548
No log 2.6557 162 0.6967 0.4849 0.6967 0.8347
No log 2.6885 164 0.6854 0.5248 0.6854 0.8279
No log 2.7213 166 0.7114 0.5240 0.7114 0.8435
No log 2.7541 168 0.8100 0.4860 0.8100 0.9000
No log 2.7869 170 0.8054 0.4720 0.8054 0.8974
No log 2.8197 172 0.7484 0.4764 0.7484 0.8651
No log 2.8525 174 0.7273 0.4832 0.7273 0.8528
No log 2.8852 176 0.7417 0.5269 0.7417 0.8612
No log 2.9180 178 0.7569 0.4970 0.7569 0.8700
No log 2.9508 180 0.8028 0.4941 0.8028 0.8960
No log 2.9836 182 0.8990 0.4070 0.8990 0.9482
No log 3.0164 184 0.9300 0.3877 0.9300 0.9644
No log 3.0492 186 0.8755 0.4502 0.8755 0.9357
No log 3.0820 188 0.7765 0.5203 0.7765 0.8812
No log 3.1148 190 0.7445 0.5020 0.7445 0.8628
No log 3.1475 192 0.7586 0.4984 0.7586 0.8710
No log 3.1803 194 0.7754 0.4939 0.7754 0.8806
No log 3.2131 196 0.8136 0.4554 0.8136 0.9020
No log 3.2459 198 0.9143 0.3988 0.9143 0.9562
No log 3.2787 200 0.9565 0.4192 0.9565 0.9780
No log 3.3115 202 0.8978 0.4471 0.8978 0.9475
No log 3.3443 204 0.8512 0.4897 0.8512 0.9226
No log 3.3770 206 0.8140 0.4885 0.8140 0.9022
No log 3.4098 208 0.7963 0.4972 0.7963 0.8923
No log 3.4426 210 0.7847 0.5013 0.7847 0.8858
No log 3.4754 212 0.7498 0.5495 0.7498 0.8659
No log 3.5082 214 0.6836 0.5113 0.6836 0.8268
No log 3.5410 216 0.6577 0.5214 0.6577 0.8110
No log 3.5738 218 0.6592 0.5441 0.6592 0.8119
No log 3.6066 220 0.6839 0.5271 0.6839 0.8270
No log 3.6393 222 0.7249 0.5347 0.7249 0.8514
No log 3.6721 224 0.7285 0.5256 0.7285 0.8535
No log 3.7049 226 0.7307 0.5415 0.7307 0.8548
No log 3.7377 228 0.7539 0.5288 0.7539 0.8683
No log 3.7705 230 0.7598 0.5041 0.7598 0.8717
No log 3.8033 232 0.7414 0.5514 0.7414 0.8610
No log 3.8361 234 0.7305 0.5360 0.7305 0.8547
No log 3.8689 236 0.7303 0.5474 0.7303 0.8546
No log 3.9016 238 0.7604 0.4680 0.7604 0.8720
No log 3.9344 240 0.7631 0.4680 0.7631 0.8736
No log 3.9672 242 0.7347 0.5528 0.7347 0.8571
No log 4.0 244 0.7332 0.5511 0.7332 0.8563
No log 4.0328 246 0.7725 0.5455 0.7725 0.8789
No log 4.0656 248 0.8359 0.5293 0.8359 0.9143
No log 4.0984 250 0.8268 0.5426 0.8268 0.9093
No log 4.1311 252 0.7533 0.5561 0.7533 0.8680
No log 4.1639 254 0.6925 0.5274 0.6925 0.8322
No log 4.1967 256 0.6719 0.5033 0.6719 0.8197
No log 4.2295 258 0.6674 0.5113 0.6674 0.8169
No log 4.2623 260 0.6454 0.5114 0.6454 0.8034
No log 4.2951 262 0.6368 0.5302 0.6368 0.7980
No log 4.3279 264 0.7168 0.5422 0.7168 0.8467
No log 4.3607 266 0.8461 0.5481 0.8461 0.9198
No log 4.3934 268 0.8851 0.5078 0.8851 0.9408
No log 4.4262 270 0.8610 0.5071 0.8610 0.9279
No log 4.4590 272 0.8361 0.5097 0.8361 0.9144
No log 4.4918 274 0.8714 0.4876 0.8714 0.9335
No log 4.5246 276 0.8920 0.4595 0.8920 0.9445
No log 4.5574 278 0.8623 0.4948 0.8623 0.9286
No log 4.5902 280 0.8183 0.5274 0.8183 0.9046
No log 4.6230 282 0.7891 0.5073 0.7891 0.8883
No log 4.6557 284 0.7748 0.5481 0.7748 0.8802
No log 4.6885 286 0.7799 0.5320 0.7799 0.8831
No log 4.7213 288 0.7872 0.5241 0.7872 0.8873
No log 4.7541 290 0.7819 0.5241 0.7819 0.8843
No log 4.7869 292 0.7586 0.5357 0.7586 0.8710
No log 4.8197 294 0.7504 0.5458 0.7504 0.8663
No log 4.8525 296 0.7661 0.5065 0.7661 0.8753
No log 4.8852 298 0.7353 0.4938 0.7353 0.8575
No log 4.9180 300 0.6818 0.5365 0.6818 0.8257
No log 4.9508 302 0.6386 0.5245 0.6386 0.7991
No log 4.9836 304 0.6638 0.4503 0.6638 0.8147
No log 5.0164 306 0.7202 0.4384 0.7202 0.8486
No log 5.0492 308 0.7244 0.4384 0.7244 0.8511
No log 5.0820 310 0.7017 0.4785 0.7017 0.8377
No log 5.1148 312 0.6962 0.4547 0.6962 0.8344
No log 5.1475 314 0.7027 0.5203 0.7027 0.8383
No log 5.1803 316 0.7202 0.5189 0.7202 0.8487
No log 5.2131 318 0.7313 0.5187 0.7313 0.8551
No log 5.2459 320 0.7375 0.5410 0.7375 0.8588
No log 5.2787 322 0.7312 0.5109 0.7312 0.8551
No log 5.3115 324 0.7521 0.5046 0.7521 0.8672
No log 5.3443 326 0.7818 0.4721 0.7818 0.8842
No log 5.3770 328 0.7890 0.4588 0.7890 0.8882
No log 5.4098 330 0.7629 0.4635 0.7629 0.8734
No log 5.4426 332 0.7327 0.5620 0.7327 0.8560
No log 5.4754 334 0.7247 0.5031 0.7247 0.8513
No log 5.5082 336 0.7173 0.5031 0.7173 0.8469
No log 5.5410 338 0.7044 0.5214 0.7044 0.8393
No log 5.5738 340 0.6948 0.5327 0.6948 0.8336
No log 5.6066 342 0.6850 0.5185 0.6850 0.8276
No log 5.6393 344 0.6849 0.5505 0.6849 0.8276
No log 5.6721 346 0.6985 0.5197 0.6985 0.8358
No log 5.7049 348 0.7081 0.5229 0.7081 0.8415
No log 5.7377 350 0.7168 0.5229 0.7168 0.8467
No log 5.7705 352 0.7265 0.5277 0.7265 0.8524
No log 5.8033 354 0.7285 0.5277 0.7285 0.8535
No log 5.8361 356 0.7406 0.5425 0.7406 0.8606
No log 5.8689 358 0.7497 0.5507 0.7497 0.8659
No log 5.9016 360 0.7482 0.5371 0.7482 0.8650
No log 5.9344 362 0.7425 0.5143 0.7425 0.8617
No log 5.9672 364 0.7462 0.5145 0.7462 0.8638
No log 6.0 366 0.7487 0.5323 0.7487 0.8653
No log 6.0328 368 0.7527 0.5025 0.7527 0.8676
No log 6.0656 370 0.7685 0.4545 0.7685 0.8766
No log 6.0984 372 0.8009 0.4894 0.8009 0.8949
No log 6.1311 374 0.8054 0.4651 0.8054 0.8974
No log 6.1639 376 0.8040 0.4816 0.8040 0.8966
No log 6.1967 378 0.7746 0.5208 0.7746 0.8801
No log 6.2295 380 0.7414 0.5490 0.7414 0.8610
No log 6.2623 382 0.7279 0.4988 0.7279 0.8532
No log 6.2951 384 0.7268 0.4880 0.7268 0.8525
No log 6.3279 386 0.7334 0.4898 0.7334 0.8564
No log 6.3607 388 0.7277 0.5064 0.7277 0.8531
No log 6.3934 390 0.7344 0.5200 0.7344 0.8570
No log 6.4262 392 0.7628 0.5271 0.7628 0.8734
No log 6.4590 394 0.8081 0.5365 0.8081 0.8989
No log 6.4918 396 0.8372 0.4971 0.8372 0.9150
No log 6.5246 398 0.8491 0.4842 0.8491 0.9215
No log 6.5574 400 0.8242 0.4773 0.8242 0.9079
No log 6.5902 402 0.7931 0.5257 0.7931 0.8906
No log 6.6230 404 0.7767 0.5295 0.7767 0.8813
No log 6.6557 406 0.7574 0.5297 0.7574 0.8703
No log 6.6885 408 0.7510 0.4969 0.7510 0.8666
No log 6.7213 410 0.7530 0.5246 0.7530 0.8678
No log 6.7541 412 0.7467 0.5358 0.7467 0.8641
No log 6.7869 414 0.7399 0.5214 0.7399 0.8601
No log 6.8197 416 0.7249 0.4890 0.7249 0.8514
No log 6.8525 418 0.7200 0.4820 0.7200 0.8485
No log 6.8852 420 0.7249 0.4695 0.7249 0.8514
No log 6.9180 422 0.7362 0.4790 0.7362 0.8580
No log 6.9508 424 0.7486 0.4839 0.7486 0.8652
No log 6.9836 426 0.7650 0.5273 0.7650 0.8747
No log 7.0164 428 0.7757 0.5193 0.7757 0.8808
No log 7.0492 430 0.7814 0.5410 0.7814 0.8840
No log 7.0820 432 0.7711 0.5242 0.7711 0.8781
No log 7.1148 434 0.7551 0.5392 0.7551 0.8690
No log 7.1475 436 0.7543 0.5392 0.7543 0.8685
No log 7.1803 438 0.7608 0.5348 0.7608 0.8722
No log 7.2131 440 0.7824 0.5337 0.7824 0.8845
No log 7.2459 442 0.8226 0.4950 0.8226 0.9070
No log 7.2787 444 0.8514 0.4795 0.8514 0.9227
No log 7.3115 446 0.8827 0.4565 0.8827 0.9395
No log 7.3443 448 0.8856 0.4626 0.8856 0.9411
No log 7.3770 450 0.8595 0.4582 0.8595 0.9271
No log 7.4098 452 0.8577 0.4651 0.8577 0.9261
No log 7.4426 454 0.8636 0.4711 0.8636 0.9293
No log 7.4754 456 0.8483 0.4737 0.8483 0.9210
No log 7.5082 458 0.8381 0.5118 0.8381 0.9155
No log 7.5410 460 0.8411 0.5032 0.8411 0.9171
No log 7.5738 462 0.8594 0.4953 0.8594 0.9271
No log 7.6066 464 0.8630 0.4642 0.8630 0.9290
No log 7.6393 466 0.8384 0.4778 0.8384 0.9156
No log 7.6721 468 0.7999 0.5010 0.7999 0.8944
No log 7.7049 470 0.7587 0.4952 0.7587 0.8711
No log 7.7377 472 0.7255 0.5143 0.7255 0.8518
No log 7.7705 474 0.7128 0.5302 0.7128 0.8443
No log 7.8033 476 0.7147 0.5214 0.7147 0.8454
No log 7.8361 478 0.7206 0.5031 0.7206 0.8489
No log 7.8689 480 0.7219 0.5031 0.7219 0.8496
No log 7.9016 482 0.7210 0.5272 0.7210 0.8491
No log 7.9344 484 0.7202 0.5217 0.7202 0.8486
No log 7.9672 486 0.7202 0.5169 0.7202 0.8487
No log 8.0 488 0.7296 0.5214 0.7296 0.8542
No log 8.0328 490 0.7416 0.5355 0.7416 0.8612
No log 8.0656 492 0.7551 0.5355 0.7551 0.8690
No log 8.0984 494 0.7717 0.5124 0.7717 0.8785
No log 8.1311 496 0.7785 0.5043 0.7785 0.8823
No log 8.1639 498 0.7716 0.5051 0.7716 0.8784
0.4608 8.1967 500 0.7588 0.5448 0.7588 0.8711
0.4608 8.2295 502 0.7588 0.5337 0.7588 0.8711
0.4608 8.2623 504 0.7624 0.5282 0.7624 0.8732
0.4608 8.2951 506 0.7760 0.5337 0.7760 0.8809
0.4608 8.3279 508 0.7914 0.4944 0.7914 0.8896
0.4608 8.3607 510 0.8008 0.5121 0.8008 0.8949
0.4608 8.3934 512 0.8189 0.4966 0.8189 0.9049
0.4608 8.4262 514 0.8317 0.5011 0.8317 0.9120
0.4608 8.4590 516 0.8453 0.5226 0.8453 0.9194
0.4608 8.4918 518 0.8452 0.5226 0.8452 0.9193
0.4608 8.5246 520 0.8299 0.5178 0.8299 0.9110
0.4608 8.5574 522 0.8054 0.5019 0.8054 0.8974
0.4608 8.5902 524 0.7774 0.5235 0.7774 0.8817
0.4608 8.6230 526 0.7614 0.5246 0.7614 0.8726
0.4608 8.6557 528 0.7585 0.5532 0.7585 0.8709
0.4608 8.6885 530 0.7521 0.5532 0.7521 0.8672
0.4608 8.7213 532 0.7518 0.5532 0.7518 0.8671
0.4608 8.7541 534 0.7539 0.5532 0.7539 0.8683
0.4608 8.7869 536 0.7607 0.5460 0.7607 0.8722
0.4608 8.8197 538 0.7710 0.5052 0.7710 0.8781
0.4608 8.8525 540 0.7728 0.5052 0.7728 0.8791
0.4608 8.8852 542 0.7656 0.5206 0.7656 0.8750
0.4608 8.9180 544 0.7602 0.5418 0.7602 0.8719
0.4608 8.9508 546 0.7589 0.5346 0.7589 0.8711
0.4608 8.9836 548 0.7596 0.5490 0.7596 0.8716
0.4608 9.0164 550 0.7591 0.5374 0.7591 0.8713
0.4608 9.0492 552 0.7576 0.5374 0.7576 0.8704
0.4608 9.0820 554 0.7584 0.5374 0.7584 0.8708
0.4608 9.1148 556 0.7625 0.5505 0.7625 0.8732
0.4608 9.1475 558 0.7687 0.5505 0.7687 0.8767
0.4608 9.1803 560 0.7776 0.5005 0.7776 0.8818
0.4608 9.2131 562 0.7851 0.5051 0.7851 0.8861
0.4608 9.2459 564 0.7876 0.5043 0.7876 0.8875
0.4608 9.2787 566 0.7901 0.5043 0.7901 0.8889
0.4608 9.3115 568 0.7881 0.5043 0.7881 0.8878
0.4608 9.3443 570 0.7840 0.5051 0.7840 0.8854
0.4608 9.3770 572 0.7797 0.5180 0.7797 0.8830
0.4608 9.4098 574 0.7741 0.4890 0.7741 0.8798
0.4608 9.4426 576 0.7698 0.4959 0.7698 0.8774
0.4608 9.4754 578 0.7693 0.4959 0.7693 0.8771
0.4608 9.5082 580 0.7691 0.4959 0.7691 0.8770
0.4608 9.5410 582 0.7690 0.4959 0.7690 0.8769
0.4608 9.5738 584 0.7677 0.4968 0.7677 0.8762
0.4608 9.6066 586 0.7657 0.5227 0.7657 0.8751
0.4608 9.6393 588 0.7655 0.5297 0.7655 0.8749
0.4608 9.6721 590 0.7661 0.5297 0.7661 0.8753
0.4608 9.7049 592 0.7656 0.5367 0.7656 0.8750
0.4608 9.7377 594 0.7654 0.5506 0.7654 0.8749
0.4608 9.7705 596 0.7651 0.5506 0.7651 0.8747
0.4608 9.8033 598 0.7642 0.5506 0.7642 0.8742
0.4608 9.8361 600 0.7633 0.5392 0.7633 0.8737
0.4608 9.8689 602 0.7633 0.5392 0.7633 0.8737
0.4608 9.9016 604 0.7636 0.5392 0.7636 0.8738
0.4608 9.9344 606 0.7639 0.5392 0.7639 0.8740
0.4608 9.9672 608 0.7642 0.5506 0.7642 0.8742
0.4608 10.0 610 0.7642 0.5506 0.7642 0.8742

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
4
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits6_FineTuningAraBERT_run2_AugV5_k12_task2_organization

Finetuned
(4023)
this model