ArabicNewSplits7_B_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k12_task7_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.5590
  • Qwk: 0.3806
  • Mse: 0.5590
  • Rmse: 0.7477

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0323 2 2.6640 -0.0829 2.6640 1.6322
No log 0.0645 4 1.3102 0.0433 1.3102 1.1446
No log 0.0968 6 0.8087 0.0501 0.8087 0.8993
No log 0.1290 8 0.8015 0.1737 0.8015 0.8953
No log 0.1613 10 0.9918 0.1827 0.9918 0.9959
No log 0.1935 12 1.0051 0.1869 1.0051 1.0026
No log 0.2258 14 0.8313 0.1718 0.8313 0.9118
No log 0.2581 16 0.7197 0.0444 0.7197 0.8483
No log 0.2903 18 0.7514 0.1786 0.7514 0.8668
No log 0.3226 20 0.7868 0.1786 0.7868 0.8870
No log 0.3548 22 0.7410 0.1372 0.7410 0.8608
No log 0.3871 24 0.7116 0.0937 0.7116 0.8436
No log 0.4194 26 0.6876 0.1282 0.6876 0.8292
No log 0.4516 28 0.7618 0.0541 0.7618 0.8728
No log 0.4839 30 0.7994 0.2193 0.7994 0.8941
No log 0.5161 32 0.7409 0.3384 0.7409 0.8607
No log 0.5484 34 0.6461 0.3886 0.6461 0.8038
No log 0.5806 36 0.6457 0.4364 0.6457 0.8035
No log 0.6129 38 0.6898 0.3329 0.6898 0.8305
No log 0.6452 40 0.7463 0.3450 0.7463 0.8639
No log 0.6774 42 0.7413 0.3280 0.7413 0.8610
No log 0.7097 44 0.6439 0.5003 0.6439 0.8024
No log 0.7419 46 0.6771 0.4260 0.6771 0.8229
No log 0.7742 48 0.7559 0.4153 0.7559 0.8694
No log 0.8065 50 0.6414 0.4819 0.6414 0.8009
No log 0.8387 52 0.6147 0.5663 0.6147 0.7840
No log 0.8710 54 0.5852 0.5039 0.5852 0.7650
No log 0.9032 56 0.6278 0.3615 0.6278 0.7924
No log 0.9355 58 0.7649 0.4230 0.7649 0.8746
No log 0.9677 60 1.0088 0.2683 1.0088 1.0044
No log 1.0 62 1.0976 0.2589 1.0976 1.0477
No log 1.0323 64 0.9123 0.3395 0.9123 0.9551
No log 1.0645 66 0.6804 0.3329 0.6804 0.8249
No log 1.0968 68 0.6931 0.4466 0.6931 0.8325
No log 1.1290 70 0.6834 0.4006 0.6834 0.8267
No log 1.1613 72 0.8086 0.3297 0.8086 0.8992
No log 1.1935 74 0.9298 0.3735 0.9298 0.9643
No log 1.2258 76 0.9383 0.2951 0.9383 0.9687
No log 1.2581 78 0.6904 0.4036 0.6904 0.8309
No log 1.2903 80 0.8048 0.3696 0.8048 0.8971
No log 1.3226 82 0.7352 0.4282 0.7352 0.8574
No log 1.3548 84 0.7465 0.2604 0.7465 0.8640
No log 1.3871 86 1.3041 0.2619 1.3041 1.1420
No log 1.4194 88 1.4378 0.2133 1.4378 1.1991
No log 1.4516 90 1.0859 0.2545 1.0859 1.0421
No log 1.4839 92 0.6468 0.4027 0.6468 0.8042
No log 1.5161 94 0.6289 0.4307 0.6289 0.7930
No log 1.5484 96 0.6069 0.3806 0.6069 0.7791
No log 1.5806 98 0.6277 0.3840 0.6277 0.7922
No log 1.6129 100 0.7870 0.3582 0.7870 0.8871
No log 1.6452 102 0.8143 0.3803 0.8143 0.9024
No log 1.6774 104 0.7082 0.3963 0.7082 0.8415
No log 1.7097 106 0.6000 0.5379 0.6000 0.7746
No log 1.7419 108 0.6071 0.5235 0.6071 0.7792
No log 1.7742 110 0.5993 0.5389 0.5993 0.7742
No log 1.8065 112 0.5958 0.5619 0.5958 0.7719
No log 1.8387 114 0.6017 0.4675 0.6017 0.7757
No log 1.8710 116 0.6437 0.5244 0.6437 0.8023
No log 1.9032 118 0.9048 0.4378 0.9048 0.9512
No log 1.9355 120 0.8661 0.4305 0.8661 0.9307
No log 1.9677 122 0.6552 0.5447 0.6552 0.8094
No log 2.0 124 0.6141 0.5270 0.6141 0.7837
No log 2.0323 126 0.5899 0.5037 0.5899 0.7681
No log 2.0645 128 0.5915 0.5647 0.5915 0.7691
No log 2.0968 130 0.5885 0.4743 0.5885 0.7672
No log 2.1290 132 0.6563 0.4444 0.6563 0.8101
No log 2.1613 134 0.8819 0.4265 0.8819 0.9391
No log 2.1935 136 0.7993 0.4548 0.7993 0.8940
No log 2.2258 138 0.5587 0.6334 0.5587 0.7475
No log 2.2581 140 0.6347 0.5217 0.6347 0.7967
No log 2.2903 142 0.6625 0.5030 0.6625 0.8140
No log 2.3226 144 0.5615 0.5634 0.5615 0.7493
No log 2.3548 146 0.5624 0.5377 0.5624 0.7499
No log 2.3871 148 0.5711 0.6272 0.5711 0.7557
No log 2.4194 150 0.5280 0.6065 0.5280 0.7266
No log 2.4516 152 0.4968 0.5782 0.4968 0.7048
No log 2.4839 154 0.5266 0.5352 0.5266 0.7256
No log 2.5161 156 0.4947 0.5143 0.4947 0.7034
No log 2.5484 158 0.5104 0.6101 0.5104 0.7144
No log 2.5806 160 0.5384 0.6087 0.5384 0.7338
No log 2.6129 162 0.4979 0.5476 0.4979 0.7056
No log 2.6452 164 0.5185 0.5117 0.5185 0.7201
No log 2.6774 166 0.6137 0.4204 0.6137 0.7834
No log 2.7097 168 0.5894 0.5015 0.5894 0.7677
No log 2.7419 170 0.5132 0.5677 0.5132 0.7164
No log 2.7742 172 0.7623 0.3970 0.7623 0.8731
No log 2.8065 174 0.9506 0.3284 0.9506 0.9750
No log 2.8387 176 0.7939 0.3909 0.7939 0.8910
No log 2.8710 178 0.5548 0.6087 0.5548 0.7449
No log 2.9032 180 0.5512 0.5267 0.5512 0.7425
No log 2.9355 182 0.6022 0.5368 0.6022 0.7760
No log 2.9677 184 0.5388 0.5087 0.5388 0.7340
No log 3.0 186 0.6090 0.5237 0.6090 0.7804
No log 3.0323 188 0.8247 0.3849 0.8247 0.9081
No log 3.0645 190 0.8771 0.4253 0.8771 0.9365
No log 3.0968 192 0.7489 0.4917 0.7489 0.8654
No log 3.1290 194 0.5940 0.5223 0.5940 0.7707
No log 3.1613 196 0.5598 0.4555 0.5598 0.7482
No log 3.1935 198 0.5568 0.4678 0.5568 0.7462
No log 3.2258 200 0.5787 0.5237 0.5787 0.7607
No log 3.2581 202 0.7214 0.4366 0.7214 0.8494
No log 3.2903 204 0.9256 0.4149 0.9256 0.9621
No log 3.3226 206 0.9664 0.3450 0.9664 0.9831
No log 3.3548 208 0.8105 0.4208 0.8105 0.9003
No log 3.3871 210 0.6001 0.5395 0.6001 0.7747
No log 3.4194 212 0.6257 0.5168 0.6257 0.7910
No log 3.4516 214 0.6950 0.4618 0.6950 0.8337
No log 3.4839 216 0.6354 0.5106 0.6354 0.7971
No log 3.5161 218 0.5754 0.5411 0.5754 0.7586
No log 3.5484 220 0.7911 0.4630 0.7911 0.8894
No log 3.5806 222 0.9074 0.3807 0.9074 0.9526
No log 3.6129 224 0.7592 0.4556 0.7592 0.8713
No log 3.6452 226 0.5787 0.4763 0.5787 0.7607
No log 3.6774 228 0.5889 0.5467 0.5889 0.7674
No log 3.7097 230 0.6048 0.5283 0.6048 0.7777
No log 3.7419 232 0.5767 0.4782 0.5767 0.7594
No log 3.7742 234 0.6094 0.4864 0.6094 0.7807
No log 3.8065 236 0.6675 0.4335 0.6675 0.8170
No log 3.8387 238 0.6659 0.4335 0.6659 0.8160
No log 3.8710 240 0.6182 0.4397 0.6182 0.7863
No log 3.9032 242 0.6271 0.3840 0.6271 0.7919
No log 3.9355 244 0.7117 0.3723 0.7117 0.8436
No log 3.9677 246 0.8276 0.3889 0.8276 0.9097
No log 4.0 248 0.8141 0.4296 0.8141 0.9023
No log 4.0323 250 0.7035 0.4587 0.7035 0.8388
No log 4.0645 252 0.5749 0.4847 0.5749 0.7582
No log 4.0968 254 0.5848 0.4171 0.5848 0.7647
No log 4.1290 256 0.5782 0.4149 0.5782 0.7604
No log 4.1613 258 0.5717 0.4914 0.5717 0.7561
No log 4.1935 260 0.6925 0.4587 0.6925 0.8321
No log 4.2258 262 0.7781 0.4723 0.7781 0.8821
No log 4.2581 264 0.7626 0.4801 0.7626 0.8733
No log 4.2903 266 0.6307 0.4167 0.6307 0.7942
No log 4.3226 268 0.5509 0.2685 0.5509 0.7423
No log 4.3548 270 0.5509 0.3011 0.5509 0.7422
No log 4.3871 272 0.5690 0.3296 0.5690 0.7543
No log 4.4194 274 0.6379 0.3723 0.6379 0.7987
No log 4.4516 276 0.6536 0.4587 0.6536 0.8085
No log 4.4839 278 0.6044 0.3891 0.6044 0.7774
No log 4.5161 280 0.5723 0.4137 0.5723 0.7565
No log 4.5484 282 0.6089 0.3779 0.6089 0.7803
No log 4.5806 284 0.6356 0.4074 0.6356 0.7973
No log 4.6129 286 0.6322 0.3522 0.6322 0.7951
No log 4.6452 288 0.6065 0.3001 0.6065 0.7788
No log 4.6774 290 0.5964 0.3258 0.5964 0.7723
No log 4.7097 292 0.6142 0.2981 0.6142 0.7837
No log 4.7419 294 0.6381 0.4190 0.6381 0.7988
No log 4.7742 296 0.6293 0.3615 0.6293 0.7933
No log 4.8065 298 0.6144 0.3891 0.6144 0.7838
No log 4.8387 300 0.6137 0.4448 0.6137 0.7834
No log 4.8710 302 0.5962 0.4444 0.5962 0.7721
No log 4.9032 304 0.5659 0.4288 0.5659 0.7522
No log 4.9355 306 0.5523 0.4364 0.5523 0.7432
No log 4.9677 308 0.5575 0.4938 0.5575 0.7467
No log 5.0 310 0.5542 0.4569 0.5542 0.7444
No log 5.0323 312 0.5691 0.3504 0.5691 0.7544
No log 5.0645 314 0.5807 0.3506 0.5807 0.7621
No log 5.0968 316 0.5754 0.3465 0.5754 0.7586
No log 5.1290 318 0.5593 0.4029 0.5593 0.7479
No log 5.1613 320 0.5551 0.4182 0.5551 0.7451
No log 5.1935 322 0.5547 0.3616 0.5547 0.7448
No log 5.2258 324 0.6202 0.4980 0.6202 0.7875
No log 5.2581 326 0.7866 0.5177 0.7866 0.8869
No log 5.2903 328 0.9957 0.4491 0.9957 0.9979
No log 5.3226 330 0.9026 0.4758 0.9026 0.9501
No log 5.3548 332 0.6774 0.4819 0.6774 0.8230
No log 5.3871 334 0.5833 0.3545 0.5833 0.7637
No log 5.4194 336 0.5493 0.3599 0.5493 0.7411
No log 5.4516 338 0.5745 0.3840 0.5745 0.7579
No log 5.4839 340 0.6080 0.4602 0.6080 0.7797
No log 5.5161 342 0.6253 0.4522 0.6253 0.7908
No log 5.5484 344 0.5821 0.5081 0.5821 0.7630
No log 5.5806 346 0.5627 0.4746 0.5627 0.7501
No log 5.6129 348 0.6057 0.4489 0.6057 0.7782
No log 5.6452 350 0.5954 0.4468 0.5954 0.7716
No log 5.6774 352 0.5686 0.4742 0.5686 0.7540
No log 5.7097 354 0.5941 0.4206 0.5941 0.7707
No log 5.7419 356 0.6209 0.3688 0.6209 0.7880
No log 5.7742 358 0.6281 0.3545 0.6281 0.7925
No log 5.8065 360 0.6014 0.3545 0.6014 0.7755
No log 5.8387 362 0.5631 0.4338 0.5631 0.7504
No log 5.8710 364 0.5573 0.4019 0.5573 0.7465
No log 5.9032 366 0.5570 0.4429 0.5570 0.7463
No log 5.9355 368 0.5530 0.4019 0.5530 0.7437
No log 5.9677 370 0.5828 0.4234 0.5828 0.7634
No log 6.0 372 0.5876 0.3769 0.5876 0.7666
No log 6.0323 374 0.5623 0.4314 0.5623 0.7499
No log 6.0645 376 0.5549 0.4322 0.5549 0.7449
No log 6.0968 378 0.5899 0.4412 0.5899 0.7680
No log 6.1290 380 0.6257 0.4732 0.6257 0.7910
No log 6.1613 382 0.5924 0.4633 0.5924 0.7697
No log 6.1935 384 0.5476 0.5430 0.5476 0.7400
No log 6.2258 386 0.5632 0.5752 0.5632 0.7505
No log 6.2581 388 0.5575 0.4964 0.5575 0.7467
No log 6.2903 390 0.5506 0.5170 0.5506 0.7420
No log 6.3226 392 0.5686 0.3840 0.5686 0.7541
No log 6.3548 394 0.5741 0.3840 0.5741 0.7577
No log 6.3871 396 0.5612 0.4354 0.5612 0.7491
No log 6.4194 398 0.5606 0.4354 0.5606 0.7488
No log 6.4516 400 0.5703 0.3919 0.5703 0.7552
No log 6.4839 402 0.5668 0.4044 0.5668 0.7529
No log 6.5161 404 0.5470 0.5114 0.5470 0.7396
No log 6.5484 406 0.5567 0.4452 0.5567 0.7461
No log 6.5806 408 0.5656 0.4198 0.5656 0.7521
No log 6.6129 410 0.5553 0.4569 0.5553 0.7452
No log 6.6452 412 0.5775 0.4883 0.5775 0.7599
No log 6.6774 414 0.5789 0.4883 0.5789 0.7609
No log 6.7097 416 0.5665 0.4838 0.5665 0.7527
No log 6.7419 418 0.5784 0.4227 0.5784 0.7606
No log 6.7742 420 0.6104 0.3854 0.6104 0.7813
No log 6.8065 422 0.6571 0.4225 0.6571 0.8106
No log 6.8387 424 0.6265 0.3754 0.6265 0.7915
No log 6.8710 426 0.5769 0.4345 0.5769 0.7595
No log 6.9032 428 0.6041 0.4835 0.6041 0.7773
No log 6.9355 430 0.7302 0.4592 0.7302 0.8545
No log 6.9677 432 0.7549 0.4592 0.7549 0.8689
No log 7.0 434 0.7521 0.4666 0.7521 0.8672
No log 7.0323 436 0.7703 0.4286 0.7703 0.8777
No log 7.0645 438 0.7254 0.4738 0.7254 0.8517
No log 7.0968 440 0.6159 0.4978 0.6159 0.7848
No log 7.1290 442 0.5702 0.3762 0.5702 0.7551
No log 7.1613 444 0.5626 0.4194 0.5626 0.7501
No log 7.1935 446 0.5824 0.4234 0.5824 0.7632
No log 7.2258 448 0.6469 0.4741 0.6469 0.8043
No log 7.2581 450 0.7013 0.4812 0.7013 0.8374
No log 7.2903 452 0.6725 0.4741 0.6725 0.8200
No log 7.3226 454 0.5940 0.4663 0.5940 0.7707
No log 7.3548 456 0.5643 0.4547 0.5643 0.7512
No log 7.3871 458 0.5540 0.3530 0.5540 0.7443
No log 7.4194 460 0.5530 0.4837 0.5530 0.7436
No log 7.4516 462 0.5707 0.5086 0.5707 0.7554
No log 7.4839 464 0.6165 0.4745 0.6165 0.7852
No log 7.5161 466 0.6170 0.4745 0.6170 0.7855
No log 7.5484 468 0.5626 0.5291 0.5626 0.7501
No log 7.5806 470 0.5394 0.5860 0.5394 0.7344
No log 7.6129 472 0.5625 0.5200 0.5625 0.7500
No log 7.6452 474 0.5727 0.5414 0.5727 0.7567
No log 7.6774 476 0.5584 0.5200 0.5584 0.7472
No log 7.7097 478 0.5266 0.5252 0.5266 0.7257
No log 7.7419 480 0.5460 0.5308 0.5460 0.7389
No log 7.7742 482 0.5878 0.5291 0.5878 0.7667
No log 7.8065 484 0.6486 0.4964 0.6486 0.8054
No log 7.8387 486 0.5971 0.5342 0.5971 0.7728
No log 7.8710 488 0.5331 0.5937 0.5331 0.7301
No log 7.9032 490 0.5499 0.5087 0.5499 0.7415
No log 7.9355 492 0.5776 0.5217 0.5776 0.7600
No log 7.9677 494 0.5710 0.4913 0.5710 0.7557
No log 8.0 496 0.5418 0.4847 0.5418 0.7361
No log 8.0323 498 0.5756 0.5373 0.5756 0.7587
0.3189 8.0645 500 0.6319 0.4550 0.6319 0.7949
0.3189 8.0968 502 0.6048 0.5373 0.6048 0.7777
0.3189 8.1290 504 0.5599 0.5617 0.5599 0.7483
0.3189 8.1613 506 0.5563 0.4322 0.5563 0.7458
0.3189 8.1935 508 0.5734 0.4367 0.5734 0.7572
0.3189 8.2258 510 0.5705 0.4092 0.5705 0.7553
0.3189 8.2581 512 0.5600 0.4068 0.5600 0.7483
0.3189 8.2903 514 0.5590 0.3806 0.5590 0.7477

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_B_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k12_task7_organization

Finetuned
(4019)
this model