ArabicNewSplits7_B_usingWellWrittenEssays_FineTuningAraBERT_run3_AugV5_k5_task7_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.4690
  • Qwk: 0.4752
  • Mse: 0.4690
  • Rmse: 0.6848

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0741 2 2.4977 -0.0449 2.4977 1.5804
No log 0.1481 4 1.2492 0.1268 1.2492 1.1177
No log 0.2222 6 0.9713 -0.0660 0.9713 0.9855
No log 0.2963 8 0.9303 0.1575 0.9303 0.9645
No log 0.3704 10 0.9918 0.2032 0.9918 0.9959
No log 0.4444 12 0.7417 0.2652 0.7417 0.8612
No log 0.5185 14 0.7136 0.1539 0.7136 0.8448
No log 0.5926 16 0.6918 0.1539 0.6918 0.8318
No log 0.6667 18 0.6898 0.0810 0.6898 0.8305
No log 0.7407 20 0.6767 0.2280 0.6767 0.8226
No log 0.8148 22 0.7654 0.1700 0.7654 0.8749
No log 0.8889 24 0.8407 0.1273 0.8407 0.9169
No log 0.9630 26 0.7601 0.2015 0.7601 0.8718
No log 1.0370 28 0.7827 0.1255 0.7827 0.8847
No log 1.1111 30 0.8380 0.2066 0.8380 0.9154
No log 1.1852 32 0.7116 0.1263 0.7116 0.8436
No log 1.2593 34 0.7100 0.1818 0.7100 0.8426
No log 1.3333 36 0.7017 0.2811 0.7017 0.8377
No log 1.4074 38 0.7185 0.2561 0.7185 0.8477
No log 1.4815 40 0.7688 0.3305 0.7688 0.8768
No log 1.5556 42 0.7386 0.3888 0.7386 0.8594
No log 1.6296 44 0.6938 0.2973 0.6938 0.8329
No log 1.7037 46 0.6561 0.2923 0.6561 0.8100
No log 1.7778 48 0.6868 0.2259 0.6868 0.8288
No log 1.8519 50 0.6790 0.3373 0.6790 0.8240
No log 1.9259 52 0.6657 0.2449 0.6657 0.8159
No log 2.0 54 0.6668 0.3817 0.6668 0.8166
No log 2.0741 56 0.6225 0.3155 0.6225 0.7890
No log 2.1481 58 0.5925 0.4111 0.5925 0.7697
No log 2.2222 60 0.5902 0.3919 0.5902 0.7682
No log 2.2963 62 0.6263 0.3372 0.6263 0.7914
No log 2.3704 64 0.6701 0.4175 0.6701 0.8186
No log 2.4444 66 0.5627 0.4534 0.5627 0.7501
No log 2.5185 68 0.5256 0.5305 0.5256 0.7250
No log 2.5926 70 0.5831 0.4751 0.5831 0.7636
No log 2.6667 72 0.5331 0.5892 0.5331 0.7301
No log 2.7407 74 0.5209 0.5831 0.5209 0.7217
No log 2.8148 76 0.5463 0.5846 0.5463 0.7391
No log 2.8889 78 0.5994 0.5642 0.5994 0.7742
No log 2.9630 80 0.8764 0.2846 0.8764 0.9362
No log 3.0370 82 0.8557 0.2900 0.8557 0.9250
No log 3.1111 84 0.5694 0.4329 0.5694 0.7546
No log 3.1852 86 0.5248 0.5693 0.5248 0.7244
No log 3.2593 88 0.5350 0.5451 0.5350 0.7314
No log 3.3333 90 0.5478 0.4463 0.5478 0.7401
No log 3.4074 92 0.6469 0.4764 0.6469 0.8043
No log 3.4815 94 0.7566 0.4877 0.7566 0.8698
No log 3.5556 96 0.6346 0.4085 0.6346 0.7966
No log 3.6296 98 0.5686 0.4295 0.5686 0.7541
No log 3.7037 100 0.6845 0.4448 0.6845 0.8273
No log 3.7778 102 0.7482 0.3933 0.7482 0.8650
No log 3.8519 104 0.6394 0.4672 0.6394 0.7996
No log 3.9259 106 0.6206 0.4562 0.6206 0.7878
No log 4.0 108 0.6267 0.4562 0.6267 0.7916
No log 4.0741 110 0.6590 0.4543 0.6590 0.8118
No log 4.1481 112 0.8509 0.4521 0.8509 0.9224
No log 4.2222 114 0.9683 0.3970 0.9683 0.9840
No log 4.2963 116 0.8380 0.4482 0.8380 0.9154
No log 4.3704 118 0.6405 0.3995 0.6405 0.8003
No log 4.4444 120 0.6436 0.4800 0.6436 0.8023
No log 4.5185 122 0.5893 0.5041 0.5893 0.7676
No log 4.5926 124 0.5672 0.5428 0.5672 0.7531
No log 4.6667 126 0.5209 0.5902 0.5209 0.7218
No log 4.7407 128 0.5933 0.4723 0.5933 0.7703
No log 4.8148 130 0.8880 0.2975 0.8880 0.9423
No log 4.8889 132 0.9442 0.3247 0.9442 0.9717
No log 4.9630 134 0.7211 0.4364 0.7211 0.8492
No log 5.0370 136 0.5188 0.6782 0.5188 0.7203
No log 5.1111 138 0.5125 0.5619 0.5125 0.7159
No log 5.1852 140 0.5461 0.6330 0.5461 0.7390
No log 5.2593 142 0.7266 0.4007 0.7266 0.8524
No log 5.3333 144 0.7064 0.4364 0.7064 0.8405
No log 5.4074 146 0.5487 0.6271 0.5487 0.7408
No log 5.4815 148 0.5792 0.5291 0.5792 0.7610
No log 5.5556 150 0.6039 0.5059 0.6039 0.7771
No log 5.6296 152 0.5369 0.5021 0.5369 0.7327
No log 5.7037 154 0.5353 0.5306 0.5353 0.7316
No log 5.7778 156 0.5444 0.5272 0.5444 0.7378
No log 5.8519 158 0.5651 0.4864 0.5651 0.7517
No log 5.9259 160 0.5365 0.5238 0.5365 0.7325
No log 6.0 162 0.5384 0.5111 0.5384 0.7337
No log 6.0741 164 0.6366 0.4777 0.6366 0.7978
No log 6.1481 166 0.7139 0.5139 0.7139 0.8449
No log 6.2222 168 0.6209 0.4845 0.6209 0.7880
No log 6.2963 170 0.5684 0.4975 0.5684 0.7539
No log 6.3704 172 0.5830 0.4915 0.5830 0.7635
No log 6.4444 174 0.6076 0.4227 0.6076 0.7795
No log 6.5185 176 0.6402 0.4484 0.6402 0.8001
No log 6.5926 178 0.6296 0.4155 0.6296 0.7934
No log 6.6667 180 0.6603 0.4980 0.6603 0.8126
No log 6.7407 182 0.6938 0.4219 0.6938 0.8329
No log 6.8148 184 0.6096 0.4675 0.6096 0.7808
No log 6.8889 186 0.5792 0.5232 0.5792 0.7611
No log 6.9630 188 0.6266 0.4764 0.6266 0.7916
No log 7.0370 190 0.5915 0.4758 0.5915 0.7691
No log 7.1111 192 0.5870 0.4599 0.5870 0.7661
No log 7.1852 194 0.5798 0.4599 0.5798 0.7615
No log 7.2593 196 0.5656 0.5021 0.5656 0.7521
No log 7.3333 198 0.6110 0.4664 0.6110 0.7817
No log 7.4074 200 0.6138 0.4905 0.6138 0.7835
No log 7.4815 202 0.6514 0.4387 0.6514 0.8071
No log 7.5556 204 0.6943 0.3869 0.6943 0.8332
No log 7.6296 206 0.6589 0.4329 0.6589 0.8117
No log 7.7037 208 0.6157 0.4473 0.6157 0.7847
No log 7.7778 210 0.6008 0.4555 0.6008 0.7751
No log 7.8519 212 0.5794 0.4945 0.5794 0.7612
No log 7.9259 214 0.5526 0.5017 0.5526 0.7434
No log 8.0 216 0.5532 0.5252 0.5532 0.7437
No log 8.0741 218 0.5166 0.5022 0.5166 0.7187
No log 8.1481 220 0.5198 0.4547 0.5198 0.7209
No log 8.2222 222 0.5164 0.4086 0.5164 0.7186
No log 8.2963 224 0.5319 0.5091 0.5319 0.7293
No log 8.3704 226 0.5195 0.5228 0.5195 0.7208
No log 8.4444 228 0.5393 0.5110 0.5393 0.7344
No log 8.5185 230 0.5323 0.4635 0.5323 0.7296
No log 8.5926 232 0.5294 0.4964 0.5294 0.7276
No log 8.6667 234 0.5573 0.4555 0.5574 0.7466
No log 8.7407 236 0.6076 0.4835 0.6076 0.7795
No log 8.8148 238 0.5611 0.4705 0.5611 0.7491
No log 8.8889 240 0.5656 0.5406 0.5656 0.7520
No log 8.9630 242 0.6126 0.4100 0.6126 0.7827
No log 9.0370 244 0.5410 0.5335 0.5410 0.7355
No log 9.1111 246 0.4899 0.5135 0.4899 0.7000
No log 9.1852 248 0.5063 0.6187 0.5063 0.7116
No log 9.2593 250 0.5397 0.4997 0.5397 0.7346
No log 9.3333 252 0.5233 0.5016 0.5233 0.7234
No log 9.4074 254 0.4870 0.5580 0.4870 0.6978
No log 9.4815 256 0.5655 0.4482 0.5655 0.7520
No log 9.5556 258 0.5641 0.4807 0.5641 0.7510
No log 9.6296 260 0.4955 0.5250 0.4955 0.7039
No log 9.7037 262 0.4909 0.5752 0.4909 0.7006
No log 9.7778 264 0.5557 0.5206 0.5557 0.7455
No log 9.8519 266 0.5015 0.4774 0.5015 0.7082
No log 9.9259 268 0.5314 0.4631 0.5314 0.7290
No log 10.0 270 0.5940 0.3854 0.5940 0.7707
No log 10.0741 272 0.5583 0.4083 0.5583 0.7472
No log 10.1481 274 0.4934 0.4847 0.4934 0.7024
No log 10.2222 276 0.5678 0.5206 0.5678 0.7535
No log 10.2963 278 0.6390 0.5520 0.6390 0.7994
No log 10.3704 280 0.6277 0.5215 0.6277 0.7923
No log 10.4444 282 0.5237 0.5237 0.5237 0.7237
No log 10.5185 284 0.4823 0.6010 0.4823 0.6945
No log 10.5926 286 0.4782 0.5648 0.4782 0.6915
No log 10.6667 288 0.4895 0.4614 0.4895 0.6996
No log 10.7407 290 0.4957 0.4614 0.4957 0.7041
No log 10.8148 292 0.4883 0.5060 0.4883 0.6988
No log 10.8889 294 0.5171 0.4243 0.5171 0.7191
No log 10.9630 296 0.5291 0.4855 0.5291 0.7274
No log 11.0370 298 0.5195 0.4867 0.5195 0.7207
No log 11.1111 300 0.4885 0.5329 0.4885 0.6989
No log 11.1852 302 0.4842 0.5943 0.4842 0.6959
No log 11.2593 304 0.4892 0.5715 0.4892 0.6994
No log 11.3333 306 0.5014 0.5736 0.5014 0.7081
No log 11.4074 308 0.5110 0.5736 0.5110 0.7149
No log 11.4815 310 0.5055 0.5413 0.5055 0.7110
No log 11.5556 312 0.5159 0.5413 0.5159 0.7182
No log 11.6296 314 0.5300 0.5309 0.5300 0.7280
No log 11.7037 316 0.5382 0.5119 0.5382 0.7336
No log 11.7778 318 0.5151 0.5161 0.5151 0.7177
No log 11.8519 320 0.4986 0.5182 0.4986 0.7061
No log 11.9259 322 0.4929 0.5203 0.4929 0.7021
No log 12.0 324 0.5140 0.5171 0.5140 0.7170
No log 12.0741 326 0.5416 0.4868 0.5416 0.7359
No log 12.1481 328 0.5614 0.4613 0.5614 0.7493
No log 12.2222 330 0.5295 0.3885 0.5295 0.7276
No log 12.2963 332 0.5064 0.4924 0.5064 0.7116
No log 12.3704 334 0.5106 0.4634 0.5106 0.7145
No log 12.4444 336 0.5128 0.4634 0.5128 0.7161
No log 12.5185 338 0.5153 0.4357 0.5153 0.7179
No log 12.5926 340 0.5018 0.5161 0.5018 0.7084
No log 12.6667 342 0.5201 0.5111 0.5201 0.7212
No log 12.7407 344 0.5035 0.5177 0.5035 0.7096
No log 12.8148 346 0.4702 0.5056 0.4702 0.6857
No log 12.8889 348 0.4613 0.5995 0.4613 0.6792
No log 12.9630 350 0.4532 0.6455 0.4532 0.6732
No log 13.0370 352 0.4827 0.5751 0.4827 0.6947
No log 13.1111 354 0.5402 0.5077 0.5402 0.7350
No log 13.1852 356 0.5315 0.5077 0.5315 0.7290
No log 13.2593 358 0.4863 0.6248 0.4863 0.6973
No log 13.3333 360 0.4499 0.6554 0.4499 0.6708
No log 13.4074 362 0.5026 0.5357 0.5026 0.7089
No log 13.4815 364 0.5291 0.4909 0.5291 0.7274
No log 13.5556 366 0.4790 0.5816 0.4790 0.6921
No log 13.6296 368 0.4820 0.5538 0.4820 0.6943
No log 13.7037 370 0.4982 0.5538 0.4982 0.7058
No log 13.7778 372 0.4846 0.5734 0.4846 0.6961
No log 13.8519 374 0.4972 0.5485 0.4972 0.7051
No log 13.9259 376 0.5766 0.4684 0.5766 0.7593
No log 14.0 378 0.5852 0.4243 0.5852 0.7650
No log 14.0741 380 0.5372 0.4597 0.5372 0.7329
No log 14.1481 382 0.5017 0.5569 0.5017 0.7083
No log 14.2222 384 0.5188 0.5305 0.5188 0.7203
No log 14.2963 386 0.5289 0.5305 0.5289 0.7273
No log 14.3704 388 0.5212 0.5044 0.5212 0.7220
No log 14.4444 390 0.5134 0.5161 0.5134 0.7165
No log 14.5185 392 0.5193 0.4902 0.5193 0.7206
No log 14.5926 394 0.5285 0.5367 0.5285 0.7270
No log 14.6667 396 0.5513 0.5687 0.5513 0.7425
No log 14.7407 398 0.5841 0.3953 0.5841 0.7643
No log 14.8148 400 0.5728 0.4737 0.5728 0.7568
No log 14.8889 402 0.5429 0.4949 0.5429 0.7368
No log 14.9630 404 0.5610 0.4534 0.5610 0.7490
No log 15.0370 406 0.5883 0.4218 0.5883 0.7670
No log 15.1111 408 0.5584 0.4769 0.5584 0.7473
No log 15.1852 410 0.5278 0.4701 0.5278 0.7265
No log 15.2593 412 0.5452 0.4737 0.5452 0.7384
No log 15.3333 414 0.5615 0.3953 0.5615 0.7493
No log 15.4074 416 0.5373 0.4737 0.5373 0.7330
No log 15.4815 418 0.4950 0.5710 0.4950 0.7035
No log 15.5556 420 0.4854 0.5036 0.4854 0.6967
No log 15.6296 422 0.4845 0.5036 0.4845 0.6961
No log 15.7037 424 0.4935 0.5623 0.4935 0.7025
No log 15.7778 426 0.5148 0.4150 0.5148 0.7175
No log 15.8519 428 0.5382 0.4198 0.5382 0.7336
No log 15.9259 430 0.5245 0.4389 0.5245 0.7242
No log 16.0 432 0.4987 0.4938 0.4987 0.7062
No log 16.0741 434 0.4749 0.5693 0.4749 0.6891
No log 16.1481 436 0.4714 0.5927 0.4714 0.6866
No log 16.2222 438 0.4669 0.6648 0.4669 0.6833
No log 16.2963 440 0.4642 0.6448 0.4642 0.6813
No log 16.3704 442 0.4665 0.6357 0.4665 0.6830
No log 16.4444 444 0.4609 0.6448 0.4609 0.6789
No log 16.5185 446 0.4589 0.6555 0.4589 0.6775
No log 16.5926 448 0.4643 0.6183 0.4643 0.6814
No log 16.6667 450 0.4777 0.6183 0.4777 0.6911
No log 16.7407 452 0.4713 0.6183 0.4713 0.6865
No log 16.8148 454 0.4749 0.5915 0.4749 0.6891
No log 16.8889 456 0.4757 0.6184 0.4757 0.6897
No log 16.9630 458 0.4750 0.5979 0.4750 0.6892
No log 17.0370 460 0.4944 0.5034 0.4945 0.7032
No log 17.1111 462 0.5045 0.4788 0.5045 0.7103
No log 17.1852 464 0.4934 0.4788 0.4934 0.7024
No log 17.2593 466 0.5013 0.4788 0.5013 0.7080
No log 17.3333 468 0.5226 0.4788 0.5226 0.7229
No log 17.4074 470 0.5479 0.4933 0.5479 0.7402
No log 17.4815 472 0.5129 0.5034 0.5129 0.7162
No log 17.5556 474 0.5016 0.4788 0.5016 0.7083
No log 17.6296 476 0.4935 0.4788 0.4935 0.7025
No log 17.7037 478 0.4730 0.5467 0.4730 0.6877
No log 17.7778 480 0.4687 0.5081 0.4687 0.6846
No log 17.8519 482 0.4879 0.4990 0.4879 0.6985
No log 17.9259 484 0.5037 0.4569 0.5037 0.7097
No log 18.0 486 0.4977 0.4756 0.4977 0.7055
No log 18.0741 488 0.4688 0.5782 0.4688 0.6847
No log 18.1481 490 0.4571 0.5819 0.4571 0.6761
No log 18.2222 492 0.4708 0.5516 0.4708 0.6861
No log 18.2963 494 0.4629 0.5752 0.4629 0.6804
No log 18.3704 496 0.4422 0.5533 0.4422 0.6650
No log 18.4444 498 0.4380 0.5819 0.4380 0.6618
0.3088 18.5185 500 0.4354 0.6200 0.4354 0.6599
0.3088 18.5926 502 0.4642 0.5796 0.4642 0.6813
0.3088 18.6667 504 0.4650 0.5283 0.4650 0.6819
0.3088 18.7407 506 0.4540 0.5539 0.4540 0.6738
0.3088 18.8148 508 0.4597 0.5587 0.4597 0.6780
0.3088 18.8889 510 0.4690 0.4752 0.4690 0.6848

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
5
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_B_usingWellWrittenEssays_FineTuningAraBERT_run3_AugV5_k5_task7_organization

Finetuned
(4019)
this model