ArabicNewSplits7_B_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k5_task7_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.5402
  • Qwk: 0.5433
  • Mse: 0.5402
  • Rmse: 0.7350

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0741 2 2.6277 -0.0593 2.6277 1.6210
No log 0.1481 4 1.3892 0.0990 1.3892 1.1786
No log 0.2222 6 1.1064 -0.1408 1.1064 1.0518
No log 0.2963 8 0.8739 0.0490 0.8739 0.9348
No log 0.3704 10 0.8837 0.2032 0.8837 0.9401
No log 0.4444 12 0.7903 0.2662 0.7903 0.8890
No log 0.5185 14 0.7864 0.2913 0.7864 0.8868
No log 0.5926 16 0.7227 0.0846 0.7227 0.8501
No log 0.6667 18 0.7165 0.1277 0.7165 0.8464
No log 0.7407 20 0.7151 0.1550 0.7151 0.8457
No log 0.8148 22 0.7149 0.1744 0.7149 0.8455
No log 0.8889 24 0.8634 0.1673 0.8634 0.9292
No log 0.9630 26 1.1161 0.1746 1.1161 1.0565
No log 1.0370 28 1.1495 0.1484 1.1495 1.0722
No log 1.1111 30 0.8530 0.2163 0.8530 0.9236
No log 1.1852 32 0.6716 0.4124 0.6716 0.8195
No log 1.2593 34 0.6868 0.4044 0.6868 0.8287
No log 1.3333 36 0.7010 0.3612 0.7010 0.8372
No log 1.4074 38 0.7283 0.3515 0.7283 0.8534
No log 1.4815 40 0.6862 0.3664 0.6862 0.8284
No log 1.5556 42 0.6768 0.2947 0.6768 0.8227
No log 1.6296 44 0.7438 0.3071 0.7438 0.8624
No log 1.7037 46 0.7208 0.3388 0.7208 0.8490
No log 1.7778 48 0.6570 0.2113 0.6570 0.8106
No log 1.8519 50 0.6522 0.3599 0.6522 0.8076
No log 1.9259 52 0.6449 0.4139 0.6449 0.8031
No log 2.0 54 0.6572 0.3779 0.6572 0.8107
No log 2.0741 56 0.6868 0.4097 0.6868 0.8287
No log 2.1481 58 0.7810 0.3228 0.7810 0.8837
No log 2.2222 60 0.7055 0.3895 0.7055 0.8399
No log 2.2963 62 0.5579 0.5430 0.5579 0.7469
No log 2.3704 64 0.6987 0.4072 0.6987 0.8359
No log 2.4444 66 0.8802 0.3638 0.8802 0.9382
No log 2.5185 68 0.7972 0.3638 0.7972 0.8929
No log 2.5926 70 0.7591 0.3732 0.7591 0.8713
No log 2.6667 72 0.6457 0.3746 0.6457 0.8035
No log 2.7407 74 0.5806 0.5092 0.5806 0.7619
No log 2.8148 76 0.6592 0.4969 0.6592 0.8119
No log 2.8889 78 0.5866 0.5678 0.5866 0.7659
No log 2.9630 80 0.5446 0.5904 0.5446 0.7380
No log 3.0370 82 0.5158 0.5430 0.5158 0.7182
No log 3.1111 84 0.5157 0.5095 0.5157 0.7181
No log 3.1852 86 0.5374 0.5960 0.5374 0.7331
No log 3.2593 88 0.5980 0.4740 0.5980 0.7733
No log 3.3333 90 0.5224 0.6200 0.5224 0.7228
No log 3.4074 92 0.5221 0.5976 0.5221 0.7225
No log 3.4815 94 0.5394 0.4504 0.5394 0.7344
No log 3.5556 96 0.5232 0.5071 0.5232 0.7233
No log 3.6296 98 0.5574 0.5117 0.5574 0.7466
No log 3.7037 100 0.7482 0.3538 0.7482 0.8650
No log 3.7778 102 0.8760 0.2797 0.8760 0.9359
No log 3.8519 104 0.7079 0.3640 0.7079 0.8413
No log 3.9259 106 0.5404 0.5446 0.5404 0.7351
No log 4.0 108 0.5411 0.5235 0.5411 0.7356
No log 4.0741 110 0.5688 0.4945 0.5688 0.7542
No log 4.1481 112 0.7251 0.3697 0.7251 0.8515
No log 4.2222 114 0.8801 0.3156 0.8801 0.9382
No log 4.2963 116 0.8864 0.3156 0.8864 0.9415
No log 4.3704 118 0.7407 0.5401 0.7407 0.8606
No log 4.4444 120 0.6891 0.5298 0.6891 0.8301
No log 4.5185 122 0.6628 0.4997 0.6628 0.8141
No log 4.5926 124 0.6294 0.4770 0.6294 0.7933
No log 4.6667 126 0.5819 0.5320 0.5819 0.7628
No log 4.7407 128 0.5682 0.5084 0.5682 0.7538
No log 4.8148 130 0.5599 0.5101 0.5599 0.7483
No log 4.8889 132 0.5678 0.5168 0.5678 0.7535
No log 4.9630 134 0.5391 0.5503 0.5391 0.7342
No log 5.0370 136 0.6542 0.5275 0.6542 0.8088
No log 5.1111 138 0.8175 0.4455 0.8175 0.9041
No log 5.1852 140 0.6756 0.4457 0.6756 0.8220
No log 5.2593 142 0.5824 0.5308 0.5824 0.7632
No log 5.3333 144 0.5207 0.5574 0.5207 0.7216
No log 5.4074 146 0.5446 0.5751 0.5446 0.7380
No log 5.4815 148 0.5675 0.5268 0.5675 0.7533
No log 5.5556 150 0.6322 0.5538 0.6322 0.7951
No log 5.6296 152 0.5775 0.5678 0.5775 0.7599
No log 5.7037 154 0.5524 0.5320 0.5524 0.7433
No log 5.7778 156 0.5404 0.5368 0.5404 0.7351
No log 5.8519 158 0.5636 0.5979 0.5636 0.7507
No log 5.9259 160 0.7932 0.3984 0.7932 0.8906
No log 6.0 162 0.9166 0.3003 0.9166 0.9574
No log 6.0741 164 0.9034 0.3003 0.9034 0.9505
No log 6.1481 166 0.7930 0.4597 0.7930 0.8905
No log 6.2222 168 0.7573 0.4597 0.7573 0.8702
No log 6.2963 170 0.7935 0.4088 0.7935 0.8908
No log 6.3704 172 0.8204 0.3620 0.8204 0.9058
No log 6.4444 174 0.7641 0.4617 0.7641 0.8741
No log 6.5185 176 0.5906 0.5180 0.5906 0.7685
No log 6.5926 178 0.5846 0.5640 0.5846 0.7646
No log 6.6667 180 0.5877 0.5899 0.5877 0.7666
No log 6.7407 182 0.5933 0.5436 0.5933 0.7703
No log 6.8148 184 0.6958 0.4344 0.6958 0.8342
No log 6.8889 186 0.7271 0.4568 0.7271 0.8527
No log 6.9630 188 0.7143 0.4890 0.7143 0.8451
No log 7.0370 190 0.6351 0.5070 0.6351 0.7969
No log 7.1111 192 0.5618 0.5269 0.5618 0.7495
No log 7.1852 194 0.5678 0.5711 0.5678 0.7535
No log 7.2593 196 0.5472 0.5368 0.5472 0.7398
No log 7.3333 198 0.5535 0.5217 0.5535 0.7440
No log 7.4074 200 0.6097 0.5581 0.6097 0.7808
No log 7.4815 202 0.6061 0.5636 0.6061 0.7785
No log 7.5556 204 0.6308 0.5046 0.6308 0.7943
No log 7.6296 206 0.5876 0.5104 0.5876 0.7665
No log 7.7037 208 0.5773 0.4656 0.5773 0.7598
No log 7.7778 210 0.5912 0.4734 0.5912 0.7689
No log 7.8519 212 0.5554 0.4715 0.5554 0.7452
No log 7.9259 214 0.5611 0.5600 0.5611 0.7491
No log 8.0 216 0.5607 0.5600 0.5607 0.7488
No log 8.0741 218 0.5711 0.5600 0.5711 0.7557
No log 8.1481 220 0.6044 0.4406 0.6044 0.7774
No log 8.2222 222 0.6989 0.3783 0.6989 0.8360
No log 8.2963 224 0.6762 0.3546 0.6762 0.8223
No log 8.3704 226 0.5977 0.4466 0.5977 0.7731
No log 8.4444 228 0.5561 0.5361 0.5561 0.7457
No log 8.5185 230 0.5510 0.5698 0.5510 0.7423
No log 8.5926 232 0.5494 0.5304 0.5494 0.7412
No log 8.6667 234 0.5880 0.5379 0.5880 0.7668
No log 8.7407 236 0.5962 0.5379 0.5962 0.7722
No log 8.8148 238 0.5637 0.5446 0.5637 0.7508
No log 8.8889 240 0.5616 0.5446 0.5616 0.7494
No log 8.9630 242 0.5579 0.5446 0.5579 0.7469
No log 9.0370 244 0.5585 0.5446 0.5585 0.7474
No log 9.1111 246 0.5509 0.5831 0.5509 0.7422
No log 9.1852 248 0.5624 0.5092 0.5624 0.7500
No log 9.2593 250 0.5894 0.4931 0.5894 0.7678
No log 9.3333 252 0.5590 0.5212 0.5590 0.7477
No log 9.4074 254 0.5203 0.5656 0.5203 0.7213
No log 9.4815 256 0.5197 0.5656 0.5197 0.7209
No log 9.5556 258 0.5283 0.5319 0.5283 0.7269
No log 9.6296 260 0.6471 0.5543 0.6471 0.8044
No log 9.7037 262 0.7623 0.4045 0.7623 0.8731
No log 9.7778 264 0.6921 0.4545 0.6921 0.8320
No log 9.8519 266 0.6496 0.4484 0.6496 0.8060
No log 9.9259 268 0.5756 0.5048 0.5756 0.7587
No log 10.0 270 0.5332 0.5587 0.5332 0.7302
No log 10.0741 272 0.4961 0.5885 0.4961 0.7044
No log 10.1481 274 0.5068 0.5658 0.5068 0.7119
No log 10.2222 276 0.5079 0.5763 0.5079 0.7126
No log 10.2963 278 0.5039 0.5114 0.5039 0.7098
No log 10.3704 280 0.5204 0.5992 0.5204 0.7214
No log 10.4444 282 0.5751 0.5065 0.5751 0.7583
No log 10.5185 284 0.5724 0.5065 0.5724 0.7566
No log 10.5926 286 0.5461 0.5902 0.5461 0.7390
No log 10.6667 288 0.5401 0.5379 0.5401 0.7349
No log 10.7407 290 0.5599 0.4923 0.5599 0.7482
No log 10.8148 292 0.5885 0.4901 0.5885 0.7671
No log 10.8889 294 0.5898 0.4883 0.5898 0.7680
No log 10.9630 296 0.5979 0.5352 0.5979 0.7732
No log 11.0370 298 0.6137 0.5398 0.6137 0.7834
No log 11.1111 300 0.6456 0.4901 0.6456 0.8035
No log 11.1852 302 0.6163 0.5151 0.6163 0.7850
No log 11.2593 304 0.6103 0.5160 0.6103 0.7812
No log 11.3333 306 0.6078 0.5231 0.6078 0.7796
No log 11.4074 308 0.6069 0.5231 0.6069 0.7790
No log 11.4815 310 0.6043 0.4972 0.6043 0.7774
No log 11.5556 312 0.6061 0.4953 0.6061 0.7785
No log 11.6296 314 0.6082 0.4444 0.6082 0.7799
No log 11.7037 316 0.6134 0.4635 0.6134 0.7832
No log 11.7778 318 0.6053 0.4635 0.6053 0.7780
No log 11.8519 320 0.5998 0.4829 0.5998 0.7745
No log 11.9259 322 0.5920 0.5268 0.5920 0.7694
No log 12.0 324 0.5943 0.5177 0.5943 0.7709
No log 12.0741 326 0.5958 0.4829 0.5958 0.7719
No log 12.1481 328 0.6618 0.4161 0.6618 0.8135
No log 12.2222 330 0.6917 0.3487 0.6917 0.8317
No log 12.2963 332 0.6286 0.4813 0.6286 0.7928
No log 12.3704 334 0.6173 0.4813 0.6173 0.7857
No log 12.4444 336 0.6278 0.4693 0.6278 0.7923
No log 12.5185 338 0.6987 0.4009 0.6987 0.8359
No log 12.5926 340 0.7761 0.3615 0.7761 0.8809
No log 12.6667 342 0.6881 0.4575 0.6881 0.8295
No log 12.7407 344 0.6048 0.4544 0.6048 0.7777
No log 12.8148 346 0.5872 0.5352 0.5872 0.7663
No log 12.8889 348 0.5853 0.5687 0.5853 0.7651
No log 12.9630 350 0.5732 0.5563 0.5732 0.7571
No log 13.0370 352 0.5889 0.4883 0.5889 0.7674
No log 13.1111 354 0.6411 0.4935 0.6411 0.8007
No log 13.1852 356 0.5985 0.4883 0.5985 0.7737
No log 13.2593 358 0.5844 0.5853 0.5844 0.7645
No log 13.3333 360 0.5997 0.5878 0.5997 0.7744
No log 13.4074 362 0.6044 0.5491 0.6044 0.7774
No log 13.4815 364 0.5806 0.5826 0.5806 0.7620
No log 13.5556 366 0.5762 0.5092 0.5762 0.7591
No log 13.6296 368 0.5718 0.4858 0.5718 0.7562
No log 13.7037 370 0.5995 0.5647 0.5995 0.7742
No log 13.7778 372 0.6913 0.4982 0.6913 0.8315
No log 13.8519 374 0.7103 0.4898 0.7103 0.8428
No log 13.9259 376 0.6345 0.5633 0.6345 0.7965
No log 14.0 378 0.5587 0.4858 0.5587 0.7474
No log 14.0741 380 0.5749 0.5058 0.5749 0.7582
No log 14.1481 382 0.5629 0.5058 0.5629 0.7503
No log 14.2222 384 0.5572 0.5529 0.5572 0.7464
No log 14.2963 386 0.6993 0.5529 0.6993 0.8363
No log 14.3704 388 0.7823 0.5190 0.7823 0.8845
No log 14.4444 390 0.7295 0.5190 0.7295 0.8541
No log 14.5185 392 0.6425 0.5670 0.6425 0.8015
No log 14.5926 394 0.5520 0.5756 0.5520 0.7430
No log 14.6667 396 0.5390 0.5373 0.5390 0.7342
No log 14.7407 398 0.5491 0.5110 0.5491 0.7410
No log 14.8148 400 0.5471 0.4876 0.5471 0.7397
No log 14.8889 402 0.5498 0.4876 0.5498 0.7415
No log 14.9630 404 0.5698 0.4832 0.5698 0.7548
No log 15.0370 406 0.5877 0.4832 0.5877 0.7666
No log 15.1111 408 0.6426 0.4747 0.6426 0.8016
No log 15.1852 410 0.6264 0.4411 0.6264 0.7914
No log 15.2593 412 0.5988 0.3902 0.5988 0.7738
No log 15.3333 414 0.5638 0.4717 0.5638 0.7509
No log 15.4074 416 0.5676 0.4171 0.5676 0.7534
No log 15.4815 418 0.5919 0.3635 0.5919 0.7694
No log 15.5556 420 0.5993 0.3590 0.5993 0.7742
No log 15.6296 422 0.5589 0.4171 0.5589 0.7476
No log 15.7037 424 0.5471 0.5344 0.5471 0.7396
No log 15.7778 426 0.5491 0.5784 0.5491 0.7410
No log 15.8519 428 0.5335 0.5476 0.5335 0.7304
No log 15.9259 430 0.5301 0.5739 0.5301 0.7281
No log 16.0 432 0.5365 0.5767 0.5365 0.7325
No log 16.0741 434 0.5430 0.5767 0.5430 0.7369
No log 16.1481 436 0.5279 0.5563 0.5279 0.7266
No log 16.2222 438 0.5587 0.5594 0.5587 0.7475
No log 16.2963 440 0.6075 0.5353 0.6075 0.7795
No log 16.3704 442 0.5978 0.5577 0.5978 0.7732
No log 16.4444 444 0.5622 0.5567 0.5622 0.7498
No log 16.5185 446 0.5342 0.5042 0.5342 0.7309
No log 16.5926 448 0.5171 0.5446 0.5171 0.7191
No log 16.6667 450 0.5112 0.5446 0.5112 0.7150
No log 16.7407 452 0.5342 0.5995 0.5342 0.7309
No log 16.8148 454 0.6196 0.5659 0.6196 0.7871
No log 16.8889 456 0.7116 0.4684 0.7116 0.8436
No log 16.9630 458 0.6503 0.5140 0.6503 0.8064
No log 17.0370 460 0.5439 0.5800 0.5439 0.7375
No log 17.1111 462 0.5346 0.5554 0.5346 0.7312
No log 17.1852 464 0.6306 0.5184 0.6306 0.7941
No log 17.2593 466 0.6997 0.4222 0.6997 0.8365
No log 17.3333 468 0.6470 0.4429 0.6470 0.8044
No log 17.4074 470 0.5340 0.4979 0.5340 0.7307
No log 17.4815 472 0.5119 0.5584 0.5119 0.7155
No log 17.5556 474 0.5546 0.5166 0.5546 0.7447
No log 17.6296 476 0.5471 0.4875 0.5471 0.7397
No log 17.7037 478 0.5188 0.5267 0.5188 0.7203
No log 17.7778 480 0.5146 0.5208 0.5146 0.7173
No log 17.8519 482 0.5152 0.5745 0.5152 0.7178
No log 17.9259 484 0.5122 0.5745 0.5122 0.7157
No log 18.0 486 0.5184 0.6363 0.5184 0.7200
No log 18.0741 488 0.5249 0.6295 0.5249 0.7245
No log 18.1481 490 0.5433 0.5483 0.5433 0.7371
No log 18.2222 492 0.5256 0.5663 0.5256 0.7250
No log 18.2963 494 0.5159 0.5840 0.5159 0.7182
No log 18.3704 496 0.5193 0.5784 0.5193 0.7206
No log 18.4444 498 0.5171 0.6112 0.5171 0.7191
0.3428 18.5185 500 0.4948 0.6027 0.4948 0.7034
0.3428 18.5926 502 0.5319 0.6223 0.5319 0.7293
0.3428 18.6667 504 0.6160 0.5313 0.6160 0.7848
0.3428 18.7407 506 0.6514 0.4884 0.6514 0.8071
0.3428 18.8148 508 0.5712 0.5368 0.5712 0.7558
0.3428 18.8889 510 0.4869 0.5861 0.4869 0.6978
0.3428 18.9630 512 0.5402 0.5433 0.5402 0.7350

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_B_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k5_task7_organization

Finetuned
(4019)
this model