ArabicNewSplits7_B_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k6_task7_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.5623
  • Qwk: 0.3072
  • Mse: 0.5623
  • Rmse: 0.7499

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0625 2 2.6552 -0.0702 2.6552 1.6295
No log 0.125 4 1.3937 0.1265 1.3937 1.1806
No log 0.1875 6 1.2589 -0.1740 1.2589 1.1220
No log 0.25 8 1.0283 -0.0215 1.0283 1.0140
No log 0.3125 10 0.7676 0.1901 0.7676 0.8761
No log 0.375 12 0.7238 0.3019 0.7238 0.8508
No log 0.4375 14 0.6892 0.2574 0.6892 0.8302
No log 0.5 16 0.7663 0.3042 0.7663 0.8754
No log 0.5625 18 0.9573 0.2782 0.9573 0.9784
No log 0.625 20 0.7705 0.3032 0.7705 0.8778
No log 0.6875 22 0.7154 0.1744 0.7154 0.8458
No log 0.75 24 0.7946 0.1729 0.7946 0.8914
No log 0.8125 26 0.7713 0.1729 0.7713 0.8782
No log 0.875 28 0.7437 0.1687 0.7437 0.8624
No log 0.9375 30 0.7639 0.1766 0.7639 0.8740
No log 1.0 32 0.7393 0.2149 0.7393 0.8598
No log 1.0625 34 0.7099 0.2607 0.7099 0.8426
No log 1.125 36 0.7041 0.3558 0.7041 0.8391
No log 1.1875 38 0.7599 0.3948 0.7599 0.8717
No log 1.25 40 0.7341 0.3876 0.7341 0.8568
No log 1.3125 42 0.6748 0.3391 0.6748 0.8215
No log 1.375 44 0.9834 0.4125 0.9834 0.9917
No log 1.4375 46 0.9691 0.4226 0.9691 0.9845
No log 1.5 48 0.6673 0.3942 0.6673 0.8169
No log 1.5625 50 0.6270 0.4044 0.6270 0.7918
No log 1.625 52 0.6701 0.3996 0.6701 0.8186
No log 1.6875 54 0.6928 0.3996 0.6928 0.8323
No log 1.75 56 0.6873 0.3996 0.6873 0.8290
No log 1.8125 58 0.7701 0.3582 0.7701 0.8776
No log 1.875 60 0.6763 0.3471 0.6763 0.8224
No log 1.9375 62 0.6425 0.3780 0.6425 0.8015
No log 2.0 64 0.6492 0.4174 0.6492 0.8057
No log 2.0625 66 0.6952 0.4528 0.6952 0.8338
No log 2.125 68 0.6594 0.4205 0.6594 0.8120
No log 2.1875 70 0.7537 0.4207 0.7537 0.8682
No log 2.25 72 0.8412 0.4234 0.8412 0.9172
No log 2.3125 74 0.7636 0.4433 0.7636 0.8739
No log 2.375 76 0.6894 0.4815 0.6894 0.8303
No log 2.4375 78 0.7287 0.4579 0.7287 0.8536
No log 2.5 80 0.7357 0.4468 0.7357 0.8577
No log 2.5625 82 0.6249 0.4504 0.6249 0.7905
No log 2.625 84 0.5867 0.5028 0.5867 0.7660
No log 2.6875 86 0.5756 0.4801 0.5756 0.7587
No log 2.75 88 0.7803 0.3847 0.7803 0.8834
No log 2.8125 90 0.7930 0.3847 0.7930 0.8905
No log 2.875 92 0.5946 0.5286 0.5946 0.7711
No log 2.9375 94 0.8136 0.3928 0.8136 0.9020
No log 3.0 96 0.7948 0.4057 0.7948 0.8915
No log 3.0625 98 0.6212 0.5286 0.6212 0.7881
No log 3.125 100 0.6819 0.4107 0.6819 0.8257
No log 3.1875 102 0.6986 0.4002 0.6986 0.8358
No log 3.25 104 0.6097 0.3002 0.6097 0.7808
No log 3.3125 106 0.6799 0.4548 0.6799 0.8246
No log 3.375 108 0.6782 0.4548 0.6782 0.8235
No log 3.4375 110 0.5983 0.5010 0.5983 0.7735
No log 3.5 112 0.6144 0.4282 0.6144 0.7838
No log 3.5625 114 0.5822 0.4828 0.5822 0.7630
No log 3.625 116 0.6336 0.4776 0.6336 0.7960
No log 3.6875 118 0.6899 0.4429 0.6899 0.8306
No log 3.75 120 0.5932 0.4850 0.5932 0.7702
No log 3.8125 122 0.5352 0.5549 0.5352 0.7316
No log 3.875 124 0.5236 0.5533 0.5236 0.7236
No log 3.9375 126 0.5297 0.4958 0.5297 0.7278
No log 4.0 128 0.5731 0.4653 0.5731 0.7570
No log 4.0625 130 0.5354 0.5492 0.5354 0.7317
No log 4.125 132 0.6429 0.4702 0.6429 0.8018
No log 4.1875 134 0.8085 0.4133 0.8085 0.8992
No log 4.25 136 0.8456 0.3747 0.8456 0.9196
No log 4.3125 138 0.7310 0.4436 0.7310 0.8550
No log 4.375 140 0.6245 0.5436 0.6245 0.7903
No log 4.4375 142 0.6987 0.4302 0.6987 0.8359
No log 4.5 144 0.6731 0.4037 0.6731 0.8204
No log 4.5625 146 0.6269 0.5533 0.6269 0.7918
No log 4.625 148 0.6519 0.4157 0.6519 0.8074
No log 4.6875 150 0.6120 0.5266 0.6120 0.7823
No log 4.75 152 0.6839 0.3976 0.6839 0.8270
No log 4.8125 154 0.6336 0.4589 0.6336 0.7960
No log 4.875 156 0.6361 0.4589 0.6361 0.7976
No log 4.9375 158 0.6306 0.3930 0.6306 0.7941
No log 5.0 160 0.6292 0.5472 0.6292 0.7932
No log 5.0625 162 0.6569 0.4562 0.6569 0.8105
No log 5.125 164 0.6536 0.4473 0.6536 0.8085
No log 5.1875 166 0.6323 0.4171 0.6323 0.7951
No log 5.25 168 0.6077 0.3603 0.6077 0.7796
No log 5.3125 170 0.6138 0.4875 0.6138 0.7835
No log 5.375 172 0.5774 0.5422 0.5774 0.7599
No log 5.4375 174 0.5912 0.4569 0.5912 0.7689
No log 5.5 176 0.5845 0.4171 0.5845 0.7645
No log 5.5625 178 0.5697 0.4276 0.5697 0.7548
No log 5.625 180 0.5796 0.5076 0.5796 0.7613
No log 5.6875 182 0.6321 0.4875 0.6321 0.7951
No log 5.75 184 0.7068 0.3754 0.7068 0.8407
No log 5.8125 186 0.6771 0.3544 0.6771 0.8229
No log 5.875 188 0.5823 0.4634 0.5823 0.7631
No log 5.9375 190 0.7524 0.3873 0.7524 0.8674
No log 6.0 192 0.7648 0.4024 0.7648 0.8745
No log 6.0625 194 0.6072 0.3976 0.6072 0.7792
No log 6.125 196 0.5823 0.5131 0.5823 0.7631
No log 6.1875 198 0.6289 0.4430 0.6289 0.7930
No log 6.25 200 0.5796 0.4637 0.5796 0.7613
No log 6.3125 202 0.6564 0.4106 0.6564 0.8102
No log 6.375 204 0.7699 0.3493 0.7699 0.8774
No log 6.4375 206 0.7005 0.3707 0.7005 0.8370
No log 6.5 208 0.5966 0.4949 0.5966 0.7724
No log 6.5625 210 0.6782 0.4124 0.6782 0.8236
No log 6.625 212 0.6663 0.4335 0.6663 0.8163
No log 6.6875 214 0.6133 0.4361 0.6133 0.7831
No log 6.75 216 0.6899 0.4237 0.6899 0.8306
No log 6.8125 218 0.7237 0.4489 0.7237 0.8507
No log 6.875 220 0.6527 0.4028 0.6527 0.8079
No log 6.9375 222 0.6428 0.4380 0.6428 0.8017
No log 7.0 224 0.6734 0.5032 0.6734 0.8206
No log 7.0625 226 0.6517 0.4136 0.6517 0.8073
No log 7.125 228 0.6312 0.3636 0.6312 0.7945
No log 7.1875 230 0.6271 0.3348 0.6271 0.7919
No log 7.25 232 0.6179 0.3690 0.6179 0.7861
No log 7.3125 234 0.6342 0.4264 0.6342 0.7964
No log 7.375 236 0.6484 0.4148 0.6484 0.8052
No log 7.4375 238 0.6276 0.4825 0.6276 0.7922
No log 7.5 240 0.5969 0.3636 0.5969 0.7726
No log 7.5625 242 0.6512 0.4293 0.6512 0.8070
No log 7.625 244 0.6156 0.4351 0.6156 0.7846
No log 7.6875 246 0.5620 0.4898 0.5620 0.7497
No log 7.75 248 0.6072 0.4933 0.6072 0.7792
No log 7.8125 250 0.7618 0.4133 0.7618 0.8728
No log 7.875 252 0.7096 0.4072 0.7096 0.8424
No log 7.9375 254 0.6403 0.4350 0.6403 0.8002
No log 8.0 256 0.5981 0.4134 0.5981 0.7734
No log 8.0625 258 0.5893 0.4618 0.5893 0.7676
No log 8.125 260 0.5695 0.4895 0.5695 0.7546
No log 8.1875 262 0.5433 0.4703 0.5433 0.7371
No log 8.25 264 0.5440 0.4703 0.5440 0.7376
No log 8.3125 266 0.5527 0.5286 0.5527 0.7435
No log 8.375 268 0.5563 0.4448 0.5563 0.7458
No log 8.4375 270 0.5591 0.5114 0.5591 0.7477
No log 8.5 272 0.5947 0.3867 0.5947 0.7712
No log 8.5625 274 0.6472 0.4272 0.6472 0.8045
No log 8.625 276 0.6380 0.4272 0.6380 0.7988
No log 8.6875 278 0.5890 0.3894 0.5890 0.7675
No log 8.75 280 0.5633 0.4357 0.5633 0.7505
No log 8.8125 282 0.5683 0.3866 0.5683 0.7538
No log 8.875 284 0.5748 0.3545 0.5748 0.7581
No log 8.9375 286 0.5557 0.4060 0.5557 0.7455
No log 9.0 288 0.5263 0.3947 0.5263 0.7255
No log 9.0625 290 0.5228 0.3661 0.5228 0.7231
No log 9.125 292 0.5549 0.4941 0.5549 0.7449
No log 9.1875 294 0.5724 0.4941 0.5724 0.7566
No log 9.25 296 0.5417 0.4634 0.5417 0.7360
No log 9.3125 298 0.5487 0.4211 0.5487 0.7407
No log 9.375 300 0.5678 0.3664 0.5678 0.7535
No log 9.4375 302 0.5857 0.3612 0.5857 0.7653
No log 9.5 304 0.5459 0.4471 0.5459 0.7388
No log 9.5625 306 0.5628 0.4883 0.5628 0.7502
No log 9.625 308 0.7055 0.4468 0.7055 0.8399
No log 9.6875 310 0.6913 0.4887 0.6913 0.8315
No log 9.75 312 0.5607 0.4704 0.5607 0.7488
No log 9.8125 314 0.5333 0.4547 0.5333 0.7303
No log 9.875 316 0.5355 0.4724 0.5355 0.7318
No log 9.9375 318 0.5351 0.4052 0.5351 0.7315
No log 10.0 320 0.5356 0.4086 0.5356 0.7319
No log 10.0625 322 0.5273 0.5189 0.5273 0.7262
No log 10.125 324 0.5275 0.4407 0.5275 0.7263
No log 10.1875 326 0.5561 0.5212 0.5561 0.7457
No log 10.25 328 0.5924 0.5015 0.5924 0.7697
No log 10.3125 330 0.5699 0.5015 0.5699 0.7549
No log 10.375 332 0.4996 0.5633 0.4996 0.7068
No log 10.4375 334 0.4942 0.5640 0.4942 0.7030
No log 10.5 336 0.5131 0.5512 0.5131 0.7163
No log 10.5625 338 0.5928 0.4569 0.5928 0.7699
No log 10.625 340 0.5902 0.4795 0.5902 0.7683
No log 10.6875 342 0.5425 0.4698 0.5425 0.7365
No log 10.75 344 0.5376 0.4970 0.5376 0.7332
No log 10.8125 346 0.5523 0.5286 0.5523 0.7432
No log 10.875 348 0.5591 0.4444 0.5591 0.7477
No log 10.9375 350 0.5964 0.4821 0.5964 0.7723
No log 11.0 352 0.6080 0.4895 0.6080 0.7797
No log 11.0625 354 0.5808 0.5518 0.5808 0.7621
No log 11.125 356 0.7362 0.4845 0.7362 0.8580
No log 11.1875 358 0.9059 0.3724 0.9059 0.9518
No log 11.25 360 0.8431 0.4382 0.8431 0.9182
No log 11.3125 362 0.6456 0.4448 0.6456 0.8035
No log 11.375 364 0.5812 0.4885 0.5812 0.7624
No log 11.4375 366 0.5710 0.4885 0.5710 0.7556
No log 11.5 368 0.5557 0.4953 0.5557 0.7455
No log 11.5625 370 0.5396 0.3889 0.5396 0.7345
No log 11.625 372 0.5384 0.4425 0.5384 0.7338
No log 11.6875 374 0.5501 0.4402 0.5501 0.7417
No log 11.75 376 0.5702 0.4419 0.5702 0.7551
No log 11.8125 378 0.5932 0.4895 0.5932 0.7702
No log 11.875 380 0.5638 0.5071 0.5638 0.7509
No log 11.9375 382 0.5461 0.4555 0.5461 0.7390
No log 12.0 384 0.5616 0.5141 0.5616 0.7494
No log 12.0625 386 0.6201 0.4997 0.6201 0.7875
No log 12.125 388 0.7189 0.3913 0.7189 0.8479
No log 12.1875 390 0.6725 0.4349 0.6725 0.8201
No log 12.25 392 0.5464 0.4726 0.5464 0.7392
No log 12.3125 394 0.5356 0.4068 0.5356 0.7318
No log 12.375 396 0.5636 0.3779 0.5636 0.7507
No log 12.4375 398 0.5500 0.3806 0.5500 0.7416
No log 12.5 400 0.5349 0.4289 0.5349 0.7314
No log 12.5625 402 0.5589 0.4514 0.5589 0.7476
No log 12.625 404 0.5740 0.3976 0.5740 0.7576
No log 12.6875 406 0.5538 0.4514 0.5538 0.7442
No log 12.75 408 0.5397 0.3305 0.5397 0.7346
No log 12.8125 410 0.5414 0.3305 0.5414 0.7358
No log 12.875 412 0.5504 0.3354 0.5504 0.7419
No log 12.9375 414 0.5500 0.3659 0.5500 0.7416
No log 13.0 416 0.5905 0.4663 0.5905 0.7684
No log 13.0625 418 0.6571 0.4295 0.6571 0.8106
No log 13.125 420 0.6411 0.4385 0.6411 0.8007
No log 13.1875 422 0.5814 0.4883 0.5814 0.7625
No log 13.25 424 0.5782 0.4883 0.5782 0.7604
No log 13.3125 426 0.5631 0.5232 0.5631 0.7504
No log 13.375 428 0.5787 0.4437 0.5787 0.7607
No log 13.4375 430 0.5659 0.4354 0.5659 0.7523
No log 13.5 432 0.5486 0.4126 0.5486 0.7407
No log 13.5625 434 0.5549 0.3860 0.5549 0.7449
No log 13.625 436 0.5635 0.4116 0.5635 0.7507
No log 13.6875 438 0.5502 0.4724 0.5502 0.7418
No log 13.75 440 0.5646 0.3915 0.5646 0.7514
No log 13.8125 442 0.5722 0.3551 0.5722 0.7564
No log 13.875 444 0.5914 0.3622 0.5914 0.7690
No log 13.9375 446 0.6115 0.3622 0.6115 0.7820
No log 14.0 448 0.5968 0.3416 0.5968 0.7726
No log 14.0625 450 0.6006 0.3416 0.6006 0.7750
No log 14.125 452 0.6163 0.3701 0.6163 0.7850
No log 14.1875 454 0.5850 0.3659 0.5850 0.7648
No log 14.25 456 0.5662 0.3585 0.5662 0.7525
No log 14.3125 458 0.5738 0.3302 0.5738 0.7575
No log 14.375 460 0.5616 0.3318 0.5616 0.7494
No log 14.4375 462 0.5943 0.3701 0.5943 0.7709
No log 14.5 464 0.6565 0.4190 0.6565 0.8102
No log 14.5625 466 0.6461 0.4190 0.6461 0.8038
No log 14.625 468 0.5864 0.3701 0.5864 0.7658
No log 14.6875 470 0.5535 0.3920 0.5535 0.7440
No log 14.75 472 0.5447 0.4186 0.5447 0.7381
No log 14.8125 474 0.5497 0.4199 0.5497 0.7414
No log 14.875 476 0.5456 0.4515 0.5456 0.7386
No log 14.9375 478 0.5498 0.4224 0.5498 0.7415
No log 15.0 480 0.5832 0.4267 0.5832 0.7637
No log 15.0625 482 0.6219 0.3688 0.6219 0.7886
No log 15.125 484 0.6820 0.3688 0.6820 0.8258
No log 15.1875 486 0.7024 0.2932 0.7024 0.8381
No log 15.25 488 0.7297 0.4329 0.7297 0.8542
No log 15.3125 490 0.7010 0.3794 0.7010 0.8373
No log 15.375 492 0.6526 0.3545 0.6526 0.8078
No log 15.4375 494 0.6440 0.4076 0.6440 0.8025
No log 15.5 496 0.6002 0.3224 0.6002 0.7747
No log 15.5625 498 0.5663 0.4441 0.5663 0.7526
0.2766 15.625 500 0.5594 0.4634 0.5594 0.7480
0.2766 15.6875 502 0.5604 0.4703 0.5604 0.7486
0.2766 15.75 504 0.5823 0.4148 0.5823 0.7631
0.2766 15.8125 506 0.6119 0.4753 0.6119 0.7822
0.2766 15.875 508 0.5899 0.4514 0.5899 0.7680
0.2766 15.9375 510 0.5892 0.3575 0.5892 0.7676
0.2766 16.0 512 0.5808 0.3688 0.5808 0.7621
0.2766 16.0625 514 0.5713 0.3980 0.5713 0.7558
0.2766 16.125 516 0.5623 0.3072 0.5623 0.7499

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_B_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k6_task7_organization

Finetuned
(4019)
this model