ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k11_task7_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.8781
  • Qwk: 0.3481
  • Mse: 0.8781
  • Rmse: 0.9371

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0364 2 2.6122 -0.0924 2.6122 1.6162
No log 0.0727 4 1.1697 0.1797 1.1697 1.0815
No log 0.1091 6 0.6923 0.2486 0.6923 0.8320
No log 0.1455 8 1.0369 0.2141 1.0369 1.0183
No log 0.1818 10 1.3344 0.0599 1.3344 1.1552
No log 0.2182 12 1.4565 -0.0369 1.4565 1.2069
No log 0.2545 14 1.4415 -0.0605 1.4415 1.2006
No log 0.2909 16 1.3658 0.1067 1.3658 1.1687
No log 0.3273 18 1.1264 0.3118 1.1264 1.0613
No log 0.3636 20 0.9641 0.3395 0.9641 0.9819
No log 0.4 22 0.8929 0.3579 0.8929 0.9449
No log 0.4364 24 0.7637 0.2904 0.7637 0.8739
No log 0.4727 26 0.7344 0.2904 0.7344 0.8570
No log 0.5091 28 0.6335 0.4489 0.6335 0.7959
No log 0.5455 30 0.6000 0.3336 0.6000 0.7746
No log 0.5818 32 0.6114 0.4273 0.6114 0.7819
No log 0.6182 34 0.6336 0.2987 0.6336 0.7960
No log 0.6545 36 0.7208 0.3396 0.7208 0.8490
No log 0.6909 38 0.7567 0.1561 0.7567 0.8699
No log 0.7273 40 0.7103 0.1508 0.7103 0.8428
No log 0.7636 42 0.6573 0.2145 0.6573 0.8107
No log 0.8 44 0.6616 0.3238 0.6616 0.8134
No log 0.8364 46 0.8007 0.3889 0.8007 0.8948
No log 0.8727 48 1.0384 0.3436 1.0384 1.0190
No log 0.9091 50 1.1907 0.0471 1.1907 1.0912
No log 0.9455 52 1.1490 0.0390 1.1490 1.0719
No log 0.9818 54 1.0745 0.1243 1.0745 1.0366
No log 1.0182 56 1.0408 0.2703 1.0408 1.0202
No log 1.0545 58 0.9044 0.3359 0.9044 0.9510
No log 1.0909 60 0.7449 0.3675 0.7449 0.8631
No log 1.1273 62 0.6103 0.3524 0.6103 0.7812
No log 1.1636 64 0.6343 0.4174 0.6343 0.7964
No log 1.2 66 0.6888 0.3730 0.6888 0.8300
No log 1.2364 68 0.7023 0.3730 0.7023 0.8380
No log 1.2727 70 0.6482 0.4262 0.6482 0.8051
No log 1.3091 72 0.6354 0.3910 0.6354 0.7971
No log 1.3455 74 0.7545 0.5101 0.7545 0.8686
No log 1.3818 76 0.9570 0.3524 0.9570 0.9783
No log 1.4182 78 1.0473 0.2939 1.0473 1.0234
No log 1.4545 80 1.1210 0.2939 1.1210 1.0588
No log 1.4909 82 1.0275 0.2909 1.0275 1.0137
No log 1.5273 84 0.9337 0.3376 0.9337 0.9663
No log 1.5636 86 0.8626 0.3174 0.8626 0.9287
No log 1.6 88 0.7502 0.4413 0.7502 0.8661
No log 1.6364 90 0.6696 0.4597 0.6696 0.8183
No log 1.6727 92 0.6661 0.4597 0.6661 0.8162
No log 1.7091 94 0.7071 0.4104 0.7071 0.8409
No log 1.7455 96 0.7936 0.4246 0.7936 0.8909
No log 1.7818 98 0.7416 0.4683 0.7416 0.8611
No log 1.8182 100 0.7849 0.4230 0.7849 0.8860
No log 1.8545 102 0.8441 0.4332 0.8441 0.9188
No log 1.8909 104 0.7952 0.4203 0.7952 0.8917
No log 1.9273 106 0.7597 0.3981 0.7597 0.8716
No log 1.9636 108 0.7919 0.4851 0.7919 0.8899
No log 2.0 110 0.8469 0.5365 0.8469 0.9203
No log 2.0364 112 0.8359 0.4754 0.8359 0.9143
No log 2.0727 114 0.7248 0.4705 0.7248 0.8513
No log 2.1091 116 0.6599 0.4749 0.6599 0.8123
No log 2.1455 118 0.6503 0.4978 0.6503 0.8064
No log 2.1818 120 0.6656 0.4808 0.6656 0.8158
No log 2.2182 122 0.6008 0.5063 0.6008 0.7751
No log 2.2545 124 0.5656 0.5512 0.5656 0.7520
No log 2.2909 126 0.5398 0.4966 0.5398 0.7347
No log 2.3273 128 0.5633 0.5395 0.5633 0.7506
No log 2.3636 130 0.6468 0.4749 0.6468 0.8043
No log 2.4 132 0.8224 0.3727 0.8224 0.9068
No log 2.4364 134 1.0126 0.3721 1.0126 1.0063
No log 2.4727 136 1.0172 0.2838 1.0172 1.0085
No log 2.5091 138 0.9692 0.2827 0.9692 0.9845
No log 2.5455 140 0.9310 0.2710 0.9310 0.9649
No log 2.5818 142 0.8533 0.3346 0.8533 0.9237
No log 2.6182 144 0.8222 0.3806 0.8222 0.9068
No log 2.6545 146 0.6908 0.4057 0.6908 0.8311
No log 2.6909 148 0.5960 0.5422 0.5960 0.7720
No log 2.7273 150 0.5992 0.4299 0.5992 0.7741
No log 2.7636 152 0.6048 0.4299 0.6048 0.7777
No log 2.8 154 0.6073 0.4569 0.6073 0.7793
No log 2.8364 156 0.5926 0.4201 0.5926 0.7698
No log 2.8727 158 0.6114 0.4801 0.6114 0.7819
No log 2.9091 160 0.6709 0.3781 0.6709 0.8191
No log 2.9455 162 0.6804 0.4295 0.6804 0.8249
No log 2.9818 164 0.7180 0.4038 0.7180 0.8474
No log 3.0182 166 0.7179 0.4385 0.7179 0.8473
No log 3.0545 168 0.7106 0.4738 0.7106 0.8430
No log 3.0909 170 0.6757 0.5059 0.6757 0.8220
No log 3.1273 172 0.6021 0.5015 0.6021 0.7760
No log 3.1636 174 0.5800 0.5362 0.5800 0.7616
No log 3.2 176 0.5869 0.5411 0.5869 0.7661
No log 3.2364 178 0.6644 0.5561 0.6644 0.8151
No log 3.2727 180 0.7467 0.4839 0.7467 0.8641
No log 3.3091 182 0.7382 0.4353 0.7382 0.8592
No log 3.3455 184 0.6342 0.4218 0.6342 0.7964
No log 3.3818 186 0.5825 0.4330 0.5825 0.7632
No log 3.4182 188 0.5717 0.4420 0.5717 0.7561
No log 3.4545 190 0.5574 0.4888 0.5574 0.7466
No log 3.4909 192 0.5830 0.5016 0.5830 0.7635
No log 3.5273 194 0.6209 0.4764 0.6209 0.7880
No log 3.5636 196 0.7006 0.4418 0.7006 0.8370
No log 3.6 198 0.7090 0.4051 0.7090 0.8420
No log 3.6364 200 0.7461 0.4409 0.7461 0.8638
No log 3.6727 202 0.7868 0.3688 0.7868 0.8870
No log 3.7091 204 0.7400 0.4014 0.7400 0.8602
No log 3.7455 206 0.6557 0.5252 0.6557 0.8097
No log 3.7818 208 0.5806 0.5736 0.5806 0.7620
No log 3.8182 210 0.5685 0.6300 0.5685 0.7540
No log 3.8545 212 0.5741 0.6025 0.5741 0.7577
No log 3.8909 214 0.6001 0.5934 0.6001 0.7746
No log 3.9273 216 0.5583 0.5831 0.5583 0.7472
No log 3.9636 218 0.5575 0.5741 0.5575 0.7466
No log 4.0 220 0.5972 0.5772 0.5972 0.7728
No log 4.0364 222 0.5775 0.5831 0.5775 0.7599
No log 4.0727 224 0.5289 0.6313 0.5289 0.7273
No log 4.1091 226 0.5273 0.6032 0.5273 0.7262
No log 4.1455 228 0.5560 0.5708 0.5560 0.7457
No log 4.1818 230 0.5738 0.5254 0.5738 0.7575
No log 4.2182 232 0.6230 0.4892 0.6230 0.7893
No log 4.2545 234 0.7497 0.3887 0.7497 0.8659
No log 4.2909 236 0.8397 0.4489 0.8397 0.9163
No log 4.3273 238 0.7863 0.4166 0.7863 0.8867
No log 4.3636 240 0.6486 0.4961 0.6486 0.8054
No log 4.4 242 0.5744 0.5817 0.5744 0.7579
No log 4.4364 244 0.5609 0.5173 0.5609 0.7489
No log 4.4727 246 0.5471 0.5473 0.5471 0.7397
No log 4.5091 248 0.5328 0.6115 0.5328 0.7300
No log 4.5455 250 0.5606 0.5567 0.5606 0.7488
No log 4.5818 252 0.5863 0.4895 0.5863 0.7657
No log 4.6182 254 0.6662 0.4144 0.6662 0.8162
No log 4.6545 256 0.7954 0.4165 0.7954 0.8919
No log 4.6909 258 0.8490 0.4511 0.8490 0.9214
No log 4.7273 260 0.8220 0.4683 0.8220 0.9066
No log 4.7636 262 0.7111 0.4265 0.7111 0.8433
No log 4.8 264 0.6175 0.4484 0.6175 0.7858
No log 4.8364 266 0.5687 0.5272 0.5687 0.7541
No log 4.8727 268 0.5597 0.5756 0.5597 0.7482
No log 4.9091 270 0.5891 0.5383 0.5891 0.7675
No log 4.9455 272 0.6351 0.5751 0.6351 0.7969
No log 4.9818 274 0.6240 0.5639 0.6240 0.7899
No log 5.0182 276 0.6382 0.5176 0.6382 0.7989
No log 5.0545 278 0.5832 0.5639 0.5832 0.7637
No log 5.0909 280 0.5524 0.5980 0.5524 0.7432
No log 5.1273 282 0.5503 0.5752 0.5503 0.7418
No log 5.1636 284 0.5552 0.5254 0.5552 0.7451
No log 5.2 286 0.6099 0.5223 0.6099 0.7810
No log 5.2364 288 0.6469 0.4646 0.6469 0.8043
No log 5.2727 290 0.6797 0.4666 0.6797 0.8244
No log 5.3091 292 0.6328 0.5484 0.6328 0.7955
No log 5.3455 294 0.5682 0.5373 0.5682 0.7538
No log 5.3818 296 0.5376 0.5647 0.5376 0.7332
No log 5.4182 298 0.5451 0.5711 0.5451 0.7383
No log 5.4545 300 0.5701 0.5646 0.5701 0.7551
No log 5.4909 302 0.5773 0.5646 0.5773 0.7598
No log 5.5273 304 0.5568 0.5827 0.5568 0.7462
No log 5.5636 306 0.5369 0.5784 0.5369 0.7327
No log 5.6 308 0.5232 0.5159 0.5232 0.7233
No log 5.6364 310 0.5533 0.5141 0.5533 0.7438
No log 5.6727 312 0.6240 0.5237 0.6240 0.7899
No log 5.7091 314 0.7192 0.4667 0.7192 0.8481
No log 5.7455 316 0.7684 0.4039 0.7684 0.8766
No log 5.7818 318 0.7094 0.4072 0.7094 0.8423
No log 5.8182 320 0.6417 0.4705 0.6417 0.8010
No log 5.8545 322 0.6233 0.4964 0.6233 0.7895
No log 5.8909 324 0.6117 0.4726 0.6117 0.7821
No log 5.9273 326 0.6271 0.5237 0.6271 0.7919
No log 5.9636 328 0.6531 0.5133 0.6531 0.8082
No log 6.0 330 0.5967 0.5098 0.5967 0.7724
No log 6.0364 332 0.5580 0.4934 0.5580 0.7470
No log 6.0727 334 0.5716 0.4613 0.5716 0.7560
No log 6.1091 336 0.5853 0.4681 0.5853 0.7650
No log 6.1455 338 0.6314 0.4933 0.6314 0.7946
No log 6.1818 340 0.7320 0.4366 0.7320 0.8556
No log 6.2182 342 0.8060 0.3946 0.8060 0.8978
No log 6.2545 344 0.8015 0.4432 0.8015 0.8953
No log 6.2909 346 0.7166 0.4438 0.7166 0.8465
No log 6.3273 348 0.6360 0.5206 0.6360 0.7975
No log 6.3636 350 0.5817 0.5157 0.5817 0.7627
No log 6.4 352 0.5522 0.5239 0.5522 0.7431
No log 6.4364 354 0.5467 0.5528 0.5467 0.7394
No log 6.4727 356 0.5680 0.5324 0.5680 0.7537
No log 6.5091 358 0.5803 0.5437 0.5803 0.7618
No log 6.5455 360 0.6032 0.4685 0.6032 0.7767
No log 6.5818 362 0.6260 0.4060 0.6260 0.7912
No log 6.6182 364 0.6575 0.5310 0.6575 0.8109
No log 6.6545 366 0.7324 0.4114 0.7324 0.8558
No log 6.6909 368 0.8136 0.4953 0.8136 0.9020
No log 6.7273 370 0.8248 0.4511 0.8248 0.9082
No log 6.7636 372 0.7524 0.4867 0.7524 0.8674
No log 6.8 374 0.6601 0.4606 0.6601 0.8125
No log 6.8364 376 0.6267 0.5467 0.6267 0.7917
No log 6.8727 378 0.6133 0.5386 0.6133 0.7832
No log 6.9091 380 0.5997 0.4933 0.5997 0.7744
No log 6.9455 382 0.6151 0.4933 0.6151 0.7843
No log 6.9818 384 0.6021 0.4933 0.6021 0.7759
No log 7.0182 386 0.5978 0.4721 0.5978 0.7732
No log 7.0545 388 0.6159 0.4484 0.6159 0.7848
No log 7.0909 390 0.6251 0.4622 0.6251 0.7906
No log 7.1273 392 0.6351 0.4484 0.6351 0.7970
No log 7.1636 394 0.6524 0.4916 0.6524 0.8077
No log 7.2 396 0.6276 0.4997 0.6276 0.7922
No log 7.2364 398 0.6258 0.4721 0.6258 0.7911
No log 7.2727 400 0.6598 0.4980 0.6598 0.8123
No log 7.3091 402 0.6979 0.4893 0.6979 0.8354
No log 7.3455 404 0.6818 0.4531 0.6818 0.8257
No log 7.3818 406 0.6310 0.4294 0.6310 0.7944
No log 7.4182 408 0.6046 0.4059 0.6046 0.7776
No log 7.4545 410 0.6191 0.4036 0.6191 0.7868
No log 7.4909 412 0.6737 0.4272 0.6737 0.8208
No log 7.5273 414 0.7559 0.3638 0.7559 0.8694
No log 7.5636 416 0.8002 0.3638 0.8002 0.8945
No log 7.6 418 0.7882 0.3803 0.7882 0.8878
No log 7.6364 420 0.7698 0.3638 0.7698 0.8774
No log 7.6727 422 0.8063 0.3980 0.8063 0.8980
No log 7.7091 424 0.8200 0.4363 0.8200 0.9056
No log 7.7455 426 0.7408 0.3638 0.7408 0.8607
No log 7.7818 428 0.6491 0.4749 0.6491 0.8057
No log 7.8182 430 0.6632 0.4491 0.6632 0.8144
No log 7.8545 432 0.7148 0.3940 0.7148 0.8454
No log 7.8909 434 0.7582 0.3940 0.7582 0.8708
No log 7.9273 436 0.8332 0.3439 0.8332 0.9128
No log 7.9636 438 0.9564 0.3253 0.9564 0.9780
No log 8.0 440 1.1334 0.3082 1.1334 1.0646
No log 8.0364 442 1.2071 0.3103 1.2071 1.0987
No log 8.0727 444 1.0299 0.3221 1.0299 1.0148
No log 8.1091 446 0.7724 0.3433 0.7724 0.8788
No log 8.1455 448 0.6366 0.4247 0.6366 0.7979
No log 8.1818 450 0.5897 0.5177 0.5897 0.7679
No log 8.2182 452 0.6094 0.4371 0.6094 0.7806
No log 8.2545 454 0.6377 0.4522 0.6377 0.7986
No log 8.2909 456 0.7083 0.3934 0.7083 0.8416
No log 8.3273 458 0.7033 0.3934 0.7033 0.8386
No log 8.3636 460 0.6453 0.4444 0.6453 0.8033
No log 8.4 462 0.5889 0.4769 0.5889 0.7674
No log 8.4364 464 0.5934 0.4534 0.5934 0.7704
No log 8.4727 466 0.6054 0.4059 0.6054 0.7781
No log 8.5091 468 0.6715 0.3914 0.6715 0.8195
No log 8.5455 470 0.7058 0.3630 0.7058 0.8401
No log 8.5818 472 0.7517 0.3630 0.7517 0.8670
No log 8.6182 474 0.7260 0.3630 0.7260 0.8520
No log 8.6545 476 0.7035 0.3699 0.7035 0.8387
No log 8.6909 478 0.6374 0.4430 0.6374 0.7984
No log 8.7273 480 0.6105 0.4602 0.6105 0.7813
No log 8.7636 482 0.5897 0.4542 0.5897 0.7679
No log 8.8 484 0.5885 0.4484 0.5885 0.7671
No log 8.8364 486 0.6167 0.4389 0.6167 0.7853
No log 8.8727 488 0.6383 0.4745 0.6383 0.7990
No log 8.9091 490 0.6450 0.4444 0.6450 0.8031
No log 8.9455 492 0.6569 0.4444 0.6569 0.8105
No log 8.9818 494 0.7066 0.4153 0.7066 0.8406
No log 9.0182 496 0.7481 0.4085 0.7481 0.8649
No log 9.0545 498 0.7680 0.4180 0.7680 0.8764
0.3313 9.0909 500 0.6969 0.4512 0.6969 0.8348
0.3313 9.1273 502 0.6103 0.4451 0.6103 0.7812
0.3313 9.1636 504 0.5726 0.4413 0.5726 0.7567
0.3313 9.2 506 0.5600 0.4968 0.5600 0.7483
0.3313 9.2364 508 0.5509 0.5272 0.5509 0.7422
0.3313 9.2727 510 0.5447 0.4892 0.5447 0.7380
0.3313 9.3091 512 0.5567 0.4619 0.5567 0.7461
0.3313 9.3455 514 0.5525 0.4801 0.5525 0.7433
0.3313 9.3818 516 0.5636 0.4699 0.5636 0.7508
0.3313 9.4182 518 0.6385 0.3985 0.6385 0.7990
0.3313 9.4545 520 0.7336 0.3297 0.7336 0.8565
0.3313 9.4909 522 0.8199 0.3521 0.8199 0.9055
0.3313 9.5273 524 0.9109 0.3317 0.9109 0.9544
0.3313 9.5636 526 0.8781 0.3481 0.8781 0.9371

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k11_task7_organization

Finetuned
(4019)
this model