ArabicNewSplits7_B_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k4_task7_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.5310
  • Qwk: 0.3661
  • Mse: 0.5310
  • Rmse: 0.7287

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0909 2 2.4916 -0.0593 2.4916 1.5785
No log 0.1818 4 1.3797 0.0412 1.3797 1.1746
No log 0.2727 6 0.9256 0.0101 0.9256 0.9621
No log 0.3636 8 0.7246 0.1009 0.7246 0.8512
No log 0.4545 10 0.8729 0.2886 0.8729 0.9343
No log 0.5455 12 0.7715 0.1867 0.7715 0.8783
No log 0.6364 14 0.7103 0.2709 0.7103 0.8428
No log 0.7273 16 0.7454 0.2430 0.7454 0.8634
No log 0.8182 18 0.7411 0.2490 0.7411 0.8609
No log 0.9091 20 0.8235 0.2494 0.8235 0.9075
No log 1.0 22 0.7384 0.0893 0.7384 0.8593
No log 1.0909 24 0.7241 0.1508 0.7241 0.8510
No log 1.1818 26 0.7325 0.1508 0.7325 0.8559
No log 1.2727 28 0.8586 0.2526 0.8586 0.9266
No log 1.3636 30 1.1151 0.2037 1.1151 1.0560
No log 1.4545 32 0.9870 0.1609 0.9870 0.9935
No log 1.5455 34 0.7412 0.1228 0.7412 0.8609
No log 1.6364 36 0.9473 0.2316 0.9473 0.9733
No log 1.7273 38 1.3972 0.1489 1.3972 1.1820
No log 1.8182 40 1.3244 0.1489 1.3244 1.1508
No log 1.9091 42 1.0005 0.1870 1.0005 1.0003
No log 2.0 44 0.7770 0.2027 0.7770 0.8815
No log 2.0909 46 0.7236 0.0846 0.7236 0.8507
No log 2.1818 48 0.7440 0.0937 0.7440 0.8626
No log 2.2727 50 0.7136 0.0889 0.7136 0.8447
No log 2.3636 52 0.7124 0.1321 0.7124 0.8440
No log 2.4545 54 0.6909 0.3398 0.6909 0.8312
No log 2.5455 56 0.6980 0.3011 0.6980 0.8355
No log 2.6364 58 0.6769 0.3442 0.6769 0.8227
No log 2.7273 60 0.6662 0.3517 0.6662 0.8162
No log 2.8182 62 0.8443 0.2784 0.8443 0.9188
No log 2.9091 64 0.9341 0.3173 0.9341 0.9665
No log 3.0 66 0.7115 0.2525 0.7115 0.8435
No log 3.0909 68 0.7508 0.3615 0.7508 0.8665
No log 3.1818 70 0.8899 0.3799 0.8899 0.9433
No log 3.2727 72 0.7616 0.3569 0.7616 0.8727
No log 3.3636 74 0.7142 0.2867 0.7142 0.8451
No log 3.4545 76 0.8702 0.2183 0.8702 0.9328
No log 3.5455 78 0.8004 0.2476 0.8004 0.8947
No log 3.6364 80 0.7409 0.3189 0.7409 0.8608
No log 3.7273 82 0.6552 0.2837 0.6552 0.8094
No log 3.8182 84 0.6300 0.4468 0.6300 0.7937
No log 3.9091 86 0.7033 0.4728 0.7033 0.8386
No log 4.0 88 0.6317 0.4547 0.6317 0.7948
No log 4.0909 90 0.6478 0.3523 0.6478 0.8049
No log 4.1818 92 0.7094 0.3425 0.7094 0.8423
No log 4.2727 94 0.6164 0.3785 0.6164 0.7851
No log 4.3636 96 0.5911 0.5671 0.5911 0.7689
No log 4.4545 98 0.5939 0.5437 0.5939 0.7706
No log 4.5455 100 0.5584 0.4681 0.5584 0.7473
No log 4.6364 102 0.6163 0.4412 0.6163 0.7851
No log 4.7273 104 0.5830 0.3831 0.5830 0.7636
No log 4.8182 106 0.5618 0.3860 0.5618 0.7495
No log 4.9091 108 0.5914 0.3936 0.5914 0.7691
No log 5.0 110 0.5772 0.4452 0.5772 0.7597
No log 5.0909 112 0.5500 0.4809 0.5500 0.7416
No log 5.1818 114 0.5817 0.3831 0.5817 0.7627
No log 5.2727 116 0.7751 0.3119 0.7751 0.8804
No log 5.3636 118 0.7284 0.3663 0.7284 0.8534
No log 5.4545 120 0.5507 0.4788 0.5507 0.7421
No log 5.5455 122 0.5647 0.5095 0.5647 0.7514
No log 5.6364 124 0.5737 0.4484 0.5737 0.7574
No log 5.7273 126 0.7109 0.2406 0.7109 0.8432
No log 5.8182 128 0.7351 0.2626 0.7351 0.8574
No log 5.9091 130 0.6103 0.3679 0.6103 0.7812
No log 6.0 132 0.6313 0.4437 0.6313 0.7945
No log 6.0909 134 0.6433 0.3737 0.6433 0.8021
No log 6.1818 136 0.6092 0.4888 0.6092 0.7805
No log 6.2727 138 0.6098 0.4569 0.6098 0.7809
No log 6.3636 140 0.6179 0.4504 0.6179 0.7861
No log 6.4545 142 0.6603 0.3793 0.6603 0.8126
No log 6.5455 144 0.6551 0.4073 0.6551 0.8094
No log 6.6364 146 0.6529 0.4158 0.6529 0.8080
No log 6.7273 148 0.6591 0.4158 0.6591 0.8118
No log 6.8182 150 0.6935 0.4051 0.6935 0.8327
No log 6.9091 152 0.6772 0.2929 0.6772 0.8229
No log 7.0 154 0.7501 0.3851 0.7501 0.8661
No log 7.0909 156 0.7384 0.3896 0.7384 0.8593
No log 7.1818 158 0.6583 0.3645 0.6583 0.8113
No log 7.2727 160 0.6429 0.3933 0.6429 0.8018
No log 7.3636 162 0.6890 0.4230 0.6890 0.8300
No log 7.4545 164 0.7573 0.3636 0.7573 0.8703
No log 7.5455 166 0.7600 0.3636 0.7600 0.8718
No log 7.6364 168 0.7270 0.3718 0.7270 0.8526
No log 7.7273 170 0.5830 0.5042 0.5830 0.7635
No log 7.8182 172 0.5754 0.4576 0.5754 0.7586
No log 7.9091 174 0.6681 0.4606 0.6681 0.8174
No log 8.0 176 0.6119 0.4542 0.6119 0.7822
No log 8.0909 178 0.5689 0.4194 0.5689 0.7543
No log 8.1818 180 0.5813 0.4547 0.5813 0.7624
No log 8.2727 182 0.6494 0.5342 0.6494 0.8058
No log 8.3636 184 0.6900 0.4952 0.6900 0.8306
No log 8.4545 186 0.5960 0.4724 0.5960 0.7720
No log 8.5455 188 0.6012 0.4698 0.6012 0.7753
No log 8.6364 190 0.6606 0.3659 0.6606 0.8128
No log 8.7273 192 0.6268 0.4474 0.6268 0.7917
No log 8.8182 194 0.5809 0.4111 0.5809 0.7622
No log 8.9091 196 0.6027 0.4124 0.6027 0.7764
No log 9.0 198 0.6044 0.4044 0.6044 0.7774
No log 9.0909 200 0.5975 0.4240 0.5975 0.7730
No log 9.1818 202 0.7707 0.3337 0.7707 0.8779
No log 9.2727 204 0.8046 0.3029 0.8046 0.8970
No log 9.3636 206 0.7495 0.3829 0.7495 0.8658
No log 9.4545 208 0.6432 0.4963 0.6432 0.8020
No log 9.5455 210 0.6239 0.5058 0.6239 0.7899
No log 9.6364 212 0.7513 0.3481 0.7513 0.8668
No log 9.7273 214 0.8242 0.3290 0.8242 0.9078
No log 9.8182 216 0.7450 0.3576 0.7450 0.8631
No log 9.9091 218 0.7643 0.3576 0.7643 0.8742
No log 10.0 220 0.6209 0.4789 0.6209 0.7880
No log 10.0909 222 0.5936 0.4386 0.5936 0.7704
No log 10.1818 224 0.6280 0.4005 0.6280 0.7925
No log 10.2727 226 0.6242 0.4307 0.6242 0.7901
No log 10.3636 228 0.5903 0.3552 0.5903 0.7683
No log 10.4545 230 0.5890 0.3754 0.5890 0.7675
No log 10.5455 232 0.6091 0.3976 0.6091 0.7804
No log 10.6364 234 0.5996 0.4243 0.5996 0.7743
No log 10.7273 236 0.5675 0.3754 0.5675 0.7533
No log 10.8182 238 0.6043 0.3417 0.6043 0.7774
No log 10.9091 240 0.6599 0.3730 0.6599 0.8124
No log 11.0 242 0.5935 0.3704 0.5935 0.7704
No log 11.0909 244 0.5812 0.5367 0.5812 0.7624
No log 11.1818 246 0.6441 0.4513 0.6441 0.8025
No log 11.2727 248 0.5744 0.5131 0.5744 0.7579
No log 11.3636 250 0.5969 0.3390 0.5969 0.7726
No log 11.4545 252 0.6297 0.2833 0.6297 0.7935
No log 11.5455 254 0.7092 0.3051 0.7092 0.8422
No log 11.6364 256 0.6637 0.2737 0.6637 0.8147
No log 11.7273 258 0.6382 0.3042 0.6382 0.7988
No log 11.8182 260 0.6091 0.3808 0.6091 0.7805
No log 11.9091 262 0.5876 0.4086 0.5876 0.7665
No log 12.0 264 0.5895 0.4354 0.5895 0.7678
No log 12.0909 266 0.5956 0.2955 0.5956 0.7717
No log 12.1818 268 0.5960 0.3258 0.5960 0.7720
No log 12.2727 270 0.5960 0.3153 0.5960 0.7720
No log 12.3636 272 0.6132 0.4354 0.6132 0.7831
No log 12.4545 274 0.6623 0.3840 0.6623 0.8138
No log 12.5455 276 0.6301 0.4253 0.6301 0.7938
No log 12.6364 278 0.6228 0.4160 0.6228 0.7892
No log 12.7273 280 0.6757 0.4373 0.6757 0.8220
No log 12.8182 282 0.7024 0.4038 0.7024 0.8381
No log 12.9091 284 0.6429 0.4654 0.6429 0.8018
No log 13.0 286 0.5773 0.4991 0.5773 0.7598
No log 13.0909 288 0.5644 0.4973 0.5644 0.7513
No log 13.1818 290 0.5524 0.4738 0.5524 0.7432
No log 13.2727 292 0.5756 0.5463 0.5756 0.7587
No log 13.3636 294 0.6397 0.4355 0.6397 0.7998
No log 13.4545 296 0.6773 0.4003 0.6773 0.8230
No log 13.5455 298 0.6191 0.5135 0.6191 0.7868
No log 13.6364 300 0.5858 0.5286 0.5858 0.7653
No log 13.7273 302 0.5982 0.3865 0.5982 0.7734
No log 13.8182 304 0.5917 0.4147 0.5917 0.7692
No log 13.9091 306 0.5884 0.4012 0.5884 0.7670
No log 14.0 308 0.6511 0.4152 0.6511 0.8069
No log 14.0909 310 0.7690 0.3466 0.7690 0.8769
No log 14.1818 312 0.8402 0.3686 0.8402 0.9166
No log 14.2727 314 0.7883 0.3686 0.7882 0.8878
No log 14.3636 316 0.6454 0.4967 0.6454 0.8034
No log 14.4545 318 0.5677 0.5177 0.5677 0.7535
No log 14.5455 320 0.6483 0.4556 0.6483 0.8052
No log 14.6364 322 0.6151 0.4920 0.6151 0.7843
No log 14.7273 324 0.5416 0.5625 0.5416 0.7359
No log 14.8182 326 0.5686 0.5212 0.5686 0.7541
No log 14.9091 328 0.6315 0.4825 0.6315 0.7947
No log 15.0 330 0.6290 0.4904 0.6290 0.7931
No log 15.0909 332 0.5979 0.4841 0.5979 0.7732
No log 15.1818 334 0.5540 0.5171 0.5540 0.7443
No log 15.2727 336 0.5588 0.4504 0.5588 0.7476
No log 15.3636 338 0.5899 0.4044 0.5899 0.7681
No log 15.4545 340 0.6131 0.4100 0.6131 0.7830
No log 15.5455 342 0.6031 0.3945 0.6031 0.7766
No log 15.6364 344 0.6203 0.4448 0.6203 0.7876
No log 15.7273 346 0.6614 0.3754 0.6614 0.8133
No log 15.8182 348 0.6740 0.3754 0.6740 0.8210
No log 15.9091 350 0.6554 0.4005 0.6554 0.8096
No log 16.0 352 0.6205 0.3958 0.6205 0.7877
No log 16.0909 354 0.5981 0.4762 0.5981 0.7734
No log 16.1818 356 0.5958 0.4717 0.5958 0.7719
No log 16.2727 358 0.5815 0.4722 0.5815 0.7625
No log 16.3636 360 0.5771 0.4722 0.5771 0.7597
No log 16.4545 362 0.5834 0.4067 0.5834 0.7638
No log 16.5455 364 0.5913 0.4067 0.5913 0.7689
No log 16.6364 366 0.6001 0.3198 0.6001 0.7746
No log 16.7273 368 0.6041 0.3198 0.6041 0.7773
No log 16.8182 370 0.6150 0.3618 0.6150 0.7842
No log 16.9091 372 0.6150 0.3763 0.6150 0.7842
No log 17.0 374 0.6135 0.3891 0.6135 0.7833
No log 17.0909 376 0.5994 0.3763 0.5994 0.7742
No log 17.1818 378 0.5715 0.3990 0.5715 0.7560
No log 17.2727 380 0.5638 0.3886 0.5638 0.7508
No log 17.3636 382 0.5627 0.3995 0.5627 0.7501
No log 17.4545 384 0.5545 0.4569 0.5545 0.7447
No log 17.5455 386 0.5499 0.4569 0.5499 0.7416
No log 17.6364 388 0.5519 0.4515 0.5519 0.7429
No log 17.7273 390 0.5677 0.3860 0.5677 0.7535
No log 17.8182 392 0.6024 0.4389 0.6024 0.7761
No log 17.9091 394 0.5887 0.4036 0.5887 0.7673
No log 18.0 396 0.5952 0.4036 0.5952 0.7715
No log 18.0909 398 0.5929 0.4345 0.5929 0.7700
No log 18.1818 400 0.5799 0.3677 0.5799 0.7615
No log 18.2727 402 0.6050 0.3894 0.6050 0.7778
No log 18.3636 404 0.6121 0.3894 0.6121 0.7824
No log 18.4545 406 0.5892 0.4504 0.5892 0.7676
No log 18.5455 408 0.5849 0.4678 0.5849 0.7648
No log 18.6364 410 0.5752 0.4495 0.5752 0.7584
No log 18.7273 412 0.5754 0.4147 0.5754 0.7586
No log 18.8182 414 0.6021 0.3814 0.6021 0.7759
No log 18.9091 416 0.5864 0.4375 0.5864 0.7658
No log 19.0 418 0.5781 0.4222 0.5781 0.7603
No log 19.0909 420 0.5753 0.4458 0.5753 0.7585
No log 19.1818 422 0.5867 0.4866 0.5867 0.7660
No log 19.2727 424 0.5924 0.4638 0.5924 0.7697
No log 19.3636 426 0.5965 0.4700 0.5965 0.7723
No log 19.4545 428 0.5930 0.4514 0.5930 0.7700
No log 19.5455 430 0.5788 0.4576 0.5788 0.7608
No log 19.6364 432 0.5713 0.4278 0.5713 0.7558
No log 19.7273 434 0.5659 0.4240 0.5659 0.7522
No log 19.8182 436 0.5782 0.4802 0.5782 0.7604
No log 19.9091 438 0.5780 0.4802 0.5780 0.7603
No log 20.0 440 0.5743 0.4484 0.5743 0.7578
No log 20.0909 442 0.5799 0.4898 0.5799 0.7615
No log 20.1818 444 0.5865 0.4802 0.5865 0.7658
No log 20.2727 446 0.6344 0.4610 0.6344 0.7965
No log 20.3636 448 0.7131 0.4204 0.7131 0.8444
No log 20.4545 450 0.7115 0.3948 0.7115 0.8435
No log 20.5455 452 0.6299 0.4351 0.6299 0.7937
No log 20.6364 454 0.5993 0.4425 0.5993 0.7742
No log 20.7273 456 0.6620 0.3840 0.6620 0.8136
No log 20.8182 458 0.6642 0.3763 0.6642 0.8150
No log 20.9091 460 0.6182 0.3445 0.6182 0.7862
No log 21.0 462 0.6163 0.4087 0.6163 0.7851
No log 21.0909 464 0.6656 0.3299 0.6656 0.8158
No log 21.1818 466 0.7025 0.3827 0.7025 0.8382
No log 21.2727 468 0.6884 0.3827 0.6884 0.8297
No log 21.3636 470 0.6481 0.4373 0.6481 0.8051
No log 21.4545 472 0.6453 0.4373 0.6453 0.8033
No log 21.5455 474 0.6720 0.4630 0.6720 0.8198
No log 21.6364 476 0.6125 0.4674 0.6125 0.7826
No log 21.7273 478 0.5717 0.4538 0.5717 0.7561
No log 21.8182 480 0.6017 0.3919 0.6017 0.7757
No log 21.9091 482 0.6018 0.3919 0.6018 0.7757
No log 22.0 484 0.5846 0.4354 0.5846 0.7646
No log 22.0909 486 0.5650 0.4199 0.5650 0.7516
No log 22.1818 488 0.5584 0.4289 0.5584 0.7472
No log 22.2727 490 0.5545 0.4199 0.5545 0.7447
No log 22.3636 492 0.5570 0.4199 0.5570 0.7463
No log 22.4545 494 0.5568 0.4199 0.5568 0.7462
No log 22.5455 496 0.5652 0.4264 0.5652 0.7518
No log 22.6364 498 0.5740 0.4264 0.5740 0.7577
0.2755 22.7273 500 0.5740 0.3980 0.5740 0.7576
0.2755 22.8182 502 0.5703 0.4070 0.5703 0.7552
0.2755 22.9091 504 0.5640 0.4061 0.5640 0.7510
0.2755 23.0 506 0.5569 0.4061 0.5569 0.7462
0.2755 23.0909 508 0.5529 0.4061 0.5529 0.7436
0.2755 23.1818 510 0.5509 0.4515 0.5509 0.7422
0.2755 23.2727 512 0.5448 0.3974 0.5448 0.7381
0.2755 23.3636 514 0.5374 0.3690 0.5374 0.7331
0.2755 23.4545 516 0.5387 0.3967 0.5387 0.7339
0.2755 23.5455 518 0.5487 0.3911 0.5487 0.7407
0.2755 23.6364 520 0.5563 0.4222 0.5563 0.7459
0.2755 23.7273 522 0.5434 0.3691 0.5434 0.7371
0.2755 23.8182 524 0.5416 0.3690 0.5416 0.7359
0.2755 23.9091 526 0.5487 0.3688 0.5487 0.7407
0.2755 24.0 528 0.5410 0.3980 0.5410 0.7355
0.2755 24.0909 530 0.5310 0.3661 0.5310 0.7287

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
1
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_B_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k4_task7_organization

Finetuned
(4019)
this model