ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run3_AugV5_k19_task1_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.8057
  • Qwk: 0.6974
  • Mse: 0.8057
  • Rmse: 0.8976

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0141 2 7.1012 0.0169 7.1012 2.6648
No log 0.0282 4 4.5838 0.0690 4.5838 2.1410
No log 0.0423 6 3.2876 0.0984 3.2876 1.8132
No log 0.0563 8 2.0251 0.2308 2.0251 1.4231
No log 0.0704 10 1.7906 0.3333 1.7906 1.3381
No log 0.0845 12 1.6665 0.1538 1.6665 1.2909
No log 0.0986 14 1.7818 0.0962 1.7818 1.3349
No log 0.1127 16 1.6864 0.0784 1.6864 1.2986
No log 0.1268 18 1.6422 0.0784 1.6422 1.2815
No log 0.1408 20 1.6095 0.2018 1.6095 1.2686
No log 0.1549 22 1.8414 0.0769 1.8414 1.3570
No log 0.1690 24 2.0919 0.1368 2.0919 1.4463
No log 0.1831 26 1.9291 0.2124 1.9291 1.3889
No log 0.1972 28 1.6220 0.1982 1.6220 1.2736
No log 0.2113 30 1.4932 0.3276 1.4932 1.2220
No log 0.2254 32 1.4233 0.4132 1.4233 1.1930
No log 0.2394 34 1.4439 0.3898 1.4439 1.2016
No log 0.2535 36 1.4170 0.3590 1.4170 1.1904
No log 0.2676 38 1.2235 0.5082 1.2235 1.1061
No log 0.2817 40 1.0858 0.4923 1.0858 1.0420
No log 0.2958 42 1.0278 0.5522 1.0278 1.0138
No log 0.3099 44 1.0304 0.5954 1.0304 1.0151
No log 0.3239 46 1.1075 0.6619 1.1075 1.0524
No log 0.3380 48 1.2333 0.5957 1.2333 1.1105
No log 0.3521 50 1.1745 0.6056 1.1745 1.0837
No log 0.3662 52 0.9866 0.6667 0.9866 0.9933
No log 0.3803 54 0.9110 0.6571 0.9110 0.9545
No log 0.3944 56 0.8876 0.6176 0.8876 0.9421
No log 0.4085 58 0.9070 0.6429 0.9070 0.9524
No log 0.4225 60 0.9524 0.6479 0.9524 0.9759
No log 0.4366 62 0.9651 0.6232 0.9651 0.9824
No log 0.4507 64 0.9852 0.6131 0.9852 0.9926
No log 0.4648 66 1.0616 0.5882 1.0616 1.0303
No log 0.4789 68 0.9085 0.6232 0.9085 0.9531
No log 0.4930 70 0.8909 0.6232 0.8909 0.9439
No log 0.5070 72 0.8992 0.6571 0.8992 0.9483
No log 0.5211 74 0.8962 0.6429 0.8962 0.9467
No log 0.5352 76 0.8662 0.6897 0.8662 0.9307
No log 0.5493 78 0.9158 0.6849 0.9158 0.9570
No log 0.5634 80 0.9907 0.6207 0.9907 0.9954
No log 0.5775 82 0.9829 0.6712 0.9829 0.9914
No log 0.5915 84 0.9869 0.6395 0.9869 0.9935
No log 0.6056 86 0.9526 0.6575 0.9526 0.9760
No log 0.6197 88 0.9608 0.6577 0.9608 0.9802
No log 0.6338 90 0.9390 0.6389 0.9390 0.9690
No log 0.6479 92 0.9708 0.6358 0.9708 0.9853
No log 0.6620 94 1.0131 0.6104 1.0131 1.0065
No log 0.6761 96 0.9482 0.6788 0.9482 0.9737
No log 0.6901 98 1.5330 0.5476 1.5330 1.2381
No log 0.7042 100 2.1741 0.3978 2.1741 1.4745
No log 0.7183 102 1.9323 0.4343 1.9323 1.3901
No log 0.7324 104 0.9557 0.6483 0.9557 0.9776
No log 0.7465 106 0.8978 0.7059 0.8978 0.9475
No log 0.7606 108 1.0438 0.5970 1.0438 1.0217
No log 0.7746 110 1.0603 0.5426 1.0603 1.0297
No log 0.7887 112 1.0843 0.5778 1.0843 1.0413
No log 0.8028 114 1.2337 0.5605 1.2337 1.1107
No log 0.8169 116 1.2044 0.5556 1.2044 1.0974
No log 0.8310 118 1.2870 0.4615 1.2870 1.1345
No log 0.8451 120 1.4873 0.3973 1.4873 1.2196
No log 0.8592 122 1.4513 0.4138 1.4513 1.2047
No log 0.8732 124 1.3648 0.4730 1.3648 1.1682
No log 0.8873 126 1.1569 0.5850 1.1569 1.0756
No log 0.9014 128 0.8885 0.6939 0.8885 0.9426
No log 0.9155 130 0.8599 0.6667 0.8599 0.9273
No log 0.9296 132 0.9090 0.6753 0.9090 0.9534
No log 0.9437 134 0.8558 0.6575 0.8558 0.9251
No log 0.9577 136 1.0568 0.6234 1.0568 1.0280
No log 0.9718 138 1.3716 0.5635 1.3716 1.1712
No log 0.9859 140 1.2822 0.5778 1.2822 1.1323
No log 1.0 142 0.9574 0.6842 0.9574 0.9785
No log 1.0141 144 0.7339 0.7403 0.7339 0.8567
No log 1.0282 146 0.8038 0.7421 0.8038 0.8966
No log 1.0423 148 0.8293 0.7389 0.8293 0.9107
No log 1.0563 150 0.7413 0.7285 0.7413 0.8610
No log 1.0704 152 0.8138 0.7133 0.8138 0.9021
No log 1.0845 154 1.1026 0.5732 1.1026 1.0500
No log 1.0986 156 1.4036 0.6211 1.4036 1.1847
No log 1.1127 158 1.3357 0.6387 1.3357 1.1557
No log 1.1268 160 1.1445 0.6298 1.1445 1.0698
No log 1.1408 162 0.9706 0.6415 0.9706 0.9852
No log 1.1549 164 0.9153 0.6443 0.9153 0.9567
No log 1.1690 166 0.9186 0.6452 0.9186 0.9584
No log 1.1831 168 0.8719 0.7273 0.8719 0.9337
No log 1.1972 170 0.8384 0.7333 0.8384 0.9157
No log 1.2113 172 0.7911 0.7211 0.7911 0.8894
No log 1.2254 174 0.6665 0.7162 0.6665 0.8164
No log 1.2394 176 0.6302 0.7733 0.6302 0.7939
No log 1.2535 178 0.6685 0.7237 0.6685 0.8176
No log 1.2676 180 0.8166 0.7089 0.8166 0.9037
No log 1.2817 182 0.9207 0.6957 0.9207 0.9595
No log 1.2958 184 0.8051 0.7273 0.8051 0.8973
No log 1.3099 186 0.6345 0.7467 0.6345 0.7965
No log 1.3239 188 0.6613 0.76 0.6613 0.8132
No log 1.3380 190 0.7003 0.7671 0.7003 0.8368
No log 1.3521 192 0.8167 0.6950 0.8167 0.9037
No log 1.3662 194 1.0653 0.6486 1.0653 1.0321
No log 1.3803 196 1.1784 0.5714 1.1784 1.0855
No log 1.3944 198 1.1491 0.6154 1.1491 1.0720
No log 1.4085 200 1.2020 0.5432 1.2020 1.0964
No log 1.4225 202 1.1693 0.5839 1.1693 1.0813
No log 1.4366 204 1.0017 0.6623 1.0017 1.0008
No log 1.4507 206 0.8426 0.6714 0.8426 0.9179
No log 1.4648 208 0.8591 0.6853 0.8591 0.9269
No log 1.4789 210 1.0080 0.6383 1.0080 1.0040
No log 1.4930 212 1.2832 0.5065 1.2832 1.1328
No log 1.5070 214 1.3817 0.4586 1.3817 1.1755
No log 1.5211 216 1.2523 0.5325 1.2523 1.1190
No log 1.5352 218 1.1136 0.6174 1.1136 1.0553
No log 1.5493 220 1.0656 0.6174 1.0656 1.0323
No log 1.5634 222 1.0164 0.6216 1.0164 1.0081
No log 1.5775 224 0.8481 0.7211 0.8481 0.9209
No log 1.5915 226 0.7332 0.7432 0.7332 0.8563
No log 1.6056 228 0.6226 0.7582 0.6226 0.7891
No log 1.6197 230 0.5924 0.7582 0.5924 0.7697
No log 1.6338 232 0.6320 0.7547 0.6320 0.7950
No log 1.6479 234 0.6936 0.7453 0.6936 0.8328
No log 1.6620 236 1.0158 0.7174 1.0158 1.0079
No log 1.6761 238 1.2400 0.6800 1.2400 1.1136
No log 1.6901 240 1.0574 0.7128 1.0574 1.0283
No log 1.7042 242 0.8327 0.7170 0.8327 0.9125
No log 1.7183 244 0.8473 0.7114 0.8473 0.9205
No log 1.7324 246 0.9681 0.6577 0.9681 0.9839
No log 1.7465 248 1.0281 0.6624 1.0281 1.0140
No log 1.7606 250 0.9553 0.6711 0.9553 0.9774
No log 1.7746 252 0.9330 0.6797 0.9330 0.9659
No log 1.7887 254 1.0018 0.6835 1.0018 1.0009
No log 1.8028 256 1.1092 0.6467 1.1092 1.0532
No log 1.8169 258 1.0483 0.6545 1.0483 1.0239
No log 1.8310 260 0.8562 0.7152 0.8562 0.9253
No log 1.8451 262 0.7807 0.7516 0.7807 0.8835
No log 1.8592 264 0.8677 0.7314 0.8677 0.9315
No log 1.8732 266 1.1141 0.6632 1.1141 1.0555
No log 1.8873 268 1.2733 0.6154 1.2733 1.1284
No log 1.9014 270 1.0912 0.6489 1.0912 1.0446
No log 1.9155 272 0.8365 0.75 0.8365 0.9146
No log 1.9296 274 0.6520 0.7683 0.6520 0.8075
No log 1.9437 276 0.6320 0.7692 0.6320 0.7950
No log 1.9577 278 0.6349 0.7742 0.6349 0.7968
No log 1.9718 280 0.7048 0.7643 0.7048 0.8395
No log 1.9859 282 0.9932 0.7 0.9932 0.9966
No log 2.0 284 1.1922 0.6848 1.1922 1.0919
No log 2.0141 286 1.0602 0.7 1.0602 1.0296
No log 2.0282 288 0.7521 0.7170 0.7521 0.8672
No log 2.0423 290 0.6735 0.7682 0.6735 0.8207
No log 2.0563 292 0.7111 0.7682 0.7111 0.8433
No log 2.0704 294 0.8486 0.6980 0.8486 0.9212
No log 2.0845 296 0.8904 0.6757 0.8904 0.9436
No log 2.0986 298 0.8872 0.6757 0.8872 0.9419
No log 2.1127 300 0.7633 0.7162 0.7633 0.8737
No log 2.1268 302 0.6614 0.7867 0.6614 0.8133
No log 2.1408 304 0.6276 0.7662 0.6276 0.7922
No log 2.1549 306 0.6660 0.7889 0.6660 0.8161
No log 2.1690 308 0.7668 0.7826 0.7668 0.8757
No log 2.1831 310 0.7584 0.7650 0.7584 0.8708
No log 2.1972 312 0.6705 0.7978 0.6705 0.8188
No log 2.2113 314 0.6824 0.7735 0.6824 0.8261
No log 2.2254 316 0.6784 0.7797 0.6784 0.8237
No log 2.2394 318 0.7720 0.7093 0.7720 0.8787
No log 2.2535 320 0.7684 0.7 0.7684 0.8766
No log 2.2676 322 0.7067 0.7179 0.7067 0.8406
No log 2.2817 324 0.6194 0.7785 0.6194 0.7870
No log 2.2958 326 0.6062 0.7702 0.6062 0.7786
No log 2.3099 328 0.6780 0.7735 0.6780 0.8234
No log 2.3239 330 0.7088 0.7735 0.7088 0.8419
No log 2.3380 332 0.8981 0.7391 0.8981 0.9477
No log 2.3521 334 1.0613 0.6952 1.0613 1.0302
No log 2.3662 336 1.0216 0.6624 1.0216 1.0107
No log 2.3803 338 0.9972 0.6351 0.9972 0.9986
No log 2.3944 340 0.9022 0.6331 0.9022 0.9498
No log 2.4085 342 0.8639 0.6525 0.8639 0.9295
No log 2.4225 344 0.8289 0.6974 0.8289 0.9104
No log 2.4366 346 0.9049 0.6795 0.9049 0.9513
No log 2.4507 348 1.0400 0.6867 1.0400 1.0198
No log 2.4648 350 1.1110 0.6932 1.1110 1.0540
No log 2.4789 352 1.0192 0.6584 1.0192 1.0096
No log 2.4930 354 0.9504 0.6710 0.9504 0.9749
No log 2.5070 356 0.8781 0.6974 0.8781 0.9371
No log 2.5211 358 0.8433 0.6974 0.8433 0.9183
No log 2.5352 360 0.8114 0.6974 0.8114 0.9008
No log 2.5493 362 0.8583 0.6974 0.8583 0.9264
No log 2.5634 364 0.8653 0.6944 0.8653 0.9302
No log 2.5775 366 0.9137 0.6667 0.9137 0.9559
No log 2.5915 368 0.9280 0.6577 0.9280 0.9633
No log 2.6056 370 0.9069 0.6839 0.9069 0.9523
No log 2.6197 372 0.7941 0.7179 0.7941 0.8911
No log 2.6338 374 0.7160 0.7152 0.7160 0.8462
No log 2.6479 376 0.7236 0.7152 0.7236 0.8506
No log 2.6620 378 0.8117 0.7059 0.8117 0.9010
No log 2.6761 380 0.8812 0.6974 0.8812 0.9387
No log 2.6901 382 0.9404 0.6623 0.9404 0.9698
No log 2.7042 384 0.9205 0.6623 0.9205 0.9594
No log 2.7183 386 0.8106 0.6797 0.8106 0.9003
No log 2.7324 388 0.7105 0.7190 0.7105 0.8429
No log 2.7465 390 0.6888 0.7484 0.6888 0.8300
No log 2.7606 392 0.7937 0.7081 0.7937 0.8909
No log 2.7746 394 0.9861 0.6864 0.9861 0.9930
No log 2.7887 396 1.0070 0.6709 1.0070 1.0035
No log 2.8028 398 0.8783 0.6579 0.8783 0.9372
No log 2.8169 400 0.7835 0.7172 0.7835 0.8852
No log 2.8310 402 0.7885 0.7172 0.7885 0.8880
No log 2.8451 404 0.8332 0.6757 0.8332 0.9128
No log 2.8592 406 0.9684 0.6624 0.9684 0.9841
No log 2.8732 408 1.0219 0.6624 1.0219 1.0109
No log 2.8873 410 1.0549 0.6667 1.0549 1.0271
No log 2.9014 412 0.9569 0.6835 0.9569 0.9782
No log 2.9155 414 0.8845 0.7089 0.8845 0.9405
No log 2.9296 416 0.8728 0.7089 0.8728 0.9342
No log 2.9437 418 0.8245 0.7013 0.8245 0.9080
No log 2.9577 420 0.8158 0.6887 0.8158 0.9032
No log 2.9718 422 0.7380 0.7190 0.7380 0.8591
No log 2.9859 424 0.6598 0.7342 0.6598 0.8123
No log 3.0 426 0.5849 0.8205 0.5849 0.7648
No log 3.0141 428 0.5801 0.8205 0.5801 0.7616
No log 3.0282 430 0.6595 0.7342 0.6595 0.8121
No log 3.0423 432 0.8432 0.7108 0.8432 0.9183
No log 3.0563 434 0.8807 0.7108 0.8807 0.9384
No log 3.0704 436 0.7533 0.7152 0.7533 0.8679
No log 3.0845 438 0.7008 0.7260 0.7008 0.8371
No log 3.0986 440 0.6825 0.7260 0.6825 0.8262
No log 3.1127 442 0.7217 0.7097 0.7217 0.8495
No log 3.1268 444 0.8791 0.6914 0.8791 0.9376
No log 3.1408 446 1.0777 0.6784 1.0777 1.0381
No log 3.1549 448 1.0267 0.6786 1.0267 1.0133
No log 3.1690 450 0.8969 0.6623 0.8969 0.9471
No log 3.1831 452 0.8913 0.6533 0.8913 0.9441
No log 3.1972 454 0.8753 0.6232 0.8753 0.9356
No log 3.2113 456 0.8033 0.6479 0.8033 0.8963
No log 3.2254 458 0.8118 0.7097 0.8118 0.9010
No log 3.2394 460 0.7611 0.7190 0.7611 0.8724
No log 3.2535 462 0.6957 0.7190 0.6957 0.8341
No log 3.2676 464 0.6886 0.7389 0.6886 0.8298
No log 3.2817 466 0.7357 0.7160 0.7357 0.8577
No log 3.2958 468 0.7938 0.7241 0.7938 0.8910
No log 3.3099 470 0.7095 0.75 0.7095 0.8423
No log 3.3239 472 0.7021 0.7682 0.7021 0.8379
No log 3.3380 474 0.7487 0.7152 0.7487 0.8652
No log 3.3521 476 0.8643 0.6879 0.8643 0.9297
No log 3.3662 478 0.9279 0.6795 0.9279 0.9633
No log 3.3803 480 0.8612 0.6795 0.8612 0.9280
No log 3.3944 482 0.7632 0.6621 0.7632 0.8736
No log 3.4085 484 0.7445 0.6849 0.7445 0.8628
No log 3.4225 486 0.7710 0.6712 0.7710 0.8781
No log 3.4366 488 0.8049 0.6879 0.8049 0.8971
No log 3.4507 490 0.8529 0.6879 0.8529 0.9235
No log 3.4648 492 0.9369 0.6752 0.9369 0.9679
No log 3.4789 494 0.9218 0.6395 0.9218 0.9601
No log 3.4930 496 0.8597 0.6483 0.8597 0.9272
No log 3.5070 498 0.7858 0.6712 0.7858 0.8864
0.4804 3.5211 500 0.7281 0.7059 0.7281 0.8533
0.4804 3.5352 502 0.7011 0.7261 0.7011 0.8373
0.4804 3.5493 504 0.7410 0.7205 0.7410 0.8608
0.4804 3.5634 506 0.7703 0.7125 0.7703 0.8777
0.4804 3.5775 508 0.7621 0.7226 0.7621 0.8730
0.4804 3.5915 510 0.7943 0.7226 0.7943 0.8912
0.4804 3.6056 512 0.8123 0.72 0.8123 0.9013
0.4804 3.6197 514 0.8487 0.6962 0.8487 0.9212
0.4804 3.6338 516 0.9293 0.7024 0.9293 0.9640
0.4804 3.6479 518 1.0252 0.6744 1.0252 1.0125
0.4804 3.6620 520 0.9166 0.6988 0.9166 0.9574
0.4804 3.6761 522 0.8057 0.6974 0.8057 0.8976

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
1
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run3_AugV5_k19_task1_organization

Finetuned
(4023)
this model