ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k20_task2_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.8259
  • Qwk: 0.5470
  • Mse: 0.8259
  • Rmse: 0.9088

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0177 2 4.5331 -0.0103 4.5331 2.1291
No log 0.0354 4 3.0722 0.0048 3.0722 1.7528
No log 0.0531 6 1.5891 0.0504 1.5891 1.2606
No log 0.0708 8 1.4019 0.0353 1.4019 1.1840
No log 0.0885 10 1.1848 0.0671 1.1848 1.0885
No log 0.1062 12 1.1921 0.0628 1.1921 1.0918
No log 0.1239 14 1.1616 0.0628 1.1616 1.0778
No log 0.1416 16 1.1063 0.1214 1.1063 1.0518
No log 0.1593 18 1.0318 0.3124 1.0318 1.0158
No log 0.1770 20 1.1120 0.3276 1.1120 1.0545
No log 0.1947 22 1.3889 0.0806 1.3889 1.1785
No log 0.2124 24 1.5334 0.1763 1.5334 1.2383
No log 0.2301 26 1.4053 0.1833 1.4053 1.1854
No log 0.2478 28 1.1221 0.2202 1.1221 1.0593
No log 0.2655 30 1.0209 0.4214 1.0209 1.0104
No log 0.2832 32 0.9914 0.4367 0.9914 0.9957
No log 0.3009 34 0.9727 0.4221 0.9727 0.9863
No log 0.3186 36 0.9438 0.3326 0.9438 0.9715
No log 0.3363 38 0.9404 0.3174 0.9404 0.9697
No log 0.3540 40 0.9281 0.3278 0.9281 0.9634
No log 0.3717 42 0.9079 0.4466 0.9079 0.9528
No log 0.3894 44 0.8961 0.4599 0.8961 0.9466
No log 0.4071 46 0.9690 0.4341 0.9690 0.9844
No log 0.4248 48 1.2089 0.3254 1.2089 1.0995
No log 0.4425 50 1.1972 0.3254 1.1972 1.0942
No log 0.4602 52 1.0502 0.3818 1.0502 1.0248
No log 0.4779 54 0.9258 0.5422 0.9258 0.9622
No log 0.4956 56 0.9118 0.5481 0.9118 0.9549
No log 0.5133 58 0.9072 0.5114 0.9072 0.9525
No log 0.5310 60 0.9131 0.4440 0.9131 0.9556
No log 0.5487 62 0.9078 0.5131 0.9078 0.9528
No log 0.5664 64 0.9716 0.5240 0.9716 0.9857
No log 0.5841 66 1.3070 0.4094 1.3070 1.1432
No log 0.6018 68 1.7774 0.3328 1.7774 1.3332
No log 0.6195 70 2.1783 0.2494 2.1783 1.4759
No log 0.6372 72 2.0173 0.2518 2.0173 1.4203
No log 0.6549 74 1.6582 0.4116 1.6582 1.2877
No log 0.6726 76 1.2196 0.3459 1.2196 1.1043
No log 0.6903 78 0.8487 0.5484 0.8487 0.9212
No log 0.7080 80 0.9261 0.5952 0.9261 0.9623
No log 0.7257 82 0.9573 0.5550 0.9573 0.9784
No log 0.7434 84 0.9100 0.5091 0.9100 0.9539
No log 0.7611 86 0.8462 0.5501 0.8462 0.9199
No log 0.7788 88 0.8785 0.5342 0.8785 0.9373
No log 0.7965 90 0.8588 0.5540 0.8588 0.9267
No log 0.8142 92 0.8034 0.5431 0.8034 0.8963
No log 0.8319 94 0.8288 0.5563 0.8288 0.9104
No log 0.8496 96 0.8961 0.5670 0.8961 0.9466
No log 0.8673 98 0.8741 0.5797 0.8741 0.9349
No log 0.8850 100 0.7990 0.5725 0.7990 0.8939
No log 0.9027 102 0.7882 0.5204 0.7882 0.8878
No log 0.9204 104 0.8207 0.4949 0.8207 0.9059
No log 0.9381 106 0.8068 0.4864 0.8068 0.8982
No log 0.9558 108 0.8569 0.5926 0.8569 0.9257
No log 0.9735 110 0.9075 0.5976 0.9075 0.9526
No log 0.9912 112 0.8065 0.5561 0.8065 0.8980
No log 1.0088 114 0.7662 0.4512 0.7662 0.8753
No log 1.0265 116 0.7639 0.4122 0.7639 0.8740
No log 1.0442 118 0.7380 0.4798 0.7380 0.8591
No log 1.0619 120 0.7488 0.5198 0.7488 0.8653
No log 1.0796 122 0.7829 0.5424 0.7829 0.8848
No log 1.0973 124 0.7526 0.5763 0.7526 0.8675
No log 1.1150 126 0.7825 0.5570 0.7825 0.8846
No log 1.1327 128 0.8635 0.5094 0.8635 0.9293
No log 1.1504 130 0.9176 0.5493 0.9176 0.9579
No log 1.1681 132 0.8972 0.5532 0.8972 0.9472
No log 1.1858 134 0.7979 0.5131 0.7979 0.8933
No log 1.2035 136 0.7745 0.5345 0.7745 0.8800
No log 1.2212 138 0.7716 0.4785 0.7716 0.8784
No log 1.2389 140 0.7653 0.5359 0.7653 0.8748
No log 1.2566 142 0.7651 0.5148 0.7651 0.8747
No log 1.2743 144 0.7516 0.5713 0.7516 0.8669
No log 1.2920 146 0.7669 0.5344 0.7669 0.8757
No log 1.3097 148 0.8466 0.5893 0.8466 0.9201
No log 1.3274 150 1.0390 0.4954 1.0390 1.0193
No log 1.3451 152 0.9643 0.4631 0.9643 0.9820
No log 1.3628 154 0.7733 0.6391 0.7733 0.8794
No log 1.3805 156 0.7833 0.6296 0.7833 0.8850
No log 1.3982 158 0.7789 0.6316 0.7789 0.8826
No log 1.4159 160 0.7314 0.6160 0.7314 0.8552
No log 1.4336 162 0.7698 0.6324 0.7698 0.8774
No log 1.4513 164 0.8656 0.5877 0.8656 0.9304
No log 1.4690 166 0.8039 0.6209 0.8039 0.8966
No log 1.4867 168 0.8132 0.5956 0.8132 0.9018
No log 1.5044 170 0.7916 0.5465 0.7916 0.8897
No log 1.5221 172 0.7856 0.4923 0.7856 0.8864
No log 1.5398 174 0.7774 0.5420 0.7774 0.8817
No log 1.5575 176 0.7798 0.4996 0.7798 0.8830
No log 1.5752 178 0.7883 0.5131 0.7883 0.8879
No log 1.5929 180 0.8413 0.5141 0.8413 0.9172
No log 1.6106 182 0.8199 0.5750 0.8199 0.9055
No log 1.6283 184 0.7758 0.6097 0.7758 0.8808
No log 1.6460 186 0.7701 0.6332 0.7701 0.8776
No log 1.6637 188 0.7570 0.6032 0.7570 0.8701
No log 1.6814 190 0.7612 0.6023 0.7612 0.8724
No log 1.6991 192 0.7809 0.5728 0.7809 0.8837
No log 1.7168 194 0.8467 0.5160 0.8467 0.9202
No log 1.7345 196 0.8600 0.5042 0.8600 0.9273
No log 1.7522 198 0.7680 0.5819 0.7680 0.8763
No log 1.7699 200 0.7322 0.5553 0.7322 0.8557
No log 1.7876 202 0.7425 0.5596 0.7425 0.8617
No log 1.8053 204 0.8012 0.5164 0.8012 0.8951
No log 1.8230 206 0.8308 0.5353 0.8308 0.9115
No log 1.8407 208 0.8855 0.5173 0.8855 0.9410
No log 1.8584 210 0.9024 0.5086 0.9024 0.9500
No log 1.8761 212 0.8857 0.4703 0.8857 0.9411
No log 1.8938 214 0.8235 0.4334 0.8235 0.9075
No log 1.9115 216 0.8578 0.4297 0.8578 0.9262
No log 1.9292 218 0.9125 0.5078 0.9125 0.9553
No log 1.9469 220 0.7713 0.5477 0.7713 0.8782
No log 1.9646 222 0.8036 0.4942 0.8036 0.8964
No log 1.9823 224 0.9158 0.5176 0.9158 0.9570
No log 2.0 226 1.0571 0.4035 1.0571 1.0282
No log 2.0177 228 1.0612 0.4407 1.0612 1.0301
No log 2.0354 230 0.9228 0.4862 0.9228 0.9606
No log 2.0531 232 0.8200 0.5815 0.8200 0.9056
No log 2.0708 234 0.8148 0.5763 0.8148 0.9027
No log 2.0885 236 0.8366 0.5313 0.8366 0.9146
No log 2.1062 238 0.7822 0.5987 0.7822 0.8844
No log 2.1239 240 0.7283 0.5973 0.7283 0.8534
No log 2.1416 242 0.7296 0.6007 0.7296 0.8542
No log 2.1593 244 0.7208 0.5866 0.7208 0.8490
No log 2.1770 246 0.8871 0.5322 0.8871 0.9418
No log 2.1947 248 1.0871 0.4927 1.0871 1.0426
No log 2.2124 250 1.0299 0.5168 1.0299 1.0148
No log 2.2301 252 0.8106 0.5601 0.8106 0.9004
No log 2.2478 254 0.7499 0.5716 0.7499 0.8660
No log 2.2655 256 0.7558 0.5136 0.7558 0.8694
No log 2.2832 258 0.8277 0.5245 0.8277 0.9098
No log 2.3009 260 1.0148 0.5083 1.0148 1.0074
No log 2.3186 262 1.0835 0.4477 1.0835 1.0409
No log 2.3363 264 0.9425 0.4733 0.9425 0.9708
No log 2.3540 266 0.8683 0.5014 0.8683 0.9319
No log 2.3717 268 0.8024 0.5528 0.8024 0.8958
No log 2.3894 270 0.8004 0.4860 0.8004 0.8947
No log 2.4071 272 0.8006 0.5553 0.8006 0.8948
No log 2.4248 274 0.8347 0.5147 0.8347 0.9136
No log 2.4425 276 0.9093 0.5250 0.9093 0.9536
No log 2.4602 278 0.8821 0.4703 0.8821 0.9392
No log 2.4779 280 0.8355 0.4820 0.8355 0.9140
No log 2.4956 282 0.8331 0.3896 0.8331 0.9127
No log 2.5133 284 0.8460 0.4202 0.8460 0.9198
No log 2.5310 286 0.9041 0.4382 0.9041 0.9508
No log 2.5487 288 0.8883 0.4386 0.8883 0.9425
No log 2.5664 290 0.8309 0.4653 0.8309 0.9115
No log 2.5841 292 0.8537 0.4396 0.8537 0.9239
No log 2.6018 294 0.9379 0.4464 0.9379 0.9685
No log 2.6195 296 0.8672 0.4140 0.8672 0.9312
No log 2.6372 298 0.8997 0.4743 0.8997 0.9486
No log 2.6549 300 0.9861 0.5015 0.9861 0.9930
No log 2.6726 302 0.9312 0.5000 0.9312 0.9650
No log 2.6903 304 0.8723 0.4006 0.8723 0.9340
No log 2.7080 306 0.8648 0.4142 0.8648 0.9299
No log 2.7257 308 0.8752 0.4105 0.8752 0.9355
No log 2.7434 310 0.9693 0.4426 0.9693 0.9846
No log 2.7611 312 0.9897 0.4806 0.9897 0.9948
No log 2.7788 314 0.9182 0.4694 0.9182 0.9582
No log 2.7965 316 0.8370 0.4366 0.8370 0.9149
No log 2.8142 318 0.8038 0.4534 0.8038 0.8966
No log 2.8319 320 0.7873 0.5011 0.7873 0.8873
No log 2.8496 322 0.8077 0.4907 0.8077 0.8987
No log 2.8673 324 0.8614 0.4631 0.8614 0.9281
No log 2.8850 326 0.8433 0.4425 0.8433 0.9183
No log 2.9027 328 0.8272 0.4465 0.8272 0.9095
No log 2.9204 330 0.8344 0.4465 0.8344 0.9135
No log 2.9381 332 0.8902 0.4291 0.8902 0.9435
No log 2.9558 334 0.9397 0.5071 0.9397 0.9694
No log 2.9735 336 0.9754 0.5140 0.9754 0.9876
No log 2.9912 338 0.9376 0.5140 0.9376 0.9683
No log 3.0088 340 0.8447 0.5331 0.8447 0.9191
No log 3.0265 342 0.8116 0.5262 0.8116 0.9009
No log 3.0442 344 0.8071 0.5536 0.8071 0.8984
No log 3.0619 346 0.8051 0.5386 0.8051 0.8973
No log 3.0796 348 0.8669 0.5968 0.8669 0.9311
No log 3.0973 350 0.9265 0.5736 0.9265 0.9625
No log 3.1150 352 0.8576 0.5781 0.8576 0.9261
No log 3.1327 354 0.8212 0.5722 0.8212 0.9062
No log 3.1504 356 0.8294 0.5892 0.8294 0.9107
No log 3.1681 358 0.8483 0.5756 0.8483 0.9210
No log 3.1858 360 0.8687 0.5272 0.8687 0.9320
No log 3.2035 362 0.8819 0.5253 0.8819 0.9391
No log 3.2212 364 0.8439 0.5611 0.8439 0.9186
No log 3.2389 366 0.7967 0.5596 0.7967 0.8926
No log 3.2566 368 0.7912 0.5726 0.7912 0.8895
No log 3.2743 370 0.7990 0.4932 0.7990 0.8939
No log 3.2920 372 0.7951 0.5382 0.7951 0.8917
No log 3.3097 374 0.8043 0.5958 0.8043 0.8969
No log 3.3274 376 0.9338 0.5552 0.9338 0.9663
No log 3.3451 378 1.1148 0.5027 1.1148 1.0559
No log 3.3628 380 1.0825 0.4856 1.0825 1.0404
No log 3.3805 382 0.9349 0.4898 0.9349 0.9669
No log 3.3982 384 0.8447 0.4703 0.8447 0.9191
No log 3.4159 386 0.7766 0.5250 0.7766 0.8812
No log 3.4336 388 0.7496 0.5059 0.7496 0.8658
No log 3.4513 390 0.7503 0.5059 0.7503 0.8662
No log 3.4690 392 0.7904 0.5359 0.7904 0.8891
No log 3.4867 394 0.9117 0.5384 0.9117 0.9548
No log 3.5044 396 0.9374 0.5445 0.9374 0.9682
No log 3.5221 398 0.8581 0.5592 0.8581 0.9263
No log 3.5398 400 0.7833 0.5722 0.7833 0.8850
No log 3.5575 402 0.7477 0.5686 0.7477 0.8647
No log 3.5752 404 0.7545 0.5315 0.7545 0.8686
No log 3.5929 406 0.7836 0.5012 0.7836 0.8852
No log 3.6106 408 0.8774 0.4869 0.8774 0.9367
No log 3.6283 410 0.9461 0.4824 0.9461 0.9727
No log 3.6460 412 0.9034 0.4869 0.9034 0.9505
No log 3.6637 414 0.7915 0.5013 0.7915 0.8897
No log 3.6814 416 0.7666 0.5216 0.7666 0.8756
No log 3.6991 418 0.7929 0.5618 0.7929 0.8905
No log 3.7168 420 0.7691 0.5415 0.7691 0.8770
No log 3.7345 422 0.7744 0.5028 0.7744 0.8800
No log 3.7522 424 0.8336 0.5042 0.8336 0.9130
No log 3.7699 426 0.8809 0.4685 0.8809 0.9386
No log 3.7876 428 0.8558 0.4914 0.8558 0.9251
No log 3.8053 430 0.8008 0.5041 0.8008 0.8949
No log 3.8230 432 0.8076 0.6059 0.8076 0.8987
No log 3.8407 434 0.8694 0.5339 0.8694 0.9324
No log 3.8584 436 0.8349 0.5427 0.8349 0.9137
No log 3.8761 438 0.7806 0.5580 0.7806 0.8835
No log 3.8938 440 0.8251 0.4940 0.8251 0.9083
No log 3.9115 442 0.8777 0.4828 0.8777 0.9368
No log 3.9292 444 0.8593 0.5044 0.8593 0.9270
No log 3.9469 446 0.8516 0.5044 0.8516 0.9228
No log 3.9646 448 0.8469 0.5354 0.8469 0.9202
No log 3.9823 450 0.8283 0.5268 0.8283 0.9101
No log 4.0 452 0.8082 0.4927 0.8082 0.8990
No log 4.0177 454 0.8205 0.5294 0.8205 0.9058
No log 4.0354 456 0.9032 0.4918 0.9032 0.9504
No log 4.0531 458 1.0507 0.4845 1.0507 1.0250
No log 4.0708 460 1.1257 0.4891 1.1257 1.0610
No log 4.0885 462 1.0746 0.5311 1.0746 1.0366
No log 4.1062 464 0.9396 0.5002 0.9396 0.9693
No log 4.1239 466 0.8224 0.4838 0.8224 0.9069
No log 4.1416 468 0.7808 0.5131 0.7808 0.8836
No log 4.1593 470 0.7776 0.4826 0.7776 0.8818
No log 4.1770 472 0.7734 0.4979 0.7734 0.8794
No log 4.1947 474 0.7931 0.5759 0.7931 0.8905
No log 4.2124 476 0.8586 0.5712 0.8586 0.9266
No log 4.2301 478 0.9410 0.4834 0.9410 0.9700
No log 4.2478 480 1.1144 0.4285 1.1144 1.0557
No log 4.2655 482 1.2020 0.4362 1.2020 1.0964
No log 4.2832 484 1.1186 0.4362 1.1186 1.0576
No log 4.3009 486 0.9264 0.4101 0.9264 0.9625
No log 4.3186 488 0.8290 0.5171 0.8290 0.9105
No log 4.3363 490 0.7999 0.5702 0.7999 0.8944
No log 4.3540 492 0.7920 0.5610 0.7920 0.8899
No log 4.3717 494 0.7659 0.5239 0.7659 0.8751
No log 4.3894 496 0.7492 0.5451 0.7492 0.8655
No log 4.4071 498 0.7418 0.5259 0.7418 0.8613
0.3669 4.4248 500 0.7953 0.5847 0.7953 0.8918
0.3669 4.4425 502 0.8523 0.5553 0.8523 0.9232
0.3669 4.4602 504 0.8619 0.5444 0.8619 0.9284
0.3669 4.4779 506 0.8021 0.5493 0.8021 0.8956
0.3669 4.4956 508 0.8031 0.5107 0.8031 0.8962
0.3669 4.5133 510 0.8259 0.5470 0.8259 0.9088

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k20_task2_organization

Finetuned
(4019)
this model