ArabicNewSplits7_usingALLEssays_FineTuningAraBERT_run2_AugV5_k19_task5_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.8534
  • Qwk: 0.3360
  • Mse: 0.8534
  • Rmse: 0.9238

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0333 2 3.8732 -0.0134 3.8732 1.9680
No log 0.0667 4 2.0080 0.0435 2.0080 1.4170
No log 0.1 6 2.1485 -0.0086 2.1485 1.4658
No log 0.1333 8 1.9538 0.0142 1.9538 1.3978
No log 0.1667 10 1.4849 0.0613 1.4849 1.2186
No log 0.2 12 1.4659 0.0466 1.4659 1.2108
No log 0.2333 14 1.5843 0.0834 1.5843 1.2587
No log 0.2667 16 1.8604 0.0536 1.8604 1.3640
No log 0.3 18 2.0805 0.0790 2.0805 1.4424
No log 0.3333 20 1.5869 0.1659 1.5869 1.2597
No log 0.3667 22 1.0519 0.1805 1.0519 1.0256
No log 0.4 24 0.9972 0.2140 0.9972 0.9986
No log 0.4333 26 1.0004 0.3071 1.0004 1.0002
No log 0.4667 28 1.0164 0.2243 1.0164 1.0082
No log 0.5 30 1.1727 0.1176 1.1727 1.0829
No log 0.5333 32 1.1364 0.1296 1.1364 1.0660
No log 0.5667 34 0.9386 0.2643 0.9386 0.9688
No log 0.6 36 0.9683 0.2467 0.9683 0.9840
No log 0.6333 38 0.9579 0.2770 0.9579 0.9787
No log 0.6667 40 0.9888 0.2441 0.9888 0.9944
No log 0.7 42 1.1550 0.2045 1.1550 1.0747
No log 0.7333 44 1.1415 0.2045 1.1415 1.0684
No log 0.7667 46 0.9475 0.2967 0.9475 0.9734
No log 0.8 48 0.9103 0.3326 0.9103 0.9541
No log 0.8333 50 0.9165 0.3326 0.9165 0.9573
No log 0.8667 52 1.1122 0.2543 1.1122 1.0546
No log 0.9 54 1.3151 0.1815 1.3151 1.1468
No log 0.9333 56 1.3107 0.1966 1.3107 1.1449
No log 0.9667 58 1.2015 0.2534 1.2015 1.0961
No log 1.0 60 1.1149 0.2995 1.1149 1.0559
No log 1.0333 62 1.0577 0.4408 1.0577 1.0284
No log 1.0667 64 1.0127 0.4627 1.0127 1.0063
No log 1.1 66 1.0160 0.4763 1.0160 1.0080
No log 1.1333 68 0.9951 0.4524 0.9951 0.9975
No log 1.1667 70 0.9869 0.3804 0.9869 0.9934
No log 1.2 72 1.0200 0.3618 1.0200 1.0099
No log 1.2333 74 1.0105 0.3525 1.0105 1.0052
No log 1.2667 76 0.9997 0.3642 0.9997 0.9999
No log 1.3 78 0.9704 0.3378 0.9704 0.9851
No log 1.3333 80 0.9867 0.4503 0.9867 0.9933
No log 1.3667 82 0.9417 0.4244 0.9417 0.9704
No log 1.4 84 0.9221 0.4122 0.9221 0.9602
No log 1.4333 86 0.9438 0.3925 0.9438 0.9715
No log 1.4667 88 0.9803 0.4628 0.9803 0.9901
No log 1.5 90 0.9662 0.4045 0.9662 0.9830
No log 1.5333 92 0.9604 0.4524 0.9604 0.9800
No log 1.5667 94 0.9789 0.3590 0.9789 0.9894
No log 1.6 96 0.8357 0.5218 0.8357 0.9142
No log 1.6333 98 0.8138 0.5231 0.8138 0.9021
No log 1.6667 100 0.7472 0.4676 0.7472 0.8644
No log 1.7 102 0.7386 0.5161 0.7386 0.8594
No log 1.7333 104 0.7525 0.6092 0.7525 0.8675
No log 1.7667 106 0.8542 0.5222 0.8542 0.9242
No log 1.8 108 0.9170 0.4802 0.9170 0.9576
No log 1.8333 110 0.9037 0.5027 0.9037 0.9507
No log 1.8667 112 0.8325 0.4562 0.8325 0.9124
No log 1.9 114 0.7533 0.3961 0.7533 0.8679
No log 1.9333 116 0.7816 0.4186 0.7816 0.8841
No log 1.9667 118 0.8153 0.3025 0.8153 0.9029
No log 2.0 120 0.9298 0.2672 0.9298 0.9643
No log 2.0333 122 0.9030 0.2978 0.9030 0.9503
No log 2.0667 124 0.8132 0.4381 0.8132 0.9018
No log 2.1 126 0.8362 0.4346 0.8362 0.9145
No log 2.1333 128 0.8234 0.4547 0.8234 0.9074
No log 2.1667 130 0.8154 0.5949 0.8154 0.9030
No log 2.2 132 0.8177 0.5898 0.8177 0.9043
No log 2.2333 134 0.8164 0.5958 0.8164 0.9036
No log 2.2667 136 0.8384 0.4949 0.8384 0.9157
No log 2.3 138 0.8629 0.4825 0.8629 0.9289
No log 2.3333 140 0.9048 0.4279 0.9048 0.9512
No log 2.3667 142 0.7847 0.5178 0.7847 0.8858
No log 2.4 144 0.7609 0.5759 0.7609 0.8723
No log 2.4333 146 0.7736 0.5089 0.7736 0.8796
No log 2.4667 148 0.7624 0.5024 0.7624 0.8732
No log 2.5 150 0.7849 0.5359 0.7849 0.8859
No log 2.5333 152 0.8208 0.4850 0.8208 0.9060
No log 2.5667 154 0.8547 0.4751 0.8547 0.9245
No log 2.6 156 0.9148 0.4536 0.9148 0.9565
No log 2.6333 158 0.9518 0.4002 0.9518 0.9756
No log 2.6667 160 0.8748 0.4305 0.8748 0.9353
No log 2.7 162 0.8733 0.5002 0.8733 0.9345
No log 2.7333 164 0.8763 0.4770 0.8763 0.9361
No log 2.7667 166 0.9401 0.3861 0.9401 0.9696
No log 2.8 168 1.0376 0.3546 1.0376 1.0186
No log 2.8333 170 0.9847 0.3511 0.9847 0.9923
No log 2.8667 172 0.8589 0.4244 0.8589 0.9268
No log 2.9 174 0.8472 0.3896 0.8472 0.9204
No log 2.9333 176 0.8381 0.3896 0.8381 0.9155
No log 2.9667 178 0.8513 0.3536 0.8513 0.9226
No log 3.0 180 0.8626 0.3583 0.8626 0.9288
No log 3.0333 182 0.8379 0.3631 0.8379 0.9154
No log 3.0667 184 0.8223 0.3877 0.8223 0.9068
No log 3.1 186 0.8575 0.4192 0.8575 0.9260
No log 3.1333 188 0.8519 0.4456 0.8519 0.9230
No log 3.1667 190 0.8071 0.5046 0.8071 0.8984
No log 3.2 192 0.7988 0.5331 0.7988 0.8938
No log 3.2333 194 0.7967 0.4869 0.7967 0.8926
No log 3.2667 196 0.7910 0.4428 0.7910 0.8894
No log 3.3 198 0.8053 0.4466 0.8053 0.8974
No log 3.3333 200 0.8726 0.4021 0.8726 0.9341
No log 3.3667 202 0.8671 0.4370 0.8671 0.9312
No log 3.4 204 0.8449 0.4069 0.8449 0.9192
No log 3.4333 206 0.8556 0.4069 0.8556 0.9250
No log 3.4667 208 0.8625 0.3873 0.8625 0.9287
No log 3.5 210 0.8911 0.4130 0.8911 0.9440
No log 3.5333 212 0.8736 0.4486 0.8736 0.9347
No log 3.5667 214 0.8828 0.4490 0.8828 0.9396
No log 3.6 216 0.9005 0.4733 0.9005 0.9489
No log 3.6333 218 0.9082 0.4734 0.9082 0.9530
No log 3.6667 220 0.9024 0.4838 0.9024 0.9499
No log 3.7 222 0.8794 0.4907 0.8794 0.9378
No log 3.7333 224 0.8729 0.4751 0.8729 0.9343
No log 3.7667 226 0.8451 0.4989 0.8451 0.9193
No log 3.8 228 0.8301 0.5023 0.8301 0.9111
No log 3.8333 230 0.8249 0.4871 0.8249 0.9082
No log 3.8667 232 0.8301 0.5340 0.8301 0.9111
No log 3.9 234 0.7927 0.5909 0.7927 0.8903
No log 3.9333 236 0.8088 0.5678 0.8088 0.8993
No log 3.9667 238 0.8651 0.4584 0.8651 0.9301
No log 4.0 240 0.7968 0.5298 0.7968 0.8926
No log 4.0333 242 0.7279 0.6597 0.7279 0.8532
No log 4.0667 244 0.7312 0.5810 0.7312 0.8551
No log 4.1 246 0.7008 0.5809 0.7008 0.8371
No log 4.1333 248 0.7665 0.4686 0.7665 0.8755
No log 4.1667 250 0.8644 0.4942 0.8644 0.9297
No log 4.2 252 0.8246 0.4926 0.8246 0.9081
No log 4.2333 254 0.7331 0.5179 0.7331 0.8562
No log 4.2667 256 0.7612 0.5477 0.7612 0.8725
No log 4.3 258 0.8132 0.5046 0.8132 0.9018
No log 4.3333 260 0.7828 0.5305 0.7828 0.8847
No log 4.3667 262 0.7450 0.4643 0.7450 0.8632
No log 4.4 264 0.7704 0.4510 0.7704 0.8777
No log 4.4333 266 0.8144 0.4686 0.8144 0.9025
No log 4.4667 268 0.7945 0.4563 0.7945 0.8913
No log 4.5 270 0.7498 0.5714 0.7498 0.8659
No log 4.5333 272 0.7884 0.5917 0.7884 0.8879
No log 4.5667 274 0.8044 0.6082 0.8044 0.8969
No log 4.6 276 0.7978 0.5558 0.7978 0.8932
No log 4.6333 278 0.7605 0.5436 0.7605 0.8721
No log 4.6667 280 0.7395 0.5587 0.7395 0.8599
No log 4.7 282 0.7372 0.5587 0.7372 0.8586
No log 4.7333 284 0.7532 0.5331 0.7532 0.8679
No log 4.7667 286 0.7799 0.5413 0.7799 0.8831
No log 4.8 288 0.7644 0.5552 0.7644 0.8743
No log 4.8333 290 0.7421 0.4511 0.7421 0.8615
No log 4.8667 292 0.7271 0.3569 0.7271 0.8527
No log 4.9 294 0.7279 0.3548 0.7279 0.8532
No log 4.9333 296 0.7324 0.5559 0.7324 0.8558
No log 4.9667 298 0.6967 0.5359 0.6967 0.8347
No log 5.0 300 0.6645 0.5287 0.6645 0.8152
No log 5.0333 302 0.6654 0.5536 0.6654 0.8157
No log 5.0667 304 0.6723 0.5314 0.6723 0.8199
No log 5.1 306 0.6685 0.5314 0.6685 0.8176
No log 5.1333 308 0.6710 0.5672 0.6710 0.8192
No log 5.1667 310 0.6696 0.6195 0.6696 0.8183
No log 5.2 312 0.6642 0.5680 0.6642 0.8150
No log 5.2333 314 0.6763 0.5785 0.6763 0.8224
No log 5.2667 316 0.7082 0.5674 0.7082 0.8416
No log 5.3 318 0.7030 0.5315 0.7030 0.8385
No log 5.3333 320 0.7039 0.4156 0.7039 0.8390
No log 5.3667 322 0.7149 0.4156 0.7149 0.8455
No log 5.4 324 0.7456 0.5328 0.7456 0.8635
No log 5.4333 326 0.7674 0.5005 0.7674 0.8760
No log 5.4667 328 0.7170 0.5093 0.7170 0.8468
No log 5.5 330 0.6801 0.5905 0.6801 0.8247
No log 5.5333 332 0.7038 0.5626 0.7038 0.8390
No log 5.5667 334 0.6904 0.5626 0.6904 0.8309
No log 5.6 336 0.6881 0.5577 0.6881 0.8295
No log 5.6333 338 0.6872 0.4776 0.6872 0.8290
No log 5.6667 340 0.6981 0.5054 0.6981 0.8355
No log 5.7 342 0.7004 0.5054 0.7004 0.8369
No log 5.7333 344 0.7283 0.5678 0.7283 0.8534
No log 5.7667 346 0.8332 0.5436 0.8332 0.9128
No log 5.8 348 0.9019 0.5317 0.9019 0.9497
No log 5.8333 350 0.8504 0.5543 0.8504 0.9222
No log 5.8667 352 0.7910 0.5366 0.7910 0.8894
No log 5.9 354 0.7200 0.6207 0.7200 0.8485
No log 5.9333 356 0.7218 0.5179 0.7218 0.8496
No log 5.9667 358 0.7238 0.5179 0.7238 0.8508
No log 6.0 360 0.7296 0.6288 0.7296 0.8542
No log 6.0333 362 0.7479 0.6110 0.7479 0.8648
No log 6.0667 364 0.7339 0.6022 0.7339 0.8567
No log 6.1 366 0.7301 0.6129 0.7301 0.8545
No log 6.1333 368 0.7229 0.6112 0.7229 0.8502
No log 6.1667 370 0.7183 0.6634 0.7183 0.8476
No log 6.2 372 0.7126 0.6229 0.7126 0.8442
No log 6.2333 374 0.7363 0.5153 0.7363 0.8581
No log 6.2667 376 0.7584 0.4295 0.7584 0.8709
No log 6.3 378 0.7692 0.4477 0.7692 0.8770
No log 6.3333 380 0.7427 0.4592 0.7427 0.8618
No log 6.3667 382 0.7113 0.5166 0.7113 0.8434
No log 6.4 384 0.7395 0.6247 0.7395 0.8599
No log 6.4333 386 0.8074 0.6260 0.8074 0.8986
No log 6.4667 388 0.7949 0.6198 0.7949 0.8916
No log 6.5 390 0.7828 0.6318 0.7828 0.8848
No log 6.5333 392 0.7452 0.6188 0.7452 0.8632
No log 6.5667 394 0.7295 0.5808 0.7295 0.8541
No log 6.6 396 0.7441 0.5375 0.7441 0.8626
No log 6.6333 398 0.7745 0.4987 0.7745 0.8801
No log 6.6667 400 0.8245 0.4025 0.8245 0.9080
No log 6.7 402 0.8315 0.4025 0.8315 0.9119
No log 6.7333 404 0.7928 0.4715 0.7928 0.8904
No log 6.7667 406 0.7590 0.4776 0.7590 0.8712
No log 6.8 408 0.7635 0.4594 0.7635 0.8738
No log 6.8333 410 0.7613 0.5179 0.7613 0.8725
No log 6.8667 412 0.7659 0.4838 0.7659 0.8752
No log 6.9 414 0.7871 0.5500 0.7871 0.8872
No log 6.9333 416 0.7997 0.5763 0.7997 0.8943
No log 6.9667 418 0.7762 0.4898 0.7762 0.8810
No log 7.0 420 0.7729 0.4691 0.7729 0.8792
No log 7.0333 422 0.7768 0.4941 0.7768 0.8813
No log 7.0667 424 0.7857 0.4719 0.7857 0.8864
No log 7.1 426 0.8169 0.5331 0.8169 0.9038
No log 7.1333 428 0.8712 0.4821 0.8712 0.9334
No log 7.1667 430 0.8486 0.4343 0.8486 0.9212
No log 7.2 432 0.8093 0.4220 0.8093 0.8996
No log 7.2333 434 0.7696 0.4878 0.7696 0.8773
No log 7.2667 436 0.7606 0.4722 0.7606 0.8721
No log 7.3 438 0.7535 0.4936 0.7535 0.8681
No log 7.3333 440 0.7846 0.5690 0.7846 0.8858
No log 7.3667 442 0.8928 0.5224 0.8928 0.9449
No log 7.4 444 0.9639 0.4641 0.9639 0.9818
No log 7.4333 446 0.9254 0.5023 0.9254 0.9620
No log 7.4667 448 0.7936 0.6487 0.7936 0.8909
No log 7.5 450 0.6927 0.6493 0.6927 0.8323
No log 7.5333 452 0.6944 0.5434 0.6944 0.8333
No log 7.5667 454 0.7139 0.5353 0.7139 0.8449
No log 7.6 456 0.7001 0.5131 0.7001 0.8367
No log 7.6333 458 0.6917 0.5635 0.6917 0.8317
No log 7.6667 460 0.7447 0.5131 0.7447 0.8629
No log 7.7 462 0.7970 0.5766 0.7970 0.8928
No log 7.7333 464 0.7959 0.6151 0.7959 0.8922
No log 7.7667 466 0.7106 0.6142 0.7106 0.8430
No log 7.8 468 0.6778 0.5981 0.6778 0.8233
No log 7.8333 470 0.6696 0.5835 0.6696 0.8183
No log 7.8667 472 0.6387 0.6476 0.6387 0.7992
No log 7.9 474 0.6564 0.5963 0.6564 0.8102
No log 7.9333 476 0.6833 0.5787 0.6833 0.8266
No log 7.9667 478 0.6721 0.5248 0.6721 0.8198
No log 8.0 480 0.6802 0.5706 0.6802 0.8248
No log 8.0333 482 0.7351 0.5141 0.7351 0.8574
No log 8.0667 484 0.8316 0.4481 0.8316 0.9119
No log 8.1 486 0.8060 0.5270 0.8060 0.8978
No log 8.1333 488 0.7043 0.5493 0.7043 0.8392
No log 8.1667 490 0.6670 0.5530 0.6670 0.8167
No log 8.2 492 0.6909 0.5877 0.6909 0.8312
No log 8.2333 494 0.7202 0.6554 0.7202 0.8486
No log 8.2667 496 0.7161 0.6293 0.7161 0.8462
No log 8.3 498 0.7869 0.6326 0.7869 0.8871
0.2637 8.3333 500 0.9139 0.5013 0.9139 0.9560
0.2637 8.3667 502 0.9650 0.3881 0.9650 0.9823
0.2637 8.4 504 0.9133 0.4208 0.9133 0.9557
0.2637 8.4333 506 0.8102 0.4044 0.8102 0.9001
0.2637 8.4667 508 0.7391 0.5260 0.7391 0.8597
0.2637 8.5 510 0.7211 0.5260 0.7211 0.8491
0.2637 8.5333 512 0.7511 0.5637 0.7511 0.8667
0.2637 8.5667 514 0.8472 0.5044 0.8472 0.9204
0.2637 8.6 516 0.9148 0.5119 0.9148 0.9565
0.2637 8.6333 518 0.9354 0.5219 0.9354 0.9672
0.2637 8.6667 520 0.9819 0.4681 0.9819 0.9909
0.2637 8.7 522 0.9180 0.4815 0.9180 0.9581
0.2637 8.7333 524 0.8448 0.4216 0.8448 0.9192
0.2637 8.7667 526 0.8285 0.3941 0.8285 0.9102
0.2637 8.8 528 0.8534 0.3360 0.8534 0.9238

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_usingALLEssays_FineTuningAraBERT_run2_AugV5_k19_task5_organization

Finetuned
(4019)
this model