ArabicNewSplits7_B_usingALLEssays_FineTuningAraBERT_run2_AugV5_k2_task7_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.8781
  • Qwk: 0.3125
  • Mse: 0.8781
  • Rmse: 0.9371

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.1818 2 2.6997 -0.0262 2.6997 1.6431
No log 0.3636 4 1.3520 0.0495 1.3520 1.1628
No log 0.5455 6 1.0319 -0.0970 1.0319 1.0158
No log 0.7273 8 1.1967 -0.1307 1.1967 1.0940
No log 0.9091 10 1.1516 -0.0735 1.1516 1.0731
No log 1.0909 12 1.3282 0.1312 1.3282 1.1525
No log 1.2727 14 1.3216 0.1371 1.3216 1.1496
No log 1.4545 16 1.0568 0.1935 1.0568 1.0280
No log 1.6364 18 0.8701 0.3060 0.8701 0.9328
No log 1.8182 20 0.7983 0.1582 0.7983 0.8935
No log 2.0 22 0.7756 0.1548 0.7756 0.8807
No log 2.1818 24 0.7474 0.2182 0.7474 0.8645
No log 2.3636 26 0.7631 0.1598 0.7631 0.8736
No log 2.5455 28 0.7545 0.1953 0.7545 0.8686
No log 2.7273 30 0.7840 0.2817 0.7840 0.8855
No log 2.9091 32 0.8941 0.2982 0.8941 0.9456
No log 3.0909 34 0.8499 0.3343 0.8499 0.9219
No log 3.2727 36 0.8794 0.3141 0.8794 0.9378
No log 3.4545 38 0.9229 0.3039 0.9229 0.9607
No log 3.6364 40 0.8965 0.3567 0.8965 0.9468
No log 3.8182 42 0.8393 0.4290 0.8393 0.9161
No log 4.0 44 0.8515 0.3285 0.8515 0.9227
No log 4.1818 46 0.9546 0.2363 0.9546 0.9770
No log 4.3636 48 0.8953 0.3294 0.8953 0.9462
No log 4.5455 50 0.7970 0.3155 0.7970 0.8927
No log 4.7273 52 0.7470 0.1720 0.7470 0.8643
No log 4.9091 54 0.8536 0.2233 0.8536 0.9239
No log 5.0909 56 0.8626 0.2559 0.8626 0.9288
No log 5.2727 58 0.7661 0.1546 0.7661 0.8752
No log 5.4545 60 0.7046 0.2447 0.7046 0.8394
No log 5.6364 62 0.7328 0.2285 0.7328 0.8560
No log 5.8182 64 0.7825 0.3794 0.7825 0.8846
No log 6.0 66 0.7524 0.3677 0.7524 0.8674
No log 6.1818 68 0.6764 0.4086 0.6764 0.8225
No log 6.3636 70 0.6763 0.3450 0.6763 0.8224
No log 6.5455 72 0.6803 0.4116 0.6803 0.8248
No log 6.7273 74 0.7579 0.4157 0.7579 0.8705
No log 6.9091 76 0.7794 0.4007 0.7794 0.8828
No log 7.0909 78 0.8299 0.3231 0.8299 0.9110
No log 7.2727 80 0.8941 0.2804 0.8941 0.9456
No log 7.4545 82 1.0201 0.2526 1.0201 1.0100
No log 7.6364 84 1.0891 0.1922 1.0891 1.0436
No log 7.8182 86 1.1379 0.2358 1.1379 1.0667
No log 8.0 88 1.0337 0.3099 1.0337 1.0167
No log 8.1818 90 0.9252 0.2952 0.9252 0.9619
No log 8.3636 92 0.8341 0.4778 0.8341 0.9133
No log 8.5455 94 0.8978 0.3631 0.8978 0.9475
No log 8.7273 96 0.7806 0.4783 0.7806 0.8835
No log 8.9091 98 0.9262 0.2938 0.9262 0.9624
No log 9.0909 100 1.1784 0.2366 1.1784 1.0855
No log 9.2727 102 1.2189 0.2366 1.2189 1.1041
No log 9.4545 104 1.0006 0.1872 1.0006 1.0003
No log 9.6364 106 0.8888 0.2557 0.8888 0.9428
No log 9.8182 108 0.8773 0.3526 0.8773 0.9366
No log 10.0 110 0.8973 0.3037 0.8973 0.9473
No log 10.1818 112 0.9208 0.2797 0.9208 0.9596
No log 10.3636 114 0.9195 0.3363 0.9195 0.9589
No log 10.5455 116 0.9835 0.3783 0.9835 0.9917
No log 10.7273 118 1.0350 0.2950 1.0350 1.0173
No log 10.9091 120 1.1033 0.2935 1.1033 1.0504
No log 11.0909 122 1.3206 0.2990 1.3206 1.1492
No log 11.2727 124 1.4766 0.1932 1.4766 1.2152
No log 11.4545 126 1.3460 0.2734 1.3460 1.1602
No log 11.6364 128 1.0634 0.2971 1.0634 1.0312
No log 11.8182 130 0.9190 0.3492 0.9190 0.9586
No log 12.0 132 0.9225 0.3824 0.9225 0.9605
No log 12.1818 134 0.9748 0.2864 0.9748 0.9873
No log 12.3636 136 1.0345 0.2971 1.0345 1.0171
No log 12.5455 138 1.0308 0.2109 1.0308 1.0153
No log 12.7273 140 0.9299 0.3618 0.9299 0.9643
No log 12.9091 142 0.8396 0.3723 0.8396 0.9163
No log 13.0909 144 0.8380 0.3450 0.8380 0.9154
No log 13.2727 146 0.9955 0.2416 0.9955 0.9977
No log 13.4545 148 1.0523 0.3253 1.0523 1.0258
No log 13.6364 150 0.9215 0.3410 0.9215 0.9600
No log 13.8182 152 0.8384 0.4063 0.8384 0.9156
No log 14.0 154 0.8435 0.3652 0.8435 0.9184
No log 14.1818 156 0.9042 0.3133 0.9042 0.9509
No log 14.3636 158 0.9021 0.3297 0.9021 0.9498
No log 14.5455 160 0.9374 0.2905 0.9374 0.9682
No log 14.7273 162 0.8828 0.2853 0.8828 0.9396
No log 14.9091 164 0.8728 0.3074 0.8728 0.9342
No log 15.0909 166 0.8724 0.36 0.8724 0.9340
No log 15.2727 168 0.8594 0.3103 0.8594 0.9271
No log 15.4545 170 0.9284 0.2960 0.9284 0.9635
No log 15.6364 172 0.9774 0.3508 0.9774 0.9886
No log 15.8182 174 0.8815 0.3206 0.8815 0.9389
No log 16.0 176 0.8766 0.3257 0.8766 0.9363
No log 16.1818 178 0.8571 0.3111 0.8571 0.9258
No log 16.3636 180 0.8711 0.3195 0.8711 0.9333
No log 16.5455 182 0.9824 0.2965 0.9824 0.9911
No log 16.7273 184 1.2457 0.1984 1.2457 1.1161
No log 16.9091 186 1.3462 0.1885 1.3462 1.1602
No log 17.0909 188 1.1678 0.2421 1.1678 1.0806
No log 17.2727 190 0.9017 0.3719 0.9017 0.9496
No log 17.4545 192 0.8041 0.3571 0.8041 0.8967
No log 17.6364 194 0.7762 0.4222 0.7762 0.8810
No log 17.8182 196 0.7923 0.3775 0.7923 0.8901
No log 18.0 198 0.9640 0.3204 0.9640 0.9819
No log 18.1818 200 1.1619 0.2930 1.1619 1.0779
No log 18.3636 202 1.1879 0.3394 1.1879 1.0899
No log 18.5455 204 1.0204 0.3065 1.0204 1.0102
No log 18.7273 206 0.8550 0.3590 0.8550 0.9247
No log 18.9091 208 0.8233 0.3961 0.8233 0.9074
No log 19.0909 210 0.7865 0.4507 0.7865 0.8868
No log 19.2727 212 0.7805 0.3991 0.7805 0.8834
No log 19.4545 214 0.8842 0.3342 0.8842 0.9403
No log 19.6364 216 1.0303 0.2901 1.0303 1.0150
No log 19.8182 218 1.0591 0.3542 1.0591 1.0291
No log 20.0 220 0.9201 0.3289 0.9201 0.9592
No log 20.1818 222 0.8614 0.3218 0.8614 0.9281
No log 20.3636 224 0.8293 0.3255 0.8293 0.9107
No log 20.5455 226 0.8365 0.3798 0.8365 0.9146
No log 20.7273 228 0.9299 0.2602 0.9299 0.9643
No log 20.9091 230 1.0229 0.2209 1.0229 1.0114
No log 21.0909 232 1.0416 0.2461 1.0416 1.0206
No log 21.2727 234 1.0107 0.2396 1.0107 1.0054
No log 21.4545 236 1.0809 0.2547 1.0809 1.0397
No log 21.6364 238 1.1082 0.3126 1.1082 1.0527
No log 21.8182 240 1.0090 0.3274 1.0090 1.0045
No log 22.0 242 0.8048 0.3775 0.8048 0.8971
No log 22.1818 244 0.7487 0.3715 0.7487 0.8653
No log 22.3636 246 0.7461 0.3669 0.7461 0.8638
No log 22.5455 248 0.7548 0.4699 0.7548 0.8688
No log 22.7273 250 0.8844 0.3790 0.8844 0.9405
No log 22.9091 252 0.9397 0.4260 0.9397 0.9694
No log 23.0909 254 0.8825 0.4074 0.8825 0.9394
No log 23.2727 256 0.8164 0.4012 0.8164 0.9035
No log 23.4545 258 0.7827 0.3976 0.7827 0.8847
No log 23.6364 260 0.8022 0.4012 0.8022 0.8956
No log 23.8182 262 0.9333 0.3615 0.9333 0.9661
No log 24.0 264 1.0322 0.3902 1.0322 1.0160
No log 24.1818 266 0.9966 0.3809 0.9966 0.9983
No log 24.3636 268 0.9876 0.3615 0.9876 0.9938
No log 24.5455 270 0.9299 0.3355 0.9299 0.9643
No log 24.7273 272 0.9906 0.3689 0.9906 0.9953
No log 24.9091 274 1.1271 0.2018 1.1271 1.0617
No log 25.0909 276 1.0844 0.2138 1.0844 1.0414
No log 25.2727 278 1.0177 0.2421 1.0177 1.0088
No log 25.4545 280 0.8680 0.3461 0.8680 0.9317
No log 25.6364 282 0.8613 0.3659 0.8613 0.9280
No log 25.8182 284 0.8345 0.3679 0.8345 0.9135
No log 26.0 286 0.8150 0.3721 0.8150 0.9028
No log 26.1818 288 0.8215 0.3919 0.8215 0.9064
No log 26.3636 290 0.8299 0.3779 0.8299 0.9110
No log 26.5455 292 0.9169 0.3363 0.9169 0.9575
No log 26.7273 294 1.0933 0.2857 1.0933 1.0456
No log 26.9091 296 1.0941 0.2930 1.0941 1.0460
No log 27.0909 298 1.0073 0.3052 1.0073 1.0037
No log 27.2727 300 0.8837 0.3618 0.8837 0.9400
No log 27.4545 302 0.8179 0.4072 0.8179 0.9044
No log 27.6364 304 0.8152 0.4124 0.8152 0.9029
No log 27.8182 306 0.7677 0.4212 0.7677 0.8762
No log 28.0 308 0.7657 0.3842 0.7657 0.8750
No log 28.1818 310 0.7792 0.4070 0.7792 0.8827
No log 28.3636 312 0.8068 0.3309 0.8068 0.8982
No log 28.5455 314 0.8041 0.3661 0.8041 0.8967
No log 28.7273 316 0.7272 0.3775 0.7272 0.8527
No log 28.9091 318 0.6796 0.4600 0.6796 0.8244
No log 29.0909 320 0.6740 0.4158 0.6740 0.8209
No log 29.2727 322 0.6631 0.4475 0.6631 0.8143
No log 29.4545 324 0.7538 0.4531 0.7538 0.8682
No log 29.6364 326 1.0299 0.3759 1.0299 1.0148
No log 29.8182 328 1.1833 0.2534 1.1833 1.0878
No log 30.0 330 1.1024 0.3033 1.1024 1.0500
No log 30.1818 332 0.9285 0.3786 0.9285 0.9636
No log 30.3636 334 0.8701 0.3803 0.8701 0.9328
No log 30.5455 336 0.8309 0.3803 0.8309 0.9115
No log 30.7273 338 0.7908 0.3710 0.7908 0.8892
No log 30.9091 340 0.7460 0.4239 0.7460 0.8637
No log 31.0909 342 0.7840 0.3732 0.7840 0.8855
No log 31.2727 344 0.8506 0.3483 0.8506 0.9223
No log 31.4545 346 0.8786 0.3747 0.8786 0.9373
No log 31.6364 348 0.9544 0.3887 0.9544 0.9769
No log 31.8182 350 0.9549 0.3577 0.9549 0.9772
No log 32.0 352 0.8753 0.3740 0.8753 0.9356
No log 32.1818 354 0.8104 0.4144 0.8104 0.9002
No log 32.3636 356 0.8068 0.4144 0.8068 0.8982
No log 32.5455 358 0.8079 0.4093 0.8079 0.8988
No log 32.7273 360 0.8291 0.3410 0.8291 0.9106
No log 32.9091 362 0.9054 0.4337 0.9054 0.9515
No log 33.0909 364 0.9094 0.4337 0.9094 0.9536
No log 33.2727 366 0.9449 0.3663 0.9449 0.9721
No log 33.4545 368 0.9896 0.3593 0.9896 0.9948
No log 33.6364 370 0.9754 0.3680 0.9754 0.9876
No log 33.8182 372 0.9203 0.3074 0.9203 0.9593
No log 34.0 374 0.9173 0.3067 0.9173 0.9578
No log 34.1818 376 0.8949 0.2952 0.8949 0.9460
No log 34.3636 378 0.8681 0.3251 0.8681 0.9317
No log 34.5455 380 0.8164 0.3307 0.8164 0.9036
No log 34.7273 382 0.8190 0.3468 0.8190 0.9050
No log 34.9091 384 0.8370 0.3719 0.8370 0.9149
No log 35.0909 386 0.7730 0.4124 0.7730 0.8792
No log 35.2727 388 0.6783 0.4413 0.6783 0.8236
No log 35.4545 390 0.6502 0.4575 0.6502 0.8063
No log 35.6364 392 0.6443 0.5326 0.6443 0.8027
No log 35.8182 394 0.6662 0.4473 0.6662 0.8162
No log 36.0 396 0.7135 0.4451 0.7135 0.8447
No log 36.1818 398 0.7679 0.3889 0.7679 0.8763
No log 36.3636 400 0.7613 0.4476 0.7613 0.8725
No log 36.5455 402 0.6961 0.4704 0.6961 0.8343
No log 36.7273 404 0.6879 0.4642 0.6879 0.8294
No log 36.9091 406 0.7066 0.4484 0.7066 0.8406
No log 37.0909 408 0.7582 0.3885 0.7582 0.8707
No log 37.2727 410 0.8165 0.3700 0.8165 0.9036
No log 37.4545 412 0.8930 0.4116 0.8930 0.9450
No log 37.6364 414 0.8887 0.4070 0.8887 0.9427
No log 37.8182 416 0.8042 0.4104 0.8042 0.8968
No log 38.0 418 0.7176 0.4260 0.7176 0.8471
No log 38.1818 420 0.7136 0.4260 0.7136 0.8447
No log 38.3636 422 0.7616 0.3820 0.7616 0.8727
No log 38.5455 424 0.8597 0.3446 0.8597 0.9272
No log 38.7273 426 1.0195 0.3716 1.0195 1.0097
No log 38.9091 428 1.0446 0.3228 1.0446 1.0221
No log 39.0909 430 0.9631 0.4081 0.9631 0.9814
No log 39.2727 432 0.8570 0.4152 0.8570 0.9257
No log 39.4545 434 0.7492 0.4144 0.7492 0.8656
No log 39.6364 436 0.6999 0.4601 0.6999 0.8366
No log 39.8182 438 0.7164 0.4155 0.7164 0.8464
No log 40.0 440 0.7859 0.4125 0.7859 0.8865
No log 40.1818 442 0.9239 0.3393 0.9239 0.9612
No log 40.3636 444 1.0221 0.2941 1.0221 1.0110
No log 40.5455 446 1.0153 0.2941 1.0153 1.0076
No log 40.7273 448 0.9174 0.3528 0.9174 0.9578
No log 40.9091 450 0.7740 0.3968 0.7740 0.8798
No log 41.0909 452 0.7002 0.4341 0.7002 0.8368
No log 41.2727 454 0.6783 0.5195 0.6783 0.8236
No log 41.4545 456 0.6765 0.4866 0.6765 0.8225
No log 41.6364 458 0.7068 0.4085 0.7068 0.8407
No log 41.8182 460 0.8041 0.4045 0.8041 0.8967
No log 42.0 462 0.8869 0.3527 0.8869 0.9418
No log 42.1818 464 0.8613 0.3827 0.8613 0.9280
No log 42.3636 466 0.8030 0.4092 0.8030 0.8961
No log 42.5455 468 0.7328 0.4065 0.7328 0.8560
No log 42.7273 470 0.6921 0.4700 0.6921 0.8319
No log 42.9091 472 0.6860 0.4866 0.6860 0.8282
No log 43.0909 474 0.7121 0.4562 0.7121 0.8438
No log 43.2727 476 0.7548 0.3690 0.7548 0.8688
No log 43.4545 478 0.8591 0.3321 0.8591 0.9269
No log 43.6364 480 0.9642 0.2901 0.9642 0.9819
No log 43.8182 482 0.9618 0.2901 0.9618 0.9807
No log 44.0 484 0.8853 0.3690 0.8853 0.9409
No log 44.1818 486 0.7976 0.3885 0.7976 0.8931
No log 44.3636 488 0.7690 0.3669 0.7690 0.8769
No log 44.5455 490 0.7423 0.4017 0.7423 0.8616
No log 44.7273 492 0.7267 0.4017 0.7267 0.8525
No log 44.9091 494 0.7187 0.3971 0.7187 0.8478
No log 45.0909 496 0.7409 0.4017 0.7409 0.8608
No log 45.2727 498 0.7596 0.3780 0.7596 0.8715
0.2887 45.4545 500 0.7821 0.4063 0.7821 0.8844
0.2887 45.6364 502 0.7754 0.4063 0.7754 0.8806
0.2887 45.8182 504 0.7359 0.4232 0.7359 0.8578
0.2887 46.0 506 0.6979 0.4155 0.6979 0.8354
0.2887 46.1818 508 0.6907 0.4467 0.6907 0.8311
0.2887 46.3636 510 0.7077 0.4085 0.7077 0.8412
0.2887 46.5455 512 0.7333 0.4302 0.7333 0.8563
0.2887 46.7273 514 0.7640 0.4354 0.7640 0.8741
0.2887 46.9091 516 0.7906 0.4423 0.7906 0.8891
0.2887 47.0909 518 0.8231 0.4284 0.8231 0.9072
0.2887 47.2727 520 0.8315 0.3798 0.8315 0.9119
0.2887 47.4545 522 0.8380 0.3440 0.8380 0.9155
0.2887 47.6364 524 0.8523 0.3440 0.8523 0.9232
0.2887 47.8182 526 0.8373 0.3798 0.8373 0.9150
0.2887 48.0 528 0.8781 0.3125 0.8781 0.9371

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_B_usingALLEssays_FineTuningAraBERT_run2_AugV5_k2_task7_organization

Finetuned
(4019)
this model