ArabicNewSplits7_B_usingWellWrittenEssays_FineTuningAraBERT_run3_AugV5_k8_task7_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 1.0863
  • Qwk: 0.2141
  • Mse: 1.0863
  • Rmse: 1.0422

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0476 2 2.7689 -0.0651 2.7689 1.6640
No log 0.0952 4 1.6709 -0.0188 1.6709 1.2926
No log 0.1429 6 1.3616 -0.2508 1.3616 1.1669
No log 0.1905 8 1.0781 0.0442 1.0781 1.0383
No log 0.2381 10 1.1667 -0.0165 1.1667 1.0801
No log 0.2857 12 1.2123 -0.0550 1.2123 1.1011
No log 0.3333 14 0.9859 0.0860 0.9859 0.9929
No log 0.3810 16 0.7936 0.1094 0.7936 0.8908
No log 0.4286 18 0.7350 0.0359 0.7350 0.8573
No log 0.4762 20 0.6921 0.0393 0.6921 0.8319
No log 0.5238 22 0.7195 0.1236 0.7195 0.8482
No log 0.5714 24 0.7622 0.0 0.7622 0.8731
No log 0.6190 26 0.7776 0.0481 0.7776 0.8818
No log 0.6667 28 0.7883 0.2879 0.7883 0.8879
No log 0.7143 30 0.8552 0.3192 0.8552 0.9248
No log 0.7619 32 0.8454 0.2908 0.8454 0.9195
No log 0.8095 34 0.7494 0.2841 0.7494 0.8657
No log 0.8571 36 0.7131 0.1365 0.7131 0.8444
No log 0.9048 38 0.7140 0.1365 0.7140 0.8450
No log 0.9524 40 0.6764 0.0893 0.6764 0.8224
No log 1.0 42 0.6687 0.2345 0.6687 0.8177
No log 1.0476 44 0.6811 0.2243 0.6811 0.8253
No log 1.0952 46 0.7468 0.2244 0.7468 0.8642
No log 1.1429 48 0.7549 0.2843 0.7549 0.8688
No log 1.1905 50 0.7087 0.2063 0.7087 0.8418
No log 1.2381 52 0.7067 0.2063 0.7067 0.8406
No log 1.2857 54 0.7333 0.1770 0.7333 0.8563
No log 1.3333 56 0.7131 0.1372 0.7131 0.8445
No log 1.3810 58 0.6990 0.1407 0.6990 0.8361
No log 1.4286 60 0.7793 0.2574 0.7793 0.8828
No log 1.4762 62 0.7758 0.2574 0.7758 0.8808
No log 1.5238 64 0.6650 0.2751 0.6650 0.8155
No log 1.5714 66 0.6585 0.2158 0.6585 0.8115
No log 1.6190 68 0.6625 0.1790 0.6625 0.8139
No log 1.6667 70 0.6702 0.2243 0.6702 0.8187
No log 1.7143 72 0.6888 0.1863 0.6888 0.8299
No log 1.7619 74 0.7046 0.2405 0.7046 0.8394
No log 1.8095 76 0.7083 0.2606 0.7083 0.8416
No log 1.8571 78 0.6988 0.2606 0.6988 0.8360
No log 1.9048 80 0.6938 0.2857 0.6938 0.8329
No log 1.9524 82 0.6987 0.2718 0.6987 0.8359
No log 2.0 84 0.7189 0.2227 0.7189 0.8479
No log 2.0476 86 0.7019 0.2540 0.7019 0.8378
No log 2.0952 88 0.7505 0.1823 0.7505 0.8663
No log 2.1429 90 0.8027 0.1842 0.8027 0.8959
No log 2.1905 92 0.7715 0.1528 0.7715 0.8784
No log 2.2381 94 0.7536 0.1471 0.7536 0.8681
No log 2.2857 96 0.7560 0.1519 0.7560 0.8695
No log 2.3333 98 0.7467 0.2506 0.7467 0.8641
No log 2.3810 100 0.8359 0.2204 0.8359 0.9143
No log 2.4286 102 0.9024 0.1416 0.9024 0.9499
No log 2.4762 104 0.9692 0.0566 0.9692 0.9845
No log 2.5238 106 0.9365 0.1304 0.9365 0.9677
No log 2.5714 108 0.7548 0.2995 0.7548 0.8688
No log 2.6190 110 0.7022 0.2981 0.7022 0.8380
No log 2.6667 112 0.7325 0.2227 0.7325 0.8558
No log 2.7143 114 0.7447 0.2621 0.7447 0.8630
No log 2.7619 116 0.7172 0.3324 0.7172 0.8469
No log 2.8095 118 0.6901 0.3781 0.6901 0.8307
No log 2.8571 120 0.6765 0.3988 0.6765 0.8225
No log 2.9048 122 0.6878 0.3060 0.6878 0.8293
No log 2.9524 124 0.7060 0.3599 0.7060 0.8402
No log 3.0 126 0.7361 0.3865 0.7361 0.8580
No log 3.0476 128 0.7232 0.3211 0.7232 0.8504
No log 3.0952 130 0.7185 0.2806 0.7185 0.8476
No log 3.1429 132 0.7141 0.2806 0.7141 0.8450
No log 3.1905 134 0.7191 0.3618 0.7191 0.8480
No log 3.2381 136 0.7034 0.3198 0.7034 0.8387
No log 3.2857 138 0.7089 0.3122 0.7089 0.8420
No log 3.3333 140 0.7141 0.3186 0.7141 0.8450
No log 3.3810 142 0.7691 0.3866 0.7691 0.8770
No log 3.4286 144 0.7676 0.4260 0.7676 0.8762
No log 3.4762 146 0.7522 0.2862 0.7522 0.8673
No log 3.5238 148 0.8201 0.2074 0.8201 0.9056
No log 3.5714 150 0.8080 0.2074 0.8080 0.8989
No log 3.6190 152 0.7240 0.2965 0.7240 0.8509
No log 3.6667 154 0.7747 0.3498 0.7747 0.8802
No log 3.7143 156 0.7974 0.4007 0.7974 0.8930
No log 3.7619 158 0.7594 0.3931 0.7594 0.8715
No log 3.8095 160 0.7519 0.3326 0.7519 0.8671
No log 3.8571 162 0.7590 0.3605 0.7590 0.8712
No log 3.9048 164 0.7615 0.3950 0.7615 0.8726
No log 3.9524 166 0.7563 0.4037 0.7563 0.8696
No log 4.0 168 0.8427 0.3720 0.8427 0.9180
No log 4.0476 170 1.0637 0.3319 1.0637 1.0313
No log 4.0952 172 1.0981 0.2928 1.0981 1.0479
No log 4.1429 174 0.9223 0.3929 0.9223 0.9604
No log 4.1905 176 0.7059 0.3816 0.7059 0.8402
No log 4.2381 178 0.7150 0.3450 0.7150 0.8455
No log 4.2857 180 0.7942 0.3612 0.7942 0.8912
No log 4.3333 182 0.8219 0.2457 0.8219 0.9066
No log 4.3810 184 0.7705 0.3546 0.7705 0.8778
No log 4.4286 186 0.7688 0.4179 0.7688 0.8768
No log 4.4762 188 0.7687 0.4276 0.7687 0.8767
No log 4.5238 190 0.7662 0.4179 0.7662 0.8753
No log 4.5714 192 0.7630 0.4287 0.7630 0.8735
No log 4.6190 194 0.7742 0.3530 0.7742 0.8799
No log 4.6667 196 0.8387 0.3866 0.8387 0.9158
No log 4.7143 198 0.9760 0.2993 0.9760 0.9879
No log 4.7619 200 0.9208 0.3128 0.9208 0.9596
No log 4.8095 202 0.7494 0.3966 0.7494 0.8657
No log 4.8571 204 0.7097 0.3556 0.7097 0.8424
No log 4.9048 206 0.7139 0.3425 0.7139 0.8449
No log 4.9524 208 0.7167 0.3186 0.7167 0.8466
No log 5.0 210 0.7447 0.4073 0.7447 0.8630
No log 5.0476 212 0.7944 0.3931 0.7944 0.8913
No log 5.0952 214 0.7848 0.3816 0.7848 0.8859
No log 5.1429 216 0.7660 0.3465 0.7660 0.8752
No log 5.1905 218 0.7795 0.3816 0.7795 0.8829
No log 5.2381 220 0.7938 0.3980 0.7938 0.8909
No log 5.2857 222 0.8869 0.3440 0.8869 0.9418
No log 5.3333 224 0.9051 0.3565 0.9051 0.9514
No log 5.3810 226 0.8013 0.3401 0.8013 0.8952
No log 5.4286 228 0.7646 0.4158 0.7646 0.8744
No log 5.4762 230 0.7742 0.4179 0.7742 0.8799
No log 5.5238 232 0.7934 0.4179 0.7934 0.8907
No log 5.5714 234 0.7807 0.3928 0.7807 0.8836
No log 5.6190 236 0.7851 0.3069 0.7851 0.8861
No log 5.6667 238 0.8186 0.3274 0.8186 0.9048
No log 5.7143 240 0.7894 0.3434 0.7894 0.8885
No log 5.7619 242 0.7560 0.3460 0.7560 0.8695
No log 5.8095 244 0.7347 0.3225 0.7347 0.8572
No log 5.8571 246 0.7282 0.3703 0.7282 0.8534
No log 5.9048 248 0.7138 0.3703 0.7138 0.8449
No log 5.9524 250 0.7404 0.3211 0.7404 0.8605
No log 6.0 252 0.8137 0.3662 0.8137 0.9020
No log 6.0476 254 0.7967 0.3662 0.7967 0.8926
No log 6.0952 256 0.7392 0.3296 0.7392 0.8598
No log 6.1429 258 0.7278 0.2965 0.7278 0.8531
No log 6.1905 260 0.7247 0.3628 0.7247 0.8513
No log 6.2381 262 0.7260 0.3677 0.7260 0.8521
No log 6.2857 264 0.7407 0.4114 0.7407 0.8606
No log 6.3333 266 0.7781 0.3689 0.7781 0.8821
No log 6.3810 268 0.8079 0.3769 0.8079 0.8988
No log 6.4286 270 0.8164 0.3640 0.8164 0.9036
No log 6.4762 272 0.8338 0.2960 0.8338 0.9131
No log 6.5238 274 0.9035 0.2697 0.9035 0.9505
No log 6.5714 276 0.9599 0.3274 0.9599 0.9797
No log 6.6190 278 1.0242 0.2509 1.0242 1.0120
No log 6.6667 280 0.9257 0.3280 0.9257 0.9621
No log 6.7143 282 0.8062 0.2806 0.8062 0.8979
No log 6.7619 284 0.7739 0.3556 0.7739 0.8797
No log 6.8095 286 0.7716 0.3198 0.7716 0.8784
No log 6.8571 288 0.8004 0.3478 0.8004 0.8947
No log 6.9048 290 0.9474 0.2916 0.9474 0.9733
No log 6.9524 292 0.9965 0.2682 0.9965 0.9982
No log 7.0 294 0.9020 0.3119 0.9020 0.9497
No log 7.0476 296 0.7798 0.3239 0.7798 0.8830
No log 7.0952 298 0.7459 0.3252 0.7459 0.8636
No log 7.1429 300 0.7628 0.1999 0.7628 0.8734
No log 7.1905 302 0.7636 0.3138 0.7636 0.8738
No log 7.2381 304 0.7482 0.3738 0.7482 0.8650
No log 7.2857 306 0.7441 0.3738 0.7441 0.8626
No log 7.3333 308 0.7667 0.3887 0.7667 0.8756
No log 7.3810 310 0.7508 0.3643 0.7508 0.8665
No log 7.4286 312 0.7286 0.4190 0.7286 0.8536
No log 7.4762 314 0.7264 0.4190 0.7264 0.8523
No log 7.5238 316 0.7425 0.4059 0.7425 0.8617
No log 7.5714 318 0.7645 0.3260 0.7645 0.8743
No log 7.6190 320 0.7930 0.3506 0.7930 0.8905
No log 7.6667 322 0.8408 0.3548 0.8408 0.9170
No log 7.7143 324 0.8271 0.3613 0.8271 0.9094
No log 7.7619 326 0.7684 0.3574 0.7684 0.8766
No log 7.8095 328 0.7600 0.3598 0.7600 0.8718
No log 7.8571 330 0.7724 0.3325 0.7724 0.8788
No log 7.9048 332 0.8260 0.4179 0.8260 0.9088
No log 7.9524 334 0.9421 0.3216 0.9421 0.9706
No log 8.0 336 0.9901 0.2916 0.9901 0.9950
No log 8.0476 338 0.9329 0.3527 0.9329 0.9659
No log 8.0952 340 0.8162 0.3401 0.8162 0.9034
No log 8.1429 342 0.7730 0.3485 0.7730 0.8792
No log 8.1905 344 0.7730 0.3485 0.7730 0.8792
No log 8.2381 346 0.7825 0.2689 0.7825 0.8846
No log 8.2857 348 0.8212 0.3841 0.8212 0.9062
No log 8.3333 350 0.8328 0.3633 0.8328 0.9126
No log 8.3810 352 0.8726 0.3586 0.8726 0.9341
No log 8.4286 354 0.8255 0.3662 0.8255 0.9086
No log 8.4762 356 0.7975 0.3051 0.7975 0.8930
No log 8.5238 358 0.8082 0.3393 0.8082 0.8990
No log 8.5714 360 0.7889 0.3172 0.7889 0.8882
No log 8.6190 362 0.8205 0.3447 0.8205 0.9058
No log 8.6667 364 0.8956 0.3802 0.8956 0.9464
No log 8.7143 366 0.8863 0.3379 0.8863 0.9415
No log 8.7619 368 0.8090 0.3864 0.8090 0.8995
No log 8.8095 370 0.7764 0.3715 0.7764 0.8812
No log 8.8571 372 0.8085 0.3628 0.8085 0.8991
No log 8.9048 374 0.7956 0.3605 0.7956 0.8920
No log 8.9524 376 0.7799 0.4288 0.7799 0.8831
No log 9.0 378 0.8575 0.3780 0.8575 0.9260
No log 9.0476 380 0.8653 0.3309 0.8653 0.9302
No log 9.0952 382 0.8082 0.4262 0.8082 0.8990
No log 9.1429 384 0.7803 0.4086 0.7803 0.8834
No log 9.1905 386 0.7935 0.4074 0.7935 0.8908
No log 9.2381 388 0.8036 0.4022 0.8036 0.8965
No log 9.2857 390 0.8153 0.4022 0.8153 0.9029
No log 9.3333 392 0.8108 0.4253 0.8108 0.9004
No log 9.3810 394 0.7850 0.3442 0.7850 0.8860
No log 9.4286 396 0.7833 0.3643 0.7833 0.8851
No log 9.4762 398 0.8147 0.4029 0.8147 0.9026
No log 9.5238 400 0.8329 0.4186 0.8329 0.9126
No log 9.5714 402 0.7977 0.3990 0.7977 0.8932
No log 9.6190 404 0.7974 0.3713 0.7974 0.8929
No log 9.6667 406 0.8272 0.3688 0.8272 0.9095
No log 9.7143 408 0.8039 0.3713 0.8039 0.8966
No log 9.7619 410 0.7733 0.3995 0.7733 0.8794
No log 9.8095 412 0.7659 0.3702 0.7659 0.8751
No log 9.8571 414 0.7716 0.3702 0.7716 0.8784
No log 9.9048 416 0.7732 0.3970 0.7732 0.8793
No log 9.9524 418 0.7640 0.3308 0.7640 0.8741
No log 10.0 420 0.7535 0.3603 0.7535 0.8681
No log 10.0476 422 0.7493 0.4006 0.7493 0.8656
No log 10.0952 424 0.7375 0.3738 0.7375 0.8588
No log 10.1429 426 0.7328 0.3738 0.7328 0.8560
No log 10.1905 428 0.7310 0.4524 0.7310 0.8550
No log 10.2381 430 0.7318 0.4126 0.7318 0.8555
No log 10.2857 432 0.7338 0.3885 0.7338 0.8566
No log 10.3333 434 0.7845 0.3613 0.7845 0.8857
No log 10.3810 436 0.8341 0.4684 0.8341 0.9133
No log 10.4286 438 0.7704 0.5059 0.7704 0.8777
No log 10.4762 440 0.6853 0.4438 0.6853 0.8278
No log 10.5238 442 0.6826 0.3692 0.6826 0.8262
No log 10.5714 444 0.6772 0.3738 0.6772 0.8230
No log 10.6190 446 0.6870 0.4051 0.6870 0.8289
No log 10.6667 448 0.6956 0.4205 0.6956 0.8340
No log 10.7143 450 0.7334 0.3417 0.7334 0.8564
No log 10.7619 452 0.7514 0.3481 0.7514 0.8668
No log 10.8095 454 0.7112 0.3833 0.7112 0.8433
No log 10.8571 456 0.7058 0.4051 0.7058 0.8401
No log 10.9048 458 0.7427 0.3366 0.7427 0.8618
No log 10.9524 460 0.8388 0.4002 0.8388 0.9158
No log 11.0 462 0.9270 0.2799 0.9270 0.9628
No log 11.0476 464 0.9563 0.2846 0.9563 0.9779
No log 11.0952 466 0.8890 0.3204 0.8890 0.9429
No log 11.1429 468 0.8861 0.3204 0.8861 0.9413
No log 11.1905 470 0.9144 0.3204 0.9144 0.9562
No log 11.2381 472 0.8148 0.4051 0.8148 0.9026
No log 11.2857 474 0.7387 0.3978 0.7387 0.8594
No log 11.3333 476 0.7603 0.4079 0.7603 0.8720
No log 11.3810 478 0.8040 0.2996 0.8040 0.8967
No log 11.4286 480 0.8193 0.2996 0.8193 0.9051
No log 11.4762 482 0.8408 0.2996 0.8408 0.9169
No log 11.5238 484 0.8091 0.3546 0.8091 0.8995
No log 11.5714 486 0.7684 0.3786 0.7684 0.8766
No log 11.6190 488 0.7626 0.3691 0.7626 0.8733
No log 11.6667 490 0.7546 0.3096 0.7546 0.8687
No log 11.7143 492 0.7515 0.3293 0.7515 0.8669
No log 11.7619 494 0.7640 0.3293 0.7640 0.8741
No log 11.8095 496 0.7820 0.3186 0.7820 0.8843
No log 11.8571 498 0.8028 0.3581 0.8028 0.8960
0.3713 11.9048 500 0.8466 0.3417 0.8466 0.9201
0.3713 11.9524 502 0.9237 0.3099 0.9237 0.9611
0.3713 12.0 504 0.9878 0.3214 0.9878 0.9939
0.3713 12.0476 506 1.0097 0.3517 1.0097 1.0048
0.3713 12.0952 508 0.8674 0.3789 0.8674 0.9314
0.3713 12.1429 510 0.7800 0.4074 0.7800 0.8832
0.3713 12.1905 512 0.7607 0.3715 0.7607 0.8722
0.3713 12.2381 514 0.8044 0.3640 0.8044 0.8969
0.3713 12.2857 516 0.9256 0.2578 0.9256 0.9621
0.3713 12.3333 518 0.9485 0.2147 0.9485 0.9739
0.3713 12.3810 520 0.9782 0.2510 0.9782 0.9890
0.3713 12.4286 522 1.0272 0.2703 1.0272 1.0135
0.3713 12.4762 524 1.0863 0.2141 1.0863 1.0422

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_B_usingWellWrittenEssays_FineTuningAraBERT_run3_AugV5_k8_task7_organization

Finetuned
(4019)
this model