ArabicNewSplits7_B_usingWellWrittenEssays_FineTuningAraBERT_run3_AugV5_k15_task7_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.6508
  • Qwk: 0.4350
  • Mse: 0.6508
  • Rmse: 0.8067

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0260 2 2.5593 -0.0924 2.5593 1.5998
No log 0.0519 4 1.2376 0.0686 1.2376 1.1125
No log 0.0779 6 0.8719 0.0822 0.8719 0.9337
No log 0.1039 8 1.0805 0.2308 1.0805 1.0395
No log 0.1299 10 1.3208 0.0263 1.3208 1.1493
No log 0.1558 12 1.2656 0.0578 1.2656 1.1250
No log 0.1818 14 0.9891 0.2504 0.9891 0.9945
No log 0.2078 16 0.8874 0.2063 0.8874 0.9420
No log 0.2338 18 0.8646 0.0966 0.8646 0.9298
No log 0.2597 20 0.8894 0.1598 0.8894 0.9431
No log 0.2857 22 0.8761 0.2883 0.8761 0.9360
No log 0.3117 24 0.8378 0.2950 0.8378 0.9153
No log 0.3377 26 0.8161 0.1359 0.8161 0.9034
No log 0.3636 28 0.8099 -0.0483 0.8099 0.8999
No log 0.3896 30 0.8055 0.0481 0.8055 0.8975
No log 0.4156 32 0.7900 0.0481 0.7900 0.8888
No log 0.4416 34 0.7838 0.0757 0.7838 0.8853
No log 0.4675 36 0.7812 0.0798 0.7812 0.8839
No log 0.4935 38 0.7376 0.1863 0.7376 0.8589
No log 0.5195 40 0.7173 0.2652 0.7173 0.8469
No log 0.5455 42 0.7231 0.0265 0.7231 0.8504
No log 0.5714 44 0.7293 0.0265 0.7293 0.8540
No log 0.5974 46 0.7005 0.1456 0.7005 0.8369
No log 0.6234 48 0.7278 0.1139 0.7278 0.8531
No log 0.6494 50 0.7201 0.1007 0.7201 0.8486
No log 0.6753 52 0.7146 0.1225 0.7146 0.8454
No log 0.7013 54 0.7775 0.2754 0.7775 0.8818
No log 0.7273 56 0.8052 0.3105 0.8052 0.8973
No log 0.7532 58 0.7485 0.3032 0.7485 0.8652
No log 0.7792 60 0.7224 0.3099 0.7224 0.8499
No log 0.8052 62 0.7961 0.3606 0.7961 0.8922
No log 0.8312 64 0.7115 0.3662 0.7115 0.8435
No log 0.8571 66 0.6522 0.3153 0.6522 0.8076
No log 0.8831 68 0.6607 0.3426 0.6607 0.8128
No log 0.9091 70 0.6901 0.3399 0.6901 0.8307
No log 0.9351 72 0.9294 0.3889 0.9294 0.9641
No log 0.9610 74 0.9612 0.3889 0.9612 0.9804
No log 0.9870 76 0.7482 0.3843 0.7482 0.8650
No log 1.0130 78 0.6676 0.2261 0.6676 0.8171
No log 1.0390 80 0.7065 0.2847 0.7065 0.8405
No log 1.0649 82 0.7792 0.3894 0.7792 0.8827
No log 1.0909 84 0.8530 0.3991 0.8530 0.9236
No log 1.1169 86 0.8638 0.3918 0.8638 0.9294
No log 1.1429 88 0.8082 0.3770 0.8082 0.8990
No log 1.1688 90 0.7270 0.3794 0.7270 0.8526
No log 1.1948 92 0.7476 0.4036 0.7476 0.8646
No log 1.2208 94 0.8148 0.4315 0.8148 0.9027
No log 1.2468 96 0.8265 0.4597 0.8265 0.9091
No log 1.2727 98 0.8640 0.4437 0.8640 0.9295
No log 1.2987 100 0.8531 0.4262 0.8531 0.9236
No log 1.3247 102 0.7843 0.3944 0.7843 0.8856
No log 1.3506 104 0.7790 0.5683 0.7790 0.8826
No log 1.3766 106 0.7766 0.5494 0.7766 0.8812
No log 1.4026 108 0.7510 0.4244 0.7510 0.8666
No log 1.4286 110 0.8535 0.4200 0.8535 0.9238
No log 1.4545 112 0.8428 0.4200 0.8428 0.9180
No log 1.4805 114 0.6884 0.4550 0.6884 0.8297
No log 1.5065 116 0.7425 0.4302 0.7425 0.8617
No log 1.5325 118 0.7680 0.4302 0.7680 0.8763
No log 1.5584 120 0.6876 0.3603 0.6876 0.8292
No log 1.5844 122 0.7120 0.3746 0.7120 0.8438
No log 1.6104 124 0.7855 0.3776 0.7855 0.8863
No log 1.6364 126 0.8018 0.3604 0.8018 0.8955
No log 1.6623 128 0.7518 0.3963 0.7518 0.8671
No log 1.6883 130 0.6855 0.2981 0.6855 0.8280
No log 1.7143 132 0.6817 0.3599 0.6817 0.8257
No log 1.7403 134 0.7018 0.3425 0.7018 0.8377
No log 1.7662 136 0.8145 0.3776 0.8145 0.9025
No log 1.7922 138 0.7996 0.3379 0.7996 0.8942
No log 1.8182 140 0.7439 0.3820 0.7439 0.8625
No log 1.8442 142 0.7272 0.4393 0.7272 0.8528
No log 1.8701 144 0.7152 0.4029 0.7152 0.8457
No log 1.8961 146 0.7305 0.4059 0.7305 0.8547
No log 1.9221 148 0.7569 0.4036 0.7569 0.8700
No log 1.9481 150 0.7804 0.4197 0.7804 0.8834
No log 1.9740 152 0.8323 0.4153 0.8323 0.9123
No log 2.0 154 0.7836 0.4123 0.7836 0.8852
No log 2.0260 156 0.6623 0.3688 0.6623 0.8138
No log 2.0519 158 0.6391 0.4308 0.6391 0.7995
No log 2.0779 160 0.6544 0.4044 0.6544 0.8089
No log 2.1039 162 0.6502 0.4675 0.6502 0.8064
No log 2.1299 164 0.6676 0.4981 0.6676 0.8171
No log 2.1558 166 0.6565 0.5333 0.6565 0.8102
No log 2.1818 168 0.6760 0.4924 0.6760 0.8222
No log 2.2078 170 0.7735 0.4476 0.7735 0.8795
No log 2.2338 172 0.8160 0.4592 0.8160 0.9033
No log 2.2597 174 0.6988 0.4587 0.6988 0.8360
No log 2.2857 176 0.6760 0.4582 0.6760 0.8222
No log 2.3117 178 0.7373 0.4664 0.7373 0.8587
No log 2.3377 180 0.7941 0.4943 0.7941 0.8911
No log 2.3636 182 0.7455 0.4464 0.7455 0.8634
No log 2.3896 184 0.7467 0.4408 0.7467 0.8641
No log 2.4156 186 0.8125 0.4057 0.8125 0.9014
No log 2.4416 188 0.8230 0.4438 0.8230 0.9072
No log 2.4675 190 0.7504 0.4315 0.7504 0.8663
No log 2.4935 192 0.6695 0.4328 0.6695 0.8182
No log 2.5195 194 0.6525 0.3923 0.6525 0.8078
No log 2.5455 196 0.6383 0.4434 0.6383 0.7989
No log 2.5714 198 0.7259 0.4745 0.7259 0.8520
No log 2.5974 200 0.9152 0.4363 0.9152 0.9567
No log 2.6234 202 1.0286 0.3697 1.0286 1.0142
No log 2.6494 204 0.8876 0.4032 0.8876 0.9421
No log 2.6753 206 0.6939 0.4294 0.6939 0.8330
No log 2.7013 208 0.6492 0.4016 0.6492 0.8057
No log 2.7273 210 0.6495 0.4016 0.6495 0.8059
No log 2.7532 212 0.6787 0.3891 0.6787 0.8238
No log 2.7792 214 0.7994 0.2518 0.7994 0.8941
No log 2.8052 216 0.8272 0.2518 0.8272 0.9095
No log 2.8312 218 0.8621 0.2518 0.8621 0.9285
No log 2.8571 220 0.8959 0.2892 0.8959 0.9465
No log 2.8831 222 0.9109 0.2463 0.9109 0.9544
No log 2.9091 224 0.8384 0.3372 0.8384 0.9156
No log 2.9351 226 0.7839 0.2754 0.7839 0.8854
No log 2.9610 228 0.7526 0.2847 0.7526 0.8675
No log 2.9870 230 0.7467 0.3399 0.7467 0.8641
No log 3.0130 232 0.7217 0.3399 0.7217 0.8496
No log 3.0390 234 0.7083 0.3355 0.7083 0.8416
No log 3.0649 236 0.7079 0.4147 0.7079 0.8414
No log 3.0909 238 0.7309 0.4472 0.7309 0.8549
No log 3.1169 240 0.7866 0.4205 0.7866 0.8869
No log 3.1429 242 0.7836 0.4272 0.7836 0.8852
No log 3.1688 244 0.6987 0.4638 0.6987 0.8359
No log 3.1948 246 0.7078 0.3723 0.7078 0.8413
No log 3.2208 248 0.8832 0.3371 0.8832 0.9398
No log 3.2468 250 1.0360 0.3233 1.0360 1.0178
No log 3.2727 252 1.0110 0.3114 1.0110 1.0055
No log 3.2987 254 0.8221 0.3846 0.8221 0.9067
No log 3.3247 256 0.7128 0.4350 0.7128 0.8443
No log 3.3506 258 0.7249 0.4294 0.7249 0.8514
No log 3.3766 260 0.8665 0.3409 0.8665 0.9309
No log 3.4026 262 1.0434 0.3114 1.0434 1.0215
No log 3.4286 264 1.1135 0.2612 1.1135 1.0552
No log 3.4545 266 0.9625 0.2939 0.9625 0.9811
No log 3.4805 268 0.7763 0.4167 0.7763 0.8811
No log 3.5065 270 0.7401 0.4007 0.7401 0.8603
No log 3.5325 272 0.7342 0.3769 0.7342 0.8568
No log 3.5584 274 0.7173 0.4887 0.7173 0.8469
No log 3.5844 276 0.7159 0.4887 0.7159 0.8461
No log 3.6104 278 0.6931 0.4562 0.6931 0.8325
No log 3.6364 280 0.6896 0.4244 0.6896 0.8304
No log 3.6623 282 0.6755 0.4244 0.6755 0.8219
No log 3.6883 284 0.6840 0.3035 0.6840 0.8270
No log 3.7143 286 0.7132 0.2706 0.7132 0.8445
No log 3.7403 288 0.7353 0.2763 0.7353 0.8575
No log 3.7662 290 0.7230 0.3739 0.7230 0.8503
No log 3.7922 292 0.7139 0.4425 0.7139 0.8449
No log 3.8182 294 0.6543 0.4472 0.6543 0.8089
No log 3.8442 296 0.6227 0.5143 0.6227 0.7891
No log 3.8701 298 0.6170 0.5143 0.6170 0.7855
No log 3.8961 300 0.6103 0.5353 0.6103 0.7812
No log 3.9221 302 0.6841 0.3914 0.6841 0.8271
No log 3.9481 304 0.7325 0.3450 0.7325 0.8559
No log 3.9740 306 0.7111 0.3843 0.7111 0.8432
No log 4.0 308 0.6645 0.4430 0.6645 0.8152
No log 4.0260 310 0.6662 0.4801 0.6662 0.8162
No log 4.0519 312 0.6962 0.4568 0.6962 0.8344
No log 4.0779 314 0.7419 0.4197 0.7419 0.8613
No log 4.1039 316 0.7885 0.4153 0.7885 0.8880
No log 4.1299 318 0.7393 0.4197 0.7393 0.8598
No log 4.1558 320 0.6665 0.4622 0.6665 0.8164
No log 4.1818 322 0.6434 0.5190 0.6434 0.8021
No log 4.2078 324 0.6730 0.4968 0.6730 0.8204
No log 4.2338 326 0.7502 0.4107 0.7502 0.8661
No log 4.2597 328 0.7908 0.3826 0.7908 0.8893
No log 4.2857 330 0.7789 0.3461 0.7789 0.8825
No log 4.3117 332 0.6899 0.3723 0.6899 0.8306
No log 4.3377 334 0.6483 0.4362 0.6483 0.8052
No log 4.3636 336 0.6579 0.4831 0.6579 0.8111
No log 4.3896 338 0.6668 0.5238 0.6668 0.8166
No log 4.4156 340 0.6919 0.4562 0.6919 0.8318
No log 4.4416 342 0.6847 0.4322 0.6847 0.8275
No log 4.4675 344 0.6750 0.4562 0.6750 0.8216
No log 4.4935 346 0.6771 0.3628 0.6771 0.8228
No log 4.5195 348 0.6688 0.4719 0.6688 0.8178
No log 4.5455 350 0.7114 0.3594 0.7114 0.8434
No log 4.5714 352 0.7823 0.3440 0.7823 0.8845
No log 4.5974 354 0.7725 0.3207 0.7725 0.8789
No log 4.6234 356 0.7203 0.3793 0.7203 0.8487
No log 4.6494 358 0.7099 0.3891 0.7099 0.8425
No log 4.6753 360 0.7028 0.3841 0.7028 0.8383
No log 4.7013 362 0.7264 0.3500 0.7264 0.8523
No log 4.7273 364 0.8161 0.3497 0.8161 0.9034
No log 4.7532 366 0.9276 0.3193 0.9276 0.9631
No log 4.7792 368 1.0180 0.2756 1.0180 1.0089
No log 4.8052 370 1.0909 0.2889 1.0909 1.0445
No log 4.8312 372 0.9931 0.2910 0.9931 0.9965
No log 4.8571 374 0.9157 0.3433 0.9157 0.9569
No log 4.8831 376 0.8451 0.3450 0.8451 0.9193
No log 4.9091 378 0.7821 0.3723 0.7821 0.8844
No log 4.9351 380 0.7334 0.4281 0.7334 0.8564
No log 4.9610 382 0.7091 0.4924 0.7091 0.8421
No log 4.9870 384 0.7097 0.5174 0.7097 0.8424
No log 5.0130 386 0.7172 0.5368 0.7172 0.8469
No log 5.0390 388 0.7452 0.4715 0.7452 0.8632
No log 5.0649 390 0.7783 0.4262 0.7783 0.8822
No log 5.0909 392 0.7523 0.4715 0.7523 0.8674
No log 5.1169 394 0.7051 0.4856 0.7051 0.8397
No log 5.1429 396 0.6810 0.4103 0.6810 0.8252
No log 5.1688 398 0.6461 0.4576 0.6461 0.8038
No log 5.1948 400 0.6391 0.4091 0.6391 0.7994
No log 5.2208 402 0.6448 0.4576 0.6448 0.8030
No log 5.2468 404 0.6728 0.4179 0.6728 0.8203
No log 5.2727 406 0.6898 0.4179 0.6898 0.8305
No log 5.2987 408 0.6979 0.5021 0.6979 0.8354
No log 5.3247 410 0.7238 0.5434 0.7238 0.8508
No log 5.3506 412 0.7758 0.3754 0.7758 0.8808
No log 5.3766 414 0.8264 0.3930 0.8264 0.9091
No log 5.4026 416 0.8066 0.4059 0.8066 0.8981
No log 5.4286 418 0.7801 0.3770 0.7801 0.8832
No log 5.4545 420 0.7822 0.3819 0.7822 0.8844
No log 5.4805 422 0.8016 0.3819 0.8016 0.8953
No log 5.5065 424 0.8450 0.3819 0.8450 0.9192
No log 5.5325 426 0.9236 0.3688 0.9236 0.9610
No log 5.5584 428 0.9145 0.3520 0.9145 0.9563
No log 5.5844 430 0.8945 0.3243 0.8945 0.9458
No log 5.6104 432 0.8930 0.3005 0.8930 0.9450
No log 5.6364 434 0.9175 0.3005 0.9175 0.9579
No log 5.6623 436 0.8753 0.2975 0.8753 0.9356
No log 5.6883 438 0.8842 0.3195 0.8842 0.9403
No log 5.7143 440 0.8259 0.3402 0.8259 0.9088
No log 5.7403 442 0.7577 0.3076 0.7577 0.8705
No log 5.7662 444 0.7655 0.3971 0.7655 0.8749
No log 5.7922 446 0.7968 0.4183 0.7968 0.8926
No log 5.8182 448 0.8354 0.3700 0.8354 0.9140
No log 5.8442 450 0.7809 0.4340 0.7809 0.8837
No log 5.8701 452 0.7328 0.4127 0.7328 0.8560
No log 5.8961 454 0.7354 0.4286 0.7354 0.8576
No log 5.9221 456 0.7256 0.4093 0.7256 0.8518
No log 5.9481 458 0.7079 0.4090 0.7079 0.8414
No log 5.9740 460 0.7165 0.4327 0.7165 0.8465
No log 6.0 462 0.7091 0.4540 0.7091 0.8421
No log 6.0260 464 0.6934 0.4465 0.6934 0.8327
No log 6.0519 466 0.6974 0.5024 0.6974 0.8351
No log 6.0779 468 0.6955 0.5024 0.6955 0.8340
No log 6.1039 470 0.6725 0.5143 0.6725 0.8201
No log 6.1299 472 0.6689 0.5434 0.6689 0.8179
No log 6.1558 474 0.6942 0.4767 0.6942 0.8332
No log 6.1818 476 0.7195 0.4747 0.7195 0.8482
No log 6.2078 478 0.6705 0.4912 0.6705 0.8188
No log 6.2338 480 0.6198 0.4924 0.6198 0.7873
No log 6.2597 482 0.6066 0.5050 0.6066 0.7788
No log 6.2857 484 0.5975 0.4486 0.5975 0.7730
No log 6.3117 486 0.6118 0.4413 0.6118 0.7822
No log 6.3377 488 0.6798 0.4997 0.6798 0.8245
No log 6.3636 490 0.7569 0.4819 0.7569 0.8700
No log 6.3896 492 0.7261 0.5259 0.7261 0.8521
No log 6.4156 494 0.6877 0.4681 0.6877 0.8293
No log 6.4416 496 0.6920 0.4681 0.6920 0.8318
No log 6.4675 498 0.6807 0.4681 0.6807 0.8250
0.3447 6.4935 500 0.6543 0.4681 0.6543 0.8089
0.3447 6.5195 502 0.6350 0.4758 0.6350 0.7969
0.3447 6.5455 504 0.6474 0.4801 0.6474 0.8046
0.3447 6.5714 506 0.6618 0.4315 0.6618 0.8135
0.3447 6.5974 508 0.6606 0.4451 0.6606 0.8127
0.3447 6.6234 510 0.6501 0.4413 0.6501 0.8063
0.3447 6.6494 512 0.6711 0.4883 0.6711 0.8192
0.3447 6.6753 514 0.6907 0.4681 0.6907 0.8311
0.3447 6.7013 516 0.6849 0.4627 0.6849 0.8276
0.3447 6.7273 518 0.7049 0.4321 0.7049 0.8396
0.3447 6.7532 520 0.6772 0.4625 0.6772 0.8229
0.3447 6.7792 522 0.6508 0.4350 0.6508 0.8067

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
1
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_B_usingWellWrittenEssays_FineTuningAraBERT_run3_AugV5_k15_task7_organization

Finetuned
(4019)
this model