ArabicNewSplits7_B_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k8_task7_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.7279
  • Qwk: 0.3637
  • Mse: 0.7279
  • Rmse: 0.8532

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0488 2 2.7541 -0.0262 2.7541 1.6595
No log 0.0976 4 1.5172 0.0526 1.5172 1.2317
No log 0.1463 6 1.0957 -0.0970 1.0957 1.0467
No log 0.1951 8 1.0551 -0.0784 1.0551 1.0272
No log 0.2439 10 1.1785 0.0391 1.1785 1.0856
No log 0.2927 12 1.1967 0.0365 1.1967 1.0939
No log 0.3415 14 1.2195 0.0391 1.2195 1.1043
No log 0.3902 16 1.0439 -0.0586 1.0439 1.0217
No log 0.4390 18 0.9441 -0.0860 0.9441 0.9717
No log 0.4878 20 0.8739 -0.0462 0.8739 0.9348
No log 0.5366 22 0.9456 0.0584 0.9456 0.9724
No log 0.5854 24 1.0990 -0.0133 1.0990 1.0483
No log 0.6341 26 1.0681 0.1264 1.0681 1.0335
No log 0.6829 28 0.9909 0.2027 0.9909 0.9954
No log 0.7317 30 0.9280 0.1718 0.9280 0.9633
No log 0.7805 32 0.8070 0.1504 0.8070 0.8983
No log 0.8293 34 0.7583 0.0966 0.7583 0.8708
No log 0.8780 36 0.8054 0.2063 0.8054 0.8975
No log 0.9268 38 0.8877 0.3425 0.8877 0.9422
No log 0.9756 40 0.9212 0.3425 0.9212 0.9598
No log 1.0244 42 0.8563 0.3169 0.8563 0.9254
No log 1.0732 44 0.8307 0.2967 0.8307 0.9114
No log 1.1220 46 0.8550 0.2812 0.8550 0.9247
No log 1.1707 48 0.8127 0.2817 0.8127 0.9015
No log 1.2195 50 0.7725 0.3127 0.7725 0.8789
No log 1.2683 52 0.7630 0.2004 0.7630 0.8735
No log 1.3171 54 0.7711 0.2224 0.7711 0.8781
No log 1.3659 56 0.8549 0.3475 0.8549 0.9246
No log 1.4146 58 0.9351 0.3710 0.9351 0.9670
No log 1.4634 60 0.9068 0.4023 0.9068 0.9523
No log 1.5122 62 0.8303 0.3931 0.8303 0.9112
No log 1.5610 64 0.8465 0.3754 0.8465 0.9201
No log 1.6098 66 0.8313 0.4183 0.8313 0.9118
No log 1.6585 68 0.8177 0.4346 0.8177 0.9043
No log 1.7073 70 0.8242 0.3478 0.8242 0.9078
No log 1.7561 72 0.8072 0.3518 0.8072 0.8984
No log 1.8049 74 0.7668 0.2419 0.7668 0.8757
No log 1.8537 76 0.7727 0.3127 0.7727 0.8790
No log 1.9024 78 0.8368 0.2932 0.8368 0.9148
No log 1.9512 80 0.8292 0.3372 0.8292 0.9106
No log 2.0 82 0.8654 0.3372 0.8654 0.9303
No log 2.0488 84 0.9425 0.3169 0.9425 0.9708
No log 2.0976 86 0.9536 0.3231 0.9536 0.9765
No log 2.1463 88 0.8796 0.3688 0.8796 0.9379
No log 2.1951 90 0.8394 0.4369 0.8394 0.9162
No log 2.2439 92 0.8564 0.4193 0.8564 0.9254
No log 2.2927 94 0.9907 0.3941 0.9907 0.9954
No log 2.3415 96 1.1902 0.3237 1.1902 1.0910
No log 2.3902 98 1.1112 0.3424 1.1112 1.0541
No log 2.4390 100 0.9167 0.3456 0.9167 0.9574
No log 2.4878 102 0.9035 0.3194 0.9035 0.9505
No log 2.5366 104 0.9186 0.3194 0.9186 0.9584
No log 2.5854 106 0.8891 0.3844 0.8891 0.9429
No log 2.6341 108 0.9572 0.3100 0.9572 0.9784
No log 2.6829 110 1.0278 0.3236 1.0278 1.0138
No log 2.7317 112 0.9314 0.3456 0.9314 0.9651
No log 2.7805 114 0.7959 0.2215 0.7959 0.8921
No log 2.8293 116 0.8375 0.2300 0.8375 0.9151
No log 2.8780 118 0.9351 0.2603 0.9351 0.9670
No log 2.9268 120 0.8578 0.2787 0.8578 0.9262
No log 2.9756 122 0.7620 0.2053 0.7620 0.8730
No log 3.0244 124 0.7770 0.2215 0.7770 0.8814
No log 3.0732 126 0.8399 0.3127 0.8399 0.9165
No log 3.1220 128 0.8440 0.3127 0.8440 0.9187
No log 3.1707 130 0.9129 0.3008 0.9129 0.9554
No log 3.2195 132 0.9910 0.3076 0.9910 0.9955
No log 3.2683 134 1.1140 0.3003 1.1140 1.0554
No log 3.3171 136 1.0980 0.2613 1.0980 1.0479
No log 3.3659 138 1.0801 0.3052 1.0801 1.0393
No log 3.4146 140 1.0305 0.3892 1.0305 1.0151
No log 3.4634 142 1.1019 0.2613 1.1019 1.0497
No log 3.5122 144 1.0513 0.3290 1.0513 1.0253
No log 3.5610 146 1.0002 0.3782 1.0002 1.0001
No log 3.6098 148 0.9344 0.4030 0.9344 0.9666
No log 3.6585 150 0.9258 0.3960 0.9258 0.9622
No log 3.7073 152 0.8618 0.3675 0.8618 0.9283
No log 3.7561 154 0.8663 0.3231 0.8663 0.9308
No log 3.8049 156 0.8801 0.3425 0.8801 0.9382
No log 3.8537 158 0.8014 0.2116 0.8014 0.8952
No log 3.9024 160 0.7914 0.2116 0.7914 0.8896
No log 3.9512 162 0.7327 0.2527 0.7327 0.8560
No log 4.0 164 0.6887 0.3524 0.6887 0.8299
No log 4.0488 166 0.6644 0.2270 0.6644 0.8151
No log 4.0976 168 0.6896 0.3640 0.6896 0.8304
No log 4.1463 170 0.7857 0.3008 0.7857 0.8864
No log 4.1951 172 0.7920 0.3256 0.7920 0.8899
No log 4.2439 174 0.7363 0.4371 0.7363 0.8581
No log 4.2927 176 0.7682 0.3770 0.7682 0.8765
No log 4.3415 178 0.8174 0.3105 0.8174 0.9041
No log 4.3902 180 0.7966 0.3384 0.7966 0.8925
No log 4.4390 182 0.7253 0.3942 0.7253 0.8516
No log 4.4878 184 0.6879 0.2780 0.6879 0.8294
No log 4.5366 186 0.6848 0.3640 0.6848 0.8275
No log 4.5854 188 0.7187 0.3794 0.7187 0.8478
No log 4.6341 190 0.7384 0.4430 0.7384 0.8593
No log 4.6829 192 0.6838 0.3942 0.6838 0.8269
No log 4.7317 194 0.6445 0.3788 0.6445 0.8028
No log 4.7805 196 0.6352 0.3253 0.6352 0.7970
No log 4.8293 198 0.6392 0.3572 0.6392 0.7995
No log 4.8780 200 0.6735 0.3840 0.6735 0.8207
No log 4.9268 202 0.7517 0.3844 0.7517 0.8670
No log 4.9756 204 0.7671 0.3914 0.7671 0.8758
No log 5.0244 206 0.7470 0.4007 0.7470 0.8643
No log 5.0732 208 0.7456 0.3958 0.7456 0.8635
No log 5.1220 210 0.7522 0.4428 0.7522 0.8673
No log 5.1707 212 0.7606 0.4513 0.7606 0.8721
No log 5.2195 214 0.7138 0.4586 0.7138 0.8449
No log 5.2683 216 0.7019 0.4216 0.7019 0.8378
No log 5.3171 218 0.6881 0.4116 0.6881 0.8295
No log 5.3659 220 0.7017 0.3864 0.7017 0.8377
No log 5.4146 222 0.7587 0.4703 0.7587 0.8710
No log 5.4634 224 0.7516 0.4484 0.7516 0.8669
No log 5.5122 226 0.7373 0.4484 0.7373 0.8587
No log 5.5610 228 0.7353 0.4484 0.7353 0.8575
No log 5.6098 230 0.8206 0.4819 0.8206 0.9058
No log 5.6585 232 1.0114 0.3632 1.0114 1.0057
No log 5.7073 234 1.2126 0.2497 1.2126 1.1012
No log 5.7561 236 1.1018 0.3777 1.1018 1.0497
No log 5.8049 238 0.7892 0.4819 0.7892 0.8884
No log 5.8537 240 0.6648 0.4823 0.6648 0.8154
No log 5.9024 242 0.6549 0.5127 0.6549 0.8092
No log 5.9512 244 0.6539 0.4830 0.6539 0.8087
No log 6.0 246 0.6685 0.5030 0.6685 0.8176
No log 6.0488 248 0.7157 0.5030 0.7157 0.8460
No log 6.0976 250 0.6926 0.4812 0.6926 0.8323
No log 6.1463 252 0.6587 0.4322 0.6587 0.8116
No log 6.1951 254 0.6654 0.4806 0.6654 0.8157
No log 6.2439 256 0.6579 0.5320 0.6579 0.8111
No log 6.2927 258 0.7270 0.4795 0.7270 0.8526
No log 6.3415 260 0.8942 0.3887 0.8942 0.9456
No log 6.3902 262 0.9506 0.2999 0.9506 0.9750
No log 6.4390 264 0.8386 0.4133 0.8386 0.9158
No log 6.4878 266 0.7039 0.3817 0.7039 0.8390
No log 6.5366 268 0.6606 0.4493 0.6606 0.8128
No log 6.5854 270 0.6647 0.4493 0.6647 0.8153
No log 6.6341 272 0.7131 0.4782 0.7131 0.8445
No log 6.6829 274 0.7871 0.4369 0.7871 0.8872
No log 6.7317 276 0.7983 0.4275 0.7983 0.8935
No log 6.7805 278 0.7229 0.4625 0.7229 0.8503
No log 6.8293 280 0.6752 0.4315 0.6752 0.8217
No log 6.8780 282 0.6646 0.4315 0.6646 0.8152
No log 6.9268 284 0.6805 0.4315 0.6805 0.8249
No log 6.9756 286 0.6912 0.4315 0.6912 0.8314
No log 7.0244 288 0.6706 0.3864 0.6706 0.8189
No log 7.0732 290 0.6817 0.3864 0.6817 0.8257
No log 7.1220 292 0.7296 0.4782 0.7296 0.8542
No log 7.1707 294 0.8281 0.4045 0.8281 0.9100
No log 7.2195 296 0.8106 0.4045 0.8106 0.9003
No log 7.2683 298 0.7151 0.4854 0.7151 0.8456
No log 7.3171 300 0.6619 0.3864 0.6619 0.8136
No log 7.3659 302 0.6541 0.4257 0.6541 0.8087
No log 7.4146 304 0.6725 0.4103 0.6725 0.8201
No log 7.4634 306 0.6875 0.3841 0.6875 0.8291
No log 7.5122 308 0.7443 0.3936 0.7443 0.8627
No log 7.5610 310 0.7273 0.3936 0.7273 0.8528
No log 7.6098 312 0.6745 0.4168 0.6745 0.8213
No log 7.6585 314 0.6542 0.3601 0.6542 0.8088
No log 7.7073 316 0.6494 0.3601 0.6494 0.8059
No log 7.7561 318 0.6663 0.2193 0.6663 0.8163
No log 7.8049 320 0.6664 0.2215 0.6664 0.8164
No log 7.8537 322 0.6769 0.1884 0.6769 0.8227
No log 7.9024 324 0.7196 0.3127 0.7196 0.8483
No log 7.9512 326 0.7678 0.3519 0.7678 0.8763
No log 8.0 328 0.7318 0.3918 0.7318 0.8555
No log 8.0488 330 0.6646 0.1901 0.6646 0.8152
No log 8.0976 332 0.6352 0.3460 0.6352 0.7970
No log 8.1463 334 0.6312 0.4719 0.6312 0.7945
No log 8.1951 336 0.6349 0.4937 0.6349 0.7968
No log 8.2439 338 0.6737 0.4103 0.6737 0.8208
No log 8.2927 340 0.7332 0.4123 0.7332 0.8563
No log 8.3415 342 0.6982 0.3770 0.6982 0.8356
No log 8.3902 344 0.6246 0.4356 0.6246 0.7903
No log 8.4390 346 0.6099 0.5753 0.6099 0.7809
No log 8.4878 348 0.6140 0.4985 0.6140 0.7836
No log 8.5366 350 0.6353 0.3867 0.6353 0.7970
No log 8.5854 352 0.7051 0.4251 0.7051 0.8397
No log 8.6341 354 0.7984 0.4175 0.7984 0.8935
No log 8.6829 356 0.8523 0.4275 0.8523 0.9232
No log 8.7317 358 0.8735 0.3913 0.8735 0.9346
No log 8.7805 360 0.7890 0.4175 0.7890 0.8883
No log 8.8293 362 0.7099 0.3746 0.7099 0.8426
No log 8.8780 364 0.7173 0.3794 0.7173 0.8469
No log 8.9268 366 0.7299 0.4502 0.7299 0.8543
No log 8.9756 368 0.7570 0.4014 0.7570 0.8701
No log 9.0244 370 0.7685 0.3770 0.7685 0.8767
No log 9.0732 372 0.7579 0.3770 0.7579 0.8706
No log 9.1220 374 0.7095 0.2847 0.7095 0.8423
No log 9.1707 376 0.6775 0.2204 0.6775 0.8231
No log 9.2195 378 0.6537 0.2530 0.6537 0.8085
No log 9.2683 380 0.6370 0.3425 0.6370 0.7981
No log 9.3171 382 0.6544 0.4393 0.6544 0.8089
No log 9.3659 384 0.6999 0.4272 0.6999 0.8366
No log 9.4146 386 0.7558 0.4275 0.7558 0.8694
No log 9.4634 388 0.7851 0.4366 0.7851 0.8861
No log 9.5122 390 0.7253 0.3891 0.7253 0.8517
No log 9.5610 392 0.6423 0.3116 0.6423 0.8014
No log 9.6098 394 0.6269 0.2652 0.6269 0.7917
No log 9.6585 396 0.6406 0.3224 0.6406 0.8004
No log 9.7073 398 0.7354 0.4144 0.7354 0.8576
No log 9.7561 400 0.7996 0.3709 0.7996 0.8942
No log 9.8049 402 0.7479 0.4251 0.7479 0.8648
No log 9.8537 404 0.6849 0.4502 0.6849 0.8276
No log 9.9024 406 0.6637 0.4272 0.6637 0.8147
No log 9.9512 408 0.6287 0.4371 0.6287 0.7929
No log 10.0 410 0.6197 0.4740 0.6197 0.7872
No log 10.0488 412 0.6015 0.4073 0.6015 0.7755
No log 10.0976 414 0.6030 0.4125 0.6030 0.7765
No log 10.1463 416 0.6392 0.4272 0.6392 0.7995
No log 10.1951 418 0.7090 0.4014 0.7090 0.8420
No log 10.2439 420 0.7087 0.3746 0.7087 0.8418
No log 10.2927 422 0.6784 0.3746 0.6784 0.8237
No log 10.3415 424 0.6742 0.3746 0.6742 0.8211
No log 10.3902 426 0.6277 0.3918 0.6277 0.7923
No log 10.4390 428 0.6108 0.3615 0.6108 0.7815
No log 10.4878 430 0.6030 0.3434 0.6030 0.7765
No log 10.5366 432 0.6089 0.4013 0.6089 0.7803
No log 10.5854 434 0.6196 0.3640 0.6196 0.7872
No log 10.6341 436 0.6322 0.3640 0.6322 0.7951
No log 10.6829 438 0.6529 0.3640 0.6529 0.8081
No log 10.7317 440 0.6729 0.3942 0.6729 0.8203
No log 10.7805 442 0.7195 0.4123 0.7195 0.8482
No log 10.8293 444 0.8339 0.4085 0.8339 0.9132
No log 10.8780 446 0.8479 0.3740 0.8479 0.9208
No log 10.9268 448 0.7308 0.4646 0.7308 0.8548
No log 10.9756 450 0.6364 0.3918 0.6364 0.7978
No log 11.0244 452 0.5793 0.3324 0.5793 0.7611
No log 11.0732 454 0.5753 0.3964 0.5753 0.7585
No log 11.1220 456 0.5735 0.4361 0.5735 0.7573
No log 11.1707 458 0.5749 0.3524 0.5749 0.7582
No log 11.2195 460 0.5827 0.3840 0.5827 0.7633
No log 11.2683 462 0.6165 0.3127 0.6165 0.7852
No log 11.3171 464 0.6480 0.4167 0.6480 0.8050
No log 11.3659 466 0.6695 0.3770 0.6695 0.8182
No log 11.4146 468 0.6750 0.3494 0.6750 0.8216
No log 11.4634 470 0.6834 0.3494 0.6834 0.8267
No log 11.5122 472 0.6854 0.3770 0.6854 0.8279
No log 11.5610 474 0.6644 0.3770 0.6644 0.8151
No log 11.6098 476 0.6318 0.3794 0.6318 0.7949
No log 11.6585 478 0.6117 0.4819 0.6117 0.7821
No log 11.7073 480 0.6233 0.4819 0.6233 0.7895
No log 11.7561 482 0.6812 0.4144 0.6812 0.8253
No log 11.8049 484 0.7111 0.4295 0.7111 0.8432
No log 11.8537 486 0.6847 0.4295 0.6847 0.8274
No log 11.9024 488 0.6134 0.4036 0.6134 0.7832
No log 11.9512 490 0.5667 0.4984 0.5667 0.7528
No log 12.0 492 0.5744 0.4229 0.5744 0.7579
No log 12.0488 494 0.6091 0.4352 0.6091 0.7804
No log 12.0976 496 0.6899 0.3195 0.6899 0.8306
No log 12.1463 498 0.8283 0.3559 0.8283 0.9101
0.3094 12.1951 500 0.9198 0.3832 0.9198 0.9591
0.3094 12.2439 502 0.8838 0.3593 0.8838 0.9401
0.3094 12.2927 504 0.7694 0.3381 0.7694 0.8771
0.3094 12.3415 506 0.6810 0.4014 0.6810 0.8252
0.3094 12.3902 508 0.6454 0.3770 0.6454 0.8034
0.3094 12.4390 510 0.6571 0.3494 0.6571 0.8106
0.3094 12.4878 512 0.6414 0.3770 0.6414 0.8009
0.3094 12.5366 514 0.6689 0.3675 0.6689 0.8179
0.3094 12.5854 516 0.6779 0.3940 0.6779 0.8234
0.3094 12.6341 518 0.6812 0.3940 0.6812 0.8253
0.3094 12.6829 520 0.6997 0.3675 0.6997 0.8365
0.3094 12.7317 522 0.7266 0.3675 0.7266 0.8524
0.3094 12.7805 524 0.7200 0.3675 0.7200 0.8485
0.3094 12.8293 526 0.6747 0.4329 0.6747 0.8214
0.3094 12.8780 528 0.6635 0.3590 0.6635 0.8146
0.3094 12.9268 530 0.7002 0.3940 0.7002 0.8368
0.3094 12.9756 532 0.7832 0.3473 0.7832 0.8850
0.3094 13.0244 534 0.8787 0.3697 0.8787 0.9374
0.3094 13.0732 536 0.8424 0.3938 0.8424 0.9178
0.3094 13.1220 538 0.7251 0.3675 0.7251 0.8515
0.3094 13.1707 540 0.6600 0.4014 0.6600 0.8124
0.3094 13.2195 542 0.6619 0.4014 0.6619 0.8136
0.3094 13.2683 544 0.7072 0.3918 0.7072 0.8409
0.3094 13.3171 546 0.8017 0.4080 0.8017 0.8954
0.3094 13.3659 548 0.9010 0.3938 0.9010 0.9492
0.3094 13.4146 550 0.9369 0.3417 0.9369 0.9680
0.3094 13.4634 552 0.9047 0.3579 0.9047 0.9511
0.3094 13.5122 554 0.8739 0.3347 0.8739 0.9349
0.3094 13.5610 556 0.7896 0.3494 0.7896 0.8886
0.3094 13.6098 558 0.7279 0.3637 0.7279 0.8532

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_B_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k8_task7_organization

Finetuned
(4019)
this model