ArabicNewSplits7_B_usingALLEssays_FineTuningAraBERT_run2_AugV5_k9_task1_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.7814
  • Qwk: 0.6957
  • Mse: 0.7814
  • Rmse: 0.8840

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0488 2 7.2532 -0.0274 7.2532 2.6932
No log 0.0976 4 4.7568 0.0784 4.7568 2.1810
No log 0.1463 6 3.2052 0.0848 3.2052 1.7903
No log 0.1951 8 3.3580 0.0 3.3580 1.8325
No log 0.2439 10 2.7689 0.0 2.7689 1.6640
No log 0.2927 12 1.9633 0.1261 1.9633 1.4012
No log 0.3415 14 1.8026 0.1869 1.8026 1.3426
No log 0.3902 16 1.6895 0.1154 1.6895 1.2998
No log 0.4390 18 1.6772 0.1682 1.6772 1.2951
No log 0.4878 20 2.1888 0.1449 2.1888 1.4795
No log 0.5366 22 2.4509 0.1233 2.4509 1.5655
No log 0.5854 24 2.3693 0.1233 2.3693 1.5392
No log 0.6341 26 1.9177 0.2794 1.9177 1.3848
No log 0.6829 28 1.8386 0.3459 1.8386 1.3559
No log 0.7317 30 1.7798 0.3459 1.7798 1.3341
No log 0.7805 32 1.6595 0.4 1.6595 1.2882
No log 0.8293 34 1.8509 0.3056 1.8509 1.3605
No log 0.8780 36 1.8881 0.3718 1.8881 1.3741
No log 0.9268 38 1.4187 0.4861 1.4187 1.1911
No log 0.9756 40 1.2640 0.4091 1.2640 1.1243
No log 1.0244 42 1.2761 0.4296 1.2761 1.1296
No log 1.0732 44 1.3586 0.4762 1.3586 1.1656
No log 1.1220 46 1.5560 0.5031 1.5560 1.2474
No log 1.1707 48 1.8092 0.4588 1.8092 1.3451
No log 1.2195 50 1.8076 0.4706 1.8076 1.3445
No log 1.2683 52 1.8801 0.4229 1.8801 1.3712
No log 1.3171 54 1.7371 0.4625 1.7371 1.3180
No log 1.3659 56 1.5370 0.4595 1.5370 1.2398
No log 1.4146 58 1.7284 0.4528 1.7284 1.3147
No log 1.4634 60 2.0689 0.4088 2.0689 1.4384
No log 1.5122 62 2.3631 0.3261 2.3631 1.5372
No log 1.5610 64 2.0090 0.4457 2.0090 1.4174
No log 1.6098 66 1.8033 0.4483 1.8033 1.3429
No log 1.6585 68 1.2981 0.5241 1.2981 1.1393
No log 1.7073 70 1.1323 0.5286 1.1323 1.0641
No log 1.7561 72 1.2775 0.5034 1.2775 1.1303
No log 1.8049 74 1.1871 0.5106 1.1871 1.0896
No log 1.8537 76 1.0777 0.4531 1.0777 1.0381
No log 1.9024 78 1.0792 0.4918 1.0792 1.0388
No log 1.9512 80 0.9937 0.5846 0.9937 0.9968
No log 2.0 82 0.9300 0.6569 0.9300 0.9643
No log 2.0488 84 0.8950 0.6957 0.8950 0.9461
No log 2.0976 86 0.9089 0.6917 0.9089 0.9534
No log 2.1463 88 0.9088 0.6412 0.9088 0.9533
No log 2.1951 90 0.9675 0.6131 0.9675 0.9836
No log 2.2439 92 1.0088 0.6081 1.0088 1.0044
No log 2.2927 94 0.8483 0.6714 0.8483 0.9210
No log 2.3415 96 0.7905 0.7534 0.7905 0.8891
No log 2.3902 98 0.7710 0.7692 0.7710 0.8781
No log 2.4390 100 0.7706 0.7778 0.7706 0.8778
No log 2.4878 102 0.7426 0.7586 0.7426 0.8617
No log 2.5366 104 0.7424 0.7586 0.7424 0.8616
No log 2.5854 106 0.7657 0.7143 0.7657 0.8750
No log 2.6341 108 0.7733 0.7153 0.7733 0.8794
No log 2.6829 110 0.7939 0.6853 0.7939 0.8910
No log 2.7317 112 0.9663 0.6154 0.9663 0.9830
No log 2.7805 114 0.9150 0.6443 0.9150 0.9565
No log 2.8293 116 0.8062 0.7034 0.8062 0.8979
No log 2.8780 118 0.8192 0.7143 0.8192 0.9051
No log 2.9268 120 0.7784 0.7467 0.7784 0.8822
No log 2.9756 122 0.8857 0.6909 0.8857 0.9411
No log 3.0244 124 1.1058 0.6556 1.1058 1.0516
No log 3.0732 126 1.1233 0.6592 1.1233 1.0598
No log 3.1220 128 0.9495 0.65 0.9495 0.9744
No log 3.1707 130 0.9196 0.7209 0.9196 0.9589
No log 3.2195 132 1.0192 0.6882 1.0192 1.0096
No log 3.2683 134 0.9886 0.7174 0.9886 0.9943
No log 3.3171 136 0.8436 0.7574 0.8436 0.9185
No log 3.3659 138 0.9995 0.6438 0.9995 0.9998
No log 3.4146 140 1.1448 0.6277 1.1448 1.0699
No log 3.4634 142 0.9396 0.6389 0.9396 0.9693
No log 3.5122 144 0.7980 0.7284 0.7980 0.8933
No log 3.5610 146 0.8870 0.7143 0.8870 0.9418
No log 3.6098 148 0.8696 0.7160 0.8696 0.9325
No log 3.6585 150 0.7690 0.7237 0.7690 0.8769
No log 3.7073 152 0.7556 0.7273 0.7556 0.8693
No log 3.7561 154 0.7727 0.7234 0.7727 0.8790
No log 3.8049 156 0.7921 0.7234 0.7921 0.8900
No log 3.8537 158 0.8132 0.7465 0.8132 0.9018
No log 3.9024 160 0.8579 0.7042 0.8579 0.9262
No log 3.9512 162 0.9063 0.7162 0.9063 0.9520
No log 4.0 164 1.0319 0.5974 1.0319 1.0158
No log 4.0488 166 1.0739 0.6061 1.0739 1.0363
No log 4.0976 168 0.9823 0.6506 0.9823 0.9911
No log 4.1463 170 0.8549 0.7075 0.8549 0.9246
No log 4.1951 172 0.8616 0.7194 0.8616 0.9282
No log 4.2439 174 0.8720 0.6716 0.8720 0.9338
No log 4.2927 176 0.8211 0.7206 0.8211 0.9061
No log 4.3415 178 0.8376 0.6471 0.8376 0.9152
No log 4.3902 180 1.0000 0.6294 1.0000 1.0000
No log 4.4390 182 0.9812 0.5882 0.9812 0.9905
No log 4.4878 184 0.9461 0.6222 0.9461 0.9727
No log 4.5366 186 0.9467 0.6475 0.9467 0.9730
No log 4.5854 188 0.8412 0.6277 0.8412 0.9172
No log 4.6341 190 0.7698 0.7206 0.7698 0.8774
No log 4.6829 192 0.7814 0.7338 0.7814 0.8840
No log 4.7317 194 0.7842 0.7534 0.7842 0.8855
No log 4.7805 196 0.8370 0.6986 0.8370 0.9149
No log 4.8293 198 0.9025 0.6803 0.9025 0.9500
No log 4.8780 200 0.8863 0.6803 0.8863 0.9414
No log 4.9268 202 0.8422 0.7075 0.8422 0.9177
No log 4.9756 204 0.8353 0.7297 0.8353 0.9139
No log 5.0244 206 0.8147 0.7034 0.8147 0.9026
No log 5.0732 208 0.8522 0.6797 0.8522 0.9231
No log 5.1220 210 0.9987 0.6303 0.9987 0.9994
No log 5.1707 212 1.0364 0.6220 1.0364 1.0180
No log 5.2195 214 0.9783 0.6135 0.9783 0.9891
No log 5.2683 216 1.0398 0.6154 1.0398 1.0197
No log 5.3171 218 1.1362 0.6180 1.1362 1.0659
No log 5.3659 220 1.0393 0.6024 1.0393 1.0195
No log 5.4146 222 0.8816 0.6710 0.8816 0.9389
No log 5.4634 224 0.8063 0.7075 0.8063 0.8980
No log 5.5122 226 0.8180 0.6906 0.8180 0.9044
No log 5.5610 228 0.8258 0.7101 0.8258 0.9087
No log 5.6098 230 0.8320 0.6806 0.8320 0.9121
No log 5.6585 232 0.8623 0.6531 0.8623 0.9286
No log 5.7073 234 0.9317 0.7059 0.9317 0.9652
No log 5.7561 236 1.0878 0.6258 1.0878 1.0430
No log 5.8049 238 1.3610 0.5402 1.3610 1.1666
No log 5.8537 240 1.5535 0.5444 1.5535 1.2464
No log 5.9024 242 1.2576 0.5389 1.2576 1.1214
No log 5.9512 244 0.9675 0.6575 0.9675 0.9836
No log 6.0 246 0.8455 0.6241 0.8455 0.9195
No log 6.0488 248 0.8058 0.6806 0.8058 0.8977
No log 6.0976 250 0.8137 0.6806 0.8137 0.9021
No log 6.1463 252 0.8732 0.6479 0.8732 0.9345
No log 6.1951 254 0.8998 0.6338 0.8998 0.9486
No log 6.2439 256 0.8498 0.6479 0.8498 0.9219
No log 6.2927 258 0.7866 0.6806 0.7866 0.8869
No log 6.3415 260 0.7857 0.6809 0.7857 0.8864
No log 6.3902 262 0.7992 0.6803 0.7992 0.8940
No log 6.4390 264 0.8791 0.7239 0.8791 0.9376
No log 6.4878 266 0.8221 0.7317 0.8221 0.9067
No log 6.5366 268 0.8202 0.7262 0.8202 0.9057
No log 6.5854 270 0.7571 0.7531 0.7571 0.8701
No log 6.6341 272 0.7182 0.7564 0.7182 0.8475
No log 6.6829 274 0.7571 0.7552 0.7571 0.8701
No log 6.7317 276 0.7853 0.7429 0.7853 0.8862
No log 6.7805 278 0.7639 0.75 0.7639 0.8740
No log 6.8293 280 0.7476 0.7468 0.7476 0.8647
No log 6.8780 282 0.8084 0.7024 0.8084 0.8991
No log 6.9268 284 0.7673 0.7024 0.7673 0.8760
No log 6.9756 286 0.7119 0.7368 0.7119 0.8438
No log 7.0244 288 0.7408 0.7432 0.7408 0.8607
No log 7.0732 290 0.7795 0.7050 0.7795 0.8829
No log 7.1220 292 0.8369 0.6715 0.8369 0.9148
No log 7.1707 294 0.9227 0.6377 0.9227 0.9606
No log 7.2195 296 0.9539 0.6573 0.9539 0.9767
No log 7.2683 298 0.9678 0.625 0.9678 0.9838
No log 7.3171 300 0.9137 0.6581 0.9137 0.9559
No log 7.3659 302 0.9415 0.6824 0.9415 0.9703
No log 7.4146 304 0.9946 0.6780 0.9946 0.9973
No log 7.4634 306 0.9096 0.6512 0.9096 0.9537
No log 7.5122 308 0.7577 0.7516 0.7577 0.8705
No log 7.5610 310 0.7918 0.7246 0.7918 0.8898
No log 7.6098 312 0.8661 0.6618 0.8661 0.9306
No log 7.6585 314 0.8746 0.6815 0.8746 0.9352
No log 7.7073 316 0.8590 0.6715 0.8590 0.9268
No log 7.7561 318 0.8765 0.7123 0.8765 0.9362
No log 7.8049 320 0.8843 0.7273 0.8843 0.9404
No log 7.8537 322 0.8418 0.7067 0.8418 0.9175
No log 7.9024 324 0.8247 0.7234 0.8247 0.9081
No log 7.9512 326 0.8080 0.7222 0.8080 0.8989
No log 8.0 328 0.8022 0.7297 0.8022 0.8956
No log 8.0488 330 0.8166 0.7114 0.8166 0.9037
No log 8.0976 332 0.8241 0.7114 0.8241 0.9078
No log 8.1463 334 0.8625 0.7114 0.8625 0.9287
No log 8.1951 336 0.8945 0.6797 0.8945 0.9458
No log 8.2439 338 0.8561 0.7226 0.8561 0.9252
No log 8.2927 340 0.8091 0.72 0.8091 0.8995
No log 8.3415 342 0.8080 0.6968 0.8080 0.8989
No log 8.3902 344 0.7874 0.7134 0.7874 0.8874
No log 8.4390 346 0.8008 0.7051 0.8008 0.8949
No log 8.4878 348 0.8251 0.7037 0.8251 0.9083
No log 8.5366 350 0.8308 0.7037 0.8308 0.9115
No log 8.5854 352 0.8213 0.725 0.8213 0.9062
No log 8.6341 354 0.7742 0.7451 0.7742 0.8799
No log 8.6829 356 0.7922 0.7333 0.7922 0.8901
No log 8.7317 358 0.7896 0.7152 0.7896 0.8886
No log 8.7805 360 0.7856 0.7516 0.7856 0.8864
No log 8.8293 362 0.8092 0.7674 0.8092 0.8995
No log 8.8780 364 0.8118 0.7665 0.8118 0.9010
No log 8.9268 366 0.8123 0.7613 0.8123 0.9013
No log 8.9756 368 0.8061 0.75 0.8061 0.8978
No log 9.0244 370 0.7980 0.7297 0.7980 0.8933
No log 9.0732 372 0.7742 0.7586 0.7741 0.8799
No log 9.1220 374 0.7519 0.7448 0.7519 0.8671
No log 9.1707 376 0.7291 0.7432 0.7291 0.8539
No log 9.2195 378 0.7291 0.7347 0.7291 0.8539
No log 9.2683 380 0.7958 0.7081 0.7958 0.8921
No log 9.3171 382 0.8851 0.6627 0.8851 0.9408
No log 9.3659 384 0.8534 0.6829 0.8534 0.9238
No log 9.4146 386 0.8632 0.6829 0.8632 0.9291
No log 9.4634 388 0.8403 0.6623 0.8403 0.9167
No log 9.5122 390 0.8390 0.6849 0.8390 0.9160
No log 9.5610 392 0.8271 0.7059 0.8271 0.9094
No log 9.6098 394 0.8322 0.6667 0.8322 0.9122
No log 9.6585 396 0.8872 0.6309 0.8872 0.9419
No log 9.7073 398 1.0011 0.6405 1.0011 1.0005
No log 9.7561 400 0.9544 0.6709 0.9544 0.9769
No log 9.8049 402 0.8421 0.6795 0.8421 0.9177
No log 9.8537 404 0.8158 0.7226 0.8158 0.9032
No log 9.9024 406 0.7928 0.7550 0.7928 0.8904
No log 9.9512 408 0.7989 0.7465 0.7989 0.8938
No log 10.0 410 0.8078 0.7133 0.8078 0.8988
No log 10.0488 412 0.7908 0.7273 0.7908 0.8893
No log 10.0976 414 0.8262 0.7485 0.8262 0.9089
No log 10.1463 416 0.8704 0.7176 0.8704 0.9329
No log 10.1951 418 0.8834 0.7066 0.8834 0.9399
No log 10.2439 420 0.9344 0.6867 0.9344 0.9667
No log 10.2927 422 0.9643 0.6545 0.9643 0.9820
No log 10.3415 424 0.8957 0.6951 0.8957 0.9464
No log 10.3902 426 0.9063 0.7337 0.9063 0.9520
No log 10.4390 428 0.9848 0.6743 0.9848 0.9924
No log 10.4878 430 1.0234 0.6474 1.0234 1.0117
No log 10.5366 432 1.0768 0.6328 1.0768 1.0377
No log 10.5854 434 0.9553 0.6936 0.9553 0.9774
No log 10.6341 436 0.8350 0.7030 0.8350 0.9138
No log 10.6829 438 0.8187 0.6918 0.8187 0.9048
No log 10.7317 440 0.8500 0.6709 0.8500 0.9220
No log 10.7805 442 0.8808 0.6867 0.8808 0.9385
No log 10.8293 444 0.9581 0.6588 0.9581 0.9788
No log 10.8780 446 0.8463 0.6867 0.8463 0.9200
No log 10.9268 448 0.7930 0.7219 0.7930 0.8905
No log 10.9756 450 0.7816 0.7205 0.7816 0.8841
No log 11.0244 452 0.8183 0.6709 0.8183 0.9046
No log 11.0732 454 0.9737 0.6375 0.9737 0.9867
No log 11.1220 456 1.0771 0.6061 1.0771 1.0378
No log 11.1707 458 0.9321 0.6282 0.9321 0.9655
No log 11.2195 460 0.7614 0.7368 0.7614 0.8726
No log 11.2683 462 0.7569 0.7172 0.7569 0.8700
No log 11.3171 464 0.7628 0.7211 0.7628 0.8734
No log 11.3659 466 0.8095 0.7368 0.8095 0.8997
No log 11.4146 468 0.8570 0.72 0.8570 0.9257
No log 11.4634 470 0.8357 0.6986 0.8357 0.9142
No log 11.5122 472 0.8079 0.7222 0.8079 0.8988
No log 11.5610 474 0.8201 0.7123 0.8201 0.9056
No log 11.6098 476 0.8117 0.7333 0.8117 0.9010
No log 11.6585 478 0.7996 0.7261 0.7996 0.8942
No log 11.7073 480 0.9063 0.7018 0.9063 0.9520
No log 11.7561 482 1.0283 0.6952 1.0283 1.0141
No log 11.8049 484 1.0268 0.7120 1.0268 1.0133
No log 11.8537 486 0.9084 0.7128 0.9084 0.9531
No log 11.9024 488 0.7986 0.7168 0.7986 0.8937
No log 11.9512 490 0.8144 0.6988 0.8144 0.9025
No log 12.0 492 0.9106 0.675 0.9106 0.9542
No log 12.0488 494 1.0304 0.6026 1.0304 1.0151
No log 12.0976 496 1.0150 0.5921 1.0150 1.0075
No log 12.1463 498 0.8861 0.6839 0.8861 0.9413
0.3765 12.1951 500 0.7760 0.7125 0.7760 0.8809
0.3765 12.2439 502 0.7554 0.7425 0.7554 0.8691
0.3765 12.2927 504 0.7878 0.7 0.7878 0.8876
0.3765 12.3415 506 0.8592 0.6994 0.8592 0.9269
0.3765 12.3902 508 0.9522 0.6545 0.9522 0.9758
0.3765 12.4390 510 0.9093 0.6410 0.9093 0.9535
0.3765 12.4878 512 0.8399 0.7152 0.8399 0.9164
0.3765 12.5366 514 0.8575 0.7183 0.8575 0.9260
0.3765 12.5854 516 0.8673 0.7111 0.8673 0.9313
0.3765 12.6341 518 0.8888 0.7007 0.8888 0.9427
0.3765 12.6829 520 0.9220 0.6383 0.9220 0.9602
0.3765 12.7317 522 0.8705 0.7183 0.8705 0.9330
0.3765 12.7805 524 0.8225 0.7194 0.8225 0.9069
0.3765 12.8293 526 0.8263 0.6618 0.8263 0.9090
0.3765 12.8780 528 0.7958 0.7324 0.7958 0.8921
0.3765 12.9268 530 0.7677 0.7347 0.7677 0.8762
0.3765 12.9756 532 0.7640 0.7467 0.7640 0.8741
0.3765 13.0244 534 0.7564 0.7632 0.7564 0.8697
0.3765 13.0732 536 0.7661 0.75 0.7661 0.8753
0.3765 13.1220 538 0.7844 0.7619 0.7844 0.8856
0.3765 13.1707 540 0.7779 0.7534 0.7779 0.8820
0.3765 13.2195 542 0.7703 0.7467 0.7703 0.8776
0.3765 13.2683 544 0.7701 0.7467 0.7701 0.8776
0.3765 13.3171 546 0.7695 0.7320 0.7695 0.8772
0.3765 13.3659 548 0.7636 0.7320 0.7636 0.8738
0.3765 13.4146 550 0.7664 0.7421 0.7664 0.8755
0.3765 13.4634 552 0.7864 0.7284 0.7864 0.8868
0.3765 13.5122 554 0.7988 0.7421 0.7988 0.8938
0.3765 13.5610 556 0.8396 0.7006 0.8396 0.9163
0.3765 13.6098 558 0.8983 0.6375 0.8983 0.9478
0.3765 13.6585 560 0.8502 0.7044 0.8502 0.9221
0.3765 13.7073 562 0.7844 0.7205 0.7844 0.8857
0.3765 13.7561 564 0.7781 0.7186 0.7781 0.8821
0.3765 13.8049 566 0.7982 0.7368 0.7982 0.8934
0.3765 13.8537 568 0.7894 0.7081 0.7894 0.8885
0.3765 13.9024 570 0.7675 0.7383 0.7675 0.8761
0.3765 13.9512 572 0.7484 0.7383 0.7484 0.8651
0.3765 14.0 574 0.7388 0.7534 0.7388 0.8595
0.3765 14.0488 576 0.7364 0.7448 0.7364 0.8581
0.3765 14.0976 578 0.7541 0.7237 0.7541 0.8684
0.3765 14.1463 580 0.7759 0.7342 0.7759 0.8809
0.3765 14.1951 582 0.7251 0.7162 0.7251 0.8515
0.3765 14.2439 584 0.6756 0.7586 0.6756 0.8219
0.3765 14.2927 586 0.6765 0.7586 0.6765 0.8225
0.3765 14.3415 588 0.7176 0.7310 0.7176 0.8471
0.3765 14.3902 590 0.7786 0.7368 0.7786 0.8824
0.3765 14.4390 592 0.7559 0.7432 0.7559 0.8694
0.3765 14.4878 594 0.7194 0.75 0.7194 0.8482
0.3765 14.5366 596 0.7091 0.7586 0.7091 0.8421
0.3765 14.5854 598 0.7128 0.7671 0.7128 0.8443
0.3765 14.6341 600 0.7295 0.7703 0.7295 0.8541
0.3765 14.6829 602 0.7607 0.7619 0.7607 0.8722
0.3765 14.7317 604 0.7993 0.7383 0.7993 0.8940
0.3765 14.7805 606 0.8438 0.6667 0.8438 0.9186
0.3765 14.8293 608 0.9049 0.6194 0.9049 0.9512
0.3765 14.8780 610 0.8762 0.6497 0.8762 0.9361
0.3765 14.9268 612 0.7830 0.7170 0.7830 0.8849
0.3765 14.9756 614 0.7036 0.7607 0.7036 0.8388
0.3765 15.0244 616 0.6831 0.775 0.6831 0.8265
0.3765 15.0732 618 0.6864 0.7771 0.6864 0.8285
0.3765 15.1220 620 0.7023 0.7703 0.7023 0.8380
0.3765 15.1707 622 0.7360 0.7733 0.7360 0.8579
0.3765 15.2195 624 0.8252 0.7114 0.8252 0.9084
0.3765 15.2683 626 0.9022 0.6486 0.9022 0.9498
0.3765 15.3171 628 0.8756 0.6573 0.8756 0.9357
0.3765 15.3659 630 0.8279 0.6475 0.8279 0.9099
0.3765 15.4146 632 0.7814 0.6957 0.7814 0.8840

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_B_usingALLEssays_FineTuningAraBERT_run2_AugV5_k9_task1_organization

Finetuned
(4019)
this model