ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run3_AugV5_k7_task1_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.7151
  • Qwk: 0.7613
  • Mse: 0.7151
  • Rmse: 0.8456

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0377 2 6.8262 0.0242 6.8262 2.6127
No log 0.0755 4 4.8308 0.0682 4.8308 2.1979
No log 0.1132 6 3.0614 0.0988 3.0614 1.7497
No log 0.1509 8 2.8613 0.0649 2.8613 1.6915
No log 0.1887 10 2.1607 0.2158 2.1607 1.4699
No log 0.2264 12 1.9319 0.1754 1.9319 1.3899
No log 0.2642 14 1.7550 0.1698 1.7550 1.3248
No log 0.3019 16 1.8252 0.1481 1.8252 1.3510
No log 0.3396 18 1.8744 0.2105 1.8744 1.3691
No log 0.3774 20 1.7739 0.2105 1.7739 1.3319
No log 0.4151 22 1.6218 0.2435 1.6218 1.2735
No log 0.4528 24 1.6428 0.2623 1.6428 1.2817
No log 0.4906 26 1.5004 0.3065 1.5004 1.2249
No log 0.5283 28 1.3021 0.4463 1.3021 1.1411
No log 0.5660 30 1.1930 0.4576 1.1930 1.0922
No log 0.6038 32 1.0764 0.5556 1.0764 1.0375
No log 0.6415 34 1.1098 0.5426 1.1098 1.0535
No log 0.6792 36 1.2277 0.496 1.2277 1.1080
No log 0.7170 38 1.1333 0.5469 1.1333 1.0645
No log 0.7547 40 1.1476 0.5041 1.1476 1.0713
No log 0.7925 42 1.0093 0.6260 1.0093 1.0046
No log 0.8302 44 0.8314 0.6241 0.8314 0.9118
No log 0.8679 46 0.8287 0.6713 0.8287 0.9103
No log 0.9057 48 0.7809 0.7027 0.7809 0.8837
No log 0.9434 50 0.7436 0.7595 0.7436 0.8623
No log 0.9811 52 0.7729 0.7089 0.7729 0.8791
No log 1.0189 54 0.8549 0.7470 0.8549 0.9246
No log 1.0566 56 0.8494 0.725 0.8494 0.9216
No log 1.0943 58 0.7721 0.7051 0.7721 0.8787
No log 1.1321 60 0.8758 0.7101 0.8758 0.9358
No log 1.1698 62 1.0534 0.6380 1.0534 1.0264
No log 1.2075 64 0.8575 0.6667 0.8575 0.9260
No log 1.2453 66 0.7509 0.7403 0.7509 0.8666
No log 1.2830 68 0.7639 0.7097 0.7639 0.8740
No log 1.3208 70 0.9315 0.6582 0.9315 0.9651
No log 1.3585 72 1.1151 0.5860 1.1151 1.0560
No log 1.3962 74 1.0909 0.5571 1.0909 1.0445
No log 1.4340 76 0.7967 0.7273 0.7967 0.8926
No log 1.4717 78 0.7792 0.7248 0.7792 0.8827
No log 1.5094 80 0.7680 0.6993 0.7680 0.8764
No log 1.5472 82 0.8610 0.6897 0.8610 0.9279
No log 1.5849 84 0.7985 0.6957 0.7985 0.8936
No log 1.6226 86 0.7460 0.7153 0.7460 0.8637
No log 1.6604 88 0.7710 0.7183 0.7710 0.8781
No log 1.6981 90 0.8698 0.6531 0.8698 0.9326
No log 1.7358 92 0.8158 0.6939 0.8158 0.9032
No log 1.7736 94 0.6270 0.7712 0.6270 0.7918
No log 1.8113 96 0.6452 0.7484 0.6452 0.8033
No log 1.8491 98 0.5969 0.8025 0.5969 0.7726
No log 1.8868 100 0.6945 0.7647 0.6945 0.8334
No log 1.9245 102 0.6515 0.7595 0.6515 0.8072
No log 1.9623 104 0.6060 0.7871 0.6060 0.7784
No log 2.0 106 0.9172 0.7273 0.9172 0.9577
No log 2.0377 108 1.0752 0.7066 1.0752 1.0369
No log 2.0755 110 0.7817 0.7308 0.7817 0.8841
No log 2.1132 112 0.6444 0.7651 0.6444 0.8027
No log 2.1509 114 0.9995 0.6345 0.9995 0.9998
No log 2.1887 116 1.0943 0.5972 1.0943 1.0461
No log 2.2264 118 0.9504 0.6709 0.9504 0.9749
No log 2.2642 120 0.7897 0.6667 0.7897 0.8887
No log 2.3019 122 0.7043 0.7413 0.7043 0.8393
No log 2.3396 124 0.7166 0.7246 0.7166 0.8465
No log 2.3774 126 0.7077 0.7246 0.7077 0.8413
No log 2.4151 128 0.7104 0.7059 0.7104 0.8429
No log 2.4528 130 0.7867 0.6853 0.7867 0.8870
No log 2.4906 132 0.7691 0.6800 0.7691 0.8770
No log 2.5283 134 0.5608 0.8 0.5608 0.7489
No log 2.5660 136 0.5852 0.7792 0.5852 0.7650
No log 2.6038 138 0.5719 0.8 0.5719 0.7562
No log 2.6415 140 0.5707 0.8025 0.5707 0.7555
No log 2.6792 142 0.5968 0.8050 0.5968 0.7725
No log 2.7170 144 0.6391 0.7595 0.6391 0.7994
No log 2.7547 146 0.7989 0.6974 0.7989 0.8938
No log 2.7925 148 0.9241 0.6627 0.9241 0.9613
No log 2.8302 150 0.9096 0.6788 0.9096 0.9537
No log 2.8679 152 0.7703 0.7027 0.7703 0.8777
No log 2.9057 154 0.7392 0.7133 0.7392 0.8598
No log 2.9434 156 0.7029 0.7234 0.7029 0.8384
No log 2.9811 158 0.6582 0.7222 0.6582 0.8113
No log 3.0189 160 0.7918 0.6944 0.7918 0.8898
No log 3.0566 162 0.7963 0.7 0.7963 0.8924
No log 3.0943 164 0.6798 0.7299 0.6798 0.8245
No log 3.1321 166 0.6436 0.7338 0.6436 0.8023
No log 3.1698 168 0.6247 0.7518 0.6247 0.7904
No log 3.2075 170 0.6711 0.7746 0.6711 0.8192
No log 3.2453 172 0.8387 0.6533 0.8387 0.9158
No log 3.2830 174 0.9398 0.6585 0.9398 0.9695
No log 3.3208 176 0.7882 0.6577 0.7882 0.8878
No log 3.3585 178 0.6204 0.7724 0.6204 0.7877
No log 3.3962 180 0.6275 0.7639 0.6275 0.7921
No log 3.4340 182 0.6621 0.7445 0.6621 0.8137
No log 3.4717 184 0.6579 0.7445 0.6579 0.8111
No log 3.5094 186 0.6379 0.7733 0.6379 0.7987
No log 3.5472 188 0.5942 0.7843 0.5942 0.7709
No log 3.5849 190 0.5832 0.7973 0.5832 0.7637
No log 3.6226 192 0.5870 0.8105 0.5870 0.7662
No log 3.6604 194 0.7061 0.7329 0.7061 0.8403
No log 3.6981 196 0.7999 0.6786 0.7999 0.8943
No log 3.7358 198 0.7105 0.7152 0.7105 0.8429
No log 3.7736 200 0.5984 0.8 0.5984 0.7735
No log 3.8113 202 0.6666 0.7722 0.6666 0.8164
No log 3.8491 204 0.6681 0.7792 0.6681 0.8174
No log 3.8868 206 0.6541 0.7838 0.6541 0.8088
No log 3.9245 208 0.8257 0.6923 0.8257 0.9087
No log 3.9623 210 0.9257 0.6893 0.9257 0.9621
No log 4.0 212 0.8795 0.7151 0.8795 0.9378
No log 4.0377 214 0.6821 0.7226 0.6821 0.8259
No log 4.0755 216 0.6091 0.8105 0.6091 0.7805
No log 4.1132 218 0.6575 0.7639 0.6575 0.8109
No log 4.1509 220 0.7120 0.6818 0.7120 0.8438
No log 4.1887 222 0.7765 0.6923 0.7765 0.8812
No log 4.2264 224 0.7780 0.6667 0.7780 0.8821
No log 4.2642 226 0.7206 0.7101 0.7206 0.8489
No log 4.3019 228 0.7255 0.7105 0.7255 0.8517
No log 4.3396 230 0.7013 0.7907 0.7013 0.8375
No log 4.3774 232 0.6804 0.7869 0.6804 0.8249
No log 4.4151 234 0.5699 0.8065 0.5699 0.7549
No log 4.4528 236 0.6014 0.7381 0.6014 0.7755
No log 4.4906 238 0.6415 0.7439 0.6415 0.8009
No log 4.5283 240 0.5879 0.7532 0.5879 0.7668
No log 4.5660 242 0.6578 0.7483 0.6578 0.8110
No log 4.6038 244 0.7409 0.7515 0.7409 0.8608
No log 4.6415 246 0.6730 0.76 0.6730 0.8203
No log 4.6792 248 0.6088 0.7568 0.6088 0.7803
No log 4.7170 250 0.6204 0.7792 0.6204 0.7877
No log 4.7547 252 0.5783 0.7792 0.5783 0.7605
No log 4.7925 254 0.7217 0.7451 0.7217 0.8496
No log 4.8302 256 0.9062 0.6628 0.9062 0.9520
No log 4.8679 258 0.8806 0.6818 0.8806 0.9384
No log 4.9057 260 0.7436 0.7294 0.7436 0.8623
No log 4.9434 262 0.6967 0.7285 0.6967 0.8347
No log 4.9811 264 0.6419 0.7724 0.6419 0.8012
No log 5.0189 266 0.6238 0.7785 0.6238 0.7898
No log 5.0566 268 0.6699 0.7613 0.6699 0.8185
No log 5.0943 270 0.6418 0.7550 0.6418 0.8011
No log 5.1321 272 0.5983 0.7867 0.5983 0.7735
No log 5.1698 274 0.6278 0.7815 0.6278 0.7923
No log 5.2075 276 0.6952 0.7261 0.6952 0.8338
No log 5.2453 278 0.7769 0.7067 0.7769 0.8814
No log 5.2830 280 0.8004 0.6857 0.8004 0.8947
No log 5.3208 282 0.7530 0.7015 0.7530 0.8678
No log 5.3585 284 0.7279 0.6917 0.7279 0.8532
No log 5.3962 286 0.7071 0.6917 0.7071 0.8409
No log 5.4340 288 0.7121 0.7324 0.7121 0.8439
No log 5.4717 290 0.8297 0.7089 0.8297 0.9109
No log 5.5094 292 0.9944 0.6405 0.9944 0.9972
No log 5.5472 294 0.9793 0.6259 0.9793 0.9896
No log 5.5849 296 0.9503 0.6383 0.9503 0.9749
No log 5.6226 298 0.8719 0.6522 0.8719 0.9337
No log 5.6604 300 0.8390 0.6917 0.8390 0.9159
No log 5.6981 302 0.7926 0.7361 0.7926 0.8903
No log 5.7358 304 0.7297 0.7722 0.7297 0.8542
No log 5.7736 306 0.6802 0.7758 0.6802 0.8247
No log 5.8113 308 0.6894 0.7978 0.6894 0.8303
No log 5.8491 310 0.7726 0.7889 0.7726 0.8790
No log 5.8868 312 0.7786 0.7746 0.7786 0.8824
No log 5.9245 314 0.7105 0.7927 0.7105 0.8429
No log 5.9623 316 0.6612 0.76 0.6612 0.8132
No log 6.0 318 0.6755 0.7376 0.6755 0.8219
No log 6.0377 320 0.6876 0.6906 0.6876 0.8292
No log 6.0755 322 0.6152 0.7465 0.6152 0.7843
No log 6.1132 324 0.6024 0.7975 0.6024 0.7762
No log 6.1509 326 0.6455 0.7831 0.6455 0.8034
No log 6.1887 328 0.6362 0.7907 0.6362 0.7976
No log 6.2264 330 0.5955 0.8047 0.5955 0.7717
No log 6.2642 332 0.6371 0.7763 0.6371 0.7982
No log 6.3019 334 0.7399 0.7586 0.7399 0.8602
No log 6.3396 336 0.8054 0.7114 0.8054 0.8975
No log 6.3774 338 0.7722 0.7059 0.7722 0.8788
No log 6.4151 340 0.8104 0.7125 0.8104 0.9002
No log 6.4528 342 0.7967 0.6933 0.7967 0.8926
No log 6.4906 344 0.7561 0.7465 0.7561 0.8695
No log 6.5283 346 0.7058 0.7391 0.7058 0.8401
No log 6.5660 348 0.7191 0.7153 0.7191 0.8480
No log 6.6038 350 0.7523 0.7353 0.7523 0.8673
No log 6.6415 352 0.8052 0.6897 0.8052 0.8974
No log 6.6792 354 0.9330 0.6788 0.9330 0.9659
No log 6.7170 356 1.0126 0.6629 1.0126 1.0063
No log 6.7547 358 0.8892 0.6709 0.8892 0.9430
No log 6.7925 360 0.7269 0.7692 0.7269 0.8526
No log 6.8302 362 0.6644 0.7746 0.6644 0.8151
No log 6.8679 364 0.6219 0.7763 0.6219 0.7886
No log 6.9057 366 0.6123 0.7843 0.6123 0.7825
No log 6.9434 368 0.6330 0.7632 0.6330 0.7956
No log 6.9811 370 0.6687 0.7625 0.6687 0.8178
No log 7.0189 372 0.6808 0.7857 0.6808 0.8251
No log 7.0566 374 0.6747 0.7857 0.6747 0.8214
No log 7.0943 376 0.6345 0.7673 0.6345 0.7965
No log 7.1321 378 0.6190 0.7862 0.6190 0.7867
No log 7.1698 380 0.6240 0.7832 0.6240 0.7899
No log 7.2075 382 0.6225 0.7571 0.6225 0.7890
No log 7.2453 384 0.6385 0.7571 0.6385 0.7990
No log 7.2830 386 0.6111 0.7571 0.6111 0.7817
No log 7.3208 388 0.6250 0.7606 0.6250 0.7906
No log 7.3585 390 0.7070 0.7483 0.7070 0.8409
No log 7.3962 392 0.9060 0.6585 0.9060 0.9519
No log 7.4340 394 0.9248 0.6667 0.9248 0.9617
No log 7.4717 396 0.7951 0.6933 0.7951 0.8917
No log 7.5094 398 0.7031 0.7391 0.7031 0.8385
No log 7.5472 400 0.7253 0.7023 0.7253 0.8516
No log 7.5849 402 0.7104 0.6970 0.7104 0.8428
No log 7.6226 404 0.6572 0.7556 0.6572 0.8107
No log 7.6604 406 0.6469 0.7613 0.6469 0.8043
No log 7.6981 408 0.7281 0.7515 0.7281 0.8533
No log 7.7358 410 0.7884 0.7160 0.7884 0.8879
No log 7.7736 412 0.7729 0.7059 0.7729 0.8791
No log 7.8113 414 0.7556 0.6667 0.7556 0.8692
No log 7.8491 416 0.7659 0.7059 0.7659 0.8751
No log 7.8868 418 0.7830 0.6963 0.7830 0.8849
No log 7.9245 420 0.7496 0.75 0.7496 0.8658
No log 7.9623 422 0.7137 0.75 0.7137 0.8448
No log 8.0 424 0.6996 0.7714 0.6996 0.8364
No log 8.0377 426 0.7178 0.7632 0.7178 0.8472
No log 8.0755 428 0.7567 0.6951 0.7567 0.8699
No log 8.1132 430 0.7383 0.7284 0.7383 0.8592
No log 8.1509 432 0.6719 0.7613 0.6719 0.8197
No log 8.1887 434 0.6344 0.8054 0.6344 0.7965
No log 8.2264 436 0.6399 0.7755 0.6399 0.7999
No log 8.2642 438 0.6532 0.7639 0.6532 0.8082
No log 8.3019 440 0.6556 0.7639 0.6556 0.8097
No log 8.3396 442 0.6326 0.7832 0.6326 0.7953
No log 8.3774 444 0.6533 0.7755 0.6533 0.8083
No log 8.4151 446 0.7156 0.7448 0.7156 0.8459
No log 8.4528 448 0.8097 0.7320 0.8097 0.8998
No log 8.4906 450 0.8389 0.7308 0.8389 0.9159
No log 8.5283 452 0.8064 0.7484 0.8064 0.8980
No log 8.5660 454 0.7304 0.7429 0.7304 0.8546
No log 8.6038 456 0.7246 0.7206 0.7246 0.8512
No log 8.6415 458 0.7371 0.7429 0.7371 0.8586
No log 8.6792 460 0.7527 0.7432 0.7527 0.8676
No log 8.7170 462 0.8106 0.725 0.8106 0.9003
No log 8.7547 464 0.7520 0.7215 0.7520 0.8672
No log 8.7925 466 0.6855 0.7639 0.6855 0.8280
No log 8.8302 468 0.7241 0.7097 0.7241 0.8509
No log 8.8679 470 0.7272 0.7260 0.7272 0.8528
No log 8.9057 472 0.7065 0.7448 0.7065 0.8405
No log 8.9434 474 0.7080 0.7448 0.7080 0.8414
No log 8.9811 476 0.7112 0.7429 0.7112 0.8434
No log 9.0189 478 0.7570 0.7042 0.7570 0.8701
No log 9.0566 480 0.8690 0.6849 0.8690 0.9322
No log 9.0943 482 0.8912 0.6803 0.8912 0.9440
No log 9.1321 484 0.8435 0.6714 0.8435 0.9184
No log 9.1698 486 0.7816 0.6765 0.7816 0.8841
No log 9.2075 488 0.7343 0.7101 0.7343 0.8569
No log 9.2453 490 0.7365 0.7143 0.7365 0.8582
No log 9.2830 492 0.7659 0.7042 0.7659 0.8752
No log 9.3208 494 0.7356 0.7448 0.7356 0.8577
No log 9.3585 496 0.6687 0.8028 0.6687 0.8178
No log 9.3962 498 0.6501 0.7917 0.6501 0.8063
0.3893 9.4340 500 0.6518 0.7821 0.6518 0.8074
0.3893 9.4717 502 0.6293 0.7927 0.6293 0.7933
0.3893 9.5094 504 0.5916 0.7811 0.5916 0.7691
0.3893 9.5472 506 0.6117 0.7907 0.6117 0.7821
0.3893 9.5849 508 0.6639 0.8187 0.6639 0.8148
0.3893 9.6226 510 0.6461 0.7904 0.6461 0.8038
0.3893 9.6604 512 0.6202 0.7867 0.6202 0.7875
0.3893 9.6981 514 0.6384 0.7571 0.6384 0.7990
0.3893 9.7358 516 0.6479 0.7445 0.6479 0.8049
0.3893 9.7736 518 0.6227 0.8026 0.6227 0.7891
0.3893 9.8113 520 0.6878 0.7590 0.6878 0.8293
0.3893 9.8491 522 0.7106 0.7590 0.7106 0.8430
0.3893 9.8868 524 0.6808 0.7683 0.6808 0.8251
0.3893 9.9245 526 0.6743 0.7763 0.6743 0.8211
0.3893 9.9623 528 0.7151 0.7613 0.7151 0.8456

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run3_AugV5_k7_task1_organization

Finetuned
(4019)
this model