ArabicNewSplits6_WithDuplicationsForScore5_FineTuningAraBERT_run3_AugV5_k7_task1_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.8165
  • Qwk: 0.6916
  • Mse: 0.8165
  • Rmse: 0.9036

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 10

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0488 2 5.2689 -0.0111 5.2689 2.2954
No log 0.0976 4 3.4688 0.0650 3.4688 1.8625
No log 0.1463 6 1.9122 0.1064 1.9122 1.3828
No log 0.1951 8 1.4467 0.0372 1.4467 1.2028
No log 0.2439 10 1.2194 0.2995 1.2194 1.1042
No log 0.2927 12 1.1404 0.1843 1.1404 1.0679
No log 0.3415 14 1.2488 0.1301 1.2488 1.1175
No log 0.3902 16 1.3808 0.1630 1.3808 1.1751
No log 0.4390 18 1.3399 0.1011 1.3399 1.1575
No log 0.4878 20 1.2208 0.1151 1.2208 1.1049
No log 0.5366 22 1.0343 0.3352 1.0343 1.0170
No log 0.5854 24 1.1087 0.3605 1.1087 1.0529
No log 0.6341 26 1.1581 0.3434 1.1581 1.0761
No log 0.6829 28 1.2700 0.3287 1.2700 1.1269
No log 0.7317 30 1.3789 0.2880 1.3789 1.1743
No log 0.7805 32 1.1101 0.3220 1.1101 1.0536
No log 0.8293 34 1.0227 0.3604 1.0227 1.0113
No log 0.8780 36 1.0866 0.4004 1.0866 1.0424
No log 0.9268 38 1.3588 0.3736 1.3588 1.1657
No log 0.9756 40 1.6318 0.2767 1.6318 1.2774
No log 1.0244 42 1.7374 0.2462 1.7374 1.3181
No log 1.0732 44 1.4466 0.3704 1.4466 1.2028
No log 1.1220 46 1.0518 0.4284 1.0518 1.0256
No log 1.1707 48 0.9663 0.5020 0.9663 0.9830
No log 1.2195 50 1.0697 0.5067 1.0697 1.0343
No log 1.2683 52 1.0895 0.4605 1.0895 1.0438
No log 1.3171 54 1.0265 0.4767 1.0265 1.0132
No log 1.3659 56 0.9403 0.4910 0.9403 0.9697
No log 1.4146 58 0.8857 0.5068 0.8857 0.9411
No log 1.4634 60 0.8874 0.4456 0.8874 0.9420
No log 1.5122 62 0.9031 0.4631 0.9031 0.9503
No log 1.5610 64 0.9416 0.4286 0.9416 0.9704
No log 1.6098 66 0.8986 0.4082 0.8986 0.9480
No log 1.6585 68 0.8276 0.5288 0.8276 0.9097
No log 1.7073 70 0.7677 0.5383 0.7677 0.8762
No log 1.7561 72 0.7503 0.5599 0.7503 0.8662
No log 1.8049 74 0.7853 0.5775 0.7853 0.8862
No log 1.8537 76 0.8229 0.5606 0.8229 0.9071
No log 1.9024 78 0.7728 0.6042 0.7728 0.8791
No log 1.9512 80 0.7428 0.5450 0.7428 0.8619
No log 2.0 82 0.8367 0.5341 0.8367 0.9147
No log 2.0488 84 0.9765 0.5663 0.9765 0.9882
No log 2.0976 86 0.8918 0.5416 0.8918 0.9443
No log 2.1463 88 0.7874 0.5999 0.7874 0.8874
No log 2.1951 90 0.7900 0.6305 0.7900 0.8888
No log 2.2439 92 0.8327 0.6578 0.8327 0.9125
No log 2.2927 94 0.8768 0.6378 0.8768 0.9364
No log 2.3415 96 0.9133 0.6067 0.9133 0.9557
No log 2.3902 98 0.8941 0.5989 0.8941 0.9456
No log 2.4390 100 0.8773 0.5981 0.8773 0.9367
No log 2.4878 102 0.8359 0.5827 0.8359 0.9143
No log 2.5366 104 0.8135 0.6754 0.8135 0.9019
No log 2.5854 106 0.7938 0.6939 0.7938 0.8910
No log 2.6341 108 0.7818 0.6491 0.7818 0.8842
No log 2.6829 110 0.7847 0.6148 0.7847 0.8858
No log 2.7317 112 0.7851 0.6078 0.7851 0.8861
No log 2.7805 114 0.8142 0.6589 0.8142 0.9023
No log 2.8293 116 0.8946 0.6513 0.8946 0.9458
No log 2.8780 118 1.0341 0.5905 1.0341 1.0169
No log 2.9268 120 1.0830 0.5956 1.0830 1.0406
No log 2.9756 122 1.0630 0.5823 1.0630 1.0310
No log 3.0244 124 1.0235 0.6014 1.0235 1.0117
No log 3.0732 126 0.9529 0.6145 0.9529 0.9762
No log 3.1220 128 0.8563 0.6691 0.8563 0.9254
No log 3.1707 130 0.8212 0.6688 0.8212 0.9062
No log 3.2195 132 0.8461 0.6314 0.8461 0.9198
No log 3.2683 134 0.8668 0.6441 0.8668 0.9310
No log 3.3171 136 0.8970 0.6490 0.8970 0.9471
No log 3.3659 138 1.0107 0.6045 1.0107 1.0054
No log 3.4146 140 1.1508 0.5821 1.1508 1.0728
No log 3.4634 142 1.0994 0.6141 1.0994 1.0485
No log 3.5122 144 0.9523 0.6342 0.9523 0.9759
No log 3.5610 146 0.8493 0.6406 0.8493 0.9216
No log 3.6098 148 0.8251 0.6440 0.8251 0.9084
No log 3.6585 150 0.8264 0.6235 0.8264 0.9091
No log 3.7073 152 0.8307 0.6402 0.8307 0.9114
No log 3.7561 154 0.8452 0.6372 0.8452 0.9193
No log 3.8049 156 0.8494 0.6357 0.8494 0.9217
No log 3.8537 158 0.8517 0.6280 0.8517 0.9229
No log 3.9024 160 0.8739 0.6096 0.8739 0.9348
No log 3.9512 162 0.9044 0.6454 0.9044 0.9510
No log 4.0 164 0.9189 0.6223 0.9189 0.9586
No log 4.0488 166 0.8692 0.6187 0.8692 0.9323
No log 4.0976 168 0.7935 0.6327 0.7935 0.8908
No log 4.1463 170 0.7530 0.6366 0.7530 0.8678
No log 4.1951 172 0.7121 0.6682 0.7121 0.8438
No log 4.2439 174 0.7004 0.6771 0.7004 0.8369
No log 4.2927 176 0.7117 0.6940 0.7117 0.8436
No log 4.3415 178 0.7332 0.6782 0.7332 0.8562
No log 4.3902 180 0.7680 0.6737 0.7680 0.8763
No log 4.4390 182 0.7984 0.6744 0.7984 0.8936
No log 4.4878 184 0.8233 0.6952 0.8233 0.9074
No log 4.5366 186 0.8819 0.6799 0.8819 0.9391
No log 4.5854 188 0.9476 0.6632 0.9476 0.9734
No log 4.6341 190 0.9814 0.6610 0.9814 0.9907
No log 4.6829 192 0.9662 0.6473 0.9662 0.9830
No log 4.7317 194 0.9137 0.6653 0.9137 0.9559
No log 4.7805 196 0.8456 0.6683 0.8456 0.9196
No log 4.8293 198 0.8173 0.6699 0.8173 0.9040
No log 4.8780 200 0.8084 0.6734 0.8084 0.8991
No log 4.9268 202 0.8196 0.6643 0.8196 0.9053
No log 4.9756 204 0.8472 0.6662 0.8472 0.9204
No log 5.0244 206 0.8963 0.6740 0.8963 0.9467
No log 5.0732 208 0.9275 0.6642 0.9275 0.9631
No log 5.1220 210 0.9337 0.6463 0.9337 0.9663
No log 5.1707 212 0.8706 0.6757 0.8706 0.9331
No log 5.2195 214 0.8232 0.6795 0.8232 0.9073
No log 5.2683 216 0.8360 0.6725 0.8360 0.9143
No log 5.3171 218 0.8097 0.6784 0.8097 0.8998
No log 5.3659 220 0.7946 0.6968 0.7946 0.8914
No log 5.4146 222 0.8190 0.6780 0.8190 0.9050
No log 5.4634 224 0.9034 0.6725 0.9034 0.9505
No log 5.5122 226 0.9556 0.6560 0.9556 0.9775
No log 5.5610 228 0.9333 0.6629 0.9333 0.9661
No log 5.6098 230 0.8735 0.6733 0.8735 0.9346
No log 5.6585 232 0.8110 0.6477 0.8110 0.9005
No log 5.7073 234 0.7411 0.7031 0.7411 0.8609
No log 5.7561 236 0.7148 0.7125 0.7148 0.8455
No log 5.8049 238 0.7280 0.7038 0.7280 0.8532
No log 5.8537 240 0.7557 0.7048 0.7557 0.8693
No log 5.9024 242 0.7714 0.6777 0.7714 0.8783
No log 5.9512 244 0.8231 0.6685 0.8231 0.9072
No log 6.0 246 0.8672 0.6759 0.8672 0.9312
No log 6.0488 248 0.9178 0.6687 0.9178 0.9580
No log 6.0976 250 0.9009 0.6742 0.9009 0.9492
No log 6.1463 252 0.8525 0.6863 0.8525 0.9233
No log 6.1951 254 0.7946 0.6702 0.7946 0.8914
No log 6.2439 256 0.7702 0.6974 0.7702 0.8776
No log 6.2927 258 0.7582 0.7018 0.7582 0.8708
No log 6.3415 260 0.7641 0.7010 0.7641 0.8742
No log 6.3902 262 0.7791 0.6861 0.7791 0.8827
No log 6.4390 264 0.8163 0.6822 0.8163 0.9035
No log 6.4878 266 0.8212 0.6713 0.8212 0.9062
No log 6.5366 268 0.7888 0.6713 0.7888 0.8881
No log 6.5854 270 0.7714 0.6720 0.7714 0.8783
No log 6.6341 272 0.7737 0.6755 0.7737 0.8796
No log 6.6829 274 0.7747 0.6766 0.7747 0.8802
No log 6.7317 276 0.7672 0.6722 0.7672 0.8759
No log 6.7805 278 0.7329 0.6923 0.7329 0.8561
No log 6.8293 280 0.6954 0.7144 0.6954 0.8339
No log 6.8780 282 0.6854 0.7207 0.6854 0.8279
No log 6.9268 284 0.7099 0.7344 0.7099 0.8425
No log 6.9756 286 0.7593 0.7023 0.7593 0.8714
No log 7.0244 288 0.8064 0.6933 0.8064 0.8980
No log 7.0732 290 0.8586 0.6759 0.8586 0.9266
No log 7.1220 292 0.9092 0.6686 0.9092 0.9535
No log 7.1707 294 0.9375 0.6618 0.9375 0.9683
No log 7.2195 296 0.9294 0.6615 0.9294 0.9641
No log 7.2683 298 0.9117 0.6584 0.9117 0.9548
No log 7.3171 300 0.8916 0.6716 0.8916 0.9443
No log 7.3659 302 0.8466 0.6823 0.8466 0.9201
No log 7.4146 304 0.8129 0.6857 0.8129 0.9016
No log 7.4634 306 0.7708 0.6945 0.7708 0.8779
No log 7.5122 308 0.7666 0.6945 0.7666 0.8755
No log 7.5610 310 0.7832 0.6874 0.7832 0.8850
No log 7.6098 312 0.8010 0.6886 0.8010 0.8950
No log 7.6585 314 0.8213 0.6886 0.8213 0.9062
No log 7.7073 316 0.8199 0.6857 0.8199 0.9055
No log 7.7561 318 0.7975 0.6945 0.7975 0.8931
No log 7.8049 320 0.7965 0.7080 0.7965 0.8925
No log 7.8537 322 0.7905 0.7080 0.7905 0.8891
No log 7.9024 324 0.7954 0.7080 0.7954 0.8919
No log 7.9512 326 0.8248 0.7072 0.8248 0.9082
No log 8.0 328 0.8837 0.6876 0.8837 0.9401
No log 8.0488 330 0.9309 0.6716 0.9309 0.9648
No log 8.0976 332 0.9517 0.6648 0.9517 0.9756
No log 8.1463 334 0.9250 0.6749 0.9250 0.9617
No log 8.1951 336 0.8816 0.6830 0.8816 0.9389
No log 8.2439 338 0.8347 0.7046 0.8347 0.9136
No log 8.2927 340 0.8134 0.7064 0.8134 0.9019
No log 8.3415 342 0.7953 0.6982 0.7953 0.8918
No log 8.3902 344 0.7890 0.6982 0.7890 0.8883
No log 8.4390 346 0.8014 0.6982 0.8014 0.8952
No log 8.4878 348 0.8158 0.7064 0.8158 0.9032
No log 8.5366 350 0.8330 0.6969 0.8330 0.9127
No log 8.5854 352 0.8293 0.6957 0.8293 0.9107
No log 8.6341 354 0.8263 0.6957 0.8263 0.9090
No log 8.6829 356 0.8218 0.6957 0.8218 0.9065
No log 8.7317 358 0.8232 0.7021 0.8232 0.9073
No log 8.7805 360 0.8324 0.6957 0.8324 0.9123
No log 8.8293 362 0.8375 0.6781 0.8375 0.9151
No log 8.8780 364 0.8404 0.6781 0.8404 0.9167
No log 8.9268 366 0.8369 0.6898 0.8369 0.9148
No log 8.9756 368 0.8347 0.6898 0.8347 0.9136
No log 9.0244 370 0.8375 0.6898 0.8375 0.9152
No log 9.0732 372 0.8305 0.6781 0.8305 0.9113
No log 9.1220 374 0.8254 0.6798 0.8254 0.9085
No log 9.1707 376 0.8136 0.6798 0.8136 0.9020
No log 9.2195 378 0.8032 0.6798 0.8032 0.8962
No log 9.2683 380 0.8005 0.6798 0.8005 0.8947
No log 9.3171 382 0.7999 0.6916 0.7999 0.8944
No log 9.3659 384 0.7975 0.6916 0.7975 0.8930
No log 9.4146 386 0.7923 0.6916 0.7923 0.8901
No log 9.4634 388 0.7890 0.6916 0.7890 0.8883
No log 9.5122 390 0.7913 0.6916 0.7913 0.8895
No log 9.5610 392 0.7962 0.6916 0.7962 0.8923
No log 9.6098 394 0.8002 0.6916 0.8002 0.8945
No log 9.6585 396 0.8013 0.6916 0.8013 0.8951
No log 9.7073 398 0.8048 0.6916 0.8048 0.8971
No log 9.7561 400 0.8078 0.6916 0.8078 0.8988
No log 9.8049 402 0.8100 0.6916 0.8100 0.9000
No log 9.8537 404 0.8126 0.6916 0.8126 0.9015
No log 9.9024 406 0.8141 0.6916 0.8141 0.9023
No log 9.9512 408 0.8158 0.6916 0.8158 0.9032
No log 10.0 410 0.8165 0.6916 0.8165 0.9036

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits6_WithDuplicationsForScore5_FineTuningAraBERT_run3_AugV5_k7_task1_organization

Finetuned
(4023)
this model