ArabicNewSplits7_FineTuningAraBERT_noAug_task2_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.8148
  • Qwk: 0.5039
  • Mse: 0.8148
  • Rmse: 0.9027

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.6667 2 4.3186 0.0289 4.3186 2.0781
No log 1.3333 4 3.5551 0.0057 3.5551 1.8855
No log 2.0 6 1.8715 0.1273 1.8715 1.3680
No log 2.6667 8 1.2768 0.1584 1.2768 1.1300
No log 3.3333 10 1.2187 0.1043 1.2187 1.1039
No log 4.0 12 1.1801 0.0802 1.1801 1.0863
No log 4.6667 14 1.2251 0.1582 1.2251 1.1068
No log 5.3333 16 1.1674 0.2255 1.1674 1.0805
No log 6.0 18 1.1509 0.2499 1.1509 1.0728
No log 6.6667 20 1.2198 0.4300 1.2198 1.1045
No log 7.3333 22 1.3491 0.3803 1.3491 1.1615
No log 8.0 24 1.0753 0.4596 1.0753 1.0369
No log 8.6667 26 0.9661 0.4521 0.9661 0.9829
No log 9.3333 28 1.0907 0.3935 1.0907 1.0444
No log 10.0 30 0.9408 0.4343 0.9408 0.9699
No log 10.6667 32 0.9663 0.4707 0.9663 0.9830
No log 11.3333 34 1.0112 0.4635 1.0112 1.0056
No log 12.0 36 1.0787 0.3988 1.0787 1.0386
No log 12.6667 38 1.0302 0.4099 1.0302 1.0150
No log 13.3333 40 0.9790 0.4314 0.9790 0.9894
No log 14.0 42 0.9711 0.4948 0.9711 0.9854
No log 14.6667 44 0.9074 0.5161 0.9074 0.9526
No log 15.3333 46 0.8885 0.5232 0.8885 0.9426
No log 16.0 48 0.9216 0.5392 0.9216 0.9600
No log 16.6667 50 0.8836 0.5232 0.8836 0.9400
No log 17.3333 52 0.8689 0.4802 0.8689 0.9322
No log 18.0 54 0.8673 0.5253 0.8673 0.9313
No log 18.6667 56 0.8704 0.5578 0.8704 0.9330
No log 19.3333 58 0.8229 0.6026 0.8229 0.9071
No log 20.0 60 0.8022 0.5750 0.8022 0.8956
No log 20.6667 62 0.8539 0.5893 0.8539 0.9241
No log 21.3333 64 0.8078 0.5911 0.8078 0.8988
No log 22.0 66 0.8441 0.5892 0.8441 0.9188
No log 22.6667 68 0.8423 0.5920 0.8423 0.9178
No log 23.3333 70 0.7835 0.5713 0.7835 0.8851
No log 24.0 72 0.8156 0.6142 0.8156 0.9031
No log 24.6667 74 0.9345 0.4760 0.9345 0.9667
No log 25.3333 76 0.9119 0.5232 0.9119 0.9550
No log 26.0 78 0.8000 0.6001 0.8000 0.8944
No log 26.6667 80 0.7924 0.6176 0.7924 0.8902
No log 27.3333 82 0.8462 0.4998 0.8462 0.9199
No log 28.0 84 0.9201 0.5210 0.9201 0.9592
No log 28.6667 86 0.9072 0.4852 0.9072 0.9525
No log 29.3333 88 0.8669 0.4802 0.8669 0.9311
No log 30.0 90 0.8311 0.5969 0.8311 0.9117
No log 30.6667 92 0.8143 0.5969 0.8143 0.9024
No log 31.3333 94 0.8059 0.5530 0.8059 0.8977
No log 32.0 96 0.8161 0.5253 0.8161 0.9034
No log 32.6667 98 0.9138 0.5224 0.9138 0.9559
No log 33.3333 100 0.9289 0.5027 0.9289 0.9638
No log 34.0 102 0.8859 0.5637 0.8859 0.9412
No log 34.6667 104 0.8473 0.5659 0.8473 0.9205
No log 35.3333 106 0.8790 0.5714 0.8790 0.9375
No log 36.0 108 0.8788 0.4976 0.8788 0.9375
No log 36.6667 110 0.8245 0.5498 0.8245 0.9080
No log 37.3333 112 0.8194 0.5601 0.8194 0.9052
No log 38.0 114 0.8635 0.5306 0.8635 0.9293
No log 38.6667 116 0.8913 0.5532 0.8913 0.9441
No log 39.3333 118 0.8816 0.5141 0.8816 0.9390
No log 40.0 120 0.8785 0.5028 0.8785 0.9373
No log 40.6667 122 0.8887 0.5495 0.8887 0.9427
No log 41.3333 124 0.8889 0.5659 0.8889 0.9428
No log 42.0 126 0.9451 0.5623 0.9451 0.9721
No log 42.6667 128 0.9728 0.5236 0.9728 0.9863
No log 43.3333 130 0.9759 0.5175 0.9759 0.9879
No log 44.0 132 0.9280 0.5182 0.9280 0.9633
No log 44.6667 134 0.9013 0.5223 0.9013 0.9494
No log 45.3333 136 0.8603 0.5440 0.8603 0.9275
No log 46.0 138 0.8540 0.5272 0.8540 0.9241
No log 46.6667 140 0.8759 0.5028 0.8759 0.9359
No log 47.3333 142 0.9440 0.4615 0.9440 0.9716
No log 48.0 144 1.0623 0.4384 1.0623 1.0307
No log 48.6667 146 1.0527 0.4384 1.0527 1.0260
No log 49.3333 148 0.9734 0.4395 0.9734 0.9866
No log 50.0 150 0.8808 0.5041 0.8808 0.9385
No log 50.6667 152 0.8440 0.5298 0.8440 0.9187
No log 51.3333 154 0.8417 0.5298 0.8417 0.9175
No log 52.0 156 0.8425 0.5185 0.8425 0.9179
No log 52.6667 158 0.8137 0.4794 0.8137 0.9020
No log 53.3333 160 0.7995 0.4521 0.7995 0.8942
No log 54.0 162 0.8230 0.5202 0.8230 0.9072
No log 54.6667 164 0.8890 0.4902 0.8890 0.9428
No log 55.3333 166 0.9673 0.4138 0.9673 0.9835
No log 56.0 168 0.9871 0.4215 0.9871 0.9935
No log 56.6667 170 1.0499 0.4368 1.0499 1.0246
No log 57.3333 172 1.0152 0.4368 1.0152 1.0076
No log 58.0 174 0.9268 0.4327 0.9268 0.9627
No log 58.6667 176 0.8770 0.5068 0.8770 0.9365
No log 59.3333 178 0.8835 0.5094 0.8835 0.9399
No log 60.0 180 0.8853 0.5249 0.8853 0.9409
No log 60.6667 182 0.8935 0.5063 0.8935 0.9452
No log 61.3333 184 0.9249 0.4658 0.9249 0.9617
No log 62.0 186 0.9280 0.4852 0.9280 0.9633
No log 62.6667 188 0.9216 0.4658 0.9216 0.9600
No log 63.3333 190 0.8901 0.4857 0.8901 0.9434
No log 64.0 192 0.8718 0.4883 0.8718 0.9337
No log 64.6667 194 0.8694 0.5109 0.8694 0.9324
No log 65.3333 196 0.9065 0.4960 0.9065 0.9521
No log 66.0 198 0.9170 0.4551 0.9170 0.9576
No log 66.6667 200 0.9124 0.4551 0.9124 0.9552
No log 67.3333 202 0.9320 0.4199 0.9320 0.9654
No log 68.0 204 0.9537 0.4271 0.9537 0.9766
No log 68.6667 206 0.9358 0.4534 0.9358 0.9673
No log 69.3333 208 0.8876 0.4788 0.8876 0.9421
No log 70.0 210 0.8288 0.5539 0.8288 0.9104
No log 70.6667 212 0.8034 0.5211 0.8034 0.8963
No log 71.3333 214 0.7787 0.5437 0.7787 0.8824
No log 72.0 216 0.7729 0.6088 0.7729 0.8791
No log 72.6667 218 0.7823 0.5836 0.7823 0.8845
No log 73.3333 220 0.7830 0.5836 0.7830 0.8849
No log 74.0 222 0.7833 0.5624 0.7833 0.8850
No log 74.6667 224 0.7724 0.5864 0.7724 0.8789
No log 75.3333 226 0.7738 0.5864 0.7738 0.8797
No log 76.0 228 0.7821 0.5624 0.7821 0.8844
No log 76.6667 230 0.8024 0.5227 0.8024 0.8958
No log 77.3333 232 0.8258 0.5557 0.8258 0.9087
No log 78.0 234 0.8462 0.5370 0.8462 0.9199
No log 78.6667 236 0.8389 0.5451 0.8389 0.9159
No log 79.3333 238 0.8142 0.5539 0.8142 0.9023
No log 80.0 240 0.8043 0.5756 0.8043 0.8968
No log 80.6667 242 0.7965 0.5920 0.7965 0.8925
No log 81.3333 244 0.7940 0.5332 0.7940 0.8911
No log 82.0 246 0.7946 0.5726 0.7946 0.8914
No log 82.6667 248 0.7879 0.5726 0.7879 0.8877
No log 83.3333 250 0.7780 0.5827 0.7780 0.8821
No log 84.0 252 0.7687 0.6358 0.7687 0.8768
No log 84.6667 254 0.7729 0.5886 0.7729 0.8792
No log 85.3333 256 0.7873 0.5564 0.7873 0.8873
No log 86.0 258 0.7928 0.5351 0.7928 0.8904
No log 86.6667 260 0.8028 0.5053 0.8028 0.8960
No log 87.3333 262 0.8146 0.5394 0.8146 0.9025
No log 88.0 264 0.8174 0.5394 0.8174 0.9041
No log 88.6667 266 0.8062 0.5682 0.8062 0.8979
No log 89.3333 268 0.8001 0.5351 0.8001 0.8945
No log 90.0 270 0.7951 0.5370 0.7951 0.8917
No log 90.6667 272 0.7934 0.5370 0.7934 0.8907
No log 91.3333 274 0.7932 0.5370 0.7932 0.8906
No log 92.0 276 0.7995 0.5370 0.7995 0.8942
No log 92.6667 278 0.8080 0.5039 0.8080 0.8989
No log 93.3333 280 0.8145 0.5039 0.8145 0.9025
No log 94.0 282 0.8162 0.5039 0.8162 0.9035
No log 94.6667 284 0.8131 0.5039 0.8131 0.9017
No log 95.3333 286 0.8082 0.5039 0.8082 0.8990
No log 96.0 288 0.8050 0.5039 0.8050 0.8972
No log 96.6667 290 0.8048 0.5039 0.8048 0.8971
No log 97.3333 292 0.8077 0.5039 0.8077 0.8987
No log 98.0 294 0.8110 0.5039 0.8110 0.9005
No log 98.6667 296 0.8139 0.5039 0.8139 0.9022
No log 99.3333 298 0.8148 0.5039 0.8148 0.9027
No log 100.0 300 0.8148 0.5039 0.8148 0.9027

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_FineTuningAraBERT_noAug_task2_organization

Finetuned
(4019)
this model