ArabicNewSplits6_FineTuningAraBERT_run2_AugV5_k6_task1_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.6213
  • Qwk: 0.7259
  • Mse: 0.6213
  • Rmse: 0.7883

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 10

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0606 2 5.1716 -0.0336 5.1716 2.2741
No log 0.1212 4 3.1515 0.0840 3.1515 1.7752
No log 0.1818 6 2.6529 -0.1073 2.6529 1.6288
No log 0.2424 8 1.9169 -0.0161 1.9169 1.3845
No log 0.3030 10 1.3331 0.1716 1.3331 1.1546
No log 0.3636 12 1.2217 0.2738 1.2217 1.1053
No log 0.4242 14 1.6960 0.0471 1.6960 1.3023
No log 0.4848 16 2.1428 0.1294 2.1428 1.4638
No log 0.5455 18 1.7629 0.0846 1.7629 1.3277
No log 0.6061 20 1.2428 0.2622 1.2428 1.1148
No log 0.6667 22 1.1485 0.2137 1.1485 1.0717
No log 0.7273 24 1.2069 0.1486 1.2069 1.0986
No log 0.7879 26 1.2510 0.1620 1.2510 1.1185
No log 0.8485 28 1.2093 0.1918 1.2093 1.0997
No log 0.9091 30 1.1191 0.2560 1.1191 1.0579
No log 0.9697 32 1.1024 0.3052 1.1024 1.0500
No log 1.0303 34 1.3082 0.1584 1.3082 1.1438
No log 1.0909 36 1.4531 0.0542 1.4531 1.2054
No log 1.1515 38 1.4133 0.0542 1.4133 1.1888
No log 1.2121 40 1.1808 0.3219 1.1808 1.0866
No log 1.2727 42 1.1072 0.3461 1.1072 1.0523
No log 1.3333 44 0.9813 0.4136 0.9813 0.9906
No log 1.3939 46 0.8714 0.4188 0.8714 0.9335
No log 1.4545 48 0.8582 0.4200 0.8582 0.9264
No log 1.5152 50 1.0155 0.4058 1.0155 1.0077
No log 1.5758 52 0.9988 0.4281 0.9988 0.9994
No log 1.6364 54 1.0183 0.4375 1.0183 1.0091
No log 1.6970 56 1.1114 0.4361 1.1114 1.0543
No log 1.7576 58 0.9248 0.5137 0.9248 0.9617
No log 1.8182 60 0.8142 0.5409 0.8142 0.9023
No log 1.8788 62 0.8669 0.5468 0.8669 0.9311
No log 1.9394 64 1.1888 0.4665 1.1888 1.0903
No log 2.0 66 1.5774 0.3833 1.5774 1.2560
No log 2.0606 68 1.6236 0.3514 1.6236 1.2742
No log 2.1212 70 1.3887 0.4201 1.3887 1.1785
No log 2.1818 72 0.9900 0.5099 0.9900 0.9950
No log 2.2424 74 0.9784 0.4088 0.9784 0.9891
No log 2.3030 76 1.1334 0.3278 1.1334 1.0646
No log 2.3636 78 0.9367 0.4638 0.9367 0.9678
No log 2.4242 80 0.7326 0.6233 0.7326 0.8559
No log 2.4848 82 0.9364 0.5079 0.9364 0.9677
No log 2.5455 84 1.2815 0.4286 1.2815 1.1320
No log 2.6061 86 1.4325 0.3701 1.4325 1.1969
No log 2.6667 88 1.4568 0.3837 1.4568 1.2070
No log 2.7273 90 1.3241 0.4205 1.3241 1.1507
No log 2.7879 92 1.1296 0.4900 1.1296 1.0628
No log 2.8485 94 0.7812 0.6672 0.7812 0.8839
No log 2.9091 96 0.5977 0.6724 0.5977 0.7731
No log 2.9697 98 0.5827 0.7399 0.5827 0.7633
No log 3.0303 100 0.5914 0.7021 0.5914 0.7690
No log 3.0909 102 0.6722 0.6912 0.6722 0.8199
No log 3.1515 104 0.9546 0.5917 0.9546 0.9770
No log 3.2121 106 1.1594 0.5116 1.1594 1.0768
No log 3.2727 108 1.1932 0.4966 1.1931 1.0923
No log 3.3333 110 1.1889 0.4966 1.1889 1.0904
No log 3.3939 112 0.9951 0.5725 0.9951 0.9975
No log 3.4545 114 0.7583 0.6219 0.7583 0.8708
No log 3.5152 116 0.6564 0.6506 0.6564 0.8102
No log 3.5758 118 0.6203 0.6938 0.6203 0.7876
No log 3.6364 120 0.5830 0.7290 0.5830 0.7635
No log 3.6970 122 0.6031 0.7003 0.6031 0.7766
No log 3.7576 124 0.6617 0.7107 0.6617 0.8135
No log 3.8182 126 0.7321 0.6700 0.7321 0.8556
No log 3.8788 128 0.6531 0.7005 0.6531 0.8082
No log 3.9394 130 0.5573 0.7042 0.5573 0.7465
No log 4.0 132 0.5524 0.7219 0.5524 0.7432
No log 4.0606 134 0.5679 0.7154 0.5679 0.7536
No log 4.1212 136 0.6148 0.7286 0.6148 0.7841
No log 4.1818 138 0.6153 0.7286 0.6153 0.7844
No log 4.2424 140 0.5826 0.7217 0.5826 0.7633
No log 4.3030 142 0.5736 0.7185 0.5736 0.7574
No log 4.3636 144 0.5824 0.7432 0.5824 0.7631
No log 4.4242 146 0.5887 0.7420 0.5887 0.7672
No log 4.4848 148 0.5630 0.7451 0.5630 0.7503
No log 4.5455 150 0.5792 0.7608 0.5792 0.7611
No log 4.6061 152 0.6118 0.7690 0.6118 0.7822
No log 4.6667 154 0.6388 0.7591 0.6388 0.7993
No log 4.7273 156 0.6286 0.7608 0.6286 0.7928
No log 4.7879 158 0.6051 0.7728 0.6051 0.7779
No log 4.8485 160 0.5955 0.7745 0.5955 0.7717
No log 4.9091 162 0.5881 0.7535 0.5881 0.7669
No log 4.9697 164 0.5954 0.7535 0.5954 0.7716
No log 5.0303 166 0.5945 0.7338 0.5945 0.7710
No log 5.0909 168 0.6129 0.7086 0.6129 0.7829
No log 5.1515 170 0.6463 0.7114 0.6463 0.8039
No log 5.2121 172 0.6282 0.7013 0.6282 0.7926
No log 5.2727 174 0.5952 0.7291 0.5952 0.7715
No log 5.3333 176 0.6244 0.7607 0.6244 0.7902
No log 5.3939 178 0.6649 0.7277 0.6649 0.8154
No log 5.4545 180 0.6719 0.7349 0.6719 0.8197
No log 5.5152 182 0.6060 0.7352 0.6060 0.7785
No log 5.5758 184 0.5767 0.7351 0.5767 0.7594
No log 5.6364 186 0.6306 0.7197 0.6306 0.7941
No log 5.6970 188 0.7099 0.6682 0.7099 0.8425
No log 5.7576 190 0.6853 0.6990 0.6853 0.8278
No log 5.8182 192 0.6075 0.7366 0.6075 0.7794
No log 5.8788 194 0.5832 0.7109 0.5832 0.7637
No log 5.9394 196 0.5971 0.7151 0.5971 0.7727
No log 6.0 198 0.6568 0.7034 0.6568 0.8104
No log 6.0606 200 0.6925 0.6949 0.6925 0.8322
No log 6.1212 202 0.6507 0.6929 0.6507 0.8067
No log 6.1818 204 0.6176 0.7251 0.6176 0.7858
No log 6.2424 206 0.6202 0.7370 0.6202 0.7875
No log 6.3030 208 0.6192 0.7438 0.6192 0.7869
No log 6.3636 210 0.6051 0.7334 0.6051 0.7779
No log 6.4242 212 0.6010 0.7237 0.6010 0.7752
No log 6.4848 214 0.6012 0.7137 0.6012 0.7753
No log 6.5455 216 0.6188 0.6910 0.6188 0.7867
No log 6.6061 218 0.6342 0.6860 0.6342 0.7964
No log 6.6667 220 0.6100 0.7150 0.6100 0.7810
No log 6.7273 222 0.5944 0.7060 0.5944 0.7710
No log 6.7879 224 0.6019 0.7344 0.6019 0.7758
No log 6.8485 226 0.6061 0.7283 0.6061 0.7785
No log 6.9091 228 0.5982 0.7181 0.5982 0.7735
No log 6.9697 230 0.5960 0.7189 0.5960 0.7720
No log 7.0303 232 0.6095 0.7472 0.6095 0.7807
No log 7.0909 234 0.6349 0.7421 0.6349 0.7968
No log 7.1515 236 0.6289 0.7464 0.6289 0.7930
No log 7.2121 238 0.6036 0.7175 0.6036 0.7769
No log 7.2727 240 0.6177 0.7054 0.6177 0.7859
No log 7.3333 242 0.6614 0.7015 0.6614 0.8132
No log 7.3939 244 0.6612 0.7152 0.6612 0.8131
No log 7.4545 246 0.6321 0.7011 0.6321 0.7951
No log 7.5152 248 0.6276 0.6823 0.6276 0.7922
No log 7.5758 250 0.6554 0.7522 0.6554 0.8096
No log 7.6364 252 0.7092 0.7207 0.7092 0.8421
No log 7.6970 254 0.7307 0.7138 0.7307 0.8548
No log 7.7576 256 0.7030 0.7381 0.7030 0.8384
No log 7.8182 258 0.6593 0.7463 0.6593 0.8119
No log 7.8788 260 0.6394 0.7052 0.6394 0.7996
No log 7.9394 262 0.6372 0.7076 0.6372 0.7983
No log 8.0 264 0.6604 0.6705 0.6604 0.8126
No log 8.0606 266 0.6878 0.6530 0.6878 0.8293
No log 8.1212 268 0.6917 0.6530 0.6917 0.8317
No log 8.1818 270 0.6671 0.6436 0.6671 0.8168
No log 8.2424 272 0.6353 0.6625 0.6353 0.7970
No log 8.3030 274 0.6205 0.7178 0.6205 0.7877
No log 8.3636 276 0.6225 0.7217 0.6225 0.7890
No log 8.4242 278 0.6331 0.7407 0.6331 0.7957
No log 8.4848 280 0.6403 0.7466 0.6403 0.8002
No log 8.5455 282 0.6476 0.7466 0.6476 0.8047
No log 8.6061 284 0.6447 0.7603 0.6447 0.8029
No log 8.6667 286 0.6461 0.7640 0.6461 0.8038
No log 8.7273 288 0.6480 0.7659 0.6480 0.8050
No log 8.7879 290 0.6489 0.7616 0.6489 0.8055
No log 8.8485 292 0.6464 0.7603 0.6464 0.8040
No log 8.9091 294 0.6403 0.7301 0.6403 0.8002
No log 8.9697 296 0.6318 0.7151 0.6318 0.7948
No log 9.0303 298 0.6290 0.7185 0.6290 0.7931
No log 9.0909 300 0.6285 0.7065 0.6285 0.7928
No log 9.1515 302 0.6263 0.7068 0.6263 0.7914
No log 9.2121 304 0.6239 0.7028 0.6239 0.7898
No log 9.2727 306 0.6210 0.7088 0.6210 0.7880
No log 9.3333 308 0.6198 0.7118 0.6198 0.7873
No log 9.3939 310 0.6187 0.7118 0.6187 0.7866
No log 9.4545 312 0.6182 0.7118 0.6182 0.7862
No log 9.5152 314 0.6195 0.7193 0.6195 0.7871
No log 9.5758 316 0.6202 0.7151 0.6202 0.7875
No log 9.6364 318 0.6204 0.7151 0.6204 0.7877
No log 9.6970 320 0.6212 0.7201 0.6212 0.7882
No log 9.7576 322 0.6220 0.7201 0.6220 0.7887
No log 9.8182 324 0.6221 0.7259 0.6221 0.7887
No log 9.8788 326 0.6219 0.7259 0.6219 0.7886
No log 9.9394 328 0.6215 0.7259 0.6215 0.7884
No log 10.0 330 0.6213 0.7259 0.6213 0.7883

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits6_FineTuningAraBERT_run2_AugV5_k6_task1_organization

Finetuned
(4023)
this model