ArabicNewSplits5_FineTuningAraBERT_run2_AugV5_k2_task3_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.7619
  • Qwk: 0.3080
  • Mse: 0.7619
  • Rmse: 0.8729

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 10

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.1333 2 3.4300 -0.0160 3.4300 1.8520
No log 0.2667 4 1.8515 -0.0130 1.8515 1.3607
No log 0.4 6 1.2353 0.0588 1.2353 1.1115
No log 0.5333 8 0.9459 0.0335 0.9459 0.9726
No log 0.6667 10 1.1278 0.0255 1.1278 1.0620
No log 0.8 12 1.9985 0.0788 1.9985 1.4137
No log 0.9333 14 1.4061 0.0255 1.4061 1.1858
No log 1.0667 16 1.2688 0.0 1.2688 1.1264
No log 1.2 18 1.0546 0.0 1.0546 1.0269
No log 1.3333 20 0.8166 0.0698 0.8166 0.9037
No log 1.4667 22 0.7717 0.1486 0.7717 0.8785
No log 1.6 24 0.6851 0.3086 0.6851 0.8277
No log 1.7333 26 0.6032 0.0909 0.6032 0.7767
No log 1.8667 28 0.6480 0.2941 0.6480 0.8050
No log 2.0 30 0.7680 0.1861 0.7680 0.8763
No log 2.1333 32 0.7549 0.1861 0.7549 0.8688
No log 2.2667 34 0.8014 0.1934 0.8014 0.8952
No log 2.4 36 0.7583 0.1861 0.7583 0.8708
No log 2.5333 38 0.6885 0.2688 0.6885 0.8298
No log 2.6667 40 0.6992 0.2593 0.6992 0.8362
No log 2.8 42 0.6328 0.1282 0.6328 0.7955
No log 2.9333 44 0.6889 0.0545 0.6889 0.8300
No log 3.0667 46 0.6816 0.0545 0.6816 0.8256
No log 3.2 48 0.6693 -0.0719 0.6693 0.8181
No log 3.3333 50 0.6656 0.0145 0.6656 0.8159
No log 3.4667 52 0.7642 0.0455 0.7642 0.8742
No log 3.6 54 0.7436 -0.0299 0.7436 0.8623
No log 3.7333 56 0.7400 0.0388 0.7400 0.8602
No log 3.8667 58 0.7491 -0.0303 0.7491 0.8655
No log 4.0 60 0.8199 0.0439 0.8199 0.9055
No log 4.1333 62 0.9522 0.0909 0.9522 0.9758
No log 4.2667 64 0.7046 0.0256 0.7046 0.8394
No log 4.4 66 0.7353 0.0388 0.7353 0.8575
No log 4.5333 68 0.7482 0.1655 0.7482 0.8650
No log 4.6667 70 0.7252 0.2000 0.7252 0.8516
No log 4.8 72 1.0154 0.2441 1.0154 1.0077
No log 4.9333 74 1.0923 0.2548 1.0923 1.0451
No log 5.0667 76 1.0033 0.2191 1.0033 1.0016
No log 5.2 78 0.6948 0.2233 0.6948 0.8336
No log 5.3333 80 0.7000 0.2661 0.7000 0.8367
No log 5.4667 82 0.8677 0.0894 0.8677 0.9315
No log 5.6 84 0.9197 0.0722 0.9197 0.9590
No log 5.7333 86 0.7566 0.2432 0.7566 0.8698
No log 5.8667 88 0.6486 0.3153 0.6486 0.8053
No log 6.0 90 0.7266 0.3188 0.7266 0.8524
No log 6.1333 92 0.6351 0.3448 0.6351 0.7969
No log 6.2667 94 0.6751 0.3846 0.6751 0.8216
No log 6.4 96 0.6798 0.3171 0.6798 0.8245
No log 6.5333 98 0.6090 0.3684 0.6090 0.7804
No log 6.6667 100 0.6550 0.4087 0.6550 0.8093
No log 6.8 102 0.6531 0.3991 0.6531 0.8081
No log 6.9333 104 0.5942 0.3645 0.5942 0.7708
No log 7.0667 106 0.6400 0.3744 0.6400 0.8000
No log 7.2 108 0.6507 0.2676 0.6507 0.8067
No log 7.3333 110 0.6583 0.2676 0.6583 0.8113
No log 7.4667 112 0.6763 0.2676 0.6763 0.8224
No log 7.6 114 0.6567 0.3422 0.6567 0.8103
No log 7.7333 116 0.6407 0.3939 0.6407 0.8004
No log 7.8667 118 0.6331 0.3480 0.6331 0.7956
No log 8.0 120 0.6429 0.3504 0.6429 0.8018
No log 8.1333 122 0.6617 0.3939 0.6617 0.8134
No log 8.2667 124 0.6955 0.2963 0.6955 0.8340
No log 8.4 126 0.6852 0.3739 0.6852 0.8278
No log 8.5333 128 0.6644 0.3418 0.6644 0.8151
No log 8.6667 130 0.6714 0.3333 0.6714 0.8194
No log 8.8 132 0.6998 0.3220 0.6998 0.8366
No log 8.9333 134 0.7623 0.2920 0.7623 0.8731
No log 9.0667 136 0.8286 0.2711 0.8286 0.9103
No log 9.2 138 0.8591 0.2432 0.8591 0.9269
No log 9.3333 140 0.8670 0.2432 0.8670 0.9311
No log 9.4667 142 0.8509 0.2432 0.8509 0.9225
No log 9.6 144 0.8185 0.2711 0.8185 0.9047
No log 9.7333 146 0.7897 0.2554 0.7897 0.8886
No log 9.8667 148 0.7689 0.3080 0.7689 0.8769
No log 10.0 150 0.7619 0.3080 0.7619 0.8729

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
2
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits5_FineTuningAraBERT_run2_AugV5_k2_task3_organization

Finetuned
(4023)
this model