ArabicNewSplits6_FineTuningAraBERT_run1_AugV5_k1_task2_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.5536
  • Qwk: 0.4021
  • Mse: 0.5536
  • Rmse: 0.7440

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 10

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.2857 2 4.0639 0.0162 4.0639 2.0159
No log 0.5714 4 2.3235 0.0699 2.3235 1.5243
No log 0.8571 6 1.1560 0.0951 1.1560 1.0752
No log 1.1429 8 0.9274 -0.0673 0.9274 0.9630
No log 1.4286 10 0.8136 0.0597 0.8136 0.9020
No log 1.7143 12 0.8452 0.1299 0.8452 0.9193
No log 2.0 14 0.8833 0.1299 0.8833 0.9398
No log 2.2857 16 0.8371 0.1819 0.8371 0.9149
No log 2.5714 18 0.8352 0.1819 0.8352 0.9139
No log 2.8571 20 0.8214 0.1677 0.8214 0.9063
No log 3.1429 22 0.8515 0.0772 0.8515 0.9228
No log 3.4286 24 0.8276 0.0772 0.8276 0.9097
No log 3.7143 26 0.7337 0.0918 0.7337 0.8565
No log 4.0 28 0.6956 0.1549 0.6956 0.8340
No log 4.2857 30 0.6612 0.2495 0.6612 0.8132
No log 4.5714 32 0.6311 0.2577 0.6311 0.7944
No log 4.8571 34 0.6153 0.3260 0.6153 0.7844
No log 5.1429 36 0.6187 0.2968 0.6187 0.7866
No log 5.4286 38 0.6281 0.2928 0.6281 0.7925
No log 5.7143 40 0.6362 0.2928 0.6362 0.7976
No log 6.0 42 0.6361 0.3274 0.6361 0.7975
No log 6.2857 44 0.6663 0.3582 0.6663 0.8163
No log 6.5714 46 0.6735 0.3594 0.6735 0.8207
No log 6.8571 48 0.6512 0.3695 0.6512 0.8070
No log 7.1429 50 0.6191 0.3685 0.6191 0.7868
No log 7.4286 52 0.5997 0.3868 0.5997 0.7744
No log 7.7143 54 0.5944 0.3868 0.5944 0.7710
No log 8.0 56 0.5776 0.4131 0.5776 0.7600
No log 8.2857 58 0.5729 0.4245 0.5729 0.7569
No log 8.5714 60 0.5646 0.4039 0.5646 0.7514
No log 8.8571 62 0.5593 0.4158 0.5593 0.7479
No log 9.1429 64 0.5566 0.4074 0.5566 0.7461
No log 9.4286 66 0.5547 0.4190 0.5547 0.7448
No log 9.7143 68 0.5537 0.4021 0.5537 0.7441
No log 10.0 70 0.5536 0.4021 0.5536 0.7440

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits6_FineTuningAraBERT_run1_AugV5_k1_task2_organization

Finetuned
(4023)
this model