MayBashendy's picture
End of training
7060dbd verified
metadata
library_name: transformers
base_model: aubmindlab/bert-base-arabertv02
tags:
  - generated_from_trainer
model-index:
  - name: >-
      ArabicNewSplits4_withSameOriginalTrainFileOfSplit3_FineTuningAraBERT_noAug_task5_organization
    results: []

ArabicNewSplits4_withSameOriginalTrainFileOfSplit3_FineTuningAraBERT_noAug_task5_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 1.2766
  • Qwk: 0.6185
  • Mse: 1.2766
  • Rmse: 1.1299

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 10

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.5 2 2.1918 0.0658 2.1918 1.4805
No log 1.0 4 1.4893 0.2213 1.4893 1.2204
No log 1.5 6 1.2602 0.2203 1.2602 1.1226
No log 2.0 8 1.3721 0.2921 1.3721 1.1714
No log 2.5 10 1.5477 0.4210 1.5477 1.2441
No log 3.0 12 1.6581 0.3796 1.6581 1.2877
No log 3.5 14 1.7377 0.3384 1.7377 1.3182
No log 4.0 16 1.5913 0.3929 1.5913 1.2615
No log 4.5 18 1.3286 0.4654 1.3286 1.1527
No log 5.0 20 1.2595 0.5472 1.2595 1.1223
No log 5.5 22 1.3522 0.5645 1.3522 1.1628
No log 6.0 24 1.2114 0.6261 1.2114 1.1007
No log 6.5 26 1.0620 0.6374 1.0620 1.0305
No log 7.0 28 0.9699 0.6781 0.9699 0.9849
No log 7.5 30 1.0178 0.6579 1.0178 1.0088
No log 8.0 32 1.0751 0.6562 1.0751 1.0369
No log 8.5 34 1.1597 0.6414 1.1597 1.0769
No log 9.0 36 1.2237 0.6270 1.2237 1.1062
No log 9.5 38 1.2766 0.6282 1.2766 1.1299
No log 10.0 40 1.2766 0.6185 1.2766 1.1299

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1