ArabicNewSplits6_WithDuplicationsForScore5_FineTuningAraBERT_run3_AugV5_k1_task3_organization
This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:
- Loss: 0.9605
- Qwk: 0.1148
- Mse: 0.9605
- Rmse: 0.9801
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 10
Training results
| Training Loss | Epoch | Step | Validation Loss | Qwk | Mse | Rmse |
|---|---|---|---|---|---|---|
| No log | 0.2857 | 2 | 3.1351 | -0.0138 | 3.1351 | 1.7706 |
| No log | 0.5714 | 4 | 1.6487 | -0.0070 | 1.6487 | 1.2840 |
| No log | 0.8571 | 6 | 0.9428 | 0.0418 | 0.9428 | 0.9710 |
| No log | 1.1429 | 8 | 1.0210 | 0.0847 | 1.0210 | 1.0104 |
| No log | 1.4286 | 10 | 0.6218 | 0.0476 | 0.6218 | 0.7886 |
| No log | 1.7143 | 12 | 0.7490 | 0.125 | 0.7490 | 0.8655 |
| No log | 2.0 | 14 | 0.7612 | 0.0 | 0.7612 | 0.8725 |
| No log | 2.2857 | 16 | 0.6580 | 0.0 | 0.6580 | 0.8112 |
| No log | 2.5714 | 18 | 0.5776 | 0.0569 | 0.5776 | 0.7600 |
| No log | 2.8571 | 20 | 0.6125 | -0.0370 | 0.6125 | 0.7826 |
| No log | 3.1429 | 22 | 0.6701 | 0.0189 | 0.6701 | 0.8186 |
| No log | 3.4286 | 24 | 0.6421 | -0.0133 | 0.6421 | 0.8013 |
| No log | 3.7143 | 26 | 0.6429 | -0.0963 | 0.6429 | 0.8018 |
| No log | 4.0 | 28 | 0.6664 | 0.1176 | 0.6664 | 0.8164 |
| No log | 4.2857 | 30 | 0.7573 | 0.2318 | 0.7573 | 0.8702 |
| No log | 4.5714 | 32 | 0.7705 | 0.1795 | 0.7705 | 0.8778 |
| No log | 4.8571 | 34 | 0.7629 | 0.0233 | 0.7629 | 0.8735 |
| No log | 5.1429 | 36 | 0.8292 | 0.0455 | 0.8292 | 0.9106 |
| No log | 5.4286 | 38 | 0.8067 | 0.0497 | 0.8067 | 0.8982 |
| No log | 5.7143 | 40 | 0.8222 | 0.1304 | 0.8222 | 0.9068 |
| No log | 6.0 | 42 | 0.8578 | 0.1131 | 0.8578 | 0.9262 |
| No log | 6.2857 | 44 | 0.8520 | 0.0980 | 0.8520 | 0.9230 |
| No log | 6.5714 | 46 | 0.8675 | 0.0857 | 0.8675 | 0.9314 |
| No log | 6.8571 | 48 | 0.8601 | 0.0741 | 0.8601 | 0.9274 |
| No log | 7.1429 | 50 | 0.8865 | 0.1131 | 0.8865 | 0.9416 |
| No log | 7.4286 | 52 | 0.9160 | 0.1203 | 0.9160 | 0.9571 |
| No log | 7.7143 | 54 | 0.9153 | 0.1186 | 0.9153 | 0.9567 |
| No log | 8.0 | 56 | 0.8859 | 0.1660 | 0.8859 | 0.9412 |
| No log | 8.2857 | 58 | 0.8600 | 0.2212 | 0.8600 | 0.9274 |
| No log | 8.5714 | 60 | 0.8610 | 0.2070 | 0.8610 | 0.9279 |
| No log | 8.8571 | 62 | 0.8818 | 0.1864 | 0.8818 | 0.9390 |
| No log | 9.1429 | 64 | 0.9196 | 0.1074 | 0.9196 | 0.9589 |
| No log | 9.4286 | 66 | 0.9461 | 0.1148 | 0.9461 | 0.9727 |
| No log | 9.7143 | 68 | 0.9583 | 0.1148 | 0.9583 | 0.9789 |
| No log | 10.0 | 70 | 0.9605 | 0.1148 | 0.9605 | 0.9801 |
Framework versions
- Transformers 4.44.2
- Pytorch 2.4.0+cu118
- Datasets 2.21.0
- Tokenizers 0.19.1
- Downloads last month
- -
Model tree for MayBashendy/ArabicNewSplits6_WithDuplicationsForScore5_FineTuningAraBERT_run3_AugV5_k1_task3_organization
Base model
aubmindlab/bert-base-arabertv02