NADI-Country-DID-MARBERT
This model is a fine-tuned version of UBC-NLP/MARBERT on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 6.6012
- Accuracy: 0.2874
- F1: 0.2717
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Use adamw_torch_fused with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 20
Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 |
|---|---|---|---|---|---|
| 2.3836 | 1.0 | 883 | 2.4948 | 0.2970 | 0.2699 |
| 1.1646 | 2.0 | 1766 | 2.8158 | 0.2919 | 0.2773 |
| 0.5628 | 3.0 | 2649 | 3.2616 | 0.2949 | 0.2811 |
| 0.2662 | 4.0 | 3532 | 3.8731 | 0.2737 | 0.2639 |
| 0.1422 | 5.0 | 4415 | 4.6939 | 0.3086 | 0.2866 |
| 0.0654 | 6.0 | 5298 | 5.3872 | 0.3025 | 0.2825 |
| 0.0356 | 7.0 | 6181 | 5.7370 | 0.3136 | 0.2921 |
| 0.0291 | 8.0 | 7064 | 6.1477 | 0.2763 | 0.2684 |
| 0.0134 | 9.0 | 7947 | 6.4828 | 0.2924 | 0.2748 |
| 0.0153 | 10.0 | 8830 | 6.6012 | 0.2874 | 0.2717 |
Framework versions
- Transformers 4.55.2
- Pytorch 2.8.0+cu128
- Datasets 2.16.0
- Tokenizers 0.21.4
- Downloads last month
- 1
Model tree for ashabrawy/NADI-Country-DID-MARBERT
Base model
UBC-NLP/MARBERT