Rooberta_combo_v2
This model is a fine-tuned version of FacebookAI/xlm-roberta-base on the None dataset. It achieves the following results on the evaluation set:
- Loss: 0.1852
- Accuracy: 0.931
- Auc: 0.981
- Precision: 0.959
- Recall: 0.927
- F1: 0.943
- F1-macro: 0.927
- F1-micro: 0.931
- F1-weighted: 0.931
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 32
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 5
- mixed_precision_training: Native AMP
Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | Auc | Precision | Recall | F1 | F1-macro | F1-micro | F1-weighted |
|---|---|---|---|---|---|---|---|---|---|---|---|
| 0.4768 | 0.2661 | 500 | 0.2931 | 0.881 | 0.944 | 0.901 | 0.907 | 0.904 | 0.874 | 0.881 | 0.881 |
| 0.3246 | 0.5323 | 1000 | 0.2694 | 0.893 | 0.957 | 0.897 | 0.934 | 0.915 | 0.885 | 0.893 | 0.892 |
| 0.2781 | 0.7984 | 1500 | 0.2299 | 0.909 | 0.968 | 0.927 | 0.926 | 0.926 | 0.904 | 0.909 | 0.909 |
| 0.258 | 1.0644 | 2000 | 0.2533 | 0.902 | 0.969 | 0.961 | 0.877 | 0.917 | 0.899 | 0.902 | 0.903 |
| 0.2354 | 1.3305 | 2500 | 0.2171 | 0.917 | 0.972 | 0.937 | 0.928 | 0.933 | 0.912 | 0.917 | 0.917 |
| 0.2243 | 1.5967 | 3000 | 0.2068 | 0.92 | 0.975 | 0.945 | 0.926 | 0.935 | 0.916 | 0.92 | 0.921 |
| 0.2219 | 1.8628 | 3500 | 0.2015 | 0.921 | 0.975 | 0.94 | 0.932 | 0.936 | 0.917 | 0.921 | 0.921 |
| 0.2114 | 2.1288 | 4000 | 0.2002 | 0.923 | 0.977 | 0.95 | 0.924 | 0.937 | 0.919 | 0.923 | 0.923 |
| 0.2051 | 2.3949 | 4500 | 0.2040 | 0.923 | 0.977 | 0.957 | 0.917 | 0.937 | 0.92 | 0.923 | 0.924 |
| 0.1952 | 2.6611 | 5000 | 0.1837 | 0.929 | 0.98 | 0.951 | 0.933 | 0.942 | 0.925 | 0.929 | 0.929 |
| 0.1933 | 2.9272 | 5500 | 0.1818 | 0.929 | 0.98 | 0.953 | 0.931 | 0.942 | 0.925 | 0.929 | 0.929 |
| 0.1918 | 3.1932 | 6000 | 0.2074 | 0.922 | 0.979 | 0.966 | 0.905 | 0.935 | 0.918 | 0.922 | 0.922 |
| 0.182 | 3.4593 | 6500 | 0.1949 | 0.93 | 0.98 | 0.96 | 0.927 | 0.943 | 0.927 | 0.93 | 0.931 |
| 0.1832 | 3.7255 | 7000 | 0.1826 | 0.93 | 0.981 | 0.954 | 0.931 | 0.942 | 0.926 | 0.93 | 0.93 |
| 0.1801 | 3.9916 | 7500 | 0.2028 | 0.924 | 0.981 | 0.967 | 0.908 | 0.937 | 0.921 | 0.924 | 0.925 |
| 0.1805 | 4.2576 | 8000 | 0.1861 | 0.931 | 0.981 | 0.958 | 0.929 | 0.943 | 0.927 | 0.931 | 0.931 |
| 0.1749 | 4.5238 | 8500 | 0.1789 | 0.932 | 0.982 | 0.96 | 0.929 | 0.944 | 0.928 | 0.932 | 0.932 |
| 0.1764 | 4.7899 | 9000 | 0.1852 | 0.931 | 0.981 | 0.959 | 0.927 | 0.943 | 0.927 | 0.931 | 0.931 |
Framework versions
- Transformers 4.53.2
- Pytorch 2.6.0+cu124
- Datasets 2.14.4
- Tokenizers 0.21.2
- Downloads last month
- -
Model tree for adity12345/Rooberta_combo_v2
Base model
FacebookAI/xlm-roberta-base