Roberta_combo_v1
This model is a fine-tuned version of FacebookAI/xlm-roberta-base on the None dataset. It achieves the following results on the evaluation set:
- Loss: 0.5245
- Accuracy: 0.787
- Auc: 0.888
- Precision: 0.775
- Recall: 0.868
- F1: 0.819
- F1-macro: 0.781
- F1-micro: 0.787
- F1-weighted: 0.785
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 32
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 5
- mixed_precision_training: Native AMP
Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | Auc | Precision | Recall | F1 | F1-macro | F1-micro | F1-weighted |
|---|---|---|---|---|---|---|---|---|---|---|---|
| 0.5526 | 0.1028 | 500 | 0.4682 | 0.749 | 0.847 | 0.732 | 0.863 | 0.792 | 0.737 | 0.749 | 0.744 |
| 0.462 | 0.2057 | 1000 | 0.4818 | 0.759 | 0.857 | 0.738 | 0.879 | 0.802 | 0.747 | 0.759 | 0.753 |
| 0.4466 | 0.3085 | 1500 | 0.4817 | 0.747 | 0.852 | 0.751 | 0.815 | 0.782 | 0.741 | 0.747 | 0.745 |
| 0.43 | 0.4114 | 2000 | 0.4300 | 0.763 | 0.87 | 0.737 | 0.891 | 0.807 | 0.75 | 0.763 | 0.756 |
| 0.4215 | 0.5142 | 2500 | 0.4412 | 0.763 | 0.869 | 0.725 | 0.924 | 0.812 | 0.746 | 0.763 | 0.753 |
| 0.4148 | 0.6170 | 3000 | 0.4229 | 0.774 | 0.879 | 0.746 | 0.899 | 0.815 | 0.762 | 0.774 | 0.768 |
| 0.4056 | 0.7199 | 3500 | 0.4351 | 0.772 | 0.88 | 0.747 | 0.89 | 0.813 | 0.761 | 0.772 | 0.767 |
| 0.4094 | 0.8227 | 4000 | 0.4184 | 0.777 | 0.883 | 0.743 | 0.913 | 0.819 | 0.764 | 0.777 | 0.77 |
| 0.403 | 0.9255 | 4500 | 0.4122 | 0.778 | 0.884 | 0.747 | 0.906 | 0.819 | 0.766 | 0.778 | 0.772 |
| 0.3905 | 1.0284 | 5000 | 0.4450 | 0.776 | 0.884 | 0.738 | 0.924 | 0.821 | 0.761 | 0.776 | 0.768 |
| 0.3826 | 1.1312 | 5500 | 0.4171 | 0.78 | 0.885 | 0.754 | 0.895 | 0.819 | 0.769 | 0.78 | 0.775 |
| 0.3743 | 1.2341 | 6000 | 0.4093 | 0.775 | 0.883 | 0.741 | 0.915 | 0.819 | 0.762 | 0.775 | 0.768 |
| 0.3716 | 1.3369 | 6500 | 0.4804 | 0.77 | 0.876 | 0.739 | 0.904 | 0.813 | 0.756 | 0.77 | 0.763 |
| 0.3748 | 1.4397 | 7000 | 0.4300 | 0.779 | 0.883 | 0.752 | 0.898 | 0.818 | 0.768 | 0.779 | 0.774 |
| 0.3741 | 1.5426 | 7500 | 0.4144 | 0.779 | 0.886 | 0.774 | 0.85 | 0.81 | 0.773 | 0.779 | 0.777 |
| 0.3732 | 1.6454 | 8000 | 0.4029 | 0.782 | 0.887 | 0.76 | 0.888 | 0.819 | 0.773 | 0.782 | 0.778 |
| 0.3622 | 1.7483 | 8500 | 0.4722 | 0.78 | 0.885 | 0.764 | 0.874 | 0.815 | 0.772 | 0.78 | 0.777 |
| 0.3738 | 1.8511 | 9000 | 0.4206 | 0.78 | 0.886 | 0.757 | 0.89 | 0.818 | 0.77 | 0.78 | 0.775 |
| 0.3669 | 1.9539 | 9500 | 0.4150 | 0.784 | 0.891 | 0.77 | 0.872 | 0.818 | 0.777 | 0.784 | 0.781 |
| 0.3562 | 2.0568 | 10000 | 0.4142 | 0.786 | 0.89 | 0.761 | 0.897 | 0.823 | 0.776 | 0.786 | 0.782 |
| 0.3333 | 2.1596 | 10500 | 0.4845 | 0.782 | 0.887 | 0.751 | 0.908 | 0.822 | 0.77 | 0.782 | 0.776 |
| 0.3396 | 2.2624 | 11000 | 0.4334 | 0.783 | 0.888 | 0.758 | 0.896 | 0.821 | 0.773 | 0.783 | 0.779 |
| 0.339 | 2.3653 | 11500 | 0.4221 | 0.786 | 0.889 | 0.776 | 0.863 | 0.818 | 0.78 | 0.786 | 0.784 |
| 0.33 | 2.4681 | 12000 | 0.4157 | 0.785 | 0.889 | 0.792 | 0.83 | 0.811 | 0.781 | 0.785 | 0.784 |
| 0.3416 | 2.5710 | 12500 | 0.4277 | 0.785 | 0.889 | 0.766 | 0.882 | 0.82 | 0.776 | 0.785 | 0.781 |
| 0.3439 | 2.6738 | 13000 | 0.4299 | 0.784 | 0.888 | 0.784 | 0.844 | 0.813 | 0.779 | 0.784 | 0.783 |
| 0.3468 | 2.7766 | 13500 | 0.4138 | 0.785 | 0.888 | 0.77 | 0.872 | 0.818 | 0.777 | 0.785 | 0.782 |
| 0.3413 | 2.8795 | 14000 | 0.4512 | 0.786 | 0.89 | 0.784 | 0.85 | 0.815 | 0.781 | 0.786 | 0.785 |
| 0.3334 | 2.9823 | 14500 | 0.4380 | 0.783 | 0.888 | 0.765 | 0.879 | 0.818 | 0.775 | 0.783 | 0.779 |
| 0.3094 | 3.0852 | 15000 | 0.4798 | 0.785 | 0.889 | 0.77 | 0.873 | 0.818 | 0.777 | 0.785 | 0.782 |
| 0.3156 | 3.1880 | 15500 | 0.4743 | 0.784 | 0.887 | 0.793 | 0.826 | 0.809 | 0.78 | 0.784 | 0.783 |
| 0.3125 | 3.2908 | 16000 | 0.4991 | 0.784 | 0.888 | 0.764 | 0.884 | 0.819 | 0.775 | 0.784 | 0.78 |
| 0.3097 | 3.3937 | 16500 | 0.4734 | 0.783 | 0.887 | 0.763 | 0.881 | 0.818 | 0.774 | 0.783 | 0.779 |
| 0.3082 | 3.4965 | 17000 | 0.4715 | 0.784 | 0.887 | 0.783 | 0.846 | 0.813 | 0.779 | 0.784 | 0.783 |
| 0.3024 | 3.5993 | 17500 | 0.4857 | 0.784 | 0.889 | 0.763 | 0.885 | 0.82 | 0.775 | 0.784 | 0.78 |
| 0.3087 | 3.7022 | 18000 | 0.5177 | 0.781 | 0.887 | 0.756 | 0.894 | 0.819 | 0.771 | 0.781 | 0.776 |
| 0.3064 | 3.8050 | 18500 | 0.4792 | 0.785 | 0.887 | 0.77 | 0.873 | 0.818 | 0.777 | 0.785 | 0.782 |
| 0.3031 | 3.9079 | 19000 | 0.4738 | 0.786 | 0.889 | 0.791 | 0.835 | 0.813 | 0.782 | 0.786 | 0.785 |
| 0.305 | 4.0107 | 19500 | 0.4908 | 0.786 | 0.889 | 0.784 | 0.848 | 0.814 | 0.78 | 0.786 | 0.784 |
| 0.2806 | 4.1135 | 20000 | 0.5368 | 0.785 | 0.888 | 0.768 | 0.879 | 0.819 | 0.777 | 0.785 | 0.782 |
| 0.2795 | 4.2164 | 20500 | 0.5228 | 0.786 | 0.889 | 0.789 | 0.838 | 0.813 | 0.781 | 0.786 | 0.785 |
| 0.2835 | 4.3192 | 21000 | 0.5038 | 0.785 | 0.888 | 0.773 | 0.867 | 0.817 | 0.778 | 0.785 | 0.782 |
| 0.2776 | 4.4220 | 21500 | 0.5283 | 0.784 | 0.887 | 0.791 | 0.831 | 0.81 | 0.78 | 0.784 | 0.783 |
| 0.2912 | 4.5249 | 22000 | 0.5161 | 0.786 | 0.888 | 0.773 | 0.869 | 0.818 | 0.779 | 0.786 | 0.783 |
| 0.2766 | 4.6277 | 22500 | 0.5228 | 0.785 | 0.888 | 0.784 | 0.845 | 0.813 | 0.78 | 0.785 | 0.783 |
| 0.2778 | 4.7306 | 23000 | 0.5270 | 0.787 | 0.888 | 0.772 | 0.874 | 0.82 | 0.779 | 0.787 | 0.784 |
| 0.271 | 4.8334 | 23500 | 0.5362 | 0.786 | 0.888 | 0.777 | 0.862 | 0.817 | 0.78 | 0.786 | 0.784 |
| 0.2835 | 4.9362 | 24000 | 0.5245 | 0.787 | 0.888 | 0.775 | 0.868 | 0.819 | 0.781 | 0.787 | 0.785 |
Framework versions
- Transformers 4.53.1
- Pytorch 2.6.0+cu124
- Datasets 2.14.4
- Tokenizers 0.21.2
- Downloads last month
- 2
Model tree for adity12345/Roberta_combo_v1
Base model
FacebookAI/xlm-roberta-base