Roberta_combo_1.6lakh
This model is a fine-tuned version of FacebookAI/xlm-roberta-base on the None dataset. It achieves the following results on the evaluation set:
- Loss: 1.4440
- Accuracy: 0.62
- Auc: 0.608
- Precision: 0.581
- Recall: 0.814
- F1: 0.678
- F1-macro: 0.608
- F1-micro: 0.62
- F1-weighted: 0.607
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 32
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 2
- mixed_precision_training: Native AMP
Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | Auc | Precision | Recall | F1 | F1-macro | F1-micro | F1-weighted |
|---|---|---|---|---|---|---|---|---|---|---|---|
| 0.5271 | 0.1935 | 1000 | 1.0367 | 0.595 | 0.57 | 0.567 | 0.751 | 0.646 | 0.587 | 0.595 | 0.586 |
| 0.439 | 0.3870 | 2000 | 1.2336 | 0.594 | 0.576 | 0.558 | 0.832 | 0.668 | 0.573 | 0.594 | 0.571 |
| 0.4166 | 0.5805 | 3000 | 1.3620 | 0.606 | 0.57 | 0.569 | 0.814 | 0.67 | 0.591 | 0.606 | 0.59 |
| 0.3997 | 0.7740 | 4000 | 1.3161 | 0.61 | 0.59 | 0.575 | 0.787 | 0.664 | 0.599 | 0.61 | 0.598 |
| 0.391 | 0.9675 | 5000 | 1.3359 | 0.613 | 0.596 | 0.573 | 0.833 | 0.679 | 0.596 | 0.613 | 0.595 |
| 0.3618 | 1.1610 | 6000 | 1.4006 | 0.617 | 0.603 | 0.578 | 0.817 | 0.677 | 0.603 | 0.617 | 0.602 |
| 0.3564 | 1.3545 | 7000 | 1.3645 | 0.619 | 0.612 | 0.584 | 0.782 | 0.669 | 0.611 | 0.619 | 0.61 |
| 0.352 | 1.5480 | 8000 | 1.4672 | 0.617 | 0.601 | 0.582 | 0.785 | 0.668 | 0.608 | 0.617 | 0.606 |
| 0.3465 | 1.7415 | 9000 | 1.4327 | 0.62 | 0.609 | 0.586 | 0.771 | 0.666 | 0.613 | 0.62 | 0.612 |
| 0.3413 | 1.9350 | 10000 | 1.4440 | 0.62 | 0.608 | 0.581 | 0.814 | 0.678 | 0.608 | 0.62 | 0.607 |
Framework versions
- Transformers 4.55.2
- Pytorch 2.6.0+cu124
- Datasets 4.0.0
- Tokenizers 0.21.4
- Downloads last month
- -
Model tree for adity12345/Roberta_combo_1.6lakh
Base model
FacebookAI/xlm-roberta-base