distrilbert_full_parameter_finetune_crisismmd
This model is a fine-tuned version of distilbert/distilbert-base-uncased on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 0.3472
- Accuracy: 84.87%
- Precision: 84.73%
- Recall: 84.87%
- F1: 84.42%
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 1
Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | Precision | Recall | F1 |
|---|---|---|---|---|---|---|---|
| 0.5759 | 0.0482 | 50 | 0.4439 | 80.61% | 80.68% | 80.61% | 80.64% |
| 0.4797 | 0.0964 | 100 | 0.4242 | 81.50% | 81.28% | 81.50% | 80.67% |
| 0.4362 | 0.1446 | 150 | 0.4136 | 82.77% | 82.97% | 82.77% | 82.86% |
| 0.4631 | 0.1929 | 200 | 0.4900 | 81.25% | 82.43% | 81.25% | 79.54% |
| 0.4326 | 0.2411 | 250 | 0.4333 | 83.28% | 83.86% | 83.28% | 82.20% |
| 0.3911 | 0.2893 | 300 | 0.4526 | 82.45% | 83.21% | 82.45% | 81.17% |
| 0.4426 | 0.3375 | 350 | 0.4395 | 83.22% | 84.14% | 83.22% | 83.48% |
| 0.4022 | 0.3857 | 400 | 0.3950 | 83.47% | 83.21% | 83.47% | 83.25% |
| 0.3937 | 0.4339 | 450 | 0.3667 | 84.55% | 84.36% | 84.55% | 84.42% |
| 0.3773 | 0.4822 | 500 | 0.3843 | 84.74% | 84.53% | 84.74% | 84.38% |
| 0.451 | 0.5304 | 550 | 0.3858 | 82.52% | 83.49% | 82.52% | 81.15% |
| 0.3488 | 0.5786 | 600 | 0.3840 | 84.68% | 84.54% | 84.68% | 84.21% |
| 0.3773 | 0.6268 | 650 | 0.3851 | 84.23% | 84.27% | 84.23% | 83.57% |
| 0.3442 | 0.6750 | 700 | 0.3492 | 85.31% | 85.10% | 85.31% | 85.08% |
| 0.3348 | 0.7232 | 750 | 0.4122 | 84.11% | 84.64% | 84.11% | 83.17% |
| 0.4304 | 0.7715 | 800 | 0.3499 | 84.81% | 84.80% | 84.81% | 84.24% |
| 0.3724 | 0.8197 | 850 | 0.3569 | 84.87% | 84.81% | 84.87% | 84.34% |
| 0.3212 | 0.8679 | 900 | 0.3706 | 84.62% | 84.86% | 84.62% | 83.88% |
| 0.4108 | 0.9161 | 950 | 0.3835 | 84.23% | 84.65% | 84.23% | 83.36% |
| 0.3334 | 0.9643 | 1000 | 0.3472 | 84.87% | 84.73% | 84.87% | 84.42% |
Framework versions
- Transformers 4.50.3
- Pytorch 2.6.0+cu124
- Datasets 3.5.0
- Tokenizers 0.21.1
- Downloads last month
- 1
Model tree for Donny-Guo/distrilbert_full_parameter_finetune_crisismmd
Base model
distilbert/distilbert-base-uncased