Finetuned_Final_LM_200k_v3
This model was trained from scratch on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 2.4006
- Accuracy: 0.8429
- F1: 0.8410
- Precision: 0.8602
- Recall: 0.8429
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 16
- eval_batch_size: 64
- seed: 42
- gradient_accumulation_steps: 2
- total_train_batch_size: 32
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 50
- num_epochs: 2
- mixed_precision_training: Native AMP
Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 | Precision | Recall |
|---|---|---|---|---|---|---|---|
| 0.038 | 0.08 | 500 | 1.6709 | 0.8505 | 0.8483 | 0.8713 | 0.8505 |
| 0.3847 | 0.16 | 1000 | 2.0643 | 0.8403 | 0.8385 | 0.8556 | 0.8403 |
| 0.4135 | 0.24 | 1500 | 1.7622 | 0.8429 | 0.8399 | 0.8706 | 0.8429 |
| 0.3496 | 0.32 | 2000 | 2.2005 | 0.8482 | 0.8459 | 0.8701 | 0.8482 |
| 0.2201 | 0.4 | 2500 | 2.2234 | 0.8467 | 0.8445 | 0.8673 | 0.8467 |
| 0.306 | 0.48 | 3000 | 2.0981 | 0.8395 | 0.8375 | 0.8567 | 0.8395 |
| 0.2741 | 0.56 | 3500 | 2.4972 | 0.8421 | 0.8401 | 0.8607 | 0.8421 |
| 0.2189 | 0.64 | 4000 | 2.1950 | 0.8459 | 0.8440 | 0.8641 | 0.8459 |
| 0.1442 | 0.72 | 4500 | 2.3179 | 0.8418 | 0.8399 | 0.8589 | 0.8418 |
| 0.3744 | 0.8 | 5000 | 2.3932 | 0.8395 | 0.8376 | 0.8565 | 0.8395 |
| 0.2153 | 0.88 | 5500 | 2.3213 | 0.8406 | 0.8384 | 0.8601 | 0.8406 |
| 0.3273 | 0.96 | 6000 | 2.2511 | 0.8467 | 0.8444 | 0.8682 | 0.8467 |
| 0.0371 | 1.04 | 6500 | 2.4062 | 0.8406 | 0.8388 | 0.8573 | 0.8406 |
| 0.1964 | 1.12 | 7000 | 2.3856 | 0.8421 | 0.8401 | 0.8604 | 0.8421 |
| 0.3516 | 1.2 | 7500 | 2.3023 | 0.8444 | 0.8424 | 0.8631 | 0.8444 |
| 0.1166 | 1.28 | 8000 | 2.3752 | 0.8414 | 0.8394 | 0.8594 | 0.8414 |
| 0.2922 | 1.36 | 8500 | 2.4364 | 0.8425 | 0.8405 | 0.8612 | 0.8425 |
| 0.2452 | 1.44 | 9000 | 2.3738 | 0.8421 | 0.8402 | 0.8599 | 0.8421 |
| 0.1247 | 1.52 | 9500 | 2.4287 | 0.8421 | 0.8401 | 0.8604 | 0.8421 |
| 0.0475 | 1.6 | 10000 | 2.4438 | 0.8433 | 0.8413 | 0.8610 | 0.8433 |
| 0.2052 | 1.68 | 10500 | 2.4006 | 0.8429 | 0.8410 | 0.8602 | 0.8429 |
Framework versions
- Transformers 4.37.0
- Pytorch 2.1.0+cu121
- Datasets 2.16.1
- Tokenizers 0.15.1
- Downloads last month
- 2