train_mrpc_1754652143
This model is a fine-tuned version of meta-llama/Meta-Llama-3-8B-Instruct on the mrpc dataset. It achieves the following results on the evaluation set:
- Loss: 0.1832
- Num Input Tokens Seen: 3388032
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 4
- eval_batch_size: 4
- seed: 123
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 10.0
Training results
| Training Loss | Epoch | Step | Validation Loss | Input Tokens Seen |
|---|---|---|---|---|
| 5.8127 | 0.5 | 413 | 5.5490 | 170336 |
| 0.5016 | 1.0 | 826 | 0.4616 | 339568 |
| 0.2334 | 1.5 | 1239 | 0.2614 | 509456 |
| 0.2324 | 2.0 | 1652 | 0.2292 | 678688 |
| 0.2087 | 2.5 | 2065 | 0.2304 | 847104 |
| 0.2261 | 3.0 | 2478 | 0.2181 | 1017368 |
| 0.2267 | 3.5 | 2891 | 0.2109 | 1187704 |
| 0.2226 | 4.0 | 3304 | 0.1965 | 1356744 |
| 0.164 | 4.5 | 3717 | 0.1991 | 1526088 |
| 0.1817 | 5.0 | 4130 | 0.2025 | 1694912 |
| 0.1843 | 5.5 | 4543 | 0.1930 | 1864000 |
| 0.1636 | 6.0 | 4956 | 0.1893 | 2033992 |
| 0.1749 | 6.5 | 5369 | 0.1857 | 2203208 |
| 0.1559 | 7.0 | 5782 | 0.1867 | 2372464 |
| 0.1646 | 7.5 | 6195 | 0.1860 | 2540880 |
| 0.1817 | 8.0 | 6608 | 0.1832 | 2710624 |
| 0.1882 | 8.5 | 7021 | 0.1861 | 2880128 |
| 0.1904 | 9.0 | 7434 | 0.1856 | 3049392 |
| 0.1687 | 9.5 | 7847 | 0.1861 | 3218352 |
| 0.2114 | 10.0 | 8260 | 0.1846 | 3388032 |
Framework versions
- PEFT 0.15.2
- Transformers 4.51.3
- Pytorch 2.8.0+cu128
- Datasets 3.6.0
- Tokenizers 0.21.1
- Downloads last month
- 2
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
🙋
Ask for provider support
Model tree for rbelanec/train_mrpc_1754652143
Base model
meta-llama/Meta-Llama-3-8B-Instruct