metadata
tags:
- generated_from_trainer
metrics:
- accuracy
model-index:
- name: GPT2_more_function_67
results: []
GPT2_more_function_67
This model was trained from scratch on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 4.3501
- Accuracy: 0.2680
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 128
- eval_batch_size: 128
- seed: 67
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 10000
- num_epochs: 10
Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|---|---|---|---|---|
| 5.634 | 1.0 | 6417 | 5.4033 | 0.1902 |
| 4.9862 | 2.0 | 12834 | 4.8415 | 0.2243 |
| 4.7151 | 3.0 | 19251 | 4.6391 | 0.2410 |
| 4.5832 | 4.0 | 25668 | 4.5397 | 0.2486 |
| 4.4844 | 5.0 | 32085 | 4.4759 | 0.2552 |
| 4.4117 | 6.0 | 38502 | 4.4317 | 0.2589 |
| 4.3505 | 7.0 | 44919 | 4.4006 | 0.2628 |
| 4.2941 | 8.0 | 51336 | 4.3753 | 0.2652 |
| 4.2479 | 9.0 | 57753 | 4.3582 | 0.2669 |
| 4.2036 | 10.0 | 64170 | 4.3501 | 0.2680 |
Framework versions
- Transformers 4.30.2
- Pytorch 2.9.1+cu128
- Datasets 4.1.1
- Tokenizers 0.13.3