metadata
tags:
- generated_from_trainer
metrics:
- accuracy
model-index:
- name: GPT2_bigram_function_53
results: []
GPT2_bigram_function_53
This model was trained from scratch on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 3.6936
- Accuracy: 0.3579
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 128
- eval_batch_size: 128
- seed: 53
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 10000
- num_epochs: 10
Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|---|---|---|---|---|
| 4.8992 | 1.0 | 6420 | 4.6850 | 0.2564 |
| 4.291 | 2.0 | 12840 | 4.1666 | 0.2988 |
| 4.0377 | 3.0 | 19260 | 3.9777 | 0.3204 |
| 3.9117 | 4.0 | 25680 | 3.8764 | 0.3325 |
| 3.813 | 5.0 | 32100 | 3.8146 | 0.3408 |
| 3.7448 | 6.0 | 38520 | 3.7702 | 0.3463 |
| 3.6872 | 7.0 | 44940 | 3.7413 | 0.3504 |
| 3.634 | 8.0 | 51360 | 3.7162 | 0.3537 |
| 3.5866 | 9.0 | 57780 | 3.7015 | 0.3563 |
| 3.5458 | 10.0 | 64200 | 3.6936 | 0.3579 |
Framework versions
- Transformers 4.30.2
- Pytorch 2.9.1+cu128
- Datasets 4.1.1
- Tokenizers 0.13.3