--- license: mit tags: - generated_from_trainer datasets: - generator model-index: - name: gpt2-concat-second results: [] --- # gpt2-concat-second This model is a fine-tuned version of [gpt2](https://huggingface.co/gpt2) on the generator dataset. It achieves the following results on the evaluation set: - Loss: 4.4031 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0005 - train_batch_size: 64 - eval_batch_size: 64 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: cosine - lr_scheduler_warmup_steps: 1000 - num_epochs: 10 - mixed_precision_training: Native AMP ### Training results | Training Loss | Epoch | Step | Validation Loss | |:-------------:|:-----:|:-----:|:---------------:| | 6.7063 | 0.29 | 500 | 5.6161 | | 5.3409 | 0.58 | 1000 | 5.1879 | | 4.9975 | 0.87 | 1500 | 4.9292 | | 4.7248 | 1.16 | 2000 | 4.7819 | | 4.5625 | 1.45 | 2500 | 4.6577 | | 4.4518 | 1.74 | 3000 | 4.5536 | | 4.3506 | 2.02 | 3500 | 4.4718 | | 4.1444 | 2.31 | 4000 | 4.4324 | | 4.1299 | 2.6 | 4500 | 4.3859 | | 4.097 | 2.89 | 5000 | 4.3383 | | 3.9322 | 3.18 | 5500 | 4.3372 | | 3.8738 | 3.47 | 6000 | 4.3092 | | 3.8743 | 3.76 | 6500 | 4.2795 | | 3.8147 | 4.05 | 7000 | 4.2758 | | 3.6152 | 4.34 | 7500 | 4.2857 | | 3.6479 | 4.63 | 8000 | 4.2632 | | 3.654 | 4.92 | 8500 | 4.2380 | | 3.4411 | 5.21 | 9000 | 4.2846 | | 3.398 | 5.49 | 9500 | 4.2785 | | 3.4249 | 5.78 | 10000 | 4.2628 | | 3.3498 | 6.07 | 10500 | 4.2910 | | 3.1525 | 6.36 | 11000 | 4.3119 | | 3.1727 | 6.65 | 11500 | 4.3057 | | 3.1862 | 6.94 | 12000 | 4.2985 | | 2.9723 | 7.23 | 12500 | 4.3475 | | 2.9448 | 7.52 | 13000 | 4.3551 | | 2.9617 | 7.81 | 13500 | 4.3526 | | 2.8946 | 8.1 | 14000 | 4.3748 | | 2.7783 | 8.39 | 14500 | 4.3866 | | 2.7819 | 8.68 | 15000 | 4.3904 | | 2.7913 | 8.96 | 15500 | 4.3905 | | 2.7052 | 9.25 | 16000 | 4.4009 | | 2.6969 | 9.54 | 16500 | 4.4029 | | 2.7 | 9.83 | 17000 | 4.4031 | ### Framework versions - Transformers 4.26.1 - Pytorch 1.11.0+cu113 - Datasets 2.13.0 - Tokenizers 0.13.3