--- base_model: distilgpt2 library_name: transformers license: apache-2.0 tags: - generated_from_trainer model-index: - name: CFDistilGPT results: [] --- # CFDistilGPT This model is a fine-tuned version of [distilgpt2](https://huggingface.co/distilgpt2) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 2.9080 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 2e-05 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments - lr_scheduler_type: linear - num_epochs: 20 ### Training results | Training Loss | Epoch | Step | Validation Loss | |:-------------:|:-----:|:----:|:---------------:| | No log | 1.0 | 140 | 3.2381 | | No log | 2.0 | 280 | 3.1353 | | No log | 3.0 | 420 | 3.0809 | | 3.2346 | 4.0 | 560 | 3.0432 | | 3.2346 | 5.0 | 700 | 3.0202 | | 3.2346 | 6.0 | 840 | 2.9960 | | 3.2346 | 7.0 | 980 | 2.9784 | | 2.9216 | 8.0 | 1120 | 2.9666 | | 2.9216 | 9.0 | 1260 | 2.9535 | | 2.9216 | 10.0 | 1400 | 2.9435 | | 2.7856 | 11.0 | 1540 | 2.9330 | | 2.7856 | 12.0 | 1680 | 2.9253 | | 2.7856 | 13.0 | 1820 | 2.9221 | | 2.7856 | 14.0 | 1960 | 2.9184 | | 2.6979 | 15.0 | 2100 | 2.9159 | | 2.6979 | 16.0 | 2240 | 2.9133 | | 2.6979 | 17.0 | 2380 | 2.9096 | | 2.6479 | 18.0 | 2520 | 2.9102 | | 2.6479 | 19.0 | 2660 | 2.9080 | | 2.6479 | 20.0 | 2800 | 2.9080 | ### Framework versions - Transformers 4.46.1 - Pytorch 2.4.0+cu121 - Datasets 3.0.1 - Tokenizers 0.20.1