daibeiya's picture
model upload
4d46bcb verified
---
library_name: transformers
base_model: /fs-computility/plm/linzhouhan/daibeiya/models/gpt2-large
tags:
- generated_from_trainer
datasets:
- openwebtext
model-index:
- name: gpt2_large_contextlm_l0236_add_lnnorm_lr_bf16_lr6e-4
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# gpt2_large_contextlm_l0236_add_lnnorm_lr_bf16_lr6e-4
This model is a fine-tuned version of [/fs-computility/plm/linzhouhan/daibeiya/models/gpt2-large](https://huggingface.co//fs-computility/plm/linzhouhan/daibeiya/models/gpt2-large) on the openwebtext dataset.
It achieves the following results on the evaluation set:
- Loss: 2.7357
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0006
- train_batch_size: 4
- eval_batch_size: 8
- seed: 42
- distributed_type: multi-GPU
- num_devices: 64
- gradient_accumulation_steps: 2
- total_train_batch_size: 512
- total_eval_batch_size: 512
- optimizer: Use adamw_torch with betas=(0.9,0.95) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- lr_scheduler_warmup_ratio: 0.05
- num_epochs: 1.0
### Training results
| Training Loss | Epoch | Step | Validation Loss |
|:-------------:|:------:|:-----:|:---------------:|
| 3.8376 | 0.0580 | 1000 | 3.7978 |
| 3.3595 | 0.1160 | 2000 | 3.3179 |
| 3.207 | 0.1741 | 3000 | 3.1580 |
| 3.0999 | 0.2321 | 4000 | 3.0631 |
| 3.0333 | 0.2901 | 5000 | 2.9981 |
| 2.9958 | 0.3481 | 6000 | 2.9496 |
| 2.9443 | 0.4062 | 7000 | 2.9102 |
| 2.9097 | 0.4642 | 8000 | 2.8760 |
| 2.879 | 0.5222 | 9000 | 2.8451 |
| 2.8506 | 0.5802 | 10000 | 2.8198 |
| 2.831 | 0.6383 | 11000 | 2.7969 |
| 2.8156 | 0.6963 | 12000 | 2.7781 |
| 2.799 | 0.7543 | 13000 | 2.7616 |
| 2.7802 | 0.8123 | 14000 | 2.7494 |
| 2.7785 | 0.8703 | 15000 | 2.7414 |
| 2.7706 | 0.9284 | 16000 | 2.7370 |
| 2.7665 | 0.9864 | 17000 | 2.7357 |
### Framework versions
- Transformers 4.51.3
- Pytorch 2.3.0+cu121
- Datasets 4.0.0
- Tokenizers 0.21.4