c3334c534df77db69020adcdb8e8dee6
This model is a fine-tuned version of openai-community/gpt2 on the dim/tldr_news dataset. It achieves the following results on the evaluation set:
- Loss: 1.0717
- Data Size: 1.0
- Epoch Runtime: 14.3781
- Accuracy: 0.7720
- F1 Macro: 0.7934
- Rouge1: 0.7727
- Rouge2: 0.0
- Rougel: 0.7720
- Rougelsum: 0.7727
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- distributed_type: multi-GPU
- num_devices: 4
- total_train_batch_size: 32
- total_eval_batch_size: 32
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: constant
- num_epochs: 50
Training results
| Training Loss | Epoch | Step | Validation Loss | Data Size | Epoch Runtime | Accuracy | F1 Macro | Rouge1 | Rouge2 | Rougel | Rougelsum |
|---|---|---|---|---|---|---|---|---|---|---|---|
| No log | 0 | 0 | 8.7551 | 0 | 1.7902 | 0.2188 | 0.0749 | 0.2180 | 0.0 | 0.2188 | 0.2188 |
| No log | 1 | 178 | 3.1126 | 0.0078 | 2.8160 | 0.2820 | 0.1785 | 0.2827 | 0.0 | 0.2820 | 0.2812 |
| No log | 2 | 356 | 2.1564 | 0.0156 | 2.2757 | 0.3366 | 0.2038 | 0.3374 | 0.0 | 0.3374 | 0.3359 |
| No log | 3 | 534 | 1.3506 | 0.0312 | 2.7817 | 0.5398 | 0.4185 | 0.5405 | 0.0 | 0.5398 | 0.5398 |
| No log | 4 | 712 | 0.9207 | 0.0625 | 3.2658 | 0.6989 | 0.5513 | 0.6996 | 0.0 | 0.7003 | 0.6982 |
| No log | 5 | 890 | 0.8099 | 0.125 | 4.1053 | 0.7223 | 0.5618 | 0.7230 | 0.0 | 0.7237 | 0.7230 |
| 0.0838 | 6 | 1068 | 0.6785 | 0.25 | 5.3897 | 0.7372 | 0.5899 | 0.7379 | 0.0 | 0.7372 | 0.7372 |
| 0.6366 | 7 | 1246 | 0.6333 | 0.5 | 8.6612 | 0.7599 | 0.5997 | 0.7607 | 0.0 | 0.7607 | 0.7599 |
| 0.5248 | 8.0 | 1424 | 0.6111 | 1.0 | 14.9487 | 0.7678 | 0.7573 | 0.7685 | 0.0 | 0.7678 | 0.7678 |
| 0.4105 | 9.0 | 1602 | 0.6250 | 1.0 | 15.4161 | 0.7614 | 0.7699 | 0.7621 | 0.0 | 0.7621 | 0.7614 |
| 0.2965 | 10.0 | 1780 | 0.6644 | 1.0 | 17.6034 | 0.7592 | 0.7712 | 0.7592 | 0.0 | 0.7599 | 0.7592 |
| 0.2309 | 11.0 | 1958 | 0.7837 | 1.0 | 16.6927 | 0.7784 | 0.8046 | 0.7791 | 0.0 | 0.7784 | 0.7784 |
| 0.1439 | 12.0 | 2136 | 1.0717 | 1.0 | 14.3781 | 0.7720 | 0.7934 | 0.7727 | 0.0 | 0.7720 | 0.7727 |
Framework versions
- Transformers 4.57.0
- Pytorch 2.8.0+cu128
- Datasets 4.3.0
- Tokenizers 0.22.1
- Downloads last month
- -
Model tree for contemmcm/c3334c534df77db69020adcdb8e8dee6
Base model
openai-community/gpt2