EdinburghNLP/xsum
Viewer • Updated • 227k • 34.5k • 145
How to use edithram23/t5-small-finetuned-xsum with Transformers:
# Load model directly
from transformers import AutoTokenizer, AutoModelForSeq2SeqLM
tokenizer = AutoTokenizer.from_pretrained("edithram23/t5-small-finetuned-xsum")
model = AutoModelForSeq2SeqLM.from_pretrained("edithram23/t5-small-finetuned-xsum")This model is a fine-tuned version of t5-small on the xsum dataset. It achieves the following results on the evaluation set:
More information needed
More information needed
More information needed
The following hyperparameters were used during training:
| Training Loss | Epoch | Step | Validation Loss |
|---|---|---|---|
| 3.8067 | 1.0 | 12753 | 3.5393 |
Base model
google-t5/t5-small