EdinburghNLP/xsum
Viewer • Updated • 227k • 34.1k • 145
How to use mindchain/t5-small-finetuned-xsum with Transformers:
# Load model directly
from transformers import AutoTokenizer, AutoModelForSeq2SeqLM
tokenizer = AutoTokenizer.from_pretrained("mindchain/t5-small-finetuned-xsum")
model = AutoModelForSeq2SeqLM.from_pretrained("mindchain/t5-small-finetuned-xsum")This model is a fine-tuned version of t5-small on the xsum dataset.
More information needed
More information needed
More information needed
The following hyperparameters were used during training: