Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer
Paper • 1910.10683 • Published • 18
How to use tomhodemon/t5-small-wikitext with Transformers:
# Load model directly
from transformers import AutoTokenizer, AutoModelForSeq2SeqLM
tokenizer = AutoTokenizer.from_pretrained("tomhodemon/t5-small-wikitext")
model = AutoModelForSeq2SeqLM.from_pretrained("tomhodemon/t5-small-wikitext")YAML Metadata Warning:empty or missing yaml metadata in repo card
Check out the documentation for more information.
t5-small trained on wikitext/wikitest-103-raw-v1 over 50k steps (around 2 hours of training) following T5 paper training procedure.