How to use research-backup/t5-base-analogy with Transformers:
# Load model directly from transformers import AutoTokenizer, AutoModelForSeq2SeqLM tokenizer = AutoTokenizer.from_pretrained("research-backup/t5-base-analogy") model = AutoModelForSeq2SeqLM.from_pretrained("research-backup/t5-base-analogy")