How to use uclanlp/plbart-multi_task-interpreted with Transformers:
# Load model directly from transformers import AutoTokenizer, AutoModelForSeq2SeqLM tokenizer = AutoTokenizer.from_pretrained("uclanlp/plbart-multi_task-interpreted") model = AutoModelForSeq2SeqLM.from_pretrained("uclanlp/plbart-multi_task-interpreted")
No model card
# Load model directly from transformers import AutoTokenizer, AutoModelForSeq2SeqLM tokenizer = AutoTokenizer.from_pretrained("uclanlp/plbart-multi_task-interpreted") model = AutoModelForSeq2SeqLM.from_pretrained("uclanlp/plbart-multi_task-interpreted")