How to use natope/question-context-random-to10-p with Transformers:
# Load model directly from transformers import AutoTokenizer, AutoModelForSeq2SeqLM tokenizer = AutoTokenizer.from_pretrained("natope/question-context-random-to10-p") model = AutoModelForSeq2SeqLM.from_pretrained("natope/question-context-random-to10-p")
What is a pickle import?
How to fix it?