How to use versae/t5-8m with Transformers:
# Load model directly from transformers import AutoTokenizer, AutoModelForSeq2SeqLM tokenizer = AutoTokenizer.from_pretrained("versae/t5-8m") model = AutoModelForSeq2SeqLM.from_pretrained("versae/t5-8m")
86d4af2
1
2
3
4
version https://git-lfs.github.com/spec/v1 oid sha256:99185e58c8c719bfdd4b3c3b4ffc10632096bcb38c2b722a944af47287ba9852 size 990323615