How to use devagonal/mt5-semantic with Transformers:
# Load model directly from transformers import AutoTokenizer, AutoModelForSeq2SeqLM tokenizer = AutoTokenizer.from_pretrained("devagonal/mt5-semantic") model = AutoModelForSeq2SeqLM.from_pretrained("devagonal/mt5-semantic")
8662a8c 5380f05 8662a8c
1
2
3
4
version https://git-lfs.github.com/spec/v1 oid sha256:5edf715553730963dc9510891571ea7725d862a77efef1cc01c22baf753aa7b8 size 2329638768