How to use GroNLP/T0pp-sharded with Transformers:
# Load model directly from transformers import AutoTokenizer, AutoModelForSeq2SeqLM tokenizer = AutoTokenizer.from_pretrained("GroNLP/T0pp-sharded") model = AutoModelForSeq2SeqLM.from_pretrained("GroNLP/T0pp-sharded")
The community tab is the place to discuss and collaborate with the HF community!