Instructions to use google/ul2 with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use google/ul2 with Transformers:
# Load model directly from transformers import AutoTokenizer, AutoModelForSeq2SeqLM tokenizer = AutoTokenizer.from_pretrained("google/ul2") model = AutoModelForSeq2SeqLM.from_pretrained("google/ul2") - Notebooks
- Google Colab
- Kaggle
Smaller UL2 models
#8
by leshanbog - opened
Hi!
Thanks for sharing this 20B model with the community! From your paper it seems that smaller model also benefit from this kind of pretraining. Do you have any plans to release a UL2-base or something of that scale?