Instructions to use google/flan-ul2 with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use google/flan-ul2 with Transformers:
# Load model directly from transformers import AutoTokenizer, AutoModelForSeq2SeqLM tokenizer = AutoTokenizer.from_pretrained("google/flan-ul2") model = AutoModelForSeq2SeqLM.from_pretrained("google/flan-ul2") - Notebooks
- Google Colab
- Kaggle
Commit History
Update README.md 5dfdc8c
Update README.md 701c4a4
Update README.md 57f845f
Update README.md ac13457
Update README.md 50f8d3a
Update README.md 75a0ff3
Upload tokenizer 2153c0b
Upload T5ForConditionalGeneration c2e07e1
initial commit f6076c3
Younes Belkada commited on