Instructions to use google/flan-ul2 with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use google/flan-ul2 with Transformers:
# Load model directly from transformers import AutoTokenizer, AutoModelForSeq2SeqLM tokenizer = AutoTokenizer.from_pretrained("google/flan-ul2") model = AutoModelForSeq2SeqLM.from_pretrained("google/flan-ul2") - Notebooks
- Google Colab
- Kaggle
Use cases of conversion to HF?
#13
by Canoot - opened
What are good example use cases, such that one needs converting this model from T5x to HuggingFace?
Canoot changed discussion status to closed
Canoot changed discussion status to open
Probably, the only use case is if you would like to use HuggingFace libraries instead of using original scripts for model pre-training/fine-tuning.
Canoot changed discussion status to closed