Instructions to use google/flan-t5-small with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use google/flan-t5-small with Transformers:
# Load model directly from transformers import AutoTokenizer, AutoModelForSeq2SeqLM tokenizer = AutoTokenizer.from_pretrained("google/flan-t5-small") model = AutoModelForSeq2SeqLM.from_pretrained("google/flan-t5-small") - Notebooks
- Google Colab
- Kaggle
Try with text generation
#5
by ybelkada - opened
README.md
CHANGED
|
@@ -63,6 +63,7 @@ language:
|
|
| 63 |
|
| 64 |
tags:
|
| 65 |
- text2text-generation
|
|
|
|
| 66 |
|
| 67 |
widget:
|
| 68 |
- text: "Translate to German: My name is Arthur"
|
|
|
|
| 63 |
|
| 64 |
tags:
|
| 65 |
- text2text-generation
|
| 66 |
+
- text-generation
|
| 67 |
|
| 68 |
widget:
|
| 69 |
- text: "Translate to German: My name is Arthur"
|