Instructions to use dropout05/t5-tiny with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use dropout05/t5-tiny with Transformers:
# Load model directly from transformers import AutoTokenizer, AutoModelForSeq2SeqLM tokenizer = AutoTokenizer.from_pretrained("dropout05/t5-tiny") model = AutoModelForSeq2SeqLM.from_pretrained("dropout05/t5-tiny") - Notebooks
- Google Colab
- Kaggle
fixed config
Browse files- config.json +1 -1
config.json
CHANGED
|
@@ -53,5 +53,5 @@
|
|
| 53 |
"torch_dtype": "float32",
|
| 54 |
"transformers_version": "4.16.2",
|
| 55 |
"use_cache": true,
|
| 56 |
-
"vocab_size":
|
| 57 |
}
|
|
|
|
| 53 |
"torch_dtype": "float32",
|
| 54 |
"transformers_version": "4.16.2",
|
| 55 |
"use_cache": true,
|
| 56 |
+
"vocab_size": 32128
|
| 57 |
}
|