Instructions to use trl-internal-testing/tiny-T5ForConditionalGeneration with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use trl-internal-testing/tiny-T5ForConditionalGeneration with Transformers:
# Load model directly from transformers import AutoTokenizer, AutoModelForSeq2SeqLM tokenizer = AutoTokenizer.from_pretrained("trl-internal-testing/tiny-T5ForConditionalGeneration") model = AutoModelForSeq2SeqLM.from_pretrained("trl-internal-testing/tiny-T5ForConditionalGeneration") - Notebooks
- Google Colab
- Kaggle
Upload tokenizer
Browse files- tokenizer_config.json +1 -0
tokenizer_config.json
CHANGED
|
@@ -1,4 +1,5 @@
|
|
| 1 |
{
|
|
|
|
| 2 |
"added_tokens_decoder": {
|
| 3 |
"0": {
|
| 4 |
"content": "<pad>",
|
|
|
|
| 1 |
{
|
| 2 |
+
"add_prefix_space": null,
|
| 3 |
"added_tokens_decoder": {
|
| 4 |
"0": {
|
| 5 |
"content": "<pad>",
|