Translation
Transformers
PyTorch
TensorFlow
JAX
Rust
Safetensors
t5
text2text-generation
summarization
text-generation-inference
Instructions to use google-t5/t5-base with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use google-t5/t5-base with Transformers:
# Use a pipeline as a high-level helper # Warning: Pipeline type "translation" is no longer supported in transformers v5. # You must load the model directly (see below) or downgrade to v4.x with: # 'pip install "transformers<5.0.0' from transformers import pipeline pipe = pipeline("translation", model="google-t5/t5-base")# Load model directly from transformers import AutoTokenizer, AutoModelForSeq2SeqLM tokenizer = AutoTokenizer.from_pretrained("google-t5/t5-base") model = AutoModelForSeq2SeqLM.from_pretrained("google-t5/t5-base") - Inference
- Notebooks
- Google Colab
- Kaggle
Update config.json
#13
by Jesuscarr - opened
- config.json +7 -1
config.json
CHANGED
|
@@ -45,7 +45,13 @@
|
|
| 45 |
"max_length": 300,
|
| 46 |
"num_beams": 4,
|
| 47 |
"prefix": "translate English to Romanian: "
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 48 |
}
|
| 49 |
},
|
| 50 |
"vocab_size": 32128
|
| 51 |
-
}
|
|
|
|
| 45 |
"max_length": 300,
|
| 46 |
"num_beams": 4,
|
| 47 |
"prefix": "translate English to Romanian: "
|
| 48 |
+
},
|
| 49 |
+
"translation_en_to_es": {
|
| 50 |
+
"early_stopping": true,
|
| 51 |
+
"max_length": 300,
|
| 52 |
+
"num_beams": 4,
|
| 53 |
+
"prefix": "translate English to Spanish: "
|
| 54 |
}
|
| 55 |
},
|
| 56 |
"vocab_size": 32128
|
| 57 |
+
}
|