Translation
Transformers
PyTorch
TensorFlow
Safetensors
t5
text-generation
summarization
text-generation-inference
Instructions to use google-t5/t5-3b with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use google-t5/t5-3b with Transformers:
# Use a pipeline as a high-level helper # Warning: Pipeline type "translation" is no longer supported in transformers v5. # You must load the model directly (see below) or downgrade to v4.x with: # 'pip install "transformers<5.0.0' from transformers import pipeline pipe = pipeline("translation", model="google-t5/t5-3b")# Load model directly from transformers import AutoTokenizer, AutoModelWithLMHead tokenizer = AutoTokenizer.from_pretrained("google-t5/t5-3b") model = AutoModelWithLMHead.from_pretrained("google-t5/t5-3b") - Notebooks
- Google Colab
- Kaggle
Translation to German doesn't work in 3B model
#8
by sszymczyk - opened
While working on a T5 support in llama.cpp I noticed a strange thing - translation to German language doesn't work in T5-3B model. The model card says that it supports German language, config.json also contains "translate English to German: " prefix in "translation_en_to_de" task, but running the model with "translate English to German: The house is wonderful." prompt results in the following output: "The house is beautiful and the staff is very friendly and helpful." Why is that? I checked that it's not a problem with my implementation as the output is the same when using transformers library. Also translation to French and Romanian works without problems.