Translation
Transformers
PyTorch
TensorFlow
JAX
Rust
ONNX
Safetensors
t5
text2text-generation
summarization
text-generation-inference
Instructions to use google-t5/t5-small with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use google-t5/t5-small with Transformers:
# Use a pipeline as a high-level helper # Warning: Pipeline type "translation" is no longer supported in transformers v5. # You must load the model directly (see below) or downgrade to v4.x with: # 'pip install "transformers<5.0.0' from transformers import pipeline pipe = pipeline("translation", model="google-t5/t5-small")# Load model directly from transformers import AutoTokenizer, AutoModelForSeq2SeqLM tokenizer = AutoTokenizer.from_pretrained("google-t5/t5-small") model = AutoModelForSeq2SeqLM.from_pretrained("google-t5/t5-small") - Inference
- Notebooks
- Google Colab
- Kaggle
Model do wrong print response on java - ( ai.onnxruntime)
8
#33 opened 8 months ago
by
kgrabko
Adding ONNX file of this model
#31 opened 11 months ago
by
SdCode
I need help.
➕🧠 1
1
#29 opened over 1 year ago
by
thebryanalvarado
Update README.md
#28 opened over 1 year ago
by
Eron011
Update README.md
#27 opened almost 2 years ago
by
thecherepaha
Update README.md
#26 opened about 2 years ago
by
KrakenJame69
Datasets used to train NMT supervised task ?
#25 opened about 2 years ago
by
OrianeN
[AUTOMATED] Model Memory Requirements
#24 opened over 2 years ago
by
model-sizer-bot
Update README.md
1
#23 opened over 2 years ago
by
Ggcall
Add evaluation results on the plain_text config and test split of imdb
#22 opened over 2 years ago
by
autoevaluator
Add evaluation results on the plain_text config and test split of imdb
#21 opened over 2 years ago
by
autoevaluator
Add evaluation results on the default config and test split of xsum
#20 opened over 2 years ago
by
autoevaluator
Add evaluation results on the default config and test split of xsum
#19 opened over 2 years ago
by
autoevaluator
Add evaluation results on the default config and test split of xsum
#18 opened over 2 years ago
by
autoevaluator
Can I get the script to run an inference through T5 ONNX using `onnxruntime`?
3
#17 opened over 2 years ago
by
AayushShah
Add Core ML conversion
1
#16 opened over 2 years ago
by
KamKamHo
how to use the trained model to infer ?
1
#10 opened about 3 years ago
by
LycheeX
It seems `T5WithLMHead` is outdated
3
#5 opened over 3 years ago
by
Narsil
scores for model.generate()
#4 opened over 3 years ago
by
rishihazra
`model_max_length` set to your preferred value.
1
#3 opened over 3 years ago
by
NikGC
Add evaluation results on the 3.0.0 config and test split of cnn_dailymail
#2 opened over 3 years ago
by
autoevaluator