Transformers
t5
How to use from the
Use from the
Transformers library
# Load model directly
from transformers import AutoModel
model = AutoModel.from_pretrained("Jingya/tiny-random-t5-neuronx", dtype="auto")
Quick Links

Model for internal fast test on inf2, do not use in other scenarios.

To build the model:

optimum-cli export neuron --model hf-internal-testing/tiny-random-t5 --task text2text-generation --batch_size 1 --sequence_length 18 --num_beams 4 tiny_random_t5_neuronx/
Downloads last month
67
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Space using Jingya/tiny-random-t5-neuronx 1