Instructions to use google/long-t5-local-large with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use google/long-t5-local-large with Transformers:
# Load model directly from transformers import AutoTokenizer, AutoModelForSeq2SeqLM tokenizer = AutoTokenizer.from_pretrained("google/long-t5-local-large") model = AutoModelForSeq2SeqLM.from_pretrained("google/long-t5-local-large") - Notebooks
- Google Colab
- Kaggle
ValueError: You have to specify either decoder_input_ids or decoder_inputs_embeds
Traceback (most recent call last):
File "", line 1, in
File "/home/alvyn/.local/lib/python3.11/site-packages/torch/nn/modules/module.py", line 1501, in _call_impl
return forward_call(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/alvyn/.local/lib/python3.11/site-packages/transformers/models/longt5/modeling_longt5.py", line 2046, in forward
decoder_outputs = self.decoder(
^^^^^^^^^^^^^
File "/home/alvyn/.local/lib/python3.11/site-packages/torch/nn/modules/module.py", line 1501, in _call_impl
return forward_call(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/alvyn/.local/lib/python3.11/site-packages/transformers/models/longt5/modeling_longt5.py", line 1442, in forward
raise ValueError(f"You have to specify either {err_msg_prefix}input_ids or {err_msg_prefix}inputs_embeds")
ValueError: You have to specify either decoder_input_ids or decoder_inputs_embeds