Instructions to use allenai/t5-small-squad11 with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use allenai/t5-small-squad11 with Transformers:
# Load model directly from transformers import AutoTokenizer, AutoModelForSeq2SeqLM tokenizer = AutoTokenizer.from_pretrained("allenai/t5-small-squad11") model = AutoModelForSeq2SeqLM.from_pretrained("allenai/t5-small-squad11") - Notebooks
- Google Colab
- Kaggle
SQuAD 1.1 question-answering based on T5-small. Example use:
from transformers import T5Config, T5ForConditionalGeneration, T5Tokenizer
model_name = "allenai/t5-small-next-word-generator-qoogle"
tokenizer = T5Tokenizer.from_pretrained(model_name)
model = T5ForConditionalGeneration.from_pretrained(model_name)
def run_model(input_string, **generator_args):
input_ids = tokenizer.encode(input_string, return_tensors="pt")
res = model.generate(input_ids, **generator_args)
output = tokenizer.batch_decode(res, skip_special_tokens=True)
print(output)
return output
run_model("Who is the winner of 2009 olympics? \n Jack and Jill participated, but James won the games.")```
which should result in the following:
['James']
- Downloads last month
- 14
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐ Ask for provider support