Instructions to use iarfmoose/t5-base-question-generator with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use iarfmoose/t5-base-question-generator with Transformers:
# Load model directly from transformers import AutoTokenizer, AutoModelForSeq2SeqLM tokenizer = AutoTokenizer.from_pretrained("iarfmoose/t5-base-question-generator") model = AutoModelForSeq2SeqLM.from_pretrained("iarfmoose/t5-base-question-generator") - Notebooks
- Google Colab
- Kaggle
Update README.md
#2
by cherry0328 - opened
Hi there! I just wanted to say that your model is fantastic. We are interested in contributing to update the README to include the base_model information. This is to help address the missing details in the model card.