Pointwise Gaggle!
Collection
4 items β’ Updated β’ 1
How to use castorini/monot5-base-msmarco with Transformers:
# Load model directly
from transformers import AutoTokenizer, AutoModelForSeq2SeqLM
tokenizer = AutoTokenizer.from_pretrained("castorini/monot5-base-msmarco")
model = AutoModelForSeq2SeqLM.from_pretrained("castorini/monot5-base-msmarco")YAML Metadata Warning:empty or missing yaml metadata in repo card
Check out the documentation for more information.
This model is a T5-base reranker fine-tuned on the MS MARCO passage dataset for 100k steps (or 10 epochs).
For better zero-shot performance (i.e., inference on other datasets), we recommend using castorini/monot5-base-msmarco-10k.
For more details on how to use it, check the following links:
Paper describing the model: Document Ranking with a Pretrained Sequence-to-Sequence Model