How to use from the
Use from the
Transformers library
# Use a pipeline as a high-level helper
from transformers import pipeline

pipe = pipeline("feature-extraction", model="OpenMatch/t5-ance")
# Load model directly
from transformers import AutoTokenizer, AutoModel

tokenizer = AutoTokenizer.from_pretrained("OpenMatch/t5-ance")
model = AutoModel.from_pretrained("OpenMatch/t5-ance")
Quick Links

T5-ANCE

T5-ANCE generally follows the training procedure described in this page, but uses a much larger batch size.

Dataset used for training:

  • MS MARCO Passage

Evaluation result:

Dataset Metric Result
MS MARCO Passage (dev) MRR@10 0.3570

Important hyper-parameters:

Name Value
Global batch size 256
Learning rate 5e-6
Maximum length of query 32
Maximum length of document 128
Template for query <text>
Template for document Title: <title> Text: <text>

Paper

-

Downloads last month
104
Safetensors
Model size
0.2B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support