# Load model directly
from transformers import AutoTokenizer, AutoModelForMaskedLM
tokenizer = AutoTokenizer.from_pretrained("OpenMatch/co-condenser-large")
model = AutoModelForMaskedLM.from_pretrained("OpenMatch/co-condenser-large")Quick Links
This model has been pretrained on MS MARCO following the approach described in the paper Unsupervised Corpus Aware Language Model Pre-training for Dense Passage Retrieval. The model can be used to reproduce the experimental results within the GitHub repository https://github.com/OpenMatch/COCO-DR.
This model is trained with BERT-large as the backbone with 335M hyperparameters.
- Downloads last month
- 14
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("fill-mask", model="OpenMatch/co-condenser-large")