# Load model directly
from transformers import AutoTokenizer, AutoModelForMaskedLM
tokenizer = AutoTokenizer.from_pretrained("OpenMatch/cocodr-large")
model = AutoModelForMaskedLM.from_pretrained("OpenMatch/cocodr-large")Quick Links
This model has been pretrained on BEIR corpus without relevance-level supervision following the approach described in the paper COCO-DR: Combating Distribution Shifts in Zero-Shot Dense Retrieval with Contrastive and Distributionally Robust Learning. The associated GitHub repository is available here https://github.com/OpenMatch/COCO-DR.
This model is trained with BERT-large as the backbone with 335M hyperparameters.
- Downloads last month
- 4
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("fill-mask", model="OpenMatch/cocodr-large")