Instructions to use OpenMatch/cocodr-large-msmarco-idro-only with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use OpenMatch/cocodr-large-msmarco-idro-only with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("fill-mask", model="OpenMatch/cocodr-large-msmarco-idro-only")# Load model directly from transformers import AutoTokenizer, AutoModelForMaskedLM tokenizer = AutoTokenizer.from_pretrained("OpenMatch/cocodr-large-msmarco-idro-only") model = AutoModelForMaskedLM.from_pretrained("OpenMatch/cocodr-large-msmarco-idro-only") - Notebooks
- Google Colab
- Kaggle
This model has been pretrained on MS MARCO corpus and then finetuned on MS MARCO training data with implicit distributionally robust optimization (iDRO), following the approach described in the paper COCO-DR: Combating Distribution Shifts in Zero-Shot Dense Retrieval with Contrastive and Distributionally Robust Learning. The associated GitHub repository is available here https://github.com/OpenMatch/COCO-DR.
This model is trained with BERT-large as the backbone with 335M hyperparameters.
- Downloads last month
- 8