File size: 407 Bytes
d784619 c0efffc f3dc587 c0efffc | 1 2 3 4 5 6 7 | ---
license: mit
---
This model has been pretrained on BookCorpus and English Wikipedia following the approach described in the paper **Condenser: a Pre-training Architecture for Dense Retrieval**. The model can be used to reproduce the experimental results within the GitHub repository https://github.com/OpenMatch/COCO-DR.
This model is trained with BERT-large as the backbone with 335M hyperparameters. |