Update README.md
Browse files
README.md
CHANGED
|
@@ -2,6 +2,6 @@
|
|
| 2 |
license: mit
|
| 3 |
---
|
| 4 |
|
| 5 |
-
This model has been pretrained on MS MARCO first, then fine-tuned on
|
| 6 |
|
| 7 |
This model is trained with BERT-large as the backbone with 335M hyperparameters.
|
|
|
|
| 2 |
license: mit
|
| 3 |
---
|
| 4 |
|
| 5 |
+
This model has been pretrained on MS MARCO passages first, then fine-tuned on the MS MARCO training set following the approach described in the paper **Unsupervised Corpus Aware Language Model Pre-training for Dense Passage Retrieval**. The model can be used to reproduce the experimental results associated GitHub repository is available here https://github.com/OpenMatch/COCO-DR.
|
| 6 |
|
| 7 |
This model is trained with BERT-large as the backbone with 335M hyperparameters.
|