Edit model card

This model has been pretrained on MS MARCO passages first, then fine-tuned on the MS MARCO training set following the approach described in the paper Unsupervised Corpus Aware Language Model Pre-training for Dense Passage Retrieval. The model can be used to reproduce the experimental results associated GitHub repository is available here https://github.com/OpenMatch/COCO-DR.

This model is trained with BERT-large as the backbone with 335M hyperparameters.

Downloads last month
17