Edit model card

license: mit

This model has been pretrained on BEIR corpus then finetuned on MS MARCO with BM25 warmup only, following the approach described in the paper COCO-DR: Combating Distribution Shifts in Zero-Shot Dense Retrieval with Contrastive and Distributionally Robust Learning. The associated GitHub repository is available here https://github.com/OpenMatch/COCO-DR.

This model is trained with BERT-base as the backbone with 110M hyperparameters.

Downloads last month
8