Edit model card

SPLADE CoCondenser SelfDistil

SPLADE model for passage retrieval. For additional details, please visit:

MRR@10 (MS MARCO dev) R@1000 (MS MARCO dev)
splade-cocondenser-selfdistil 37.6 98.4

Citation

If you use our checkpoint, please cite our work:

@misc{https://doi.org/10.48550/arxiv.2205.04733,
  doi = {10.48550/ARXIV.2205.04733},
  url = {https://arxiv.org/abs/2205.04733},
  author = {Formal, Thibault and Lassance, Carlos and Piwowarski, Benjamin and Clinchant, Stéphane},
  keywords = {Information Retrieval (cs.IR), Computation and Language (cs.CL), FOS: Computer and information sciences, FOS: Computer and information sciences},
  title = {From Distillation to Hard Negative Sampling: Making Sparse Neural IR Models More Effective},
  publisher = {arXiv},
  year = {2022},
  copyright = {Creative Commons Attribution Non Commercial Share Alike 4.0 International}
}
Downloads last month
2,974
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Dataset used to train naver/splade-cocondenser-selfdistil

Space using naver/splade-cocondenser-selfdistil 1