Papers
arxiv:2301.02998

InPars-Light: Cost-Effective Unsupervised Training of Efficient Rankers

Published on Jan 8, 2023
Authors:
,
,
,
,
,

Abstract

We carried out a reproducibility study of InPars recipe for unsupervised training of neural rankers. As a by-product of this study, we developed a simple-yet-effective modification of InPars, which we called InPars-light. Unlike InPars, InPars-light uses only a freely available language model BLOOM and 7x-100x smaller ranking models. On all five English retrieval collections (used in the original InPars study) we obtained substantial (7-30%) and statistically significant improvements over BM25 in nDCG or MRR using only a 30M parameter six-layer MiniLM ranker. In contrast, in the InPars study only a 100x larger MonoT5-3B model consistently outperformed BM25, whereas their smaller MonoT5-220M model (which is still 7x larger than our MiniLM ranker), outperformed BM25 only on MS MARCO and TREC DL 2020. In a purely unsupervised setting, our 435M parameter DeBERTA v3 ranker was roughly at par with the 7x larger MonoT5-3B: In fact, on three out of five datasets, it slightly outperformed MonoT5-3B. Finally, these good results were achieved by re-ranking only 100 candidate documents compared to 1000 used in InPars. We believe that InPars-light is the first truly cost-effective prompt-based unsupervised recipe to train and deploy neural ranking models that outperform BM25.

Community

Hi there,

Thanks for your work on "InPars-Light"! It would be fantastic if you could share the pre-trained models, especially those fine-tuned on MS MARCO, on Hugging Face. We're eager to use them for studying model interpretability in our educational projects.

Thanks a lot!

Best,

Paper author

Hi @AntoninJarolim thank you for kind words: Is it ok these models are not fully-HF-compatible models?

·

Thanks for the quick reply! It's okay if the model isn't fully HF-compatible. I think I can manage with it as is.

Thanks again!

Paper author

Hi @AntoninJarolim I have a some follow up questions: Could you send me an e-mail to leo a boytsov.info?

Thank you!

Sign up or log in to comment

Models citing this paper 0

No model linking this paper

Cite arxiv.org/abs/2301.02998 in a model README.md to link it from this page.

Datasets citing this paper 0

No dataset linking this paper

Cite arxiv.org/abs/2301.02998 in a dataset README.md to link it from this page.

Spaces citing this paper 0

No Space linking this paper

Cite arxiv.org/abs/2301.02998 in a Space README.md to link it from this page.

Collections including this paper 0

No Collection including this paper

Add this paper to a collection to link it from this page.