mt5-base-multi-msmarco Reranker finetuned on Multi MS MARCO

Introduction

mT5-base is a mT5-based model finetuned on a multilingual translated version of MS MARCO passage dataset. This dataset, named Multi MS MARCO, is formed by 12 complete MS MARCO passages collection in 12 different languages. Further information about the dataset or the translation method can be found on our paper mMARCO: A Multilingual Version of MS MARCO Passage Ranking Dataset and mMARCO repository.

Usage


from transformers import T5Tokenizer, MT5ForConditionalGeneration

model_name = 'unicamp-dl/mt5-base-multi-msmarco'
tokenizer  = T5Tokenizer.from_pretrained(model_name)
model      = MT5ForConditionalGeneration.from_pretrained(model_name)

Citation

If you use ptt5-base-msmarco-pt-100k, please cite:

@misc{bonifacio2021mmarco,
  title={mMARCO: A Multilingual Version of MS MARCO Passage Ranking Dataset}, 
  author={Luiz Henrique Bonifacio and Israel Campiotti and Roberto Lotufo and Rodrigo Nogueira},
  year={2021},
  eprint={2108.13897},
  archivePrefix={arXiv},
  primaryClass={cs.CL}

}

New

Select AutoNLP in the “Train” menu to fine-tune this model automatically.

Downloads last month
91
Hosted inference API
Text2Text Generation

Inference API has been turned off for this model.