lhbonifacio commited on
Commit
188ffba
1 Parent(s): 416a681

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +9 -8
README.md CHANGED
@@ -17,8 +17,7 @@ inference: false
17
  # PTT5-base Reranker finetuned on both English and Portuguese MS MARCO
18
  ## Introduction
19
  ptt5-base-msmarco-en-pt-10k is a T5-based model pretrained in the BrWac corpus, finetuned on both English and Portuguese translated version of MS MARCO passage dataset. This model was finetuned for 10k steps.
20
- Further information about the dataset or the translation method can be found on our [Cross-Lingual repository](https://github.com/unicamp-dl/cross-lingual-analysis).
21
-
22
  ## Usage
23
  ```python
24
 
@@ -32,9 +31,11 @@ model = T5ForConditionalGeneration.from_pretrained(model_name)
32
  # Citation
33
  If you use ptt5-base-msmarco-en-pt-10k, please cite:
34
 
35
- @article{rosa2021cost,
36
- title={A cost-benefit analysis of cross-lingual transfer methods},
37
- author={Rosa, Guilherme Moraes and Bonifacio, Luiz Henrique and de Souza, Leandro Rodrigues and Lotufo, Roberto and Nogueira, Rodrigo},
38
- journal={arXiv preprint arXiv:2105.06813},
39
- year={2021}
40
- }
 
 
 
17
  # PTT5-base Reranker finetuned on both English and Portuguese MS MARCO
18
  ## Introduction
19
  ptt5-base-msmarco-en-pt-10k is a T5-based model pretrained in the BrWac corpus, finetuned on both English and Portuguese translated version of MS MARCO passage dataset. This model was finetuned for 10k steps.
20
+ Further information about the dataset or the translation method can be found on our [**mMARCO: A Multilingual Version of MS MARCO Passage Ranking Dataset**](https://arxiv.org/abs/2108.13897) and [mMARCO](https://github.com/unicamp-dl/mMARCO) repository.
 
21
  ## Usage
22
  ```python
23
 
 
31
  # Citation
32
  If you use ptt5-base-msmarco-en-pt-10k, please cite:
33
 
34
+ @misc{bonifacio2021mmarco,
35
+ title={mMARCO: A Multilingual Version of MS MARCO Passage Ranking Dataset},
36
+ author={Luiz Henrique Bonifacio and Israel Campiotti and Roberto Lotufo and Rodrigo Nogueira},
37
+ year={2021},
38
+ eprint={2108.13897},
39
+ archivePrefix={arXiv},
40
+ primaryClass={cs.CL}
41
+ }