File size: 1,545 Bytes
81449cb 1fac4ad |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 |
---
license: mit
---
# MS MARCO Distillation Scores for Translate-Distill
This repository contains [MS MARCO](https://microsoft.github.io/msmarco/) training
query-passage scores produced by MonoT5 reranker
[`unicamp-dl/mt5-13b-mmarco-100k`](https://huggingface.co/unicamp-dl/mt5-13b-mmarco-100k) and
[`castorini/monot5-3b-msmarco-10k`](https://huggingface.co/castorini/monot5-3b-msmarco-10k).
Each training query is associated with the top-50 passages retrieved by the [ColBERTv2](https://arxiv.org/abs/2112.01488) model.
Files are gzip compressed and with the naming scheme of `{teacher}-monot5-{msmarco, mmarco}-{qlang}{plang}.jsonl.gz`,
which indicates the teacher reranker that inferenced using `qlang` queries and `plang` passages from MS MARCO.
For languages other than English (eng), we use the translated text provided by mmarco and [neuMarco](https://ir-datasets.com/neumarco.html).
## Usage
We recommand downloading the files to incorporate with the training script in the [PLAID-X](https://github.com/hltcoe/ColBERT-X/tree/plaid-x) codebase.
## Citation and Bibtex Info
Please cite the following paper if you use the scores.
```bibtext
@inproceedings{translate-distill,
author = {Eugene Yang and Dawn Lawrie and James Mayfield and Douglas W. Oard and Scott Miller},
title = {Translate-Distill: Learning Cross-Language \ Dense Retrieval by Translation and Distillation},
booktitle = {Proceedings of the 46th European Conference on Information Retrieval (ECIR)},
year = {2024},
url = {tba}
}
```
|