Edit model card

Fine-tuned mT5 Base model used as multilingual answer generator (mGEN) for the cross-lingual RAG QA pipeline CORA described in the paper One Question Answering Model for Many Languages with Cross-lingual Dense Passage Retrieval. (NeurIPS 2021).

The checkpoint was downloaded following the instructions on the Github readme, and then uploaded to the Hugging Face Hub. Please contact the original paper authors for any problem you might encounter with this model.

If you use this model, cite it as follows:

@inproceedings{asai-etal-2021-one,
 author = {Asai, Akari and Yu, Xinyan and Kasai, Jungo and Hajishirzi, Hanna},
 booktitle = {Advances in Neural Information Processing Systems},
 editor = {M. Ranzato and A. Beygelzimer and Y. Dauphin and P.S. Liang and J. Wortman Vaughan},
 pages = {7547--7560},
 publisher = {Curran Associates, Inc.},
 title = {One Question Answering Model for Many Languages with Cross-lingual Dense Passage Retrieval},
 url = {https://proceedings.neurips.cc/paper_files/paper/2021/file/3df07fdae1ab273a967aaa1d355b8bb6-Paper.pdf},
 volume = {34},
 year = {2021}
}
Downloads last month
192

Datasets used to train gsarti/cora_mgen

Space using gsarti/cora_mgen 1