gsarti commited on
Commit
2220b74
1 Parent(s): 22a0edf

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +6 -5
README.md CHANGED
@@ -44,7 +44,7 @@ thumbnail: https://gsarti.com/publication/it5/featured.png
44
  ---
45
  # mT5 Base for Question Answering ⁉️ 🇮🇹
46
 
47
- This repository contains the checkpoint for the [mT5 Base](https://huggingface.co/google/mt5-base) model fine-tuned on extractive question answering on the [SQuAD-IT corpus](https://huggingface.co/datasets/squad_it) as part of the experiments of the paper [IT5: Large-scale Text-to-text Pretraining for Italian Language Understanding and Generation](https://arxiv.org) by [Gabriele Sarti](https://gsarti.com) and [Malvina Nissim](https://malvinanissim.github.io).
48
 
49
  A comprehensive overview of other released materials is provided in the [gsarti/it5](https://github.com/gsarti/it5) repository. Refer to the paper for additional details concerning the reported scores and the evaluation approach.
50
 
@@ -73,10 +73,11 @@ If you use this model in your research, please cite our work as:
73
 
74
  ```bibtex
75
  @article{sarti-nissim-2022-it5,
76
- title={IT5: Large-scale Text-to-text Pretraining for Italian Language Understanding and Generation},
77
  author={Sarti, Gabriele and Nissim, Malvina},
78
- journal={ArXiv preprint TBD},
79
- url={TBD},
80
- year={2022}
 
81
  }
82
  ```
44
  ---
45
  # mT5 Base for Question Answering ⁉️ 🇮🇹
46
 
47
+ This repository contains the checkpoint for the [mT5 Base](https://huggingface.co/google/mt5-base) model fine-tuned on extractive question answering on the [SQuAD-IT corpus](https://huggingface.co/datasets/squad_it) as part of the experiments of the paper [IT5: Large-scale Text-to-text Pretraining for Italian Language Understanding and Generation](https://arxiv.org/abs/2203.03759) by [Gabriele Sarti](https://gsarti.com) and [Malvina Nissim](https://malvinanissim.github.io).
48
 
49
  A comprehensive overview of other released materials is provided in the [gsarti/it5](https://github.com/gsarti/it5) repository. Refer to the paper for additional details concerning the reported scores and the evaluation approach.
50
 
73
 
74
  ```bibtex
75
  @article{sarti-nissim-2022-it5,
76
+ title={{IT5}: Large-scale Text-to-text Pretraining for Italian Language Understanding and Generation},
77
  author={Sarti, Gabriele and Nissim, Malvina},
78
+ journal={ArXiv preprint 2203.03759},
79
+ url={https://arxiv.org/abs/2203.03759},
80
+ year={2022},
81
+ month={mar}
82
  }
83
  ```