gsarti commited on
Commit
56d5cc7
1 Parent(s): c5cb2a5

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +6 -5
README.md CHANGED
@@ -56,7 +56,7 @@ thumbnail: https://gsarti.com/publication/it5/featured.png
56
  ---
57
  # IT5 Small for Question Generation 💭 🇮🇹
58
 
59
- This repository contains the checkpoint for the [IT5 Small](https://huggingface.co/gsarti/it5-small) model fine-tuned on question generation on the [SQuAD-IT corpus](https://huggingface.co/datasets/squad_it) as part of the experiments of the paper [IT5: Large-scale Text-to-text Pretraining for Italian Language Understanding and Generation](https://arxiv.org) by [Gabriele Sarti](https://gsarti.com) and [Malvina Nissim](https://malvinanissim.github.io).
60
 
61
  A comprehensive overview of other released materials is provided in the [gsarti/it5](https://github.com/gsarti/it5) repository. Refer to the paper for additional details concerning the reported scores and the evaluation approach.
62
 
@@ -85,10 +85,11 @@ If you use this model in your research, please cite our work as:
85
 
86
  ```bibtex
87
  @article{sarti-nissim-2022-it5,
88
- title={IT5: Large-scale Text-to-text Pretraining for Italian Language Understanding and Generation},
89
  author={Sarti, Gabriele and Nissim, Malvina},
90
- journal={ArXiv preprint TBD},
91
- url={TBD},
92
- year={2022}
 
93
  }
94
  ```
56
  ---
57
  # IT5 Small for Question Generation 💭 🇮🇹
58
 
59
+ This repository contains the checkpoint for the [IT5 Small](https://huggingface.co/gsarti/it5-small) model fine-tuned on question generation on the [SQuAD-IT corpus](https://huggingface.co/datasets/squad_it) as part of the experiments of the paper [IT5: Large-scale Text-to-text Pretraining for Italian Language Understanding and Generation](https://arxiv.org/abs/2203.03759) by [Gabriele Sarti](https://gsarti.com) and [Malvina Nissim](https://malvinanissim.github.io).
60
 
61
  A comprehensive overview of other released materials is provided in the [gsarti/it5](https://github.com/gsarti/it5) repository. Refer to the paper for additional details concerning the reported scores and the evaluation approach.
62
 
85
 
86
  ```bibtex
87
  @article{sarti-nissim-2022-it5,
88
+ title={{IT5}: Large-scale Text-to-text Pretraining for Italian Language Understanding and Generation},
89
  author={Sarti, Gabriele and Nissim, Malvina},
90
+ journal={ArXiv preprint 2203.03759},
91
+ url={https://arxiv.org/abs/2203.03759},
92
+ year={2022},
93
+ month={mar}
94
  }
95
  ```