gsarti commited on
Commit
b619de1
1 Parent(s): 7fc7df5

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +6 -5
README.md CHANGED
@@ -74,7 +74,7 @@ thumbnail: https://gsarti.com/publication/it5/featured.png
74
  ---
75
  # IT5 Small for News Summarization ✂️🗞️ 🇮🇹
76
 
77
- This repository contains the checkpoint for the [IT5 Small](https://huggingface.co/gsarti/it5-small) model fine-tuned on news summarization on the [Fanpage](https://huggingface.co/datasets/ARTeLab/fanpage) and [Il Post](https://huggingface.co/datasets/ARTeLab/ilpost) corpora as part of the experiments of the paper [IT5: Large-scale Text-to-text Pretraining for Italian Language Understanding and Generation](https://arxiv.org) by [Gabriele Sarti](https://gsarti.com) and [Malvina Nissim](https://malvinanissim.github.io).
78
 
79
  A comprehensive overview of other released materials is provided in the [gsarti/it5](https://github.com/gsarti/it5) repository. Refer to the paper for additional details concerning the reported scores and the evaluation approach.
80
 
@@ -103,10 +103,11 @@ If you use this model in your research, please cite our work as:
103
 
104
  ```bibtex
105
  @article{sarti-nissim-2022-it5,
106
- title={IT5: Large-scale Text-to-text Pretraining for Italian Language Understanding and Generation},
107
  author={Sarti, Gabriele and Nissim, Malvina},
108
- journal={ArXiv preprint TBD},
109
- url={TBD},
110
- year={2022}
 
111
  }
112
  ```
74
  ---
75
  # IT5 Small for News Summarization ✂️🗞️ 🇮🇹
76
 
77
+ This repository contains the checkpoint for the [IT5 Small](https://huggingface.co/gsarti/it5-small) model fine-tuned on news summarization on the [Fanpage](https://huggingface.co/datasets/ARTeLab/fanpage) and [Il Post](https://huggingface.co/datasets/ARTeLab/ilpost) corpora as part of the experiments of the paper [IT5: Large-scale Text-to-text Pretraining for Italian Language Understanding and Generation](https://arxiv.org/abs/2203.03759) by [Gabriele Sarti](https://gsarti.com) and [Malvina Nissim](https://malvinanissim.github.io).
78
 
79
  A comprehensive overview of other released materials is provided in the [gsarti/it5](https://github.com/gsarti/it5) repository. Refer to the paper for additional details concerning the reported scores and the evaluation approach.
80
 
103
 
104
  ```bibtex
105
  @article{sarti-nissim-2022-it5,
106
+ title={{IT5}: Large-scale Text-to-text Pretraining for Italian Language Understanding and Generation},
107
  author={Sarti, Gabriele and Nissim, Malvina},
108
+ journal={ArXiv preprint 2203.03759},
109
+ url={https://arxiv.org/abs/2203.03759},
110
+ year={2022},
111
+ month={mar}
112
  }
113
  ```