DanSumT5-small / README.md
sarakolding's picture
adding basic model card (#1)
dd7c21a
metadata
pipeline_tag: summarization
license: apache-2.0
language:
  - da

mT5-small fine-tuned for News article Summarisation ✏️🧾

Google's mT5 for summarisation downstream task.

Model summary

This repository contains a model for Danish abstractive summarisation of news articles. The summariser is based on a language-specific mT5-small.

The model is fine-tuned using an abstractive subset of the DaNewsroom dataset (Varab & Schluter, 2020), according to the binned density categories employed in Newsroom (Grusky et al., 2019).

References

Grusky, M., Naaman, M., & Artzi, Y. (2018). Newsroom: A Dataset of 1.3 Million Summaries with Diverse Extractive Strategies. ArXiv:1804.11283 [Cs]. http://arxiv.org/abs/1804.11283

Varab, D., & Schluter, N. (2020). DaNewsroom: A Large-scale Danish Summarisation Dataset. Proceedings of the 12th Language Resources and Evaluation Conference, 6731–6739. https://aclanthology.org/2020.lrec-1.831