wikisum / README.md
d0rj's picture
Update README.md
f7379d8
---
dataset_info:
features:
- name: url
dtype: string
- name: title
dtype: string
- name: summary
dtype: string
- name: article
dtype: string
- name: step_headers
dtype: string
splits:
- name: train
num_bytes: 315275236
num_examples: 35775
- name: test
num_bytes: 17584216
num_examples: 2000
- name: validation
num_bytes: 17880851
num_examples: 2000
download_size: 194202865
dataset_size: 350740303
license:
- unknown
task_categories:
- summarization
language:
- en
multilinguality:
- monolingual
tags:
- abstractive-summarization
- wiki
- abstractive
pretty_name: 'WikiSum: Coherent Summarization Dataset for Efficient Human-Evaluation'
size_categories:
- 10K<n<100K
source_datasets:
- original
paperswithcode_id: wikisum
---
# wikisum
## Dataset Description
- **Homepage:** https://registry.opendata.aws/wikisum/
- **Repository:** https://github.com/tensorflow/tensor2tensor/tree/master/tensor2tensor/data_generators/wikisum
- **Paper:** [Generating Wikipedia by Summarizing Long Sequences](https://arxiv.org/abs/1801.10198)
- **Leaderboard:** [More Information Needed]
- **Point of Contact:** [nachshon](mailto:nachshon@amazon.com)