Datasets:
Tasks:
Summarization
Modalities:
Text
Formats:
parquet
Languages:
English
Size:
10K - 100K
ArXiv:
License:
Update README.md
Browse files
README.md
CHANGED
@@ -23,7 +23,31 @@ dataset_info:
|
|
23 |
num_examples: 2000
|
24 |
download_size: 194202865
|
25 |
dataset_size: 350740303
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
26 |
---
|
27 |
-
#
|
28 |
|
29 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
23 |
num_examples: 2000
|
24 |
download_size: 194202865
|
25 |
dataset_size: 350740303
|
26 |
+
license:
|
27 |
+
- unknown
|
28 |
+
task_categories:
|
29 |
+
- summarization
|
30 |
+
language:
|
31 |
+
- en
|
32 |
+
multilinguality:
|
33 |
+
- monolingual
|
34 |
+
tags:
|
35 |
+
- abstractive-summarization
|
36 |
+
- wiki
|
37 |
+
- abstractive
|
38 |
+
pretty_name: 'WikiSum: Coherent Summarization Dataset for Efficient Human-Evaluation'
|
39 |
+
size_categories:
|
40 |
+
- 10K<n<100K
|
41 |
+
source_datasets:
|
42 |
+
- original
|
43 |
+
paperswithcode_id: wikisum
|
44 |
---
|
45 |
+
# wikisum
|
46 |
|
47 |
+
## Dataset Description
|
48 |
+
|
49 |
+
- **Homepage:** https://registry.opendata.aws/wikisum/
|
50 |
+
- **Repository:** https://github.com/tensorflow/tensor2tensor/tree/master/tensor2tensor/data_generators/wikisum
|
51 |
+
- **Paper:** [Generating Wikipedia by Summarizing Long Sequences](https://arxiv.org/abs/1801.10198)
|
52 |
+
- **Leaderboard:** [More Information Needed]
|
53 |
+
- **Point of Contact:** [nachshon](mailto:nachshon@amazon.com)
|