Datasets:
d0rj
/

Languages:
English
Multilinguality:
monolingual
Size Categories:
10K<n<100K
Source Datasets:
original
ArXiv:
License:
d0rj commited on
Commit
f7379d8
1 Parent(s): 29e8a78

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +26 -2
README.md CHANGED
@@ -23,7 +23,31 @@ dataset_info:
23
  num_examples: 2000
24
  download_size: 194202865
25
  dataset_size: 350740303
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
26
  ---
27
- # Dataset Card for "wikisum"
28
 
29
- [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
 
 
 
 
 
 
 
23
  num_examples: 2000
24
  download_size: 194202865
25
  dataset_size: 350740303
26
+ license:
27
+ - unknown
28
+ task_categories:
29
+ - summarization
30
+ language:
31
+ - en
32
+ multilinguality:
33
+ - monolingual
34
+ tags:
35
+ - abstractive-summarization
36
+ - wiki
37
+ - abstractive
38
+ pretty_name: 'WikiSum: Coherent Summarization Dataset for Efficient Human-Evaluation'
39
+ size_categories:
40
+ - 10K<n<100K
41
+ source_datasets:
42
+ - original
43
+ paperswithcode_id: wikisum
44
  ---
45
+ # wikisum
46
 
47
+ ## Dataset Description
48
+
49
+ - **Homepage:** https://registry.opendata.aws/wikisum/
50
+ - **Repository:** https://github.com/tensorflow/tensor2tensor/tree/master/tensor2tensor/data_generators/wikisum
51
+ - **Paper:** [Generating Wikipedia by Summarizing Long Sequences](https://arxiv.org/abs/1801.10198)
52
+ - **Leaderboard:** [More Information Needed]
53
+ - **Point of Contact:** [nachshon](mailto:nachshon@amazon.com)