File size: 7,297 Bytes
8f4b3c2
8f8fcfd
 
8f4b3c2
4fbc307
8f8fcfd
 
 
 
 
 
 
 
 
 
 
 
 
8f4b3c2
 
47559bc
8f4b3c2
4fbc307
 
 
 
 
f7cb2df
 
4fbc307
 
 
 
 
 
 
 
 
86b2000
8f4b3c2
4fbc307
 
 
 
 
 
 
 
 
 
 
 
8f4b3c2
 
 
4fbc307
8f4b3c2
f7cb2df
 
 
 
 
 
 
142f450
f7cb2df
4fbc307
142f450
4fbc307
 
 
c7c94d3
4fbc307
 
 
 
142f450
4fbc307
c7c94d3
142f450
4fbc307
 
 
142f450
 
 
 
4fbc307
162015a
142f450
4fbc307
 
 
142f450
162015a
142f450
4fbc307
 
 
c7c94d3
 
4fbc307
c7c94d3
 
 
8f4b3c2
4fbc307
 
 
8f4b3c2
 
 
4fbc307
8f4b3c2
 
 
 
 
4fbc307
8f4b3c2
4fbc307
8f4b3c2
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
---
language_creators:
- found
languages:
- en
licenses:
- cc-by-4.0
multilinguality:
- monolingual
pretty_name: Scifi_TV_Shows
size_categories:
- unknown
source_datasets:
- original
task_categories:
- other
task_ids:
- other-other-story-generation
tags:
  - Story Generation
paperswithcode_id: scifi-tv-plots
---

# Dataset Card for Science Fiction TV Show Plots Corpus

## Table of Contents
- [Dataset Description](#dataset-description)
  - [Format](#format)
- [Raw Dataset Structure](#dataset-structure)
  - [_all-sci-fi-data.txt_](#all-data)
  - [Files in _Test-Train-Val_ Directory](#test-train-val)
  - [Files in _Input_OutputFiles_ Directory](#input-output)
  - [Files in _OriginalStoriesSeparated_ Directory](#original-stories)
- [Additional Information](#additional-information)
  - [Citation](#citation)
  - [Licensing](#licensing)
  
## Dataset Description
A collection of long-running (80+ episodes) science fiction TV show plot synopses, scraped from Fandom.com wikis. Collected Nov 2017. Each episode is considered a "story".

Contains plot summaries from: 
- Babylon 5 (https://babylon5.fandom.com/wiki/Main_Page) - 84 stories
- Doctor Who (https://tardis.fandom.com/wiki/Doctor_Who_Wiki) - 311 stories
- Doctor Who spin-offs - 95 stories
- Farscape (https://farscape.fandom.com/wiki/Farscape_Encyclopedia_Project:Main_Page) - 90 stories
- Fringe (https://fringe.fandom.com/wiki/FringeWiki) - 87 stories
- Futurama (https://futurama.fandom.com/wiki/Futurama_Wiki) - 87 stories
- Stargate (https://stargate.fandom.com/wiki/Stargate_Wiki) - 351 stories
- Star Trek (https://memory-alpha.fandom.com/wiki/Star_Trek) - 701 stories
- Star Wars books (https://starwars.fandom.com/wiki/Main_Page) - 205 stories, each book is a story
- Star Wars Rebels - 65 stories 
- X-Files (https://x-files.fandom.com/wiki/Main_Page) - 200 stories

Total: 2276 stories

Dataset is "eventified" and generalized (see LJ Martin, P Ammanabrolu, X Wang, W Hancock, S Singh, B Harrison, and MO Riedl. Event Representations for Automated Story Generation with Deep Neural Nets, Thirty-Second AAAI Conference on Artificial Intelligence (AAAI), 2018. for details on these processes.) and split into train-test-validation sets—separated by story so that full stories will stay together—for converting events into full sentences.

## Format
| Dataset Split | Number of Stories in Split | Number of Lines in Split |
| ------------- |--------------------------- |------------------------- |
| Train         | 1738                       | 257,184                  |
| Validation    | 194                        | 32,855                   |
| Test          | 450                        | 30,938                   |

---
## Raw Dataset Structure
### _all-sci-fi-data.txt_
* Each line of the stories contains data in the format: 5-tuple events in a list (subject, verb, direct object, modifier noun, preposition) ||| generalized 5-tuple events in a list ||| original sentence ||| generalized sentence
```
[[u'Voyager', u'run', 'EmptyParameter', u'deuterium', u'out'], [u'Voyager', u'force', u'go', 'EmptyParameter', 'EmptyParameter'], [u'Voyager', u'go', 'EmptyParameter', u'mode', u'into']]|||[['<VESSEL>0', 'function-105.2.1', 'EmptyParameter', "Synset('atom.n.01')", u'out'], ['<VESSEL>0', 'urge-58.1-1', u'escape-51.1-1', 'EmptyParameter', 'EmptyParameter'], ['<VESSEL>0', u'escape-51.1-1', 'EmptyParameter', "Synset('statistic.n.01')", u'into']]|||The USS Voyager is running out of deuterium as a fuel and is forced to go into Gray mode.|||the <VESSEL>0 is running out of Synset('atom.n.01') as a Synset('matter.n.03') and is forced to go into Synset('horse.n.01') Synset('statistic.n.01').
```
* Stories end with &lt;EOS> (end of story) tag on its own line
* On the line after &lt;EOS>, there is a defaultdict of entities found in the story by tag and in order (e.g. the second entity in the "&lt;ORGANIZATION>" list in the dictionary would be &lt;ORGANIZATION>1 in the story above&mdash;index starts at 0). These lines start with "%%%%%%%%%%%%%%%%%".
```
%%%%%%%%%%%%%%%%%defaultdict(<type 'list'>, {'<ORGANIZATION>': ['seven of nine', 'silver blood'], '<LOCATION>': ['sickbay', 'astrometrics', 'paris', 'cavern', 'vorik', 'caves'], '<DATE>': ['an hour ago', 'now'], '<MISC>': ['selected works', 'demon class', 'electromagnetic', 'parises', 'mimetic'], '<DURATION>': ['less than a week', 'the past four years', 'thirty seconds', 'an hour', 'two hours'], '<NUMBER>': ['two', 'dozen', '14', '15'], '<ORDINAL>': ['first'], '<PERSON>': ['tom paris', 'harry kim', 'captain kathryn janeway', 'tuvok', 'chakotay', 'jirex', 'neelix', 'the doctor', 'seven', 'ensign kashimuro nozawa', 'green', 'lt jg elanna torres', 'ensign vorik'], '<VESSEL>': ['uss voyager', 'starfleet']})
```

### Files in _Test-Train-Val_ Directory
* File names: all-sci-fi-val.txt, all-sci-fi-test.txt, & all-sci-fi-train.txt
* Each line of the stories contains data in the format: 5-tuple events in a list ||| generalized 5-tuple events in a list ||| original sentence ||| generalized sentence
```
[[u'Voyager', u'run', 'EmptyParameter', u'deuterium', u'out'], [u'Voyager', u'force', u'go', 'EmptyParameter', 'EmptyParameter'], [u'Voyager', u'go', 'EmptyParameter', u'mode', u'into']]|||[['<VESSEL>0', 'function-105.2.1', 'EmptyParameter', "Synset('atom.n.01')", u'out'], ['<VESSEL>0', 'urge-58.1-1', u'escape-51.1-1', 'EmptyParameter', 'EmptyParameter'], ['<VESSEL>0', u'escape-51.1-1', 'EmptyParameter', "Synset('statistic.n.01')", u'into']]|||The USS Voyager is running out of deuterium as a fuel and is forced to go into Gray mode.|||the <VESSEL>0 is running out of Synset('atom.n.01') as a Synset('matter.n.03') and is forced to go into Synset('horse.n.01') Synset('statistic.n.01').
```
* No &lt;EOS> tags or dictionary.
* Separated 80-10-10 for train-test-val, but by story instead of individual lines.


### Files in _Input_OutputFiles_ Directory
**Files ending with _*\_input.txt_**
* 5-tuple generalized event on each line; formatted as a string instead of a list
```
<VESSEL>0 function-105.2.1 EmptyParameter Synset('atom.n.01') out
```

**Files ending with _*\_output.txt_**
* Corresponding generalized sentence for events in the matching _input.txt file
```
the <VESSEL>0 is running out of Synset('atom.n.01') as a Synset('matter.n.03') and is forced to go into Synset('horse.n.01') Synset('statistic.n.01').
```


### Files in _OriginalStoriesSeparated_ Directory
* Contains unedited, unparsed original stories scraped from the respective Fandom wikis.
* Each line is a story with sentences space-separated. After each story, there is a &lt;EOS> tag on a new line.
* There is one file for each of the 11 domains listed above.
---
## Additional Information
### Citation
```
@inproceedings{Ammanabrolu2020AAAI, 
title={Story Realization: Expanding Plot Events into Sentences}, 
author={Prithviraj Ammanabrolu and Ethan Tien and Wesley Cheung and Zhaochen Luo and William Ma and Lara J. Martin and Mark O. Riedl}, 
journal={Proceedings of the AAAI Conference on Artificial Intelligence (AAAI)}, 
year={2020}, 
volume={34},
number={05},
url={https://ojs.aaai.org//index.php/AAAI/article/view/6232}
}
```
---
### Licensing
The Creative Commons Attribution 4.0 International License. https://creativecommons.org/licenses/by/4.0/