--- dataset_info: features: - name: input dtype: string - name: output dtype: string - name: type dtype: string splits: - name: train num_bytes: 194097976.97925937 num_examples: 10659 - name: test num_bytes: 25683201.043425813 num_examples: 1570 - name: validation num_bytes: 35799607.99283796 num_examples: 1824 download_size: 92249754 dataset_size: 255580786.01552314 --- # Dataset Card for "booksum-summary-analysis-8192" Subset of [emozilla/booksum-summary-analysis](https://huggingface.co/datasets/emozilla/booksum-summary-analysis) with only entries that are less than 8,192 tokens under the [EleutherAI/gpt-neox-20b](https://huggingface.co/EleutherAI/gpt-neox-20b) tokenizer.