pszemraj commited on
Commit
4d9acab
1 Parent(s): 1cb2bed

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +9 -1
README.md CHANGED
@@ -344,11 +344,19 @@ model-index:
344
 
345
  # Longformer Encoder-Decoder (LED) for Narrative-Esque Long Text Summarization
346
 
 
 
 
347
  - **What:** This is the (current) result of the quest for a summarization model that condenses technical/long information down well _in general, academic and narrative usage
348
  - **Use cases:** long narrative summarization (think stories - as the dataset intended), article/paper/textbook/other summarization, technical:simple summarization.
349
  - Models trained on this dataset tend to also _explain_ what they are summarizing, which IMO is awesome.
 
 
 
 
 
 
350
 
351
- - works well on lots of text, and can hand 16384 tokens/batch.
352
 
353
  ## About
354
 
 
344
 
345
  # Longformer Encoder-Decoder (LED) for Narrative-Esque Long Text Summarization
346
 
347
+
348
+ [![colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/gist/pszemraj/36950064ca76161d9d258e5cdbfa6833/led-base-demo-token-batching.ipynb)
349
+
350
  - **What:** This is the (current) result of the quest for a summarization model that condenses technical/long information down well _in general, academic and narrative usage
351
  - **Use cases:** long narrative summarization (think stories - as the dataset intended), article/paper/textbook/other summarization, technical:simple summarization.
352
  - Models trained on this dataset tend to also _explain_ what they are summarizing, which IMO is awesome.
353
+ - Works well on lots of text, and can hand 16384 tokens/batch.
354
+ - See examples in Colab demo linked above, or try the [demo on Spaces](https://huggingface.co/spaces/pszemraj/summarize-long-text)
355
+
356
+ -
357
+
358
+ > Note: the API is set to generate a max of 64 tokens for runtime reasons, so the summaries may be truncated (depending on the length of input text). For best results use python as below.
359
 
 
360
 
361
  ## About
362