pszemraj commited on
Commit
b7c31bc
1 Parent(s): 38be53c

fix colab link

Browse files
Files changed (1) hide show
  1. README.md +13 -14
README.md CHANGED
@@ -282,10 +282,11 @@ model-index:
282
 
283
  # Longformer Encoder-Decoder (LED) for Narrative-Esque Long Text Summarization
284
 
 
 
 
285
 
286
- [![colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/gist/pszemraj/3eba944ddc9fc9a4a1bfb21e83b57620/summarization-token-batching.ipynb)
287
-
288
- A fine-tuned version of [allenai/led-large-16384](https://huggingface.co/allenai/led-large-16384) on the BookSum dataset.
289
 
290
  Goal: a model that can generalize well and is useful in summarizing long text in academic and daily usage. The result works well on lots of text and can handle 16384 tokens/batch (_if you have the GPU memory to handle that_)
291
 
@@ -322,17 +323,15 @@ summarizer = pipeline(
322
  wall_of_text = "your words here"
323
 
324
  result = summarizer(
325
- wall_of_text,
326
- min_length=16,
327
- max_length=256,
328
- no_repeat_ngram_size=3,
329
- encoder_no_repeat_ngram_size =3,
330
- repetition_penalty=3.5,
331
- num_beams=4,
332
- early_stopping=True,
333
- )
334
-
335
-
336
  ```
337
 
338
 
 
282
 
283
  # Longformer Encoder-Decoder (LED) for Narrative-Esque Long Text Summarization
284
 
285
+ <a href="https://colab.research.google.com/gist/pszemraj/3eba944ddc9fc9a4a1bfb21e83b57620/summarization-token-batching.ipynb">
286
+ <img src="https://colab.research.google.com/assets/colab-badge.svg" alt="Open In Colab"/>
287
+ </a>
288
 
289
+ A fine-tuned version of [allenai/led-large-16384](https://huggingface.co/allenai/led-large-16384) on the `BookSum` dataset.
 
 
290
 
291
  Goal: a model that can generalize well and is useful in summarizing long text in academic and daily usage. The result works well on lots of text and can handle 16384 tokens/batch (_if you have the GPU memory to handle that_)
292
 
 
323
  wall_of_text = "your words here"
324
 
325
  result = summarizer(
326
+ wall_of_text,
327
+ min_length=16,
328
+ max_length=256,
329
+ no_repeat_ngram_size=3,
330
+ encoder_no_repeat_ngram_size=3,
331
+ repetition_penalty=3.5,
332
+ num_beams=4,
333
+ early_stopping=True,
334
+ )
 
 
335
  ```
336
 
337