pszemraj commited on
Commit
115e7fc
1 Parent(s): 9e14a17

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +9 -9
README.md CHANGED
@@ -433,12 +433,12 @@ The Longformer Encoder-Decoder (LED) for Narrative-Esque Long Text Summarization
433
 
434
  ## Key Features and Use Cases
435
 
436
- - Ideal for summarizing long narratives, articles, papers, textbooks, and other technical documents.
437
- - Trained to also explain the summarized content, offering insightful output.
438
  - High capacity: Handles up to 16,384 tokens per batch.
439
- - Live demo available: [Colab demo](https://colab.research.google.com/gist/pszemraj/36950064ca76161d9d258e5cdbfa6833/led-base-demo-token-batching.ipynb) and [demo on Spaces](https://huggingface.co/spaces/pszemraj/summarize-long-text).
440
 
441
- > **Note:** The API is configured to generate a maximum of 64 tokens due to runtime constraints. For optimal results, use the Python approach detailed below.
442
 
443
  ## Training Details
444
 
@@ -446,16 +446,14 @@ The model was trained on the BookSum dataset released by SalesForce, which leads
446
 
447
  Model checkpoint: [`pszemraj/led-base-16384-finetuned-booksum`](https://huggingface.co/pszemraj/led-base-16384-finetuned-booksum).
448
 
449
- For comparison, all generation parameters for the API have been kept consistent across versions.
450
-
451
  ## Other Related Checkpoints
452
 
453
  Apart from the LED-based model, I have also fine-tuned other models on `kmfoda/booksum`:
454
 
455
- - [Long-T5-Global-Base](https://huggingface.co/pszemraj/long-t5-tglobal-base-16384-book-summary)
456
  - [BigBird-Pegasus-Large-K](https://huggingface.co/pszemraj/bigbird-pegasus-large-K-booksum)
457
  - [Pegasus-X-Large](https://huggingface.co/pszemraj/pegasus-x-large-book-summary)
458
- - [Long-T5-Global-XL](https://huggingface.co/pszemraj/long-t5-tglobal-xl-16384-book-summary)
459
 
460
  There are also other variants on other datasets etc on my hf profile, feel free to try them out :)
461
 
@@ -524,6 +522,8 @@ out_str = summarizer.summarize_string(long_string)
524
  print(f"summary: {out_str}")
525
  ```
526
 
527
- Currently implemented interfaces include a Python API, a Command-Line Interface (CLI), and a shareable demo application. For detailed explanations and documentation, check the README or the wiki.
 
 
528
 
529
  ---
 
433
 
434
  ## Key Features and Use Cases
435
 
436
+ - Ideal for summarizing long narratives, articles, papers, textbooks, and other documents.
437
+ - the sparknotes-esque style leads to 'explanations' in the summarized content, offering insightful output.
438
  - High capacity: Handles up to 16,384 tokens per batch.
439
+ - demos: try it out in the notebook linked above or in the [demo on Spaces](https://huggingface.co/spaces/pszemraj/summarize-long-text)
440
 
441
+ > **Note:** The API is configured to generate a maximum of ~96 tokens due to inference timeout constraints. For better results, use the Python approach detailed below.
442
 
443
  ## Training Details
444
 
 
446
 
447
  Model checkpoint: [`pszemraj/led-base-16384-finetuned-booksum`](https://huggingface.co/pszemraj/led-base-16384-finetuned-booksum).
448
 
 
 
449
  ## Other Related Checkpoints
450
 
451
  Apart from the LED-based model, I have also fine-tuned other models on `kmfoda/booksum`:
452
 
453
+ - [Long-T5-tglobal-base](https://huggingface.co/pszemraj/long-t5-tglobal-base-16384-book-summary)
454
  - [BigBird-Pegasus-Large-K](https://huggingface.co/pszemraj/bigbird-pegasus-large-K-booksum)
455
  - [Pegasus-X-Large](https://huggingface.co/pszemraj/pegasus-x-large-book-summary)
456
+ - [Long-T5-tglobal-XL](https://huggingface.co/pszemraj/long-t5-tglobal-xl-16384-book-summary)
457
 
458
  There are also other variants on other datasets etc on my hf profile, feel free to try them out :)
459
 
 
522
  print(f"summary: {out_str}")
523
  ```
524
 
525
+ Currently implemented interfaces include a Python API, a Command-Line Interface (CLI), and a shareable demo/web UI.
526
+
527
+ For detailed explanations and documentation, check the [README](https://github.com/pszemraj/textsum) or the [wiki](https://github.com/pszemraj/textsum/wiki.
528
 
529
  ---