Shobhank-iiitdwd commited on
Commit
9c3f3fb
1 Parent(s): e35516c

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +2 -5
README.md CHANGED
@@ -470,8 +470,8 @@ model-index:
470
  - [Inference over long documents in batches](#how-to-run-inference-over-a-very-long-30k-tokens-document-in-batches)
471
  [How to fine-tune further](#how-to-fine-tune-further)
472
  - [Training procedure](#training-procedure)
473
- - [Training hyperparameters](#training-hyperparameters)
474
- - [Framework versions](#framework-versions)
475
 
476
 
477
  <!-- /TOC -->
@@ -521,9 +521,6 @@ Pass [other parameters related to beam search textgen](https://huggingface.co/bl
521
  `kmfoda/booksum` dataset on HuggingFace - read [the original paper here](https://arxiv.org/abs/2105.08209). Summaries longer than 1024 LongT5 tokens were filtered out to prevent the model from learning to generate "partial" summaries.
522
 
523
 
524
- * * *
525
-
526
- ## FAQ
527
 
528
  ### How to run inference over a very long (30k+ tokens) document in batches?
529
 
 
470
  - [Inference over long documents in batches](#how-to-run-inference-over-a-very-long-30k-tokens-document-in-batches)
471
  [How to fine-tune further](#how-to-fine-tune-further)
472
  - [Training procedure](#training-procedure)
473
+ - [Training hyperparameters](#training-hyperparameters)
474
+ - [Framework versions](#framework-versions)
475
 
476
 
477
  <!-- /TOC -->
 
521
  `kmfoda/booksum` dataset on HuggingFace - read [the original paper here](https://arxiv.org/abs/2105.08209). Summaries longer than 1024 LongT5 tokens were filtered out to prevent the model from learning to generate "partial" summaries.
522
 
523
 
 
 
 
524
 
525
  ### How to run inference over a very long (30k+ tokens) document in batches?
526