abertsch commited on
Commit
782e7f0
1 Parent(s): f2a7c2d

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +3 -0
README.md CHANGED
@@ -2,7 +2,10 @@
2
  datasets:
3
  - abertsch/booksum-fullbooks
4
  pipeline_tag: text2text-generation
 
5
  ---
6
  Model from the preprint [Unlimiformer: Long-Range Transformers with Unlimited Length Input](https://arxiv.org/abs/2305.01625).
7
 
8
  This model was finetuned from a BART-base model using the alternating-training strategy described in section 3.2 of the paper. It was finetuned on the dataset BookSum (full-book setting).
 
 
 
2
  datasets:
3
  - abertsch/booksum-fullbooks
4
  pipeline_tag: text2text-generation
5
+ inference: false
6
  ---
7
  Model from the preprint [Unlimiformer: Long-Range Transformers with Unlimited Length Input](https://arxiv.org/abs/2305.01625).
8
 
9
  This model was finetuned from a BART-base model using the alternating-training strategy described in section 3.2 of the paper. It was finetuned on the dataset BookSum (full-book setting).
10
+
11
+ *The inference demo is disabled because you must add the Unlimiformer files to your repo before this model can handle unlimited length input!* See the [Unlimiformer GitHub](https://github.com/abertsch72/unlimiformer) for setup instructions.