Text2Text Generation
Transformers
PyTorch
bart
feature-extraction
abertsch's picture
Update README.md
d5a6277
metadata
datasets:
  - yuvalkirstain/summ_screen_fd_t5_lm
  - urialon/summ_screen_validation
  - urialon/summ_screen_test
pipeline_tag: text2text-generation
inference: false

Model from the preprint Unlimiformer: Long-Range Transformers with Unlimited Length Input.

This model was finetuned from a BART-base model using the retrieval-augmented training strategy described in section 3.2 of the paper. It was finetuned on the dataset SummScreen using the data preprocessing pipeline from SLED; to load the validation or test set for use with these model, please use the datasets urialon/summ_screen_validation and urialon/summ_screen_test.

This is the strongest of the Unlimiformer models for SummScreen.

The inference demo is disabled because you must add the Unlimiformer files to your repo before this model can handle unlimited length input! See the Unlimiformer GitHub for setup instructions.