Text2Text Generation
Transformers
PyTorch
bart
feature-extraction
abertsch's picture
Create README.md
f9bbc3f
|
raw
history blame
971 Bytes
metadata
datasets:
  - yuvalkirstain/summ_screen_fd_t5_lm
  - urialon/summ_screen_validation
  - urialon/summ_screen_test

Model from the preprint Unlimiformer: Long-Range Transformers with Unlimited Length Input.

This model was finetuned from a BART-base model using Unlimiformer-aware early stopping, described in section 3.1 of the paper. It was finetuned on the dataset SummScreen using the data preprocessing pipeline from SLED; to load the validation or test set for use with these model, please use the datasets urialon/summ_screen_validation and urialon/summ_screen_test.

This is generally a weaker model than the retrieval-trained model and a stronger model than the baseline