Baseline for the preprint Unlimiformer: Long-Range Transformers with Unlimited Length Input.

This model was finetuned from a BART-base model as a baseline. It was finetuned on the dataset BookSum (full-book setting).

Downloads last month
15
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Dataset used to train abertsch/bart-base-booksum