Baseline for the preprint Unlimiformer: Long-Range Transformers with Unlimited Length Input.

This model was finetuned from a BART-base model as a baseline. It was finetuned on the dataset BookSum (full-book setting).

Downloads last month
10
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Dataset used to train abertsch/bart-base-booksum