Baseline for the preprint Unlimiformer: Long-Range Transformers with Unlimited Length Input.

This model was finetuned from a BART-base model as a baseline. It was finetuned on the dataset BookSum (full-book setting).

Downloads last month
104
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and the model is not deployed on the HF Inference API.

Dataset used to train abertsch/bart-base-booksum