Model from the preprint Unlimiformer: Long-Range Transformers with Unlimited Length Input
This is a BART-base model finetuned using Unlimiformer-aware early stopping, as described in section 3.1 of the paper. The model was finetuned on GovReport using the data processing pipeline from SLED; to load the validation or test set for use with these model, please use the datasets urialon/gov_report_validation and urialon/gov_report_test.
This is generally a weaker model than the alternating-training model and a stronger model than the baseline.