File size: 837 Bytes
68261a4 |
1 2 3 4 5 |
Model from the preprint [Unlimiformer: Long-Range Transformers with Unlimited Length Input](https://arxiv.org/abs/2305.01625)
This is a BART-base model finetuned using Unlimiformer-aware early stopping, as described in section 3.1 of the paper. The model was finetuned on GovReport using the data processing pipeline from SLED; to load the validation or test set for use with these model, please use the datasets [urialon/gov_report_validation](https://huggingface.co/datasets/urialon/gov_report_validation) and [urialon/gov_report_test](https://huggingface.co/datasets/urialon/gov_report_test).
This is generally a weaker model than the [alternating-training model](https://huggingface.co/abertsch/unlimiformer-bart-govreport-alternating) and a stronger model than the [baseline](https://huggingface.co/abertsch/bart-base-govreport). |