YAML Metadata
Warning:
empty or missing yaml metadata in repo card
(https://huggingface.co/docs/hub/model-cards#model-card-metadata)
COCO-LM: Correcting and Contrasting Text Sequences for Language Model Pretraining
This model card contains the COCO-LM model (large++ version) proposed in this paper. The official GitHub repository can be found here.
Citation
If you find this model card useful for your research, please cite the following paper:
@inproceedings{meng2021coco,
title={{COCO-LM}: Correcting and contrasting text sequences for language model pretraining},
author={Meng, Yu and Xiong, Chenyan and Bajaj, Payal and Tiwary, Saurabh and Bennett, Paul and Han, Jiawei and Song, Xia},
booktitle={NeurIPS},
year={2021}
}
- Downloads last month
- 64
Inference Providers
NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API:
The model has no pipeline_tag.