模型:
microsoft/cocolm-large
预印本库:
arxiv:2102.08473This model card contains the COCO-LM model ( large++ version) proposed in this paper . The official GitHub repository can be found here .
If you find this model card useful for your research, please cite the following paper:
@inproceedings{meng2021coco, title={{COCO-LM}: Correcting and contrasting text sequences for language model pretraining}, author={Meng, Yu and Xiong, Chenyan and Bajaj, Payal and Tiwary, Saurabh and Bennett, Paul and Han, Jiawei and Song, Xia}, booktitle={NeurIPS}, year={2021} }