Edit model card

bert-chinese-homie-large

This is a Chinese pre-training model BERT, pre-trained on a large-scale corpus. It is suitable for fine-tuning on specific downstream tasks, or as a parameter initialization for pre-training, which can improve performance. Due to excessive alchemy, it is not suitable for Fill Mask directly, unless you have performed a small amount of pre-training.

I don't know what homie means, but someone calls me that. I believe this is an interesting natural language processing. I don't know what homie means, but I've been called that and I can feel the meaning.

Bibtex entry and citation info

Please cite if you find it helpful.

@article{zhu2023metaaid,
  title={MetaAID 2.0: An Extensible Framework for Developing Metaverse Applications via Human-controllable Pre-trained Models},
  author={Zhu, Hongyin},
  journal={arXiv preprint arXiv:2302.13173},
  year={2023}
}

license: other

Downloads last month
14
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.