chat-llama2-70b-80k / README.md
hongyin's picture
Update README.md
ddc85d3
|
raw
history blame
736 Bytes
metadata
language:
  - en
  - zh
pipeline_tag: text-generation

hongyin/chat-informer-70b-80k

I am pleased to introduce an English-Chinese conversation assistant designed to reduce the cost of inference. It is trained based on the Llama2-chat-70B, with a unique vocabulary and 70 billion parameters.

Losing fat is the only way to solve all problems.


Bibtex entry and citation info

Please cite if you find it helpful.

@misc{zhu2023metaaid,
      title={MetaAID 2.5: A Secure Framework for Developing Metaverse Applications via Large Language Models}, 
      author={Hongyin Zhu},
      year={2023},
      eprint={2312.14480},
      archivePrefix={arXiv},
      primaryClass={cs.CR}
}

license: other