Edit model card

hongyin/awareness-en-zh-bilingual-2.5b

This is a English-Chinese bilingual autoregressive language model based on OPT with a parameter size of 2.5b. The model is trained based on the NTP task on large-scale unstructured text. It is worth noting that the model is suitable for further training as an initialization parameter, and does not have the ability to be used as a chatbot directly unless it is fine-tuned on a multi-round dialogue corpus. This model is a basic model, used as a raw material for alchemy.

Bibtex entry and citation info

Please cite if you find it helpful.

@article{zhu2023metaaid,
  title={MetaAID 2.0: An Extensible Framework for Developing Metaverse Applications via Human-controllable Pre-trained Models},
  author={Zhu, Hongyin},
  journal={arXiv preprint arXiv:2302.13173},
  year={2023}
}

license: other

Downloads last month
3
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.