Edit model card

hongyin/informer-1b

I am pleased to introduce to you an English-Chinese bilingual autoregressive language model. This model is trained from scratch and has a unique vocabulary and 100 million parameters based on the LLAMA2 model structure. Our goal is to provide a solution that is computationally cheap and easy to inference. It's important to note that this is a base model, not intended to be used as a chatbot, but rather for alchemy. We look forward to providing you with a practical model product.

To put aside the damn high-sounding words, the name of each model has rich connotations and personal experience, including the previous model, and it is worth reminding us repeatedly.


Bibtex entry and citation info

Please cite if you find it helpful.

@article{zhu2023metaaid,
  title={MetaAID 2.0: An Extensible Framework for Developing Metaverse Applications via Human-controllable Pre-trained Models},
  author={Zhu, Hongyin},
  journal={arXiv preprint arXiv:2302.13173},
  year={2023}
}

license: other

Downloads last month
22
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.