|
--- |
|
language: |
|
- en |
|
- zh |
|
pipeline_tag: other |
|
--- |
|
## hongyin/informer-0.3b-80k |
|
|
|
I am pleased to introduce to you an English-Chinese bilingual autoregressive language model. This model is trained from scratch and has a unique vocabulary and 30 million parameters based on the LLAMA2 model structure. Our goal is to provide a solution that is computationally cheap and easy to inference. It's important to note that this is a base model, not intended to be used as a chatbot, but rather for alchemy. We look forward to providing you with a practical model product. |
|
|
|
To put aside the damn high-sounding words, the name of each model has rich connotations and personal experience, including the previous model, and it is worth reminding us repeatedly. |
|
|
|
```python |
|
|
|
``` |
|
|
|
## Bibtex entry and citation info |
|
Please cite if you find it helpful. |
|
``` |
|
@article{zhu2023metaaid, |
|
title={MetaAID 2.0: An Extensible Framework for Developing Metaverse Applications via Human-controllable Pre-trained Models}, |
|
author={Zhu, Hongyin}, |
|
journal={arXiv preprint arXiv:2302.13173}, |
|
year={2023} |
|
} |
|
|
|
``` |
|
|
|
--- |
|
license: other |
|
--- |