chatglm_6b / README.md
kakamond's picture
Update README.md
43f4f54 verified
|
raw
history blame
1.05 kB
metadata
license: mit

ChatGLM-6B Mirror

ChatGLM-6B is an open source, bilingual conversational language model based on the General Language Model (GLM) architecture with 6.2 billion parameters. Combined with model quantization techniques, it can be deployed locally on consumer-grade graphics cards (as low as 6GB of video memory at INT4 quantization level). ChatGLM-6B uses similar technology to ChatGPT, optimized for Chinese Q&A and conversation. With approximately 1T identifiers trained in both English and Chinese, and supported by supervised fine-tuning, feedback self-help, and human feedback reinforcement learning, ChatGLM-6B with 6.2 billion parameters is able to generate responses that are fairly consistent with human preferences.

Usage

from modelscope import snapshot_download
model_dir = snapshot_download('Genius-Society/chatglm_6b')

Maintenance

git clone git@hf.co:Genius-Society/chatglm_6b
cd chatglm_6b

Reference

[1] ChatGLM-6B