FrankC0st1e
commited on
Commit
•
f62f088
1
Parent(s):
eb30cff
update readme
Browse files
README.md
CHANGED
@@ -22,6 +22,7 @@ print(responds)
|
|
22 |
# Note
|
23 |
1. You can alse inference with [vLLM](https://github.com/vllm-project/vllm), which will be compatible with this repo and has a much higher inference throughput.
|
24 |
2. The precision of model weights in this repo is bfloat16. Manual convertion is needed for other kinds of dtype.
|
|
|
25 |
|
26 |
# Statement
|
27 |
1. As a language model, MiniCPM-MoE-8x2B generates content by learning from a vast amount of text.
|
|
|
22 |
# Note
|
23 |
1. You can alse inference with [vLLM](https://github.com/vllm-project/vllm), which will be compatible with this repo and has a much higher inference throughput.
|
24 |
2. The precision of model weights in this repo is bfloat16. Manual convertion is needed for other kinds of dtype.
|
25 |
+
3. For more details, please refer to our [github repo](https://github.com/OpenBMB/MiniCPM).
|
26 |
|
27 |
# Statement
|
28 |
1. As a language model, MiniCPM-MoE-8x2B generates content by learning from a vast amount of text.
|