Ningyu commited on
Commit
bd24730
·
1 Parent(s): e9610ec

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -379,7 +379,7 @@ pip install -r requirements.txt
379
 
380
  <h3 id="2-2">2.2 Pretraining model weight acquisition and restoration</h3>
381
 
382
- ❗❗❗ Note that in terms of hardware, performing step `2.2`, which involves merging LLaMA-13B with ZhiXI-13B-Diff, requires approximately **100GB** of RAM, with no demand for VRAM (this is due to the memory overhead caused by our merging strategy. To facilitate usage, we will improve our merging approach in future updates, and we are currently developing a 7B model as well, so stay tuned). For step `2.4`, which involves inference using `ZhiXi`, a minimum of **26GB** of VRAM is required.
383
 
384
  **1. Download LLaMA 13B and ZhiXi-13B-Diff**
385
 
 
379
 
380
  <h3 id="2-2">2.2 Pretraining model weight acquisition and restoration</h3>
381
 
382
+ ❗❗❗ Note that in terms of hardware, performing step `2.2`, which involves merging LLaMA-13B with ZhiXI-13B-Diff, requires approximately **100GB** of RAM, with no demand for VRAM (this is due to the memory overhead caused by our merging strategy. For your convenience, we have provided the fp16 weights at this link: https://huggingface.co/zjunlp/zhixi-13b-diff-fp16. **fp16 weights require less memory but may slightly impact performance**. We will improve our merging approach in future updates, and we are currently developing a 7B model as well, so stay tuned). For step `2.4`, which involves inference using `ZhiXi`, a minimum of **26GB** of VRAM is required.
383
 
384
  **1. Download LLaMA 13B and ZhiXi-13B-Diff**
385