Update README.md
Browse files
README.md
CHANGED
@@ -1,3 +1,46 @@
|
|
1 |
---
|
2 |
license: apache-2.0
|
|
|
|
|
|
|
|
|
3 |
---
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
---
|
2 |
license: apache-2.0
|
3 |
+
language:
|
4 |
+
- zh
|
5 |
+
tags:
|
6 |
+
- Chinese
|
7 |
---
|
8 |
+
|
9 |
+
# Open-Chinese-LLaMA-7B-Patch
|
10 |
+
|
11 |
+
This model is a **Chinese large language model base** generated from the [LLaMA](https://github.com/facebookresearch/llama)-7B model after **secondary pre-training** on Chinese datasets.
|
12 |
+
|
13 |
+
This model is a **patch** model and must be used in conjunction with the official weights. For the installation of the patch and related tutorials, please refer to [OpenLMLab/llama](https://github.com/OpenLMLab/llama).
|
14 |
+
|
15 |
+
## Usage
|
16 |
+
|
17 |
+
Since the official weights for [LLaMA](https://github.com/facebookresearch/llama)-7B have not been open-sourced, the model released this time is of the **patch** type, which needs to be used in combination with the original official weights.
|
18 |
+
|
19 |
+
You can install the **patch** using `tools/patch_model.py`, for example:
|
20 |
+
|
21 |
+
```bash
|
22 |
+
|
23 |
+
python tools/patch_model.py --base_model <path_or_name_to_original_model>
|
24 |
+
--patch_model openlmlab/open-chinese-llama-7b-patch
|
25 |
+
--base_model_format <hf_or_raw>
|
26 |
+
|
27 |
+
```
|
28 |
+
|
29 |
+
The **patch** is installed in place, which means that the installed **patch** is the complete `hf` format weight. You can use `transformers` to load the model.
|
30 |
+
|
31 |
+
## Quick Experience via Command Line
|
32 |
+
|
33 |
+
The **patched** model can be easily loaded by `transformers`. For a quick experience, we provide a console Demo:
|
34 |
+
|
35 |
+
```bash
|
36 |
+
|
37 |
+
python cli_demo.py --model openlmlab/open-chinese-llama-7b-patch
|
38 |
+
--devices 0
|
39 |
+
--max_length 1024
|
40 |
+
--do_sample true
|
41 |
+
--top_k 40
|
42 |
+
--top_p 0.8
|
43 |
+
--temperature 0.7
|
44 |
+
--penalty 1.02
|
45 |
+
|
46 |
+
```
|