shiyemin2 commited on
Commit
d6a0c30
1 Parent(s): 8f5f62f

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +18 -0
README.md CHANGED
@@ -11,6 +11,24 @@
11
  - [Demo 地址 / HuggingFace Spaces](#)
12
  - [Colab 一键启动](#)
13
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
14
  ## 资源下载
15
 
16
  - 模型下载:[Chinese Llama2 Chat Model](#)
 
11
  - [Demo 地址 / HuggingFace Spaces](#)
12
  - [Colab 一键启动](#)
13
 
14
+ ## 快速测试
15
+ ```python
16
+ from transformers import AutoTokenizer, AutoModelForCausalLM, TextStreamer
17
+
18
+ model_path = "LinkSoul/Chinese-Llama-2-7b"
19
+
20
+ tokenizer = AutoTokenizer.from_pretrained(model_path, use_fast=False)
21
+ model = AutoModelForCausalLM.from_pretrained(model_path).half().cuda()
22
+ streamer = TextStreamer(tokenizer, skip_prompt=True, skip_special_tokens=True)
23
+
24
+ instruction = """[INST] <<SYS>>\nYou are a helpful, respectful and honest assistant. Always answer as helpfully as possible, while being safe. Your answers should not include any harmful, unethical, racist, sexist, toxic, dangerous, or illegal content. Please ensure that your responses are socially unbiased and positive in nature.
25
+
26
+ If a question does not make any sense, or is not factually coherent, explain why instead of answering something not correct. If you don't know the answer to a question, please don't share false information.\n<</SYS>>\n\n{} [/INST]"""
27
+
28
+ prompt = instruction.format("用英文回答,什么是夫妻肺片?")
29
+ generate_ids = model.generate(tokenizer(prompt, return_tensors='pt').input_ids.cuda(), max_new_tokens=4096, streamer=streamer)
30
+ ```
31
+
32
  ## 资源下载
33
 
34
  - 模型下载:[Chinese Llama2 Chat Model](#)