s-JoL commited on
Commit
98373c8
1 Parent(s): 4662a47

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +24 -18
README.md CHANGED
@@ -30,28 +30,34 @@ Baichuan-13B is an open-source, commercially available large-scale language mode
30
 
31
  ## How to Get Started with the Model
32
 
33
- 如下是一个使用Baichuan-13B进行1-shot推理的任务,根据作品给出作者名,正确输出为"夜雨寄北->李商隐"
34
  ```python
35
- from transformers import AutoModelForCausalLM, AutoTokenizer
36
-
37
- tokenizer = AutoTokenizer.from_pretrained("baichuan-inc/Baichuan-13B", trust_remote_code=True)
38
- model = AutoModelForCausalLM.from_pretrained("baichuan-inc/Baichuan-13B", device_map="auto", trust_remote_code=True)
39
- inputs = tokenizer('登鹳雀楼->王之涣\n夜雨寄北->', return_tensors='pt')
40
- inputs = inputs.to('cuda:0')
41
- pred = model.generate(**inputs, max_new_tokens=64,repetition_penalty=1.1)
42
- print(tokenizer.decode(pred.cpu()[0], skip_special_tokens=True))
 
 
43
  ```
44
 
45
- The following is a task of performing 1-shot inference using Baichuan-13B, where the author's name is given based on the work, with the correct output being "One Hundred Years of Solitude->Gabriel Garcia Marquez"
 
 
46
  ```python
47
- from transformers import AutoModelForCausalLM, AutoTokenizer
48
-
49
- tokenizer = AutoTokenizer.from_pretrained("baichuan-inc/Baichuan-13B", trust_remote_code=True)
50
- model = AutoModelForCausalLM.from_pretrained("baichuan-inc/Baichuan-13B", device_map="auto", trust_remote_code=True)
51
- inputs = tokenizer('Hamlet->Shakespeare\nOne Hundred Years of Solitude->', return_tensors='pt')
52
- inputs = inputs.to('cuda:0')
53
- pred = model.generate(**inputs, max_new_tokens=64,repetition_penalty=1.1)
54
- print(tokenizer.decode(pred.cpu()[0], skip_special_tokens=True))
 
 
55
  ```
56
 
57
  ## Model Details
 
30
 
31
  ## How to Get Started with the Model
32
 
33
+ 如下是一个使用Baichuan-13B-Chat进行对话的示例,正确输出为"乔戈里峰。世界第二高峰———乔戈里峰西方登山者称其为k2峰,海拔高度是8611米,位于喀喇昆仑山脉的中巴边境上"
34
  ```python
35
+ import torch
36
+ from transformers import AutoModel, AutoTokenizer
37
+ from transformers.generation.utils import GenerationConfig
38
+ tokenizer = AutoTokenizer.from_pretrained("baichuan-inc/Baichuan-13B-Chat", use_fast=False, trust_remote_code=True)
39
+ model = AutoModel.from_pretrained("baichuan-inc/Baichuan-13B-Chat", device_map="auto", torch_dtype=torch.float16, trust_remote_code=True)
40
+ model.generation_config = GenerationConfig.from_pretrained("baichuan-inc/Baichuan-13B-Chat")
41
+ messages = []
42
+ messages.append({"role": "user", "content": "世界上第二高的山峰是哪座"})
43
+ response = model.chat(tokenizer, messages)
44
+ print(response)
45
  ```
46
 
47
+ ```
48
+
49
+ Here is an example of a conversation using Baichuan-13B-Chat, the correct output is "K2. The world's second highest peak - K2, also known as Mount Godwin-Austen or Chhogori, with an altitude of 8611 meters, is located on the China-Pakistan border in the Karakoram Range."
50
  ```python
51
+ import torch
52
+ from transformers import AutoModel, AutoTokenizer
53
+ from transformers.generation.utils import GenerationConfig
54
+ tokenizer = AutoTokenizer.from_pretrained("baichuan-inc/Baichuan-13B-Chat", use_fast=False, trust_remote_code=True)
55
+ model = AutoModel.from_pretrained("baichuan-inc/Baichuan-13B-Chat", device_map="auto", torch_dtype=torch.float16, trust_remote_code=True)
56
+ model.generation_config = GenerationConfig.from_pretrained("baichuan-inc/Baichuan-13B-Chat")
57
+ messages = []
58
+ messages.append({"role": "user", "content": "The second highest mountain in the world is K2."})
59
+ response = model.chat(tokenizer, messages)
60
+ print(response)
61
  ```
62
 
63
  ## Model Details