GuoPD commited on
Commit
1a66eec
1 Parent(s): 38edfd8

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +10 -3
README.md CHANGED
@@ -34,7 +34,7 @@ tasks:
34
  [2023.12.29] 🎉🎉🎉 我们发布了 **[Baichuan2-13B-Chat](https://huggingface.co/baichuan-inc/Baichuan2-13B-Chat) v2** 版本。其中:
35
  - 大幅提升了模型的综合能力,特别是数学和逻辑推理、复杂指令跟随能力。
36
  - 模型处理长度从 4096 提升 至8192 。
37
- - 使用方法参考[快速开始](#Start)
38
 
39
  # <span id="Introduction">模型介绍/Introduction</span>
40
 
@@ -64,8 +64,15 @@ In the Baichuan 2 series models, we have utilized the new feature `F.scaled_dot_
64
  import torch
65
  from transformers import AutoModelForCausalLM, AutoTokenizer
66
  from transformers.generation.utils import GenerationConfig
67
- tokenizer = AutoTokenizer.from_pretrained("baichuan-inc/Baichuan2-13B-Chat", revision="v2.0", use_fast=False, trust_remote_code=True)
68
- model = AutoModelForCausalLM.from_pretrained("baichuan-inc/Baichuan2-13B-Chat", revision="v2.0", device_map="auto", torch_dtype=torch.bfloat16, trust_remote_code=True)
 
 
 
 
 
 
 
69
  model.generation_config = GenerationConfig.from_pretrained("baichuan-inc/Baichuan2-13B-Chat", revision="v2.0")
70
  messages = []
71
  messages.append({"role": "user", "content": "解释一下“温故而知新”"})
 
34
  [2023.12.29] 🎉🎉🎉 我们发布了 **[Baichuan2-13B-Chat](https://huggingface.co/baichuan-inc/Baichuan2-13B-Chat) v2** 版本。其中:
35
  - 大幅提升了模型的综合能力,特别是数学和逻辑推理、复杂指令跟随能力。
36
  - 模型处理长度从 4096 提升 至8192 。
37
+ - 使用时需指定revision=v2.0,详细方法参考[快速开始](#Start)
38
 
39
  # <span id="Introduction">模型介绍/Introduction</span>
40
 
 
64
  import torch
65
  from transformers import AutoModelForCausalLM, AutoTokenizer
66
  from transformers.generation.utils import GenerationConfig
67
+ tokenizer = AutoTokenizer.from_pretrained("baichuan-inc/Baichuan2-13B-Chat",
68
+ revision="v2.0",
69
+ use_fast=False,
70
+ trust_remote_code=True)
71
+ model = AutoModelForCausalLM.from_pretrained("baichuan-inc/Baichuan2-13B-Chat",
72
+ revision="v2.0",
73
+ device_map="auto",
74
+ torch_dtype=torch.bfloat16,
75
+ trust_remote_code=True)
76
  model.generation_config = GenerationConfig.from_pretrained("baichuan-inc/Baichuan2-13B-Chat", revision="v2.0")
77
  messages = []
78
  messages.append({"role": "user", "content": "解释一下“温故而知新”"})