ClueAI commited on
Commit
55a253b
1 Parent(s): f5a85a6

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +8 -1
README.md CHANGED
@@ -68,11 +68,14 @@ Based on the original functions of Chatyuan-large-v1, we optimized the model as
68
  ## 期望模型使用方式及适用范围
69
 
70
  ### 对话运行方式
 
 
71
  ```python
72
  from transformers import AutoTokenizer, AutoModel
73
  import os
74
  model_dir='ClueAI/ChatYuan-large-v2'
75
  tokenizer = AutoTokenizer.from_pretrained(model_dir)
 
76
  model = AutoModel.from_pretrained(model_dir, trust_remote_code=True)
77
  history = []
78
  print("starting")
@@ -88,13 +91,17 @@ while True:
88
  print(f"小元:{response}")
89
  ```
90
 
91
- #### 代码范例
 
 
92
 
93
  加载模型:
94
 
95
  ```python
96
  # 加载模型
97
  from transformers import T5Tokenizer, T5ForConditionalGeneration
 
 
98
  tokenizer = T5Tokenizer.from_pretrained("ClueAI/ChatYuan-large-v2")
99
  model = T5ForConditionalGeneration.from_pretrained("ClueAI/ChatYuan-large-v2")
100
  # 该加载方式,在最大长度为512时 大约需要6G多显存
 
68
  ## 期望模型使用方式及适用范围
69
 
70
  ### 对话运行方式
71
+
72
+
73
  ```python
74
  from transformers import AutoTokenizer, AutoModel
75
  import os
76
  model_dir='ClueAI/ChatYuan-large-v2'
77
  tokenizer = AutoTokenizer.from_pretrained(model_dir)
78
+ # 速度会受到网络影响
79
  model = AutoModel.from_pretrained(model_dir, trust_remote_code=True)
80
  history = []
81
  print("starting")
 
91
  print(f"小元:{response}")
92
  ```
93
 
94
+ #### 高级参数配置代码示例
95
+
96
+
97
 
98
  加载模型:
99
 
100
  ```python
101
  # 加载模型
102
  from transformers import T5Tokenizer, T5ForConditionalGeneration
103
+
104
+ # 自动下载一次后,本地运行,不受网络影响
105
  tokenizer = T5Tokenizer.from_pretrained("ClueAI/ChatYuan-large-v2")
106
  model = T5ForConditionalGeneration.from_pretrained("ClueAI/ChatYuan-large-v2")
107
  # 该加载方式,在最大长度为512时 大约需要6G多显存