Added correct model to load

#5
Files changed (1) hide show
  1. README.md +2 -2
README.md CHANGED
@@ -38,8 +38,8 @@ Here give some examples of how to use our model.
38
  #### Chat Model Inference
39
  ```python
40
  from transformers import AutoTokenizer, AutoModelForCausalLM
41
- tokenizer = AutoTokenizer.from_pretrained("deepseek-ai/deepseek-coder-6.7b-instruct", trust_remote_code=True)
42
- model = AutoModelForCausalLM.from_pretrained("deepseek-ai/deepseek-coder-6.7b-instruct", trust_remote_code=True, torch_dtype=torch.bfloat16).cuda()
43
  messages=[
44
  { 'role': 'user', 'content': "write a quick sort algorithm in python."}
45
  ]
 
38
  #### Chat Model Inference
39
  ```python
40
  from transformers import AutoTokenizer, AutoModelForCausalLM
41
+ tokenizer = AutoTokenizer.from_pretrained("deepseek-ai/deepseek-coder-1.3b-instruct", trust_remote_code=True)
42
+ model = AutoModelForCausalLM.from_pretrained("deepseek-ai/deepseek-coder-1.3b-instruct", trust_remote_code=True, torch_dtype=torch.bfloat16).cuda()
43
  messages=[
44
  { 'role': 'user', 'content': "write a quick sort algorithm in python."}
45
  ]