rongaoli commited on
Commit
01f64df
1 Parent(s): bfed216

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +2 -2
README.md CHANGED
@@ -38,8 +38,8 @@ You can use this model directly with a pipeline for text generation. This exampl
38
 
39
  ```py
40
  from transformers import AutoModelForCausalLM, AutoTokenizer, FlaxAutoModelForCausalLM
41
- model = AutoModelForCausalLM.from_pretrained("FlagOpen/CodeLlama-7b-Python-taco")
42
- tokenizer = AutoTokenizer.from_pretrained("FlagOpen/CodeLlama-7b-Python-taco")
43
  prompt = """
44
  A function to greet user. Given a user name it should say hello
45
  def greet(name):
 
38
 
39
  ```py
40
  from transformers import AutoModelForCausalLM, AutoTokenizer, FlaxAutoModelForCausalLM
41
+ model = AutoModelForCausalLM.from_pretrained("flagopen/CodeLlama-7b-Python-taco")
42
+ tokenizer = AutoTokenizer.from_pretrained("flagopen/CodeLlama-7b-Python-taco")
43
  prompt = """
44
  A function to greet user. Given a user name it should say hello
45
  def greet(name):