suriyagunasekar commited on
Commit
34046b0
1 Parent(s): 24ad69c

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +28 -2
README.md CHANGED
@@ -35,18 +35,24 @@ where the model generates the text after "Bob:".
35
 
36
  #### Code format:
37
  ```python
 
38
  def print_prime(n):
39
  """
40
  Print all primes between 1 and n
41
  """
42
  primes = []
43
  for num in range(2, n+1):
44
- for i in range(2, num):
 
45
  if num % i == 0:
 
46
  break
47
- else:
48
  primes.append(num)
49
  print(primes)
 
 
 
50
  ```
51
  where the model generates the text after the comments. (Note: This is a legitimate and correct use of the else statement in Python loops.)
52
 
@@ -81,6 +87,26 @@ where the model generates the text after the comments. (Note: This is a legitima
81
  ### License
82
  The model is licensed under the [Research License](https://huggingface.co/microsoft/phi-1_5/resolve/main/Research%20License.docx).
83
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
84
  ### Citation
85
  ```bib
86
  @article{textbooks2,
 
35
 
36
  #### Code format:
37
  ```python
38
+ \`\`\`python
39
  def print_prime(n):
40
  """
41
  Print all primes between 1 and n
42
  """
43
  primes = []
44
  for num in range(2, n+1):
45
+ is_prime = True
46
+ for i in range(2, int(num**0.5)+1):
47
  if num % i == 0:
48
+ is_prime = False
49
  break
50
+ if is_prime:
51
  primes.append(num)
52
  print(primes)
53
+
54
+ print_prime(20)
55
+ \`\`\`
56
  ```
57
  where the model generates the text after the comments. (Note: This is a legitimate and correct use of the else statement in Python loops.)
58
 
 
87
  ### License
88
  The model is licensed under the [Research License](https://huggingface.co/microsoft/phi-1_5/resolve/main/Research%20License.docx).
89
 
90
+ ### Sample Code
91
+ ```python
92
+ import torch
93
+ from transformers import AutoModelForCausalLM, AutoTokenizer
94
+
95
+ torch.set_default_device('cuda')
96
+ model = AutoModelForCausalLM.from_pretrained("microsoft/phi-1_5", trust_remote_code=True, torch_dtype="auto")
97
+ tokenizer = AutoTokenizer.from_pretrained("microsoft/phi-1_5", trust_remote_code=True, torch_dtype="auto")
98
+ inputs = tokenizer('''```python
99
+ def print_prime(n):
100
+ """
101
+ Print all primes between 1 and n
102
+ """''', return_tensors="pt", return_attention_mask=False)
103
+
104
+ eos_token_id = tokenizer.encode("```")[0]
105
+ outputs = model.generate(**inputs, max_length=500)
106
+ text = tokenizer.batch_decode(outputs)[0]
107
+ print(text)
108
+ ```
109
+
110
  ### Citation
111
  ```bib
112
  @article{textbooks2,