juewang commited on
Commit
0f35095
1 Parent(s): 092efe0

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +9 -2
README.md CHANGED
@@ -12,8 +12,6 @@ inference:
12
  language:
13
  - en
14
  pipeline_tag: text-generation
15
- tags:
16
- - gpt
17
  widget:
18
  -
19
  example_title: "ADE Corpus V2"
@@ -205,6 +203,15 @@ from transformers import pipeline
205
  pipe = pipeline(model='togethercomputer/GPT-JT-6B-v1')
206
  pipe('''Please answer the following question:\n\nQuestion: Where is Zurich?\nAnswer:''')
207
  ```
 
 
 
 
 
 
 
 
 
208
  # Training Data
209
  We fine-tune [GPT-J-6B](https://huggingface.co/EleutherAI/gpt-j-6B) on NI, P3, COT, the pile data.
210
  - [Natural-Instructions](https://github.com/allenai/natural-instructions)
 
12
  language:
13
  - en
14
  pipeline_tag: text-generation
 
 
15
  widget:
16
  -
17
  example_title: "ADE Corpus V2"
 
203
  pipe = pipeline(model='togethercomputer/GPT-JT-6B-v1')
204
  pipe('''Please answer the following question:\n\nQuestion: Where is Zurich?\nAnswer:''')
205
  ```
206
+
207
+ or
208
+
209
+ ```python
210
+ from transformers import AutoTokenizer, AutoModelForCausalLM
211
+ tokenizer = AutoTokenizer.from_pretrained("togethercomputer/GPT-JT-6B-v1")
212
+ model = AutoModelForCausalLM.from_pretrained("togethercomputer/GPT-JT-6B-v1")
213
+ ```
214
+
215
  # Training Data
216
  We fine-tune [GPT-J-6B](https://huggingface.co/EleutherAI/gpt-j-6B) on NI, P3, COT, the pile data.
217
  - [Natural-Instructions](https://github.com/allenai/natural-instructions)