Text Generation
Transformers
PyTorch
English
gpt_neox
causal-lm
Inference Endpoints
text-generation-inference
hardmaru commited on
Commit
a444b43
1 Parent(s): 3123a2b

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +3 -0
README.md CHANGED
@@ -43,6 +43,9 @@ inputs = tokenizer(prompt, return_tensors="pt").to("cuda")
43
  tokens = model.generate(
44
  **inputs,
45
  max_new_tokens=64,
 
 
 
46
  print(tokenizer.decode(tokens[0], skip_special_tokens=True))
47
  ```
48
 
 
43
  tokens = model.generate(
44
  **inputs,
45
  max_new_tokens=64,
46
+ temperature=0.7,
47
+ do_sample=True,
48
+ )
49
  print(tokenizer.decode(tokens[0], skip_special_tokens=True))
50
  ```
51