Text Generation
Transformers
PyTorch
English
gpt_neox
causal-lm
Inference Endpoints
text-generation-inference
reshinthadith commited on
Commit
56f0d9b
1 Parent(s): f7257a9

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +2 -2
README.md CHANGED
@@ -26,8 +26,8 @@ Get started chatting with `StableLM-Tuned-Alpha` by using the following code sni
26
  ```python
27
  from transformers import AutoModelForCausalLM, AutoTokenizer
28
 
29
- tokenizer = AutoTokenizer.from_pretrained("StabilityAI/stablelm-tuned-alpha-7b")
30
- model = AutoModelForCausalLM.from_pretrained("StabilityAI/stablelm-tuned-alpha-7b")
31
  model.half().cuda()
32
 
33
  inputs = tokenizer("What's your mood today?", return_tensors="pt").to("cuda")
 
26
  ```python
27
  from transformers import AutoModelForCausalLM, AutoTokenizer
28
 
29
+ tokenizer = AutoTokenizer.from_pretrained("StabilityAI/stablelm-tuned-alpha-3b")
30
+ model = AutoModelForCausalLM.from_pretrained("StabilityAI/stablelm-tuned-alpha-3b")
31
  model.half().cuda()
32
 
33
  inputs = tokenizer("What's your mood today?", return_tensors="pt").to("cuda")