Text Generation
Transformers
PyTorch
English
gpt_neox
causal-lm
Inference Endpoints
text-generation-inference
Beyondo commited on
Commit
7e37697
1 Parent(s): d1c03d2

Usage: 7b -> 3b

Browse files
Files changed (1) hide show
  1. README.md +2 -2
README.md CHANGED
@@ -27,8 +27,8 @@ Get started chatting with `StableLM-Tuned-Alpha` by using the following code sni
27
  ```python
28
  from transformers import AutoModelForCausalLM, AutoTokenizer, StoppingCriteria, StoppingCriteriaList
29
 
30
- tokenizer = AutoTokenizer.from_pretrained("StabilityAI/stablelm-tuned-alpha-7b")
31
- model = AutoModelForCausalLM.from_pretrained("StabilityAI/stablelm-tuned-alpha-7b")
32
  model.half().cuda()
33
 
34
  class StopOnTokens(StoppingCriteria):
 
27
  ```python
28
  from transformers import AutoModelForCausalLM, AutoTokenizer, StoppingCriteria, StoppingCriteriaList
29
 
30
+ tokenizer = AutoTokenizer.from_pretrained("StabilityAI/stablelm-tuned-alpha-3b")
31
+ model = AutoModelForCausalLM.from_pretrained("StabilityAI/stablelm-tuned-alpha-3b")
32
  model.half().cuda()
33
 
34
  class StopOnTokens(StoppingCriteria):