hassanzay commited on
Commit
c934d66
1 Parent(s): 00e549c

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +5 -2
README.md CHANGED
@@ -24,6 +24,8 @@ datasets:
24
 
25
  `Stable LM 2 12B` is a 12.1 billion parameter decoder-only language model pre-trained on 2 trillion tokens of diverse multilingual and code datasets for two epochs.
26
 
 
 
27
  ## Usage
28
 
29
  Get started generating text with `Stable LM 2 12B` by using the following code snippet:
@@ -83,7 +85,8 @@ print(tokenizer.decode(tokens[0], skip_special_tokens=True))
83
  * **Language(s)**: English
84
  * **Paper**: [Stable LM 2 Technical Report](https://arxiv.org/abs/2402.17834)
85
  * **Library**: [GPT-NeoX](https://github.com/EleutherAI/gpt-neox)
86
- * **License**: [Stability AI Non-Commercial Research Community License](https://huggingface.co/stabilityai/stablelm-2-12b/blob/main/LICENSE). If you'd like to use this model for commercial products or purposes, please contact us [here](https://stability.ai/membership) to learn more.
 
87
  * **Contact**: For questions and comments about the model, please email `lm@stability.ai`
88
 
89
  ### Model Architecture
@@ -122,7 +125,7 @@ The model is pre-trained on the aforementioned datasets in `bfloat16` precision,
122
 
123
  ### Intended Use
124
 
125
- The model is intended to be used as a foundational base model for application-specific fine-tuning. Developers must evaluate and fine-tune the model for safe performance in downstream applications.
126
 
127
  ### Limitations and Bias
128
 
 
24
 
25
  `Stable LM 2 12B` is a 12.1 billion parameter decoder-only language model pre-trained on 2 trillion tokens of diverse multilingual and code datasets for two epochs.
26
 
27
+ Please note: For commercial use, please refer to https://stability.ai/membership.
28
+
29
  ## Usage
30
 
31
  Get started generating text with `Stable LM 2 12B` by using the following code snippet:
 
85
  * **Language(s)**: English
86
  * **Paper**: [Stable LM 2 Technical Report](https://arxiv.org/abs/2402.17834)
87
  * **Library**: [GPT-NeoX](https://github.com/EleutherAI/gpt-neox)
88
+ * **License**: [Stability AI Non-Commercial Research Community License](https://huggingface.co/stabilityai/stablelm-2-12b/blob/main/LICENSE).
89
+ * **Commercial License**: to use this model commercially, please refer to https://stability.ai/membership
90
  * **Contact**: For questions and comments about the model, please email `lm@stability.ai`
91
 
92
  ### Model Architecture
 
125
 
126
  ### Intended Use
127
 
128
+ The model is intended to be used as a foundational base model for application-specific fine-tuning. Developers must evaluate and fine-tune the model for safe performance in downstream applications. For commercial use, please refer to https://stability.ai/membership.
129
 
130
  ### Limitations and Bias
131