JoeyOhman commited on
Commit
3b86e17
1 Parent(s): 9704f32

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +2 -0
README.md CHANGED
@@ -13,6 +13,8 @@ GPT-SW3 is an autoregressive large language model that is capable of generating
13
  Like other large language models for which the diversity (or lack thereof) of training data induces downstream impact on the quality of our model, GPT-SW3 has limitations in terms of for example bias and safety. GPT-SW3 can also have quality issues in terms of generation diversity and hallucination. By releasing with the modified RAIL license, we also hope to increase communication, transparency, and the study of large language models. The model may: overrepresent some viewpoints and underrepresent others, contain stereotypes, generate hateful, abusive, violent, discriminatory or prejudicial language. The model may make errors, including producing incorrect information as if it were factual, it may generate irrelevant or repetitive outputs, and content that may not be appropriate for all settings, including sexual content.
14
 
15
  # How to use
 
 
16
  The following code snippet loads our tokenizer & model, and uses the GPU if available.
17
 
18
  ```python
 
13
  Like other large language models for which the diversity (or lack thereof) of training data induces downstream impact on the quality of our model, GPT-SW3 has limitations in terms of for example bias and safety. GPT-SW3 can also have quality issues in terms of generation diversity and hallucination. By releasing with the modified RAIL license, we also hope to increase communication, transparency, and the study of large language models. The model may: overrepresent some viewpoints and underrepresent others, contain stereotypes, generate hateful, abusive, violent, discriminatory or prejudicial language. The model may make errors, including producing incorrect information as if it were factual, it may generate irrelevant or repetitive outputs, and content that may not be appropriate for all settings, including sexual content.
14
 
15
  # How to use
16
+ To be able to access the model from Python, since this is a private repository, you have to log in with your access token. This can be done with `huggingface-cli login`, see [HuggingFace Quick Start Guide](https://huggingface.co/docs/huggingface_hub/quick-start#login) for more information.
17
+
18
  The following code snippet loads our tokenizer & model, and uses the GPU if available.
19
 
20
  ```python