mkshing commited on
Commit
d8abd08
1 Parent(s): fe16172

Update README.md

Browse files

fix the link to config

Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -58,7 +58,7 @@ print(tokenizer.decode(tokens[0], skip_special_tokens=True))
58
 
59
  ### Training Procedure
60
 
61
- Models are pre-trained on the aforementioned dataset in mixed-precision (FP16), optimized with Adam, and trained using the NeoX tokenizer with a vocabulary size of 50,257. We outline the complete hyperparameters choices in the project's [GitHub repository](https://github.com/Stability-AI/StableLM-staging/blob/main/configs/stablelm-base-alpha-7b.yaml).
62
 
63
  ## Use and Limitations
64
 
 
58
 
59
  ### Training Procedure
60
 
61
+ Models are pre-trained on the aforementioned dataset in mixed-precision (FP16), optimized with Adam, and trained using the NeoX tokenizer with a vocabulary size of 50,257. We outline the complete hyperparameters choices in the project's [GitHub repository](https://github.com/Stability-AI/StableLM/blob/main/configs/stablelm-base-alpha-3b.yaml).
62
 
63
  ## Use and Limitations
64