abhi-mosaic dblalock commited on
Commit
716e2c1
1 Parent(s): 25cfaef

minor prose tweaks (#1)

Browse files

- minor prose tweaks (555974c96b0a25d3e60593f274d61c415b56ef9c)


Co-authored-by: d blalock <dblalock@users.noreply.huggingface.co>

Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -91,7 +91,7 @@ model = transformers.AutoModelForCausalLM.from_pretrained(
91
  )
92
  ```
93
 
94
- The model was trained first on 2048, and then an additional pre-training phase was included for sequence length adaptation to 8192. However, ALiBi further enables users to increase the maximum sequence length during finetuning and/or inference. For example:
95
 
96
  ```python
97
  import transformers
 
91
  )
92
  ```
93
 
94
+ The model was trained initially on a sequence length of 2048. An additional pre-training phase was included for sequence length adaptation to 8192. However, ALiBi further enables users to increase the maximum sequence length during finetuning and/or inference. For example:
95
 
96
  ```python
97
  import transformers