minor prose tweaks

#1
by dblalock - opened
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -91,7 +91,7 @@ model = transformers.AutoModelForCausalLM.from_pretrained(
91
  )
92
  ```
93
 
94
- The model was trained first on 2048, and then an additional pre-training phase was included for sequence length adaptation to 8192. However, ALiBi further enables users to increase the maximum sequence length during finetuning and/or inference. For example:
95
 
96
  ```python
97
  import transformers
 
91
  )
92
  ```
93
 
94
+ The model was trained initially on a sequence length of 2048. An additional pre-training phase was included for sequence length adaptation to 8192. However, ALiBi further enables users to increase the maximum sequence length during finetuning and/or inference. For example:
95
 
96
  ```python
97
  import transformers