Text Generation
Transformers
PyTorch
mpt
Composer
MosaicML
llm-foundry
custom_code
text-generation-inference
dblalock commited on
Commit
555974c
1 Parent(s): 25cfaef

minor prose tweaks

Browse files
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -91,7 +91,7 @@ model = transformers.AutoModelForCausalLM.from_pretrained(
91
  )
92
  ```
93
 
94
- The model was trained first on 2048, and then an additional pre-training phase was included for sequence length adaptation to 8192. However, ALiBi further enables users to increase the maximum sequence length during finetuning and/or inference. For example:
95
 
96
  ```python
97
  import transformers
 
91
  )
92
  ```
93
 
94
+ The model was trained initially on a sequence length of 2048. An additional pre-training phase was included for sequence length adaptation to 8192. However, ALiBi further enables users to increase the maximum sequence length during finetuning and/or inference. For example:
95
 
96
  ```python
97
  import transformers