Text Generation
Transformers
PyTorch
mpt
Composer
MosaicML
llm-foundry
conversational
custom_code
text-generation-inference
kartikmosaicml commited on
Commit
946e808
1 Parent(s): ebbcb23

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -110,7 +110,7 @@ model = transformers.AutoModelForCausalLM.from_pretrained(
110
  )
111
  ```
112
 
113
- The model was trained initially with a sequence length of 4096 with an additional pretraining stage for sequence length adapation up to 8192. However, ALiBi enables users to increase the maximum sequence length even further during finetuning and/or inference. For example:
114
 
115
  ```python
116
  import transformers
 
110
  )
111
  ```
112
 
113
+ The model was trained initially with a sequence length of 2048 with an additional pretraining stage for sequence length adapation up to 8192. However, ALiBi enables users to increase the maximum sequence length even further during finetuning and/or inference. For example:
114
 
115
  ```python
116
  import transformers