kartikmosaicml commited on
Commit
0c8ae92
1 Parent(s): 727e05d

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -99,7 +99,7 @@ model = transformers.AutoModelForCausalLM.from_pretrained(
99
  )
100
  ```
101
 
102
- The model was trained initially with a sequence length of 4096 with an additional pretraining stage for sequence length adapation up to 8192. However, ALiBi enables users to increase the maximum sequence length even further during finetuning and/or inference. For example:
103
 
104
  ```python
105
  import transformers
 
99
  )
100
  ```
101
 
102
+ The model was trained initially with a sequence length of 2048 with an additional pretraining stage for sequence length adapation up to 8192. However, ALiBi enables users to increase the maximum sequence length even further during finetuning and/or inference. For example:
103
 
104
  ```python
105
  import transformers