andreaskoepf commited on
Commit
8808e0f
1 Parent(s): 5a7e696

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +2 -2
README.md CHANGED
@@ -10,8 +10,8 @@ license: other
10
 
11
  ## Long context via RoPE scaling
12
 
13
- This model was fine-tuned with a context size of 8192 tokens using linear scaling of RoPE embeddings. In order to load and use this model
14
- a version >=4.31.0 of HF Transformers needs to be installed.
15
 
16
 
17
  ## Model Configuration
 
10
 
11
  ## Long context via RoPE scaling
12
 
13
+ This model was fine-tuned with a context size of 8192 tokens using linear scaling of RoPE embeddings. In order to load this model
14
+ HF Transformers >=4.31.0 is required.
15
 
16
 
17
  ## Model Configuration