andreaskoepf
commited on
Commit
·
8808e0f
1
Parent(s):
5a7e696
Update README.md
Browse files
README.md
CHANGED
@@ -10,8 +10,8 @@ license: other
|
|
10 |
|
11 |
## Long context via RoPE scaling
|
12 |
|
13 |
-
This model was fine-tuned with a context size of 8192 tokens using linear scaling of RoPE embeddings. In order to load
|
14 |
-
|
15 |
|
16 |
|
17 |
## Model Configuration
|
|
|
10 |
|
11 |
## Long context via RoPE scaling
|
12 |
|
13 |
+
This model was fine-tuned with a context size of 8192 tokens using linear scaling of RoPE embeddings. In order to load this model
|
14 |
+
HF Transformers >=4.31.0 is required.
|
15 |
|
16 |
|
17 |
## Model Configuration
|