Sweaterdog commited on
Commit
b74bc5c
1 Parent(s): 4011d6d

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -99,7 +99,7 @@ goto loop
99
 
100
  #
101
 
102
- For Anybody who is wondering what the context length is, the Qwen version, has a length of 64000 tokens, for the Llama version, it has a 128000 token context window. *(***NOTE***Any model can support a longer context, but these are the supported values in training)*
103
 
104
  #
105
 
 
99
 
100
  #
101
 
102
+ For Anybody who is wondering what the context length is, the Qwen version, has a length of 64000 tokens, for the Llama version, it has a 128000 token context window. *(***NOTE*** Any model can support a longer context, but these are the supported values in training)*
103
 
104
  #
105