Transformers
mpt
Composer
MosaicML
llm-foundry
text-generation-inference
TheBloke commited on
Commit
f123950
1 Parent(s): 4bb6ada

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -63,7 +63,7 @@ prompt
63
  The base model has an 8K context length. It is not yet confirmed if the 8K context of this model works with the quantised files.
64
 
65
  If it does, [KoboldCpp](https://github.com/LostRuins/koboldcpp) supports 8K context if you manually it to 8K by adjusting the text box above the slider:
66
- ![.](https://i.imgur.com/tEbpeJq.png)
67
 
68
  It is currently unknown as to whether it is compatible with other clients.
69
 
 
63
  The base model has an 8K context length. It is not yet confirmed if the 8K context of this model works with the quantised files.
64
 
65
  If it does, [KoboldCpp](https://github.com/LostRuins/koboldcpp) supports 8K context if you manually it to 8K by adjusting the text box above the slider:
66
+ ![.](https://i.imgur.com/tEbpeJqm.png)
67
 
68
  It is currently unknown as to whether it is compatible with other clients.
69