Transformers
mpt
Composer
MosaicML
llm-foundry
text-generation-inference
TheBloke commited on
Commit
ce2ad61
1 Parent(s): 45fb9c5

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +6 -2
README.md CHANGED
@@ -59,9 +59,13 @@ This should be fixed in the next release of KoboldCpp, so if you are running a v
59
 
60
  ## Prompt template
61
 
62
- Just type the prompt!
 
63
  ```
64
- prompt
 
 
 
65
  ```
66
 
67
  ## A note regarding context length: 8K
 
59
 
60
  ## Prompt template
61
 
62
+ Based on the code for the MPT 30B Chat Space, I believe this is the correct prompt template:
63
+
64
  ```
65
+ <|im_start|>system
66
+ A conversation between a user and an LLM-based AI assistant. The assistant gives helpful and honest answers.<|im_end|>
67
+ <|im_start|>user
68
+ prompt goes here<|im_end|>
69
  ```
70
 
71
  ## A note regarding context length: 8K