TheBloke nacs commited on
Commit
8efe657
1 Parent(s): 890b9dc

Fix prompt format in llama.cpp command (#2)

Browse files

- Fix prompt format in llama.cpp command (e8393d61ec9236075184a66f0e297081b7953561)


Co-authored-by: nacs <nacs@users.noreply.huggingface.co>

Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -125,7 +125,7 @@ Make sure you are using `llama.cpp` from commit [6381d4e110bd0ec02843a60bbeb8b6f
125
  For compatibility with older versions of llama.cpp, or for use with third-party clients and libaries, please use GGML files instead.
126
 
127
  ```
128
- ./main -t 10 -ngl 32 -m airoboros-c34b-2.1.q4_K_M.gguf --color -c 4096 --temp 0.7 --repeat_penalty 1.1 -n -1 -p "### Instruction: Write a story about llamas\n### Response:"
129
  ```
130
  Change `-t 10` to the number of physical CPU cores you have. For example if your system has 8 cores/16 threads, use `-t 8`.
131
 
 
125
  For compatibility with older versions of llama.cpp, or for use with third-party clients and libaries, please use GGML files instead.
126
 
127
  ```
128
+ ./main -t 10 -ngl 32 -m airoboros-c34b-2.1.q4_K_M.gguf --color -c 4096 --temp 0.7 --repeat_penalty 1.1 -n -1 -p "A chat\nUSER: Write a story about llamas\nASSISTANT:\n"
129
  ```
130
  Change `-t 10` to the number of physical CPU cores you have. For example if your system has 8 cores/16 threads, use `-t 8`.
131