Sagicc commited on
Commit
a172c3a
1 Parent(s): 0d24090

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +3 -3
README.md CHANGED
@@ -102,17 +102,17 @@ Invoke the llama.cpp server or the CLI.
102
  CLI:
103
 
104
  ```bash
105
- llama-cli --hf-repo YorkieOH10/granite-8b-code-instruct-Q4_K_M-GGUF --model granite-8b-code-instruct.Q4_K_M.gguf -p "The meaning to life and the universe is"
106
  ```
107
 
108
  Server:
109
 
110
  ```bash
111
- llama-server --hf-repo YorkieOH10/granite-8b-code-instruct-Q4_K_M-GGUF --model granite-8b-code-instruct.Q4_K_M.gguf -c 2048
112
  ```
113
 
114
  Note: You can also use this checkpoint directly through the [usage steps](https://github.com/ggerganov/llama.cpp?tab=readme-ov-file#usage) listed in the Llama.cpp repo as well.
115
 
116
  ```
117
- git clone https://github.com/ggerganov/llama.cpp && cd llama.cpp && make && ./main -m granite-8b-code-instruct.Q4_K_M.gguf -n 128
118
  ```
 
102
  CLI:
103
 
104
  ```bash
105
+ llama-cli --hf-repo Sagicc/granite-8b-code-instruct-Q5_K_M-GGUF --model granite-8b-code-instruct.Q5_K_M.gguf -p "You are an AI assistant"
106
  ```
107
 
108
  Server:
109
 
110
  ```bash
111
+ llama-server --hf-repo Sagicc/granite-8b-code-instruct-Q5_K_M-GGUF --model granite-8b-code-instruct.Q5_K_M.gguf -c 2048
112
  ```
113
 
114
  Note: You can also use this checkpoint directly through the [usage steps](https://github.com/ggerganov/llama.cpp?tab=readme-ov-file#usage) listed in the Llama.cpp repo as well.
115
 
116
  ```
117
+ git clone https://github.com/ggerganov/llama.cpp && cd llama.cpp && make && ./main -m granite-8b-code-instruct.Q5_K_M.gguf -n 128
118
  ```