tested the llama-cli and just fixed for a better conversation
Browse files
README.md
CHANGED
@@ -32,5 +32,5 @@ rm -r RWKV-6-World-1.6B-GGUF
|
|
32 |
|
33 |
* Now to run the model, you can use the following command:
|
34 |
```
|
35 |
-
|
36 |
```
|
|
|
32 |
|
33 |
* Now to run the model, you can use the following command:
|
34 |
```
|
35 |
+
./llama-cli -m ./model/RWKV-6-World-1.6B-GGUF-Q4_K_M.gguf --in-suffix "Assistant:" --interactive-first -c 1024 -t 0.7 --top-k 50 --top-p 0.95 -n 128 -p "Assistant: Hello, what can i help you with today?\nUser:" -r "User:"
|
36 |
```
|