chenghenry commited on
Commit
3db0199
1 Parent(s): 8bb1eef

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +36 -1
README.md CHANGED
@@ -2,4 +2,39 @@
2
  license: gemma
3
  library_name: transformers
4
  base_model: google/gemma-1.1-7b-it
5
- ---
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
2
  license: gemma
3
  library_name: transformers
4
  base_model: google/gemma-1.1-7b-it
5
+ ---
6
+
7
+ ## Usage (llama-cli with GPU):
8
+ ```
9
+ llama-cli -m ./gemma-1.1-7b-it-Q6_K.gguf -ngl 100 --temp 0 --repeat-penalty 1.0 --color -p "Why is the sky blue?"
10
+ ```
11
+
12
+ ## Usage (llama-cli with CPU):
13
+ ```
14
+ llama-cli -m ./gemma-1.1-7b-it-Q6_K.gguf --temp 0 --repeat-penalty 1.0 --color -p "Why is the sky blue?"
15
+ ```
16
+
17
+ ## Usage (llama-cpp-python via Hugging Face Hub):
18
+ ```
19
+ from llama_cpp import Llama
20
+
21
+ llm = Llama.from_pretrained(
22
+ repo_id="chenghenry/gemma-1.1-7b-it-GGUF",
23
+ filename="gemma-1.1-7b-it-Q6_K.gguf",
24
+ n_ctx=8192,
25
+ n_batch=2048,
26
+ n_gpu_layers=100,
27
+ verbose=False,
28
+ chat_format="gemma"
29
+ )
30
+
31
+ prompt = "Why is the sky blue?"
32
+
33
+ messages = [{"role": "user", "content": prompt}]
34
+ response = llm.create_chat_completion(
35
+ messages=messages,
36
+ repeat_penalty=1.0,
37
+ temperature=0)
38
+
39
+ print(response["choices"][0]["message"]["content"])
40
+ ```