apepkuss79 commited on
Commit
f59cda6
1 Parent(s): 27e95b4

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +30 -10
README.md CHANGED
@@ -24,25 +24,45 @@ quantized_by: Second State Inc.
24
 
25
  - LlamaEdge version: coming soon
26
 
27
- <!-- - LlamaEdge version: [v0.14.3](https://github.com/LlamaEdge/LlamaEdge/releases/tag/0.14.3)
28
 
29
  - Prompt template
30
 
31
- - Prompt type: `chatml`
32
 
33
  - Prompt string
34
 
35
  ```text
36
- <|im_start|>system
37
- {system_message}<|im_end|>
38
- <|im_start|>user
39
- {prompt}<|im_end|>
40
- <|im_start|>assistant
41
- ``` -->
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
42
 
43
  - Context size: `128000`
44
 
45
- <!-- - Run as LlamaEdge service
46
 
47
  ```bash
48
  wasmedge --dir .:. --nn-preload default:GGML:AUTO:functionary-small-v3.2-Q5_K_M.gguf \
@@ -59,7 +79,7 @@ quantized_by: Second State Inc.
59
  llama-chat.wasm \
60
  --prompt-template functionary-32 \
61
  --ctx-size 128000
62
- ``` -->
63
 
64
  ## Quantized GGUF Models
65
 
 
24
 
25
  - LlamaEdge version: coming soon
26
 
27
+ <!-- - LlamaEdge version: [v0.14.3](https://github.com/LlamaEdge/LlamaEdge/releases/tag/0.14.3) -->
28
 
29
  - Prompt template
30
 
31
+ - Prompt type: `functionary-32`
32
 
33
  - Prompt string
34
 
35
  ```text
36
+ <|start_header_id|>system<|end_header_id|>
37
+
38
+ You are capable of executing available function(s) if required.
39
+ Only execute function(s) when absolutely necessary.
40
+ Ask for the required input to:recipient==all
41
+ Use JSON for function arguments.
42
+ Respond in this format:
43
+ >>>${recipient}
44
+ ${content}
45
+ Available functions:
46
+ // Supported function definitions that should be called when necessary.
47
+ namespace functions {
48
+
49
+ // Get the current weather
50
+ type get_current_weather = (_: {
51
+
52
+ // The city and state, e.g. San Francisco, CA
53
+ location: string,
54
+
55
+ }) => any;
56
+
57
+
58
+ } // namespace functions<|eot_id|><|start_header_id|>user<|end_header_id|>
59
+
60
+ What is the weather like in Beijing today?<|eot_id|><|start_header_id|>assistant<|end_header_id|>
61
+ ```
62
 
63
  - Context size: `128000`
64
 
65
+ - Run as LlamaEdge service
66
 
67
  ```bash
68
  wasmedge --dir .:. --nn-preload default:GGML:AUTO:functionary-small-v3.2-Q5_K_M.gguf \
 
79
  llama-chat.wasm \
80
  --prompt-template functionary-32 \
81
  --ctx-size 128000
82
+ ```
83
 
84
  ## Quantized GGUF Models
85