Triangle104
commited on
Commit
•
2f1a0ae
1
Parent(s):
7b474ed
Update README.md
Browse files
README.md
CHANGED
@@ -3,6 +3,7 @@ base_model: FuseAI/FuseChat-Llama-3.2-1B-Instruct
|
|
3 |
tags:
|
4 |
- llama-cpp
|
5 |
- gguf-my-repo
|
|
|
6 |
---
|
7 |
|
8 |
# Triangle104/FuseChat-Llama-3.2-1B-Instruct-Q5_K_M-GGUF
|
@@ -184,4 +185,4 @@ Step 3: Run inference through the main binary.
|
|
184 |
or
|
185 |
```
|
186 |
./llama-server --hf-repo Triangle104/FuseChat-Llama-3.2-1B-Instruct-Q5_K_M-GGUF --hf-file fusechat-llama-3.2-1b-instruct-q5_k_m.gguf -c 2048
|
187 |
-
```
|
|
|
3 |
tags:
|
4 |
- llama-cpp
|
5 |
- gguf-my-repo
|
6 |
+
license: llama3.2
|
7 |
---
|
8 |
|
9 |
# Triangle104/FuseChat-Llama-3.2-1B-Instruct-Q5_K_M-GGUF
|
|
|
185 |
or
|
186 |
```
|
187 |
./llama-server --hf-repo Triangle104/FuseChat-Llama-3.2-1B-Instruct-Q5_K_M-GGUF --hf-file fusechat-llama-3.2-1b-instruct-q5_k_m.gguf -c 2048
|
188 |
+
```
|