apepkuss79
commited on
Commit
•
b8ee068
1
Parent(s):
5d9ccb3
Update README.md
Browse filesAdd q6 and q8 models to talbe
README.md
CHANGED
@@ -66,4 +66,5 @@ tags:
|
|
66 |
| [openchat-3.5-0106.Q5_0.gguf](https://huggingface.co/second-state/OpenChat-3.5-0106-GGUF/blob/main/openchat-3.5-0106-Q5_0.gguf) | Q5_0 | 5 | 5.00 GB| legacy; medium, balanced quality - prefer using Q4_K_M |
|
67 |
| [openchat-3.5-0106.Q5_K_M.gguf](https://huggingface.co/second-state/OpenChat-3.5-0106-GGUF/blob/main/openchat-3.5-0106-Q5_K_M.gguf) | Q5_K_M | 5 | 5.13 GB| large, very low quality loss - recommended |
|
68 |
| [openchat-3.5-0106.Q5_K_S.gguf](https://huggingface.co/second-state/OpenChat-3.5-0106-GGUF/blob/main/openchat-3.5-0106-Q5_K_S.gguf) | Q5_K_S | 5 | 5.00 GB| large, low quality loss - recommended |
|
69 |
-
|
|
|
|
66 |
| [openchat-3.5-0106.Q5_0.gguf](https://huggingface.co/second-state/OpenChat-3.5-0106-GGUF/blob/main/openchat-3.5-0106-Q5_0.gguf) | Q5_0 | 5 | 5.00 GB| legacy; medium, balanced quality - prefer using Q4_K_M |
|
67 |
| [openchat-3.5-0106.Q5_K_M.gguf](https://huggingface.co/second-state/OpenChat-3.5-0106-GGUF/blob/main/openchat-3.5-0106-Q5_K_M.gguf) | Q5_K_M | 5 | 5.13 GB| large, very low quality loss - recommended |
|
68 |
| [openchat-3.5-0106.Q5_K_S.gguf](https://huggingface.co/second-state/OpenChat-3.5-0106-GGUF/blob/main/openchat-3.5-0106-Q5_K_S.gguf) | Q5_K_S | 5 | 5.00 GB| large, low quality loss - recommended |
|
69 |
+
| [openchat-3.5-0106.Q6_K.gguf](https://huggingface.co/second-state/OpenChat-3.5-0106-GGUF/blob/main/openchat-3.5-0106-Q6_K.gguf) | Q6_K | 6 | 5.94 GB| very large, extremely low quality loss |
|
70 |
+
| [openchat-3.5-0106.Q8_0.gguf](https://huggingface.co/second-state/OpenChat-3.5-0106-GGUF/blob/main/openchat-3.5-0106-Q8_0.gguf) | Q8_0 | 8 | 7.70 GB| very large, extremely low quality loss - not recommended |
|