Xin Liu commited on
Commit
4447448
1 Parent(s): be6ec58

Signed-off-by: Xin Liu <sam@secondstate.io>

Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -65,7 +65,7 @@ tags:
65
  | [Qwen1.5-1.8B-Chat-Q2_K.gguf](https://huggingface.co/second-state/Qwen1.5-1.8B-Chat-GGUF/blob/main/Qwen1.5-1.8B-Chat-Q2_K.gguf) | Q2_K | 2 | 863 MB| smallest, significant quality loss - not recommended for most purposes |
66
  | [Qwen1.5-1.8B-Chat-Q3_K_L.gguf](https://huggingface.co/second-state/Qwen1.5-1.8B-Chat-GGUF/blob/main/Qwen1.5-1.8B-Chat-Q3_K_L.gguf) | Q3_K_L | 3 | 1.06 GB| small, substantial quality loss |
67
  | [Qwen1.5-1.8B-Chat-Q3_K_M.gguf](https://huggingface.co/second-state/Qwen1.5-1.8B-Chat-GGUF/blob/main/Qwen1.5-1.8B-Chat-Q3_K_M.gguf) | Q3_K_M | 3 | 1.02 GB| very small, high quality loss |
68
- | [Qwen1.5-1.8B-Chat-Q3_K_S.gguf](hhttps://huggingface.co/second-state/Qwen1.5-1.8B-Chat-GGUF/blob/main/Qwen1.5-1.8B-Chat-Q3_K_S.gguf) | Q3_K_S | 3 | 970 MB| very small, high quality loss |
69
  | [Qwen1.5-1.8B-Chat-Q4_0.gguf](https://huggingface.co/second-state/Qwen1.5-1.8B-Chat-GGUF/blob/main/Qwen1.5-1.8B-Chat-Q4_0.gguf) | Q4_0 | 4 | 1.12 GB| legacy; small, very high quality loss - prefer using Q3_K_M |
70
  | [Qwen1.5-1.8B-Chat-Q4_K_M.gguf](https://huggingface.co/second-state/Qwen1.5-1.8B-Chat-GGUF/blob/main/Qwen1.5-1.8B-Chat-Q4_K_M.gguf) | Q4_K_M | 4 | 1.22 GB| medium, balanced quality - recommended |
71
  | [Qwen1.5-1.8B-Chat-Q4_K_S.gguf](https://huggingface.co/second-state/Qwen1.5-1.8B-Chat-GGUF/blob/main/Qwen1.5-1.8B-Chat-Q4_K_S.gguf) | Q4_K_S | 4 | 1.16 GB| small, greater quality loss |
 
65
  | [Qwen1.5-1.8B-Chat-Q2_K.gguf](https://huggingface.co/second-state/Qwen1.5-1.8B-Chat-GGUF/blob/main/Qwen1.5-1.8B-Chat-Q2_K.gguf) | Q2_K | 2 | 863 MB| smallest, significant quality loss - not recommended for most purposes |
66
  | [Qwen1.5-1.8B-Chat-Q3_K_L.gguf](https://huggingface.co/second-state/Qwen1.5-1.8B-Chat-GGUF/blob/main/Qwen1.5-1.8B-Chat-Q3_K_L.gguf) | Q3_K_L | 3 | 1.06 GB| small, substantial quality loss |
67
  | [Qwen1.5-1.8B-Chat-Q3_K_M.gguf](https://huggingface.co/second-state/Qwen1.5-1.8B-Chat-GGUF/blob/main/Qwen1.5-1.8B-Chat-Q3_K_M.gguf) | Q3_K_M | 3 | 1.02 GB| very small, high quality loss |
68
+ | [Qwen1.5-1.8B-Chat-Q3_K_S.gguf](https://huggingface.co/second-state/Qwen1.5-1.8B-Chat-GGUF/blob/main/Qwen1.5-1.8B-Chat-Q3_K_S.gguf) | Q3_K_S | 3 | 970 MB| very small, high quality loss |
69
  | [Qwen1.5-1.8B-Chat-Q4_0.gguf](https://huggingface.co/second-state/Qwen1.5-1.8B-Chat-GGUF/blob/main/Qwen1.5-1.8B-Chat-Q4_0.gguf) | Q4_0 | 4 | 1.12 GB| legacy; small, very high quality loss - prefer using Q3_K_M |
70
  | [Qwen1.5-1.8B-Chat-Q4_K_M.gguf](https://huggingface.co/second-state/Qwen1.5-1.8B-Chat-GGUF/blob/main/Qwen1.5-1.8B-Chat-Q4_K_M.gguf) | Q4_K_M | 4 | 1.22 GB| medium, balanced quality - recommended |
71
  | [Qwen1.5-1.8B-Chat-Q4_K_S.gguf](https://huggingface.co/second-state/Qwen1.5-1.8B-Chat-GGUF/blob/main/Qwen1.5-1.8B-Chat-Q4_K_S.gguf) | Q4_K_S | 4 | 1.16 GB| small, greater quality loss |