apepkuss79 commited on
Commit
8787161
1 Parent(s): 064dc6d

Upload README.md with huggingface_hub

Browse files
Files changed (1) hide show
  1. README.md +4 -0
README.md CHANGED
@@ -84,6 +84,10 @@ tags:
84
  | [Qwen1.5-110B-Chat-Q6_K-00001-of-00003.gguf](https://huggingface.co/second-state/Qwen1.5-110B-Chat-GGUF/blob/main/Qwen1.5-110B-Chat-Q6_K-00001-of-00003.gguf) | Q6_K | 6 | 31.9 GB| very large, extremely low quality loss |
85
  | [Qwen1.5-110B-Chat-Q6_K-00002-of-00003.gguf](https://huggingface.co/second-state/Qwen1.5-110B-Chat-GGUF/blob/main/Qwen1.5-110B-Chat-Q6_K-00002-of-00003.gguf) | Q6_K | 6 | 32 GB| very large, extremely low quality loss |
86
  | [Qwen1.5-110B-Chat-Q6_K-00003-of-00003.gguf](https://huggingface.co/second-state/Qwen1.5-110B-Chat-GGUF/blob/main/Qwen1.5-110B-Chat-Q6_K-00003-of-00003.gguf) | Q6_K | 6 | 27.3 GB| very large, extremely low quality loss |
 
 
 
 
87
  <!-- | [Qwen1.5-7B-Chat-Q3_K_L.gguf](https://huggingface.co/second-state/Qwen1.5-7B-Chat-GGUF/blob/main/Qwen1.5-7B-Chat-Q3_K_L.gguf) | Q3_K_L | 3 | 4.22 GB| small, substantial quality loss |
88
  | [Qwen1.5-7B-Chat-Q3_K_M.gguf](https://huggingface.co/second-state/Qwen1.5-7B-Chat-GGUF/blob/main/Qwen1.5-7B-Chat-Q3_K_M.gguf) | Q3_K_M | 3 | 3.92 GB| very small, high quality loss |
89
  | [Qwen1.5-7B-Chat-Q3_K_S.gguf](https://huggingface.co/second-state/Qwen1.5-7B-Chat-GGUF/blob/main/Qwen1.5-7B-Chat-Q3_K_S.gguf) | Q3_K_S | 3 | 3.57 GB| very small, high quality loss |
 
84
  | [Qwen1.5-110B-Chat-Q6_K-00001-of-00003.gguf](https://huggingface.co/second-state/Qwen1.5-110B-Chat-GGUF/blob/main/Qwen1.5-110B-Chat-Q6_K-00001-of-00003.gguf) | Q6_K | 6 | 31.9 GB| very large, extremely low quality loss |
85
  | [Qwen1.5-110B-Chat-Q6_K-00002-of-00003.gguf](https://huggingface.co/second-state/Qwen1.5-110B-Chat-GGUF/blob/main/Qwen1.5-110B-Chat-Q6_K-00002-of-00003.gguf) | Q6_K | 6 | 32 GB| very large, extremely low quality loss |
86
  | [Qwen1.5-110B-Chat-Q6_K-00003-of-00003.gguf](https://huggingface.co/second-state/Qwen1.5-110B-Chat-GGUF/blob/main/Qwen1.5-110B-Chat-Q6_K-00003-of-00003.gguf) | Q6_K | 6 | 27.3 GB| very large, extremely low quality loss |
87
+ | [Qwen1.5-110B-Chat-Q8_0-00001-of-00004.gguf](https://huggingface.co/second-state/Qwen1.5-110B-Chat-GGUF/blob/main/Qwen1.5-110B-Chat-Q8_0-00001-of-00004.gguf) | Q8_0 | 8 | 32.1 GB| very large, extremely low quality loss - not recommended |
88
+ | [Qwen1.5-110B-Chat-Q8_0-00002-of-00004.gguf](https://huggingface.co/second-state/Qwen1.5-110B-Chat-GGUF/blob/main/Qwen1.5-110B-Chat-Q8_0-00002-of-00004.gguf) | Q8_0 | 8 | 31.9 GB| very large, extremely low quality loss - not recommended |
89
+ | [Qwen1.5-110B-Chat-Q8_0-00003-of-00004.gguf](https://huggingface.co/second-state/Qwen1.5-110B-Chat-GGUF/blob/main/Qwen1.5-110B-Chat-Q8_0-00003-of-00004.gguf) | Q8_0 | 8 | 32.2 GB| very large, extremely low quality loss - not recommended |
90
+ | [Qwen1.5-110B-Chat-Q8_0-00004-of-00004.gguf](https://huggingface.co/second-state/Qwen1.5-110B-Chat-GGUF/blob/main/Qwen1.5-110B-Chat-Q8_0-00004-of-00004.gguf) | Q8_0 | 8 | 22 GB| very large, extremely low quality loss - not recommended |
91
  <!-- | [Qwen1.5-7B-Chat-Q3_K_L.gguf](https://huggingface.co/second-state/Qwen1.5-7B-Chat-GGUF/blob/main/Qwen1.5-7B-Chat-Q3_K_L.gguf) | Q3_K_L | 3 | 4.22 GB| small, substantial quality loss |
92
  | [Qwen1.5-7B-Chat-Q3_K_M.gguf](https://huggingface.co/second-state/Qwen1.5-7B-Chat-GGUF/blob/main/Qwen1.5-7B-Chat-Q3_K_M.gguf) | Q3_K_M | 3 | 3.92 GB| very small, high quality loss |
93
  | [Qwen1.5-7B-Chat-Q3_K_S.gguf](https://huggingface.co/second-state/Qwen1.5-7B-Chat-GGUF/blob/main/Qwen1.5-7B-Chat-Q3_K_S.gguf) | Q3_K_S | 3 | 3.57 GB| very small, high quality loss |