cjpais commited on
Commit
3966604
1 Parent(s): 82415a5

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -13,7 +13,7 @@ Notes: Was prepared with a unofficial script, and is likely missing some data an
13
  | ---- | ---- | ---- | ---- | ----- |
14
  | [llava-v1.6-34b.Q3_K_XS.gguf](https://huggingface.co/cjpais/llava-v1.6-34B-gguf/blob/main/llava-1.6-34b.Q3_K_XS.gguf) | Q3_K_XS | 3 | 14.2 GB| very small, high quality loss |
15
  | [llava-v1.6-34b.Q3_K_M.gguf](https://huggingface.co/cjpais/llava-v1.6-34B-gguf/blob/main/llava-1.6-34b.Q3_K.gguf) | Q3_K_M | 3 | 16.7 GB| very small, high quality loss |
16
- | [llava-v1.6-34b.Q4_K_M.gguf](https://huggingface.co/cjpais/llava-v1.6-34B-gguf/blob/main/llava-1.6-34b.Q4_K_M.gguf) | Q4_K_M | 4 | 20.66 GB| medium, balanced quality - recommended |
17
  | [llava-v1.6-34b.Q5_K_S.gguf](https://huggingface.co/cjpais/llava-v1.6-34B-gguf/blob/main/llava-1.6-34b.Q5_K_S.gguf) | Q5_K_S | 5 | 23.7 GB| large, low quality loss - recommended |
18
  | [llava-v1.6-34b.Q5_K_M.gguf](https://huggingface.co/cjpais/llava-v1.6-34B-gguf/blob/main/ggml-model-Q5_K.gguf) | Q5_K_M | 5 | 24.3 GB| large, very low quality loss - recommended |
19
 
 
13
  | ---- | ---- | ---- | ---- | ----- |
14
  | [llava-v1.6-34b.Q3_K_XS.gguf](https://huggingface.co/cjpais/llava-v1.6-34B-gguf/blob/main/llava-1.6-34b.Q3_K_XS.gguf) | Q3_K_XS | 3 | 14.2 GB| very small, high quality loss |
15
  | [llava-v1.6-34b.Q3_K_M.gguf](https://huggingface.co/cjpais/llava-v1.6-34B-gguf/blob/main/llava-1.6-34b.Q3_K.gguf) | Q3_K_M | 3 | 16.7 GB| very small, high quality loss |
16
+ | [llava-v1.6-34b.Q4_K_M.gguf](https://huggingface.co/cjpais/llava-v1.6-34B-gguf/blob/main/llava-v1.6-34b.Q4_K_M.gguf) | Q4_K_M | 4 | 20.66 GB| medium, balanced quality - recommended |
17
  | [llava-v1.6-34b.Q5_K_S.gguf](https://huggingface.co/cjpais/llava-v1.6-34B-gguf/blob/main/llava-1.6-34b.Q5_K_S.gguf) | Q5_K_S | 5 | 23.7 GB| large, low quality loss - recommended |
18
  | [llava-v1.6-34b.Q5_K_M.gguf](https://huggingface.co/cjpais/llava-v1.6-34B-gguf/blob/main/ggml-model-Q5_K.gguf) | Q5_K_M | 5 | 24.3 GB| large, very low quality loss - recommended |
19