legraphista commited on
Commit
e6504e9
1 Parent(s): 7e100ec

Upload README.md with huggingface_hub

Browse files
Files changed (1) hide show
  1. README.md +2 -2
README.md CHANGED
@@ -74,8 +74,8 @@ IMatrix dataset: [here](https://gist.githubusercontent.com/bartowski1182/eb213dc
74
  | [glm-4-9b-chat.Q5_K_S.gguf](https://huggingface.co/legraphista/glm-4-9b-chat-GGUF/blob/main/glm-4-9b-chat.Q5_K_S.gguf) | Q5_K_S | 6.69GB | ✅ Available | ⚪ Static | 📦 No
75
  | [glm-4-9b-chat.Q4_K.gguf](https://huggingface.co/legraphista/glm-4-9b-chat-GGUF/blob/main/glm-4-9b-chat.Q4_K.gguf) | Q4_K | 6.25GB | ✅ Available | ⚪ Static | 📦 No
76
  | [glm-4-9b-chat.Q4_K_S.gguf](https://huggingface.co/legraphista/glm-4-9b-chat-GGUF/blob/main/glm-4-9b-chat.Q4_K_S.gguf) | Q4_K_S | 5.75GB | ✅ Available | ⚪ Static | 📦 No
77
- | glm-4-9b-chat.IQ4_NL | IQ4_NL | - | Processing | ⚪ Static | -
78
- | glm-4-9b-chat.IQ4_XS | IQ4_XS | - | Processing | ⚪ Static | -
79
  | [glm-4-9b-chat.Q3_K.gguf](https://huggingface.co/legraphista/glm-4-9b-chat-GGUF/blob/main/glm-4-9b-chat.Q3_K.gguf) | Q3_K | 5.06GB | ✅ Available | ⚪ Static | 📦 No
80
  | [glm-4-9b-chat.Q3_K_L.gguf](https://huggingface.co/legraphista/glm-4-9b-chat-GGUF/blob/main/glm-4-9b-chat.Q3_K_L.gguf) | Q3_K_L | 5.28GB | ✅ Available | ⚪ Static | 📦 No
81
  | [glm-4-9b-chat.Q3_K_S.gguf](https://huggingface.co/legraphista/glm-4-9b-chat-GGUF/blob/main/glm-4-9b-chat.Q3_K_S.gguf) | Q3_K_S | 4.59GB | ✅ Available | ⚪ Static | 📦 No
 
74
  | [glm-4-9b-chat.Q5_K_S.gguf](https://huggingface.co/legraphista/glm-4-9b-chat-GGUF/blob/main/glm-4-9b-chat.Q5_K_S.gguf) | Q5_K_S | 6.69GB | ✅ Available | ⚪ Static | 📦 No
75
  | [glm-4-9b-chat.Q4_K.gguf](https://huggingface.co/legraphista/glm-4-9b-chat-GGUF/blob/main/glm-4-9b-chat.Q4_K.gguf) | Q4_K | 6.25GB | ✅ Available | ⚪ Static | 📦 No
76
  | [glm-4-9b-chat.Q4_K_S.gguf](https://huggingface.co/legraphista/glm-4-9b-chat-GGUF/blob/main/glm-4-9b-chat.Q4_K_S.gguf) | Q4_K_S | 5.75GB | ✅ Available | ⚪ Static | 📦 No
77
+ | [glm-4-9b-chat.IQ4_NL.gguf](https://huggingface.co/legraphista/glm-4-9b-chat-GGUF/blob/main/glm-4-9b-chat.IQ4_NL.gguf) | IQ4_NL | 5.51GB | Available | ⚪ Static | 📦 No
78
+ | [glm-4-9b-chat.IQ4_XS.gguf](https://huggingface.co/legraphista/glm-4-9b-chat-GGUF/blob/main/glm-4-9b-chat.IQ4_XS.gguf) | IQ4_XS | 5.30GB | Available | ⚪ Static | 📦 No
79
  | [glm-4-9b-chat.Q3_K.gguf](https://huggingface.co/legraphista/glm-4-9b-chat-GGUF/blob/main/glm-4-9b-chat.Q3_K.gguf) | Q3_K | 5.06GB | ✅ Available | ⚪ Static | 📦 No
80
  | [glm-4-9b-chat.Q3_K_L.gguf](https://huggingface.co/legraphista/glm-4-9b-chat-GGUF/blob/main/glm-4-9b-chat.Q3_K_L.gguf) | Q3_K_L | 5.28GB | ✅ Available | ⚪ Static | 📦 No
81
  | [glm-4-9b-chat.Q3_K_S.gguf](https://huggingface.co/legraphista/glm-4-9b-chat-GGUF/blob/main/glm-4-9b-chat.Q3_K_S.gguf) | Q3_K_S | 4.59GB | ✅ Available | ⚪ Static | 📦 No