legraphista commited on
Commit
a5a7083
1 Parent(s): 07d10b7

Upload README.md with huggingface_hub

Browse files
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -90,7 +90,7 @@ Link: [here](https://huggingface.co/legraphista/glm-4-9b-chat-IMat-GGUF/blob/mai
90
  | glm-4-9b-chat.IQ3_XS | IQ3_XS | - | ⏳ Processing | 🟢 IMatrix | -
91
  | glm-4-9b-chat.IQ3_XXS | IQ3_XXS | - | ⏳ Processing | 🟢 IMatrix | -
92
  | [glm-4-9b-chat.Q2_K.gguf](https://huggingface.co/legraphista/glm-4-9b-chat-IMat-GGUF/blob/main/glm-4-9b-chat.Q2_K.gguf) | Q2_K | 3.99GB | ✅ Available | 🟢 IMatrix | 📦 No
93
- | glm-4-9b-chat.Q2_K_S | Q2_K_S | - | Processing | 🟢 IMatrix | -
94
  | glm-4-9b-chat.IQ2_M | IQ2_M | - | ⏳ Processing | 🟢 IMatrix | -
95
  | glm-4-9b-chat.IQ2_S | IQ2_S | - | ⏳ Processing | 🟢 IMatrix | -
96
  | glm-4-9b-chat.IQ2_XS | IQ2_XS | - | ⏳ Processing | 🟢 IMatrix | -
 
90
  | glm-4-9b-chat.IQ3_XS | IQ3_XS | - | ⏳ Processing | 🟢 IMatrix | -
91
  | glm-4-9b-chat.IQ3_XXS | IQ3_XXS | - | ⏳ Processing | 🟢 IMatrix | -
92
  | [glm-4-9b-chat.Q2_K.gguf](https://huggingface.co/legraphista/glm-4-9b-chat-IMat-GGUF/blob/main/glm-4-9b-chat.Q2_K.gguf) | Q2_K | 3.99GB | ✅ Available | 🟢 IMatrix | 📦 No
93
+ | [glm-4-9b-chat.Q2_K_S.gguf](https://huggingface.co/legraphista/glm-4-9b-chat-IMat-GGUF/blob/main/glm-4-9b-chat.Q2_K_S.gguf) | Q2_K_S | 3.96GB | Available | 🟢 IMatrix | 📦 No
94
  | glm-4-9b-chat.IQ2_M | IQ2_M | - | ⏳ Processing | 🟢 IMatrix | -
95
  | glm-4-9b-chat.IQ2_S | IQ2_S | - | ⏳ Processing | 🟢 IMatrix | -
96
  | glm-4-9b-chat.IQ2_XS | IQ2_XS | - | ⏳ Processing | 🟢 IMatrix | -