legraphista commited on
Commit
3181299
1 Parent(s): 55e3fde

Upload README.md with huggingface_hub

Browse files
Files changed (1) hide show
  1. README.md +2 -2
README.md CHANGED
@@ -63,7 +63,7 @@ Link: [here](https://huggingface.co/legraphista/glm-4-9b-chat-IMat-GGUF/blob/mai
63
  | Filename | Quant type | File Size | Status | Uses IMatrix | Is Split |
64
  | -------- | ---------- | --------- | ------ | ------------ | -------- |
65
  | glm-4-9b-chat.Q8_0 | Q8_0 | - | ⏳ Processing | ⚪ Static | -
66
- | glm-4-9b-chat.Q6_K | Q6_K | - | Processing | ⚪ Static | -
67
  | glm-4-9b-chat.Q4_K | Q4_K | - | ⏳ Processing | 🟢 IMatrix | -
68
  | glm-4-9b-chat.Q3_K | Q3_K | - | ⏳ Processing | 🟢 IMatrix | -
69
  | glm-4-9b-chat.Q2_K | Q2_K | - | ⏳ Processing | 🟢 IMatrix | -
@@ -75,7 +75,7 @@ Link: [here](https://huggingface.co/legraphista/glm-4-9b-chat-IMat-GGUF/blob/mai
75
  | glm-4-9b-chat.BF16 | BF16 | - | ⏳ Processing | ⚪ Static | -
76
  | glm-4-9b-chat.FP16 | F16 | - | ⏳ Processing | ⚪ Static | -
77
  | glm-4-9b-chat.Q8_0 | Q8_0 | - | ⏳ Processing | ⚪ Static | -
78
- | glm-4-9b-chat.Q6_K | Q6_K | - | Processing | ⚪ Static | -
79
  | glm-4-9b-chat.Q5_K | Q5_K | - | ⏳ Processing | ⚪ Static | -
80
  | glm-4-9b-chat.Q5_K_S | Q5_K_S | - | ⏳ Processing | ⚪ Static | -
81
  | glm-4-9b-chat.Q4_K | Q4_K | - | ⏳ Processing | 🟢 IMatrix | -
 
63
  | Filename | Quant type | File Size | Status | Uses IMatrix | Is Split |
64
  | -------- | ---------- | --------- | ------ | ------------ | -------- |
65
  | glm-4-9b-chat.Q8_0 | Q8_0 | - | ⏳ Processing | ⚪ Static | -
66
+ | [glm-4-9b-chat.Q6_K.gguf](https://huggingface.co/legraphista/glm-4-9b-chat-IMat-GGUF/blob/main/glm-4-9b-chat.Q6_K.gguf) | Q6_K | 8.26GB | Available | ⚪ Static | 📦 No
67
  | glm-4-9b-chat.Q4_K | Q4_K | - | ⏳ Processing | 🟢 IMatrix | -
68
  | glm-4-9b-chat.Q3_K | Q3_K | - | ⏳ Processing | 🟢 IMatrix | -
69
  | glm-4-9b-chat.Q2_K | Q2_K | - | ⏳ Processing | 🟢 IMatrix | -
 
75
  | glm-4-9b-chat.BF16 | BF16 | - | ⏳ Processing | ⚪ Static | -
76
  | glm-4-9b-chat.FP16 | F16 | - | ⏳ Processing | ⚪ Static | -
77
  | glm-4-9b-chat.Q8_0 | Q8_0 | - | ⏳ Processing | ⚪ Static | -
78
+ | [glm-4-9b-chat.Q6_K.gguf](https://huggingface.co/legraphista/glm-4-9b-chat-IMat-GGUF/blob/main/glm-4-9b-chat.Q6_K.gguf) | Q6_K | 8.26GB | Available | ⚪ Static | 📦 No
79
  | glm-4-9b-chat.Q5_K | Q5_K | - | ⏳ Processing | ⚪ Static | -
80
  | glm-4-9b-chat.Q5_K_S | Q5_K_S | - | ⏳ Processing | ⚪ Static | -
81
  | glm-4-9b-chat.Q4_K | Q4_K | - | ⏳ Processing | 🟢 IMatrix | -