legraphista commited on
Commit
b6fbddb
β€’
1 Parent(s): f34812f

Upload README.md with huggingface_hub

Browse files
Files changed (1) hide show
  1. README.md +2 -2
README.md CHANGED
@@ -64,7 +64,7 @@ Link: [here](https://huggingface.co/legraphista/Llama-3-8B-Instruct-MopeyMule-IM
64
  | Filename | Quant type | File Size | Status | Uses IMatrix | Is Split |
65
  | -------- | ---------- | --------- | ------ | ------------ | -------- |
66
  | Llama-3-8B-Instruct-MopeyMule.Q8_0 | Q8_0 | - | ⏳ Processing | βšͺ Static | -
67
- | Llama-3-8B-Instruct-MopeyMule.Q6_K | Q6_K | - | ⏳ Processing | βšͺ Static | -
68
  | Llama-3-8B-Instruct-MopeyMule.Q4_K | Q4_K | - | ⏳ Processing | 🟒 IMatrix | -
69
  | Llama-3-8B-Instruct-MopeyMule.Q3_K | Q3_K | - | ⏳ Processing | 🟒 IMatrix | -
70
  | Llama-3-8B-Instruct-MopeyMule.Q2_K | Q2_K | - | ⏳ Processing | 🟒 IMatrix | -
@@ -76,7 +76,7 @@ Link: [here](https://huggingface.co/legraphista/Llama-3-8B-Instruct-MopeyMule-IM
76
  | Llama-3-8B-Instruct-MopeyMule.BF16 | BF16 | - | ⏳ Processing | βšͺ Static | -
77
  | Llama-3-8B-Instruct-MopeyMule.FP16 | F16 | - | ⏳ Processing | βšͺ Static | -
78
  | Llama-3-8B-Instruct-MopeyMule.Q8_0 | Q8_0 | - | ⏳ Processing | βšͺ Static | -
79
- | Llama-3-8B-Instruct-MopeyMule.Q6_K | Q6_K | - | ⏳ Processing | βšͺ Static | -
80
  | Llama-3-8B-Instruct-MopeyMule.Q5_K | Q5_K | - | ⏳ Processing | βšͺ Static | -
81
  | Llama-3-8B-Instruct-MopeyMule.Q5_K_S | Q5_K_S | - | ⏳ Processing | βšͺ Static | -
82
  | Llama-3-8B-Instruct-MopeyMule.Q4_K | Q4_K | - | ⏳ Processing | 🟒 IMatrix | -
 
64
  | Filename | Quant type | File Size | Status | Uses IMatrix | Is Split |
65
  | -------- | ---------- | --------- | ------ | ------------ | -------- |
66
  | Llama-3-8B-Instruct-MopeyMule.Q8_0 | Q8_0 | - | ⏳ Processing | βšͺ Static | -
67
+ | [Llama-3-8B-Instruct-MopeyMule.Q6_K.gguf](https://huggingface.co/legraphista/Llama-3-8B-Instruct-MopeyMule-IMat-GGUF/blob/main/Llama-3-8B-Instruct-MopeyMule.Q6_K.gguf) | Q6_K | 6.60GB | βœ… Available | βšͺ Static | πŸ“¦ No
68
  | Llama-3-8B-Instruct-MopeyMule.Q4_K | Q4_K | - | ⏳ Processing | 🟒 IMatrix | -
69
  | Llama-3-8B-Instruct-MopeyMule.Q3_K | Q3_K | - | ⏳ Processing | 🟒 IMatrix | -
70
  | Llama-3-8B-Instruct-MopeyMule.Q2_K | Q2_K | - | ⏳ Processing | 🟒 IMatrix | -
 
76
  | Llama-3-8B-Instruct-MopeyMule.BF16 | BF16 | - | ⏳ Processing | βšͺ Static | -
77
  | Llama-3-8B-Instruct-MopeyMule.FP16 | F16 | - | ⏳ Processing | βšͺ Static | -
78
  | Llama-3-8B-Instruct-MopeyMule.Q8_0 | Q8_0 | - | ⏳ Processing | βšͺ Static | -
79
+ | [Llama-3-8B-Instruct-MopeyMule.Q6_K.gguf](https://huggingface.co/legraphista/Llama-3-8B-Instruct-MopeyMule-IMat-GGUF/blob/main/Llama-3-8B-Instruct-MopeyMule.Q6_K.gguf) | Q6_K | 6.60GB | βœ… Available | βšͺ Static | πŸ“¦ No
80
  | Llama-3-8B-Instruct-MopeyMule.Q5_K | Q5_K | - | ⏳ Processing | βšͺ Static | -
81
  | Llama-3-8B-Instruct-MopeyMule.Q5_K_S | Q5_K_S | - | ⏳ Processing | βšͺ Static | -
82
  | Llama-3-8B-Instruct-MopeyMule.Q4_K | Q4_K | - | ⏳ Processing | 🟒 IMatrix | -