legraphista commited on
Commit
9f10159
β€’
1 Parent(s): faf61e5

Upload README.md with huggingface_hub

Browse files
Files changed (1) hide show
  1. README.md +2 -2
README.md CHANGED
@@ -63,7 +63,7 @@ Link: [here](https://huggingface.co/legraphista/Llama-3-8B-Instruct-MopeyMule-IM
63
  ### Common Quants
64
  | Filename | Quant type | File Size | Status | Uses IMatrix | Is Split |
65
  | -------- | ---------- | --------- | ------ | ------------ | -------- |
66
- | Llama-3-8B-Instruct-MopeyMule.Q8_0 | Q8_0 | - | ⏳ Processing | βšͺ Static | -
67
  | [Llama-3-8B-Instruct-MopeyMule.Q6_K.gguf](https://huggingface.co/legraphista/Llama-3-8B-Instruct-MopeyMule-IMat-GGUF/blob/main/Llama-3-8B-Instruct-MopeyMule.Q6_K.gguf) | Q6_K | 6.60GB | βœ… Available | βšͺ Static | πŸ“¦ No
68
  | Llama-3-8B-Instruct-MopeyMule.Q4_K | Q4_K | - | ⏳ Processing | 🟒 IMatrix | -
69
  | Llama-3-8B-Instruct-MopeyMule.Q3_K | Q3_K | - | ⏳ Processing | 🟒 IMatrix | -
@@ -75,7 +75,7 @@ Link: [here](https://huggingface.co/legraphista/Llama-3-8B-Instruct-MopeyMule-IM
75
  | -------- | ---------- | --------- | ------ | ------------ | -------- |
76
  | Llama-3-8B-Instruct-MopeyMule.BF16 | BF16 | - | ⏳ Processing | βšͺ Static | -
77
  | Llama-3-8B-Instruct-MopeyMule.FP16 | F16 | - | ⏳ Processing | βšͺ Static | -
78
- | Llama-3-8B-Instruct-MopeyMule.Q8_0 | Q8_0 | - | ⏳ Processing | βšͺ Static | -
79
  | [Llama-3-8B-Instruct-MopeyMule.Q6_K.gguf](https://huggingface.co/legraphista/Llama-3-8B-Instruct-MopeyMule-IMat-GGUF/blob/main/Llama-3-8B-Instruct-MopeyMule.Q6_K.gguf) | Q6_K | 6.60GB | βœ… Available | βšͺ Static | πŸ“¦ No
80
  | Llama-3-8B-Instruct-MopeyMule.Q5_K | Q5_K | - | ⏳ Processing | βšͺ Static | -
81
  | Llama-3-8B-Instruct-MopeyMule.Q5_K_S | Q5_K_S | - | ⏳ Processing | βšͺ Static | -
 
63
  ### Common Quants
64
  | Filename | Quant type | File Size | Status | Uses IMatrix | Is Split |
65
  | -------- | ---------- | --------- | ------ | ------------ | -------- |
66
+ | [Llama-3-8B-Instruct-MopeyMule.Q8_0.gguf](https://huggingface.co/legraphista/Llama-3-8B-Instruct-MopeyMule-IMat-GGUF/blob/main/Llama-3-8B-Instruct-MopeyMule.Q8_0.gguf) | Q8_0 | 8.54GB | βœ… Available | βšͺ Static | πŸ“¦ No
67
  | [Llama-3-8B-Instruct-MopeyMule.Q6_K.gguf](https://huggingface.co/legraphista/Llama-3-8B-Instruct-MopeyMule-IMat-GGUF/blob/main/Llama-3-8B-Instruct-MopeyMule.Q6_K.gguf) | Q6_K | 6.60GB | βœ… Available | βšͺ Static | πŸ“¦ No
68
  | Llama-3-8B-Instruct-MopeyMule.Q4_K | Q4_K | - | ⏳ Processing | 🟒 IMatrix | -
69
  | Llama-3-8B-Instruct-MopeyMule.Q3_K | Q3_K | - | ⏳ Processing | 🟒 IMatrix | -
 
75
  | -------- | ---------- | --------- | ------ | ------------ | -------- |
76
  | Llama-3-8B-Instruct-MopeyMule.BF16 | BF16 | - | ⏳ Processing | βšͺ Static | -
77
  | Llama-3-8B-Instruct-MopeyMule.FP16 | F16 | - | ⏳ Processing | βšͺ Static | -
78
+ | [Llama-3-8B-Instruct-MopeyMule.Q8_0.gguf](https://huggingface.co/legraphista/Llama-3-8B-Instruct-MopeyMule-IMat-GGUF/blob/main/Llama-3-8B-Instruct-MopeyMule.Q8_0.gguf) | Q8_0 | 8.54GB | βœ… Available | βšͺ Static | πŸ“¦ No
79
  | [Llama-3-8B-Instruct-MopeyMule.Q6_K.gguf](https://huggingface.co/legraphista/Llama-3-8B-Instruct-MopeyMule-IMat-GGUF/blob/main/Llama-3-8B-Instruct-MopeyMule.Q6_K.gguf) | Q6_K | 6.60GB | βœ… Available | βšͺ Static | πŸ“¦ No
80
  | Llama-3-8B-Instruct-MopeyMule.Q5_K | Q5_K | - | ⏳ Processing | βšͺ Static | -
81
  | Llama-3-8B-Instruct-MopeyMule.Q5_K_S | Q5_K_S | - | ⏳ Processing | βšͺ Static | -