Text Generation
GGUF
quantized
GGUF
imatrix
quantization
imat
static
legraphista commited on
Commit
76f7c0a
β€’
1 Parent(s): 7fc23f0

Upload README.md with huggingface_hub

Browse files
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -71,7 +71,7 @@ Link: [here](https://huggingface.co/legraphista/dolphin-2.9.1-llama-3-8b-IMat-GG
71
  | Filename | Quant type | File Size | Status | Uses IMatrix | Is Split |
72
  | -------- | ---------- | --------- | ------ | ------------ | -------- |
73
  | [dolphin-2.9.1-llama-3-8b.FP16.gguf](https://huggingface.co/legraphista/dolphin-2.9.1-llama-3-8b-IMat-GGUF/blob/main/dolphin-2.9.1-llama-3-8b.FP16.gguf) | F16 | 16.07GB | βœ… Available | βšͺ Static | πŸ“¦ No
74
- | dolphin-2.9.1-llama-3-8b.BF16 | BF16 | - | ⏳ Processing | βšͺ Static | -
75
  | [dolphin-2.9.1-llama-3-8b.Q5_K.gguf](https://huggingface.co/legraphista/dolphin-2.9.1-llama-3-8b-IMat-GGUF/blob/main/dolphin-2.9.1-llama-3-8b.Q5_K.gguf) | Q5_K | 5.73GB | βœ… Available | βšͺ Static | πŸ“¦ No
76
  | [dolphin-2.9.1-llama-3-8b.Q5_K_S.gguf](https://huggingface.co/legraphista/dolphin-2.9.1-llama-3-8b-IMat-GGUF/blob/main/dolphin-2.9.1-llama-3-8b.Q5_K_S.gguf) | Q5_K_S | 5.60GB | βœ… Available | βšͺ Static | πŸ“¦ No
77
  | dolphin-2.9.1-llama-3-8b.Q4_K_S | Q4_K_S | - | ⏳ Processing | 🟒 IMatrix | -
 
71
  | Filename | Quant type | File Size | Status | Uses IMatrix | Is Split |
72
  | -------- | ---------- | --------- | ------ | ------------ | -------- |
73
  | [dolphin-2.9.1-llama-3-8b.FP16.gguf](https://huggingface.co/legraphista/dolphin-2.9.1-llama-3-8b-IMat-GGUF/blob/main/dolphin-2.9.1-llama-3-8b.FP16.gguf) | F16 | 16.07GB | βœ… Available | βšͺ Static | πŸ“¦ No
74
+ | [dolphin-2.9.1-llama-3-8b.BF16.gguf](https://huggingface.co/legraphista/dolphin-2.9.1-llama-3-8b-IMat-GGUF/blob/main/dolphin-2.9.1-llama-3-8b.BF16.gguf) | BF16 | 16.07GB | βœ… Available | βšͺ Static | πŸ“¦ No
75
  | [dolphin-2.9.1-llama-3-8b.Q5_K.gguf](https://huggingface.co/legraphista/dolphin-2.9.1-llama-3-8b-IMat-GGUF/blob/main/dolphin-2.9.1-llama-3-8b.Q5_K.gguf) | Q5_K | 5.73GB | βœ… Available | βšͺ Static | πŸ“¦ No
76
  | [dolphin-2.9.1-llama-3-8b.Q5_K_S.gguf](https://huggingface.co/legraphista/dolphin-2.9.1-llama-3-8b-IMat-GGUF/blob/main/dolphin-2.9.1-llama-3-8b.Q5_K_S.gguf) | Q5_K_S | 5.60GB | βœ… Available | βšͺ Static | πŸ“¦ No
77
  | dolphin-2.9.1-llama-3-8b.Q4_K_S | Q4_K_S | - | ⏳ Processing | 🟒 IMatrix | -