Hugging Face
Models
Datasets
Spaces
Posts
Docs
Solutions
Pricing
Log In
Sign Up
dranger003
/
c4ai-command-r-plus-iMat.GGUF
like
137
Text Generation
GGUF
Inference Endpoints
License:
cc-by-nc-4.0
Model card
Files
Files and versions
Community
19
Deploy
Use this model
598d831
c4ai-command-r-plus-iMat.GGUF
2 contributors
History:
169 commits
dranger003
Upload folder using huggingface_hub
598d831
verified
7 months ago
.gitattributes
Safe
6.63 kB
Upload folder using huggingface_hub
7 months ago
README.md
Safe
5.3 kB
Update README.md
7 months ago
ggml-c4ai-command-r-plus-104b-ppl.png
Safe
434 kB
Upload ggml-c4ai-command-r-plus-104b-ppl.png
7 months ago
ggml-c4ai-command-r-plus-f16-00001-of-00005.gguf
Safe
49.5 GB
LFS
Upload folder using huggingface_hub
7 months ago
ggml-c4ai-command-r-plus-f16-00002-of-00005.gguf
Safe
49.7 GB
LFS
Upload folder using huggingface_hub
7 months ago
ggml-c4ai-command-r-plus-f16-00003-of-00005.gguf
Safe
49.5 GB
LFS
Upload folder using huggingface_hub
7 months ago
ggml-c4ai-command-r-plus-f16-00004-of-00005.gguf
Safe
49.5 GB
LFS
Upload folder using huggingface_hub
7 months ago
ggml-c4ai-command-r-plus-f16-00005-of-00005.gguf
Safe
9.44 GB
LFS
Upload folder using huggingface_hub
7 months ago
ggml-c4ai-command-r-plus-f16-imatrix.dat
Safe
27.5 MB
LFS
Upload folder using huggingface_hub
7 months ago
ggml-c4ai-command-r-plus-q4_k_m-00001-of-00002.gguf
Safe
49.7 GB
LFS
Upload folder using huggingface_hub
7 months ago
ggml-c4ai-command-r-plus-q4_k_m-00002-of-00002.gguf
Safe
13 GB
LFS
Upload folder using huggingface_hub
7 months ago
ggml-c4ai-command-r-plus-q4_k_s-00001-of-00002.gguf
Safe
49.7 GB
LFS
Upload folder using huggingface_hub
7 months ago
ggml-c4ai-command-r-plus-q4_k_s-00002-of-00002.gguf
Safe
9.97 GB
LFS
Upload folder using huggingface_hub
7 months ago
ggml-c4ai-command-r-plus-q5_k_m-00001-of-00002.gguf
Safe
49.7 GB
LFS
Upload folder using huggingface_hub
7 months ago
ggml-c4ai-command-r-plus-q5_k_m-00002-of-00002.gguf
Safe
24 GB
LFS
Upload folder using huggingface_hub
7 months ago
ggml-c4ai-command-r-plus-q5_k_s-00001-of-00002.gguf
Safe
49.6 GB
LFS
Upload folder using huggingface_hub
7 months ago
ggml-c4ai-command-r-plus-q5_k_s-00002-of-00002.gguf
Safe
22.2 GB
LFS
Upload folder using huggingface_hub
7 months ago
ggml-c4ai-command-r-plus-q6_k-00001-of-00002.gguf
Safe
49.7 GB
LFS
Upload folder using huggingface_hub
7 months ago
ggml-c4ai-command-r-plus-q6_k-00002-of-00002.gguf
Safe
35.4 GB
LFS
Upload folder using huggingface_hub
7 months ago
ggml-c4ai-command-r-plus-q8_0-00001-of-00003.gguf
Safe
49.8 GB
LFS
Upload folder using huggingface_hub
7 months ago
ggml-c4ai-command-r-plus-q8_0-00002-of-00003.gguf
Safe
49.7 GB
LFS
Upload folder using huggingface_hub
7 months ago
ggml-c4ai-command-r-plus-q8_0-00003-of-00003.gguf
Safe
10.8 GB
LFS
Upload folder using huggingface_hub
7 months ago