Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
dranger003
/
c4ai-command-r-plus-iMat.GGUF
like
138
Text Generation
GGUF
Inference Endpoints
License:
cc-by-nc-4.0
Model card
Files
Files and versions
Community
19
Deploy
Use this model
f60ae34
c4ai-command-r-plus-iMat.GGUF
2 contributors
History:
105 commits
dranger003
Upload ggml-c4ai-command-r-plus-104b-f16-00001-of-00005.gguf with huggingface_hub
f60ae34
verified
9 months ago
.gitattributes
Safe
4.77 kB
Upload ggml-c4ai-command-r-plus-104b-f16-00001-of-00005.gguf with huggingface_hub
9 months ago
README.md
Safe
3.38 kB
Update README.md
9 months ago
ggml-c4ai-command-r-plus-104b-f16-00001-of-00005.gguf
Safe
48.4 GB
LFS
Upload ggml-c4ai-command-r-plus-104b-f16-00001-of-00005.gguf with huggingface_hub
9 months ago
ggml-c4ai-command-r-plus-104b-f16-imatrix.dat
Safe
27.5 MB
LFS
Upload ggml-c4ai-command-r-plus-104b-f16-imatrix.dat with huggingface_hub
9 months ago
ggml-c4ai-command-r-plus-104b-iq1_m.gguf
Safe
25.2 GB
LFS
Upload ggml-c4ai-command-r-plus-104b-iq1_m.gguf with huggingface_hub
9 months ago
ggml-c4ai-command-r-plus-104b-iq1_s.gguf
Safe
23.2 GB
LFS
Upload ggml-c4ai-command-r-plus-104b-iq1_s.gguf with huggingface_hub
9 months ago
ggml-c4ai-command-r-plus-104b-iq2_m.gguf
Safe
36 GB
LFS
Upload ggml-c4ai-command-r-plus-104b-iq2_m.gguf with huggingface_hub
9 months ago
ggml-c4ai-command-r-plus-104b-iq2_s.gguf
Safe
33.3 GB
LFS
Upload ggml-c4ai-command-r-plus-104b-iq2_s.gguf with huggingface_hub
9 months ago
ggml-c4ai-command-r-plus-104b-iq2_xs.gguf
Safe
31.6 GB
LFS
Upload ggml-c4ai-command-r-plus-104b-iq2_xs.gguf with huggingface_hub
9 months ago
ggml-c4ai-command-r-plus-104b-iq2_xxs.gguf
Safe
28.6 GB
LFS
Upload ggml-c4ai-command-r-plus-104b-iq2_xxs.gguf with huggingface_hub
9 months ago
ggml-c4ai-command-r-plus-104b-iq3_m.gguf
Safe
47.7 GB
LFS
Upload ggml-c4ai-command-r-plus-104b-iq3_m.gguf with huggingface_hub
9 months ago
ggml-c4ai-command-r-plus-104b-iq3_s.gguf
Safe
46 GB
LFS
Upload ggml-c4ai-command-r-plus-104b-iq3_s.gguf with huggingface_hub
9 months ago
ggml-c4ai-command-r-plus-104b-iq3_xs.gguf
Safe
43.6 GB
LFS
Upload ggml-c4ai-command-r-plus-104b-iq3_xs.gguf with huggingface_hub
9 months ago
ggml-c4ai-command-r-plus-104b-iq3_xxs.gguf
Safe
40.7 GB
LFS
Upload ggml-c4ai-command-r-plus-104b-iq3_xxs.gguf with huggingface_hub
9 months ago
ggml-c4ai-command-r-plus-104b-iq4_xs-00001-of-00002.gguf
Safe
48.7 GB
LFS
Upload ggml-c4ai-command-r-plus-104b-iq4_xs-00001-of-00002.gguf with huggingface_hub
9 months ago
ggml-c4ai-command-r-plus-104b-iq4_xs-00002-of-00002.gguf
Safe
7.55 GB
LFS
Upload ggml-c4ai-command-r-plus-104b-iq4_xs-00002-of-00002.gguf with huggingface_hub
9 months ago
ggml-c4ai-command-r-plus-104b-q5_k_s-00001-of-00002.gguf
Safe
48.3 GB
LFS
Upload ggml-c4ai-command-r-plus-104b-q5_k_s-00001-of-00002.gguf with huggingface_hub
9 months ago
ggml-c4ai-command-r-plus-104b-q5_k_s-00002-of-00002.gguf
Safe
23.5 GB
LFS
Upload ggml-c4ai-command-r-plus-104b-q5_k_s-00002-of-00002.gguf with huggingface_hub
9 months ago
ggml-c4ai-command-r-plus-104b-q6_k-00001-of-00002.gguf
Safe
49 GB
LFS
Upload ggml-c4ai-command-r-plus-104b-q6_k-00001-of-00002.gguf with huggingface_hub
9 months ago
ggml-c4ai-command-r-plus-104b-q6_k-00002-of-00002.gguf
Safe
36.1 GB
LFS
Upload ggml-c4ai-command-r-plus-104b-q6_k-00002-of-00002.gguf with huggingface_hub
9 months ago
ggml-c4ai-command-r-plus-104b-q8_0-00001-of-00003.gguf
Safe
48.9 GB
LFS
Upload ggml-c4ai-command-r-plus-104b-q8_0-00001-of-00003.gguf with huggingface_hub
9 months ago
ggml-c4ai-command-r-plus-104b-q8_0-00002-of-00003.gguf
Safe
46.3 GB
LFS
Upload ggml-c4ai-command-r-plus-104b-q8_0-00002-of-00003.gguf with huggingface_hub
9 months ago
ggml-c4ai-command-r-plus-104b-q8_0-00003-of-00003.gguf
Safe
15.1 GB
LFS
Upload ggml-c4ai-command-r-plus-104b-q8_0-00003-of-00003.gguf with huggingface_hub
9 months ago