Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
bartowski
/
granite-3.1-3b-a800m-instruct-GGUF
like
1
Text Generation
GGUF
language
granite-3.1
conversational
License:
apache-2.0
Model card
Files
Files and versions
Community
Use this model
480be36
granite-3.1-3b-a800m-instruct-GGUF
1 contributor
History:
35 commits
bartowski
Upload granite-3.1-3b-a800m-instruct-IQ4_NL.gguf with huggingface_hub
480be36
verified
about 9 hours ago
.gitattributes
Safe
3.15 kB
Upload granite-3.1-3b-a800m-instruct.imatrix with huggingface_hub
5 days ago
README.md
Safe
13.5 kB
Update metadata with huggingface_hub
5 days ago
granite-3.1-3b-a800m-instruct-IQ3_M.gguf
Safe
1.49 GB
LFS
Upload granite-3.1-3b-a800m-instruct-IQ3_M.gguf with huggingface_hub
5 days ago
granite-3.1-3b-a800m-instruct-IQ3_XS.gguf
Safe
1.38 GB
LFS
Upload granite-3.1-3b-a800m-instruct-IQ3_XS.gguf with huggingface_hub
5 days ago
granite-3.1-3b-a800m-instruct-IQ4_NL.gguf
Safe
1.88 GB
LFS
Upload granite-3.1-3b-a800m-instruct-IQ4_NL.gguf with huggingface_hub
about 9 hours ago
granite-3.1-3b-a800m-instruct-IQ4_XS.gguf
Safe
1.78 GB
LFS
Upload granite-3.1-3b-a800m-instruct-IQ4_XS.gguf with huggingface_hub
5 days ago
granite-3.1-3b-a800m-instruct-Q2_K.gguf
Safe
1.24 GB
LFS
Upload granite-3.1-3b-a800m-instruct-Q2_K.gguf with huggingface_hub
5 days ago
granite-3.1-3b-a800m-instruct-Q2_K_L.gguf
Safe
1.26 GB
LFS
Upload granite-3.1-3b-a800m-instruct-Q2_K_L.gguf with huggingface_hub
5 days ago
granite-3.1-3b-a800m-instruct-Q3_K_L.gguf
Safe
1.74 GB
LFS
Upload granite-3.1-3b-a800m-instruct-Q3_K_L.gguf with huggingface_hub
5 days ago
granite-3.1-3b-a800m-instruct-Q3_K_M.gguf
Safe
1.61 GB
LFS
Upload granite-3.1-3b-a800m-instruct-Q3_K_M.gguf with huggingface_hub
5 days ago
granite-3.1-3b-a800m-instruct-Q3_K_S.gguf
Safe
1.46 GB
LFS
Upload granite-3.1-3b-a800m-instruct-Q3_K_S.gguf with huggingface_hub
5 days ago
granite-3.1-3b-a800m-instruct-Q3_K_XL.gguf
Safe
1.76 GB
LFS
Upload granite-3.1-3b-a800m-instruct-Q3_K_XL.gguf with huggingface_hub
5 days ago
granite-3.1-3b-a800m-instruct-Q4_0.gguf
Safe
1.89 GB
LFS
Upload granite-3.1-3b-a800m-instruct-Q4_0.gguf with huggingface_hub
about 9 hours ago
granite-3.1-3b-a800m-instruct-Q4_K_L.gguf
Safe
2.04 GB
LFS
Upload granite-3.1-3b-a800m-instruct-Q4_K_L.gguf with huggingface_hub
about 9 hours ago
granite-3.1-3b-a800m-instruct-Q4_K_M.gguf
Safe
2.02 GB
LFS
Upload granite-3.1-3b-a800m-instruct-Q4_K_M.gguf with huggingface_hub
about 9 hours ago
granite-3.1-3b-a800m-instruct-Q4_K_S.gguf
Safe
1.9 GB
LFS
Upload granite-3.1-3b-a800m-instruct-Q4_K_S.gguf with huggingface_hub
about 9 hours ago
granite-3.1-3b-a800m-instruct-Q5_K_L.gguf
Safe
2.37 GB
LFS
Upload granite-3.1-3b-a800m-instruct-Q5_K_L.gguf with huggingface_hub
about 9 hours ago
granite-3.1-3b-a800m-instruct-Q5_K_M.gguf
Safe
2.36 GB
LFS
Upload granite-3.1-3b-a800m-instruct-Q5_K_M.gguf with huggingface_hub
about 9 hours ago
granite-3.1-3b-a800m-instruct-Q5_K_S.gguf
Safe
2.29 GB
LFS
Upload granite-3.1-3b-a800m-instruct-Q5_K_S.gguf with huggingface_hub
about 9 hours ago
granite-3.1-3b-a800m-instruct-Q6_K.gguf
Safe
2.71 GB
LFS
Upload granite-3.1-3b-a800m-instruct-Q6_K.gguf with huggingface_hub
about 9 hours ago
granite-3.1-3b-a800m-instruct-Q6_K_L.gguf
Safe
2.73 GB
LFS
Upload granite-3.1-3b-a800m-instruct-Q6_K_L.gguf with huggingface_hub
about 9 hours ago
granite-3.1-3b-a800m-instruct-Q8_0.gguf
Safe
3.51 GB
LFS
Upload granite-3.1-3b-a800m-instruct-Q8_0.gguf with huggingface_hub
about 9 hours ago
granite-3.1-3b-a800m-instruct.imatrix
Safe
18.2 MB
LFS
Upload granite-3.1-3b-a800m-instruct.imatrix with huggingface_hub
5 days ago