Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
qwp4w3hyb
/
codegeex4-all-9b-iMat-GGUF
like
0
GGUF
Inference Endpoints
Model card
Files
Files and versions
Community
Deploy
Use this model
main
codegeex4-all-9b-iMat-GGUF
1 contributor
History:
2 commits
qwp4w3hyb
Upload folder using huggingface_hub
a9b2d22
verified
6 months ago
.gitattributes
Safe
2.2 kB
Upload folder using huggingface_hub
6 months ago
codegeex4-all-9b-bf16.gguf
Safe
18.8 GB
LFS
Upload folder using huggingface_hub
6 months ago
codegeex4-all-9b-imat-IQ1_S.gguf
Safe
3.1 GB
LFS
Upload folder using huggingface_hub
6 months ago
codegeex4-all-9b-imat-IQ2_XXS.gguf
Safe
3.43 GB
LFS
Upload folder using huggingface_hub
6 months ago
codegeex4-all-9b-imat-IQ3_XXS.gguf
Safe
4.26 GB
LFS
Upload folder using huggingface_hub
6 months ago
codegeex4-all-9b-imat-IQ4_XS.gguf
Safe
5.25 GB
LFS
Upload folder using huggingface_hub
6 months ago
codegeex4-all-9b-imat-Q4_K_L.gguf
Safe
7.88 GB
LFS
Upload folder using huggingface_hub
6 months ago
codegeex4-all-9b-imat-Q5_K_L.gguf
Safe
8.69 GB
LFS
Upload folder using huggingface_hub
6 months ago
codegeex4-all-9b-imat-Q6_K_L.gguf
Safe
9.73 GB
LFS
Upload folder using huggingface_hub
6 months ago
codegeex4-all-9b-imat-Q8_0_L.gguf
Safe
11.2 GB
LFS
Upload folder using huggingface_hub
6 months ago
codegeex4-all-9b.imatrix
Safe
4.16 MB
LFS
Upload folder using huggingface_hub
6 months ago