Hugging Face
Models
Datasets
Spaces
Posts
Docs
Solutions
Pricing
Log In
Sign Up
PrunaAI
/
codegemma-7b-it-GGUF-smashed
like
1
Follow
Pruna AI
103
GGUF
pruna-ai
Inference Endpoints
conversational
Model card
Files
Files and versions
Community
1
Deploy
Use this model
bdb4b5a
codegemma-7b-it-GGUF-smashed
2 contributors
History:
22 commits
johnrachwanpruna
b8131f46b6b7cef192285a1053b37b2d7386dccbd808315213444fc61a90ffad
bdb4b5a
verified
7 months ago
.gitattributes
2.72 kB
5aebf8f3576c817bff9859d84501310f7f4672185d92d72a0d3b4fceb07e33b5
7 months ago
README.md
12 kB
b8131f46b6b7cef192285a1053b37b2d7386dccbd808315213444fc61a90ffad
7 months ago
codegemma-7b-it.IQ3_M.gguf
4.11 GB
LFS
a5d390d582e923f05c778144388a6198cd69891913919808074fac00a0ece4bc
7 months ago
codegemma-7b-it.IQ3_S.gguf
3.98 GB
LFS
3cc51962400408800176777391eee02e1c5b459bcf2fc09dfb027a0310a7cded
7 months ago
codegemma-7b-it.IQ3_XS.gguf
3.8 GB
LFS
9d50fa302643860d4f44ca47dc2c3a5ac6319718d6c32e4a581ace39bddbcbb3
7 months ago
codegemma-7b-it.IQ4_NL.gguf
5.04 GB
LFS
26fc87742f9a1e918b7c54fe16f82e46f6e53af74ad6262c6dd8377e55f9f860
7 months ago
codegemma-7b-it.IQ4_XS.gguf
4.81 GB
LFS
8987ee3c7468e2b35dece1713f6c33423b3e2b8650360497c4425b5639f6ce55
7 months ago
codegemma-7b-it.Q2_K.gguf
3.48 GB
LFS
88546dbcdf8b8f3a31d74439f071cdeb2f0b601d05d7767de5b5a9c5e34f0f3d
7 months ago
codegemma-7b-it.Q3_K_L.gguf
4.71 GB
LFS
c4b9ad8b5e42cd16d8ca7d6a059182990f1aa009ecb3f55208999f6bd4885ecb
7 months ago
codegemma-7b-it.Q3_K_M.gguf
4.37 GB
LFS
d02990d3062db47049fbc5cf819c8490bf664c73de5ede253b73362d28e0629b
7 months ago
codegemma-7b-it.Q3_K_S.gguf
3.98 GB
LFS
4f61afceede5934ab1e016f9b81021420a04ba368ef16956919d5eba7c890074
7 months ago
codegemma-7b-it.Q4_0.gguf
5.01 GB
LFS
2df8d0977d0b84728e01cb2988fe3ccc9f879e2047162d4c67e77ea9b6860884
7 months ago
codegemma-7b-it.Q4_1.gguf
5.5 GB
LFS
0637a81262a049c65a7364a426e3582254bb29a0df4fc36ca10b10952dc3ed98
7 months ago
codegemma-7b-it.Q4_K_M.gguf
5.33 GB
LFS
1add0f934a48675a50adf7736a3b1a9f6a94ec98249d2f166c60da76bc87e8ef
7 months ago
codegemma-7b-it.Q4_K_S.gguf
5.05 GB
LFS
6d65ed26944913f48f531f823c4c52b25b1aed95a0c84f37016162f17a1e20b9
7 months ago
codegemma-7b-it.Q5_0.gguf
5.98 GB
LFS
631ea59f420af8858574f93bef876c0223a1ea88de253f3aa800259119edfd39
7 months ago
codegemma-7b-it.Q5_1.gguf
6.47 GB
LFS
6649323fff487c1de1d28a5d3776c482f5cf27bc5816eebd9e0736aaea66752a
7 months ago
codegemma-7b-it.Q5_K_M.gguf
6.14 GB
LFS
526bd197df694ec976c30fe3486a5ada6af124de577c7b8ea65bedc432819d76
7 months ago
codegemma-7b-it.Q5_K_S.gguf
5.98 GB
LFS
91e157029022dea05921fcaf670d46333a29f4ba8634ec5fcb3849a996a5dba0
7 months ago
codegemma-7b-it.Q6_K.gguf
7.01 GB
LFS
7e0a82d1aad1f82d3e61f3164307a25168762256f46d9d9716d9184feff9de21
7 months ago
codegemma-7b-it.Q8_0.gguf
9.08 GB
LFS
5aebf8f3576c817bff9859d84501310f7f4672185d92d72a0d3b4fceb07e33b5
7 months ago
codegemma-7b-it.fp16.bin
17.1 GB
LFS
9c65826f9af9839532bd3e527295c0e912bbab40121272b1a3c7eeb2788f19e8
7 months ago