Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
ukung
/
komodo-7b-base-GGUF
like
1
Text Generation
GGUF
finetuned
License:
apache-2.0
Model card
Files
Files and versions
Community
Use this model
1ad79e6
komodo-7b-base-GGUF
1 contributor
History:
11 commits
ukung
Upload komodo-7b-base-q5_1.gguf with huggingface_hub
1ad79e6
verified
8 months ago
.gitattributes
Safe
2.14 kB
Upload komodo-7b-base-q5_1.gguf with huggingface_hub
8 months ago
README.md
Safe
31 Bytes
initial commit
8 months ago
komodo-7b-base-q2_k.gguf
Safe
2.55 GB
LFS
Upload komodo-7b-base-q2_k.gguf with huggingface_hub
8 months ago
komodo-7b-base-q3_k_l.gguf
Safe
3.61 GB
LFS
Upload komodo-7b-base-q3_k_l.gguf with huggingface_hub
8 months ago
komodo-7b-base-q3_k_m.gguf
Safe
3.31 GB
LFS
Upload komodo-7b-base-q3_k_m.gguf with huggingface_hub
8 months ago
komodo-7b-base-q3_k_s.gguf
Safe
2.96 GB
LFS
Upload komodo-7b-base-q3_k_s.gguf with huggingface_hub
8 months ago
komodo-7b-base-q4_0.gguf
Safe
3.84 GB
LFS
Upload komodo-7b-base-q4_0.gguf with huggingface_hub
8 months ago
komodo-7b-base-q4_1.gguf
Safe
4.26 GB
LFS
Upload komodo-7b-base-q4_1.gguf with huggingface_hub
8 months ago
komodo-7b-base-q4_k_m.gguf
Safe
4.1 GB
LFS
Upload komodo-7b-base-q4_k_m.gguf with huggingface_hub
8 months ago
komodo-7b-base-q4_k_s.gguf
Safe
3.87 GB
LFS
Upload komodo-7b-base-q4_k_s.gguf with huggingface_hub
8 months ago
komodo-7b-base-q5_0.gguf
Safe
4.67 GB
LFS
Upload komodo-7b-base-q5_0.gguf with huggingface_hub
8 months ago
komodo-7b-base-q5_1.gguf
Safe
5.08 GB
LFS
Upload komodo-7b-base-q5_1.gguf with huggingface_hub
8 months ago