Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
PrunaAI
/
cognitivecomputations-dolphin-2.9.2-qwen2-72b-GGUF-smashed
like
0
Follow
Pruna AI
137
GGUF
pruna-ai
Inference Endpoints
conversational
Model card
Files
Files and versions
Community
Deploy
Use this model
5232823
cognitivecomputations-dolphin-2.9.2-qwen2-72b-GGUF-smashed
2 contributors
History:
22 commits
johnrachwanpruna
Upload dolphin-2.9.2-qwen2-72b.Q6_K.gguf-00007-of-00008.gguf with huggingface_hub
5232823
verified
7 months ago
.gitattributes
3.27 kB
Upload dolphin-2.9.2-qwen2-72b.Q6_K.gguf-00007-of-00008.gguf with huggingface_hub
7 months ago
README.md
12.3 kB
Upload README.md with huggingface_hub
7 months ago
dolphin-2.9.2-qwen2-72b.Q3_K_L.gguf
39.5 GB
LFS
Upload dolphin-2.9.2-qwen2-72b.Q3_K_L.gguf with huggingface_hub
7 months ago
dolphin-2.9.2-qwen2-72b.Q4_1.gguf
45.7 GB
LFS
Upload dolphin-2.9.2-qwen2-72b.Q4_1.gguf with huggingface_hub
7 months ago
dolphin-2.9.2-qwen2-72b.Q4_K_M.gguf
47.4 GB
LFS
Upload dolphin-2.9.2-qwen2-72b.Q4_K_M.gguf with huggingface_hub
7 months ago
dolphin-2.9.2-qwen2-72b.Q5_1.gguf-00001-of-00008.gguf
7.81 GB
LFS
Upload dolphin-2.9.2-qwen2-72b.Q5_1.gguf-00001-of-00008.gguf with huggingface_hub
7 months ago
dolphin-2.9.2-qwen2-72b.Q5_1.gguf-00002-of-00008.gguf
7.32 GB
LFS
Upload dolphin-2.9.2-qwen2-72b.Q5_1.gguf-00002-of-00008.gguf with huggingface_hub
7 months ago
dolphin-2.9.2-qwen2-72b.Q5_1.gguf-00005-of-00008.gguf
7.14 GB
LFS
Upload dolphin-2.9.2-qwen2-72b.Q5_1.gguf-00005-of-00008.gguf with huggingface_hub
7 months ago
dolphin-2.9.2-qwen2-72b.Q5_1.gguf-00006-of-00008.gguf
7.05 GB
LFS
Upload dolphin-2.9.2-qwen2-72b.Q5_1.gguf-00006-of-00008.gguf with huggingface_hub
7 months ago
dolphin-2.9.2-qwen2-72b.Q5_1.gguf-00007-of-00008.gguf
7.14 GB
LFS
Upload dolphin-2.9.2-qwen2-72b.Q5_1.gguf-00007-of-00008.gguf with huggingface_hub
7 months ago
dolphin-2.9.2-qwen2-72b.Q5_1.gguf-00008-of-00008.gguf
4.42 GB
LFS
Upload dolphin-2.9.2-qwen2-72b.Q5_1.gguf-00008-of-00008.gguf with huggingface_hub
7 months ago
dolphin-2.9.2-qwen2-72b.Q5_K_M.gguf-00001-of-00008.gguf
8.08 GB
LFS
Upload dolphin-2.9.2-qwen2-72b.Q5_K_M.gguf-00001-of-00008.gguf with huggingface_hub
7 months ago
dolphin-2.9.2-qwen2-72b.Q5_K_M.gguf-00003-of-00008.gguf
6.69 GB
LFS
Upload dolphin-2.9.2-qwen2-72b.Q5_K_M.gguf-00003-of-00008.gguf with huggingface_hub
7 months ago
dolphin-2.9.2-qwen2-72b.Q5_K_M.gguf-00006-of-00008.gguf
6.93 GB
LFS
Upload dolphin-2.9.2-qwen2-72b.Q5_K_M.gguf-00006-of-00008.gguf with huggingface_hub
7 months ago
dolphin-2.9.2-qwen2-72b.Q5_K_M.gguf-00007-of-00008.gguf
7.25 GB
LFS
Upload dolphin-2.9.2-qwen2-72b.Q5_K_M.gguf-00007-of-00008.gguf with huggingface_hub
7 months ago
dolphin-2.9.2-qwen2-72b.Q6_K.gguf-00007-of-00008.gguf
8.46 GB
LFS
Upload dolphin-2.9.2-qwen2-72b.Q6_K.gguf-00007-of-00008.gguf with huggingface_hub
7 months ago
dolphin-2.9.2-qwen2-72b.Q8_0.gguf-00003-of-00008.gguf
9.74 GB
LFS
Upload dolphin-2.9.2-qwen2-72b.Q8_0.gguf-00003-of-00008.gguf with huggingface_hub
7 months ago
dolphin-2.9.2-qwen2-72b.Q8_0.gguf-00005-of-00008.gguf
10.1 GB
LFS
Upload dolphin-2.9.2-qwen2-72b.Q8_0.gguf-00005-of-00008.gguf with huggingface_hub
7 months ago
dolphin-2.9.2-qwen2-72b.Q8_0.gguf-00008-of-00008.gguf
6.14 GB
LFS
Upload dolphin-2.9.2-qwen2-72b.Q8_0.gguf-00008-of-00008.gguf with huggingface_hub
7 months ago
dolphin-2.9.2-qwen2-72b.fp16.bin-00004-of-00008.gguf
18.3 GB
LFS
Upload dolphin-2.9.2-qwen2-72b.fp16.bin-00004-of-00008.gguf with huggingface_hub
7 months ago
dolphin-2.9.2-qwen2-72b.fp16.bin-00006-of-00008.gguf
18.8 GB
LFS
Upload dolphin-2.9.2-qwen2-72b.fp16.bin-00006-of-00008.gguf with huggingface_hub
7 months ago
dolphin-2.9.2-qwen2-72b.fp16.bin-00007-of-00008.gguf
19 GB
LFS
Upload dolphin-2.9.2-qwen2-72b.fp16.bin-00007-of-00008.gguf with huggingface_hub
7 months ago