Hugging Face
Models
Datasets
Spaces
Posts
Docs
Solutions
Pricing
Log In
Sign Up
dranger003
/
bagel-dpo-34b-v0.5-iMat.GGUF
like
2
Text Generation
GGUF
Inference Endpoints
License:
apache-2.0
Model card
Files
Files and versions
Community
1
Deploy
Use this model
28cc820
bagel-dpo-34b-v0.5-iMat.GGUF
2 contributors
History:
10 commits
dranger003
Upload ggml-bagel-dpo-34b-v0.5-iq3_xs.gguf with huggingface_hub
28cc820
verified
7 months ago
.gitattributes
Safe
2.02 kB
Upload ggml-bagel-dpo-34b-v0.5-iq3_xs.gguf with huggingface_hub
7 months ago
README.md
Safe
843 Bytes
Update README.md
7 months ago
ggml-bagel-dpo-34b-v0.5-f16-imatrix.dat
Safe
15.3 MB
LFS
Upload ggml-bagel-dpo-34b-v0.5-f16-imatrix.dat with huggingface_hub
7 months ago
ggml-bagel-dpo-34b-v0.5-iq2_xs.gguf
Safe
10.3 GB
LFS
Upload ggml-bagel-dpo-34b-v0.5-iq2_xs.gguf with huggingface_hub
7 months ago
ggml-bagel-dpo-34b-v0.5-iq3_xs.gguf
Safe
14.2 GB
LFS
Upload ggml-bagel-dpo-34b-v0.5-iq3_xs.gguf with huggingface_hub
7 months ago
ggml-bagel-dpo-34b-v0.5-iq4_xs.gguf
Safe
18.5 GB
LFS
Upload ggml-bagel-dpo-34b-v0.5-iq4_xs.gguf with huggingface_hub
7 months ago
ggml-bagel-dpo-34b-v0.5-q5_k.gguf
Safe
24.3 GB
LFS
Upload ggml-bagel-dpo-34b-v0.5-q5_k.gguf with huggingface_hub
7 months ago
ggml-bagel-dpo-34b-v0.5-q6_k.gguf
Safe
28.2 GB
LFS
Upload ggml-bagel-dpo-34b-v0.5-q6_k.gguf with huggingface_hub
7 months ago
ggml-bagel-dpo-34b-v0.5-q8_0.gguf
Safe
36.5 GB
LFS
Upload ggml-bagel-dpo-34b-v0.5-q8_0.gguf with huggingface_hub
7 months ago