Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
failspy
/
mixtral-8x22b-v0.1-instruct-oh-GGUF
like
1
GGUF
teknium/OpenHermes-2.5
English
Inference Endpoints
conversational
License:
apache-2.0
Model card
Files
Files and versions
Community
Deploy
Use this model
d582b89
mixtral-8x22b-v0.1-instruct-oh-GGUF
1 contributor
History:
3 commits
failspy
Upload Quants: Q6_K_M, Q8_0
d582b89
verified
10 months ago
.gitattributes
Safe
2.64 kB
Upload Quants: Q6_K_M, Q8_0
10 months ago
mixtral-8x22b-v0.1-instruct-oh-Q4_K_M-00001-of-00002.gguf
Safe
45.1 GB
LFS
Upload quants: Q5_K_M, Q4_K_M
10 months ago
mixtral-8x22b-v0.1-instruct-oh-Q4_K_M-00002-of-00002.gguf
Safe
40.5 GB
LFS
Upload quants: Q5_K_M, Q4_K_M
10 months ago
mixtral-8x22b-v0.1-instruct-oh-Q5_K_M-00001-of-00003.gguf
Safe
44.6 GB
LFS
Upload quants: Q5_K_M, Q4_K_M
10 months ago
mixtral-8x22b-v0.1-instruct-oh-Q5_K_M-00002-of-00003.gguf
Safe
44.8 GB
LFS
Upload quants: Q5_K_M, Q4_K_M
10 months ago
mixtral-8x22b-v0.1-instruct-oh-Q5_K_M-00003-of-00003.gguf
Safe
10.6 GB
LFS
Upload quants: Q5_K_M, Q4_K_M
10 months ago
mixtral-8x22b-v0.1-instruct-oh-Q6_K_M-00001-of-00003.gguf
Safe
44.8 GB
LFS
Upload Quants: Q6_K_M, Q8_0
10 months ago
mixtral-8x22b-v0.1-instruct-oh-Q6_K_M-00002-of-00003.gguf
Safe
44.9 GB
LFS
Upload Quants: Q6_K_M, Q8_0
10 months ago
mixtral-8x22b-v0.1-instruct-oh-Q6_K_M-00003-of-00003.gguf
Safe
25.8 GB
LFS
Upload Quants: Q6_K_M, Q8_0
10 months ago
mixtral-8x22b-v0.1-instruct-oh-Q8_0-00001-of-00004.gguf
Safe
45 GB
LFS
Upload Quants: Q6_K_M, Q8_0
10 months ago
mixtral-8x22b-v0.1-instruct-oh-Q8_0-00002-of-00004.gguf
Safe
44.5 GB
LFS
Upload Quants: Q6_K_M, Q8_0
10 months ago
mixtral-8x22b-v0.1-instruct-oh-Q8_0-00003-of-00004.gguf
Safe
44.5 GB
LFS
Upload Quants: Q6_K_M, Q8_0
10 months ago
mixtral-8x22b-v0.1-instruct-oh-Q8_0-00004-of-00004.gguf
Safe
15.4 GB
LFS
Upload Quants: Q6_K_M, Q8_0
10 months ago