Hugging Face
Models
Datasets
Spaces
Posts
Docs
Solutions
Pricing
Log In
Sign Up
hermes42
/
Mixtral-8x22B-v0.1-GGUF
like
11
GGUF
5 languages
Mixture of Experts
Inference Endpoints
License:
apache-2.0
Model card
Files
Files and versions
Community
Deploy
Use this model
56b3075
Mixtral-8x22B-v0.1-GGUF
1 contributor
History:
22 commits
hermes42
Upload Mixtral-8x22B-v0.1.Q4_K_M.gguf-part-e with huggingface_hub
56b3075
verified
7 months ago
.gitattributes
3 kB
Upload Mixtral-8x22B-v0.1.Q4_K_M.gguf-part-e with huggingface_hub
7 months ago
Mixtral-8x22B-v0.1.Q4_K_M.gguf-part-a
5.37 GB
LFS
Upload Mixtral-8x22B-v0.1.Q4_K_M.gguf-part-a with huggingface_hub
7 months ago
Mixtral-8x22B-v0.1.Q4_K_M.gguf-part-b
5.37 GB
LFS
Upload Mixtral-8x22B-v0.1.Q4_K_M.gguf-part-b with huggingface_hub
7 months ago
Mixtral-8x22B-v0.1.Q4_K_M.gguf-part-c
5.37 GB
LFS
Upload Mixtral-8x22B-v0.1.Q4_K_M.gguf-part-c with huggingface_hub
7 months ago
Mixtral-8x22B-v0.1.Q4_K_M.gguf-part-d
5.37 GB
LFS
Upload Mixtral-8x22B-v0.1.Q4_K_M.gguf-part-d with huggingface_hub
7 months ago
Mixtral-8x22B-v0.1.Q4_K_M.gguf-part-e
5.37 GB
LFS
Upload Mixtral-8x22B-v0.1.Q4_K_M.gguf-part-e with huggingface_hub
7 months ago
Mixtral-8x22B-v0.1.Q4_K_S.gguf-part-a
5.37 GB
LFS
Upload Mixtral-8x22B-v0.1.Q4_K_S.gguf-part-a with huggingface_hub
7 months ago
Mixtral-8x22B-v0.1.Q4_K_S.gguf-part-b
5.37 GB
LFS
Upload Mixtral-8x22B-v0.1.Q4_K_S.gguf-part-b with huggingface_hub
7 months ago
Mixtral-8x22B-v0.1.Q4_K_S.gguf-part-c
5.37 GB
LFS
Upload Mixtral-8x22B-v0.1.Q4_K_S.gguf-part-c with huggingface_hub
7 months ago
Mixtral-8x22B-v0.1.Q4_K_S.gguf-part-d
5.37 GB
LFS
Upload Mixtral-8x22B-v0.1.Q4_K_S.gguf-part-d with huggingface_hub
7 months ago
Mixtral-8x22B-v0.1.Q4_K_S.gguf-part-e
5.37 GB
LFS
Upload Mixtral-8x22B-v0.1.Q4_K_S.gguf-part-e with huggingface_hub
7 months ago
Mixtral-8x22B-v0.1.Q4_K_S.gguf-part-f
5.37 GB
LFS
Upload Mixtral-8x22B-v0.1.Q4_K_S.gguf-part-f with huggingface_hub
7 months ago
Mixtral-8x22B-v0.1.Q4_K_S.gguf-part-g
5.37 GB
LFS
Upload Mixtral-8x22B-v0.1.Q4_K_S.gguf-part-g with huggingface_hub
7 months ago
Mixtral-8x22B-v0.1.Q4_K_S.gguf-part-h
5.37 GB
LFS
Upload Mixtral-8x22B-v0.1.Q4_K_S.gguf-part-h with huggingface_hub
7 months ago
Mixtral-8x22B-v0.1.Q4_K_S.gguf-part-i
5.37 GB
LFS
Upload Mixtral-8x22B-v0.1.Q4_K_S.gguf-part-i with huggingface_hub
7 months ago
Mixtral-8x22B-v0.1.Q4_K_S.gguf-part-j
5.37 GB
LFS
Upload Mixtral-8x22B-v0.1.Q4_K_S.gguf-part-j with huggingface_hub
7 months ago
Mixtral-8x22B-v0.1.Q4_K_S.gguf-part-k
5.37 GB
LFS
Upload Mixtral-8x22B-v0.1.Q4_K_S.gguf-part-k with huggingface_hub
7 months ago
Mixtral-8x22B-v0.1.Q4_K_S.gguf-part-l
5.37 GB
LFS
Upload Mixtral-8x22B-v0.1.Q4_K_S.gguf-part-l with huggingface_hub
7 months ago
Mixtral-8x22B-v0.1.Q4_K_S.gguf-part-m
5.37 GB
LFS
Upload Mixtral-8x22B-v0.1.Q4_K_S.gguf-part-m with huggingface_hub
7 months ago
Mixtral-8x22B-v0.1.Q4_K_S.gguf-part-n
5.37 GB
LFS
Upload Mixtral-8x22B-v0.1.Q4_K_S.gguf-part-n with huggingface_hub
7 months ago
Mixtral-8x22B-v0.1.Q4_K_S.gguf-part-o
5.32 GB
LFS
Upload Mixtral-8x22B-v0.1.Q4_K_S.gguf-part-o with huggingface_hub
7 months ago
README.md
3.62 kB
Update README.md
7 months ago