Hugging Face
Models
Datasets
Spaces
Posts
Docs
Solutions
Pricing
Log In
Sign Up
DFofanov78
/
Vikhr-7B-instruct-GGUF
like
0
Text Generation
GGUF
zjkarina/Vikhr_instruct
Russian
English
License:
apache-2.0
Model card
Files
Files and versions
Community
Use this model
main
Vikhr-7B-instruct-GGUF
1 contributor
History:
2 commits
DFofanov78
Upload folder using huggingface_hub
2e81ed9
verified
7 months ago
.gitattributes
Safe
2.5 kB
Upload folder using huggingface_hub
7 months ago
README.md
Safe
1.75 kB
Upload folder using huggingface_hub
7 months ago
Vikhr-7B-instruct-Q2_K.gguf
Safe
2.76 GB
LFS
Upload folder using huggingface_hub
7 months ago
Vikhr-7B-instruct-Q3_K_L.gguf
Safe
3.86 GB
LFS
Upload folder using huggingface_hub
7 months ago
Vikhr-7B-instruct-Q3_K_M.gguf
Safe
3.56 GB
LFS
Upload folder using huggingface_hub
7 months ago
Vikhr-7B-instruct-Q3_K_S.gguf
Safe
3.21 GB
LFS
Upload folder using huggingface_hub
7 months ago
Vikhr-7B-instruct-Q3_K_XS.gguf
Safe
3.03 GB
LFS
Upload folder using huggingface_hub
7 months ago
Vikhr-7B-instruct-Q4_0.gguf
Safe
4.15 GB
LFS
Upload folder using huggingface_hub
7 months ago
Vikhr-7B-instruct-Q4_1.gguf
Safe
4.6 GB
LFS
Upload folder using huggingface_hub
7 months ago
Vikhr-7B-instruct-Q4_K_M.gguf
Safe
4.41 GB
LFS
Upload folder using huggingface_hub
7 months ago
Vikhr-7B-instruct-Q4_K_S.gguf
Safe
4.19 GB
LFS
Upload folder using huggingface_hub
7 months ago
Vikhr-7B-instruct-Q5_0.gguf
Safe
5.05 GB
LFS
Upload folder using huggingface_hub
7 months ago
Vikhr-7B-instruct-Q5_1.gguf
Safe
5.49 GB
LFS
Upload folder using huggingface_hub
7 months ago
Vikhr-7B-instruct-Q5_K_M.gguf
Safe
5.18 GB
LFS
Upload folder using huggingface_hub
7 months ago
Vikhr-7B-instruct-Q5_K_S.gguf
Safe
5.05 GB
LFS
Upload folder using huggingface_hub
7 months ago
Vikhr-7B-instruct-Q6_K.gguf
Safe
6 GB
LFS
Upload folder using huggingface_hub
7 months ago
Vikhr-7B-instruct-Q8_0.gguf
Safe
7.77 GB
LFS
Upload folder using huggingface_hub
7 months ago