Hugging Face
Models
Datasets
Spaces
Posts
Docs
Solutions
Pricing
Log In
Sign Up
Dracones
/
perky-70b-v0.1-GGUF
like
2
GGUF
English
Not-For-All-Audiences
Inference Endpoints
arxiv:
2203.05482
License:
llama2
Model card
Files
Files and versions
Community
Deploy
Use this model
main
perky-70b-v0.1-GGUF
1 contributor
History:
7 commits
Dracones
Upload perky-70b-v0.1-Q8_0.gguf-part-b with huggingface_hub
f166973
verified
9 months ago
.gitattributes
Safe
2.59 kB
Upload perky-70b-v0.1-Q8_0.gguf-part-b with huggingface_hub
9 months ago
Perky.card.png
Safe
1.72 MB
LFS
Upload folder using huggingface_hub
9 months ago
README.md
Safe
6.68 kB
Upload README.md with huggingface_hub
9 months ago
perky-70b-v0.1-IQ3_XXS.gguf
Safe
28.2 GB
LFS
Upload folder using huggingface_hub
9 months ago
perky-70b-v0.1-Q2_K.gguf
Safe
25.5 GB
LFS
Upload folder using huggingface_hub
9 months ago
perky-70b-v0.1-Q3_K_L.gguf
Safe
36.1 GB
LFS
Upload folder using huggingface_hub
9 months ago
perky-70b-v0.1-Q3_K_M.gguf
Safe
33.3 GB
LFS
Upload folder using huggingface_hub
9 months ago
perky-70b-v0.1-Q3_K_S.gguf
Safe
29.9 GB
LFS
Upload folder using huggingface_hub
9 months ago
perky-70b-v0.1-Q3_K_XS.gguf
Safe
28.3 GB
LFS
Upload folder using huggingface_hub
9 months ago
perky-70b-v0.1-Q4_0.gguf
Safe
38.9 GB
LFS
Upload folder using huggingface_hub
9 months ago
perky-70b-v0.1-Q4_K_M.gguf
Safe
41.4 GB
LFS
Upload folder using huggingface_hub
9 months ago
perky-70b-v0.1-Q4_K_S.gguf
Safe
39.2 GB
LFS
Upload folder using huggingface_hub
9 months ago
perky-70b-v0.1-Q5_0.gguf
Safe
47.5 GB
LFS
Upload folder using huggingface_hub
9 months ago
perky-70b-v0.1-Q5_K_M.gguf
Safe
48.8 GB
LFS
Upload folder using huggingface_hub
9 months ago
perky-70b-v0.1-Q5_K_S.gguf
Safe
47.5 GB
LFS
Upload folder using huggingface_hub
9 months ago
perky-70b-v0.1-Q6_K.gguf-part-a
Safe
32.2 GB
LFS
Upload perky-70b-v0.1-Q6_K.gguf-part-a with huggingface_hub
9 months ago
perky-70b-v0.1-Q6_K.gguf-part-b
Safe
24.4 GB
LFS
Upload perky-70b-v0.1-Q6_K.gguf-part-b with huggingface_hub
9 months ago
perky-70b-v0.1-Q8_0.gguf-part-a
Safe
42.9 GB
LFS
Upload perky-70b-v0.1-Q8_0.gguf-part-a with huggingface_hub
9 months ago
perky-70b-v0.1-Q8_0.gguf-part-b
Safe
30.3 GB
LFS
Upload perky-70b-v0.1-Q8_0.gguf-part-b with huggingface_hub
9 months ago