Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
Felladrin
/
gguf-pythia-1.4b-sft-full
like
0
GGUF
Inference Endpoints
conversational
License:
apache-2.0
Model card
Files
Files and versions
Community
1
Deploy
Use this model
6fe9eab
gguf-pythia-1.4b-sft-full
1 contributor
History:
7 commits
Felladrin
Delete Pythia-31M-Chat-v1.Q4_0.gguf
6fe9eab
verified
6 months ago
.gitattributes
Safe
2.99 kB
Upload folder using huggingface_hub
6 months ago
Pythia-31M-Chat-v1.Q4_K_M.gguf
Safe
22.6 MB
LFS
Upload folder using huggingface_hub
6 months ago
Pythia-31M-Chat-v1.Q4_K_S.gguf
Safe
22.3 MB
LFS
Upload folder using huggingface_hub
6 months ago
Pythia-31M-Chat-v1.Q5_K_M.gguf
Safe
24.7 MB
LFS
Upload folder using huggingface_hub
6 months ago
Pythia-31M-Chat-v1.Q5_K_S.gguf
Safe
24.5 MB
LFS
Upload folder using huggingface_hub
6 months ago
Pythia-31M-Chat-v1.Q6_K.gguf
Safe
26.9 MB
LFS
Upload folder using huggingface_hub
6 months ago
Pythia-31M-Chat-v1.Q8_0.gguf
Safe
34.2 MB
LFS
Upload folder using huggingface_hub
6 months ago
pythia-1.4b-sft-full.F16.gguf
Safe
2.83 GB
LFS
Upload folder using huggingface_hub
6 months ago
pythia-1.4b-sft-full.Q2_K.gguf
Safe
570 MB
LFS
Upload folder using huggingface_hub
6 months ago
pythia-1.4b-sft-full.Q3_K_M.gguf
Safe
761 MB
LFS
Upload folder using huggingface_hub
6 months ago
pythia-1.4b-sft-full.Q3_K_S.gguf
Safe
652 MB
LFS
Upload folder using huggingface_hub
6 months ago
pythia-1.4b-sft-full.Q4_0.gguf
Safe
826 MB
LFS
Upload folder using huggingface_hub
6 months ago
pythia-1.4b-sft-full.Q4_K_M.gguf
Safe
916 MB
LFS
Upload folder using huggingface_hub
6 months ago
pythia-1.4b-sft-full.Q4_K_S.gguf
Safe
833 MB
LFS
Upload folder using huggingface_hub
6 months ago
pythia-1.4b-sft-full.Q5_K_M.gguf
Safe
1.06 GB
LFS
Upload folder using huggingface_hub
6 months ago
pythia-1.4b-sft-full.Q5_K_S.gguf
Safe
990 MB
LFS
Upload folder using huggingface_hub
6 months ago
pythia-1.4b-sft-full.Q6_K.gguf
Safe
1.16 GB
LFS
Upload folder using huggingface_hub
6 months ago
pythia-1.4b-sft-full.Q8_0.gguf
Safe
1.51 GB
LFS
Upload folder using huggingface_hub
6 months ago