Hugging Face
Models
Datasets
Spaces
Posts
Docs
Solutions
Pricing
Log In
Sign Up
QuantFactory
/
pythia-12b-GGUF
like
2
Follow
Quant Factory
195
Text Generation
GGUF
PyTorch
EleutherAI/pile
English
causal-lm
pythia
Inference Endpoints
arxiv:
2304.01373
arxiv:
2101.00027
arxiv:
2201.07311
License:
apache-2.0
Model card
Files
Files and versions
Community
1
Deploy
Use this model
f4b2778
pythia-12b-GGUF
3 contributors
History:
15 commits
aashish1904
Upload pythia-12b.Q4_K_S.gguf with huggingface_hub
f4b2778
verified
4 months ago
.gitattributes
Safe
2.33 kB
Upload pythia-12b.Q4_K_S.gguf with huggingface_hub
4 months ago
pythia-12b.Q2_K.gguf
Safe
4.5 GB
LFS
Upload pythia-12b.Q2_K.gguf with huggingface_hub
4 months ago
pythia-12b.Q3_K_L.gguf
Safe
6.79 GB
LFS
Upload pythia-12b.Q3_K_L.gguf with huggingface_hub
4 months ago
pythia-12b.Q3_K_M.gguf
Safe
6.23 GB
LFS
Upload pythia-12b.Q3_K_M.gguf with huggingface_hub
4 months ago
pythia-12b.Q3_K_S.gguf
Safe
5.2 GB
LFS
Upload pythia-12b.Q3_K_S.gguf with huggingface_hub
4 months ago
pythia-12b.Q4_0.gguf
Safe
6.74 GB
LFS
Upload pythia-12b.Q4_0.gguf with huggingface_hub
4 months ago
pythia-12b.Q4_1.gguf
Safe
7.46 GB
LFS
Upload pythia-12b.Q4_1.gguf with huggingface_hub
4 months ago
pythia-12b.Q4_K_M.gguf
Safe
7.58 GB
LFS
Upload pythia-12b.Q4_K_M.gguf with huggingface_hub
4 months ago
pythia-12b.Q4_K_S.gguf
Safe
6.79 GB
LFS
Upload pythia-12b.Q4_K_S.gguf with huggingface_hub
4 months ago
pythia-12b.Q5_0.gguf
Safe
8.19 GB
LFS
Upload pythia-12b.Q5_0.gguf with huggingface_hub
4 months ago
pythia-12b.Q5_1.gguf
Safe
8.91 GB
LFS
Upload pythia-12b.Q5_1.gguf with huggingface_hub
4 months ago
pythia-12b.Q5_K_M.gguf
Safe
8.82 GB
LFS
Upload pythia-12b.Q5_K_M.gguf with huggingface_hub
4 months ago
pythia-12b.Q5_K_S.gguf
Safe
8.19 GB
LFS
Upload pythia-12b.Q5_K_S.gguf with huggingface_hub
4 months ago
pythia-12b.Q6_K.gguf
Safe
9.73 GB
LFS
Upload pythia-12b.Q6_K.gguf with huggingface_hub
4 months ago
pythia-12b.Q8_0.gguf
Safe
12.6 GB
LFS
Upload pythia-12b.Q8_0.gguf with huggingface_hub
4 months ago