Hugging Face
Models
Datasets
Spaces
Posts
Docs
Solutions
Pricing
Log In
Sign Up
PrunaAI
/
phi-2-GGUF-smashed
like
0
Follow
Pruna AI
114
GGUF
pruna-ai
Inference Endpoints
Model card
Files
Files and versions
Community
3
Deploy
Use this model
refs/pr/2
phi-2-GGUF-smashed
2 contributors
History:
23 commits
johnrachwanpruna
80492429c386502271fb584481a488f5bd54b6bcc5b5cc6f9c0d622387da66f7
73837bc
verified
7 months ago
.gitattributes
Safe
2.53 kB
5913a50abeac41303ed55cf9337ddbea8759d5d8c1d370c6a5e52b0b7a622660
7 months ago
README.md
Safe
4.4 kB
Create README.md
7 months ago
phi-2.IQ3_M.gguf
Safe
1.32 GB
LFS
830cf9cc3c01b3770f9f7dbe5060290777594eec0970e74c86c005d785c34ca3
7 months ago
phi-2.IQ3_S.gguf
Safe
1.25 GB
LFS
3bdd7d6d96e1ca6b0eb7942309c13be28e3fdd94d62d9a606912857cdc9f8699
7 months ago
phi-2.IQ3_XS.gguf
Safe
1.2 GB
LFS
4ce169ef1e8816bc28a654dbae6bbf06ff85a5ba96167d7aae7d02760ea78654
7 months ago
phi-2.IQ4_NL.gguf
Safe
1.62 GB
LFS
847daaf4a07d1e65d04fcd2cf83167c5e107383b2d5aab815eb068984a711fc6
7 months ago
phi-2.IQ4_XS.gguf
Safe
1.54 GB
LFS
5913a50abeac41303ed55cf9337ddbea8759d5d8c1d370c6a5e52b0b7a622660
7 months ago
phi-2.Q2_K.gguf
Safe
1.11 GB
LFS
Upload folder using huggingface_hub (#1)
7 months ago
phi-2.Q3_K_L.gguf
Safe
1.58 GB
LFS
Upload folder using huggingface_hub (#1)
7 months ago
phi-2.Q3_K_M.gguf
Safe
1.43 GB
LFS
Upload folder using huggingface_hub (#1)
7 months ago
phi-2.Q3_K_S.gguf
Safe
1.25 GB
LFS
Upload folder using huggingface_hub (#1)
7 months ago
phi-2.Q4_0.gguf
Safe
1.6 GB
LFS
Upload folder using huggingface_hub (#1)
7 months ago
phi-2.Q4_1.gguf
Safe
1.77 GB
LFS
Upload folder using huggingface_hub (#1)
7 months ago
phi-2.Q4_K_M.gguf
Safe
1.74 GB
LFS
Upload folder using huggingface_hub (#1)
7 months ago
phi-2.Q4_K_S.gguf
Safe
1.62 GB
LFS
Upload folder using huggingface_hub (#1)
7 months ago
phi-2.Q5_0.gguf
Safe
1.93 GB
LFS
Upload folder using huggingface_hub (#1)
7 months ago
phi-2.Q5_1.gguf
Safe
2.1 GB
LFS
Upload folder using huggingface_hub (#1)
7 months ago
phi-2.Q5_K_M.gguf
Safe
2 GB
LFS
Upload folder using huggingface_hub (#1)
7 months ago
phi-2.Q5_K_S.gguf
Safe
1.93 GB
LFS
Upload folder using huggingface_hub (#1)
7 months ago
phi-2.Q6_K.gguf
Safe
2.29 GB
LFS
Upload folder using huggingface_hub (#1)
7 months ago
phi-2.Q8_0.gguf
Safe
2.96 GB
LFS
Upload folder using huggingface_hub (#1)
7 months ago
phi-2.fp16.bin
Safe
5.56 GB
LFS
Upload folder using huggingface_hub (#1)
7 months ago