GGUFd https://huggingface.co/Maykeye/TinyLLama-v0
Download
pip install huggingface-hub
From CLI:
huggingface-cli download \
aladar/TinyLLama-v0-GGUF \
TinyLLama-v0.Q8_0.gguf \
--local-dir . \
--local-dir-use-symlinks False
- Downloads last month
- 63
Hardware compatibility
Log In
to view the estimation
8-bit
16-bit
32-bit
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
๐
Ask for provider support
HF Inference deployability: The model has no library tag.