phi-3-mini-llamafile-nonAVX
llamafile lets you distribute and run LLMs with a single file. announcement blog post
Downloads
This repository was created using the llamafile-builder
- Downloads last month
- 9
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
๐
Ask for provider support
HF Inference deployability: The model has no library tag.
Model tree for blueprintninja/phi-3-mini-llamafile-nonAVX
Base model
QuantFactory/Phi-3-mini-4k-instruct-GGUF