phi-3-mini-llamafile-nonAVX
llamafile lets you distribute and run LLMs with a single file. announcement blog post
Downloads
This repository was created using the llamafile-builder
- Downloads last month
- 3
Inference Providers
NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API:
The model has no library tag.
Model tree for blueprintninja/phi-3-mini-llamafile-nonAVX
Base model
QuantFactory/Phi-3-mini-4k-instruct-GGUF