Breeze-7B-Instruct-v1_0-GGUF
Original Model
MediaTek-Research/Breeze-7B-Instruct-v1_0
Run with Gaianet
Prompt template
prompt template: mediatek-breeze
Context size
chat_ctx_size: 8000
Run with GaiaNet
Quick start: https://docs.gaianet.ai/node-guide/quick-start
Customize your node: https://docs.gaianet.ai/node-guide/customize
Quantized with llama.cpp b3613
- Downloads last month
- 56
Hardware compatibility
Log In
to view the estimation
2-bit
3-bit
4-bit
5-bit
6-bit
8-bit
16-bit
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
๐
Ask for provider support
HF Inference deployability: The model authors have turned it off explicitly.
Model tree for gaianet/Breeze-7B-Instruct-v1_0-GGUF
Base model
MediaTek-Research/Breeze-7B-Instruct-v1_0