vaclavkosar's picture
Add missing quant_config.json for compatibility with vLLM backends out of the box.
aa2a3bf verified
raw
history blame
90 Bytes
{
"zero_point": true,
"q_group_size": 128,
"w_bit": 4,
"version": "GEMM"
}