GGUF
English
Not-For-All-Audiences

A request for quantization.

#1
by Kotokin - opened

Hi, can you please quantize the following model into iq2_xs? Miquliz 120B v2.0

Sorry, I'd rather not touch anything with miqu in it.

Artefact2 changed discussion status to closed

Sign up or log in to comment