Quantized version?

#2
by farrael004 - opened

Do you plan on quantizing this model yourself? I would be very curious to test it once the oobabooga webui implements the MPT model loader in the GPTQ_loader module.

Nothing supports quantizing MPT models yet mate. But i can give it a try once things get implemented.

I think this can be done now?

Sign up or log in to comment