Transformers
mpt
Composer
MosaicML
llm-foundry
text-generation-inference

Are GPTQ version of the mpt models possible?

#1
by latent-variable - opened

Just out of curiosity, is there any limitation on having GPTQ version the MPT models, I could not find a gptq version of the for the mpt-7B, so I am concern there are limitations or are they on the way?

Not at the moment. At least I don't know how.

There was a PR put in to AutoGPTQ to implement it, but it hit some technical issue with the base MPT repo and to my knowledge it's not yet been resolved.

There was one MPT GPTQ put out, by Occam of the Kobald team, but I don't know how he did it.

I will investigate further.

Sign up or log in to comment