model does not work on rocm

#1
by Nicopara - opened

ERROR:The model could not be loaded because its type could not be inferred from its name.
ERROR:Please specify the type manually using the --model_type argument.

Hi @Nicopara , this model is quantized using AutoGPTQ, and as of now the only way to load it is through that library. I will add in the next couple of days a README containing instructions on how to load the model.

Thank you, I was trying to load this model and the other model from the author mayank31398 on ooba, both ending in failure. Using the provided GPTQ-for_SantaCoder ends in:
Loading checkpoint shards: 14%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–‹ | 1/7 [00:05<00:32, 5.45s/it]Killed
So no luck with that either.

Sign up or log in to comment