Error loading model with HQQModelForCausalLM: Model architecture Phi3ForCausalLM not supported yet.

#3
by dkjsnnr1 - opened

Hey The sample does does not work propperly. I tried to figure out the issues:

try:
model = HQQModelForCausalLM.from_quantized("PrunaAI/microsoft-Phi-3-mini-128k-instruct-HQQ-1bit-smashed")
except Exception as e:
print("Error loading model with HQQModelForCausalLM:", e)
try:
model = AutoHQQHFModel.from_quantized("PrunaAI/microsoft-Phi-3-mini-128k-instruct-HQQ-1bit-smashed")
except Exception as e:
print("Error loading model with AutoHQQHFModel:", e)

Output:
Error loading model with HQQModelForCausalLM: Model architecture Phi3ForCausalLM not supported yet.
Error loading model with AutoHQQHFModel: BaseHQQHFModel.create_model() missing 1 required positional argument: 'kwargs'

This remains also after installing: transformers-4.41.0.dev0

Sign up or log in to comment