Getting InvalidHeaderDeserialization trying to load this model

#1
by neilb - opened

I'm running the sample code provided on the Model card using the latest version of AutoAWQ and it crashes on the load model step:

Load model

model = AutoAWQForCausalLM.from_quantized(model_name_or_path, fuse_layers=True,
trust_remote_code=False, safetensors=True)

safetensors_rust.SafetensorError: Error while deserializing header: InvalidHeaderDeserialization

@TheBloke Could you varify this error?

i will be waiting the GPTQ version instead

Sorry about that, it was caused by a bug that resulted in there being an empty model.safetensors file as well as model-x-of-y.safetensors files as well

I've removed the bad file. If you also delete model.safetensors from your local download directory, or delete the whole folder and do a download again, it will work fine now

Sign up or log in to comment