Error while loading the model using safe tensors

#22
by tasheer10 - opened

Getting the following error while executing the following code :
pipe = pipeline("text-generation", model="HuggingFaceH4/starchat-beta", torch_dtype=torch.bfloat16, device_map="auto")
Error:
result[k] = f.get_tensor(k)
RuntimeError: Viewing a tensor as a new dtype with a different number of bytes per element is not supported.

tasheer10 changed discussion title from Error while loading the model using saved tensors to Error while loading the model using safe tensors

Sign up or log in to comment