How to properly load the model?

#2
by chaltik - opened

Following "Use in Transformers" example, I am making a call:

from transformers import AutoModel
model = AutoModel.from_pretrained("TheBloke/llama-2-7B-Guanaco-QLoRA-GGML")

resulting in the following error:

OSError: TheBloke/llama-2-7B-Guanaco-QLoRA-GGML does not appear to have a file named pytorch_model.bin, tf_model.h5, model.ckpt or flax_model.msgpack.

Sign up or log in to comment