*** OSError: meta-llama/Meta-Llama-3-8B-Instruct does not appear to have a file named pytorch_model.bin, tf_model.h5, model.ckpt or flax_model.msgpack.

#131
by akjagadish - opened

OSError: meta-llama/Meta-Llama-3-8B-Instruct does not appear to have a file named pytorch_model.bin, tf_model.h5, model.ckpt or flax_model.msgpack.

akjagadish changed discussion title from trying to use LLama-3-8B-Instruct model to self.model = AutoModel.from_pretrained(engine, device_map="auto", torch_dtype=torch.bfloat16, use_auth_token=hf_key)
akjagadish changed discussion title from self.model = AutoModel.from_pretrained(engine, device_map="auto", torch_dtype=torch.bfloat16, use_auth_token=hf_key) to *** OSError: meta-llama/Meta-Llama-3-8B-Instruct does not appear to have a file named pytorch_model.bin, tf_model.h5, model.ckpt or flax_model.msgpack.

I had the same error message with meta llama3 8b instruct. After I downgraded to torch==2.1.0 and transformers==4.40.0, it worked for me.

Sign up or log in to comment