Load model fail

#4
by Fazzie - opened

Get Error

OSError: Unable to load weights from pytorch checkpoint file for '/mnt/bd/fazzie-data/models/CodeLlama-7b-hf/pytorch_model-00001-of-00002.bin' at '/mnt/bd/fazzie-data/models/CodeLlama-7b-hf/pytorch_model-00001-of-00002.bin'. If you tried to load a PyTorch model from a TF 2.0 checkpoint, please set from_tf=True.

Here is the code

from transformers import AutoTokenizer, AutoModelForCausalLM
import transformers
import torch

model = "./models/CodeLlama-7b-hf"

tokenizer = AutoTokenizer.from_pretrained(model)

model = AutoModelForCausalLM.from_pretrained(model)

and I have installed the latest transformers by

pip install git+https://github.com/huggingface/transformers.git@refs/pull/25740/head accelerate

Code Llama org

You might have downloaded the repo when it was in an incomplete state, as we are still making changes to it. Does AutoModelForCausalLM.from_pretrained("codellama/CodeLlama-7b-hf") work for you now? It should download the safetensors files we recently uploaded.

osanseviero changed discussion status to closed

Sign up or log in to comment