Meta-Llama-3-8B-Instruct does not appear to have a file named config.json

#82
by RK-RK - opened

Downloaded the model using

huggingface-cli download meta-llama/Meta-Llama-3-8B-Instruct --include "original/*" --local-dir Meta-Llama-3-8B-Instruct

and am getting the below error:

Meta-Llama-3-8B-Instruct does not appear to have a file named config.json

What am i missing?

same problem

hi! i face the same issue, afterdownload the model, i got the same error, i suggest to look carefully to the name of the file , for same reason the file apperead on my machine like config(1).json , that "1" created some isseues, i delete the "1" and i solved the problem (pay attetion do not leave any -space- between the "config" and the "." , let me know...

hi! i face the same issue, afterdownload the model, i got the same error, i suggest to look carefully to the name of the file , for same reason the file apperead on my machine like config(1).json , that "1" created some isseues, i delete the "1" and i solved the problem (pay attetion do not leave any -space- between the "config" and the "." , let me know...

import transformers
import torch

model_path = "./" # replace with the actual path to the model directory
model_id = "Meta-Llama-3-8B-Instruct-Q4_K_M"

Load the model from the local path

model = transformers.AutoModelForCausalLM.from_pretrained(model_path)

Create the pipeline

pipeline = transformers.pipeline(
"text-generation", model=model, model_kwargs={"torch_dtype": torch.bfloat16}, device_map="auto"
)

Test the pipeline

output = pipeline("hi")
print(output)

give this error after get config

OSError: Error no file named pytorch_model.bin, tf_model.h5, model.ckpt.index or flax_model.msgpack found in directory ./.

i downlowa this model:

https://huggingface.co/lmstudio-community/Meta-Llama-3-8B-Instruct-GGUF/resolve/main/Meta-Llama-3-8B-Instruct-Q4_K_M.gguf

import transformers
import torch

model_id = "meta-llama/Meta-Llama-3-8B"

pipeline = transformers.pipeline(
"text-generation", model=model_id, model_kwargs={"torch_dtype": torch.bfloat16}, device_map="auto"
)

pipeline("hi")

why carsh and not give response?
i run it on colab

https://huggingface.co/meta-llama/Meta-Llama-3-8B-Instruct/tree/main this is the right link, go there and then one by one dowload all files

hi! i face the same issue, afterdownload the model, i got the same error, i suggest to look carefully to the name of the file , for same reason the file apperead on my machine like config(1).json , that "1" created some isseues, i delete the "1" and i solved the problem (pay attetion do not leave any -space- between the "config" and the "." , let me know...

HI , I verified this is not the issue for me..Thanks

Sign up or log in to comment