Getting model not found error

#39
by drathi - opened

Hey, I am getting error : "OSError: anon8231489123/vicuna-13b-GPTQ-4bit-128g does not appear to have a file named pytorch_model-00001-of-00003.bin. Checkout 'https://huggingface.co/anon8231489123/vicuna-13b-GPTQ-4bit-128g/main' for available files." Any pointers?

I've got the same error

For me working, it must be something with your setup/folder/links. It would be better if you paste all here

model_id = "anon8231489123/vicuna-13b-GPTQ-4bit-128g"
Tokenizer = LlamaTokenizerFast.from_pretrained(model_id)
model = AutoModelForCausalLM.from_pretrained(model_id)

I recive an error once it hits the "AutoModelForCausalLM.from_pretrained(model_id)" with the error ""OSError: anon8231489123/vicuna-13b-GPTQ-4bit-128g does not appear to have a file named pytorch_model-00001-of-00003.bin"

ohh sorry, i using it with oobabooga and it's working. I can't help with python :(
Maybe you can try with oobabooga and if it will work that's mean that the problem is with the code and maybe then reverse engineer the oobabooga commands?

Sign up or log in to comment