error while loading with exllama and AutoGPTQ

#2
by anon7463435254 - opened

Hi @TheBloke ,

I am getting the following error when trying to load the model using either ExLLaMA or AutoGPTQ. Actually there is no tokenizer.model file, but how can this be solved?

OSError: Not found: "models/TheBloke_deepseek-coder-6.7B-instruct-GPTQ/tokenizer.model": No such file or directory Error #2.

Thank you.

the same here, with fastchat & exllama2

ExLlama doesn't support models without tokenizer.model, but ExLlamav2 added a patch with support last night. AutoGPTQ will work.

Sign up or log in to comment