Error no file named...

#8
by solotrek - opened

Hi, Fresh windows install of oobabooga today.(Nvidia option) Downloading TheBloke/Wizard-Vicuna-13B-Uncensored-GPTQ seemed to go fine. I didnt get to the point where I define the wbits and model_type. Just got below errors. Tried terminating the cmd window and opening a new cmd window.
But when using start_windows.bat to get things going, loading doesnt seem to work.
It looks like there should be a config file that is missing?.. But it not the config yml from ooba.
Any ideas?

"INFO:Loading TheBloke_Wizard-Vicuna-13B-Uncensored-GPTQ...
Traceback (most recent call last):
File "C:\Users\user\Dev\AI\oobabooga_windows\text-generation-webui\server.py", line 1102, in
shared.model, shared.tokenizer = load_model(shared.model_name)
File "C:\Users\user\Dev\AI\oobabooga_windows\text-generation-webui\modules\models.py", line 97, in load_model
output = load_func(model_name)
File "C:\Users\user\Dev\AI\oobabooga_windows\text-generation-webui\modules\models.py", line 155, in huggingface_loader
model = LoaderClass.from_pretrained(Path(f"{shared.args.model_dir}/{model_name}"), low_cpu_mem_usage=True, torch_dtype=torch.bfloat16 if shared.args.bf16 else torch.float16, trust_remote_code=shared.args.trust_remote_code)
File "C:\Users\user\Dev\AI\oobabooga_windows\installer_files\env\lib\site-packages\transformers\models\auto\auto_factory.py", line 472, in from_pretrained
return model_class.from_pretrained(
File "C:\Users\user\Dev\AI\oobabooga_windows\installer_files\env\lib\site-packages\transformers\modeling_utils.py", line 2406, in from_pretrained
raise EnvironmentError(
OSError: Error no file named pytorch_model.bin, tf_model.h5, model.ckpt.index or flax_model.msgpack found in directory models\TheBloke_Wizard-Vicuna-13B-Uncensored-GPTQ."

Please see the README for instructions on setting and saving GPTQ parameters for this model

Sign up or log in to comment