OSError: Error no file named pytorch_model.bin, tf_model.h5, model.ckpt.index or flax_model.msgpack

#6
by Appolonius - opened

the full error after downloading this model with the Text Generation Web UI (automatic windows install)
I'm currently downloading these models found elsewhere and going to place them in the model folder. is this normal?

Traceback (most recent call last):
File "C:\Users\Appolonius\Desktop\oobabooga_windows\text-generation-webui\server.py", line 1097, in
shared.model, shared.tokenizer = load_model(shared.model_name)
File "C:\Users\Appolonius\Desktop\oobabooga_windows\text-generation-webui\modules\models.py", line 97, in load_model
output = load_func(model_name)
File "C:\Users\Appolonius\Desktop\oobabooga_windows\text-generation-webui\modules\models.py", line 155, in huggingface_loader
model = LoaderClass.from_pretrained(Path(f"{shared.args.model_dir}/{model_name}"), low_cpu_mem_usage=True, torch_dtype=torch.bfloat16 if shared.args.bf16 else torch.float16, trust_remote_code=shared.args.trust_remote_code)
File "C:\Users\Appolonius\Desktop\oobabooga_windows\installer_files\env\lib\site-packages\transformers\models\auto\auto_factory.py", line 472, in from_pretrained
return model_class.from_pretrained(
File "C:\Users\Appolonius\Desktop\oobabooga_windows\installer_files\env\lib\site-packages\transformers\modeling_utils.py", line 2406, in from_pretrained
raise EnvironmentError(
OSError: Error no file named pytorch_model.bin, tf_model.h5, model.ckpt.index or flax_model.msgpack found in directory models\TheBloke_Wizard-Vicuna-30B-Uncensored-GPTQ.
Press any key to continue . . .

I'm also getting this error when trying to build a template for running this model on banana.dev

I get the same error on my device. I have some of the files from other model but not all of them. Does anyone know where I can get them from?

Are you using the latest version of webui?

I believe so. I tried downloading it from scratch again, and also running the update.

This error means GPTQ parameters are not set. For those using text-generation-webui, please see the instructions in the README.

@pelatho to use this with Python code , please check out AutoGPTQ. It can load GPTQ quantised models like this. You can't load GPTQ models with transformers on its own, you need to AutoGPTQ.

@TheBloke

Directly from readme
" * Note that you do not need to set GPTQ parameters any more. These should all be set to default values, as they are now set automatically from the file quantize_config.json."

Got that in the folder so I don't think that's the issue.

Also getting the same error and reinstalled using the directions in the readme... still got the same issue.

@DissentingPotato show me a screenshot of your models folder, and the contents of the TheBloke_Wizard-Vicuna-30B-Uncensored-GPTQ folder under that.

Sign up or log in to comment