Error on start - raise EnvironmentError( OSError: Error no file named pytorch_model.bin, tf_model.h5, model.ckpt.index or flax_model.msgpack

#10
by ilnurshams - opened

Help needed!

When I start via oobabooga_windows start.bat, it gives me the following error:

bin C:\Users\Admin\Desktop\oobabooga_windows\installer_files\env\lib\site-packages\bitsandbytes\libbitsandbytes_cuda117.dll
INFO:Loading TheBloke_WizardLM-30B-Uncensored-GPTQ...
Traceback (most recent call last):
File "C:\Users\Admin\Desktop\oobabooga_windows\text-generation-webui\server.py", line 1081, in
shared.model, shared.tokenizer = load_model(shared.model_name)
File "C:\Users\Admin\Desktop\oobabooga_windows\text-generation-webui\modules\models.py", line 95, in load_model
output = load_func(model_name)
File "C:\Users\Admin\Desktop\oobabooga_windows\text-generation-webui\modules\models.py", line 153, in huggingface_loader
model = LoaderClass.from_pretrained(Path(f"{shared.args.model_dir}/{model_name}"), low_cpu_mem_usage=True, torch_dtype=torch.bfloat16 if shared.args.bf16 else torch.float16, trust_remote_code=shared.args.trust_remote_code)
File "C:\Users\Admin\Desktop\oobabooga_windows\installer_files\env\lib\site-packages\transformers\models\auto\auto_factory.py", line 472, in from_pretrained
return model_class.from_pretrained(
File "C:\Users\Admin\Desktop\oobabooga_windows\installer_files\env\lib\site-packages\transformers\modeling_utils.py", line 2406, in from_pretrained
raise EnvironmentError(
OSError: Error no file named pytorch_model.bin, tf_model.h5, model.ckpt.index or flax_model.msgpack found in directory models\TheBloke_WizardLM-30B-Uncensored-GPTQ.

Done!
Press any key to continue . . .
Screenshot 2023-05-25 163428.png

it's complaining that it can't find a proper model in the folder.

switch to a model that you know works, then use the built-in model downloader to download the model you want.

i downloaded the model using the built in model downloader. still getting the same error

This error happens when you've not set GPTQ parameters for the model. Please see the README.

Update: i was trying to deploy this model on runpod's oobabooga ui image. that image does not have the latest version of text-gen-ui, which is the reason for this error, at least on my side. not sure about OP

Ah yeah, in fact the Runpod text-gen-ui template doesn't support GPTQ at all I think.

I have a Runpod template which supports GPTQ, GGML with GPU acceleration, and which always uses the latest version of text-generation-webui : https://runpod.io/gsc?template=qk29nkmbfr&ref=eexqfacd

Dude thats sick! Thanks!

@TheBloke I tried with your template on runpod and it still shows similar error on oobagooba UI:

File “/root/text-generation-webui/server.py”, line 71, in load_model_wrapper
shared.model, shared.tokenizer = load_model(shared.model_name)
File “/root/text-generation-webui/modules/models.py”, line 95, in load_model
output = load_func(model_name)
File “/root/text-generation-webui/modules/models.py”, line 153, in huggingface_loader
model = LoaderClass.from_pretrained(Path(f"{shared.args.model_dir}/{model_name}"), low_cpu_mem_usage=True, torch_dtype=torch.bfloat16 if shared.args.bf16 else torch.float16, trust_remote_code=shared.args.trust_remote_code)
File “/usr/local/lib/python3.10/dist-packages/transformers/models/auto/auto_factory.py”, line 471, in from_pretrained
return model_class.from_pretrained(
File “/usr/local/lib/python3.10/dist-packages/transformers/modeling_utils.py”, line 2405, in from_pretrained
raise EnvironmentError(
OSError: Error no file named pytorch_model.bin, tf_model.h5, model.ckpt.index or flax_model.msgpack found in directory models/TheBloke_WizardLM-30B-Uncensored-GPTQ.

This happens because you didn't set the GPTQ parameters. Please follow the instructions in the README - they're in both the README for this model, and the README for the Runpod template.

Make sure to set the GPTQ params and then "Save settings for this model" and "reload this model"

Fairly soon I will update the template to use AutoGPTQ and then you won't need to set GPTQ parameters manually. But for now you still do.

@TheBloke Thank you! is there a way to create an API endpoint and make http request instead of using the UI?

Yeah I made a template for that too!

https://runpod.io/gsc?template=f1pf20op0z&ref=eexqfacd

image.png

It's using the same container as "One-Click UI" so it should work exactly the same. The difference is it opens the API port so you can access remotely, and I included some instructions on using it in the README shown above

The GOAT! TYSM

This comment has been hidden
ilnurshams changed discussion status to closed

Sign up or log in to comment