Don't load in oobabooga webui (gtx 1660 ti, intel core i5)

#2
by Vlagamer - opened

i know what isnt a github but (Starting the web UI...

​

===================================BUG REPORT===================================

Welcome to bitsandbytes. For bug reports, please submit your error trace to: https://github.com/TimDettmers/bitsandbytes/issues

================================================================================

CUDA SETUP: CUDA runtime path found: D:\Games\oobabooga-windows\installer_files\env\bin\cudart64_110.dll

CUDA SETUP: Highest compute capability among GPUs detected: 7.5

CUDA SETUP: Detected CUDA version 117

CUDA SETUP: Loading binary D:\Games\oobabooga-windows\installer_files\env\lib\site-packages\bitsandbytes\libbitsandbytes_cuda117.dll...

Loading mayaeary_pygmalion-6b_dev-4bit-128g...

Found the following quantized model: models\mayaeary_pygmalion-6b_dev-4bit-128g\pygmalion-6b_dev-4bit-128g.safetensors

Loading model ...

D:\Games\oobabooga-windows\installer_files\env\lib\site-packages\safetensors\torch.py:99: UserWarning: TypedStorage is deprecated. It will be removed in the future and UntypedStorage will be the only storage class. This should only matter to you if you are using storages directly. To access UntypedStorage directly, use tensor.untyped_storage() instead of tensor.storage()

with safe_open(filename, framework="pt", device=device) as f:

D:\Games\oobabooga-windows\installer_files\env\lib\site-packages\torch\_utils.py:776: UserWarning: TypedStorage is deprecated. It will be removed in the future and UntypedStorage will be the only storage class. This should only matter to you if you are using storages directly. To access UntypedStorage directly, use tensor.untyped_storage() instead of tensor.storage()

return self.fget.__get__(instance, owner)()

D:\Games\oobabooga-windows\installer_files\env\lib\site-packages\torch\storage.py:899: UserWarning: TypedStorage is deprecated. It will be removed in the future and UntypedStorage will be the only storage class. This should only matter to you if you are using storages directly. To access UntypedStorage directly, use tensor.untyped_storage() instead of tensor.storage()

storage = cls(wrap_storage=untyped_storage) ) i got this error tell me what to do in fact I have these configurations --wbits 4 --groupsize 128 --extensions api

It isn't errors, it just warnings. You should be able to use model without problems.

I have this error loading the model

Traceback (most recent call last):
File "C:\AIweb\oobabooga-windows\text-generation-webui\server.py", line 85, in load_model_wrapper
shared.model, shared.tokenizer = load_model(shared.model_name)
File "C:\AIweb\oobabooga-windows\text-generation-webui\modules\models.py", line 188, in load_model
tokenizer = AutoTokenizer.from_pretrained(Path(f"{shared.args.model_dir}/{shared.model_name}/"))
File "C:\AIweb\oobabooga-windows\installer_files\env\lib\site-packages\transformers\models\auto\tokenization_auto.py", line 702, in from_pretrained
return tokenizer_class.from_pretrained(pretrained_model_name_or_path, *inputs, **kwargs)
File "C:\AIweb\oobabooga-windows\installer_files\env\lib\site-packages\transformers\tokenization_utils_base.py", line 1811, in from_pretrained
return cls._from_pretrained(
File "C:\AIweb\oobabooga-windows\installer_files\env\lib\site-packages\transformers\tokenization_utils_base.py", line 1841, in _from_pretrained
slow_tokenizer = (cls.slow_tokenizer_class)._from_pretrained(
File "C:\AIweb\oobabooga-windows\installer_files\env\lib\site-packages\transformers\tokenization_utils_base.py", line 1965, in _from_pretrained
tokenizer = cls(*init_inputs, **init_kwargs)
File "C:\AIweb\oobabooga-windows\installer_files\env\lib\site-packages\transformers\models\gpt2\tokenization_gpt2.py", line 188, in init
with open(vocab_file, encoding="utf-8") as vocab_handle:
TypeError: expected str, bytes or os.PathLike object, not NoneType

That means the model isn't downloaded properly, you have to clone/download the whole folder so that it matches this repo exactly.

Sign up or log in to comment