Didn't work on text generation webui

#2
by Hawk9970 - opened

File "C:\Users***\Downloads\Compressed\text-generation-webui-main\text-generation-webui-main\installer_files\env\Lib\site-packages\transformers\configuration_utils.py", line 722, in _get_config_dict

config_dict = cls._dict_from_json_file(resolved_config_file)

          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

File "C:\Users***\Downloads\Compressed\text-generation-webui-main\text-generation-webui-main\installer_files\env\Lib\site-packages\transformers\configuration_utils.py", line 825, in _dict_from_json_file

text = reader.read()

   ^^^^^^^^^^^^^

File "", line 322, in decode

UnicodeDecodeError: 'utf-8' codec can't decode byte 0xa0 in position 578: invalid start byte

During handling of the above exception, another exception occurred:

Traceback (most recent call last):

File "C:\Users***\Downloads\Compressed\text-generation-webui-main\text-generation-webui-main\modules\ui_model_menu.py", line 248, in load_model_wrapper

shared.model, shared.tokenizer = load_model(selected_model, loader)

                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

File "C:\Users***\Downloads\Compressed\text-generation-webui-main\text-generation-webui-main\modules\models.py", line 94, in load_model

output = load_func_maploader

     ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

File "C:\Users***\Downloads\Compressed\text-generation-webui-main\text-generation-webui-main\modules\models.py", line 152, in huggingface_loader

config = AutoConfig.from_pretrained(path_to_model, trust_remote_code=shared.args.trust_remote_code)

     ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

File "C:\Users***\Downloads\Compressed\text-generation-webui-main\text-generation-webui-main\installer_files\env\Lib\site-packages\transformers\models\auto\configuration_auto.py", line 965, in from_pretrained

config_dict, unused_kwargs = PretrainedConfig.get_config_dict(pretrained_model_name_or_path, **kwargs)

                         ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

File "C:\Users***\Downloads\Compressed\text-generation-webui-main\text-generation-webui-main\installer_files\env\Lib\site-packages\transformers\configuration_utils.py", line 632, in get_config_dict

config_dict, kwargs = cls._get_config_dict(pretrained_model_name_or_path, **kwargs)

                  ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

File "C:\Users***\Downloads\Compressed\text-generation-webui-main\text-generation-webui-main\installer_files\env\Lib\site-packages\transformers\configuration_utils.py", line 726, in _get_config_dict

raise EnvironmentError(
OSError: It looks like the config file at 'models\Mistral-Nemo-Instruct-2407.q5_k .gguf' is not a valid JSON file.

Please help, I really want to play with this model

Owner

You have to use llama.cpp or kobold.cpp or other back-ends who support GGUF.

ZeroWw changed discussion status to closed

Sign up or log in to comment