Text Generation
Transformers
Safetensors
English
mistral
text-generation-inference
4-bit precision
gptq

I am having issue loading dolphin 2.6 mistral 7B GPTQ:main by TheBloke . Pasting the error in the description pls help

#1
by KingSlayer49 - opened

Traceback (most recent call last):
File "C:\Other_stuffs\AI\TextgenwebUI\text-generation-webui\modules\ui_model_menu.py", line 214, in load_model_wrapper
shared.model, shared.tokenizer = load_model(selected_model, loader)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Other_stuffs\AI\TextgenwebUI\text-generation-webui\modules\models.py", line 79, in load_model
metadata = get_model_metadata(model_name)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Other_stuffs\AI\TextgenwebUI\text-generation-webui\modules\models_settings.py", line 115, in get_model_metadata
metadata = json.loads(open(path, 'r').read())
^^^^^^^^^^^^^^^^^^^^^^
File "C:\Other_stuffs\AI\TextgenwebUI\text-generation-webui\installer_files\env\Lib\encodings\cp1252.py", line 23, in decode
return codecs.charmap_decode(input,self.errors,decoding_table)[0]
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
UnicodeDecodeError: 'charmap' codec can't decode byte 0x9d in position 2149: character maps to

Exactly my case. I get the same error

Hey Bloke can you help us please

Same error

KingSlayer49 changed discussion status to closed
KingSlayer49 changed discussion status to open

Sign up or log in to comment