update to latest transformers and exllama, still loading fail

#17
by yiouyou - opened

update the transformers to 4.32.0.dev0, exllama to 0.0.8+cu117, still get errors when loading with text-generation-webui as below:

Traceback (most recent call last): File “/home/abc/text-generation-webui/server.py”, line 68, in load_model_wrapper shared.model, shared.tokenizer = load_model(shared.model_name, loader) File “/home/abc/text-generation-webui/modules/models.py”, line 79, in load_model output = load_func_maploader File “/home/abc/text-generation-webui/modules/models.py”, line 320, in ExLlama_loader model, tokenizer = ExllamaModel.from_pretrained(model_name) File “/home/abc/text-generation-webui/modules/exllama.py”, line 49, in from_pretrained config = ExLlamaConfig(str(model_config_path)) File “/home/abc/miniconda3/envs/textgen/lib/python3.10/site-packages/exllama/model.py”, line 46, in init read_config = json.load(f) File “/home/abc/miniconda3/envs/textgen/lib/python3.10/json/init.py”, line 293, in load return loads(fp.read(), File “/home/abc/miniconda3/envs/textgen/lib/python3.10/json/init.py”, line 346, in loads return _default_decoder.decode(s) File “/home/abc/miniconda3/envs/textgen/lib/python3.10/json/decoder.py”, line 340, in decode raise JSONDecodeError(“Extra data”, s, end) json.decoder.JSONDecodeError: Extra data: line 25 column 5 (char 632)

Need some help to get it running, Thanks for help!

yiouyou changed discussion title from update the transformers to 4.32.0.dev0, exllama to 0.0.8+cu117, still get errors when loading with text-generation-webui to update to latest transformers and exllama, still get errors when loading with text-generation-webui
yiouyou changed discussion title from update to latest transformers and exllama, still get errors when loading with text-generation-webui to update to latest transformers and exllama, text-generation-webui still loading fail
yiouyou changed discussion title from update to latest transformers and exllama, text-generation-webui still loading fail to update to latest transformers and exllama, still loading fail

Hmm, it seems like it's failing on reading config.json.

The files are definitely fine, I tested this model with ExLlama earlier today actually.

Can you try triggering a download of the repo again, in case your files have got corrupted or one file is missing or something like that.

Thanks for your help! After re-download all small files, it's loaded.

Sign up or log in to comment