Mistral isn't supported yet

#3
by abrehmaaan - opened

I am getting this error:
Traceback (most recent call last):
File "C:\Users\Administrator\Projects\localGPT\run_localGPT.py", line 267, in
main()
File "C:\Program Files\Python310\lib\site-packages\click\core.py", line 1157, in call
return self.main(*args, **kwargs)
File "C:\Program Files\Python310\lib\site-packages\click\core.py", line 1078, in main
rv = self.invoke(ctx)
File "C:\Program Files\Python310\lib\site-packages\click\core.py", line 1434, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "C:\Program Files\Python310\lib\site-packages\click\core.py", line 783, in invoke
return __callback(*args, **kwargs)
File "C:\Users\Administrator\Projects\localGPT\run_localGPT.py", line 237, in main
qa = retrieval_qa_pipline(device_type, use_history, promptTemplate_type=model_type)
File "C:\Users\Administrator\Projects\localGPT\run_localGPT.py", line 132, in retrieval_qa_pipline
llm = load_model(device_type, model_id=MODEL_ID, model_basename=MODEL_BASENAME, LOGGING=logging)
File "C:\Users\Administrator\Projects\localGPT\run_localGPT.py", line 66, in load_model
model, tokenizer = load_quantized_model_qptq(model_id, model_basename, device_type, LOGGING)
File "C:\Users\Administrator\Projects\localGPT\load_models.py", line 95, in load_quantized_model_qptq
model = AutoGPTQForCausalLM.from_quantized(
File "C:\Program Files\Python310\lib\site-packages\auto_gptq\modeling\auto.py", line 79, in from_quantized
model_type = check_and_get_model_type(save_dir or model_name_or_path, trust_remote_code)
File "C:\Program Files\Python310\lib\site-packages\auto_gptq\modeling_utils.py", line 125, in check_and_get_model_type
raise TypeError(f"{config.model_type} isn't supported yet.")
TypeError: mistral isn't supported yet.

I am using localGPT repository from PromptEngineer.

@TheBloke Please help

Yes, AutoGPTQ doesn't support Mistral GPTQ yet. You need to use Transformers GPTQ, or ExLlama.

Sign up or log in to comment