Can't determine model type from model name. Please specify it manually using --model_type argument
#12
by
chouaibmeramria
- opened
So did you specify the model type in the GPTQ settings?
Try following my easy install instructions. I think it's step 8 you're not doing fully.
How to easily download and use this model in text-generation-webui
Open the text-generation-webui UI as normal.
- Click the Model tab.
- Under Download custom model or LoRA, enter
TheBloke/wizardLM-7B-GPTQ
. - Click Download.
- Wait until it says it's finished downloading.
- Click the Refresh icon next to Model in the top left.
- In the Model drop-down: choose the model you just downloaded,
wizardLM-7B-GPTQg
. - If you see an error in the bottom right, ignore it - it's temporary.
- Fill out the
GPTQ parameters
on the right:Bits = 4
,Groupsize = 128
,model_type = Llama
- Click Save settings for this model in the top right.
- Click Reload the Model in the top right.
- Once it says it's loaded, click the Text Generation tab and enter a prompt!