TypeError: RefinedWebModel isn't supported yet.

#1
by thefaheem - opened

This is what my code is

from transformers import AutoTokenizer
from auto_gptq import AutoGPTQForCausalLM

# Download the model from HF and store it locally, then reference its location here:
#quantized_model_dir = model_path

from transformers import AutoTokenizer
tokenizer = AutoTokenizer.from_pretrained("TheBloke/WizardLM-Uncensored-Falcon-7B-GPTQ", use_fast=False)

model = AutoGPTQForCausalLM.from_quantized("TheBloke/WizardLM-Uncensored-Falcon-7B-GPTQ", device="cuda:1", use_triton=False, use_safetensors=True, trust_remote_code=True)

When I Run This, i Got These:

- configuration_RW.py
. Make sure to double-check they do not contain any added malicious code. To avoid downloading new versions of the code file, you can pin a revision.
โ•ญโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€ Traceback (most recent call last) โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ•ฎ
โ”‚ in <cell line: 11>:11                                                                            โ”‚
โ”‚                                                                                                  โ”‚
โ”‚ /usr/local/lib/python3.10/dist-packages/auto_gptq/modeling/auto.py:62 in from_quantized          โ”‚
โ”‚                                                                                                  โ”‚
โ”‚   59 โ”‚   โ”‚   model_basename: Optional[str] = None,                                               โ”‚
โ”‚   60 โ”‚   โ”‚   trust_remote_code: bool = False                                                     โ”‚
โ”‚   61 โ”‚   ) -> BaseGPTQForCausalLM:                                                               โ”‚
โ”‚ โฑ 62 โ”‚   โ”‚   model_type = check_and_get_model_type(save_dir)                                     โ”‚
โ”‚   63 โ”‚   โ”‚   return GPTQ_CAUSAL_LM_MODEL_MAP[model_type].from_quantized(                         โ”‚
โ”‚   64 โ”‚   โ”‚   โ”‚   save_dir=save_dir,                                                              โ”‚
โ”‚   65 โ”‚   โ”‚   โ”‚   device=device,                                                                  โ”‚
โ”‚                                                                                                  โ”‚
โ”‚ /usr/local/lib/python3.10/dist-packages/auto_gptq/modeling/_utils.py:124 in                      โ”‚
โ”‚ check_and_get_model_type                                                                         โ”‚
โ”‚                                                                                                  โ”‚
โ”‚   121 def check_and_get_model_type(model_dir):                                                   โ”‚
โ”‚   122 โ”‚   config = AutoConfig.from_pretrained(model_dir, trust_remote_code=True)                 โ”‚
โ”‚   123 โ”‚   if config.model_type not in SUPPORTED_MODELS:                                          โ”‚
โ”‚ โฑ 124 โ”‚   โ”‚   raise TypeError(f"{config.model_type} isn't supported yet.")                       โ”‚
โ”‚   125 โ”‚   model_type = config.model_type                                                         โ”‚
โ”‚   126 โ”‚   return model_type                                                                      โ”‚
โ”‚   127                                                                                            โ”‚
โ•ฐโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ•ฏ
TypeError: RefinedWebModel isn't supported yet.

How Should I Get Rid of These?

Can SomeBody Help?

@TheBloke can you please chime in. Thanks!

โ•ญโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€ Traceback (most recent call last) โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ•ฎ
โ”‚ in <cell line: 11>:11 โ”‚
โ”‚ โ”‚
โ”‚ /usr/local/lib/python3.10/dist-packages/auto_gptq/modeling/auto.py:62 in from_quantized โ”‚
โ”‚ โ”‚
โ”‚ 59 โ”‚ โ”‚ model_basename: Optional[str] = None, โ”‚
โ”‚ 60 โ”‚ โ”‚ trust_remote_code: bool = False โ”‚
โ”‚ 61 โ”‚ ) -> BaseGPTQForCausalLM: โ”‚
โ”‚ โฑ 62 โ”‚ โ”‚ model_type = check_and_get_model_type(save_dir) โ”‚
โ”‚ 63 โ”‚ โ”‚ return GPTQ_CAUSAL_LM_MODEL_MAP[model_type].from_quantized( โ”‚
โ”‚ 64 โ”‚ โ”‚ โ”‚ save_dir=save_dir, โ”‚
โ”‚ 65 โ”‚ โ”‚ โ”‚ device=device, โ”‚
โ”‚ โ”‚
โ”‚ /usr/local/lib/python3.10/dist-packages/auto_gptq/modeling/_utils.py:124 in โ”‚
โ”‚ check_and_get_model_type โ”‚
โ”‚ โ”‚
โ”‚ 121 def check_and_get_model_type(model_dir): โ”‚
โ”‚ 122 โ”‚ config = AutoConfig.from_pretrained(model_dir, trust_remote_code=True) โ”‚
โ”‚ 123 โ”‚ if config.model_type not in SUPPORTED_MODELS: โ”‚
โ”‚ โฑ 124 โ”‚ โ”‚ raise TypeError(f"{config.model_type} isn't supported yet.") โ”‚
โ”‚ 125 โ”‚ model_type = config.model_type โ”‚
โ”‚ 126 โ”‚ return model_type โ”‚
โ”‚ 127 โ”‚
โ•ฐโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ•ฏ
TypeError: RefinedWebModel isn't supported yet.

Did You Found Any Solution, My Friend?

Don't Worry @Dxtrmst Tom is Working on This.

You need to be using auto-gptq version 0.2.0. For some reason you guys don't have the right version.

Please try:

pip uninstall auto-gptq
pip install auto-gptq==0.2.0

You need to be using auto-gptq version 0.2.0. For some reason you guys don't have the right version.

Please try:

pip uninstall auto-gptq
pip install auto-gptq==0.2.0

I donno somehow it says it does'nt have version 0.2.0

thank you @TheBloke

this works for me in colab google

!git clone https://github.com/PanQiWei/AutoGPTQ
%cd AutoGPTQ
!pip install .

and then download in colab as per below sequence.

  1. !huggingface-cli login --token "hf_xxxxxxxxx"
  2. from huggingface_hub import snapshot_download
    snapshot_download(repo_id="TheBloke/WizardLM-Uncensored-Falcon-7B-GPTQ")

this wil dowdload to the colab instance root folder (this folder will be shown after executing step 2 above)

the rest are the same. however i received some warning notifications but it still works.

Yeah currently a lot of warnings are printed, stuff like:

WARNING:RWGPTQForCausalLM hasn't fused attention module yet, will skip inject fused attention.
WARNING:RWGPTQForCausalLM hasn't fused mlp module yet, will skip inject fused mlp.

This can be ignored. I will suggest to the AutoGPTQ author that these warnings should be INFO instead, or not printed

The only working "old" version of auto-gptq at the moment is %pip install auto-gptq==0.2.2 on colab.

Sign up or log in to comment