TypeError: qwen isn't supported yet.?

#2
by Boffy - opened

2023-11-22 00:07:35 INFO:Loading TheBloke_Qwen-14B-Chat-AWQ...
2023-11-22 00:07:35 ERROR:Failed to load the model.
Traceback (most recent call last):
File "C:\Projects\AI\text-generation-webui\modules\ui_model_menu.py", line 209, in load_model_wrapper
shared.model, shared.tokenizer = load_model(shared.model_name, loader)
File "C:\Projects\AI\text-generation-webui\modules\models.py", line 85, in load_model
output = load_func_maploader
File "C:\Projects\AI\text-generation-webui\modules\models.py", line 310, in AutoAWQ_loader
model = AutoAWQForCausalLM.from_quantized(
File "C:\Projects\AI\text-generation-webui\installer_files\env\lib\site-packages\awq\models\auto.py", line 50, in from_quantized
model_type = check_and_get_model_type(quant_path, trust_remote_code)
File "C:\Projects\AI\text-generation-webui\installer_files\env\lib\site-packages\awq\models\auto.py", line 25, in check_and_get_model_type
raise TypeError(f"{config.model_type} isn't supported yet.")
TypeError: qwen isn't supported yet.

i got same error, like some answers

+1
same stuff.

AutoAWQ has added Qwen support in its main branch, but it hasn't yet done a new release with it. Normally Casper releases quite quickly, but he must not be able to yet.

In the meantime you can get Qwen support by installing AutoAWQ from source:

git clone https://github.com/casper-hansen/AutoAWQ
cd AutoAWQ
pip3 install .

Thanks that works.A reminder: If you use Oobabooga, text-generation-webui-main it is necessary to run the appropriate cmd_windows for example to run the command in the virtual environment
I also manually set: pip install transformers>=4.37.0
do not forget the dot. for (pip3 install .)

Sign up or log in to comment