Unexpected keyword 'rope_scaling' while loading model

#122
by gandhipratik65j - opened

Please check detailed logs

I want to load the model via webUI and expose open API but I am facing below issue.

File "E:\text-generation-webui-main\text-generation-webui-main\modules\ui_model_menu.py", line 213, in load_model_wrapper
shared.model, shared.tokenizer = load_model(selected_model, loader)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "E:\text-generation-webui-main\text-generation-webui-main\modules\models.py", line 87, in load_model
output = load_func_maploader
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "E:\text-generation-webui-main\text-generation-webui-main\modules\models.py", line 235, in huggingface_loader
model = LoaderClass.from_pretrained(path_to_model, **params)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "E:\text-generation-webui-main\text-generation-webui-main\installer_files\env\Lib\site-packages\transformers\models\auto\auto_factory.py", line 566, in from_pretrained
return model_class.from_pretrained(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "E:\text-generation-webui-main\text-generation-webui-main\installer_files\env\Lib\site-packages\transformers\modeling_utils.py", line 3596, in from_pretrained
model = cls(config, *model_args, **model_kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
TypeError: MistralForCausalLM.init() got an unexpected keyword argument 'rope_scaling'

The transformers version seems wrong can you make sure you are using the latest / at least 4.34 ?

Sign up or log in to comment