Facing issue while loading model - KeyError: 'mixtral'

#30
by swapnil3597 - opened

Code Snippet:

from transformers import AutoModelForCausalLM, AutoTokenizer

model_id = "mistralai/Mixtral-8x7B-Instruct-v0.1"
tokenizer = AutoTokenizer.from_pretrained(model_id)

model = AutoModelForCausalLM.from_pretrained(model_id)

text = "Hello my name is"
inputs = tokenizer(text, return_tensors="pt")

outputs = model.generate(**inputs, max_new_tokens=20)
print(tokenizer.decode(outputs[0], skip_special_tokens=True))

Facing following issue:
```bash

KeyError Traceback (most recent call last)
Cell In[22], line 6
3 model_id = "mistralai/Mixtral-8x7B-Instruct-v0.1"
4 tokenizer = AutoTokenizer.from_pretrained(model_id)
----> 6 model = AutoModelForCausalLM.from_pretrained(model_id)
8 text = "Hello my name is"
9 inputs = tokenizer(text, return_tensors="pt")

File /data/environments/llm_env/lib/python3.10/site-packages/transformers/models/auto/auto_factory.py:526, in _BaseAutoModelClass.from_pretrained(cls, pretrained_model_name_or_path, *model_args, **kwargs)
523 if kwargs.get("quantization_config", None) is not None:
524 _ = kwargs.pop("quantization_config")
--> 526 config, kwargs = AutoConfig.from_pretrained(
527 pretrained_model_name_or_path,
528 return_unused_kwargs=True,
529 trust_remote_code=trust_remote_code,
530 code_revision=code_revision,
531 _commit_hash=commit_hash,
532 **hub_kwargs,
533 **kwargs,
534 )
536 # if torch_dtype=auto was passed here, ensure to pass it on
537 if kwargs_orig.get("torch_dtype", None) == "auto":

File /data/environments/llm_env/lib/python3.10/site-packages/transformers/models/auto/configuration_auto.py:1064, in AutoConfig.from_pretrained(cls, pretrained_model_name_or_path, **kwargs)
1062 return config_class.from_pretrained(pretrained_model_name_or_path, **kwargs)
1063 elif "model_type" in config_dict:
-> 1064 config_class = CONFIG_MAPPING[config_dict["model_type"]]
1065 return config_class.from_dict(config_dict, **unused_kwargs)
1066 else:
1067 # Fallback: use pattern matching on the string.
1068 # We go from longer names to shorter names to catch roberta before bert (for instance)

File /data/environments/llm_env/lib/python3.10/site-packages/transformers/models/auto/configuration_auto.py:761, in _LazyConfigMapping.getitem(self, key)
759 return self._extra_content[key]
760 if key not in self._mapping:
--> 761 raise KeyError(key)
762 value = self._mapping[key]
763 module_name = model_type_to_module_name(key)

KeyError: 'mixtral'


Getting issue on line `model = AutoModelForCausalLM.from_pretrained(model_id)`. Kindly help me to resolve this issue.

Referred to troubleshooting steps in: https://huggingface.co/mistralai/Mistral-7B-v0.1
It is suggested to use stable version of Transformers, 4.34.0 or newer.

I'm using: `transformers==4.36.1`. Still facing this issue.
This comment has been hidden
swapnil3597 changed discussion status to closed
swapnil3597 changed discussion status to open

Hello, I have the same error here.

Upgrade transformers pip install transformers --upgrade (transformers 4.36.1) solved the issue for me

https://pypi.org/project/transformers/

I am facing the same problem. I have installed transformers 4.36.1 but still does not work.

make sure you are actually using it: import transformers; print(transformers.__version__)

Yeah, Upgrading transformers actually works. Restarting the kernel after upgrading worked for me.

What are your hardware config, that you are loading the model directly?

Sign up or log in to comment