key error

#2
by NS-Y - opened

KeyError Traceback (most recent call last)
in <cell line: 5>()
3 device = "cuda" # the device to load the model onto
4
----> 5 model = AutoModelForCausalLM.from_pretrained("mistralai/Mistral-7B-Instruct-v0.1")
6 tokenizer = AutoTokenizer.from_pretrained("mistralai/Mistral-7B-Instruct-v0.1")
7

2 frames
/usr/local/lib/python3.10/dist-packages/transformers/models/auto/configuration_auto.py in getitem(self, key)
732 return self._extra_content[key]
733 if key not in self._mapping:
--> 734 raise KeyError(key)
735 value = self._mapping[key]
736 module_name = model_type_to_module_name(key)

KeyError: 'mistral'


KeyError Traceback (most recent call last)
File :5
1 from transformers import AutoModelForCausalLM, AutoTokenizer
3 device = "cuda" # the device to load the model onto
----> 5 model = AutoModelForCausalLM.from_pretrained("mistralai/Mistral-7B-Instruct-v0.1")
6 tokenizer = AutoTokenizer.from_pretrained("mistralai/Mistral-7B-Instruct-v0.1")
8 text = "[INST] What is your favourite condiment? [/INST]"

File /databricks/python/lib/python3.9/site-packages/transformers/models/auto/auto_factory.py:434, in _BaseAutoModelClass.from_pretrained(cls, pretrained_model_name_or_path, *model_args, **kwargs)
432 hub_kwargs = {name: kwargs.pop(name) for name in hub_kwargs_names if name in kwargs}
433 if not isinstance(config, PretrainedConfig):
--> 434 config, kwargs = AutoConfig.from_pretrained(
435 pretrained_model_name_or_path,
436 return_unused_kwargs=True,
437 trust_remote_code=trust_remote_code,
438 **hub_kwargs,
439 **kwargs,
440 )
441 if hasattr(config, "auto_map") and cls.name in config.auto_map:
442 if not trust_remote_code:

File /databricks/python/lib/python3.9/site-packages/transformers/models/auto/configuration_auto.py:829, in AutoConfig.from_pretrained(cls, pretrained_model_name_or_path, **kwargs)
827 return config_class.from_pretrained(pretrained_model_name_or_path, **kwargs)
828 elif "model_type" in config_dict:
--> 829 config_class = CONFIG_MAPPING[config_dict["model_type"]]
830 return config_class.from_dict(config_dict, **unused_kwargs)
831 else:
832 # Fallback: use pattern matching on the string.
833 # We go from longer names to shorter names to catch roberta before bert (for instance)

File /databricks/python/lib/python3.9/site-packages/transformers/models/auto/configuration_auto.py:536, in _LazyConfigMapping.getitem(self, key)
534 return self._extra_content[key]
535 if key not in self._mapping:
--> 536 raise KeyError(key)
537 value = self._mapping[key]
538 module_name = model_type_to_module_name(key)

KeyError: 'mistral'

rename "mistral" to "llama" into the config.json file.
work on some of my tools, crash on others, I let you try lmao.
try it just for the funnies

Try to install the latest transformers from the source. It will work

Had this problem. Updated Transformers and renamed "mistral" to "llama" in config.json file. Fixed.

The latest transformers from the source solved it.

NS-Y changed discussion status to closed
NS-Y changed discussion status to open

Installing transformers from source helps -

pip install --upgrade git+https://github.com/huggingface/transformers

upgraded to latest transofmer install still getting this error.

upgraded to latest transofmer install still getting this error.

Code_zfZEP9oBNa.gif

Installing transformers from source helps -

pip install --upgrade git+https://github.com/huggingface/transformers

This is working

Mistral AI_ org

Yes, we are supported on main, so will be included in the next release, but are not included in the current release (v4.33.3).

timlacroix changed discussion status to closed

if you installed transformers from source, don't forget to reset your kernel before you try again

im using pipline ,where i can find config.json

No changes to config necessary. Updating transformers alone worked for me:
Downloading transformers-4.34.0-py3-none-any.whl.metadata (121 kB)
Downloading transformers-4.34.0-py3-none-any.whl (7.7 MB)
Downloading tokenizers-0.14.1-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (3.8 MB)

Sign up or log in to comment