Tansformers does not recognize chameleon

#5
by BrokenSoul - opened

KeyError Traceback (most recent call last)
/usr/local/lib/python3.10/dist-packages/transformers/models/auto/configuration_auto.py in from_pretrained(cls, pretrained_model_name_or_path, **kwargs)
981 try:
--> 982 config_class = CONFIG_MAPPING[config_dict["model_type"]]
983 except KeyError:

3 frames
KeyError: 'chameleon'

During handling of the above exception, another exception occurred:

ValueError Traceback (most recent call last)
/usr/local/lib/python3.10/dist-packages/transformers/models/auto/configuration_auto.py in from_pretrained(cls, pretrained_model_name_or_path, **kwargs)
982 config_class = CONFIG_MAPPING[config_dict["model_type"]]
983 except KeyError:
--> 984 raise ValueError(
985 f"The checkpoint you are trying to load has model type {config_dict['model_type']} "
986 "but Transformers does not recognize this architecture. This could be because of an "

ValueError: The checkpoint you are trying to load has model type chameleon but Transformers does not recognize this architecture. This could be because of an issue with the checkpoint, or because your version of Transformers is out of date.

@BrokenSoul as the error message says, it might be due to outdated Transformers version. Can you try to pull the latest changes from main by running !pip install --upgrade git+https://github.com/huggingface/transformers.git as we released Chameleon just recently

I'm encountering the same error , even though I've updated to the latest version of Hugging Face Transformers.

Traceback (most recent call last):
File "/opt/conda/lib/python3.10/site-packages/transformers/models/auto/configuration_auto.py", line 982, in from_pretrained
class_ref, pretrained_model_name_or_path, code_revision=code_revision, **kwargs
File "/opt/conda/lib/python3.10/site-packages/transformers/models/auto/configuration_auto.py", line 684, in getitem
self._extra_content = {}
KeyError: 'chameleon'

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
File "/opt/conda/lib/python3.10/site-packages/IPython/core/interactiveshell.py", line 3553, in run_code
exec(code_obj, self.user_global_ns, self.user_ns)
File "/tmp/ipykernel_209/2575232475.py", line 9, in
model = AutoModelForSeq2SeqLM.from_pretrained("facebook/chameleon-7b")
File "/opt/conda/lib/python3.10/site-packages/transformers/models/auto/auto_factory.py", line 524, in from_pretrained
config, kwargs = AutoConfig.from_pretrained(
File "/opt/conda/lib/python3.10/site-packages/transformers/models/auto/configuration_auto.py", line 984, in from_pretrained
if os.path.isdir(pretrained_model_name_or_path):
ValueError: The checkpoint you are trying to load has model type chameleon but Transformers does not recognize this architecture. This could be because of an issue with the checkpoint, or because your version of Transformers is out of date.

Hey! Can you make sure that you are importing transformers version at least v4.43.0, by

import transformers
transformers.__version__

In case you are in Colab or Jupyter, you might need to restart the kernel because sometimes it happens that newly installed packages are not importable. I verified that Chameleon should work in the latest

I'm encountering the same error , even though I've updated to the latest version of Hugging Face Transformers.

Traceback (most recent call last):
File "/opt/conda/lib/python3.10/site-packages/transformers/models/auto/configuration_auto.py", line 982, in from_pretrained
class_ref, pretrained_model_name_or_path, code_revision=code_revision, **kwargs
File "/opt/conda/lib/python3.10/site-packages/transformers/models/auto/configuration_auto.py", line 684, in getitem
self._extra_content = {}
KeyError: 'chameleon'

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
File "/opt/conda/lib/python3.10/site-packages/IPython/core/interactiveshell.py", line 3553, in run_code
exec(code_obj, self.user_global_ns, self.user_ns)
File "/tmp/ipykernel_209/2575232475.py", line 9, in
model = AutoModelForSeq2SeqLM.from_pretrained("facebook/chameleon-7b")
File "/opt/conda/lib/python3.10/site-packages/transformers/models/auto/auto_factory.py", line 524, in from_pretrained
config, kwargs = AutoConfig.from_pretrained(
File "/opt/conda/lib/python3.10/site-packages/transformers/models/auto/configuration_auto.py", line 984, in from_pretrained
if os.path.isdir(pretrained_model_name_or_path):
ValueError: The checkpoint you are trying to load has model type chameleon but Transformers does not recognize this architecture. This could be because of an issue with the checkpoint, or because your version of Transformers is out of date.

Transformers given a wrong implementation to use :

Load model directly

from transformers import AutoProcessor, AutoModelForSeq2SeqLM

processor = AutoProcessor.from_pretrained("facebook/chameleon-7b")
model = AutoModelForSeq2SeqLM.from_pretrained("facebook/chameleon-7b")

Instead, use this :
'from transformers import AutoProcessor, AutoModel

processor = AutoProcessor.from_pretrained("facebook/chameleon-7b")
model = AutoModel.from_pretrained("facebook/chameleon-7b")'

Else this class: ChameleonForConditionalGeneration

Sign up or log in to comment