ValueError: The checkpoint you are trying to load has model type `mllama` but Transformers does not recognize this architecture

#39
by KevalRx - opened

I get an error when running this code sample from Transformers library:

# Load model directly
from transformers import AutoProcessor, AutoModelForPreTraining

processor = AutoProcessor.from_pretrained("meta-llama/Llama-3.2-11B-Vision")
model = AutoModelForPreTraining.from_pretrained("meta-llama/Llama-3.2-11B-Vision")

Error:

---------------------------------------------------------------------------
KeyError                                  Traceback (most recent call last)
/usr/local/lib/python3.10/dist-packages/transformers/models/auto/configuration_auto.py in from_pretrained(cls, pretrained_model_name_or_path, **kwargs)
    992             try:
--> 993                 config_class = CONFIG_MAPPING[config_dict["model_type"]]
    994             except KeyError:

3 frames
/usr/local/lib/python3.10/dist-packages/transformers/models/auto/configuration_auto.py in __getitem__(self, key)
    694         if key not in self._mapping:
--> 695             raise KeyError(key)
    696         value = self._mapping[key]

KeyError: 'mllama'

During handling of the above exception, another exception occurred:

ValueError                                Traceback (most recent call last)
<ipython-input-18-bc2ee643b38e> in <cell line: 2>()
      1 # Load model
----> 2 processor = AutoProcessor.from_pretrained("meta-llama/Llama-3.2-11B-Vision")
      3 model = AutoModelForPreTraining.from_pretrained("meta-llama/Llama-3.2-11B-Vision")

/usr/local/lib/python3.10/dist-packages/transformers/models/auto/processing_auto.py in from_pretrained(cls, pretrained_model_name_or_path, **kwargs)
    290             # Otherwise, load config, if it can be loaded.
    291             if not isinstance(config, PretrainedConfig):
--> 292                 config = AutoConfig.from_pretrained(
    293                     pretrained_model_name_or_path, trust_remote_code=trust_remote_code, **kwargs
    294                 )

/usr/local/lib/python3.10/dist-packages/transformers/models/auto/configuration_auto.py in from_pretrained(cls, pretrained_model_name_or_path, **kwargs)
    993                 config_class = CONFIG_MAPPING[config_dict["model_type"]]
    994             except KeyError:
--> 995                 raise ValueError(
    996                     f"The checkpoint you are trying to load has model type `{config_dict['model_type']}` "
    997                     "but Transformers does not recognize this architecture. This could be because of an "

ValueError: The checkpoint you are trying to load has model type `mllama` but Transformers does not recognize this architecture. This could be because of an issue with the checkpoint, or because your version of Transformers is out of date.

Were you able to figure out?

Any updates on this?

Sign up or log in to comment