Problem loading model

#7
by Moghrua - opened

Anyone else seeing this? I'm mystifed!

A new version of the following files was downloaded from https://huggingface.co/visheratin/MC-LLaVA-3b:
- modeling_llava.py
. Make sure to double-check they do not contain any added malicious code. To avoid downloading new versions of the code file, you can pin a revision.
No module named 'flash_attn'
No module named 'flash_attn'
---------------------------------------------------------------------------
ValueError                                Traceback (most recent call last)
<ipython-input-3-3c73d908b946> in <cell line: 4>()
      2 import torch
      3 
----> 4 model = AutoModel.from_pretrained("visheratin/MC-LLaVA-3b", torch_dtype=torch.float16, trust_remote_code=True).to("cuda")

1 frames
/usr/local/lib/python3.10/dist-packages/transformers/models/auto/auto_factory.py in register(cls, config_class, model_class, exist_ok)
    581         """
    582         if hasattr(model_class, "config_class") and model_class.config_class != config_class:
--> 583             raise ValueError(
    584                 "The model class you are passing has a `config_class` attribute that is not consistent with the "
    585                 f"config class you passed (model has {model_class.config_class} and you passed {config_class}. Fix "

ValueError: The model class you are passing has a `config_class` attribute that is not consistent with the config class you passed (model has <class 'transformers_modules.visheratin.MC-LLaVA-3b.06bc212f5ae47e362b98afe3abd929eb603ce9ba.modeling_llava.LlavaConfig'> and you passed <class 'transformers_modules.visheratin.MC-LLaVA-3b.06bc212f5ae47e362b98afe3abd929eb603ce9ba.modeling_llava.LlavaConfig'>. Fix one of those so they match!

Hi! So far, I can tell that this is related to the latest release of the Transformers library. If you install one of the earlier versions (e.g., pip install -q -U transformers==4.37.0), everything works. I will look deeper into it later.

Sign up or log in to comment