Fix AutoModel not loading model correctly due to config_class inconsistency

#11

This fixes an issue when using AutoModel to instantiate the model where the config class instantiated with the model is from the transformers library instead of the model's module. This causes the instantiation to fail with the error below. See this Github issue for more details.

Traceback (most recent call last):
    model = AutoModel.from_pretrained("zhihan1996/DNA_bert_6", trust_remote_code=True)
            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File ".../lib/python3.11/site-packages/transformers/models/auto/auto_factory.py", line 560, in from_pretrained
    cls.register(config.__class__, model_class, exist_ok=True)
  File ".../lib/python3.11/site-packages/transformers/models/auto/auto_factory.py", line 586, in register
    raise ValueError(
ValueError: The model class you are passing has a `config_class` attribute that is not consistent with the config class you passed (model has <class 'transformers.models.bert.configuration_bert.BertConfig'> and you passed <class 'transformers_modules.zhihan1996.DNA_bert_6.55e0c0eb7b734c8b9b77bc083bf89eb6fbda1341.configuration_bert.BertConfig'>. Fix one of those so they match!
Ready to merge
This branch is ready to get merged automatically.

Sign up or log in to comment