Model not working with transformers main

#83
by ybelkada HF staff - opened

Since the attention mask refactor in transformers main some private methods has been removed such as _expand_mask

https://github.com/huggingface/transformers/pull/27086

Loading the model fails with this error trace:

    from transformers.models.bloom.modeling_bloom import _expand_mask as _expand_mask_bloom
ImportError: cannot import name '_expand_mask' from 'transformers.models.bloom.modeling_bloom' 

Simple script to repro:

from accelerate import init_empty_weights
from transformers import AutoModelForCausalLM, AutoConfig

model_id = "mosaicml/mpt-7b"
config = AutoConfig.from_pretrained(
    model_id, trust_remote_code=True, revision="72e5f594ce36f9cabfa2a9fd8f58b491eb467ee7"
)
with init_empty_weights():
    model = AutoModelForCausalLM.from_config(
        config, trust_remote_code=True, code_revision="72e5f594ce36f9cabfa2a9fd8f58b491eb467ee7"
    )

Meaning a patch will be needed here for future transformers versions.

Mosaic ML, Inc. org

@ybelkada This should be fixed now. Could you please retest and confirm on your end? Thanks!

Works like charm now, thanks!

ybelkada changed discussion status to closed

Doesn't work on the git
ImportError: cannot import name '_expand_mask' from 'transformers.models.bloom.modeling_bloom' (/lre/home/jperez/anaconda3/envs/llms/lib/python3.9/site-packages/transformers/models/bloom/modeling_bloom.py)

ImportError: cannot import name '_expand_mask' from 'transformers.models.bloom.modeling_bloom' (/usr/local/lib/python3.9/dist-packages/transformers/models/bloom/modeling_bloom.py)

I got this error with transformers 4.36.1, any solution?

Mosaic ML, Inc. org

The latest model works. Are you pinned to a revision?

Sign up or log in to comment