Error when using mpt-7b

#90
by yuzhen17 - opened

After the recent update, I encountered the following error while running the model:

  File "main.py", line 129, in <module>
    main()
  File "main.py", line 103, in main
    model=AutoModelForCausalLM.from_pretrained(
  File "/home/data/yuzhenh17/miniconda3/envs/new-env/lib/python3.8/site-packages/transformers/models/auto/auto_factory.py", line 526, in from_pretrained
    config, kwargs = AutoConfig.from_pretrained(
  File "/home/data/yuzhenh17/miniconda3/envs/new-env/lib/python3.8/site-packages/transformers/models/auto/configuration_auto.py", line 1057, in from_pretrained
    config_class = get_class_from_dynamic_module(
  File "/home/data/yuzhenh17/miniconda3/envs/new-env/lib/python3.8/site-packages/transformers/dynamic_module_utils.py", line 499, in get_class_from_dynamic_module
    return get_class_in_module(class_name, final_module.replace(".py", ""))
  File "/home/data/yuzhenh17/miniconda3/envs/new-env/lib/python3.8/site-packages/transformers/dynamic_module_utils.py", line 199, in get_class_in_module
    module = importlib.import_module(module_path)
  File "/home/data/yuzhenh17/miniconda3/envs/new-env/lib/python3.8/importlib/__init__.py", line 127, in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
  File "<frozen importlib._bootstrap>", line 1014, in _gcd_import
  File "<frozen importlib._bootstrap>", line 991, in _find_and_load
  File "<frozen importlib._bootstrap>", line 975, in _find_and_load_unlocked
  File "<frozen importlib._bootstrap>", line 671, in _load_unlocked
  File "<frozen importlib._bootstrap_external>", line 843, in exec_module
  File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
  File "/homes/.cache/huggingface/modules/transformers_modules/mosaicml/mpt-7b/67cf22a4e6809edb7308dd0a2ae2c1ffb86f4984/configuration_mpt.py", line 5, in <module>
    from .attention import check_alibi_support, is_flash_v1_installed, is_flash_v2_installed
  File "/homes/.cache/huggingface/modules/transformers_modules/mosaicml/mpt-7b/67cf22a4e6809edb7308dd0a2ae2c1ffb86f4984/attention.py", line 60, in <module>
    def scaled_multihead_dot_product_attention(query: torch.Tensor, key: torch.Tensor, value: torch.Tensor, n_heads: int, kv_n_heads: int, past_key_value: Optional[tuple[torch.Tensor, torch.Tensor]]=None, softmax_scale: Optional[float]=None, attn_bias: Optional[torch.Tensor]=None, key_padding_mask: Optional[torch.Tensor]=None, is_causal: bool=False, dropout_p: float=0.0, training: bool=False, needs_weights: bool=False) -> tuple[torch.Tensor, Optional[torch.Tensor], Optional[tuple[torch.Tensor, torch.Tensor]]]:
TypeError: 'type' object is not subscriptable

Hi, you'll need to upgrade to python 3.9

Or continue using a previous commit of the model

yuzhen17 changed discussion status to closed

Sign up or log in to comment