Flash dependency (locks out non-NVIDIA GPUs)

#4
by Thalesian - opened

Title says it all. This should run on a Mac M1 architecture (some have VRAM > 98, so can run this). However flash_attn is called on repeatedly, and hard to re-code without it.

Disco Research org

The code is equivalent to the standard mistral 7b code other than the MoE integration which does not use attention. Flash attention should only be used when loading the model with use_flash_attn_2=True, otherwise it should be good. Have you tried it?

import torch
from transformers import AutoModelForCausalLM, AutoTokenizer

model = AutoModelForCausalLM.from_pretrained("~/LLM/mixtral-8x7b-32kseqlen", low_cpu_mem_usage=True, device_map="auto", trust_remote_code=True)

Traceback (most recent call last):
File "", line 1, in
File "/miniconda3/envs/textgen/lib/python3.11/site-packages/transformers/models/auto/auto_factory.py", line 553, in from_pretrained
model_class = get_class_from_dynamic_module(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "
/miniconda3/envs/textgen/lib/python3.11/site-packages/transformers/dynamic_module_utils.py", line 487, in get_class_from_dynamic_module
final_module = get_cached_module_file(
^^^^^^^^^^^^^^^^^^^^^^^
File "/miniconda3/envs/textgen/lib/python3.11/site-packages/transformers/dynamic_module_utils.py", line 314, in get_cached_module_file
modules_needed = check_imports(resolved_module_file)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "
/miniconda3/envs/textgen/lib/python3.11/site-packages/transformers/dynamic_module_utils.py", line 179, in check_imports
raise ImportError(
ImportError: This modeling file requires the following packages that were not found in your environment: flash_attn. Run pip install flash_attn

Not sure why it keeps trying to get flash attention - this hasn't been a problem with other models.

Title says it all. This should run on a Mac M1 architecture (some have VRAM > 98, so can run this). However flash_attn is called on repeatedly, and hard to re-code without it.

Some have VRAM > 98 GB for CPU or GPU?

Sign up or log in to comment