Import error - any ideas?

#1
by landmann - opened

I'm getting the following error when trying to load the model:

RuntimeError: Failed to import transformers.models.llama.modeling_llama because of the following error (look up to see its traceback):
/home/ubuntu/myenv/lib/python3.10/site-packages/flash_attn_2_cuda.cpython-310-x86_64-linux-gnu.so: undefined symbol: _ZN2at4_ops5zeros4callEN3c108ArrayRefINS2_6SymIntEEENS2_8optionalINS2_10ScalarTypeEEENS6_INS2_6LayoutEEENS6_INS2_6DeviceEEENS6_IbEE
qresearch org

issue is related to you using flash attention most likely, we have not tested with it

we will try to fix this at some point

qtnx changed discussion status to closed

Sign up or log in to comment