Error due to torch upgrade
#2
by
John6666
- opened
It seems to crash just loading with trust_remote_code=True.
Something related to flash_attn seems to have changed.
Edit:
Same on Large.
runtime error
Exit code: 1. Reason: nloaded from https://huggingface.co/gokaygokay/Florence-2-Flux-Large:
- configuration_florence2.py
. Make sure to double-check they do not contain any added malicious code. To avoid downloading new versions of the code file, you can pin a revision.
A new version of the following files was downloaded from https://huggingface.co/gokaygokay/Florence-2-Flux-Large:
- modeling_florence2.py
. Make sure to double-check they do not contain any added malicious code. To avoid downloading new versions of the code file, you can pin a revision.
Traceback (most recent call last):
File "/home/user/app/app.py", line 28, in <module>
from tagger.fl2flux import predict_tags_fl2_flux
File "/home/user/app/tagger/fl2flux.py", line 11, in <module>
fl_model = AutoModelForCausalLM.from_pretrained('gokaygokay/Florence-2-Flux-Large', trust_remote_code=True).to("cpu").eval()
File "/usr/local/lib/python3.10/site-packages/transformers/models/auto/auto_factory.py", line 553, in from_pretrained
model_class = get_class_from_dynamic_module(
File "/usr/local/lib/python3.10/site-packages/transformers/dynamic_module_utils.py", line 552, in get_class_from_dynamic_module
return get_class_in_module(class_name, final_module, force_reload=force_download)
File "/usr/local/lib/python3.10/site-packages/transformers/dynamic_module_utils.py", line 249, in get_class_in_module
module_spec.loader.exec_module(module)
File "/home/user/.cache/huggingface/modules/transformers_modules/gokaygokay/Florence-2-Flux-Large/ed3af3df6d23d9f25d1dd4ce05ba95bb43c37209/modeling_florence2.py", line 63, in <module>
from flash_attn.bert_padding import index_first_axis, pad_input, unpad_input # noqa
File "/usr/local/lib/python3.10/site-packages/flash_attn/__init__.py", line 3, in <module>
from flash_attn.flash_attn_interface import (
File "/usr/local/lib/python3.10/site-packages/flash_attn/flash_attn_interface.py", line 10, in <module>
import flash_attn_2_cuda as flash_attn_cuda
ModuleNotFoundError: No module named 'flash_attn_2_cuda'
torch
git+https://github.com/huggingface/diffusers.git
git+https://github.com/huggingface/transformers.git
git+https://github.com/huggingface/peft.git
git+https://github.com/huggingface/accelerate.git
sentencepiece
torchvision
huggingface_hub
timm
einops
controlnet_aux
kornia
numpy
opencv-python
deepspeed
mediapipe
openai==1.37.0
translatepy
unidecode
Edit:
torch==2.4.0
The above was a workaround.
Even Microsoft's Florence2 crashed. torch's latest version seems to break transformers' FSDP, Diffusers' custom pipelines, etc. I knew it was buggy, but I was caught off guard.
John6666
changed discussion title from
Error due to library spec change?
to Error due to torch upgrade
Thanks.
I'm going to close this one, because if Microsoft's is buggy too, it's probably impossible to fix on the user side.
Probably wiser to wait for torch to stabilize.
John6666
changed discussion status to
closed