Sentence transformers error: 'XLMRobertaFlashConfig' is not iterable
#54
by
thereiter
- opened
Hello.
The latest commit (82b68d65447e856379361e8d4054b21f63c97dbc) breaks loading jinaai/jina-embeddings-v3
via sentence transformers:
Traceback (most recent call last):
File "/home/work/venv/lib/python3.12/site-packages/sentence_transformers/SentenceTransformer.py", line 1728, in _load_sbert_model
module = module_class(model_name_or_path, cache_dir=cache_folder, backend=self.backend, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/work/.cache/huggingface/modules/transformers_modules/jinaai/jina-embeddings-v3/30996fea06f69ecd8382ee4f11e29acaf6b5405e/custom_st.py", line 82, in __init__
self.auto_model = AutoModel.from_pretrained(model_name_or_path, config=self.config, cache_dir=cache_dir, **model_args)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/work/venv/lib/python3.12/site-packages/transformers/models/auto/auto_factory.py", line 559, in from_pretrained
return model_class.from_pretrained(Q
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/work/.cache/huggingface/modules/transformers_modules/jinaai/xlm-roberta-flash-implementation/82b68d65447e856379361e8d4054b21f63c97dbc/modeling_lora.py", line 340, in from_pretrained
if key in config:
^^^^^^^^^^^^^
TypeError: argument of type 'XLMRobertaFlashConfig' is not iterable
During handling of the above exception, another exception occurred:
...
...
File "/home/work/cache/huggingface/modules/transformers_modules/jinaai/jina-embeddings-v3/30996fea06f69ecd8382ee4f11e29acaf6b5405e/custom_st.py", line 220, in load
with open(sbert_config_path) as fIn:
^^^^^^^^^^^^^^^^^^^^^^^
FileNotFoundError: [Errno 2] No such file or directory: 'jinaai/jina-embeddings-v3/sentence_xlnet_config.json'
Rolling back to previous commit or commenting out changes in modeling_lora.py
fixed this issue for me.
same issue here
if key in config: => if key in config.to_dict():
Thanks for reporting the issue. This if key in config
works fine with transformers 4.46+ and I assumed it would also work with previous versions. I merged the PR so it's fixed now. Closing the issue.
jupyterjazz
changed discussion status to
closed
This comment has been hidden