Hi,
@jzx03
,
I am not aware of any of the related code being broken right now, but Transformers is big and fast growing library, and such things happened before. It would be best if you post a reproducible example as an issue, showing difference in behavior in 4.49 and some earlier version. See how I did that in https://github.com/huggingface/transformers/issues/29525.
Also check the relevant tests like https://github.com/huggingface/transformers/blob/be37d34f44ff1bc928e59ffb8a30adecab8835a8/tests/models/llama/test_modeling_llama.py#L811 see if they still work OK, or, possibly extend the tests to cover the failure that you discovered.
Ruslan S.
poedator

·
AI & ML interests
None yet
Recent Activity
commented on
their
article
1 day ago
4D masks support in Transformers
commented on
their
article
6 days ago
4D masks support in Transformers
upvoted
a
paper
4 months ago
Switti: Designing Scale-Wise Transformers for Text-to-Image Synthesis
Organizations
poedator's activity
commented on
4D masks support in Transformers
1 day ago
commented on
4D masks support in Transformers
6 days ago
no, I don't have such script readily available. please follow the links in section 2 above or search elsewhere
upvoted
a
paper
4 months ago
upvoted
a
paper
7 months ago
upvoted
an
article
9 months ago
Article
4D masks support in Transformers
By
•
•
21zero embedding for token_id == `1` (<BOS>)
#15 opened 11 months ago
by
poedator
zero embedding for token_id == `1` (<BOS>)
#7 opened 11 months ago
by
poedator
Upvoting a blog post
7
#4 opened about 1 year ago
by
santiviquez

published
an
article
about 1 year ago
Article
4D masks support in Transformers
By
•
•
21