url
string
repository_url
string
labels_url
string
comments_url
string
events_url
string
html_url
string
id
int64
node_id
string
number
int64
title
string
user
dict
labels
list
state
string
locked
bool
assignee
dict
assignees
list
milestone
null
comments
list
created_at
timestamp[ms]
updated_at
timestamp[ms]
closed_at
timestamp[ms]
author_association
string
type
dict
active_lock_reason
null
draft
bool
pull_request
dict
body
string
closed_by
dict
reactions
dict
timeline_url
string
performed_via_github_app
null
state_reason
string
sub_issues_summary
dict
issue_dependencies_summary
dict
is_pull_request
bool
is_closed
bool
https://api.github.com/repos/huggingface/transformers/issues/40038
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/40038/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/40038/comments
https://api.github.com/repos/huggingface/transformers/issues/40038/events
https://github.com/huggingface/transformers/pull/40038
3,304,431,456
PR_kwDOCUB6oc6iyXF0
40,038
Fix error on importing unavailable torch.distributed
{ "login": "m-gallus", "id": 141938080, "node_id": "U_kgDOCHXNoA", "avatar_url": "https://avatars.githubusercontent.com/u/141938080?v=4", "gravatar_id": "", "url": "https://api.github.com/users/m-gallus", "html_url": "https://github.com/m-gallus", "followers_url": "https://api.github.com/users/m-gallus/...
[]
closed
false
null
[]
null
[]
2025-08-08T15:30:07
2025-08-29T08:24:38
2025-08-12T14:30:51
CONTRIBUTOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/40038", "html_url": "https://github.com/huggingface/transformers/pull/40038", "diff_url": "https://github.com/huggingface/transformers/pull/40038.diff", "patch_url": "https://github.com/huggingface/transformers/pull/40038.patch", "merged_at...
# What does this PR do? Currently PyTorch on Windows builds don't support distributed module and when users attempt to use transformers or a lib dependent on it, it fails with the following error: ``` File "C:\Users\Micha\AppData\Local\Programs\Python\Python312\Lib\site-packages\torch\distributed\tensor\__init__.py"...
{ "login": "ArthurZucker", "id": 48595927, "node_id": "MDQ6VXNlcjQ4NTk1OTI3", "avatar_url": "https://avatars.githubusercontent.com/u/48595927?v=4", "gravatar_id": "", "url": "https://api.github.com/users/ArthurZucker", "html_url": "https://github.com/ArthurZucker", "followers_url": "https://api.github.c...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/40038/reactions", "total_count": 1, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 1, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/40038/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/40037
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/40037/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/40037/comments
https://api.github.com/repos/huggingface/transformers/issues/40037/events
https://github.com/huggingface/transformers/pull/40037
3,304,338,186
PR_kwDOCUB6oc6iyC9f
40,037
fix `notification_service.py` about `time_spent`
{ "login": "ydshieh", "id": 2521628, "node_id": "MDQ6VXNlcjI1MjE2Mjg=", "avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4", "gravatar_id": "", "url": "https://api.github.com/users/ydshieh", "html_url": "https://github.com/ydshieh", "followers_url": "https://api.github.com/users/ydshieh/...
[]
closed
false
null
[]
null
[]
2025-08-08T14:54:28
2025-08-08T15:11:17
2025-08-08T15:11:16
COLLABORATOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/40037", "html_url": "https://github.com/huggingface/transformers/pull/40037", "diff_url": "https://github.com/huggingface/transformers/pull/40037.diff", "patch_url": "https://github.com/huggingface/transformers/pull/40037.patch", "merged_at...
# What does this PR do? We extract the information from test report file like in `1 passed in 7.89s`, but at some point, we do > ... ["time_spent"] += time_spent[1:-1] which remove the first digit. I am not guility!
{ "login": "ydshieh", "id": 2521628, "node_id": "MDQ6VXNlcjI1MjE2Mjg=", "avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4", "gravatar_id": "", "url": "https://api.github.com/users/ydshieh", "html_url": "https://github.com/ydshieh", "followers_url": "https://api.github.com/users/ydshieh/...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/40037/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/40037/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/40036
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/40036/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/40036/comments
https://api.github.com/repos/huggingface/transformers/issues/40036/events
https://github.com/huggingface/transformers/pull/40036
3,304,334,242
PR_kwDOCUB6oc6iyCFH
40,036
fix `notification_service.py` about `time_spent`
{ "login": "ydshieh", "id": 2521628, "node_id": "MDQ6VXNlcjI1MjE2Mjg=", "avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4", "gravatar_id": "", "url": "https://api.github.com/users/ydshieh", "html_url": "https://github.com/ydshieh", "followers_url": "https://api.github.com/users/ydshieh/...
[]
closed
false
null
[]
null
[]
2025-08-08T14:53:15
2025-08-08T15:06:10
2025-08-08T14:53:49
COLLABORATOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/40036", "html_url": "https://github.com/huggingface/transformers/pull/40036", "diff_url": "https://github.com/huggingface/transformers/pull/40036.diff", "patch_url": "https://github.com/huggingface/transformers/pull/40036.patch", "merged_at...
null
{ "login": "ydshieh", "id": 2521628, "node_id": "MDQ6VXNlcjI1MjE2Mjg=", "avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4", "gravatar_id": "", "url": "https://api.github.com/users/ydshieh", "html_url": "https://github.com/ydshieh", "followers_url": "https://api.github.com/users/ydshieh/...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/40036/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/40036/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/40035
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/40035/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/40035/comments
https://api.github.com/repos/huggingface/transformers/issues/40035/events
https://github.com/huggingface/transformers/pull/40035
3,304,307,403
PR_kwDOCUB6oc6ix8WV
40,035
Remove deprecated cache-related objects
{ "login": "Cyrilvallez", "id": 71554963, "node_id": "MDQ6VXNlcjcxNTU0OTYz", "avatar_url": "https://avatars.githubusercontent.com/u/71554963?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Cyrilvallez", "html_url": "https://github.com/Cyrilvallez", "followers_url": "https://api.github.com/...
[]
closed
false
null
[]
null
[]
2025-08-08T14:43:52
2025-08-11T08:30:16
2025-08-11T08:30:14
MEMBER
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/40035", "html_url": "https://github.com/huggingface/transformers/pull/40035", "diff_url": "https://github.com/huggingface/transformers/pull/40035.diff", "patch_url": "https://github.com/huggingface/transformers/pull/40035.patch", "merged_at...
# What does this PR do? As per the title! Both the CacheConfigs and the KeyValuesWrapper were scheduled for deprecation next release
{ "login": "Cyrilvallez", "id": 71554963, "node_id": "MDQ6VXNlcjcxNTU0OTYz", "avatar_url": "https://avatars.githubusercontent.com/u/71554963?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Cyrilvallez", "html_url": "https://github.com/Cyrilvallez", "followers_url": "https://api.github.com/...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/40035/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/40035/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/40034
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/40034/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/40034/comments
https://api.github.com/repos/huggingface/transformers/issues/40034/events
https://github.com/huggingface/transformers/issues/40034
3,304,232,120
I_kwDOCUB6oc7E8pS4
40,034
`plamo-2-1b` broken on latest main
{ "login": "tdoublep", "id": 7945038, "node_id": "MDQ6VXNlcjc5NDUwMzg=", "avatar_url": "https://avatars.githubusercontent.com/u/7945038?v=4", "gravatar_id": "", "url": "https://api.github.com/users/tdoublep", "html_url": "https://github.com/tdoublep", "followers_url": "https://api.github.com/users/tdoub...
[ { "id": 3817266200, "node_id": "MDU6TGFiZWwzODE3MjY2MjAw", "url": "https://api.github.com/repos/huggingface/transformers/labels/bug", "name": "bug", "color": "d73a4a", "default": true, "description": null } ]
closed
false
null
[]
null
[]
2025-08-08T14:15:53
2025-09-02T14:24:07
2025-09-02T14:24:07
NONE
null
null
null
null
### System Info ``` - `transformers` version: 4.56.0.dev0 - Platform: Linux-5.15.0-143-generic-x86_64-with-glibc2.35 - Python version: 3.11.10 - Huggingface_hub version: 0.34.3 - Safetensors version: 0.4.5 - Accelerate version: 1.0.1 - Accelerate config: not found - DeepSpeed version: not installed - PyTorch version ...
{ "login": "tdoublep", "id": 7945038, "node_id": "MDQ6VXNlcjc5NDUwMzg=", "avatar_url": "https://avatars.githubusercontent.com/u/7945038?v=4", "gravatar_id": "", "url": "https://api.github.com/users/tdoublep", "html_url": "https://github.com/tdoublep", "followers_url": "https://api.github.com/users/tdoub...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/40034/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/40034/timeline
null
completed
{ "total": 0, "completed": 0, "percent_completed": 0 }
{ "blocked_by": 0, "total_blocked_by": 0, "blocking": 0, "total_blocking": 0 }
false
true
https://api.github.com/repos/huggingface/transformers/issues/40033
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/40033/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/40033/comments
https://api.github.com/repos/huggingface/transformers/issues/40033/events
https://github.com/huggingface/transformers/pull/40033
3,304,133,087
PR_kwDOCUB6oc6ixWWn
40,033
Add model card for MobileViT
{ "login": "Shivamjan", "id": 177928568, "node_id": "U_kgDOCpr5eA", "avatar_url": "https://avatars.githubusercontent.com/u/177928568?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Shivamjan", "html_url": "https://github.com/Shivamjan", "followers_url": "https://api.github.com/users/Shivam...
[]
closed
false
null
[]
null
[]
2025-08-08T13:49:15
2025-08-13T07:54:05
2025-08-12T18:37:00
CONTRIBUTOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/40033", "html_url": "https://github.com/huggingface/transformers/pull/40033", "diff_url": "https://github.com/huggingface/transformers/pull/40033.diff", "patch_url": "https://github.com/huggingface/transformers/pull/40033.patch", "merged_at...
# What does this PR do? This PR adds a detailed and beginner-friendly model card for MobileViT to the Hugging Face Transformers documentation. The previous model card was minimal and lacked clear explanations about the model architecture. This model retains several elements from the earlier version, as they remain a...
{ "login": "stevhliu", "id": 59462357, "node_id": "MDQ6VXNlcjU5NDYyMzU3", "avatar_url": "https://avatars.githubusercontent.com/u/59462357?v=4", "gravatar_id": "", "url": "https://api.github.com/users/stevhliu", "html_url": "https://github.com/stevhliu", "followers_url": "https://api.github.com/users/ste...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/40033/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/40033/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/40032
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/40032/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/40032/comments
https://api.github.com/repos/huggingface/transformers/issues/40032/events
https://github.com/huggingface/transformers/issues/40032
3,304,119,793
I_kwDOCUB6oc7E8N3x
40,032
Add Padding Strategy to DataCollatorForLanguageModeling
{ "login": "rjgleaton", "id": 70818603, "node_id": "MDQ6VXNlcjcwODE4NjAz", "avatar_url": "https://avatars.githubusercontent.com/u/70818603?v=4", "gravatar_id": "", "url": "https://api.github.com/users/rjgleaton", "html_url": "https://github.com/rjgleaton", "followers_url": "https://api.github.com/users/...
[ { "id": 2648621985, "node_id": "MDU6TGFiZWwyNjQ4NjIxOTg1", "url": "https://api.github.com/repos/huggingface/transformers/labels/Feature%20request", "name": "Feature request", "color": "FBCA04", "default": false, "description": "Request for a new feature" } ]
open
false
null
[]
null
[]
2025-08-08T13:46:00
2025-08-08T13:46:00
null
CONTRIBUTOR
null
null
null
null
### Feature request Add the ability to specify a padding strategy when using `DataCollatorForLanguageModeling` ### Motivation This is a minor QOL enhancement that makes the collator more consistent with others in the library. The main use case would probably be padding to max length to make memory usage more stable ...
null
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/40032/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/40032/timeline
null
null
{ "total": 0, "completed": 0, "percent_completed": 0 }
{ "blocked_by": 0, "total_blocked_by": 0, "blocking": 0, "total_blocking": 0 }
false
false
https://api.github.com/repos/huggingface/transformers/issues/40031
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/40031/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/40031/comments
https://api.github.com/repos/huggingface/transformers/issues/40031/events
https://github.com/huggingface/transformers/issues/40031
3,304,096,734
I_kwDOCUB6oc7E8IPe
40,031
[gpt-oss] MoE routing bug in the mxfp4 implementation (in distributed setting)
{ "login": "kitft", "id": 58341426, "node_id": "MDQ6VXNlcjU4MzQxNDI2", "avatar_url": "https://avatars.githubusercontent.com/u/58341426?v=4", "gravatar_id": "", "url": "https://api.github.com/users/kitft", "html_url": "https://github.com/kitft", "followers_url": "https://api.github.com/users/kitft/follow...
[ { "id": 2648621985, "node_id": "MDU6TGFiZWwyNjQ4NjIxOTg1", "url": "https://api.github.com/repos/huggingface/transformers/labels/Feature%20request", "name": "Feature request", "color": "FBCA04", "default": false, "description": "Request for a new feature" }, { "id": 3817266200, ...
closed
false
null
[]
null
[]
2025-08-08T13:39:35
2025-08-19T14:35:15
2025-08-19T14:35:15
NONE
null
null
null
null
### System Info ```


 - `transformers` version: 4.55.0 - Platform: Linux-6.11.11+-x86_64-with-glibc2.35 - Python version: 3.11.13 - Huggingface_hub version: 0.34.3 - Safetensors version: 0.6.1 - Accelerate version: 1.10.0 - Accelerate config: not found - DeepSpeed version: not installed - PyTorch version (accelerato...
{ "login": "ArthurZucker", "id": 48595927, "node_id": "MDQ6VXNlcjQ4NTk1OTI3", "avatar_url": "https://avatars.githubusercontent.com/u/48595927?v=4", "gravatar_id": "", "url": "https://api.github.com/users/ArthurZucker", "html_url": "https://github.com/ArthurZucker", "followers_url": "https://api.github.c...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/40031/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/40031/timeline
null
completed
{ "total": 0, "completed": 0, "percent_completed": 0 }
{ "blocked_by": 0, "total_blocked_by": 0, "blocking": 0, "total_blocking": 0 }
false
true
https://api.github.com/repos/huggingface/transformers/issues/40030
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/40030/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/40030/comments
https://api.github.com/repos/huggingface/transformers/issues/40030/events
https://github.com/huggingface/transformers/pull/40030
3,304,077,401
PR_kwDOCUB6oc6ixJ32
40,030
Update boxes expectations for OWLViT test
{ "login": "mihaidusmanu", "id": 7276224, "node_id": "MDQ6VXNlcjcyNzYyMjQ=", "avatar_url": "https://avatars.githubusercontent.com/u/7276224?v=4", "gravatar_id": "", "url": "https://api.github.com/users/mihaidusmanu", "html_url": "https://github.com/mihaidusmanu", "followers_url": "https://api.github.com...
[]
closed
false
null
[]
null
[]
2025-08-08T13:33:15
2025-08-12T14:44:29
2025-08-12T14:03:38
NONE
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/40030", "html_url": "https://github.com/huggingface/transformers/pull/40030", "diff_url": "https://github.com/huggingface/transformers/pull/40030.diff", "patch_url": "https://github.com/huggingface/transformers/pull/40030.patch", "merged_at...
# What does this PR do? While working on a related PR #40023, I noticed some OWLViT tests were failing on main on my local machine. Seems to be some minor differences in the predicted boxes so I simply updated them to the latest values (the other outputs seem correct). Not sure if this is hardware-related or somet...
{ "login": "qubvel", "id": 31920396, "node_id": "MDQ6VXNlcjMxOTIwMzk2", "avatar_url": "https://avatars.githubusercontent.com/u/31920396?v=4", "gravatar_id": "", "url": "https://api.github.com/users/qubvel", "html_url": "https://github.com/qubvel", "followers_url": "https://api.github.com/users/qubvel/fo...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/40030/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/40030/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/40029
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/40029/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/40029/comments
https://api.github.com/repos/huggingface/transformers/issues/40029/events
https://github.com/huggingface/transformers/pull/40029
3,303,842,319
PR_kwDOCUB6oc6iwYAK
40,029
Revert FA2 kwargs construction
{ "login": "zucchini-nlp", "id": 100715397, "node_id": "U_kgDOBgDLhQ", "avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4", "gravatar_id": "", "url": "https://api.github.com/users/zucchini-nlp", "html_url": "https://github.com/zucchini-nlp", "followers_url": "https://api.github.com/use...
[]
closed
false
null
[]
null
[]
2025-08-08T12:16:20
2025-08-12T08:48:35
2025-08-12T08:48:35
MEMBER
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/40029", "html_url": "https://github.com/huggingface/transformers/pull/40029", "diff_url": "https://github.com/huggingface/transformers/pull/40029.diff", "patch_url": "https://github.com/huggingface/transformers/pull/40029.patch", "merged_at...
# What does this PR do? As per title, discussed internally
{ "login": "zucchini-nlp", "id": 100715397, "node_id": "U_kgDOBgDLhQ", "avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4", "gravatar_id": "", "url": "https://api.github.com/users/zucchini-nlp", "html_url": "https://github.com/zucchini-nlp", "followers_url": "https://api.github.com/use...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/40029/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/40029/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/40028
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/40028/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/40028/comments
https://api.github.com/repos/huggingface/transformers/issues/40028/events
https://github.com/huggingface/transformers/issues/40028
3,303,756,086
I_kwDOCUB6oc7E61E2
40,028
`TypeError: 'builtins.safe_open' object is not iterable` in `load_pytorch_state_dict_in_tf2_model `
{ "login": "harupy", "id": 17039389, "node_id": "MDQ6VXNlcjE3MDM5Mzg5", "avatar_url": "https://avatars.githubusercontent.com/u/17039389?v=4", "gravatar_id": "", "url": "https://api.github.com/users/harupy", "html_url": "https://github.com/harupy", "followers_url": "https://api.github.com/users/harupy/fo...
[]
closed
false
null
[]
null
[]
2025-08-08T11:51:28
2025-08-13T13:00:58
2025-08-08T12:57:43
CONTRIBUTOR
null
null
null
null
``` Traceback (most recent call last): File "/home/runner/work/dev/dev/tests/transformers/helper.py", line 321, in <module> prefetch_models() File "/home/runner/work/dev/dev/tests/transformers/helper.py", line 317, in prefetch_models func() File "/home/runner/work/dev/dev/tests/helper_functions.py", line ...
{ "login": "harupy", "id": 17039389, "node_id": "MDQ6VXNlcjE3MDM5Mzg5", "avatar_url": "https://avatars.githubusercontent.com/u/17039389?v=4", "gravatar_id": "", "url": "https://api.github.com/users/harupy", "html_url": "https://github.com/harupy", "followers_url": "https://api.github.com/users/harupy/fo...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/40028/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/40028/timeline
null
completed
{ "total": 0, "completed": 0, "percent_completed": 0 }
{ "blocked_by": 0, "total_blocked_by": 0, "blocking": 0, "total_blocking": 0 }
false
true
https://api.github.com/repos/huggingface/transformers/issues/40027
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/40027/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/40027/comments
https://api.github.com/repos/huggingface/transformers/issues/40027/events
https://github.com/huggingface/transformers/pull/40027
3,303,727,480
PR_kwDOCUB6oc6iv_7v
40,027
Add amd runners to run-slow command
{ "login": "ivarflakstad", "id": 69173633, "node_id": "MDQ6VXNlcjY5MTczNjMz", "avatar_url": "https://avatars.githubusercontent.com/u/69173633?v=4", "gravatar_id": "", "url": "https://api.github.com/users/ivarflakstad", "html_url": "https://github.com/ivarflakstad", "followers_url": "https://api.github.c...
[]
closed
false
null
[]
null
[]
2025-08-08T11:42:12
2025-10-16T22:44:45
2025-10-16T22:44:45
MEMBER
null
null
true
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/40027", "html_url": "https://github.com/huggingface/transformers/pull/40027", "diff_url": "https://github.com/huggingface/transformers/pull/40027.diff", "patch_url": "https://github.com/huggingface/transformers/pull/40027.patch", "merged_at...
null
{ "login": "ivarflakstad", "id": 69173633, "node_id": "MDQ6VXNlcjY5MTczNjMz", "avatar_url": "https://avatars.githubusercontent.com/u/69173633?v=4", "gravatar_id": "", "url": "https://api.github.com/users/ivarflakstad", "html_url": "https://github.com/ivarflakstad", "followers_url": "https://api.github.c...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/40027/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/40027/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/40026
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/40026/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/40026/comments
https://api.github.com/repos/huggingface/transformers/issues/40026/events
https://github.com/huggingface/transformers/pull/40026
3,303,429,863
PR_kwDOCUB6oc6ivFff
40,026
Bnb failling tests
{ "login": "MekkCyber", "id": 93391238, "node_id": "U_kgDOBZEJhg", "avatar_url": "https://avatars.githubusercontent.com/u/93391238?v=4", "gravatar_id": "", "url": "https://api.github.com/users/MekkCyber", "html_url": "https://github.com/MekkCyber", "followers_url": "https://api.github.com/users/MekkCybe...
[]
closed
false
null
[]
null
[]
2025-08-08T09:58:31
2025-08-08T14:28:02
2025-08-08T14:28:00
CONTRIBUTOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/40026", "html_url": "https://github.com/huggingface/transformers/pull/40026", "diff_url": "https://github.com/huggingface/transformers/pull/40026.diff", "patch_url": "https://github.com/huggingface/transformers/pull/40026.patch", "merged_at...
# What does this PR do?
{ "login": "ydshieh", "id": 2521628, "node_id": "MDQ6VXNlcjI1MjE2Mjg=", "avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4", "gravatar_id": "", "url": "https://api.github.com/users/ydshieh", "html_url": "https://github.com/ydshieh", "followers_url": "https://api.github.com/users/ydshieh/...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/40026/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/40026/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/40025
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/40025/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/40025/comments
https://api.github.com/repos/huggingface/transformers/issues/40025/events
https://github.com/huggingface/transformers/pull/40025
3,303,411,344
PR_kwDOCUB6oc6ivB2z
40,025
[GLM4V] fix vision placeholder mask
{ "login": "zucchini-nlp", "id": 100715397, "node_id": "U_kgDOBgDLhQ", "avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4", "gravatar_id": "", "url": "https://api.github.com/users/zucchini-nlp", "html_url": "https://github.com/zucchini-nlp", "followers_url": "https://api.github.com/use...
[ { "id": 8103865784, "node_id": "LA_kwDOCUB6oc8AAAAB4wctuA", "url": "https://api.github.com/repos/huggingface/transformers/labels/for%20patch", "name": "for patch", "color": "D93F0B", "default": false, "description": "Tag issues / labels that should be included in the next patch" } ]
closed
false
null
[]
null
[]
2025-08-08T09:51:44
2025-08-11T06:36:20
2025-08-11T06:36:20
MEMBER
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/40025", "html_url": "https://github.com/huggingface/transformers/pull/40025", "diff_url": "https://github.com/huggingface/transformers/pull/40025.diff", "patch_url": "https://github.com/huggingface/transformers/pull/40025.patch", "merged_at...
# What does this PR do? As per title, GLM uses only `image_token_id` to denote both inputs and doesn't do mixed input inference
{ "login": "zucchini-nlp", "id": 100715397, "node_id": "U_kgDOBgDLhQ", "avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4", "gravatar_id": "", "url": "https://api.github.com/users/zucchini-nlp", "html_url": "https://github.com/zucchini-nlp", "followers_url": "https://api.github.com/use...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/40025/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/40025/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/40024
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/40024/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/40024/comments
https://api.github.com/repos/huggingface/transformers/issues/40024/events
https://github.com/huggingface/transformers/pull/40024
3,303,332,372
PR_kwDOCUB6oc6iuykD
40,024
Fix missing None default values for Gemma3n model in get_placeholder_mask (#39991)
{ "login": "Znerual", "id": 22452386, "node_id": "MDQ6VXNlcjIyNDUyMzg2", "avatar_url": "https://avatars.githubusercontent.com/u/22452386?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Znerual", "html_url": "https://github.com/Znerual", "followers_url": "https://api.github.com/users/Znerua...
[ { "id": 8103865784, "node_id": "LA_kwDOCUB6oc8AAAAB4wctuA", "url": "https://api.github.com/repos/huggingface/transformers/labels/for%20patch", "name": "for patch", "color": "D93F0B", "default": false, "description": "Tag issues / labels that should be included in the next patch" } ]
closed
false
null
[]
null
[]
2025-08-08T09:28:44
2025-08-08T15:09:06
2025-08-08T10:43:42
CONTRIBUTOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/40024", "html_url": "https://github.com/huggingface/transformers/pull/40024", "diff_url": "https://github.com/huggingface/transformers/pull/40024.diff", "patch_url": "https://github.com/huggingface/transformers/pull/40024.patch", "merged_at...
# What does this PR do? <!-- Congratulations! You've made it this far! You're not quite done yet though. Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution. Then, please replace this with a d...
{ "login": "zucchini-nlp", "id": 100715397, "node_id": "U_kgDOBgDLhQ", "avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4", "gravatar_id": "", "url": "https://api.github.com/users/zucchini-nlp", "html_url": "https://github.com/zucchini-nlp", "followers_url": "https://api.github.com/use...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/40024/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/40024/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/40023
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/40023/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/40023/comments
https://api.github.com/repos/huggingface/transformers/issues/40023/events
https://github.com/huggingface/transformers/pull/40023
3,303,209,499
PR_kwDOCUB6oc6iuZm2
40,023
Add support for SDPA for OWLViT and OWLv2
{ "login": "mihaidusmanu", "id": 7276224, "node_id": "MDQ6VXNlcjcyNzYyMjQ=", "avatar_url": "https://avatars.githubusercontent.com/u/7276224?v=4", "gravatar_id": "", "url": "https://api.github.com/users/mihaidusmanu", "html_url": "https://github.com/mihaidusmanu", "followers_url": "https://api.github.com...
[]
open
false
null
[]
null
[]
2025-08-08T08:51:31
2025-08-08T13:37:43
null
NONE
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/40023", "html_url": "https://github.com/huggingface/transformers/pull/40023", "diff_url": "https://github.com/huggingface/transformers/pull/40023.diff", "patch_url": "https://github.com/huggingface/transformers/pull/40023.patch", "merged_at...
# What does this PR do? Add support for SDPA (scaled_dot_product_attention) for efficient attention to OWLViT and OWLv2 models. The previous code is used in the eager attention implementation. I roughly followed the SigLIP code for inspiration. Note that we could do a larger refactory to use the is_causal flag, bu...
null
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/40023/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/40023/timeline
null
null
null
null
true
false
https://api.github.com/repos/huggingface/transformers/issues/40022
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/40022/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/40022/comments
https://api.github.com/repos/huggingface/transformers/issues/40022/events
https://github.com/huggingface/transformers/pull/40022
3,302,740,419
PR_kwDOCUB6oc6is6lS
40,022
fix: resolve dropout type error in DogeDecoder
{ "login": "wubingheng111", "id": 123940419, "node_id": "U_kgDOB2MuQw", "avatar_url": "https://avatars.githubusercontent.com/u/123940419?v=4", "gravatar_id": "", "url": "https://api.github.com/users/wubingheng111", "html_url": "https://github.com/wubingheng111", "followers_url": "https://api.github.com/...
[]
open
false
null
[]
null
[]
2025-08-08T05:58:41
2025-08-12T13:12:21
null
NONE
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/40022", "html_url": "https://github.com/huggingface/transformers/pull/40022", "diff_url": "https://github.com/huggingface/transformers/pull/40022.diff", "patch_url": "https://github.com/huggingface/transformers/pull/40022.patch", "merged_at...
Fix: #40079 Fixed TypeError where dropout() received tuple instead of Tensor in DogeDecoderLayer when using MoE configuration. The MLP forward method returns a tuple (hidden_states, router_logits) for MoE layers, but the subsequent dropout operation expected only a Tensor. - Extract hidden_states from tuple before ...
null
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/40022/reactions", "total_count": 1, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 1 }
https://api.github.com/repos/huggingface/transformers/issues/40022/timeline
null
null
null
null
true
false
https://api.github.com/repos/huggingface/transformers/issues/40021
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/40021/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/40021/comments
https://api.github.com/repos/huggingface/transformers/issues/40021/events
https://github.com/huggingface/transformers/pull/40021
3,302,733,188
PR_kwDOCUB6oc6is5Ln
40,021
[fix] batch inference for llava_onevision
{ "login": "cyr0930", "id": 14088169, "node_id": "MDQ6VXNlcjE0MDg4MTY5", "avatar_url": "https://avatars.githubusercontent.com/u/14088169?v=4", "gravatar_id": "", "url": "https://api.github.com/users/cyr0930", "html_url": "https://github.com/cyr0930", "followers_url": "https://api.github.com/users/cyr093...
[]
closed
false
null
[]
null
[]
2025-08-08T05:54:35
2025-08-12T10:58:06
2025-08-12T09:01:01
CONTRIBUTOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/40021", "html_url": "https://github.com/huggingface/transformers/pull/40021", "diff_url": "https://github.com/huggingface/transformers/pull/40021.diff", "patch_url": "https://github.com/huggingface/transformers/pull/40021.patch", "merged_at...
# What does this PR do? This PR recovers batch inference feature for llava_onevision and fixes some contents in documentation. Before this commit, putting single-image examples and multi-image examples in the same batch did not work appropriately, as iterable is not consumed correctly. And also make test case cover ...
{ "login": "zucchini-nlp", "id": 100715397, "node_id": "U_kgDOBgDLhQ", "avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4", "gravatar_id": "", "url": "https://api.github.com/users/zucchini-nlp", "html_url": "https://github.com/zucchini-nlp", "followers_url": "https://api.github.com/use...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/40021/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/40021/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/40020
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/40020/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/40020/comments
https://api.github.com/repos/huggingface/transformers/issues/40020/events
https://github.com/huggingface/transformers/issues/40020
3,302,526,470
I_kwDOCUB6oc7E2I4G
40,020
accelerate==1.10.0 and safetensors==0.6.1 are incompatible with transformers==4.53.1
{ "login": "AniruddhaHumane", "id": 22525550, "node_id": "MDQ6VXNlcjIyNTI1NTUw", "avatar_url": "https://avatars.githubusercontent.com/u/22525550?v=4", "gravatar_id": "", "url": "https://api.github.com/users/AniruddhaHumane", "html_url": "https://github.com/AniruddhaHumane", "followers_url": "https://api...
[ { "id": 3817266200, "node_id": "MDU6TGFiZWwzODE3MjY2MjAw", "url": "https://api.github.com/repos/huggingface/transformers/labels/bug", "name": "bug", "color": "d73a4a", "default": true, "description": null } ]
closed
false
null
[]
null
[]
2025-08-08T03:53:57
2025-09-15T08:02:53
2025-09-15T08:02:53
NONE
null
null
null
null
### System Info ```Shell accelerate==1.10.0 safetensors==0.6.1 transformers==4.53.1 ``` ### Who can help? _No response_ ### Information - [ ] The official example scripts - [ ] My own modified scripts ### Tasks - [ ] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...) - [ ] My own tas...
{ "login": "github-actions[bot]", "id": 41898282, "node_id": "MDM6Qm90NDE4OTgyODI=", "avatar_url": "https://avatars.githubusercontent.com/in/15368?v=4", "gravatar_id": "", "url": "https://api.github.com/users/github-actions%5Bbot%5D", "html_url": "https://github.com/apps/github-actions", "followers_url"...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/40020/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/40020/timeline
null
completed
{ "total": 0, "completed": 0, "percent_completed": 0 }
{ "blocked_by": 0, "total_blocked_by": 0, "blocking": 0, "total_blocking": 0 }
false
true
https://api.github.com/repos/huggingface/transformers/issues/40019
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/40019/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/40019/comments
https://api.github.com/repos/huggingface/transformers/issues/40019/events
https://github.com/huggingface/transformers/pull/40019
3,302,471,087
PR_kwDOCUB6oc6isFFm
40,019
Feat/add gpt oss sequence classification
{ "login": "robin-ede", "id": 115729295, "node_id": "U_kgDOBuXjjw", "avatar_url": "https://avatars.githubusercontent.com/u/115729295?v=4", "gravatar_id": "", "url": "https://api.github.com/users/robin-ede", "html_url": "https://github.com/robin-ede", "followers_url": "https://api.github.com/users/robin-...
[]
closed
false
null
[]
null
[]
2025-08-08T03:11:50
2025-08-15T19:10:02
2025-08-15T19:10:02
CONTRIBUTOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/40019", "html_url": "https://github.com/huggingface/transformers/pull/40019", "diff_url": "https://github.com/huggingface/transformers/pull/40019.diff", "patch_url": "https://github.com/huggingface/transformers/pull/40019.patch", "merged_at...
# What does this PR do? This PR implements `GptOssForSequenceClassification` for text classification tasks. **Key Changes:** - ✅ **New Model Class**: Added `GptOssForSequenceClassification` inheriting from `GenericForSequenceClassification` and `GptOssPreTrainedModel` - ✅ **Consistent Implementation**: Implemen...
{ "login": "robin-ede", "id": 115729295, "node_id": "U_kgDOBuXjjw", "avatar_url": "https://avatars.githubusercontent.com/u/115729295?v=4", "gravatar_id": "", "url": "https://api.github.com/users/robin-ede", "html_url": "https://github.com/robin-ede", "followers_url": "https://api.github.com/users/robin-...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/40019/reactions", "total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/40019/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/40018
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/40018/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/40018/comments
https://api.github.com/repos/huggingface/transformers/issues/40018/events
https://github.com/huggingface/transformers/issues/40018
3,302,280,724
I_kwDOCUB6oc7E1M4U
40,018
need GptOssForSequenceClassification
{ "login": "cold-eye", "id": 48782821, "node_id": "MDQ6VXNlcjQ4NzgyODIx", "avatar_url": "https://avatars.githubusercontent.com/u/48782821?v=4", "gravatar_id": "", "url": "https://api.github.com/users/cold-eye", "html_url": "https://github.com/cold-eye", "followers_url": "https://api.github.com/users/col...
[ { "id": 2648621985, "node_id": "MDU6TGFiZWwyNjQ4NjIxOTg1", "url": "https://api.github.com/repos/huggingface/transformers/labels/Feature%20request", "name": "Feature request", "color": "FBCA04", "default": false, "description": "Request for a new feature" } ]
closed
false
null
[]
null
[]
2025-08-08T00:46:48
2025-08-19T11:54:39
2025-08-19T11:54:39
NONE
null
null
null
null
### Feature request need GptOssForSequenceClassification ### Motivation for text classificition ### Your contribution nothing
{ "login": "Rocketknight1", "id": 12866554, "node_id": "MDQ6VXNlcjEyODY2NTU0", "avatar_url": "https://avatars.githubusercontent.com/u/12866554?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Rocketknight1", "html_url": "https://github.com/Rocketknight1", "followers_url": "https://api.githu...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/40018/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/40018/timeline
null
completed
{ "total": 0, "completed": 0, "percent_completed": 0 }
{ "blocked_by": 0, "total_blocked_by": 0, "blocking": 0, "total_blocking": 0 }
false
true
https://api.github.com/repos/huggingface/transformers/issues/40017
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/40017/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/40017/comments
https://api.github.com/repos/huggingface/transformers/issues/40017/events
https://github.com/huggingface/transformers/issues/40017
3,302,045,171
I_kwDOCUB6oc7E0TXz
40,017
Major issues with transformers version causing rubbish generations with Gemma3 family using vllm
{ "login": "AbdelrahmanHagrass", "id": 48356468, "node_id": "MDQ6VXNlcjQ4MzU2NDY4", "avatar_url": "https://avatars.githubusercontent.com/u/48356468?v=4", "gravatar_id": "", "url": "https://api.github.com/users/AbdelrahmanHagrass", "html_url": "https://github.com/AbdelrahmanHagrass", "followers_url": "ht...
[ { "id": 3817266200, "node_id": "MDU6TGFiZWwzODE3MjY2MjAw", "url": "https://api.github.com/repos/huggingface/transformers/labels/bug", "name": "bug", "color": "d73a4a", "default": true, "description": null } ]
closed
false
null
[]
null
[]
2025-08-07T22:18:29
2025-08-08T12:22:56
2025-08-08T12:22:56
NONE
null
null
null
null
### System Info . ### Who can help? Major issues with transformers version when used with Gemma3 family models on vllm. The output generations are incorrect and not usable—appears to be due to incompatibility or regression in transformers. Please advise on compatible versions or fixes. Example: Generations are rubb...
{ "login": "AbdelrahmanHagrass", "id": 48356468, "node_id": "MDQ6VXNlcjQ4MzU2NDY4", "avatar_url": "https://avatars.githubusercontent.com/u/48356468?v=4", "gravatar_id": "", "url": "https://api.github.com/users/AbdelrahmanHagrass", "html_url": "https://github.com/AbdelrahmanHagrass", "followers_url": "ht...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/40017/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/40017/timeline
null
completed
{ "total": 0, "completed": 0, "percent_completed": 0 }
{ "blocked_by": 0, "total_blocked_by": 0, "blocking": 0, "total_blocking": 0 }
false
true
https://api.github.com/repos/huggingface/transformers/issues/40016
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/40016/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/40016/comments
https://api.github.com/repos/huggingface/transformers/issues/40016/events
https://github.com/huggingface/transformers/pull/40016
3,301,927,221
PR_kwDOCUB6oc6iqX1G
40,016
[WIP] Fix naive for loops for MoE models resulting in sub 20% downstream MFU for training with trl, e.t.c (Qwen3, Deepseek V3, Ernie 4.5, GLM 4.5, Dots1)
{ "login": "perinmclaughlin", "id": 7523023, "node_id": "MDQ6VXNlcjc1MjMwMjM=", "avatar_url": "https://avatars.githubusercontent.com/u/7523023?v=4", "gravatar_id": "", "url": "https://api.github.com/users/perinmclaughlin", "html_url": "https://github.com/perinmclaughlin", "followers_url": "https://api.g...
[]
closed
false
null
[]
null
[]
2025-08-07T21:30:05
2025-09-04T03:27:04
2025-08-13T01:21:25
NONE
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/40016", "html_url": "https://github.com/huggingface/transformers/pull/40016", "diff_url": "https://github.com/huggingface/transformers/pull/40016.diff", "patch_url": "https://github.com/huggingface/transformers/pull/40016.patch", "merged_at...
# What does this PR do? Fixes the longstanding issues with MoE training being bottlenecked by naive for loops for models with > 8 experts. This can result in sub 20% MFU in downstream training frameworks such as unsloth and trl. (Qwen3 30B on H800) There have been several downstream issues already from training ...
{ "login": "perinmclaughlin", "id": 7523023, "node_id": "MDQ6VXNlcjc1MjMwMjM=", "avatar_url": "https://avatars.githubusercontent.com/u/7523023?v=4", "gravatar_id": "", "url": "https://api.github.com/users/perinmclaughlin", "html_url": "https://github.com/perinmclaughlin", "followers_url": "https://api.g...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/40016/reactions", "total_count": 6, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 6, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/40016/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/40015
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/40015/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/40015/comments
https://api.github.com/repos/huggingface/transformers/issues/40015/events
https://github.com/huggingface/transformers/pull/40015
3,301,811,984
PR_kwDOCUB6oc6ip_2W
40,015
Update expected output values after #39885 (part 2)
{ "login": "ydshieh", "id": 2521628, "node_id": "MDQ6VXNlcjI1MjE2Mjg=", "avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4", "gravatar_id": "", "url": "https://api.github.com/users/ydshieh", "html_url": "https://github.com/ydshieh", "followers_url": "https://api.github.com/users/ydshieh/...
[]
closed
false
null
[]
null
[]
2025-08-07T20:42:51
2025-08-07T20:56:28
2025-08-07T20:52:53
COLLABORATOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/40015", "html_url": "https://github.com/huggingface/transformers/pull/40015", "diff_url": "https://github.com/huggingface/transformers/pull/40015.diff", "patch_url": "https://github.com/huggingface/transformers/pull/40015.patch", "merged_at...
# What does this PR do? The changes are expected. I also change the atol and rtol to 1e-4
{ "login": "ydshieh", "id": 2521628, "node_id": "MDQ6VXNlcjI1MjE2Mjg=", "avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4", "gravatar_id": "", "url": "https://api.github.com/users/ydshieh", "html_url": "https://github.com/ydshieh", "followers_url": "https://api.github.com/users/ydshieh/...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/40015/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/40015/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/40014
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/40014/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/40014/comments
https://api.github.com/repos/huggingface/transformers/issues/40014/events
https://github.com/huggingface/transformers/pull/40014
3,301,573,959
PR_kwDOCUB6oc6ipPWW
40,014
docs: fix duplication in 'en/optimizers.md'
{ "login": "luckyvickyricky", "id": 75977640, "node_id": "MDQ6VXNlcjc1OTc3NjQw", "avatar_url": "https://avatars.githubusercontent.com/u/75977640?v=4", "gravatar_id": "", "url": "https://api.github.com/users/luckyvickyricky", "html_url": "https://github.com/luckyvickyricky", "followers_url": "https://api...
[]
closed
false
null
[]
null
[]
2025-08-07T19:01:56
2025-08-07T20:28:44
2025-08-07T20:28:43
CONTRIBUTOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/40014", "html_url": "https://github.com/huggingface/transformers/pull/40014", "diff_url": "https://github.com/huggingface/transformers/pull/40014.diff", "patch_url": "https://github.com/huggingface/transformers/pull/40014.patch", "merged_at...
# What does this PR do? This PR fixes a minor duplication in the code: - "gradient_checkpointing=True" ## Before submitting - [x] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case). - [x] Did you read the [contributor guideline](https://github.com/huggingface/transf...
{ "login": "stevhliu", "id": 59462357, "node_id": "MDQ6VXNlcjU5NDYyMzU3", "avatar_url": "https://avatars.githubusercontent.com/u/59462357?v=4", "gravatar_id": "", "url": "https://api.github.com/users/stevhliu", "html_url": "https://github.com/stevhliu", "followers_url": "https://api.github.com/users/ste...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/40014/reactions", "total_count": 2, "+1": 1, "-1": 0, "laugh": 0, "hooray": 1, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/40014/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/40013
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/40013/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/40013/comments
https://api.github.com/repos/huggingface/transformers/issues/40013/events
https://github.com/huggingface/transformers/pull/40013
3,301,568,753
PR_kwDOCUB6oc6ipOQj
40,013
pin torchcodec==0.5.0 for now with torch 2.7.1 on daily CI
{ "login": "ydshieh", "id": 2521628, "node_id": "MDQ6VXNlcjI1MjE2Mjg=", "avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4", "gravatar_id": "", "url": "https://api.github.com/users/ydshieh", "html_url": "https://github.com/ydshieh", "followers_url": "https://api.github.com/users/ydshieh/...
[]
closed
false
null
[]
null
[]
2025-08-07T18:59:37
2025-08-07T21:05:41
2025-08-07T21:05:39
COLLABORATOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/40013", "html_url": "https://github.com/huggingface/transformers/pull/40013", "diff_url": "https://github.com/huggingface/transformers/pull/40013.diff", "patch_url": "https://github.com/huggingface/transformers/pull/40013.patch", "merged_at...
# What does this PR do? Will unpin this weekend.
{ "login": "ydshieh", "id": 2521628, "node_id": "MDQ6VXNlcjI1MjE2Mjg=", "avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4", "gravatar_id": "", "url": "https://api.github.com/users/ydshieh", "html_url": "https://github.com/ydshieh", "followers_url": "https://api.github.com/users/ydshieh/...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/40013/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/40013/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/40012
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/40012/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/40012/comments
https://api.github.com/repos/huggingface/transformers/issues/40012/events
https://github.com/huggingface/transformers/pull/40012
3,301,539,912
PR_kwDOCUB6oc6ipIc7
40,012
unpin torch<2.8 on circleci
{ "login": "ydshieh", "id": 2521628, "node_id": "MDQ6VXNlcjI1MjE2Mjg=", "avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4", "gravatar_id": "", "url": "https://api.github.com/users/ydshieh", "html_url": "https://github.com/ydshieh", "followers_url": "https://api.github.com/users/ydshieh/...
[]
closed
false
null
[]
null
[]
2025-08-07T18:47:12
2025-08-07T19:31:19
2025-08-07T19:31:17
COLLABORATOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/40012", "html_url": "https://github.com/huggingface/transformers/pull/40012", "diff_url": "https://github.com/huggingface/transformers/pull/40012.diff", "patch_url": "https://github.com/huggingface/transformers/pull/40012.patch", "merged_at...
# What does this PR do? revert #39951, as torchcodec 2.6.0 is released. (without this unpin, we will get errors with torchcodec 2.6.0 + torch 2.7.1)
{ "login": "ydshieh", "id": 2521628, "node_id": "MDQ6VXNlcjI1MjE2Mjg=", "avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4", "gravatar_id": "", "url": "https://api.github.com/users/ydshieh", "html_url": "https://github.com/ydshieh", "followers_url": "https://api.github.com/users/ydshieh/...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/40012/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/40012/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/40011
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/40011/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/40011/comments
https://api.github.com/repos/huggingface/transformers/issues/40011/events
https://github.com/huggingface/transformers/pull/40011
3,301,522,407
PR_kwDOCUB6oc6ipE6d
40,011
🌐 [i18n-KO] Translated `optimizers.md` to Korean
{ "login": "chelsseeey", "id": 152389483, "node_id": "U_kgDOCRVHaw", "avatar_url": "https://avatars.githubusercontent.com/u/152389483?v=4", "gravatar_id": "", "url": "https://api.github.com/users/chelsseeey", "html_url": "https://github.com/chelsseeey", "followers_url": "https://api.github.com/users/che...
[]
closed
false
null
[]
null
[]
2025-08-07T18:39:32
2025-08-13T17:00:47
2025-08-13T17:00:47
CONTRIBUTOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/40011", "html_url": "https://github.com/huggingface/transformers/pull/40011", "diff_url": "https://github.com/huggingface/transformers/pull/40011.diff", "patch_url": "https://github.com/huggingface/transformers/pull/40011.patch", "merged_at...
# What does this PR do? Translated the `optimizers.md` file of the documentation to Korean. Thank you in advance for your review. Part of https://github.com/huggingface/transformers/issues/20179 ## Before reviewing - [X] Check for missing / redundant translations (번역 누락/중복 검사) - [X] Grammar Check (맞춤법 검사) ...
{ "login": "stevhliu", "id": 59462357, "node_id": "MDQ6VXNlcjU5NDYyMzU3", "avatar_url": "https://avatars.githubusercontent.com/u/59462357?v=4", "gravatar_id": "", "url": "https://api.github.com/users/stevhliu", "html_url": "https://github.com/stevhliu", "followers_url": "https://api.github.com/users/ste...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/40011/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/40011/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/40010
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/40010/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/40010/comments
https://api.github.com/repos/huggingface/transformers/issues/40010/events
https://github.com/huggingface/transformers/issues/40010
3,301,318,909
I_kwDOCUB6oc7ExiD9
40,010
Customizable Logit Warping Strategies for Generation
{ "login": "PamelaBha", "id": 219210686, "node_id": "U_kgDODRDjvg", "avatar_url": "https://avatars.githubusercontent.com/u/219210686?v=4", "gravatar_id": "", "url": "https://api.github.com/users/PamelaBha", "html_url": "https://github.com/PamelaBha", "followers_url": "https://api.github.com/users/Pamela...
[ { "id": 2648621985, "node_id": "MDU6TGFiZWwyNjQ4NjIxOTg1", "url": "https://api.github.com/repos/huggingface/transformers/labels/Feature%20request", "name": "Feature request", "color": "FBCA04", "default": false, "description": "Request for a new feature" } ]
open
false
null
[]
null
[]
2025-08-07T17:24:05
2025-10-13T20:43:45
null
NONE
null
null
null
null
### Feature request Improve the generate() API by supporting custom, declarative logit warping strategies. Make it easier for users to plug in standard and custom LogitProcessors via configuration or arguments without needing to subclass or dive into internals. ### Motivation The generation module already support...
null
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/40010/reactions", "total_count": 1, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 1, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/40010/timeline
null
null
{ "total": 0, "completed": 0, "percent_completed": 0 }
{ "blocked_by": 0, "total_blocked_by": 0, "blocking": 0, "total_blocking": 0 }
false
false
https://api.github.com/repos/huggingface/transformers/issues/40009
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/40009/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/40009/comments
https://api.github.com/repos/huggingface/transformers/issues/40009/events
https://github.com/huggingface/transformers/pull/40009
3,301,296,073
PR_kwDOCUB6oc6ioUkT
40,009
feat: extract rev in attn_implementation kernels via @
{ "login": "drbh", "id": 9896130, "node_id": "MDQ6VXNlcjk4OTYxMzA=", "avatar_url": "https://avatars.githubusercontent.com/u/9896130?v=4", "gravatar_id": "", "url": "https://api.github.com/users/drbh", "html_url": "https://github.com/drbh", "followers_url": "https://api.github.com/users/drbh/followers", ...
[]
closed
false
null
[]
null
[]
2025-08-07T17:16:09
2025-08-11T19:14:13
2025-08-11T19:14:13
CONTRIBUTOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/40009", "html_url": "https://github.com/huggingface/transformers/pull/40009", "diff_url": "https://github.com/huggingface/transformers/pull/40009.diff", "patch_url": "https://github.com/huggingface/transformers/pull/40009.patch", "merged_at...
This PR adds the ability to specify kernel revisions via the `@` symbol in in the `attn_implementation` in `AutoModelForCausalLM.from_pretrained` ### Example usage ```bash uv run repro.py ``` ```python # /// script # requires-python = ">=3.12" # dependencies = [ # "accelerate", # "torch==2.7....
{ "login": "drbh", "id": 9896130, "node_id": "MDQ6VXNlcjk4OTYxMzA=", "avatar_url": "https://avatars.githubusercontent.com/u/9896130?v=4", "gravatar_id": "", "url": "https://api.github.com/users/drbh", "html_url": "https://github.com/drbh", "followers_url": "https://api.github.com/users/drbh/followers", ...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/40009/reactions", "total_count": 1, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 1, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/40009/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/40008
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/40008/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/40008/comments
https://api.github.com/repos/huggingface/transformers/issues/40008/events
https://github.com/huggingface/transformers/pull/40008
3,301,222,762
PR_kwDOCUB6oc6ioFPH
40,008
Fixes for EncoderDecoderCache
{ "login": "remi-or", "id": 83456801, "node_id": "MDQ6VXNlcjgzNDU2ODAx", "avatar_url": "https://avatars.githubusercontent.com/u/83456801?v=4", "gravatar_id": "", "url": "https://api.github.com/users/remi-or", "html_url": "https://github.com/remi-or", "followers_url": "https://api.github.com/users/remi-o...
[]
closed
false
null
[]
null
[]
2025-08-07T16:50:59
2025-08-18T15:51:06
2025-08-18T15:51:06
COLLABORATOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/40008", "html_url": "https://github.com/huggingface/transformers/pull/40008", "diff_url": "https://github.com/huggingface/transformers/pull/40008.diff", "patch_url": "https://github.com/huggingface/transformers/pull/40008.patch", "merged_at...
The `EncoderDecoderCache` object is not compatible with `nn.DataParallel` because it expects to be instantiated with 2 arguments. This probably was not an issue before because the legacy cache was a tuple of tuples (thus compatible with `nn.DataParallel.gather`) but it is now. This PR proposes a fix by changing the...
{ "login": "Cyrilvallez", "id": 71554963, "node_id": "MDQ6VXNlcjcxNTU0OTYz", "avatar_url": "https://avatars.githubusercontent.com/u/71554963?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Cyrilvallez", "html_url": "https://github.com/Cyrilvallez", "followers_url": "https://api.github.com/...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/40008/reactions", "total_count": 1, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 1, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/40008/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/40007
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/40007/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/40007/comments
https://api.github.com/repos/huggingface/transformers/issues/40007/events
https://github.com/huggingface/transformers/pull/40007
3,301,163,932
PR_kwDOCUB6oc6in4v1
40,007
🚨 Use lru_cache for sine pos embeddings MaskFormer
{ "login": "yonigozlan", "id": 74535834, "node_id": "MDQ6VXNlcjc0NTM1ODM0", "avatar_url": "https://avatars.githubusercontent.com/u/74535834?v=4", "gravatar_id": "", "url": "https://api.github.com/users/yonigozlan", "html_url": "https://github.com/yonigozlan", "followers_url": "https://api.github.com/use...
[ { "id": 6886428489, "node_id": "LA_kwDOCUB6oc8AAAABmnaPSQ", "url": "https://api.github.com/repos/huggingface/transformers/labels/run-slow", "name": "run-slow", "color": "E1D519", "default": false, "description": "" } ]
closed
false
null
[]
null
[]
2025-08-07T16:31:55
2025-08-13T17:05:23
2025-08-13T17:05:23
MEMBER
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/40007", "html_url": "https://github.com/huggingface/transformers/pull/40007", "diff_url": "https://github.com/huggingface/transformers/pull/40007.diff", "patch_url": "https://github.com/huggingface/transformers/pull/40007.patch", "merged_at...
# What does this PR do? Since sine pos embeddings only depends on a fixed feature shape (for these models at least), this seems like a free speedup. Changed Maskformer and related only for now, as its sine pos embedding module is used in sam2, but happy to extend this to other models in the library! The speed gai...
{ "login": "yonigozlan", "id": 74535834, "node_id": "MDQ6VXNlcjc0NTM1ODM0", "avatar_url": "https://avatars.githubusercontent.com/u/74535834?v=4", "gravatar_id": "", "url": "https://api.github.com/users/yonigozlan", "html_url": "https://github.com/yonigozlan", "followers_url": "https://api.github.com/use...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/40007/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/40007/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/40006
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/40006/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/40006/comments
https://api.github.com/repos/huggingface/transformers/issues/40006/events
https://github.com/huggingface/transformers/pull/40006
3,301,127,047
PR_kwDOCUB6oc6inw89
40,006
Fix PerceptionLM image preprocessing for non-tiled image input.
{ "login": "shuminghu", "id": 2934295, "node_id": "MDQ6VXNlcjI5MzQyOTU=", "avatar_url": "https://avatars.githubusercontent.com/u/2934295?v=4", "gravatar_id": "", "url": "https://api.github.com/users/shuminghu", "html_url": "https://github.com/shuminghu", "followers_url": "https://api.github.com/users/sh...
[]
closed
false
null
[]
null
[]
2025-08-07T16:17:59
2025-08-12T08:41:01
2025-08-12T08:40:22
CONTRIBUTOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/40006", "html_url": "https://github.com/huggingface/transformers/pull/40006", "diff_url": "https://github.com/huggingface/transformers/pull/40006.diff", "patch_url": "https://github.com/huggingface/transformers/pull/40006.patch", "merged_at...
Add support for vanilla image that only has C,H,W dims but not tiles dim. This is non-default image shapes used in PLM but it's useful in demos and low-resoure devices. @zucchini-nlp
{ "login": "zucchini-nlp", "id": 100715397, "node_id": "U_kgDOBgDLhQ", "avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4", "gravatar_id": "", "url": "https://api.github.com/users/zucchini-nlp", "html_url": "https://github.com/zucchini-nlp", "followers_url": "https://api.github.com/use...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/40006/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/40006/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/40005
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/40005/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/40005/comments
https://api.github.com/repos/huggingface/transformers/issues/40005/events
https://github.com/huggingface/transformers/pull/40005
3,300,952,108
PR_kwDOCUB6oc6inMLi
40,005
[fix] Pass video inputs to plm
{ "login": "4g", "id": 664530, "node_id": "MDQ6VXNlcjY2NDUzMA==", "avatar_url": "https://avatars.githubusercontent.com/u/664530?v=4", "gravatar_id": "", "url": "https://api.github.com/users/4g", "html_url": "https://github.com/4g", "followers_url": "https://api.github.com/users/4g/followers", "followi...
[]
closed
false
null
[]
null
[]
2025-08-07T15:24:09
2025-08-08T04:24:38
2025-08-07T15:25:07
NONE
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/40005", "html_url": "https://github.com/huggingface/transformers/pull/40005", "diff_url": "https://github.com/huggingface/transformers/pull/40005.diff", "patch_url": "https://github.com/huggingface/transformers/pull/40005.patch", "merged_at...
1. video_inputs were not being passed to plm, resulting in same results for all videos. 2. This was breaking the official example. more @ https://github.com/huggingface/transformers/issues/40004 3. tested locally with different videos # Pass video inputs to plm Fixes # [40004](https://github.com/huggingface/tra...
{ "login": "4g", "id": 664530, "node_id": "MDQ6VXNlcjY2NDUzMA==", "avatar_url": "https://avatars.githubusercontent.com/u/664530?v=4", "gravatar_id": "", "url": "https://api.github.com/users/4g", "html_url": "https://github.com/4g", "followers_url": "https://api.github.com/users/4g/followers", "followi...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/40005/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/40005/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/40004
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/40004/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/40004/comments
https://api.github.com/repos/huggingface/transformers/issues/40004/events
https://github.com/huggingface/transformers/issues/40004
3,300,810,266
I_kwDOCUB6oc7Evl4a
40,004
video_inputs are not passed to perception_lm
{ "login": "4g", "id": 664530, "node_id": "MDQ6VXNlcjY2NDUzMA==", "avatar_url": "https://avatars.githubusercontent.com/u/664530?v=4", "gravatar_id": "", "url": "https://api.github.com/users/4g", "html_url": "https://github.com/4g", "followers_url": "https://api.github.com/users/4g/followers", "followi...
[ { "id": 3817266200, "node_id": "MDU6TGFiZWwzODE3MjY2MjAw", "url": "https://api.github.com/repos/huggingface/transformers/labels/bug", "name": "bug", "color": "d73a4a", "default": true, "description": null } ]
closed
false
null
[]
null
[]
2025-08-07T14:44:49
2025-08-07T18:22:08
2025-08-07T18:22:08
NONE
null
null
null
null
### System Info - `transformers` version: 4.55.0 - Platform: Linux-6.14.0-27-generic-x86_64-with-glibc2.39 - Python version: 3.11.9 - Huggingface_hub version: 0.34.3 - Safetensors version: 0.4.3 - Accelerate version: 1.6.0 - Accelerate config: not found - DeepSpeed version: 0.17.1 - PyTorch version (accelerator?): 2....
{ "login": "zucchini-nlp", "id": 100715397, "node_id": "U_kgDOBgDLhQ", "avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4", "gravatar_id": "", "url": "https://api.github.com/users/zucchini-nlp", "html_url": "https://github.com/zucchini-nlp", "followers_url": "https://api.github.com/use...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/40004/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/40004/timeline
null
completed
{ "total": 0, "completed": 0, "percent_completed": 0 }
{ "blocked_by": 0, "total_blocked_by": 0, "blocking": 0, "total_blocking": 0 }
false
true
https://api.github.com/repos/huggingface/transformers/issues/40003
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/40003/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/40003/comments
https://api.github.com/repos/huggingface/transformers/issues/40003/events
https://github.com/huggingface/transformers/pull/40003
3,300,791,406
PR_kwDOCUB6oc6imqgX
40,003
fix: remove CHAT_TEMPLATE import in tests for deepseek-vl
{ "login": "geetu040", "id": 90601662, "node_id": "MDQ6VXNlcjkwNjAxNjYy", "avatar_url": "https://avatars.githubusercontent.com/u/90601662?v=4", "gravatar_id": "", "url": "https://api.github.com/users/geetu040", "html_url": "https://github.com/geetu040", "followers_url": "https://api.github.com/users/gee...
[]
closed
false
null
[]
null
[]
2025-08-07T14:39:43
2025-08-07T16:20:05
2025-08-07T16:19:37
CONTRIBUTOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/40003", "html_url": "https://github.com/huggingface/transformers/pull/40003", "diff_url": "https://github.com/huggingface/transformers/pull/40003.diff", "patch_url": "https://github.com/huggingface/transformers/pull/40003.patch", "merged_at...
# What does this PR do? Fixes #39966 This PR removes the `CHAT_TEMPLATE` imports from `test_processing_deepseek_vl.py` and `test_processing_deepseek_vl_hybrid.py`. These imports were referencing weight conversion scripts that are not included in the PyPI distribution, which causes the tests to fail in the v4.55.0...
{ "login": "zucchini-nlp", "id": 100715397, "node_id": "U_kgDOBgDLhQ", "avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4", "gravatar_id": "", "url": "https://api.github.com/users/zucchini-nlp", "html_url": "https://github.com/zucchini-nlp", "followers_url": "https://api.github.com/use...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/40003/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/40003/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/40002
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/40002/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/40002/comments
https://api.github.com/repos/huggingface/transformers/issues/40002/events
https://github.com/huggingface/transformers/pull/40002
3,300,782,135
PR_kwDOCUB6oc6imoho
40,002
[`Flash Attention`] Fix flash attention integration
{ "login": "vasqu", "id": 73884904, "node_id": "MDQ6VXNlcjczODg0OTA0", "avatar_url": "https://avatars.githubusercontent.com/u/73884904?v=4", "gravatar_id": "", "url": "https://api.github.com/users/vasqu", "html_url": "https://github.com/vasqu", "followers_url": "https://api.github.com/users/vasqu/follow...
[ { "id": 8103865784, "node_id": "LA_kwDOCUB6oc8AAAAB4wctuA", "url": "https://api.github.com/repos/huggingface/transformers/labels/for%20patch", "name": "for patch", "color": "D93F0B", "default": false, "description": "Tag issues / labels that should be included in the next patch" } ]
closed
false
null
[]
null
[]
2025-08-07T14:37:09
2025-08-12T20:36:52
2025-08-12T15:24:10
CONTRIBUTOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/40002", "html_url": "https://github.com/huggingface/transformers/pull/40002", "diff_url": "https://github.com/huggingface/transformers/pull/40002.diff", "patch_url": "https://github.com/huggingface/transformers/pull/40002.patch", "merged_at...
The current flash attention implementation has several issues: - `test_flash_attn_2_equivalence` was failing in all models (I think) - Fa kwargs no longer had `max_length_q/k` - Varlen has been handled incorrectly leading to errors by always opting to this path even if this was not the case (pos ids but no attention...
{ "login": "vasqu", "id": 73884904, "node_id": "MDQ6VXNlcjczODg0OTA0", "avatar_url": "https://avatars.githubusercontent.com/u/73884904?v=4", "gravatar_id": "", "url": "https://api.github.com/users/vasqu", "html_url": "https://github.com/vasqu", "followers_url": "https://api.github.com/users/vasqu/follow...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/40002/reactions", "total_count": 8, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 6, "eyes": 2 }
https://api.github.com/repos/huggingface/transformers/issues/40002/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/40001
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/40001/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/40001/comments
https://api.github.com/repos/huggingface/transformers/issues/40001/events
https://github.com/huggingface/transformers/issues/40001
3,300,732,592
I_kwDOCUB6oc7EvS6w
40,001
Possible wrong init call
{ "login": "zhizhongli-sony", "id": 165778602, "node_id": "U_kgDOCeGUqg", "avatar_url": "https://avatars.githubusercontent.com/u/165778602?v=4", "gravatar_id": "", "url": "https://api.github.com/users/zhizhongli-sony", "html_url": "https://github.com/zhizhongli-sony", "followers_url": "https://api.githu...
[ { "id": 3817266200, "node_id": "MDU6TGFiZWwzODE3MjY2MjAw", "url": "https://api.github.com/repos/huggingface/transformers/labels/bug", "name": "bug", "color": "d73a4a", "default": true, "description": null } ]
closed
false
null
[]
null
[]
2025-08-07T14:22:51
2025-10-05T08:03:00
2025-10-05T08:03:00
NONE
null
null
null
null
### System Info - `transformers` version: 4.52.4 - Platform: Linux-5.15.0-94-generic-x86_64-with-glibc2.35 - Python version: 3.11.11 - Huggingface_hub version: 0.30.1 - Safetensors version: 0.5.3 - Accelerate version: 1.4.0 - Accelerate config: not found - DeepSpeed version: 0.16.4 - PyTorch version (GPU?): 2.6.0+c...
{ "login": "github-actions[bot]", "id": 41898282, "node_id": "MDM6Qm90NDE4OTgyODI=", "avatar_url": "https://avatars.githubusercontent.com/in/15368?v=4", "gravatar_id": "", "url": "https://api.github.com/users/github-actions%5Bbot%5D", "html_url": "https://github.com/apps/github-actions", "followers_url"...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/40001/reactions", "total_count": 1, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 1 }
https://api.github.com/repos/huggingface/transformers/issues/40001/timeline
null
completed
{ "total": 0, "completed": 0, "percent_completed": 0 }
{ "blocked_by": 0, "total_blocked_by": 0, "blocking": 0, "total_blocking": 0 }
false
true
https://api.github.com/repos/huggingface/transformers/issues/40000
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/40000/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/40000/comments
https://api.github.com/repos/huggingface/transformers/issues/40000/events
https://github.com/huggingface/transformers/pull/40000
3,300,715,180
PR_kwDOCUB6oc6imaWT
40,000
Fix an annoying flaky test
{ "login": "zucchini-nlp", "id": 100715397, "node_id": "U_kgDOBgDLhQ", "avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4", "gravatar_id": "", "url": "https://api.github.com/users/zucchini-nlp", "html_url": "https://github.com/zucchini-nlp", "followers_url": "https://api.github.com/use...
[]
closed
false
null
[]
null
[]
2025-08-07T14:18:08
2025-08-12T13:19:51
2025-08-08T08:32:51
MEMBER
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/40000", "html_url": "https://github.com/huggingface/transformers/pull/40000", "diff_url": "https://github.com/huggingface/transformers/pull/40000.diff", "patch_url": "https://github.com/huggingface/transformers/pull/40000.patch", "merged_at...
# What does this PR do? As per title, trying to download the image from `picsum` times out if we run the test many times, so I moved the image to HF hub. AFAIK we can make many subsequent requests to hub downloads when testing
{ "login": "zucchini-nlp", "id": 100715397, "node_id": "U_kgDOBgDLhQ", "avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4", "gravatar_id": "", "url": "https://api.github.com/users/zucchini-nlp", "html_url": "https://github.com/zucchini-nlp", "followers_url": "https://api.github.com/use...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/40000/reactions", "total_count": 6, "+1": 3, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 3, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/40000/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/39999
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/39999/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/39999/comments
https://api.github.com/repos/huggingface/transformers/issues/39999/events
https://github.com/huggingface/transformers/pull/39999
3,300,702,196
PR_kwDOCUB6oc6imXiP
39,999
allow TP to work in ND-parallel with fsdp cpu ram efficient loading
{ "login": "winglian", "id": 381258, "node_id": "MDQ6VXNlcjM4MTI1OA==", "avatar_url": "https://avatars.githubusercontent.com/u/381258?v=4", "gravatar_id": "", "url": "https://api.github.com/users/winglian", "html_url": "https://github.com/winglian", "followers_url": "https://api.github.com/users/winglia...
[]
open
false
null
[]
null
[]
2025-08-07T14:14:48
2025-08-25T08:56:54
null
CONTRIBUTOR
null
null
true
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/39999", "html_url": "https://github.com/huggingface/transformers/pull/39999", "diff_url": "https://github.com/huggingface/transformers/pull/39999.diff", "patch_url": "https://github.com/huggingface/transformers/pull/39999.patch", "merged_at...
# What does this PR do? For N-D parallelism, When using FSDP2+TP with cpu_ram_efficient_loading, we have to specify the device_map as "meta" for non-rank0 processes. Additionally, even though we already know what device it will ultimately end up on through the device_mesh, we don't want to change the device_map for ...
null
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/39999/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/39999/timeline
null
null
null
null
true
false
https://api.github.com/repos/huggingface/transformers/issues/39998
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/39998/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/39998/comments
https://api.github.com/repos/huggingface/transformers/issues/39998/events
https://github.com/huggingface/transformers/pull/39998
3,300,690,487
PR_kwDOCUB6oc6imU-O
39,998
Raising error when quantizing a quantized model
{ "login": "MekkCyber", "id": 93391238, "node_id": "U_kgDOBZEJhg", "avatar_url": "https://avatars.githubusercontent.com/u/93391238?v=4", "gravatar_id": "", "url": "https://api.github.com/users/MekkCyber", "html_url": "https://github.com/MekkCyber", "followers_url": "https://api.github.com/users/MekkCybe...
[]
closed
false
null
[]
null
[]
2025-08-07T14:12:01
2025-08-07T20:37:27
2025-08-07T20:37:26
CONTRIBUTOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/39998", "html_url": "https://github.com/huggingface/transformers/pull/39998", "diff_url": "https://github.com/huggingface/transformers/pull/39998.diff", "patch_url": "https://github.com/huggingface/transformers/pull/39998.patch", "merged_at...
# What does this PR do? This pr raises error if we try to quantize a quantized model using a different quantization method
{ "login": "SunMarc", "id": 57196510, "node_id": "MDQ6VXNlcjU3MTk2NTEw", "avatar_url": "https://avatars.githubusercontent.com/u/57196510?v=4", "gravatar_id": "", "url": "https://api.github.com/users/SunMarc", "html_url": "https://github.com/SunMarc", "followers_url": "https://api.github.com/users/SunMar...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/39998/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/39998/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/39997
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/39997/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/39997/comments
https://api.github.com/repos/huggingface/transformers/issues/39997/events
https://github.com/huggingface/transformers/pull/39997
3,300,687,325
PR_kwDOCUB6oc6imUTb
39,997
make sure position_ids are passed in for causal mask creation for gpt-oss
{ "login": "winglian", "id": 381258, "node_id": "MDQ6VXNlcjM4MTI1OA==", "avatar_url": "https://avatars.githubusercontent.com/u/381258?v=4", "gravatar_id": "", "url": "https://api.github.com/users/winglian", "html_url": "https://github.com/winglian", "followers_url": "https://api.github.com/users/winglia...
[]
open
false
null
[]
null
[]
2025-08-07T14:11:06
2025-08-12T14:34:29
null
CONTRIBUTOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/39997", "html_url": "https://github.com/huggingface/transformers/pull/39997", "diff_url": "https://github.com/huggingface/transformers/pull/39997.diff", "patch_url": "https://github.com/huggingface/transformers/pull/39997.patch", "merged_at...
# What does this PR do? Packing won't work with gpt-oss since it doesn't respect the position ids. See https://github.com/huggingface/transformers/pull/39194 <!-- Congratulations! You've made it this far! You're not quite done yet though. Once merged, your PR is going to appear in the release notes with the t...
null
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/39997/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/39997/timeline
null
null
null
null
true
false
https://api.github.com/repos/huggingface/transformers/issues/39996
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/39996/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/39996/comments
https://api.github.com/repos/huggingface/transformers/issues/39996/events
https://github.com/huggingface/transformers/pull/39996
3,300,663,541
PR_kwDOCUB6oc6imPPf
39,996
Tie weights recursively on all submodels
{ "login": "Cyrilvallez", "id": 71554963, "node_id": "MDQ6VXNlcjcxNTU0OTYz", "avatar_url": "https://avatars.githubusercontent.com/u/71554963?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Cyrilvallez", "html_url": "https://github.com/Cyrilvallez", "followers_url": "https://api.github.com/...
[]
closed
false
null
[]
null
[]
2025-08-07T14:04:37
2025-08-08T14:03:19
2025-08-08T14:03:16
MEMBER
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/39996", "html_url": "https://github.com/huggingface/transformers/pull/39996", "diff_url": "https://github.com/huggingface/transformers/pull/39996.diff", "patch_url": "https://github.com/huggingface/transformers/pull/39996.patch", "merged_at...
# What does this PR do? Fixes https://github.com/huggingface/transformers/issues/39900. Current code calls custom `_tie_weights` recursively on all `modules`, but does not recursively ties the embeddings or the encoder/decoder parts
{ "login": "Cyrilvallez", "id": 71554963, "node_id": "MDQ6VXNlcjcxNTU0OTYz", "avatar_url": "https://avatars.githubusercontent.com/u/71554963?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Cyrilvallez", "html_url": "https://github.com/Cyrilvallez", "followers_url": "https://api.github.com/...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/39996/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/39996/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/39995
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/39995/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/39995/comments
https://api.github.com/repos/huggingface/transformers/issues/39995/events
https://github.com/huggingface/transformers/pull/39995
3,300,582,975
PR_kwDOCUB6oc6il-E9
39,995
Fix consistency
{ "login": "Cyrilvallez", "id": 71554963, "node_id": "MDQ6VXNlcjcxNTU0OTYz", "avatar_url": "https://avatars.githubusercontent.com/u/71554963?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Cyrilvallez", "html_url": "https://github.com/Cyrilvallez", "followers_url": "https://api.github.com/...
[]
closed
false
null
[]
null
[]
2025-08-07T13:44:48
2025-08-07T13:57:52
2025-08-07T13:52:41
MEMBER
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/39995", "html_url": "https://github.com/huggingface/transformers/pull/39995", "diff_url": "https://github.com/huggingface/transformers/pull/39995.diff", "patch_url": "https://github.com/huggingface/transformers/pull/39995.patch", "merged_at...
# What does this PR do? cc @qubvel for viz after your PR!
{ "login": "Cyrilvallez", "id": 71554963, "node_id": "MDQ6VXNlcjcxNTU0OTYz", "avatar_url": "https://avatars.githubusercontent.com/u/71554963?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Cyrilvallez", "html_url": "https://github.com/Cyrilvallez", "followers_url": "https://api.github.com/...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/39995/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/39995/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/39994
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/39994/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/39994/comments
https://api.github.com/repos/huggingface/transformers/issues/39994/events
https://github.com/huggingface/transformers/pull/39994
3,300,575,657
PR_kwDOCUB6oc6il8gg
39,994
chore: Add type hints to import_utils.py module
{ "login": "wirthual", "id": 2640499, "node_id": "MDQ6VXNlcjI2NDA0OTk=", "avatar_url": "https://avatars.githubusercontent.com/u/2640499?v=4", "gravatar_id": "", "url": "https://api.github.com/users/wirthual", "html_url": "https://github.com/wirthual", "followers_url": "https://api.github.com/users/wirth...
[]
closed
false
null
[]
null
[]
2025-08-07T13:42:35
2025-08-21T11:20:21
2025-08-21T11:20:20
CONTRIBUTOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/39994", "html_url": "https://github.com/huggingface/transformers/pull/39994", "diff_url": "https://github.com/huggingface/transformers/pull/39994.diff", "patch_url": "https://github.com/huggingface/transformers/pull/39994.patch", "merged_at...
# What does this PR do? Add type hints to `import_utils.py` Based on these [docs](https://mypy.readthedocs.io/en/stable/running_mypy.html#missing-library-stubs-or-py-typed-marker), this change should avoid errors like: ``` infinity_emb/inference/loading_strategy.py:10: error: Skipping analyzing "transformers.ut...
{ "login": "Rocketknight1", "id": 12866554, "node_id": "MDQ6VXNlcjEyODY2NTU0", "avatar_url": "https://avatars.githubusercontent.com/u/12866554?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Rocketknight1", "html_url": "https://github.com/Rocketknight1", "followers_url": "https://api.githu...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/39994/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/39994/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/39993
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/39993/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/39993/comments
https://api.github.com/repos/huggingface/transformers/issues/39993/events
https://github.com/huggingface/transformers/pull/39993
3,300,566,912
PR_kwDOCUB6oc6il6m3
39,993
Default to dequantize if cpu in device_map for mxfp4
{ "login": "MekkCyber", "id": 93391238, "node_id": "U_kgDOBZEJhg", "avatar_url": "https://avatars.githubusercontent.com/u/93391238?v=4", "gravatar_id": "", "url": "https://api.github.com/users/MekkCyber", "html_url": "https://github.com/MekkCyber", "followers_url": "https://api.github.com/users/MekkCybe...
[ { "id": 8103865784, "node_id": "LA_kwDOCUB6oc8AAAAB4wctuA", "url": "https://api.github.com/repos/huggingface/transformers/labels/for%20patch", "name": "for patch", "color": "D93F0B", "default": false, "description": "Tag issues / labels that should be included in the next patch" } ]
closed
false
null
[]
null
[]
2025-08-07T13:40:10
2025-08-12T14:48:54
2025-08-12T14:48:52
CONTRIBUTOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/39993", "html_url": "https://github.com/huggingface/transformers/pull/39993", "diff_url": "https://github.com/huggingface/transformers/pull/39993.diff", "patch_url": "https://github.com/huggingface/transformers/pull/39993.patch", "merged_at...
# What does this PR do? For mxfp4 gptoss model, if no cuda device is available and the model is prequantized we default to dequantizing it after raising a warning so that it can run on cpu
{ "login": "ArthurZucker", "id": 48595927, "node_id": "MDQ6VXNlcjQ4NTk1OTI3", "avatar_url": "https://avatars.githubusercontent.com/u/48595927?v=4", "gravatar_id": "", "url": "https://api.github.com/users/ArthurZucker", "html_url": "https://github.com/ArthurZucker", "followers_url": "https://api.github.c...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/39993/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/39993/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/39992
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/39992/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/39992/comments
https://api.github.com/repos/huggingface/transformers/issues/39992/events
https://github.com/huggingface/transformers/issues/39992
3,300,511,232
I_kwDOCUB6oc7Euc4A
39,992
[gpt-oss] Transform checkpoint from safetensors to state dict
{ "login": "fingertap", "id": 7274689, "node_id": "MDQ6VXNlcjcyNzQ2ODk=", "avatar_url": "https://avatars.githubusercontent.com/u/7274689?v=4", "gravatar_id": "", "url": "https://api.github.com/users/fingertap", "html_url": "https://github.com/fingertap", "followers_url": "https://api.github.com/users/fi...
[]
closed
false
null
[]
null
[]
2025-08-07T13:24:06
2025-09-15T08:02:55
2025-09-15T08:02:55
NONE
null
null
null
null
Yesterday I was working on gpt-oss. However, loading the weights give me troubles. 
For models like Qwen, I did things like this: 1. Create model on meta device 2. FSDP2 shard it, so it can fit in memory 3. On each GPU, it read weights from safetensors in a generator style, to save memory. 4. Chunk the weights and cop...
{ "login": "github-actions[bot]", "id": 41898282, "node_id": "MDM6Qm90NDE4OTgyODI=", "avatar_url": "https://avatars.githubusercontent.com/in/15368?v=4", "gravatar_id": "", "url": "https://api.github.com/users/github-actions%5Bbot%5D", "html_url": "https://github.com/apps/github-actions", "followers_url"...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/39992/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/39992/timeline
null
completed
{ "total": 0, "completed": 0, "percent_completed": 0 }
{ "blocked_by": 0, "total_blocked_by": 0, "blocking": 0, "total_blocking": 0 }
false
true
https://api.github.com/repos/huggingface/transformers/issues/39991
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/39991/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/39991/comments
https://api.github.com/repos/huggingface/transformers/issues/39991/events
https://github.com/huggingface/transformers/issues/39991
3,300,416,534
I_kwDOCUB6oc7EuFwW
39,991
Gemma3n get_placeholder_mask issue
{ "login": "Znerual", "id": 22452386, "node_id": "MDQ6VXNlcjIyNDUyMzg2", "avatar_url": "https://avatars.githubusercontent.com/u/22452386?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Znerual", "html_url": "https://github.com/Znerual", "followers_url": "https://api.github.com/users/Znerua...
[ { "id": 3817266200, "node_id": "MDU6TGFiZWwzODE3MjY2MjAw", "url": "https://api.github.com/repos/huggingface/transformers/labels/bug", "name": "bug", "color": "d73a4a", "default": true, "description": null } ]
closed
false
null
[]
null
[]
2025-08-07T12:58:50
2025-08-08T10:45:48
2025-08-08T10:45:48
CONTRIBUTOR
null
null
null
null
### System Info - `transformers` version: 4.55.0 - Platform: Linux-6.8.0-71-generic-x86_64-with-glibc2.39 - Python version: 3.12.2 - Huggingface_hub version: 0.34.3 - Safetensors version: 0.6.1 - Accelerate version: 1.9.0 - Accelerate config: not found - DeepSpeed version: not installed - PyTorch version (accelerator...
{ "login": "zucchini-nlp", "id": 100715397, "node_id": "U_kgDOBgDLhQ", "avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4", "gravatar_id": "", "url": "https://api.github.com/users/zucchini-nlp", "html_url": "https://github.com/zucchini-nlp", "followers_url": "https://api.github.com/use...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/39991/reactions", "total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/39991/timeline
null
completed
{ "total": 0, "completed": 0, "percent_completed": 0 }
{ "blocked_by": 0, "total_blocked_by": 0, "blocking": 0, "total_blocking": 0 }
false
true
https://api.github.com/repos/huggingface/transformers/issues/39990
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/39990/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/39990/comments
https://api.github.com/repos/huggingface/transformers/issues/39990/events
https://github.com/huggingface/transformers/pull/39990
3,300,322,108
PR_kwDOCUB6oc6ilFd5
39,990
Update expected output values after #39885 (part 1)
{ "login": "ydshieh", "id": 2521628, "node_id": "MDQ6VXNlcjI1MjE2Mjg=", "avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4", "gravatar_id": "", "url": "https://api.github.com/users/ydshieh", "html_url": "https://github.com/ydshieh", "followers_url": "https://api.github.com/users/ydshieh/...
[]
closed
false
null
[]
null
[]
2025-08-07T12:31:42
2025-08-07T14:00:30
2025-08-07T14:00:29
COLLABORATOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/39990", "html_url": "https://github.com/huggingface/transformers/pull/39990", "diff_url": "https://github.com/huggingface/transformers/pull/39990.diff", "patch_url": "https://github.com/huggingface/transformers/pull/39990.patch", "merged_at...
# What does this PR do? The changes are expected.
{ "login": "ydshieh", "id": 2521628, "node_id": "MDQ6VXNlcjI1MjE2Mjg=", "avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4", "gravatar_id": "", "url": "https://api.github.com/users/ydshieh", "html_url": "https://github.com/ydshieh", "followers_url": "https://api.github.com/users/ydshieh/...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/39990/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/39990/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/39989
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/39989/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/39989/comments
https://api.github.com/repos/huggingface/transformers/issues/39989/events
https://github.com/huggingface/transformers/pull/39989
3,300,320,571
PR_kwDOCUB6oc6ilFIQ
39,989
Higgs modules_to_not_convert standardization
{ "login": "MekkCyber", "id": 93391238, "node_id": "U_kgDOBZEJhg", "avatar_url": "https://avatars.githubusercontent.com/u/93391238?v=4", "gravatar_id": "", "url": "https://api.github.com/users/MekkCyber", "html_url": "https://github.com/MekkCyber", "followers_url": "https://api.github.com/users/MekkCybe...
[]
closed
false
null
[]
null
[]
2025-08-07T12:31:10
2025-08-08T08:23:01
2025-08-08T08:22:59
CONTRIBUTOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/39989", "html_url": "https://github.com/huggingface/transformers/pull/39989", "diff_url": "https://github.com/huggingface/transformers/pull/39989.diff", "patch_url": "https://github.com/huggingface/transformers/pull/39989.patch", "merged_at...
# What does this PR do? Standardizes the way higgs quantization uses the `module_to_not_convert` attribute
{ "login": "MekkCyber", "id": 93391238, "node_id": "U_kgDOBZEJhg", "avatar_url": "https://avatars.githubusercontent.com/u/93391238?v=4", "gravatar_id": "", "url": "https://api.github.com/users/MekkCyber", "html_url": "https://github.com/MekkCyber", "followers_url": "https://api.github.com/users/MekkCybe...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/39989/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/39989/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/39988
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/39988/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/39988/comments
https://api.github.com/repos/huggingface/transformers/issues/39988/events
https://github.com/huggingface/transformers/pull/39988
3,300,266,198
PR_kwDOCUB6oc6ik5QI
39,988
Update Glm4V processor and add tests
{ "login": "zucchini-nlp", "id": 100715397, "node_id": "U_kgDOBgDLhQ", "avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4", "gravatar_id": "", "url": "https://api.github.com/users/zucchini-nlp", "html_url": "https://github.com/zucchini-nlp", "followers_url": "https://api.github.com/use...
[]
closed
false
null
[]
null
[]
2025-08-07T12:14:49
2025-08-12T11:40:55
2025-08-12T11:40:55
MEMBER
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/39988", "html_url": "https://github.com/huggingface/transformers/pull/39988", "diff_url": "https://github.com/huggingface/transformers/pull/39988.diff", "patch_url": "https://github.com/huggingface/transformers/pull/39988.patch", "merged_at...
# What does this PR do? As per title, the processor has no tests and currently user-defined `size` isn't being used when processing images/videos
{ "login": "zucchini-nlp", "id": 100715397, "node_id": "U_kgDOBgDLhQ", "avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4", "gravatar_id": "", "url": "https://api.github.com/users/zucchini-nlp", "html_url": "https://github.com/zucchini-nlp", "followers_url": "https://api.github.com/use...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/39988/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/39988/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/39987
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/39987/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/39987/comments
https://api.github.com/repos/huggingface/transformers/issues/39987/events
https://github.com/huggingface/transformers/pull/39987
3,300,227,743
PR_kwDOCUB6oc6ikwvz
39,987
Add a VGGT(Visual Geometry Grounded Transformer) model compatible with huggingface transfromers
{ "login": "panzhizhen111", "id": 176186831, "node_id": "U_kgDOCoBlzw", "avatar_url": "https://avatars.githubusercontent.com/u/176186831?v=4", "gravatar_id": "", "url": "https://api.github.com/users/panzhizhen111", "html_url": "https://github.com/panzhizhen111", "followers_url": "https://api.github.com/...
[]
open
false
null
[]
null
[]
2025-08-07T12:03:24
2025-08-11T02:32:58
null
NONE
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/39987", "html_url": "https://github.com/huggingface/transformers/pull/39987", "diff_url": "https://github.com/huggingface/transformers/pull/39987.diff", "patch_url": "https://github.com/huggingface/transformers/pull/39987.patch", "merged_at...
This PR adds the VGGT (Visual Geometry Grounded Transformer) model to the Hugging Face Transformers library. It includes: - `VggtConfig`, `VggtModel` - Integration into the Auto classes (`AutoConfig`, `AutoModel`). - Basic unit tests for configuration, save/load, and forward pass. - [WIP]Model documentation in `do...
{ "login": "panzhizhen111", "id": 176186831, "node_id": "U_kgDOCoBlzw", "avatar_url": "https://avatars.githubusercontent.com/u/176186831?v=4", "gravatar_id": "", "url": "https://api.github.com/users/panzhizhen111", "html_url": "https://github.com/panzhizhen111", "followers_url": "https://api.github.com/...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/39987/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/39987/timeline
null
null
null
null
true
false
https://api.github.com/repos/huggingface/transformers/issues/39986
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/39986/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/39986/comments
https://api.github.com/repos/huggingface/transformers/issues/39986/events
https://github.com/huggingface/transformers/pull/39986
3,300,006,647
PR_kwDOCUB6oc6ij_ll
39,986
fix: resolve triton version check compatibility on windows
{ "login": "Tsumugii24", "id": 124921491, "node_id": "U_kgDOB3Imkw", "avatar_url": "https://avatars.githubusercontent.com/u/124921491?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Tsumugii24", "html_url": "https://github.com/Tsumugii24", "followers_url": "https://api.github.com/users/Tsu...
[ { "id": 8103865784, "node_id": "LA_kwDOCUB6oc8AAAAB4wctuA", "url": "https://api.github.com/repos/huggingface/transformers/labels/for%20patch", "name": "for patch", "color": "D93F0B", "default": false, "description": "Tag issues / labels that should be included in the next patch" } ]
closed
false
null
[]
null
[]
2025-08-07T10:56:21
2025-08-11T06:53:47
2025-08-11T06:53:20
CONTRIBUTOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/39986", "html_url": "https://github.com/huggingface/transformers/pull/39986", "diff_url": "https://github.com/huggingface/transformers/pull/39986.diff", "patch_url": "https://github.com/huggingface/transformers/pull/39986.patch", "merged_at...
# What does this PR do? <!-- Congratulations! You've made it this far! You're not quite done yet though. Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution. Then, please replace this w...
{ "login": "MekkCyber", "id": 93391238, "node_id": "U_kgDOBZEJhg", "avatar_url": "https://avatars.githubusercontent.com/u/93391238?v=4", "gravatar_id": "", "url": "https://api.github.com/users/MekkCyber", "html_url": "https://github.com/MekkCyber", "followers_url": "https://api.github.com/users/MekkCybe...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/39986/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/39986/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/39985
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/39985/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/39985/comments
https://api.github.com/repos/huggingface/transformers/issues/39985/events
https://github.com/huggingface/transformers/issues/39985
3,299,994,871
I_kwDOCUB6oc7Esez3
39,985
Triton version check compatibility on windows
{ "login": "Tsumugii24", "id": 124921491, "node_id": "U_kgDOB3Imkw", "avatar_url": "https://avatars.githubusercontent.com/u/124921491?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Tsumugii24", "html_url": "https://github.com/Tsumugii24", "followers_url": "https://api.github.com/users/Tsu...
[ { "id": 3817266200, "node_id": "MDU6TGFiZWwzODE3MjY2MjAw", "url": "https://api.github.com/repos/huggingface/transformers/labels/bug", "name": "bug", "color": "d73a4a", "default": true, "description": null } ]
closed
false
null
[]
null
[]
2025-08-07T10:52:38
2025-08-11T06:53:21
2025-08-11T06:53:21
CONTRIBUTOR
null
null
null
null
### System Info - `transformers` version: 4.55.0 - Platform: Windows-11-10.0.26100-SP0 - Python version: 3.12.11 - Huggingface_hub version: 0.34.3 - Safetensors version: 0.6.1 - Accelerate version: 1.9.0 - Accelerate config: not found - DeepSpeed version: not installed - PyTorch version (accelerator?): 2.8.0+cu129 ...
{ "login": "MekkCyber", "id": 93391238, "node_id": "U_kgDOBZEJhg", "avatar_url": "https://avatars.githubusercontent.com/u/93391238?v=4", "gravatar_id": "", "url": "https://api.github.com/users/MekkCyber", "html_url": "https://github.com/MekkCyber", "followers_url": "https://api.github.com/users/MekkCybe...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/39985/reactions", "total_count": 2, "+1": 2, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/39985/timeline
null
completed
{ "total": 0, "completed": 0, "percent_completed": 0 }
{ "blocked_by": 0, "total_blocked_by": 0, "blocking": 0, "total_blocking": 0 }
false
true
https://api.github.com/repos/huggingface/transformers/issues/39984
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/39984/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/39984/comments
https://api.github.com/repos/huggingface/transformers/issues/39984/events
https://github.com/huggingface/transformers/pull/39984
3,299,877,630
PR_kwDOCUB6oc6ijjbL
39,984
Fix setting attention for multimodal models
{ "login": "zucchini-nlp", "id": 100715397, "node_id": "U_kgDOBgDLhQ", "avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4", "gravatar_id": "", "url": "https://api.github.com/users/zucchini-nlp", "html_url": "https://github.com/zucchini-nlp", "followers_url": "https://api.github.com/use...
[]
closed
false
null
[]
null
[]
2025-08-07T10:13:10
2025-08-19T09:35:12
2025-08-19T09:35:12
MEMBER
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/39984", "html_url": "https://github.com/huggingface/transformers/pull/39984", "diff_url": "https://github.com/huggingface/transformers/pull/39984.diff", "patch_url": "https://github.com/huggingface/transformers/pull/39984.patch", "merged_at...
# What does this PR do? Fixes setting attention implementation in multimodals as a dict. Currently it fails because `self._attn_implementation` is not defined at the point when we try to `get` it. We need to set attention to `None` if the key is not found in dict, which is the default attention
{ "login": "zucchini-nlp", "id": 100715397, "node_id": "U_kgDOBgDLhQ", "avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4", "gravatar_id": "", "url": "https://api.github.com/users/zucchini-nlp", "html_url": "https://github.com/zucchini-nlp", "followers_url": "https://api.github.com/use...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/39984/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/39984/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/39983
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/39983/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/39983/comments
https://api.github.com/repos/huggingface/transformers/issues/39983/events
https://github.com/huggingface/transformers/issues/39983
3,299,842,712
I_kwDOCUB6oc7Er5qY
39,983
CVE fix for v4.37.2 and v4.38.0
{ "login": "Aman-Surkar", "id": 99606590, "node_id": "U_kgDOBe_gPg", "avatar_url": "https://avatars.githubusercontent.com/u/99606590?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Aman-Surkar", "html_url": "https://github.com/Aman-Surkar", "followers_url": "https://api.github.com/users/Am...
[]
closed
false
null
[]
null
[]
2025-08-07T10:02:44
2025-09-15T08:02:57
2025-09-15T08:02:57
NONE
null
null
null
null
Hi Team, I wanted to have CVE (https://github.com/advisories/GHSA-jjph-296x-mrcr) fixed in v4.37.2 and v4.38.0 through backporting. I see that chat.py which was recently added, which has a mention in the fix for the CVE doesn't exists in v4.37.2 and v4.38.0. I wanted to know how I can fix it.
{ "login": "github-actions[bot]", "id": 41898282, "node_id": "MDM6Qm90NDE4OTgyODI=", "avatar_url": "https://avatars.githubusercontent.com/in/15368?v=4", "gravatar_id": "", "url": "https://api.github.com/users/github-actions%5Bbot%5D", "html_url": "https://github.com/apps/github-actions", "followers_url"...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/39983/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/39983/timeline
null
completed
{ "total": 0, "completed": 0, "percent_completed": 0 }
{ "blocked_by": 0, "total_blocked_by": 0, "blocking": 0, "total_blocking": 0 }
false
true
https://api.github.com/repos/huggingface/transformers/issues/39982
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/39982/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/39982/comments
https://api.github.com/repos/huggingface/transformers/issues/39982/events
https://github.com/huggingface/transformers/issues/39982
3,299,827,227
I_kwDOCUB6oc7Er14b
39,982
flash-attn cannot perform deterministic computation
{ "login": "Ju-si-yuan", "id": 59277332, "node_id": "MDQ6VXNlcjU5Mjc3MzMy", "avatar_url": "https://avatars.githubusercontent.com/u/59277332?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Ju-si-yuan", "html_url": "https://github.com/Ju-si-yuan", "followers_url": "https://api.github.com/use...
[ { "id": 3817266200, "node_id": "MDU6TGFiZWwzODE3MjY2MjAw", "url": "https://api.github.com/repos/huggingface/transformers/labels/bug", "name": "bug", "color": "d73a4a", "default": true, "description": null } ]
closed
false
null
[]
null
[]
2025-08-07T09:58:06
2025-08-08T10:38:50
2025-08-08T10:38:49
NONE
null
null
null
null
### System Info - `transformers` version: 4.52.4 - Platform: Linux-5.10.134-010.ali5000.al8.x86_64-x86_64-with-glibc2.39 - Python version: 3.10.18 - Huggingface_hub version: 0.34.3 - Safetensors version: 0.6.1 - Accelerate version: 1.7.0 - Accelerate config: not found - DeepSpeed version: 0.16.4 - PyTorch version (GP...
{ "login": "vasqu", "id": 73884904, "node_id": "MDQ6VXNlcjczODg0OTA0", "avatar_url": "https://avatars.githubusercontent.com/u/73884904?v=4", "gravatar_id": "", "url": "https://api.github.com/users/vasqu", "html_url": "https://github.com/vasqu", "followers_url": "https://api.github.com/users/vasqu/follow...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/39982/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/39982/timeline
null
completed
{ "total": 0, "completed": 0, "percent_completed": 0 }
{ "blocked_by": 0, "total_blocked_by": 0, "blocking": 0, "total_blocking": 0 }
false
true
https://api.github.com/repos/huggingface/transformers/issues/39981
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/39981/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/39981/comments
https://api.github.com/repos/huggingface/transformers/issues/39981/events
https://github.com/huggingface/transformers/pull/39981
3,299,613,845
PR_kwDOCUB6oc6iiqKC
39,981
[Idefics] fix device mismatch
{ "login": "zucchini-nlp", "id": 100715397, "node_id": "U_kgDOBgDLhQ", "avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4", "gravatar_id": "", "url": "https://api.github.com/users/zucchini-nlp", "html_url": "https://github.com/zucchini-nlp", "followers_url": "https://api.github.com/use...
[ { "id": 8103865784, "node_id": "LA_kwDOCUB6oc8AAAAB4wctuA", "url": "https://api.github.com/repos/huggingface/transformers/labels/for%20patch", "name": "for patch", "color": "D93F0B", "default": false, "description": "Tag issues / labels that should be included in the next patch" } ]
closed
false
null
[]
null
[]
2025-08-07T08:50:23
2025-08-07T15:45:04
2025-08-07T09:12:04
MEMBER
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/39981", "html_url": "https://github.com/huggingface/transformers/pull/39981", "diff_url": "https://github.com/huggingface/transformers/pull/39981.diff", "patch_url": "https://github.com/huggingface/transformers/pull/39981.patch", "merged_at...
# What does this PR do? Fixes https://github.com/huggingface/transformers/issues/39947. We need to use consistent device when computing `position_ids`, and since we are updating `position_ids` we will use its device instead of `pixel_values`. Otherwise it raises errors because those are located in two different devi...
{ "login": "zucchini-nlp", "id": 100715397, "node_id": "U_kgDOBgDLhQ", "avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4", "gravatar_id": "", "url": "https://api.github.com/users/zucchini-nlp", "html_url": "https://github.com/zucchini-nlp", "followers_url": "https://api.github.com/use...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/39981/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/39981/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/39980
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/39980/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/39980/comments
https://api.github.com/repos/huggingface/transformers/issues/39980/events
https://github.com/huggingface/transformers/pull/39980
3,299,571,961
PR_kwDOCUB6oc6iig5V
39,980
[DRAFT] optimize gpt-oss decoding
{ "login": "jiqing-feng", "id": 107918818, "node_id": "U_kgDOBm614g", "avatar_url": "https://avatars.githubusercontent.com/u/107918818?v=4", "gravatar_id": "", "url": "https://api.github.com/users/jiqing-feng", "html_url": "https://github.com/jiqing-feng", "followers_url": "https://api.github.com/users/...
[]
closed
false
null
[]
null
[]
2025-08-07T08:38:48
2025-08-07T08:43:10
2025-08-07T08:43:10
CONTRIBUTOR
null
null
true
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/39980", "html_url": "https://github.com/huggingface/transformers/pull/39980", "diff_url": "https://github.com/huggingface/transformers/pull/39980.diff", "patch_url": "https://github.com/huggingface/transformers/pull/39980.patch", "merged_at...
Optimize gpt-oss decoding because the hidden states only need to repeat local num experts times.
{ "login": "jiqing-feng", "id": 107918818, "node_id": "U_kgDOBm614g", "avatar_url": "https://avatars.githubusercontent.com/u/107918818?v=4", "gravatar_id": "", "url": "https://api.github.com/users/jiqing-feng", "html_url": "https://github.com/jiqing-feng", "followers_url": "https://api.github.com/users/...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/39980/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/39980/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/39979
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/39979/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/39979/comments
https://api.github.com/repos/huggingface/transformers/issues/39979/events
https://github.com/huggingface/transformers/pull/39979
3,299,564,676
PR_kwDOCUB6oc6iifTg
39,979
Fix cross-attention masking before residual connection
{ "login": "ArkVex", "id": 159469387, "node_id": "U_kgDOCYFPSw", "avatar_url": "https://avatars.githubusercontent.com/u/159469387?v=4", "gravatar_id": "", "url": "https://api.github.com/users/ArkVex", "html_url": "https://github.com/ArkVex", "followers_url": "https://api.github.com/users/ArkVex/follower...
[]
closed
false
null
[]
null
[]
2025-08-07T08:36:23
2025-08-22T11:44:40
2025-08-22T11:44:40
CONTRIBUTOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/39979", "html_url": "https://github.com/huggingface/transformers/pull/39979", "diff_url": "https://github.com/huggingface/transformers/pull/39979.diff", "patch_url": "https://github.com/huggingface/transformers/pull/39979.patch", "merged_at...
This PR fixes an incorrect masking position in the MllamaCrossAttentionDecoderLayer. Previously, the full_text_row_masked_out_mask was applied after the cross-attention output was added to the residual connection. This allowed image tokens to leak into text tokens that should not have seen them. The fix moves the ma...
{ "login": "ArkVex", "id": 159469387, "node_id": "U_kgDOCYFPSw", "avatar_url": "https://avatars.githubusercontent.com/u/159469387?v=4", "gravatar_id": "", "url": "https://api.github.com/users/ArkVex", "html_url": "https://github.com/ArkVex", "followers_url": "https://api.github.com/users/ArkVex/follower...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/39979/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/39979/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/39978
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/39978/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/39978/comments
https://api.github.com/repos/huggingface/transformers/issues/39978/events
https://github.com/huggingface/transformers/pull/39978
3,299,512,793
PR_kwDOCUB6oc6iiT6U
39,978
Various test fixes for AMD
{ "login": "remi-or", "id": 83456801, "node_id": "MDQ6VXNlcjgzNDU2ODAx", "avatar_url": "https://avatars.githubusercontent.com/u/83456801?v=4", "gravatar_id": "", "url": "https://api.github.com/users/remi-or", "html_url": "https://github.com/remi-or", "followers_url": "https://api.github.com/users/remi-o...
[]
closed
false
null
[]
null
[]
2025-08-07T08:20:41
2025-08-07T12:00:12
2025-08-07T08:57:04
COLLABORATOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/39978", "html_url": "https://github.com/huggingface/transformers/pull/39978", "diff_url": "https://github.com/huggingface/transformers/pull/39978.diff", "patch_url": "https://github.com/huggingface/transformers/pull/39978.patch", "merged_at...
This PR introduces minor test fixes for 4 different models: - for `internvl` we add an Expectation for ROCm 9.4 and remove a tensor being created from a tensor; - for `llama` we add an Expectation for ROCm 9.4; - for `llava` we add the `@require_bitsandbytes` decorator for a test requires bitsandbytes; - for `mistr...
{ "login": "remi-or", "id": 83456801, "node_id": "MDQ6VXNlcjgzNDU2ODAx", "avatar_url": "https://avatars.githubusercontent.com/u/83456801?v=4", "gravatar_id": "", "url": "https://api.github.com/users/remi-or", "html_url": "https://github.com/remi-or", "followers_url": "https://api.github.com/users/remi-o...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/39978/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/39978/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/39977
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/39977/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/39977/comments
https://api.github.com/repos/huggingface/transformers/issues/39977/events
https://github.com/huggingface/transformers/issues/39977
3,299,372,993
I_kwDOCUB6oc7EqG_B
39,977
FSDP2 not compatible with transformers >= 4.54.0 GenericForTokenClassification
{ "login": "ETOgaosion", "id": 57280232, "node_id": "MDQ6VXNlcjU3MjgwMjMy", "avatar_url": "https://avatars.githubusercontent.com/u/57280232?v=4", "gravatar_id": "", "url": "https://api.github.com/users/ETOgaosion", "html_url": "https://github.com/ETOgaosion", "followers_url": "https://api.github.com/use...
[ { "id": 3817266200, "node_id": "MDU6TGFiZWwzODE3MjY2MjAw", "url": "https://api.github.com/repos/huggingface/transformers/labels/bug", "name": "bug", "color": "d73a4a", "default": true, "description": null } ]
closed
false
null
[]
null
[]
2025-08-07T07:34:42
2025-08-15T10:28:17
2025-08-15T10:28:17
NONE
null
null
null
null
### System Info - `transformers` version: 4.54.0 - Platform: Linux-5.10.135.bsk.6-amd64-x86_64-with-glibc2.35 - Python version: 3.10.12 - Huggingface_hub version: 0.34.3 - Safetensors version: 0.5.3 - Accelerate version: 1.9.0 - Accelerate config: not found - DeepSpeed version: not installed - PyTorch version (accele...
{ "login": "Cyrilvallez", "id": 71554963, "node_id": "MDQ6VXNlcjcxNTU0OTYz", "avatar_url": "https://avatars.githubusercontent.com/u/71554963?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Cyrilvallez", "html_url": "https://github.com/Cyrilvallez", "followers_url": "https://api.github.com/...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/39977/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/39977/timeline
null
completed
{ "total": 0, "completed": 0, "percent_completed": 0 }
{ "blocked_by": 0, "total_blocked_by": 0, "blocking": 0, "total_blocking": 0 }
false
true
https://api.github.com/repos/huggingface/transformers/issues/39976
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/39976/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/39976/comments
https://api.github.com/repos/huggingface/transformers/issues/39976/events
https://github.com/huggingface/transformers/pull/39976
3,299,366,244
PR_kwDOCUB6oc6ih1Bn
39,976
Fix Qwen3 MoE GGUF architecture mismatch
{ "login": "ctcanbol", "id": 103742287, "node_id": "U_kgDOBi77Tw", "avatar_url": "https://avatars.githubusercontent.com/u/103742287?v=4", "gravatar_id": "", "url": "https://api.github.com/users/ctcanbol", "html_url": "https://github.com/ctcanbol", "followers_url": "https://api.github.com/users/ctcanbol/...
[]
closed
false
null
[]
null
[]
2025-08-07T07:32:56
2025-08-12T13:39:20
2025-08-12T13:38:48
CONTRIBUTOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/39976", "html_url": "https://github.com/huggingface/transformers/pull/39976", "diff_url": "https://github.com/huggingface/transformers/pull/39976.diff", "patch_url": "https://github.com/huggingface/transformers/pull/39976.patch", "merged_at...
# What does this PR do? Currently, GGUF versions of Qwen3 MoE models raises "_ValueError: The checkpoint you are trying to load has model type qwen3moe but Transformers does not recognize this architecture. This could be because of an issue with the checkpoint, or because your version of Transformers is out of date_...
{ "login": "ArthurZucker", "id": 48595927, "node_id": "MDQ6VXNlcjQ4NTk1OTI3", "avatar_url": "https://avatars.githubusercontent.com/u/48595927?v=4", "gravatar_id": "", "url": "https://api.github.com/users/ArthurZucker", "html_url": "https://github.com/ArthurZucker", "followers_url": "https://api.github.c...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/39976/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/39976/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/39975
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/39975/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/39975/comments
https://api.github.com/repos/huggingface/transformers/issues/39975/events
https://github.com/huggingface/transformers/pull/39975
3,299,077,941
PR_kwDOCUB6oc6ig2sY
39,975
[bugfix] Fix tensor device in Idefics2, Idefics3, and SmolVLM
{ "login": "qgallouedec", "id": 45557362, "node_id": "MDQ6VXNlcjQ1NTU3MzYy", "avatar_url": "https://avatars.githubusercontent.com/u/45557362?v=4", "gravatar_id": "", "url": "https://api.github.com/users/qgallouedec", "html_url": "https://github.com/qgallouedec", "followers_url": "https://api.github.com/...
[ { "id": 8103865784, "node_id": "LA_kwDOCUB6oc8AAAAB4wctuA", "url": "https://api.github.com/repos/huggingface/transformers/labels/for%20patch", "name": "for patch", "color": "D93F0B", "default": false, "description": "Tag issues / labels that should be included in the next patch" } ]
closed
false
null
[]
null
[]
2025-08-07T05:48:01
2025-08-13T07:58:53
2025-08-13T07:58:51
MEMBER
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/39975", "html_url": "https://github.com/huggingface/transformers/pull/39975", "diff_url": "https://github.com/huggingface/transformers/pull/39975.diff", "patch_url": "https://github.com/huggingface/transformers/pull/39975.patch", "merged_at...
# What does this PR do? <!-- Congratulations! You've made it this far! You're not quite done yet though. Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution. Then, please replace this w...
{ "login": "ArthurZucker", "id": 48595927, "node_id": "MDQ6VXNlcjQ4NTk1OTI3", "avatar_url": "https://avatars.githubusercontent.com/u/48595927?v=4", "gravatar_id": "", "url": "https://api.github.com/users/ArthurZucker", "html_url": "https://github.com/ArthurZucker", "followers_url": "https://api.github.c...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/39975/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/39975/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/39974
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/39974/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/39974/comments
https://api.github.com/repos/huggingface/transformers/issues/39974/events
https://github.com/huggingface/transformers/issues/39974
3,298,661,255
I_kwDOCUB6oc7EnZOH
39,974
bug in new transformers: 'Florence2ForConditionalGeneration' object has no attribute '_supports_sdpa'
{ "login": "pseudotensor", "id": 2249614, "node_id": "MDQ6VXNlcjIyNDk2MTQ=", "avatar_url": "https://avatars.githubusercontent.com/u/2249614?v=4", "gravatar_id": "", "url": "https://api.github.com/users/pseudotensor", "html_url": "https://github.com/pseudotensor", "followers_url": "https://api.github.com...
[ { "id": 3817266200, "node_id": "MDU6TGFiZWwzODE3MjY2MjAw", "url": "https://api.github.com/repos/huggingface/transformers/labels/bug", "name": "bug", "color": "d73a4a", "default": true, "description": null } ]
closed
false
null
[]
null
[]
2025-08-07T02:27:18
2025-09-07T22:19:49
2025-09-01T11:40:40
NONE
null
null
null
null
### System Info transformers 4.55.0 python 3.10 ubuntu 22 ### Who can help? @amyeroberts, @qubvel ### Information - [ ] The official example scripts - [x] My own modified scripts ### Tasks - [ ] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...) - [x] My own task or dataset (give d...
{ "login": "zucchini-nlp", "id": 100715397, "node_id": "U_kgDOBgDLhQ", "avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4", "gravatar_id": "", "url": "https://api.github.com/users/zucchini-nlp", "html_url": "https://github.com/zucchini-nlp", "followers_url": "https://api.github.com/use...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/39974/reactions", "total_count": 2, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 1 }
https://api.github.com/repos/huggingface/transformers/issues/39974/timeline
null
completed
{ "total": 0, "completed": 0, "percent_completed": 0 }
{ "blocked_by": 0, "total_blocked_by": 0, "blocking": 0, "total_blocking": 0 }
false
true
https://api.github.com/repos/huggingface/transformers/issues/39973
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/39973/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/39973/comments
https://api.github.com/repos/huggingface/transformers/issues/39973/events
https://github.com/huggingface/transformers/pull/39973
3,298,487,106
PR_kwDOCUB6oc6ie3Zw
39,973
Causal loss for `ForConditionalGeneration`
{ "login": "qgallouedec", "id": 45557362, "node_id": "MDQ6VXNlcjQ1NTU3MzYy", "avatar_url": "https://avatars.githubusercontent.com/u/45557362?v=4", "gravatar_id": "", "url": "https://api.github.com/users/qgallouedec", "html_url": "https://github.com/qgallouedec", "followers_url": "https://api.github.com/...
[]
closed
false
null
[]
null
[]
2025-08-07T00:36:30
2025-08-12T12:03:11
2025-08-12T12:03:10
MEMBER
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/39973", "html_url": "https://github.com/huggingface/transformers/pull/39973", "diff_url": "https://github.com/huggingface/transformers/pull/39973.diff", "patch_url": "https://github.com/huggingface/transformers/pull/39973.patch", "merged_at...
# What does this PR do? <!-- Congratulations! You've made it this far! You're not quite done yet though. Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution. Then, please replace this w...
{ "login": "ArthurZucker", "id": 48595927, "node_id": "MDQ6VXNlcjQ4NTk1OTI3", "avatar_url": "https://avatars.githubusercontent.com/u/48595927?v=4", "gravatar_id": "", "url": "https://api.github.com/users/ArthurZucker", "html_url": "https://github.com/ArthurZucker", "followers_url": "https://api.github.c...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/39973/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/39973/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/39972
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/39972/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/39972/comments
https://api.github.com/repos/huggingface/transformers/issues/39972/events
https://github.com/huggingface/transformers/issues/39972
3,298,407,074
I_kwDOCUB6oc7EmbKi
39,972
Gemma3 with fp16 in inference (I don't know if this change is working in fine-tune) #BUG FIX
{ "login": "DGTell", "id": 122606028, "node_id": "U_kgDOB07RzA", "avatar_url": "https://avatars.githubusercontent.com/u/122606028?v=4", "gravatar_id": "", "url": "https://api.github.com/users/DGTell", "html_url": "https://github.com/DGTell", "followers_url": "https://api.github.com/users/DGTell/follower...
[]
closed
false
null
[]
null
[]
2025-08-06T23:46:26
2025-09-08T11:13:38
2025-09-08T11:13:38
NONE
null
null
null
null
### First of all, I want to say that I’m not a programmer, and I don’t know much about GitHub. (This post was translated with ChatGPT because my English isn’t great) I don’t know if this issue is just with me or if it affects everyone, but **I managed to fix the problem** where, during inference with Gemma3-4B-it (I ...
{ "login": "Rocketknight1", "id": 12866554, "node_id": "MDQ6VXNlcjEyODY2NTU0", "avatar_url": "https://avatars.githubusercontent.com/u/12866554?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Rocketknight1", "html_url": "https://github.com/Rocketknight1", "followers_url": "https://api.githu...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/39972/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/39972/timeline
null
completed
{ "total": 0, "completed": 0, "percent_completed": 0 }
{ "blocked_by": 0, "total_blocked_by": 0, "blocking": 0, "total_blocking": 0 }
false
true
https://api.github.com/repos/huggingface/transformers/issues/39971
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/39971/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/39971/comments
https://api.github.com/repos/huggingface/transformers/issues/39971/events
https://github.com/huggingface/transformers/pull/39971
3,298,250,470
PR_kwDOCUB6oc6ieD86
39,971
Fix missing video inputs for PerceptionLM.
{ "login": "shuminghu", "id": 2934295, "node_id": "MDQ6VXNlcjI5MzQyOTU=", "avatar_url": "https://avatars.githubusercontent.com/u/2934295?v=4", "gravatar_id": "", "url": "https://api.github.com/users/shuminghu", "html_url": "https://github.com/shuminghu", "followers_url": "https://api.github.com/users/sh...
[ { "id": 8103865784, "node_id": "LA_kwDOCUB6oc8AAAAB4wctuA", "url": "https://api.github.com/repos/huggingface/transformers/labels/for%20patch", "name": "for patch", "color": "D93F0B", "default": false, "description": "Tag issues / labels that should be included in the next patch" } ]
closed
false
null
[]
null
[]
2025-08-06T22:18:42
2025-08-07T16:18:47
2025-08-07T15:54:46
CONTRIBUTOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/39971", "html_url": "https://github.com/huggingface/transformers/pull/39971", "diff_url": "https://github.com/huggingface/transformers/pull/39971.diff", "patch_url": "https://github.com/huggingface/transformers/pull/39971.patch", "merged_at...
Critical: Fixes missing video input for PerceptionLM (accidentally removed in [PR](https://github.com/huggingface/transformers/pull/39583)) Minor: Add support for vanilla image that only has C,H,W dims but not tiles dim. This is non-default image shapes used in PLM but it's useful in demos and low-resoure devices...
{ "login": "zucchini-nlp", "id": 100715397, "node_id": "U_kgDOBgDLhQ", "avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4", "gravatar_id": "", "url": "https://api.github.com/users/zucchini-nlp", "html_url": "https://github.com/zucchini-nlp", "followers_url": "https://api.github.com/use...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/39971/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/39971/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/39970
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/39970/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/39970/comments
https://api.github.com/repos/huggingface/transformers/issues/39970/events
https://github.com/huggingface/transformers/pull/39970
3,298,137,794
PR_kwDOCUB6oc6idrHh
39,970
Add Keypoint Matcher pipeline
{ "login": "sbucaille", "id": 24275548, "node_id": "MDQ6VXNlcjI0Mjc1NTQ4", "avatar_url": "https://avatars.githubusercontent.com/u/24275548?v=4", "gravatar_id": "", "url": "https://api.github.com/users/sbucaille", "html_url": "https://github.com/sbucaille", "followers_url": "https://api.github.com/users/...
[]
closed
false
null
[]
null
[]
2025-08-06T21:27:58
2025-08-26T14:36:28
2025-08-26T14:26:57
CONTRIBUTOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/39970", "html_url": "https://github.com/huggingface/transformers/pull/39970", "diff_url": "https://github.com/huggingface/transformers/pull/39970.diff", "patch_url": "https://github.com/huggingface/transformers/pull/39970.patch", "merged_at...
# What does this PR do? Implements `keypoint-matcher` pipeline. Quite basic for now, let me know if I should add things. I added tests to have single and multiple pairs as well as checking it correctly fails when there is only one image provided. Committed on top of #39968 but will be rebased on main once the f...
{ "login": "qubvel", "id": 31920396, "node_id": "MDQ6VXNlcjMxOTIwMzk2", "avatar_url": "https://avatars.githubusercontent.com/u/31920396?v=4", "gravatar_id": "", "url": "https://api.github.com/users/qubvel", "html_url": "https://github.com/qubvel", "followers_url": "https://api.github.com/users/qubvel/fo...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/39970/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/39970/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/39969
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/39969/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/39969/comments
https://api.github.com/repos/huggingface/transformers/issues/39969/events
https://github.com/huggingface/transformers/issues/39969
3,298,128,111
I_kwDOCUB6oc7ElXDv
39,969
Finetune `gpt-oss-20b` with `mxfp4` quantization
{ "login": "eliotjones1", "id": 12123338, "node_id": "MDQ6VXNlcjEyMTIzMzM4", "avatar_url": "https://avatars.githubusercontent.com/u/12123338?v=4", "gravatar_id": "", "url": "https://api.github.com/users/eliotjones1", "html_url": "https://github.com/eliotjones1", "followers_url": "https://api.github.com/...
[]
closed
false
null
[]
null
[]
2025-08-06T21:24:19
2025-08-14T12:41:15
2025-08-06T22:47:26
NONE
null
null
null
null
Apologies if this is the wrong issue format -- I am not confident enough to say that this is for sure a bug and not just user error. I am currently unable to finetune (using peft/trl) the new oss openai model with quantization. The relevant packages and their versions are: ``` transformers ...
{ "login": "eliotjones1", "id": 12123338, "node_id": "MDQ6VXNlcjEyMTIzMzM4", "avatar_url": "https://avatars.githubusercontent.com/u/12123338?v=4", "gravatar_id": "", "url": "https://api.github.com/users/eliotjones1", "html_url": "https://github.com/eliotjones1", "followers_url": "https://api.github.com/...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/39969/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/39969/timeline
null
completed
{ "total": 0, "completed": 0, "percent_completed": 0 }
{ "blocked_by": 0, "total_blocked_by": 0, "blocking": 0, "total_blocking": 0 }
false
true
https://api.github.com/repos/huggingface/transformers/issues/39968
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/39968/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/39968/comments
https://api.github.com/repos/huggingface/transformers/issues/39968/events
https://github.com/huggingface/transformers/pull/39968
3,297,901,294
PR_kwDOCUB6oc6ic2ii
39,968
[superglue] Fixed the way batch mask was applied to the scores before match assignment computation
{ "login": "sbucaille", "id": 24275548, "node_id": "MDQ6VXNlcjI0Mjc1NTQ4", "avatar_url": "https://avatars.githubusercontent.com/u/24275548?v=4", "gravatar_id": "", "url": "https://api.github.com/users/sbucaille", "html_url": "https://github.com/sbucaille", "followers_url": "https://api.github.com/users/...
[ { "id": 5769473378, "node_id": "LA_kwDOCUB6oc8AAAABV-MtYg", "url": "https://api.github.com/repos/huggingface/transformers/labels/Vision", "name": "Vision", "color": "C079EF", "default": false, "description": "" } ]
closed
false
null
[]
null
[]
2025-08-06T20:04:14
2025-08-09T17:37:36
2025-08-07T08:49:39
CONTRIBUTOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/39968", "html_url": "https://github.com/huggingface/transformers/pull/39968", "diff_url": "https://github.com/huggingface/transformers/pull/39968.diff", "patch_url": "https://github.com/huggingface/transformers/pull/39968.patch", "merged_at...
# What does this PR do? Fixes the way mask is applied to the scores in SuperPoint. Realized in some cases not covered by the tests that I end up with the following error : ```python self = SuperGlueImageProcessor { "do_grayscale": true, "do_rescale": true, "do_resize": true, "image_processor_type":....
{ "login": "qubvel", "id": 31920396, "node_id": "MDQ6VXNlcjMxOTIwMzk2", "avatar_url": "https://avatars.githubusercontent.com/u/31920396?v=4", "gravatar_id": "", "url": "https://api.github.com/users/qubvel", "html_url": "https://github.com/qubvel", "followers_url": "https://api.github.com/users/qubvel/fo...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/39968/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/39968/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/39967
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/39967/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/39967/comments
https://api.github.com/repos/huggingface/transformers/issues/39967/events
https://github.com/huggingface/transformers/pull/39967
3,297,884,118
PR_kwDOCUB6oc6icyv7
39,967
Bump transformers from 4.48.0 to 4.53.0 in /examples/tensorflow/language-modeling-tpu
{ "login": "dependabot[bot]", "id": 49699333, "node_id": "MDM6Qm90NDk2OTkzMzM=", "avatar_url": "https://avatars.githubusercontent.com/in/29110?v=4", "gravatar_id": "", "url": "https://api.github.com/users/dependabot%5Bbot%5D", "html_url": "https://github.com/apps/dependabot", "followers_url": "https://a...
[ { "id": 1905493434, "node_id": "MDU6TGFiZWwxOTA1NDkzNDM0", "url": "https://api.github.com/repos/huggingface/transformers/labels/dependencies", "name": "dependencies", "color": "0366d6", "default": false, "description": "Pull requests that update a dependency file" }, { "id": 6410...
closed
false
null
[]
null
[]
2025-08-06T19:57:21
2025-08-07T11:13:49
2025-08-07T11:13:48
CONTRIBUTOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/39967", "html_url": "https://github.com/huggingface/transformers/pull/39967", "diff_url": "https://github.com/huggingface/transformers/pull/39967.diff", "patch_url": "https://github.com/huggingface/transformers/pull/39967.patch", "merged_at...
Bumps [transformers](https://github.com/huggingface/transformers) from 4.48.0 to 4.53.0. <details> <summary>Release notes</summary> <p><em>Sourced from <a href="https://github.com/huggingface/transformers/releases">transformers's releases</a>.</em></p> <blockquote> <h2>Release v4.53.0</h2> <h3>Gemma3n</h3> <p>Gemma 3n ...
{ "login": "Rocketknight1", "id": 12866554, "node_id": "MDQ6VXNlcjEyODY2NTU0", "avatar_url": "https://avatars.githubusercontent.com/u/12866554?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Rocketknight1", "html_url": "https://github.com/Rocketknight1", "followers_url": "https://api.githu...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/39967/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/39967/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/39966
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/39966/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/39966/comments
https://api.github.com/repos/huggingface/transformers/issues/39966/events
https://github.com/huggingface/transformers/issues/39966
3,297,734,272
I_kwDOCUB6oc7Ej26A
39,966
`convert_deepseek_vl_weights_to_hf.py` not included in v4.55.0 release.
{ "login": "rasmi", "id": 2267370, "node_id": "MDQ6VXNlcjIyNjczNzA=", "avatar_url": "https://avatars.githubusercontent.com/u/2267370?v=4", "gravatar_id": "", "url": "https://api.github.com/users/rasmi", "html_url": "https://github.com/rasmi", "followers_url": "https://api.github.com/users/rasmi/follower...
[ { "id": 3817266200, "node_id": "MDU6TGFiZWwzODE3MjY2MjAw", "url": "https://api.github.com/repos/huggingface/transformers/labels/bug", "name": "bug", "color": "d73a4a", "default": true, "description": null } ]
closed
false
null
[]
null
[]
2025-08-06T18:57:34
2025-08-07T16:19:38
2025-08-07T16:19:38
CONTRIBUTOR
null
null
null
null
`convert_deepseek_vl_weights_to_hf.py`, introduced in #36248, is in [main](https://github.com/huggingface/transformers/blob/main/src/transformers/models/deepseek_vl/convert_deepseek_vl_weights_to_hf.py) but not in the [v4.55.0 release](https://github.com/huggingface/transformers/blob/v4.55.0/src/transformers/models/de...
{ "login": "zucchini-nlp", "id": 100715397, "node_id": "U_kgDOBgDLhQ", "avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4", "gravatar_id": "", "url": "https://api.github.com/users/zucchini-nlp", "html_url": "https://github.com/zucchini-nlp", "followers_url": "https://api.github.com/use...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/39966/reactions", "total_count": 1, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 1 }
https://api.github.com/repos/huggingface/transformers/issues/39966/timeline
null
completed
{ "total": 0, "completed": 0, "percent_completed": 0 }
{ "blocked_by": 0, "total_blocked_by": 0, "blocking": 0, "total_blocking": 0 }
false
true
https://api.github.com/repos/huggingface/transformers/issues/39965
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/39965/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/39965/comments
https://api.github.com/repos/huggingface/transformers/issues/39965/events
https://github.com/huggingface/transformers/pull/39965
3,297,440,210
PR_kwDOCUB6oc6ibR7E
39,965
Fix HGNetV2 Model Card and Image Classification Pipeline Usage Tips
{ "login": "ducviet00", "id": 24910916, "node_id": "MDQ6VXNlcjI0OTEwOTE2", "avatar_url": "https://avatars.githubusercontent.com/u/24910916?v=4", "gravatar_id": "", "url": "https://api.github.com/users/ducviet00", "html_url": "https://github.com/ducviet00", "followers_url": "https://api.github.com/users/...
[]
closed
false
null
[]
null
[]
2025-08-06T17:12:10
2025-08-07T16:33:30
2025-08-07T16:33:30
CONTRIBUTOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/39965", "html_url": "https://github.com/huggingface/transformers/pull/39965", "diff_url": "https://github.com/huggingface/transformers/pull/39965.diff", "patch_url": "https://github.com/huggingface/transformers/pull/39965.patch", "merged_at...
# What does this PR do? <!-- Congratulations! You've made it this far! You're not quite done yet though. Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution. Then, please replace this w...
{ "login": "stevhliu", "id": 59462357, "node_id": "MDQ6VXNlcjU5NDYyMzU3", "avatar_url": "https://avatars.githubusercontent.com/u/59462357?v=4", "gravatar_id": "", "url": "https://api.github.com/users/stevhliu", "html_url": "https://github.com/stevhliu", "followers_url": "https://api.github.com/users/ste...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/39965/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/39965/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/39964
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/39964/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/39964/comments
https://api.github.com/repos/huggingface/transformers/issues/39964/events
https://github.com/huggingface/transformers/pull/39964
3,297,193,476
PR_kwDOCUB6oc6iacDH
39,964
fix glm4v image process
{ "login": "KeyKy", "id": 2967075, "node_id": "MDQ6VXNlcjI5NjcwNzU=", "avatar_url": "https://avatars.githubusercontent.com/u/2967075?v=4", "gravatar_id": "", "url": "https://api.github.com/users/KeyKy", "html_url": "https://github.com/KeyKy", "followers_url": "https://api.github.com/users/KeyKy/follower...
[]
closed
false
null
[]
null
[]
2025-08-06T15:41:17
2025-08-06T16:46:58
2025-08-06T16:46:58
CONTRIBUTOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/39964", "html_url": "https://github.com/huggingface/transformers/pull/39964", "diff_url": "https://github.com/huggingface/transformers/pull/39964.diff", "patch_url": "https://github.com/huggingface/transformers/pull/39964.patch", "merged_at...
@amyeroberts , @qubvel Issue: shortest_edge and longest_edge in preprocess_config.json are being ignored during GLM-4V image preprocessing. Please investigate and fix.
{ "login": "qubvel", "id": 31920396, "node_id": "MDQ6VXNlcjMxOTIwMzk2", "avatar_url": "https://avatars.githubusercontent.com/u/31920396?v=4", "gravatar_id": "", "url": "https://api.github.com/users/qubvel", "html_url": "https://github.com/qubvel", "followers_url": "https://api.github.com/users/qubvel/fo...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/39964/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/39964/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/39963
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/39963/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/39963/comments
https://api.github.com/repos/huggingface/transformers/issues/39963/events
https://github.com/huggingface/transformers/issues/39963
3,296,743,556
I_kwDOCUB6oc7EgFCE
39,963
change `dataloader_persistent_workers` default value to `True`
{ "login": "farbodbj", "id": 110523279, "node_id": "U_kgDOBpZzjw", "avatar_url": "https://avatars.githubusercontent.com/u/110523279?v=4", "gravatar_id": "", "url": "https://api.github.com/users/farbodbj", "html_url": "https://github.com/farbodbj", "followers_url": "https://api.github.com/users/farbodbj/...
[]
open
false
null
[]
null
[]
2025-08-06T13:34:04
2025-10-26T08:03:00
null
NONE
null
null
null
null
https://github.com/huggingface/transformers/blob/82eb67e62a0a66b46647ff4132c173d2f3b8b54f/src/transformers/training_args.py#L1339 As described in the documentation, setting this configuration to `True` will cause training speedup but will cause more RAM usage, and the default is set to True. I believe this configurat...
null
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/39963/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/39963/timeline
null
null
{ "total": 0, "completed": 0, "percent_completed": 0 }
{ "blocked_by": 0, "total_blocked_by": 0, "blocking": 0, "total_blocking": 0 }
false
false
https://api.github.com/repos/huggingface/transformers/issues/39962
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/39962/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/39962/comments
https://api.github.com/repos/huggingface/transformers/issues/39962/events
https://github.com/huggingface/transformers/pull/39962
3,296,704,354
PR_kwDOCUB6oc6iYv0f
39,962
Use torch._check instead of a test to make the model Gemma3 exportable
{ "login": "xadupre", "id": 22452781, "node_id": "MDQ6VXNlcjIyNDUyNzgx", "avatar_url": "https://avatars.githubusercontent.com/u/22452781?v=4", "gravatar_id": "", "url": "https://api.github.com/users/xadupre", "html_url": "https://github.com/xadupre", "followers_url": "https://api.github.com/users/xadupr...
[]
open
false
null
[]
null
[]
2025-08-06T13:23:15
2025-08-06T15:08:10
null
CONTRIBUTOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/39962", "html_url": "https://github.com/huggingface/transformers/pull/39962", "diff_url": "https://github.com/huggingface/transformers/pull/39962.diff", "patch_url": "https://github.com/huggingface/transformers/pull/39962.patch", "merged_at...
# What does this PR do? torch.export.export fails on a tests used to raise an exception if not true. This is replaced by torch._check to avoid torch.export.export complain about it. Fixes # (issue) ## Before submitting - [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's...
null
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/39962/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/39962/timeline
null
null
null
null
true
false
https://api.github.com/repos/huggingface/transformers/issues/39961
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/39961/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/39961/comments
https://api.github.com/repos/huggingface/transformers/issues/39961/events
https://github.com/huggingface/transformers/issues/39961
3,296,600,078
I_kwDOCUB6oc7EfiAO
39,961
Calling `trainer.evaluate()` before `trainer.train()` with FSDP 2 raises `ValueError: When using FSDP2, a model and optimizer must be passed together to `Accelerator.prepare()...`
{ "login": "RonanFR", "id": 10586126, "node_id": "MDQ6VXNlcjEwNTg2MTI2", "avatar_url": "https://avatars.githubusercontent.com/u/10586126?v=4", "gravatar_id": "", "url": "https://api.github.com/users/RonanFR", "html_url": "https://github.com/RonanFR", "followers_url": "https://api.github.com/users/RonanF...
[ { "id": 3817266200, "node_id": "MDU6TGFiZWwzODE3MjY2MjAw", "url": "https://api.github.com/repos/huggingface/transformers/labels/bug", "name": "bug", "color": "d73a4a", "default": true, "description": null } ]
closed
false
null
[]
null
[]
2025-08-06T12:53:42
2025-10-06T23:56:24
2025-09-14T08:02:54
NONE
null
null
null
null
### System Info **Environement:** - `transformers` version: 4.55.0 - Platform: Linux-5.15.0-1087-aws-x86_64-with-glibc2.39 - Python version: 3.12.3 - Huggingface_hub version: 0.34.3 - Safetensors version: 0.4.4 - Accelerate version: 1.9.0 - Accelerate config: - compute_environment: LOCAL_MACHINE - distributed_type:...
{ "login": "github-actions[bot]", "id": 41898282, "node_id": "MDM6Qm90NDE4OTgyODI=", "avatar_url": "https://avatars.githubusercontent.com/in/15368?v=4", "gravatar_id": "", "url": "https://api.github.com/users/github-actions%5Bbot%5D", "html_url": "https://github.com/apps/github-actions", "followers_url"...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/39961/reactions", "total_count": 4, "+1": 3, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 1 }
https://api.github.com/repos/huggingface/transformers/issues/39961/timeline
null
completed
{ "total": 0, "completed": 0, "percent_completed": 0 }
{ "blocked_by": 0, "total_blocked_by": 0, "blocking": 0, "total_blocking": 0 }
false
true
https://api.github.com/repos/huggingface/transformers/issues/39960
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/39960/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/39960/comments
https://api.github.com/repos/huggingface/transformers/issues/39960/events
https://github.com/huggingface/transformers/pull/39960
3,296,464,611
PR_kwDOCUB6oc6iX65T
39,960
Gemma3 fixes
{ "login": "remi-or", "id": 83456801, "node_id": "MDQ6VXNlcjgzNDU2ODAx", "avatar_url": "https://avatars.githubusercontent.com/u/83456801?v=4", "gravatar_id": "", "url": "https://api.github.com/users/remi-or", "html_url": "https://github.com/remi-or", "followers_url": "https://api.github.com/users/remi-o...
[]
closed
false
null
[]
null
[]
2025-08-06T12:12:58
2025-08-07T07:57:21
2025-08-07T07:57:21
COLLABORATOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/39960", "html_url": "https://github.com/huggingface/transformers/pull/39960", "diff_url": "https://github.com/huggingface/transformers/pull/39960.diff", "patch_url": "https://github.com/huggingface/transformers/pull/39960.patch", "merged_at...
This PR fixes several `gemma3` tests: - on any GPU, there is a possible multi-devices error that can arise in the forward of `Gemma3Model` which was fixed by specifying the device in a tensor creation - on AMD MI300, there are now new expectations for some generation tests cc. @zucchini-nlp maybe because this is a...
{ "login": "remi-or", "id": 83456801, "node_id": "MDQ6VXNlcjgzNDU2ODAx", "avatar_url": "https://avatars.githubusercontent.com/u/83456801?v=4", "gravatar_id": "", "url": "https://api.github.com/users/remi-or", "html_url": "https://github.com/remi-or", "followers_url": "https://api.github.com/users/remi-o...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/39960/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/39960/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/39959
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/39959/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/39959/comments
https://api.github.com/repos/huggingface/transformers/issues/39959/events
https://github.com/huggingface/transformers/pull/39959
3,296,242,626
PR_kwDOCUB6oc6iXKZA
39,959
Fix grammatical error in MoE variable name: expert_hitted → expert_hit, hitted_experts → hit_experts
{ "login": "Mihonarium", "id": 24436954, "node_id": "MDQ6VXNlcjI0NDM2OTU0", "avatar_url": "https://avatars.githubusercontent.com/u/24436954?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Mihonarium", "html_url": "https://github.com/Mihonarium", "followers_url": "https://api.github.com/use...
[]
closed
false
null
[]
null
[]
2025-08-06T11:00:57
2025-08-06T17:09:09
2025-08-06T15:45:20
CONTRIBUTOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/39959", "html_url": "https://github.com/huggingface/transformers/pull/39959", "diff_url": "https://github.com/huggingface/transformers/pull/39959.diff", "patch_url": "https://github.com/huggingface/transformers/pull/39959.patch", "merged_at...
# What does this PR do? Fixes a grammatical error in variable naming across all Mixture of Experts (MoE) implementations. The variables `expert_hitted` and `hitted_experts` are grammatically incorrect: the past tense/past participle of "hit" is "hit", not "hitted". Fixes #39955. ## Before submitting - [x] Thi...
{ "login": "Rocketknight1", "id": 12866554, "node_id": "MDQ6VXNlcjEyODY2NTU0", "avatar_url": "https://avatars.githubusercontent.com/u/12866554?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Rocketknight1", "html_url": "https://github.com/Rocketknight1", "followers_url": "https://api.githu...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/39959/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/39959/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/39958
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/39958/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/39958/comments
https://api.github.com/repos/huggingface/transformers/issues/39958/events
https://github.com/huggingface/transformers/issues/39958
3,296,240,560
I_kwDOCUB6oc7EeKOw
39,958
TypeError: Received a NoneType for argument video_processor, but a BaseVideoProcessor was expected.(this issue im getting when using doc-ocr)
{ "login": "2ayush2", "id": 80869490, "node_id": "MDQ6VXNlcjgwODY5NDkw", "avatar_url": "https://avatars.githubusercontent.com/u/80869490?v=4", "gravatar_id": "", "url": "https://api.github.com/users/2ayush2", "html_url": "https://github.com/2ayush2", "followers_url": "https://api.github.com/users/2ayush...
[ { "id": 2648621985, "node_id": "MDU6TGFiZWwyNjQ4NjIxOTg1", "url": "https://api.github.com/repos/huggingface/transformers/labels/Feature%20request", "name": "Feature request", "color": "FBCA04", "default": false, "description": "Request for a new feature" } ]
open
false
null
[]
null
[]
2025-08-06T11:00:13
2025-08-28T18:11:34
null
NONE
null
null
null
null
### Feature request (venv) PS C:\Treeleaf-project\smartid-processor\smartid\service\models\dots.ocr> python .\demo_hf.py Loading checkpoint shards: 100%|████████████████████████████████████████████████| 2/2 [00:15<00:00, 7.52s/it] The image processor of type `Qwen2VLImageProcessor` is now loaded as a fast processor b...
null
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/39958/reactions", "total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/39958/timeline
null
null
{ "total": 0, "completed": 0, "percent_completed": 0 }
{ "blocked_by": 0, "total_blocked_by": 0, "blocking": 0, "total_blocking": 0 }
false
false
https://api.github.com/repos/huggingface/transformers/issues/39956
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/39956/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/39956/comments
https://api.github.com/repos/huggingface/transformers/issues/39956/events
https://github.com/huggingface/transformers/pull/39956
3,296,181,195
PR_kwDOCUB6oc6iW9R8
39,956
Harmonize `past_key_value` to `past_key_valueS` everywhere
{ "login": "Cyrilvallez", "id": 71554963, "node_id": "MDQ6VXNlcjcxNTU0OTYz", "avatar_url": "https://avatars.githubusercontent.com/u/71554963?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Cyrilvallez", "html_url": "https://github.com/Cyrilvallez", "followers_url": "https://api.github.com/...
[]
closed
false
null
[]
null
[]
2025-08-06T10:41:20
2025-08-08T09:53:00
2025-08-08T09:52:58
MEMBER
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/39956", "html_url": "https://github.com/huggingface/transformers/pull/39956", "diff_url": "https://github.com/huggingface/transformers/pull/39956.diff", "patch_url": "https://github.com/huggingface/transformers/pull/39956.patch", "merged_at...
As per the title. I'm getting annoyed to see both, it was time to finally make it coherent everywhere. All the changes are made, so I'll only need to remove the decorators after next release (it's only internal modules, so no need for a long deprecation cycle) Also reapplied modular to `examples/modular-transforme...
{ "login": "Cyrilvallez", "id": 71554963, "node_id": "MDQ6VXNlcjcxNTU0OTYz", "avatar_url": "https://avatars.githubusercontent.com/u/71554963?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Cyrilvallez", "html_url": "https://github.com/Cyrilvallez", "followers_url": "https://api.github.com/...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/39956/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/39956/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/39955
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/39955/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/39955/comments
https://api.github.com/repos/huggingface/transformers/issues/39955/events
https://github.com/huggingface/transformers/issues/39955
3,296,143,740
I_kwDOCUB6oc7Edyl8
39,955
Fix grammatically incorrect variable name "expert_hitted" → "expert_hit" in MoE implementation
{ "login": "Mihonarium", "id": 24436954, "node_id": "MDQ6VXNlcjI0NDM2OTU0", "avatar_url": "https://avatars.githubusercontent.com/u/24436954?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Mihonarium", "html_url": "https://github.com/Mihonarium", "followers_url": "https://api.github.com/use...
[]
closed
false
null
[]
null
[]
2025-08-06T10:27:27
2025-08-06T15:45:21
2025-08-06T15:45:21
CONTRIBUTOR
null
null
null
null
## Description The variable name `expert_hitted` used in the Mixture of Experts (MoE) implementations is grammatically incorrect. The past tense/past participle of "hit" is "hit", not "hitted". ## Current behavior The codebase currently uses `expert_hitted` in: - `src/transformers/models/gpt_oss/modular_gpt_oss.py` - ...
{ "login": "Rocketknight1", "id": 12866554, "node_id": "MDQ6VXNlcjEyODY2NTU0", "avatar_url": "https://avatars.githubusercontent.com/u/12866554?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Rocketknight1", "html_url": "https://github.com/Rocketknight1", "followers_url": "https://api.githu...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/39955/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/39955/timeline
null
completed
{ "total": 0, "completed": 0, "percent_completed": 0 }
{ "blocked_by": 0, "total_blocked_by": 0, "blocking": 0, "total_blocking": 0 }
false
true
https://api.github.com/repos/huggingface/transformers/issues/39954
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/39954/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/39954/comments
https://api.github.com/repos/huggingface/transformers/issues/39954/events
https://github.com/huggingface/transformers/issues/39954
3,296,049,185
I_kwDOCUB6oc7Edbgh
39,954
[gpt‑oss] eager_attention_forward not using sliding-window attention for GPT‑OSS models
{ "login": "AlfredTino", "id": 41940791, "node_id": "MDQ6VXNlcjQxOTQwNzkx", "avatar_url": "https://avatars.githubusercontent.com/u/41940791?v=4", "gravatar_id": "", "url": "https://api.github.com/users/AlfredTino", "html_url": "https://github.com/AlfredTino", "followers_url": "https://api.github.com/use...
[]
closed
false
null
[]
null
[]
2025-08-06T09:55:07
2025-08-07T02:28:56
2025-08-07T02:28:56
NONE
null
null
null
null
In the latest transformers version v4.55.0, the GPT‑OSS model’s eager_attention_forward implementation does **not** use sliding‑window attention. This behavior diverges from the original GPT‑OSS specification, where alternating full‑context and sliding‑window attention (e.g. window size 128) is a key architectural feat...
{ "login": "AlfredTino", "id": 41940791, "node_id": "MDQ6VXNlcjQxOTQwNzkx", "avatar_url": "https://avatars.githubusercontent.com/u/41940791?v=4", "gravatar_id": "", "url": "https://api.github.com/users/AlfredTino", "html_url": "https://github.com/AlfredTino", "followers_url": "https://api.github.com/use...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/39954/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/39954/timeline
null
completed
{ "total": 0, "completed": 0, "percent_completed": 0 }
{ "blocked_by": 0, "total_blocked_by": 0, "blocking": 0, "total_blocking": 0 }
false
true
https://api.github.com/repos/huggingface/transformers/issues/39953
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/39953/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/39953/comments
https://api.github.com/repos/huggingface/transformers/issues/39953/events
https://github.com/huggingface/transformers/pull/39953
3,295,992,379
PR_kwDOCUB6oc6iWUTR
39,953
Fix MXFP4 quantizer validation to allow CPU inference with dequantize option
{ "login": "returnL", "id": 44701395, "node_id": "MDQ6VXNlcjQ0NzAxMzk1", "avatar_url": "https://avatars.githubusercontent.com/u/44701395?v=4", "gravatar_id": "", "url": "https://api.github.com/users/returnL", "html_url": "https://github.com/returnL", "followers_url": "https://api.github.com/users/return...
[ { "id": 8103865784, "node_id": "LA_kwDOCUB6oc8AAAAB4wctuA", "url": "https://api.github.com/repos/huggingface/transformers/labels/for%20patch", "name": "for patch", "color": "D93F0B", "default": false, "description": "Tag issues / labels that should be included in the next patch" } ]
closed
false
null
[]
null
[]
2025-08-06T09:35:54
2025-08-06T17:52:53
2025-08-06T13:20:41
CONTRIBUTOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/39953", "html_url": "https://github.com/huggingface/transformers/pull/39953", "diff_url": "https://github.com/huggingface/transformers/pull/39953.diff", "patch_url": "https://github.com/huggingface/transformers/pull/39953.patch", "merged_at...
# What does this PR do? This PR fixes a bug that prevented MXFP4 models from running on CPU when `quantization_config.dequantize=True` was set. ## Problem The validation logic in `Mxfp4HfQuantizer` checked CUDA availability before checking the `dequantize` flag, causing failures on CPU-only environments even whe...
{ "login": "MekkCyber", "id": 93391238, "node_id": "U_kgDOBZEJhg", "avatar_url": "https://avatars.githubusercontent.com/u/93391238?v=4", "gravatar_id": "", "url": "https://api.github.com/users/MekkCyber", "html_url": "https://github.com/MekkCyber", "followers_url": "https://api.github.com/users/MekkCybe...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/39953/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/39953/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/39952
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/39952/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/39952/comments
https://api.github.com/repos/huggingface/transformers/issues/39952/events
https://github.com/huggingface/transformers/pull/39952
3,295,934,217
PR_kwDOCUB6oc6iWHoM
39,952
[DO NOT MERGE] Testing safetensors 0.6.1rc0
{ "login": "Narsil", "id": 204321, "node_id": "MDQ6VXNlcjIwNDMyMQ==", "avatar_url": "https://avatars.githubusercontent.com/u/204321?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Narsil", "html_url": "https://github.com/Narsil", "followers_url": "https://api.github.com/users/Narsil/follow...
[]
closed
false
null
[]
null
[]
2025-08-06T09:18:12
2025-08-06T12:23:15
2025-08-06T12:23:15
CONTRIBUTOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/39952", "html_url": "https://github.com/huggingface/transformers/pull/39952", "diff_url": "https://github.com/huggingface/transformers/pull/39952.diff", "patch_url": "https://github.com/huggingface/transformers/pull/39952.patch", "merged_at...
# What does this PR do? <!-- Congratulations! You've made it this far! You're not quite done yet though. Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution. Then, please replace this w...
{ "login": "Narsil", "id": 204321, "node_id": "MDQ6VXNlcjIwNDMyMQ==", "avatar_url": "https://avatars.githubusercontent.com/u/204321?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Narsil", "html_url": "https://github.com/Narsil", "followers_url": "https://api.github.com/users/Narsil/follow...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/39952/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/39952/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/39951
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/39951/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/39951/comments
https://api.github.com/repos/huggingface/transformers/issues/39951/events
https://github.com/huggingface/transformers/pull/39951
3,295,856,263
PR_kwDOCUB6oc6iV2xA
39,951
circleci: pin torch 2.7.1 until `torchcodec` is updated
{ "login": "ydshieh", "id": 2521628, "node_id": "MDQ6VXNlcjI1MjE2Mjg=", "avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4", "gravatar_id": "", "url": "https://api.github.com/users/ydshieh", "html_url": "https://github.com/ydshieh", "followers_url": "https://api.github.com/users/ydshieh/...
[]
closed
false
null
[]
null
[]
2025-08-06T08:57:50
2025-08-06T09:18:02
2025-08-06T09:18:00
COLLABORATOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/39951", "html_url": "https://github.com/huggingface/transformers/pull/39951", "diff_url": "https://github.com/huggingface/transformers/pull/39951.diff", "patch_url": "https://github.com/huggingface/transformers/pull/39951.patch", "merged_at...
# What does this PR do? to make CircleCI ✅
{ "login": "ydshieh", "id": 2521628, "node_id": "MDQ6VXNlcjI1MjE2Mjg=", "avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4", "gravatar_id": "", "url": "https://api.github.com/users/ydshieh", "html_url": "https://github.com/ydshieh", "followers_url": "https://api.github.com/users/ydshieh/...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/39951/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/39951/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/39950
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/39950/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/39950/comments
https://api.github.com/repos/huggingface/transformers/issues/39950/events
https://github.com/huggingface/transformers/pull/39950
3,295,846,160
PR_kwDOCUB6oc6iV0lH
39,950
Add pytest marker: `torch_compile_test` and `torch_export_test`
{ "login": "ydshieh", "id": 2521628, "node_id": "MDQ6VXNlcjI1MjE2Mjg=", "avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4", "gravatar_id": "", "url": "https://api.github.com/users/ydshieh", "html_url": "https://github.com/ydshieh", "followers_url": "https://api.github.com/users/ydshieh/...
[]
closed
false
null
[]
null
[]
2025-08-06T08:55:01
2025-08-13T21:47:18
2025-08-13T21:47:16
COLLABORATOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/39950", "html_url": "https://github.com/huggingface/transformers/pull/39950", "diff_url": "https://github.com/huggingface/transformers/pull/39950.diff", "patch_url": "https://github.com/huggingface/transformers/pull/39950.patch", "merged_at...
# What does this PR do? The torch team considers to run some transformers tests on their side, especially the compile and export tests. They request us to provide a way to easily and reliably running those related tests. Therefore this PR adds 2 new pytest marker and use them to the relevant tests. A run usin...
{ "login": "ydshieh", "id": 2521628, "node_id": "MDQ6VXNlcjI1MjE2Mjg=", "avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4", "gravatar_id": "", "url": "https://api.github.com/users/ydshieh", "html_url": "https://github.com/ydshieh", "followers_url": "https://api.github.com/users/ydshieh/...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/39950/reactions", "total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/39950/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/39949
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/39949/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/39949/comments
https://api.github.com/repos/huggingface/transformers/issues/39949/events
https://github.com/huggingface/transformers/pull/39949
3,295,802,034
PR_kwDOCUB6oc6iVq3b
39,949
Add pytest marker: `torch_compile_test` and `torch_export_test`
{ "login": "ydshieh", "id": 2521628, "node_id": "MDQ6VXNlcjI1MjE2Mjg=", "avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4", "gravatar_id": "", "url": "https://api.github.com/users/ydshieh", "html_url": "https://github.com/ydshieh", "followers_url": "https://api.github.com/users/ydshieh/...
[]
closed
false
null
[]
null
[]
2025-08-06T08:43:22
2025-08-06T08:54:26
2025-08-06T08:53:55
COLLABORATOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/39949", "html_url": "https://github.com/huggingface/transformers/pull/39949", "diff_url": "https://github.com/huggingface/transformers/pull/39949.diff", "patch_url": "https://github.com/huggingface/transformers/pull/39949.patch", "merged_at...
# What does this PR do? The torch team considers to run some transformers tests on their side, especially the compile and export tests. They request us to provide a way to easily and reliably running those related tests. Therefore this PR adds 2 new pytest marker and use them to the relevant tests. A run usin...
{ "login": "ydshieh", "id": 2521628, "node_id": "MDQ6VXNlcjI1MjE2Mjg=", "avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4", "gravatar_id": "", "url": "https://api.github.com/users/ydshieh", "html_url": "https://github.com/ydshieh", "followers_url": "https://api.github.com/users/ydshieh/...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/39949/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/39949/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/39948
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/39948/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/39948/comments
https://api.github.com/repos/huggingface/transformers/issues/39948/events
https://github.com/huggingface/transformers/pull/39948
3,295,521,206
PR_kwDOCUB6oc6iUtIs
39,948
feat: Support tensor inputs in ImageClassificationPipeline
{ "login": "Hashbrownsss", "id": 142291877, "node_id": "U_kgDOCHszpQ", "avatar_url": "https://avatars.githubusercontent.com/u/142291877?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Hashbrownsss", "html_url": "https://github.com/Hashbrownsss", "followers_url": "https://api.github.com/use...
[]
closed
false
null
[]
null
[]
2025-08-06T07:28:05
2025-08-07T19:06:09
2025-08-07T19:06:09
NONE
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/39948", "html_url": "https://github.com/huggingface/transformers/pull/39948", "diff_url": "https://github.com/huggingface/transformers/pull/39948.diff", "patch_url": "https://github.com/huggingface/transformers/pull/39948.patch", "merged_at...
# What does this PR do? Adds support for accepting numpy arrays and pytorch tensors as direct inputs to the 'ImageClassificationPipeline'. Currently the pipelines 'preprocess' function only accepts PIL images or file paths. Made the pipeline flexible to use with existing datasets and data pipelines as requested in the...
{ "login": "Hashbrownsss", "id": 142291877, "node_id": "U_kgDOCHszpQ", "avatar_url": "https://avatars.githubusercontent.com/u/142291877?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Hashbrownsss", "html_url": "https://github.com/Hashbrownsss", "followers_url": "https://api.github.com/use...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/39948/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/39948/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/39947
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/39947/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/39947/comments
https://api.github.com/repos/huggingface/transformers/issues/39947/events
https://github.com/huggingface/transformers/issues/39947
3,295,261,949
I_kwDOCUB6oc7EabT9
39,947
v4.55.0 Idefics3 RuntimeError Tensors on different devices
{ "login": "noahleegithub", "id": 42154767, "node_id": "MDQ6VXNlcjQyMTU0NzY3", "avatar_url": "https://avatars.githubusercontent.com/u/42154767?v=4", "gravatar_id": "", "url": "https://api.github.com/users/noahleegithub", "html_url": "https://github.com/noahleegithub", "followers_url": "https://api.githu...
[ { "id": 3817266200, "node_id": "MDU6TGFiZWwzODE3MjY2MjAw", "url": "https://api.github.com/repos/huggingface/transformers/labels/bug", "name": "bug", "color": "d73a4a", "default": true, "description": null } ]
closed
false
null
[]
null
[]
2025-08-06T05:42:38
2025-08-07T09:12:05
2025-08-07T09:12:05
NONE
null
null
null
null
### System Info - `transformers` version: 4.55.0 - Python version: 3.10.14 - PyTorch version (accelerator?): 2.7.1+cu126 (CUDA) - Using GPU in script?: Yes - GPU type: Tesla V100-PCIE-32GB ### Who can help? @ArthurZucker @guangy10 ### Information - [ ] The official example scripts - [x] My own modified scripts #...
{ "login": "zucchini-nlp", "id": 100715397, "node_id": "U_kgDOBgDLhQ", "avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4", "gravatar_id": "", "url": "https://api.github.com/users/zucchini-nlp", "html_url": "https://github.com/zucchini-nlp", "followers_url": "https://api.github.com/use...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/39947/reactions", "total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/39947/timeline
null
completed
{ "total": 0, "completed": 0, "percent_completed": 0 }
{ "blocked_by": 0, "total_blocked_by": 0, "blocking": 0, "total_blocking": 0 }
false
true
https://api.github.com/repos/huggingface/transformers/issues/39946
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/39946/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/39946/comments
https://api.github.com/repos/huggingface/transformers/issues/39946/events
https://github.com/huggingface/transformers/issues/39946
3,295,159,627
I_kwDOCUB6oc7EaCVL
39,946
Retaining computational graph after using AutoImageProcessor
{ "login": "YinniKun", "id": 94827377, "node_id": "U_kgDOBabzcQ", "avatar_url": "https://avatars.githubusercontent.com/u/94827377?v=4", "gravatar_id": "", "url": "https://api.github.com/users/YinniKun", "html_url": "https://github.com/YinniKun", "followers_url": "https://api.github.com/users/YinniKun/fo...
[ { "id": 2648621985, "node_id": "MDU6TGFiZWwyNjQ4NjIxOTg1", "url": "https://api.github.com/repos/huggingface/transformers/labels/Feature%20request", "name": "Feature request", "color": "FBCA04", "default": false, "description": "Request for a new feature" } ]
open
false
null
[]
null
[]
2025-08-06T04:41:06
2025-08-07T22:07:09
null
NONE
null
null
null
null
### Feature request Right now `AutoImageProcessor` would detach the tensor from any gradient, which is totally fine if the input is obtained from DataLoader and have no gradient anyway. But would it be possible to retain the computational graph after using the preprocessing of `AutoImageProcessor` (similar to `torchvi...
null
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/39946/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/39946/timeline
null
null
{ "total": 0, "completed": 0, "percent_completed": 0 }
{ "blocked_by": 0, "total_blocked_by": 0, "blocking": 0, "total_blocking": 0 }
false
false
https://api.github.com/repos/huggingface/transformers/issues/39945
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/39945/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/39945/comments
https://api.github.com/repos/huggingface/transformers/issues/39945/events
https://github.com/huggingface/transformers/issues/39945
3,295,151,100
I_kwDOCUB6oc7EaAP8
39,945
GPT-OSS mxfp4 with triton_kernel: make_default_matmul_mxfp4_w_layout not found
{ "login": "yilian49", "id": 43861414, "node_id": "MDQ6VXNlcjQzODYxNDE0", "avatar_url": "https://avatars.githubusercontent.com/u/43861414?v=4", "gravatar_id": "", "url": "https://api.github.com/users/yilian49", "html_url": "https://github.com/yilian49", "followers_url": "https://api.github.com/users/yil...
[ { "id": 3817266200, "node_id": "MDU6TGFiZWwzODE3MjY2MjAw", "url": "https://api.github.com/repos/huggingface/transformers/labels/bug", "name": "bug", "color": "d73a4a", "default": true, "description": null } ]
closed
false
null
[]
null
[]
2025-08-06T04:36:03
2025-09-18T08:02:16
2025-09-18T08:02:16
NONE
null
null
null
null
### System Info - `transformers` version: 4.55.0 - Platform: Linux-5.15.0-144-generic-x86_64-with-glibc2.35 - Python version: 3.10.12 - Huggingface_hub version: 0.34.3 - Safetensors version: 0.5.3 - Accelerate version: 1.9.0 - Accelerate config: not found - DeepSpeed version: not installed - PyTorch version (accele...
{ "login": "github-actions[bot]", "id": 41898282, "node_id": "MDM6Qm90NDE4OTgyODI=", "avatar_url": "https://avatars.githubusercontent.com/in/15368?v=4", "gravatar_id": "", "url": "https://api.github.com/users/github-actions%5Bbot%5D", "html_url": "https://github.com/apps/github-actions", "followers_url"...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/39945/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/39945/timeline
null
completed
{ "total": 0, "completed": 0, "percent_completed": 0 }
{ "blocked_by": 0, "total_blocked_by": 0, "blocking": 0, "total_blocking": 0 }
false
true
https://api.github.com/repos/huggingface/transformers/issues/39944
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/39944/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/39944/comments
https://api.github.com/repos/huggingface/transformers/issues/39944/events
https://github.com/huggingface/transformers/pull/39944
3,295,037,556
PR_kwDOCUB6oc6iTH-v
39,944
Add back `_tp_plan` attribute
{ "login": "rishub-tamirisa", "id": 87284850, "node_id": "MDQ6VXNlcjg3Mjg0ODUw", "avatar_url": "https://avatars.githubusercontent.com/u/87284850?v=4", "gravatar_id": "", "url": "https://api.github.com/users/rishub-tamirisa", "html_url": "https://github.com/rishub-tamirisa", "followers_url": "https://api...
[ { "id": 1834056761, "node_id": "MDU6TGFiZWwxODM0MDU2NzYx", "url": "https://api.github.com/repos/huggingface/transformers/labels/Core:%20Modeling", "name": "Core: Modeling", "color": "FF8446", "default": false, "description": "Internals of the library; Models." }, { "id": 27608221...
closed
false
null
[]
null
[]
2025-08-06T03:17:24
2025-08-20T13:29:56
2025-08-20T13:29:56
CONTRIBUTOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/39944", "html_url": "https://github.com/huggingface/transformers/pull/39944", "diff_url": "https://github.com/huggingface/transformers/pull/39944.diff", "patch_url": "https://github.com/huggingface/transformers/pull/39944.patch", "merged_at...
Fixes #39943 - improvements to tensor parallel plan , validation, and extensibility: Added property-based getters and setters for `tp_plan` and `pp_plan` in the model class, including validation of parallel styles and layer pattern matching, with warnings for non-existent patterns. This ensures only supported paral...
{ "login": "ArthurZucker", "id": 48595927, "node_id": "MDQ6VXNlcjQ4NTk1OTI3", "avatar_url": "https://avatars.githubusercontent.com/u/48595927?v=4", "gravatar_id": "", "url": "https://api.github.com/users/ArthurZucker", "html_url": "https://github.com/ArthurZucker", "followers_url": "https://api.github.c...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/39944/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/39944/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/39943
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/39943/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/39943/comments
https://api.github.com/repos/huggingface/transformers/issues/39943/events
https://github.com/huggingface/transformers/issues/39943
3,295,029,310
I_kwDOCUB6oc7EZig-
39,943
Breaking change in unset `_tp_plan` attribute
{ "login": "rishub-tamirisa", "id": 87284850, "node_id": "MDQ6VXNlcjg3Mjg0ODUw", "avatar_url": "https://avatars.githubusercontent.com/u/87284850?v=4", "gravatar_id": "", "url": "https://api.github.com/users/rishub-tamirisa", "html_url": "https://github.com/rishub-tamirisa", "followers_url": "https://api...
[ { "id": 3817266200, "node_id": "MDU6TGFiZWwzODE3MjY2MjAw", "url": "https://api.github.com/repos/huggingface/transformers/labels/bug", "name": "bug", "color": "d73a4a", "default": true, "description": null } ]
closed
false
null
[]
null
[]
2025-08-06T03:11:25
2025-08-20T13:29:57
2025-08-20T13:29:57
CONTRIBUTOR
null
null
null
null
The vLLM transformers frontend [relies on the `_tp_plan` attribute being set in the model](https://github.com/vllm-project/vllm/blob/main/vllm/model_executor/models/transformers.py#L543). It was removed [here](https://github.com/huggingface/transformers/pull/39501/files#diff-6b72b98c4c2dcfc6cc606843917733f5d858374fbc22...
{ "login": "ArthurZucker", "id": 48595927, "node_id": "MDQ6VXNlcjQ4NTk1OTI3", "avatar_url": "https://avatars.githubusercontent.com/u/48595927?v=4", "gravatar_id": "", "url": "https://api.github.com/users/ArthurZucker", "html_url": "https://github.com/ArthurZucker", "followers_url": "https://api.github.c...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/39943/reactions", "total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/39943/timeline
null
completed
{ "total": 0, "completed": 0, "percent_completed": 0 }
{ "blocked_by": 0, "total_blocked_by": 0, "blocking": 0, "total_blocking": 0 }
false
true
https://api.github.com/repos/huggingface/transformers/issues/39942
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/39942/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/39942/comments
https://api.github.com/repos/huggingface/transformers/issues/39942/events
https://github.com/huggingface/transformers/pull/39942
3,294,885,425
PR_kwDOCUB6oc6iSpGr
39,942
fix llama issue
{ "login": "yao-matrix", "id": 7245027, "node_id": "MDQ6VXNlcjcyNDUwMjc=", "avatar_url": "https://avatars.githubusercontent.com/u/7245027?v=4", "gravatar_id": "", "url": "https://api.github.com/users/yao-matrix", "html_url": "https://github.com/yao-matrix", "followers_url": "https://api.github.com/users...
[]
closed
false
null
[]
null
[]
2025-08-06T01:33:26
2025-10-29T22:29:35
2025-10-23T21:31:51
CONTRIBUTOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/39942", "html_url": "https://github.com/huggingface/transformers/pull/39942", "diff_url": "https://github.com/huggingface/transformers/pull/39942.diff", "patch_url": "https://github.com/huggingface/transformers/pull/39942.patch", "merged_at...
similar like this PR https://github.com/huggingface/transformers/pull/39646, fix the same issue found while enabling llama lora finetuning across multiple card. @SunMarc , pls help review, thx very much.
{ "login": "yao-matrix", "id": 7245027, "node_id": "MDQ6VXNlcjcyNDUwMjc=", "avatar_url": "https://avatars.githubusercontent.com/u/7245027?v=4", "gravatar_id": "", "url": "https://api.github.com/users/yao-matrix", "html_url": "https://github.com/yao-matrix", "followers_url": "https://api.github.com/users...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/39942/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/39942/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/39941
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/39941/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/39941/comments
https://api.github.com/repos/huggingface/transformers/issues/39941/events
https://github.com/huggingface/transformers/pull/39941
3,294,751,328
PR_kwDOCUB6oc6iSNSi
39,941
fixing image_utils.py todo
{ "login": "skochar1", "id": 60591774, "node_id": "MDQ6VXNlcjYwNTkxNzc0", "avatar_url": "https://avatars.githubusercontent.com/u/60591774?v=4", "gravatar_id": "", "url": "https://api.github.com/users/skochar1", "html_url": "https://github.com/skochar1", "followers_url": "https://api.github.com/users/sko...
[]
open
false
null
[]
null
[]
2025-08-06T00:05:43
2025-08-06T10:02:44
null
NONE
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/39941", "html_url": "https://github.com/huggingface/transformers/pull/39941", "diff_url": "https://github.com/huggingface/transformers/pull/39941.diff", "patch_url": "https://github.com/huggingface/transformers/pull/39941.patch", "merged_at...
# What does this PR do? <!-- Congratulations! You've made it this far! You're not quite done yet though. Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution. Then, please replace this w...
null
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/39941/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/39941/timeline
null
null
null
null
true
false
https://api.github.com/repos/huggingface/transformers/issues/39940
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/39940/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/39940/comments
https://api.github.com/repos/huggingface/transformers/issues/39940/events
https://github.com/huggingface/transformers/pull/39940
3,294,737,530
PR_kwDOCUB6oc6iSKdS
39,940
Enable gpt-oss mxfp4 on older hardware (sm75+)
{ "login": "matthewdouglas", "id": 38992547, "node_id": "MDQ6VXNlcjM4OTkyNTQ3", "avatar_url": "https://avatars.githubusercontent.com/u/38992547?v=4", "gravatar_id": "", "url": "https://api.github.com/users/matthewdouglas", "html_url": "https://github.com/matthewdouglas", "followers_url": "https://api.gi...
[ { "id": 8103865784, "node_id": "LA_kwDOCUB6oc8AAAAB4wctuA", "url": "https://api.github.com/repos/huggingface/transformers/labels/for%20patch", "name": "for patch", "color": "D93F0B", "default": false, "description": "Tag issues / labels that should be included in the next patch" } ]
closed
false
null
[]
null
[]
2025-08-05T23:55:18
2025-08-06T17:53:33
2025-08-06T13:39:21
MEMBER
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/39940", "html_url": "https://github.com/huggingface/transformers/pull/39940", "diff_url": "https://github.com/huggingface/transformers/pull/39940.diff", "patch_url": "https://github.com/huggingface/transformers/pull/39940.patch", "merged_at...
# What does this PR do? Currently the MXFP4 quantized version of gpt-oss models is restricted to newer GPUs (Hopper and Blackwell). This PR enables the MXFP4 version on Turing, Ampere, and Ada GPUs. Tested with the gpt-oss-20b on RTX 4090 and T4. If the user has the kernels installed but the hardware is too ol...
{ "login": "SunMarc", "id": 57196510, "node_id": "MDQ6VXNlcjU3MTk2NTEw", "avatar_url": "https://avatars.githubusercontent.com/u/57196510?v=4", "gravatar_id": "", "url": "https://api.github.com/users/SunMarc", "html_url": "https://github.com/SunMarc", "followers_url": "https://api.github.com/users/SunMar...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/39940/reactions", "total_count": 10, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 10, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/39940/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/39939
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/39939/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/39939/comments
https://api.github.com/repos/huggingface/transformers/issues/39939/events
https://github.com/huggingface/transformers/issues/39939
3,294,735,910
I_kwDOCUB6oc7EYa4m
39,939
AttributeError: 'BitsAndBytesConfig' object has no attribute 'get_loading_attributes' with transformers 4.55.0
{ "login": "yukiharada1228", "id": 117978472, "node_id": "U_kgDOBwg1aA", "avatar_url": "https://avatars.githubusercontent.com/u/117978472?v=4", "gravatar_id": "", "url": "https://api.github.com/users/yukiharada1228", "html_url": "https://github.com/yukiharada1228", "followers_url": "https://api.github.c...
[ { "id": 3817266200, "node_id": "MDU6TGFiZWwzODE3MjY2MjAw", "url": "https://api.github.com/repos/huggingface/transformers/labels/bug", "name": "bug", "color": "d73a4a", "default": true, "description": null } ]
closed
false
null
[]
null
[]
2025-08-05T23:54:22
2025-08-08T11:29:11
2025-08-08T11:04:59
NONE
null
null
null
null
### System Info transformers version: 4.55.0 platform: Linux-5.15.0-139-generic-x86_64-with-glibc2.31 python version: 3.10.13 PyTorch version: 2.1.0a0+32f93b1 TensorFlow version: N/A Flax version: N/A JAX version: N/A JAXLib version: N/A Using GPU in script?: Yes Using distributed or parallel set-up in script?: No ##...
{ "login": "yukiharada1228", "id": 117978472, "node_id": "U_kgDOBwg1aA", "avatar_url": "https://avatars.githubusercontent.com/u/117978472?v=4", "gravatar_id": "", "url": "https://api.github.com/users/yukiharada1228", "html_url": "https://github.com/yukiharada1228", "followers_url": "https://api.github.c...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/39939/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/39939/timeline
null
completed
{ "total": 0, "completed": 0, "percent_completed": 0 }
{ "blocked_by": 0, "total_blocked_by": 0, "blocking": 0, "total_blocking": 0 }
false
true
https://api.github.com/repos/huggingface/transformers/issues/39938
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/39938/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/39938/comments
https://api.github.com/repos/huggingface/transformers/issues/39938/events
https://github.com/huggingface/transformers/pull/39938
3,294,632,177
PR_kwDOCUB6oc6iR0PJ
39,938
Fix whisper `return_language` with `return_timestamp=word`
{ "login": "Metric-Void", "id": 21335640, "node_id": "MDQ6VXNlcjIxMzM1NjQw", "avatar_url": "https://avatars.githubusercontent.com/u/21335640?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Metric-Void", "html_url": "https://github.com/Metric-Void", "followers_url": "https://api.github.com/...
[ { "id": 6470596964, "node_id": "LA_kwDOCUB6oc8AAAABga15ZA", "url": "https://api.github.com/repos/huggingface/transformers/labels/Audio", "name": "Audio", "color": "760453", "default": false, "description": "" }, { "id": 7377881103, "node_id": "LA_kwDOCUB6oc8AAAABt8GIDw", ...
open
false
null
[]
null
[]
2025-08-05T22:42:52
2025-10-06T17:08:10
null
NONE
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/39938", "html_url": "https://github.com/huggingface/transformers/pull/39938", "diff_url": "https://github.com/huggingface/transformers/pull/39938.diff", "patch_url": "https://github.com/huggingface/transformers/pull/39938.patch", "merged_at...
# What does this PR do? Fixes #39404. Add a switch to Whisper.generate() that allows preserving some special tokens, then stripped in retrieve_segments to ensure timestamp alignment. Tested on short and long audios. Tested on English, French, and Cantonese. Prediction and timestamp results align, and language ...
null
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/39938/reactions", "total_count": 1, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 1, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/39938/timeline
null
null
null
null
true
false