url string | repository_url string | labels_url string | comments_url string | events_url string | html_url string | id int64 | node_id string | number int64 | title string | user dict | labels list | state string | locked bool | assignee dict | assignees list | milestone null | comments list | created_at timestamp[ms] | updated_at timestamp[ms] | closed_at timestamp[ms] | author_association string | type dict | active_lock_reason null | draft bool | pull_request dict | body string | closed_by dict | reactions dict | timeline_url string | performed_via_github_app null | state_reason string | sub_issues_summary dict | issue_dependencies_summary dict | is_pull_request bool | is_closed bool |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
https://api.github.com/repos/huggingface/transformers/issues/40940 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/40940/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/40940/comments | https://api.github.com/repos/huggingface/transformers/issues/40940/events | https://github.com/huggingface/transformers/pull/40940 | 3,426,395,700 | PR_kwDOCUB6oc6pFML6 | 40,940 | Remove nested import logic for torchvision | {
"login": "yonigozlan",
"id": 74535834,
"node_id": "MDQ6VXNlcjc0NTM1ODM0",
"avatar_url": "https://avatars.githubusercontent.com/u/74535834?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/yonigozlan",
"html_url": "https://github.com/yonigozlan",
"followers_url": "https://api.github.com/use... | [] | closed | false | null | [] | null | [] | 2025-09-17T13:47:14 | 2025-09-17T17:34:30 | 2025-09-17T17:34:30 | MEMBER | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40940",
"html_url": "https://github.com/huggingface/transformers/pull/40940",
"diff_url": "https://github.com/huggingface/transformers/pull/40940.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40940.patch",
"merged_at... | As the title says, remove the nested logic import as they were causing some issues when used with modular. | {
"login": "yonigozlan",
"id": 74535834,
"node_id": "MDQ6VXNlcjc0NTM1ODM0",
"avatar_url": "https://avatars.githubusercontent.com/u/74535834?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/yonigozlan",
"html_url": "https://github.com/yonigozlan",
"followers_url": "https://api.github.com/use... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/40940/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/40940/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/40939 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/40939/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/40939/comments | https://api.github.com/repos/huggingface/transformers/issues/40939/events | https://github.com/huggingface/transformers/pull/40939 | 3,426,263,085 | PR_kwDOCUB6oc6pEvbR | 40,939 | [t5gemma] fix `get_text_config` and related fixes | {
"login": "gante",
"id": 12240844,
"node_id": "MDQ6VXNlcjEyMjQwODQ0",
"avatar_url": "https://avatars.githubusercontent.com/u/12240844?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/gante",
"html_url": "https://github.com/gante",
"followers_url": "https://api.github.com/users/gante/follow... | [] | closed | false | null | [] | null | [] | 2025-09-17T13:12:22 | 2025-10-01T14:55:31 | 2025-10-01T14:55:27 | MEMBER | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40939",
"html_url": "https://github.com/huggingface/transformers/pull/40939",
"diff_url": "https://github.com/huggingface/transformers/pull/40939.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40939.patch",
"merged_at... | # What does this PR do?
Fixes https://github.com/huggingface/transformers/issues/40874, #41073, #41239
Follow-up to https://github.com/huggingface/transformers/pull/40454 and #40903
Fixes how we retrieve sub configs in `t5gemma`, and also subsequent bugs. More details in the diff :) | {
"login": "gante",
"id": 12240844,
"node_id": "MDQ6VXNlcjEyMjQwODQ0",
"avatar_url": "https://avatars.githubusercontent.com/u/12240844?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/gante",
"html_url": "https://github.com/gante",
"followers_url": "https://api.github.com/users/gante/follow... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/40939/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/40939/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/40938 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/40938/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/40938/comments | https://api.github.com/repos/huggingface/transformers/issues/40938/events | https://github.com/huggingface/transformers/issues/40938 | 3,426,084,950 | I_kwDOCUB6oc7MNehW | 40,938 | RFC for `tokenization` in v5 | {
"login": "ArthurZucker",
"id": 48595927,
"node_id": "MDQ6VXNlcjQ4NTk1OTI3",
"avatar_url": "https://avatars.githubusercontent.com/u/48595927?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ArthurZucker",
"html_url": "https://github.com/ArthurZucker",
"followers_url": "https://api.github.c... | [
{
"id": 9105758243,
"node_id": "LA_kwDOCUB6oc8AAAACHr7YIw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/for_v5?",
"name": "for_v5?",
"color": "35BC94",
"default": false,
"description": ""
}
] | open | false | {
"login": "ArthurZucker",
"id": 48595927,
"node_id": "MDQ6VXNlcjQ4NTk1OTI3",
"avatar_url": "https://avatars.githubusercontent.com/u/48595927?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ArthurZucker",
"html_url": "https://github.com/ArthurZucker",
"followers_url": "https://api.github.c... | [
{
"login": "itazap",
"id": 31893021,
"node_id": "MDQ6VXNlcjMxODkzMDIx",
"avatar_url": "https://avatars.githubusercontent.com/u/31893021?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/itazap",
"html_url": "https://github.com/itazap",
"followers_url": "https://api.github... | null | [] | 2025-09-17T12:27:17 | 2025-10-13T02:00:49 | null | COLLABORATOR | null | null | null | null | Sharing here our plans for v5 !
Right now, the distinction between tokenizers (e.g. Bart, Albert) isn’t explicit. We don’t know the actual
algorithm (Unigram, WordPiece, etc). The current ConvertSlow mechanism hides this detail. Instead of relying on “convert slow,” we want to make tokenizer definitions explicit (us... | null | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/40938/reactions",
"total_count": 22,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 16,
"confused": 0,
"heart": 6,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/40938/timeline | null | null | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | false |
https://api.github.com/repos/huggingface/transformers/issues/40937 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/40937/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/40937/comments | https://api.github.com/repos/huggingface/transformers/issues/40937/events | https://github.com/huggingface/transformers/pull/40937 | 3,426,076,147 | PR_kwDOCUB6oc6pEHBd | 40,937 | Remove repeated import | {
"login": "cyyever",
"id": 17618148,
"node_id": "MDQ6VXNlcjE3NjE4MTQ4",
"avatar_url": "https://avatars.githubusercontent.com/u/17618148?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/cyyever",
"html_url": "https://github.com/cyyever",
"followers_url": "https://api.github.com/users/cyyeve... | [] | closed | false | null | [] | null | [] | 2025-09-17T12:24:40 | 2025-09-22T13:02:58 | 2025-09-22T12:57:13 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40937",
"html_url": "https://github.com/huggingface/transformers/pull/40937",
"diff_url": "https://github.com/huggingface/transformers/pull/40937.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40937.patch",
"merged_at... | # What does this PR do?
As the title says.. | {
"login": "Rocketknight1",
"id": 12866554,
"node_id": "MDQ6VXNlcjEyODY2NTU0",
"avatar_url": "https://avatars.githubusercontent.com/u/12866554?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Rocketknight1",
"html_url": "https://github.com/Rocketknight1",
"followers_url": "https://api.githu... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/40937/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/40937/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/40936 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/40936/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/40936/comments | https://api.github.com/repos/huggingface/transformers/issues/40936/events | https://github.com/huggingface/transformers/pull/40936 | 3,425,973,764 | PR_kwDOCUB6oc6pDwsA | 40,936 | rm slow tokenizers | {
"login": "itazap",
"id": 31893021,
"node_id": "MDQ6VXNlcjMxODkzMDIx",
"avatar_url": "https://avatars.githubusercontent.com/u/31893021?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/itazap",
"html_url": "https://github.com/itazap",
"followers_url": "https://api.github.com/users/itazap/fo... | [] | open | false | null | [] | null | [] | 2025-09-17T11:56:18 | 2025-10-29T14:28:08 | null | COLLABORATOR | null | null | true | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40936",
"html_url": "https://github.com/huggingface/transformers/pull/40936",
"diff_url": "https://github.com/huggingface/transformers/pull/40936.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40936.patch",
"merged_at... | Llama POC for simplifying tokenizers (https://github.com/huggingface/transformers/issues/40938)
- no slow tokenizer
- TokenizerFast becomes just Tokenizer
- 2 options for loading a tokenizer: 1) tokenizer.json, 2) a trainable new "blank" tokenizer
TODO:
- for llama we have legacy behaviour in tokenizer.json... | null | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/40936/reactions",
"total_count": 1,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 1,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/40936/timeline | null | null | null | null | true | false |
https://api.github.com/repos/huggingface/transformers/issues/40935 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/40935/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/40935/comments | https://api.github.com/repos/huggingface/transformers/issues/40935/events | https://github.com/huggingface/transformers/pull/40935 | 3,425,842,454 | PR_kwDOCUB6oc6pDUfQ | 40,935 | [i18n-bn] Add Bengali language README file | {
"login": "saidurpulok",
"id": 59414463,
"node_id": "MDQ6VXNlcjU5NDE0NDYz",
"avatar_url": "https://avatars.githubusercontent.com/u/59414463?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/saidurpulok",
"html_url": "https://github.com/saidurpulok",
"followers_url": "https://api.github.com/... | [] | closed | false | null | [] | null | [] | 2025-09-17T11:16:41 | 2025-09-22T16:51:40 | 2025-09-22T16:51:39 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40935",
"html_url": "https://github.com/huggingface/transformers/pull/40935",
"diff_url": "https://github.com/huggingface/transformers/pull/40935.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40935.patch",
"merged_at... | Added Bengali (বাংলা) localization:
- New file README_bn.md (concise translation of main README structure)
- Added বাংলা link to root README language selector Scope: docs-only, no code or tests impacted, no dependencies.
Motivation: Improve accessibility for Bengali-speaking community.
Reviewer suggestion: @... | {
"login": "stevhliu",
"id": 59462357,
"node_id": "MDQ6VXNlcjU5NDYyMzU3",
"avatar_url": "https://avatars.githubusercontent.com/u/59462357?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/stevhliu",
"html_url": "https://github.com/stevhliu",
"followers_url": "https://api.github.com/users/ste... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/40935/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/40935/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/40934 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/40934/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/40934/comments | https://api.github.com/repos/huggingface/transformers/issues/40934/events | https://github.com/huggingface/transformers/pull/40934 | 3,425,803,510 | PR_kwDOCUB6oc6pDL8- | 40,934 | [models] remove unused `import torch.utils.checkpoint` | {
"login": "gante",
"id": 12240844,
"node_id": "MDQ6VXNlcjEyMjQwODQ0",
"avatar_url": "https://avatars.githubusercontent.com/u/12240844?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/gante",
"html_url": "https://github.com/gante",
"followers_url": "https://api.github.com/users/gante/follow... | [] | closed | false | null | [] | null | [] | 2025-09-17T11:04:40 | 2025-09-17T15:38:01 | 2025-09-17T15:37:56 | MEMBER | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40934",
"html_url": "https://github.com/huggingface/transformers/pull/40934",
"diff_url": "https://github.com/huggingface/transformers/pull/40934.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40934.patch",
"merged_at... | # What does this PR do?
(See title) | {
"login": "gante",
"id": 12240844,
"node_id": "MDQ6VXNlcjEyMjQwODQ0",
"avatar_url": "https://avatars.githubusercontent.com/u/12240844?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/gante",
"html_url": "https://github.com/gante",
"followers_url": "https://api.github.com/users/gante/follow... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/40934/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/40934/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/40933 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/40933/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/40933/comments | https://api.github.com/repos/huggingface/transformers/issues/40933/events | https://github.com/huggingface/transformers/issues/40933 | 3,425,776,920 | I_kwDOCUB6oc7MMTUY | 40,933 | `UserWarning: `seed_generator` is deprecated and will be removed in a future version.` | {
"login": "khteh",
"id": 3871483,
"node_id": "MDQ6VXNlcjM4NzE0ODM=",
"avatar_url": "https://avatars.githubusercontent.com/u/3871483?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/khteh",
"html_url": "https://github.com/khteh",
"followers_url": "https://api.github.com/users/khteh/follower... | [] | closed | false | null | [] | null | [] | 2025-09-17T10:55:51 | 2025-09-17T12:28:51 | 2025-09-17T12:28:51 | NONE | null | null | null | null | `transformers==4.56.1`
```
/home/khteh/.local/share/virtualenvs/pAIthon-GaqEDHQT/lib/python3.13/site-packages/transformers/generation/tf_utils.py:465: UserWarning: `seed_generator` is deprecated and will be removed in a future version.
warnings.warn("`seed_generator` is deprecated and will be removed in a future vers... | {
"login": "Rocketknight1",
"id": 12866554,
"node_id": "MDQ6VXNlcjEyODY2NTU0",
"avatar_url": "https://avatars.githubusercontent.com/u/12866554?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Rocketknight1",
"html_url": "https://github.com/Rocketknight1",
"followers_url": "https://api.githu... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/40933/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/40933/timeline | null | completed | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | true |
https://api.github.com/repos/huggingface/transformers/issues/40932 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/40932/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/40932/comments | https://api.github.com/repos/huggingface/transformers/issues/40932/events | https://github.com/huggingface/transformers/issues/40932 | 3,425,733,859 | I_kwDOCUB6oc7MMIzj | 40,932 | Inconsistent handling of tokenizer bos_token | {
"login": "fxmarty-amd",
"id": 180171742,
"node_id": "U_kgDOCr0z3g",
"avatar_url": "https://avatars.githubusercontent.com/u/180171742?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/fxmarty-amd",
"html_url": "https://github.com/fxmarty-amd",
"followers_url": "https://api.github.com/users/... | [
{
"id": 1834056635,
"node_id": "MDU6TGFiZWwxODM0MDU2NjM1",
"url": "https://api.github.com/repos/huggingface/transformers/labels/Core:%20Tokenization",
"name": "Core: Tokenization",
"color": "FF4446",
"default": false,
"description": "Internals of the library; Tokenization."
},
{
... | closed | false | null | [] | null | [] | 2025-09-17T10:43:21 | 2025-10-26T08:02:20 | 2025-10-26T08:02:20 | CONTRIBUTOR | null | null | null | null | ### System Info
```
- `transformers` version: 4.55.4
- Platform: Linux-6.8.0-78-generic-x86_64-with-glibc2.39
- Python version: 3.12.11
- Huggingface_hub version: 0.34.6
- Safetensors version: 0.6.2
- Accelerate version: 1.10.1
- Accelerate config: not found
- DeepSpeed version: not installed
- PyTorch version (acc... | {
"login": "github-actions[bot]",
"id": 41898282,
"node_id": "MDM6Qm90NDE4OTgyODI=",
"avatar_url": "https://avatars.githubusercontent.com/in/15368?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/github-actions%5Bbot%5D",
"html_url": "https://github.com/apps/github-actions",
"followers_url"... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/40932/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/40932/timeline | null | completed | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | true |
https://api.github.com/repos/huggingface/transformers/issues/40931 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/40931/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/40931/comments | https://api.github.com/repos/huggingface/transformers/issues/40931/events | https://github.com/huggingface/transformers/pull/40931 | 3,425,639,382 | PR_kwDOCUB6oc6pCneC | 40,931 | 🚨 [unbloating] unify `TypedDict` usage in processing | {
"login": "zucchini-nlp",
"id": 100715397,
"node_id": "U_kgDOBgDLhQ",
"avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/zucchini-nlp",
"html_url": "https://github.com/zucchini-nlp",
"followers_url": "https://api.github.com/use... | [] | closed | false | null | [] | null | [] | 2025-09-17T10:16:12 | 2025-10-03T12:17:59 | 2025-10-03T12:17:59 | MEMBER | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40931",
"html_url": "https://github.com/huggingface/transformers/pull/40931",
"diff_url": "https://github.com/huggingface/transformers/pull/40931.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40931.patch",
"merged_at... | # What does this PR do?
This PR refactors how `TypedDicts` are used across processing classes to cut down duplication and avoid mismatches. Key updates:
* We previously had two separate “base TypedDicts” for images (one in `processing`, one in `fast image processing`). They were identical, both defining the same ... | {
"login": "zucchini-nlp",
"id": 100715397,
"node_id": "U_kgDOBgDLhQ",
"avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/zucchini-nlp",
"html_url": "https://github.com/zucchini-nlp",
"followers_url": "https://api.github.com/use... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/40931/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/40931/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/40930 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/40930/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/40930/comments | https://api.github.com/repos/huggingface/transformers/issues/40930/events | https://github.com/huggingface/transformers/pull/40930 | 3,425,418,045 | PR_kwDOCUB6oc6pB3JA | 40,930 | Fix `Glm4vMoeIntegrationTest` | {
"login": "ydshieh",
"id": 2521628,
"node_id": "MDQ6VXNlcjI1MjE2Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ydshieh",
"html_url": "https://github.com/ydshieh",
"followers_url": "https://api.github.com/users/ydshieh/... | [] | closed | false | null | [] | null | [] | 2025-09-17T09:16:26 | 2025-09-17T16:21:20 | 2025-09-17T16:21:19 | COLLABORATOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40930",
"html_url": "https://github.com/huggingface/transformers/pull/40930",
"diff_url": "https://github.com/huggingface/transformers/pull/40930.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40930.patch",
"merged_at... | # What does this PR do?
This integration test class takes > 3 hours to finish.
https://github.com/huggingface/transformers/actions/runs/17784986682/job/50551078690
The model is very large (despite being MOE) and the tests loading the model by offloading to cpu/disk.
Even with `max_new_tokens=10`, one test a... | {
"login": "ydshieh",
"id": 2521628,
"node_id": "MDQ6VXNlcjI1MjE2Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ydshieh",
"html_url": "https://github.com/ydshieh",
"followers_url": "https://api.github.com/users/ydshieh/... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/40930/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/40930/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/40929 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/40929/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/40929/comments | https://api.github.com/repos/huggingface/transformers/issues/40929/events | https://github.com/huggingface/transformers/pull/40929 | 3,425,367,129 | PR_kwDOCUB6oc6pBsI5 | 40,929 | Minor fix for #40727 | {
"login": "ydshieh",
"id": 2521628,
"node_id": "MDQ6VXNlcjI1MjE2Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ydshieh",
"html_url": "https://github.com/ydshieh",
"followers_url": "https://api.github.com/users/ydshieh/... | [] | closed | false | null | [] | null | [] | 2025-09-17T09:02:19 | 2025-09-17T09:42:16 | 2025-09-17T09:42:14 | COLLABORATOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40929",
"html_url": "https://github.com/huggingface/transformers/pull/40929",
"diff_url": "https://github.com/huggingface/transformers/pull/40929.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40929.patch",
"merged_at... | # What does this PR do?
| {
"login": "ydshieh",
"id": 2521628,
"node_id": "MDQ6VXNlcjI1MjE2Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ydshieh",
"html_url": "https://github.com/ydshieh",
"followers_url": "https://api.github.com/users/ydshieh/... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/40929/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/40929/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/40928 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/40928/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/40928/comments | https://api.github.com/repos/huggingface/transformers/issues/40928/events | https://github.com/huggingface/transformers/pull/40928 | 3,424,844,576 | PR_kwDOCUB6oc6o_7kT | 40,928 | 🚨Refactor: Update text2text generation pipelines to use max_new_tokens… | {
"login": "lilin-1",
"id": 177207022,
"node_id": "U_kgDOCo_27g",
"avatar_url": "https://avatars.githubusercontent.com/u/177207022?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/lilin-1",
"html_url": "https://github.com/lilin-1",
"followers_url": "https://api.github.com/users/lilin-1/foll... | [] | closed | false | null | [] | null | [] | 2025-09-17T06:18:18 | 2025-09-24T11:55:33 | 2025-09-24T11:54:55 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40928",
"html_url": "https://github.com/huggingface/transformers/pull/40928",
"diff_url": "https://github.com/huggingface/transformers/pull/40928.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40928.patch",
"merged_at... | ---
name: Pull Request
---
about: Create a pull request to contribute to 🤗 Transformers
---
title: "Refactor: Update text2text generation pipelines to use max_new_tokens and resolve max_length warning"
---
labels: bug, summarization
---
assignees: ''
---
## Related Issue
Closes #40768
## Summary
... | {
"login": "Rocketknight1",
"id": 12866554,
"node_id": "MDQ6VXNlcjEyODY2NTU0",
"avatar_url": "https://avatars.githubusercontent.com/u/12866554?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Rocketknight1",
"html_url": "https://github.com/Rocketknight1",
"followers_url": "https://api.githu... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/40928/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/40928/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/40927 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/40927/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/40927/comments | https://api.github.com/repos/huggingface/transformers/issues/40927/events | https://github.com/huggingface/transformers/issues/40927 | 3,424,652,563 | I_kwDOCUB6oc7MIA0T | 40,927 | PreTrainedTokenizer requires self.get_vocab() which is no longer implemented | {
"login": "eugenekwaNeuromics",
"id": 163505331,
"node_id": "U_kgDOCb7ksw",
"avatar_url": "https://avatars.githubusercontent.com/u/163505331?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/eugenekwaNeuromics",
"html_url": "https://github.com/eugenekwaNeuromics",
"followers_url": "https://... | [
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
}
] | closed | false | null | [] | null | [] | 2025-09-17T04:54:53 | 2025-10-01T13:08:57 | 2025-10-01T13:08:57 | NONE | null | null | null | null | ### System Info
Hi, there is an issue with the PreTrainedTokenizer class in the current transformers v4.56.1. Line 1516 of `tokenization_utils_base.py` curently returns a `NotImplementedError()` for the `PreTrainedTokenizerBase.get_vocab(self)` function. However, PreTrainedTokenizer still requires the `.get_vocab` fun... | {
"login": "ArthurZucker",
"id": 48595927,
"node_id": "MDQ6VXNlcjQ4NTk1OTI3",
"avatar_url": "https://avatars.githubusercontent.com/u/48595927?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ArthurZucker",
"html_url": "https://github.com/ArthurZucker",
"followers_url": "https://api.github.c... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/40927/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/40927/timeline | null | completed | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | true |
https://api.github.com/repos/huggingface/transformers/issues/40926 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/40926/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/40926/comments | https://api.github.com/repos/huggingface/transformers/issues/40926/events | https://github.com/huggingface/transformers/issues/40926 | 3,424,562,789 | I_kwDOCUB6oc7MHq5l | 40,926 | Contrastive search doesn't work on Gemma3 | {
"login": "jood-canva",
"id": 206628664,
"node_id": "U_kgDODFDnOA",
"avatar_url": "https://avatars.githubusercontent.com/u/206628664?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jood-canva",
"html_url": "https://github.com/jood-canva",
"followers_url": "https://api.github.com/users/joo... | [
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
}
] | closed | false | null | [] | null | [] | 2025-09-17T03:59:59 | 2025-09-17T12:12:46 | 2025-09-17T12:12:11 | NONE | null | null | null | null | ### System Info
- `transformers` version: 4.53.3
- Platform: Linux-6.8.0-1036-aws-x86_64-with-glibc2.35
- Python version: 3.11.10
- Huggingface_hub version: 0.34.3
- Safetensors version: 0.4.3
- Accelerate version: 1.6.0
- Accelerate config: not found
- DeepSpeed version: not installed
- PyTorch version (accelerator?... | {
"login": "manueldeprada",
"id": 6536835,
"node_id": "MDQ6VXNlcjY1MzY4MzU=",
"avatar_url": "https://avatars.githubusercontent.com/u/6536835?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/manueldeprada",
"html_url": "https://github.com/manueldeprada",
"followers_url": "https://api.github.... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/40926/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/40926/timeline | null | completed | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | true |
https://api.github.com/repos/huggingface/transformers/issues/40925 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/40925/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/40925/comments | https://api.github.com/repos/huggingface/transformers/issues/40925/events | https://github.com/huggingface/transformers/pull/40925 | 3,424,538,434 | PR_kwDOCUB6oc6o-6b8 | 40,925 | Fix outdated torch version check | {
"login": "cyyever",
"id": 17618148,
"node_id": "MDQ6VXNlcjE3NjE4MTQ4",
"avatar_url": "https://avatars.githubusercontent.com/u/17618148?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/cyyever",
"html_url": "https://github.com/cyyever",
"followers_url": "https://api.github.com/users/cyyeve... | [] | closed | false | null | [] | null | [] | 2025-09-17T03:47:43 | 2025-09-22T12:53:03 | 2025-09-22T12:38:08 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40925",
"html_url": "https://github.com/huggingface/transformers/pull/40925",
"diff_url": "https://github.com/huggingface/transformers/pull/40925.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40925.patch",
"merged_at... | # What does this PR do?
Fix outdated torch version check. | {
"login": "Rocketknight1",
"id": 12866554,
"node_id": "MDQ6VXNlcjEyODY2NTU0",
"avatar_url": "https://avatars.githubusercontent.com/u/12866554?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Rocketknight1",
"html_url": "https://github.com/Rocketknight1",
"followers_url": "https://api.githu... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/40925/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/40925/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/40924 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/40924/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/40924/comments | https://api.github.com/repos/huggingface/transformers/issues/40924/events | https://github.com/huggingface/transformers/pull/40924 | 3,424,276,079 | PR_kwDOCUB6oc6o-BKZ | 40,924 | Don't list dropout in eager_paged_attention_forward | {
"login": "cyyever",
"id": 17618148,
"node_id": "MDQ6VXNlcjE3NjE4MTQ4",
"avatar_url": "https://avatars.githubusercontent.com/u/17618148?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/cyyever",
"html_url": "https://github.com/cyyever",
"followers_url": "https://api.github.com/users/cyyeve... | [] | closed | false | null | [] | null | [] | 2025-09-17T02:01:41 | 2025-09-18T10:22:21 | 2025-09-18T09:05:50 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40924",
"html_url": "https://github.com/huggingface/transformers/pull/40924",
"diff_url": "https://github.com/huggingface/transformers/pull/40924.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40924.patch",
"merged_at... | # What does this PR do?
The `dropout` argument is not used in eager_paged_attention_forward.
| {
"login": "ArthurZucker",
"id": 48595927,
"node_id": "MDQ6VXNlcjQ4NTk1OTI3",
"avatar_url": "https://avatars.githubusercontent.com/u/48595927?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ArthurZucker",
"html_url": "https://github.com/ArthurZucker",
"followers_url": "https://api.github.c... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/40924/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/40924/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/40923 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/40923/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/40923/comments | https://api.github.com/repos/huggingface/transformers/issues/40923/events | https://github.com/huggingface/transformers/pull/40923 | 3,424,176,382 | PR_kwDOCUB6oc6o9sru | 40,923 | Wait for main process in _save_checkpoint to ensure best checkpoint exists | {
"login": "ssharpe42",
"id": 8136905,
"node_id": "MDQ6VXNlcjgxMzY5MDU=",
"avatar_url": "https://avatars.githubusercontent.com/u/8136905?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ssharpe42",
"html_url": "https://github.com/ssharpe42",
"followers_url": "https://api.github.com/users/ss... | [] | closed | false | null | [] | null | [] | 2025-09-17T00:56:37 | 2025-09-30T09:41:04 | 2025-09-30T09:41:03 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40923",
"html_url": "https://github.com/huggingface/transformers/pull/40923",
"diff_url": "https://github.com/huggingface/transformers/pull/40923.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40923.patch",
"merged_at... | # What does this PR do?
In 4.50.0 a bug was introduced with a refactor of the best checkpoint process when we try to load the best model at end and a non-main process has a null `self.state.best_model_checkpoint`.
Before we run `_load_best_model()` there is a barrier to make sure the main process saves the mode... | {
"login": "SunMarc",
"id": 57196510,
"node_id": "MDQ6VXNlcjU3MTk2NTEw",
"avatar_url": "https://avatars.githubusercontent.com/u/57196510?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/SunMarc",
"html_url": "https://github.com/SunMarc",
"followers_url": "https://api.github.com/users/SunMar... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/40923/reactions",
"total_count": 1,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 1,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/40923/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/40922 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/40922/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/40922/comments | https://api.github.com/repos/huggingface/transformers/issues/40922/events | https://github.com/huggingface/transformers/pull/40922 | 3,423,853,331 | PR_kwDOCUB6oc6o8ocn | 40,922 | [DOC] Add missing dates in model cards | {
"login": "yonigozlan",
"id": 74535834,
"node_id": "MDQ6VXNlcjc0NTM1ODM0",
"avatar_url": "https://avatars.githubusercontent.com/u/74535834?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/yonigozlan",
"html_url": "https://github.com/yonigozlan",
"followers_url": "https://api.github.com/use... | [] | closed | false | null | [] | null | [] | 2025-09-16T21:50:07 | 2025-09-17T15:17:06 | 2025-09-17T15:17:06 | MEMBER | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40922",
"html_url": "https://github.com/huggingface/transformers/pull/40922",
"diff_url": "https://github.com/huggingface/transformers/pull/40922.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40922.patch",
"merged_at... | Cc @stevhliu ;)
I'm building a space to display a visual timeline of model releases in Transformers, happy to discuss this more and how we could integrate it to the docs once I have something working!
| {
"login": "yonigozlan",
"id": 74535834,
"node_id": "MDQ6VXNlcjc0NTM1ODM0",
"avatar_url": "https://avatars.githubusercontent.com/u/74535834?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/yonigozlan",
"html_url": "https://github.com/yonigozlan",
"followers_url": "https://api.github.com/use... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/40922/reactions",
"total_count": 1,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 1,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/40922/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/40921 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/40921/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/40921/comments | https://api.github.com/repos/huggingface/transformers/issues/40921/events | https://github.com/huggingface/transformers/pull/40921 | 3,423,827,302 | PR_kwDOCUB6oc6o8iz0 | 40,921 | Add FlexOlmo model | {
"login": "2015aroras",
"id": 19700980,
"node_id": "MDQ6VXNlcjE5NzAwOTgw",
"avatar_url": "https://avatars.githubusercontent.com/u/19700980?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/2015aroras",
"html_url": "https://github.com/2015aroras",
"followers_url": "https://api.github.com/use... | [
{
"id": 1843244711,
"node_id": "MDU6TGFiZWwxODQzMjQ0NzEx",
"url": "https://api.github.com/repos/huggingface/transformers/labels/New%20model",
"name": "New model",
"color": "fbca04",
"default": false,
"description": ""
}
] | closed | false | null | [] | null | [] | 2025-09-16T21:38:13 | 2025-09-18T17:16:41 | 2025-09-18T09:04:06 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40921",
"html_url": "https://github.com/huggingface/transformers/pull/40921",
"diff_url": "https://github.com/huggingface/transformers/pull/40921.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40921.patch",
"merged_at... | # What does this PR do?
<!--
Congratulations! You've made it this far! You're not quite done yet though.
Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution.
Then, please replace this w... | {
"login": "ArthurZucker",
"id": 48595927,
"node_id": "MDQ6VXNlcjQ4NTk1OTI3",
"avatar_url": "https://avatars.githubusercontent.com/u/48595927?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ArthurZucker",
"html_url": "https://github.com/ArthurZucker",
"followers_url": "https://api.github.c... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/40921/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/40921/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/40920 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/40920/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/40920/comments | https://api.github.com/repos/huggingface/transformers/issues/40920/events | https://github.com/huggingface/transformers/pull/40920 | 3,423,773,212 | PR_kwDOCUB6oc6o8XWM | 40,920 | Fix AttributeError: add num_hidden_layers property to T5GemmaConfig | {
"login": "avchauzov",
"id": 21357563,
"node_id": "MDQ6VXNlcjIxMzU3NTYz",
"avatar_url": "https://avatars.githubusercontent.com/u/21357563?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/avchauzov",
"html_url": "https://github.com/avchauzov",
"followers_url": "https://api.github.com/users/... | [] | closed | false | null | [] | null | [] | 2025-09-16T21:14:28 | 2025-09-21T00:46:19 | 2025-09-21T00:46:00 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40920",
"html_url": "https://github.com/huggingface/transformers/pull/40920",
"diff_url": "https://github.com/huggingface/transformers/pull/40920.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40920.patch",
"merged_at... | # What does this PR do?
This PR fixes an `AttributeError` that occurs when using T5Gemma models with `Seq2SeqTrainer`. The issue arises because `DynamicCache` in `cache_utils.py` expects `config.num_hidden_layers` to exist, but `T5GemmaConfig` doesn't have this attribute.
**Changes:**
- Added `num_hidden_layers`... | {
"login": "avchauzov",
"id": 21357563,
"node_id": "MDQ6VXNlcjIxMzU3NTYz",
"avatar_url": "https://avatars.githubusercontent.com/u/21357563?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/avchauzov",
"html_url": "https://github.com/avchauzov",
"followers_url": "https://api.github.com/users/... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/40920/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/40920/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/40919 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/40919/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/40919/comments | https://api.github.com/repos/huggingface/transformers/issues/40919/events | https://github.com/huggingface/transformers/pull/40919 | 3,423,749,968 | PR_kwDOCUB6oc6o8SOA | 40,919 | Standardize audio/vision embedding function name for multimodal models | {
"login": "jackzhxng",
"id": 32371937,
"node_id": "MDQ6VXNlcjMyMzcxOTM3",
"avatar_url": "https://avatars.githubusercontent.com/u/32371937?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jackzhxng",
"html_url": "https://github.com/jackzhxng",
"followers_url": "https://api.github.com/users/... | [] | closed | false | null | [] | null | [] | 2025-09-16T21:06:20 | 2025-10-08T20:24:39 | 2025-09-18T08:45:04 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40919",
"html_url": "https://github.com/huggingface/transformers/pull/40919",
"diff_url": "https://github.com/huggingface/transformers/pull/40919.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40919.patch",
"merged_at... | # What does this PR do?
Make all multimodal models share the same function name for audio and vision. This function should encapsulate all of the encoder module code up to the fusion with the prompt embeddings when doing early fusion.
This makes it so that we can rely on this function name downstream:
- Audio: htt... | {
"login": "zucchini-nlp",
"id": 100715397,
"node_id": "U_kgDOBgDLhQ",
"avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/zucchini-nlp",
"html_url": "https://github.com/zucchini-nlp",
"followers_url": "https://api.github.com/use... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/40919/reactions",
"total_count": 3,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 3,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/40919/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/40918 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/40918/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/40918/comments | https://api.github.com/repos/huggingface/transformers/issues/40918/events | https://github.com/huggingface/transformers/pull/40918 | 3,423,196,190 | PR_kwDOCUB6oc6o6aKP | 40,918 | Add Model Card for `GptOss` | {
"login": "ParagEkbote",
"id": 69567729,
"node_id": "MDQ6VXNlcjY5NTY3NzI5",
"avatar_url": "https://avatars.githubusercontent.com/u/69567729?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ParagEkbote",
"html_url": "https://github.com/ParagEkbote",
"followers_url": "https://api.github.com/... | [] | closed | false | null | [] | null | [] | 2025-09-16T17:46:51 | 2025-09-18T17:24:37 | 2025-09-18T17:24:37 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40918",
"html_url": "https://github.com/huggingface/transformers/pull/40918",
"diff_url": "https://github.com/huggingface/transformers/pull/40918.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40918.patch",
"merged_at... | # What does this PR do?
Following the structure as described in #36979, I have updated the model card for `GptOss`. I've not added a quantization example due to complexity, feel free to suggest an example. Could you please review?
## Before submitting
- [x] This PR fixes a typo or improves the docs (you can dism... | {
"login": "stevhliu",
"id": 59462357,
"node_id": "MDQ6VXNlcjU5NDYyMzU3",
"avatar_url": "https://avatars.githubusercontent.com/u/59462357?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/stevhliu",
"html_url": "https://github.com/stevhliu",
"followers_url": "https://api.github.com/users/ste... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/40918/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/40918/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/40917 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/40917/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/40917/comments | https://api.github.com/repos/huggingface/transformers/issues/40917/events | https://github.com/huggingface/transformers/pull/40917 | 3,423,134,075 | PR_kwDOCUB6oc6o6M6L | 40,917 | 🚨 [generate] update paligemma mask updates (and other assisted generation-related fixes) | {
"login": "gante",
"id": 12240844,
"node_id": "MDQ6VXNlcjEyMjQwODQ0",
"avatar_url": "https://avatars.githubusercontent.com/u/12240844?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/gante",
"html_url": "https://github.com/gante",
"followers_url": "https://api.github.com/users/gante/follow... | [] | closed | false | null | [] | null | [] | 2025-09-16T17:22:43 | 2025-10-03T09:33:59 | 2025-09-23T16:20:00 | MEMBER | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40917",
"html_url": "https://github.com/huggingface/transformers/pull/40917",
"diff_url": "https://github.com/huggingface/transformers/pull/40917.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40917.patch",
"merged_at... | # What does this PR do?
🚨 BC-breaking: `paligemma` processor now returns `token_type_ids` by default. This is required to disambiguate forward passes, due to the bidirectional attention mask in the prompt. Advanced generation methods may run forward passes with prompt + generated tokens, so they will fail without `... | {
"login": "gante",
"id": 12240844,
"node_id": "MDQ6VXNlcjEyMjQwODQ0",
"avatar_url": "https://avatars.githubusercontent.com/u/12240844?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/gante",
"html_url": "https://github.com/gante",
"followers_url": "https://api.github.com/users/gante/follow... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/40917/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/40917/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/40916 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/40916/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/40916/comments | https://api.github.com/repos/huggingface/transformers/issues/40916/events | https://github.com/huggingface/transformers/pull/40916 | 3,422,997,726 | PR_kwDOCUB6oc6o5vUQ | 40,916 | Remove unused arguments | {
"login": "cyyever",
"id": 17618148,
"node_id": "MDQ6VXNlcjE3NjE4MTQ4",
"avatar_url": "https://avatars.githubusercontent.com/u/17618148?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/cyyever",
"html_url": "https://github.com/cyyever",
"followers_url": "https://api.github.com/users/cyyeve... | [] | closed | false | null | [] | null | [] | 2025-09-16T16:34:21 | 2025-09-23T11:41:44 | 2025-09-23T11:40:51 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40916",
"html_url": "https://github.com/huggingface/transformers/pull/40916",
"diff_url": "https://github.com/huggingface/transformers/pull/40916.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40916.patch",
"merged_at... | # What does this PR do?
Remove unused arguments. | {
"login": "Rocketknight1",
"id": 12866554,
"node_id": "MDQ6VXNlcjEyODY2NTU0",
"avatar_url": "https://avatars.githubusercontent.com/u/12866554?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Rocketknight1",
"html_url": "https://github.com/Rocketknight1",
"followers_url": "https://api.githu... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/40916/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/40916/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/40915 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/40915/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/40915/comments | https://api.github.com/repos/huggingface/transformers/issues/40915/events | https://github.com/huggingface/transformers/issues/40915 | 3,422,963,061 | I_kwDOCUB6oc7MBkV1 | 40,915 | HfArgumentParser does not support peft.LoraConfig | {
"login": "romitjain",
"id": 11757603,
"node_id": "MDQ6VXNlcjExNzU3NjAz",
"avatar_url": "https://avatars.githubusercontent.com/u/11757603?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/romitjain",
"html_url": "https://github.com/romitjain",
"followers_url": "https://api.github.com/users/... | [
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
}
] | closed | false | null | [] | null | [] | 2025-09-16T16:23:56 | 2025-09-23T05:16:14 | 2025-09-23T05:16:14 | CONTRIBUTOR | null | null | null | null | ### System Info
- `transformers` version: 4.57.0.dev0
- Platform: Linux-5.14.0-284.73.1.el9_2.x86_64-x86_64-with-glibc2.39
- Python version: 3.12.3
- Huggingface_hub version: 0.34.4
- Safetensors version: 0.5.2
- Accelerate version: 1.10.1
- Accelerate config: not found
- DeepSpeed version: not installed
- PyTorch ... | {
"login": "romitjain",
"id": 11757603,
"node_id": "MDQ6VXNlcjExNzU3NjAz",
"avatar_url": "https://avatars.githubusercontent.com/u/11757603?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/romitjain",
"html_url": "https://github.com/romitjain",
"followers_url": "https://api.github.com/users/... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/40915/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/40915/timeline | null | completed | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | true |
https://api.github.com/repos/huggingface/transformers/issues/40914 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/40914/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/40914/comments | https://api.github.com/repos/huggingface/transformers/issues/40914/events | https://github.com/huggingface/transformers/pull/40914 | 3,422,870,136 | PR_kwDOCUB6oc6o5TQ_ | 40,914 | Add support for Florence-2 training | {
"login": "ducviet00",
"id": 24910916,
"node_id": "MDQ6VXNlcjI0OTEwOTE2",
"avatar_url": "https://avatars.githubusercontent.com/u/24910916?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ducviet00",
"html_url": "https://github.com/ducviet00",
"followers_url": "https://api.github.com/users/... | [] | closed | false | null | [] | null | [] | 2025-09-16T15:57:23 | 2025-09-17T11:49:57 | 2025-09-17T11:49:57 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40914",
"html_url": "https://github.com/huggingface/transformers/pull/40914",
"diff_url": "https://github.com/huggingface/transformers/pull/40914.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40914.patch",
"merged_at... | # What does this PR do?
This PR adds support for Florence-2 training.
Without shifting tokens to the right, the model cannot compute the forward loss correctly because the decoder input IDs are not generated from the labels.
In the PR https://github.com/huggingface/transformers/pull/38188, I thought it was handled... | {
"login": "zucchini-nlp",
"id": 100715397,
"node_id": "U_kgDOBgDLhQ",
"avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/zucchini-nlp",
"html_url": "https://github.com/zucchini-nlp",
"followers_url": "https://api.github.com/use... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/40914/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/40914/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/40913 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/40913/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/40913/comments | https://api.github.com/repos/huggingface/transformers/issues/40913/events | https://github.com/huggingface/transformers/issues/40913 | 3,422,410,750 | I_kwDOCUB6oc7L_df- | 40,913 | Setting chat_template when creating a processor does not change the chat template | {
"login": "NohTow",
"id": 38869395,
"node_id": "MDQ6VXNlcjM4ODY5Mzk1",
"avatar_url": "https://avatars.githubusercontent.com/u/38869395?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/NohTow",
"html_url": "https://github.com/NohTow",
"followers_url": "https://api.github.com/users/NohTow/fo... | [
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
}
] | closed | false | null | [] | null | [] | 2025-09-16T13:53:39 | 2025-10-26T08:02:22 | 2025-10-26T08:02:22 | CONTRIBUTOR | null | null | null | null | ### System Info
Hello,
With `transformers == 4.55.3` (and below, it seems), passing `chat_template` as an args when creating the processor does not seems to change the chat template accordingly.
Few lines to reproduce:
```python
from transformers import AutoProcessor
processor = AutoProcessor.from_pretrained("Qwen/Qwe... | {
"login": "github-actions[bot]",
"id": 41898282,
"node_id": "MDM6Qm90NDE4OTgyODI=",
"avatar_url": "https://avatars.githubusercontent.com/in/15368?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/github-actions%5Bbot%5D",
"html_url": "https://github.com/apps/github-actions",
"followers_url"... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/40913/reactions",
"total_count": 1,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 1
} | https://api.github.com/repos/huggingface/transformers/issues/40913/timeline | null | completed | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | true |
https://api.github.com/repos/huggingface/transformers/issues/40912 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/40912/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/40912/comments | https://api.github.com/repos/huggingface/transformers/issues/40912/events | https://github.com/huggingface/transformers/pull/40912 | 3,422,322,633 | PR_kwDOCUB6oc6o3cfB | 40,912 | Fix dtype in Paligemma | {
"login": "zucchini-nlp",
"id": 100715397,
"node_id": "U_kgDOBgDLhQ",
"avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/zucchini-nlp",
"html_url": "https://github.com/zucchini-nlp",
"followers_url": "https://api.github.com/use... | [] | closed | false | null | [] | null | [] | 2025-09-16T13:31:08 | 2025-09-16T20:24:03 | 2025-09-16T16:07:56 | MEMBER | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40912",
"html_url": "https://github.com/huggingface/transformers/pull/40912",
"diff_url": "https://github.com/huggingface/transformers/pull/40912.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40912.patch",
"merged_at... | # What does this PR do?
Fixes https://github.com/huggingface/transformers/issues/40875 and makes sure we have the same dtype before operations. The attention mask is used by LM and has to be the same dtype. Then we also need to cast VLM outputs because VLM and projection can be of different dtypes in configs
I st... | {
"login": "zucchini-nlp",
"id": 100715397,
"node_id": "U_kgDOBgDLhQ",
"avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/zucchini-nlp",
"html_url": "https://github.com/zucchini-nlp",
"followers_url": "https://api.github.com/use... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/40912/reactions",
"total_count": 1,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 1,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/40912/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/40911 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/40911/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/40911/comments | https://api.github.com/repos/huggingface/transformers/issues/40911/events | https://github.com/huggingface/transformers/pull/40911 | 3,422,052,093 | PR_kwDOCUB6oc6o2ht_ | 40,911 | ENH: Enable readline support for transformers chat | {
"login": "BenjaminBossan",
"id": 6229650,
"node_id": "MDQ6VXNlcjYyMjk2NTA=",
"avatar_url": "https://avatars.githubusercontent.com/u/6229650?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/BenjaminBossan",
"html_url": "https://github.com/BenjaminBossan",
"followers_url": "https://api.gith... | [] | closed | false | null | [] | null | [] | 2025-09-16T12:20:41 | 2025-09-19T09:39:22 | 2025-09-19T09:39:22 | MEMBER | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40911",
"html_url": "https://github.com/huggingface/transformers/pull/40911",
"diff_url": "https://github.com/huggingface/transformers/pull/40911.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40911.patch",
"merged_at... | # What does this PR do?
This small change enables GNU readline support for the `transformers chat` command. This includes:
- advanced navigation and editing: `ctrl + a` `ctrl + e` `alt + b` `alt + f` `ctrl + k` `alt + d` etc.
- navigate and search history: `↑` `↓` `ctrl + p` `ctrl + n` `ctrl + r`
- undo: `ctrl... | {
"login": "gante",
"id": 12240844,
"node_id": "MDQ6VXNlcjEyMjQwODQ0",
"avatar_url": "https://avatars.githubusercontent.com/u/12240844?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/gante",
"html_url": "https://github.com/gante",
"followers_url": "https://api.github.com/users/gante/follow... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/40911/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/40911/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/40910 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/40910/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/40910/comments | https://api.github.com/repos/huggingface/transformers/issues/40910/events | https://github.com/huggingface/transformers/issues/40910 | 3,421,942,661 | I_kwDOCUB6oc7L9rOF | 40,910 | Gemma-3: prepare_inputs_for_generation should forward pixel_values based on image token presence, not cache_position==0 | {
"login": "Simone999",
"id": 29517129,
"node_id": "MDQ6VXNlcjI5NTE3MTI5",
"avatar_url": "https://avatars.githubusercontent.com/u/29517129?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Simone999",
"html_url": "https://github.com/Simone999",
"followers_url": "https://api.github.com/users/... | [
{
"id": 2796628563,
"node_id": "MDU6TGFiZWwyNzk2NjI4NTYz",
"url": "https://api.github.com/repos/huggingface/transformers/labels/WIP",
"name": "WIP",
"color": "234C99",
"default": false,
"description": "Label your PR/Issue with WIP for some long outstanding Issues/PRs that are work in pro... | open | false | {
"login": "zucchini-nlp",
"id": 100715397,
"node_id": "U_kgDOBgDLhQ",
"avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/zucchini-nlp",
"html_url": "https://github.com/zucchini-nlp",
"followers_url": "https://api.github.com/use... | [
{
"login": "zucchini-nlp",
"id": 100715397,
"node_id": "U_kgDOBgDLhQ",
"avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/zucchini-nlp",
"html_url": "https://github.com/zucchini-nlp",
"followers_url": "https:... | null | [] | 2025-09-16T11:52:40 | 2025-10-17T08:03:41 | null | NONE | null | null | null | null | ### System Info
- `transformers` version: 4.56.1
- Platform: Windows-11-10.0.26100-SP0
- Python version: 3.12.11
- Huggingface_hub version: 0.34.4
- Safetensors version: 0.6.2
- Accelerate version: 1.10.1
- Accelerate config: not found
- DeepSpeed version: not installed
- PyTorch version (accelerator?): 2.8.0+cu129... | null | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/40910/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/40910/timeline | null | null | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | false |
https://api.github.com/repos/huggingface/transformers/issues/40909 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/40909/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/40909/comments | https://api.github.com/repos/huggingface/transformers/issues/40909/events | https://github.com/huggingface/transformers/pull/40909 | 3,421,794,031 | PR_kwDOCUB6oc6o1rnB | 40,909 | disable `test_fast_is_faster_than_slow` | {
"login": "ydshieh",
"id": 2521628,
"node_id": "MDQ6VXNlcjI1MjE2Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ydshieh",
"html_url": "https://github.com/ydshieh",
"followers_url": "https://api.github.com/users/ydshieh/... | [] | closed | false | null | [] | null | [] | 2025-09-16T11:16:23 | 2025-09-16T13:34:06 | 2025-09-16T13:34:05 | COLLABORATOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40909",
"html_url": "https://github.com/huggingface/transformers/pull/40909",
"diff_url": "https://github.com/huggingface/transformers/pull/40909.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40909.patch",
"merged_at... | # What does this PR do?
This test causes too much trouble and energy. As discussed offline, skip it until someone can check if this test makes sense on cpu and with small batch size.
Last time seeing such failure is 1 day ago:
https://app.circleci.com/pipelines/github/huggingface/transformers/146100/workflows/... | {
"login": "ydshieh",
"id": 2521628,
"node_id": "MDQ6VXNlcjI1MjE2Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ydshieh",
"html_url": "https://github.com/ydshieh",
"followers_url": "https://api.github.com/users/ydshieh/... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/40909/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/40909/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/40908 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/40908/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/40908/comments | https://api.github.com/repos/huggingface/transformers/issues/40908/events | https://github.com/huggingface/transformers/pull/40908 | 3,421,627,219 | PR_kwDOCUB6oc6o1LVe | 40,908 | Fix `load_balancing_loss_func` incompatible with `past_key_values` | {
"login": "tkj666",
"id": 28040169,
"node_id": "MDQ6VXNlcjI4MDQwMTY5",
"avatar_url": "https://avatars.githubusercontent.com/u/28040169?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/tkj666",
"html_url": "https://github.com/tkj666",
"followers_url": "https://api.github.com/users/tkj666/fo... | [
{
"id": 2648621985,
"node_id": "MDU6TGFiZWwyNjQ4NjIxOTg1",
"url": "https://api.github.com/repos/huggingface/transformers/labels/Feature%20request",
"name": "Feature request",
"color": "FBCA04",
"default": false,
"description": "Request for a new feature"
}
] | open | false | null | [] | null | [] | 2025-09-16T10:38:52 | 2025-10-16T13:58:33 | null | NONE | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40908",
"html_url": "https://github.com/huggingface/transformers/pull/40908",
"diff_url": "https://github.com/huggingface/transformers/pull/40908.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40908.patch",
"merged_at... |
# What does this PR do?
<!--
Congratulations! You've made it this far! You're not quite done yet though.
Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution.
Then, please replace this... | null | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/40908/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/40908/timeline | null | null | null | null | true | false |
https://api.github.com/repos/huggingface/transformers/issues/40907 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/40907/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/40907/comments | https://api.github.com/repos/huggingface/transformers/issues/40907/events | https://github.com/huggingface/transformers/pull/40907 | 3,421,590,114 | PR_kwDOCUB6oc6o1ELt | 40,907 | [cache] Only use scalars in `get_mask_sizes` | {
"login": "Cyrilvallez",
"id": 71554963,
"node_id": "MDQ6VXNlcjcxNTU0OTYz",
"avatar_url": "https://avatars.githubusercontent.com/u/71554963?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Cyrilvallez",
"html_url": "https://github.com/Cyrilvallez",
"followers_url": "https://api.github.com/... | [] | closed | false | null | [] | null | [] | 2025-09-16T10:30:21 | 2025-09-16T10:49:01 | 2025-09-16T10:48:59 | MEMBER | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40907",
"html_url": "https://github.com/huggingface/transformers/pull/40907",
"diff_url": "https://github.com/huggingface/transformers/pull/40907.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40907.patch",
"merged_at... | # What does this PR do?
As per the title. We can rely on the scalar `self.cumulative_length` instead of tensor `cache_position[0]`, as was introduced in https://github.com/huggingface/transformers/pull/40893. It's much better for downstream masking and compilation support. | {
"login": "Cyrilvallez",
"id": 71554963,
"node_id": "MDQ6VXNlcjcxNTU0OTYz",
"avatar_url": "https://avatars.githubusercontent.com/u/71554963?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Cyrilvallez",
"html_url": "https://github.com/Cyrilvallez",
"followers_url": "https://api.github.com/... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/40907/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/40907/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/40906 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/40906/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/40906/comments | https://api.github.com/repos/huggingface/transformers/issues/40906/events | https://github.com/huggingface/transformers/pull/40906 | 3,421,545,685 | PR_kwDOCUB6oc6o07lb | 40,906 | [generate] misc fixes | {
"login": "gante",
"id": 12240844,
"node_id": "MDQ6VXNlcjEyMjQwODQ0",
"avatar_url": "https://avatars.githubusercontent.com/u/12240844?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/gante",
"html_url": "https://github.com/gante",
"followers_url": "https://api.github.com/users/gante/follow... | [] | closed | false | null | [] | null | [] | 2025-09-16T10:20:36 | 2025-09-16T14:18:09 | 2025-09-16T14:18:06 | MEMBER | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40906",
"html_url": "https://github.com/huggingface/transformers/pull/40906",
"diff_url": "https://github.com/huggingface/transformers/pull/40906.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40906.patch",
"merged_at... | # What does this PR do?
Fixes/todos for minor issues I encountered while working on #40833 | {
"login": "gante",
"id": 12240844,
"node_id": "MDQ6VXNlcjEyMjQwODQ0",
"avatar_url": "https://avatars.githubusercontent.com/u/12240844?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/gante",
"html_url": "https://github.com/gante",
"followers_url": "https://api.github.com/users/gante/follow... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/40906/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/40906/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/40905 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/40905/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/40905/comments | https://api.github.com/repos/huggingface/transformers/issues/40905/events | https://github.com/huggingface/transformers/pull/40905 | 3,421,232,753 | PR_kwDOCUB6oc6oz392 | 40,905 | Set seed for `Glm4vIntegrationTest` | {
"login": "ydshieh",
"id": 2521628,
"node_id": "MDQ6VXNlcjI1MjE2Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ydshieh",
"html_url": "https://github.com/ydshieh",
"followers_url": "https://api.github.com/users/ydshieh/... | [] | closed | false | null | [] | null | [] | 2025-09-16T09:04:38 | 2025-09-16T11:01:53 | 2025-09-16T11:01:51 | COLLABORATOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40905",
"html_url": "https://github.com/huggingface/transformers/pull/40905",
"diff_url": "https://github.com/huggingface/transformers/pull/40905.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40905.patch",
"merged_at... | # What does this PR do?
This model has `"do_sample": true,` in its `generation_config.json`, see
https://huggingface.co/zai-org/GLM-4.1V-9B-Thinking/blob/main/generation_config.json
We need to set seed, otherwise I will have nightmare of getting different outputs each day ... 😭
`Glm4vIntegrationTest` no... | {
"login": "ydshieh",
"id": 2521628,
"node_id": "MDQ6VXNlcjI1MjE2Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ydshieh",
"html_url": "https://github.com/ydshieh",
"followers_url": "https://api.github.com/users/ydshieh/... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/40905/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/40905/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/40904 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/40904/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/40904/comments | https://api.github.com/repos/huggingface/transformers/issues/40904/events | https://github.com/huggingface/transformers/issues/40904 | 3,421,162,476 | I_kwDOCUB6oc7L6svs | 40,904 | MXFP4 Tensor Core GEMM support in GPT-OSS for Blackwell GPUs | {
"login": "TheTinyTeddy",
"id": 171109504,
"node_id": "U_kgDOCjLsgA",
"avatar_url": "https://avatars.githubusercontent.com/u/171109504?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/TheTinyTeddy",
"html_url": "https://github.com/TheTinyTeddy",
"followers_url": "https://api.github.com/use... | [] | closed | false | null | [] | null | [] | 2025-09-16T08:46:30 | 2025-10-26T08:02:24 | 2025-10-26T08:02:24 | NONE | null | null | null | null | Hi there,
I was looking at the code (in `triton_kernels/matmul_ogs_details/_matmul_ogs.py`) and found that even when using Blackwell GPU the Triton kernel that implements
`acc = tl.dot_scaled(x, x_scales, x_format, w, w_scales, w_format, acc=acc, fast_math=True)`
is actually doing BF16 GEMM rather than MXFP4 GEMM, a... | {
"login": "github-actions[bot]",
"id": 41898282,
"node_id": "MDM6Qm90NDE4OTgyODI=",
"avatar_url": "https://avatars.githubusercontent.com/in/15368?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/github-actions%5Bbot%5D",
"html_url": "https://github.com/apps/github-actions",
"followers_url"... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/40904/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/40904/timeline | null | completed | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | true |
https://api.github.com/repos/huggingface/transformers/issues/40903 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/40903/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/40903/comments | https://api.github.com/repos/huggingface/transformers/issues/40903/events | https://github.com/huggingface/transformers/pull/40903 | 3,421,039,480 | PR_kwDOCUB6oc6ozNNG | 40,903 | Fix missing num_hidden_layers attribute in T5GemmaConfig | {
"login": "0xjeffro",
"id": 105006121,
"node_id": "U_kgDOBkJEKQ",
"avatar_url": "https://avatars.githubusercontent.com/u/105006121?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/0xjeffro",
"html_url": "https://github.com/0xjeffro",
"followers_url": "https://api.github.com/users/0xjeffro/... | [] | closed | false | null | [] | null | [] | 2025-09-16T08:17:57 | 2025-09-17T13:17:56 | 2025-09-17T13:17:55 | NONE | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40903",
"html_url": "https://github.com/huggingface/transformers/pull/40903",
"diff_url": "https://github.com/huggingface/transformers/pull/40903.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40903.patch",
"merged_at... | This is a quick fix for issue #40874. The `T5GemmaConfig` class was missing the `num_hidden_layers` attribute that cache initialization expects. Added `num_hidden_layers` property to expose the decoder's `num_hidden_layers` value.
# What does this PR do?
<!--
Congratulations! You've made it this far! You're not ... | {
"login": "gante",
"id": 12240844,
"node_id": "MDQ6VXNlcjEyMjQwODQ0",
"avatar_url": "https://avatars.githubusercontent.com/u/12240844?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/gante",
"html_url": "https://github.com/gante",
"followers_url": "https://api.github.com/users/gante/follow... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/40903/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/40903/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/40902 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/40902/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/40902/comments | https://api.github.com/repos/huggingface/transformers/issues/40902/events | https://github.com/huggingface/transformers/pull/40902 | 3,420,865,719 | PR_kwDOCUB6oc6oyneX | 40,902 | Fix flaky `Gemma3nAudioFeatureExtractionTest::test_dither` | {
"login": "ydshieh",
"id": 2521628,
"node_id": "MDQ6VXNlcjI1MjE2Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ydshieh",
"html_url": "https://github.com/ydshieh",
"followers_url": "https://api.github.com/users/ydshieh/... | [] | closed | false | null | [] | null | [] | 2025-09-16T07:31:22 | 2025-09-16T09:00:09 | 2025-09-16T09:00:07 | COLLABORATOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40902",
"html_url": "https://github.com/huggingface/transformers/pull/40902",
"diff_url": "https://github.com/huggingface/transformers/pull/40902.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40902.patch",
"merged_at... | # What does this PR do?
> tests/models/gemma3n/test_feature_extraction_gemma3n.py::Gemma3nAudioFeatureExtractionTest::test_dither
is flaky since it is added in #39059, the failing ratio is 0.72 % (running 10K times).
I run it 50K times to get the maximal value for the difference, which could go up to `0.5`.
... | {
"login": "ydshieh",
"id": 2521628,
"node_id": "MDQ6VXNlcjI1MjE2Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ydshieh",
"html_url": "https://github.com/ydshieh",
"followers_url": "https://api.github.com/users/ydshieh/... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/40902/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/40902/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/40901 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/40901/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/40901/comments | https://api.github.com/repos/huggingface/transformers/issues/40901/events | https://github.com/huggingface/transformers/issues/40901 | 3,420,767,024 | I_kwDOCUB6oc7L5MMw | 40,901 | Cannot fine-tune T5Gemma with Seq2SeqTrainer | {
"login": "Crissium",
"id": 91039086,
"node_id": "MDQ6VXNlcjkxMDM5MDg2",
"avatar_url": "https://avatars.githubusercontent.com/u/91039086?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Crissium",
"html_url": "https://github.com/Crissium",
"followers_url": "https://api.github.com/users/Cri... | [
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
}
] | closed | false | null | [] | null | [] | 2025-09-16T07:04:53 | 2025-10-20T12:39:10 | 2025-10-20T12:39:10 | NONE | null | null | null | null | ### System Info
- `transformers` version: 4.56.1
- Platform: Linux-6.8.0-60-generic-x86_64-with-glibc2.39
- Python version: 3.12.11
- Huggingface_hub version: 0.34.4
- Safetensors version: 0.5.3
- Accelerate version: 1.8.0
- Accelerate config: not found
- DeepSpeed version: not installed
- PyTorch version (accelerato... | {
"login": "SunMarc",
"id": 57196510,
"node_id": "MDQ6VXNlcjU3MTk2NTEw",
"avatar_url": "https://avatars.githubusercontent.com/u/57196510?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/SunMarc",
"html_url": "https://github.com/SunMarc",
"followers_url": "https://api.github.com/users/SunMar... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/40901/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/40901/timeline | null | completed | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | true |
https://api.github.com/repos/huggingface/transformers/issues/40900 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/40900/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/40900/comments | https://api.github.com/repos/huggingface/transformers/issues/40900/events | https://github.com/huggingface/transformers/pull/40900 | 3,420,090,946 | PR_kwDOCUB6oc6ov9EB | 40,900 | eneration: meta-safe _prepare_special_tokens + regression tests | {
"login": "moonrunnerkc",
"id": 125813226,
"node_id": "U_kgDOB3_B6g",
"avatar_url": "https://avatars.githubusercontent.com/u/125813226?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/moonrunnerkc",
"html_url": "https://github.com/moonrunnerkc",
"followers_url": "https://api.github.com/use... | [
{
"id": 9258341780,
"node_id": "LA_kwDOCUB6oc8AAAACJ9cVlA",
"url": "https://api.github.com/repos/huggingface/transformers/labels/Code%20agent%20slop",
"name": "Code agent slop",
"color": "C59579",
"default": false,
"description": ""
}
] | closed | false | null | [] | null | [] | 2025-09-16T02:30:26 | 2025-09-16T12:26:48 | 2025-09-16T12:26:48 | NONE | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40900",
"html_url": "https://github.com/huggingface/transformers/pull/40900",
"diff_url": "https://github.com/huggingface/transformers/pull/40900.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40900.patch",
"merged_at... | # Summary
## What / Why
This PR makes `generation/utils.py::_prepare_special_tokens` **meta-safe**.
In assisted decoding, special-token tensors could be created on the `meta` device and then accessed via `.item()` or `.cpu().numpy()`, which triggers:
RuntimeError: Tensor.item() cannot be called on meta tensor... | {
"login": "Rocketknight1",
"id": 12866554,
"node_id": "MDQ6VXNlcjEyODY2NTU0",
"avatar_url": "https://avatars.githubusercontent.com/u/12866554?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Rocketknight1",
"html_url": "https://github.com/Rocketknight1",
"followers_url": "https://api.githu... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/40900/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/40900/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/40899 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/40899/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/40899/comments | https://api.github.com/repos/huggingface/transformers/issues/40899/events | https://github.com/huggingface/transformers/pull/40899 | 3,419,941,186 | PR_kwDOCUB6oc6oveup | 40,899 | Don't report `num_input_tokens_seen` when disabled | {
"login": "qgallouedec",
"id": 45557362,
"node_id": "MDQ6VXNlcjQ1NTU3MzYy",
"avatar_url": "https://avatars.githubusercontent.com/u/45557362?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/qgallouedec",
"html_url": "https://github.com/qgallouedec",
"followers_url": "https://api.github.com/... | [] | closed | false | null | [] | null | [] | 2025-09-16T00:48:27 | 2025-09-18T05:04:20 | 2025-09-18T05:04:20 | MEMBER | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40899",
"html_url": "https://github.com/huggingface/transformers/pull/40899",
"diff_url": "https://github.com/huggingface/transformers/pull/40899.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40899.patch",
"merged_at... | ```python
>>> bool("no")
True
``` | {
"login": "qgallouedec",
"id": 45557362,
"node_id": "MDQ6VXNlcjQ1NTU3MzYy",
"avatar_url": "https://avatars.githubusercontent.com/u/45557362?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/qgallouedec",
"html_url": "https://github.com/qgallouedec",
"followers_url": "https://api.github.com/... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/40899/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/40899/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/40898 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/40898/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/40898/comments | https://api.github.com/repos/huggingface/transformers/issues/40898/events | https://github.com/huggingface/transformers/pull/40898 | 3,419,701,618 | PR_kwDOCUB6oc6ourMO | 40,898 | Adding [T5/MT5/UMT5]EncoderForSequenceClassification | {
"login": "cbhyphen",
"id": 12734117,
"node_id": "MDQ6VXNlcjEyNzM0MTE3",
"avatar_url": "https://avatars.githubusercontent.com/u/12734117?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/cbhyphen",
"html_url": "https://github.com/cbhyphen",
"followers_url": "https://api.github.com/users/cbh... | [] | open | false | null | [] | null | [] | 2025-09-15T22:21:09 | 2025-10-15T04:25:17 | null | NONE | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40898",
"html_url": "https://github.com/huggingface/transformers/pull/40898",
"diff_url": "https://github.com/huggingface/transformers/pull/40898.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40898.patch",
"merged_at... | # What does this PR do?
This PR adds an encoder-only sequence classifier for T5. Inspiration for this comes from the following paper: ["Sentence-T5: Scalable Sentence Encoders from Pre-trained Text-to-Text Models"](https://arxiv.org/abs/2108.08877). The mean of final hidden states is used as the sentence represent... | null | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/40898/reactions",
"total_count": 1,
"+1": 1,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/40898/timeline | null | null | null | null | true | false |
https://api.github.com/repos/huggingface/transformers/issues/40897 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/40897/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/40897/comments | https://api.github.com/repos/huggingface/transformers/issues/40897/events | https://github.com/huggingface/transformers/pull/40897 | 3,418,910,345 | PR_kwDOCUB6oc6osBOM | 40,897 | docs: standardized GIT model card according to the issue #36979 | {
"login": "Big-Marvel",
"id": 145830550,
"node_id": "U_kgDOCLEylg",
"avatar_url": "https://avatars.githubusercontent.com/u/145830550?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Big-Marvel",
"html_url": "https://github.com/Big-Marvel",
"followers_url": "https://api.github.com/users/Big... | [] | closed | false | null | [] | null | [] | 2025-09-15T17:45:34 | 2025-09-18T17:25:07 | 2025-09-18T17:25:06 | NONE | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40897",
"html_url": "https://github.com/huggingface/transformers/pull/40897",
"diff_url": "https://github.com/huggingface/transformers/pull/40897.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40897.patch",
"merged_at... | # What does this PR do?
This PR adds a standardized model card for the **Generative Image-to-Text Transformer (GIT)** following the ongoing documentation cleanup and model card standardization effort.
Specifically, it:
* Creates `git.md` with standardized structure (badges placeholder, model overview, usage ex... | {
"login": "stevhliu",
"id": 59462357,
"node_id": "MDQ6VXNlcjU5NDYyMzU3",
"avatar_url": "https://avatars.githubusercontent.com/u/59462357?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/stevhliu",
"html_url": "https://github.com/stevhliu",
"followers_url": "https://api.github.com/users/ste... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/40897/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/40897/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/40896 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/40896/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/40896/comments | https://api.github.com/repos/huggingface/transformers/issues/40896/events | https://github.com/huggingface/transformers/pull/40896 | 3,418,867,515 | PR_kwDOCUB6oc6or37h | 40,896 | Remove reference to subclasses in modernbert | {
"login": "lematt1991",
"id": 13142923,
"node_id": "MDQ6VXNlcjEzMTQyOTIz",
"avatar_url": "https://avatars.githubusercontent.com/u/13142923?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/lematt1991",
"html_url": "https://github.com/lematt1991",
"followers_url": "https://api.github.com/use... | [] | closed | false | null | [] | null | [] | 2025-09-15T17:31:30 | 2025-09-17T15:36:55 | 2025-09-17T15:36:55 | NONE | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40896",
"html_url": "https://github.com/huggingface/transformers/pull/40896",
"diff_url": "https://github.com/huggingface/transformers/pull/40896.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40896.patch",
"merged_at... | # What does this PR do?
`ModernBertPretrainedModel` currently [references](https://github.com/huggingface/transformers/blob/main/src/transformers/models/modernbert/modular_modernbert.py#L805-L815) it's sub-classes when initializing weights. This breaks things when you try to create a new model that inherits from th... | {
"login": "lematt1991",
"id": 13142923,
"node_id": "MDQ6VXNlcjEzMTQyOTIz",
"avatar_url": "https://avatars.githubusercontent.com/u/13142923?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/lematt1991",
"html_url": "https://github.com/lematt1991",
"followers_url": "https://api.github.com/use... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/40896/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/40896/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/40895 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/40895/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/40895/comments | https://api.github.com/repos/huggingface/transformers/issues/40895/events | https://github.com/huggingface/transformers/pull/40895 | 3,418,864,216 | PR_kwDOCUB6oc6or3Nd | 40,895 | [generate] remove docs of a feature that no longer exists | {
"login": "gante",
"id": 12240844,
"node_id": "MDQ6VXNlcjEyMjQwODQ0",
"avatar_url": "https://avatars.githubusercontent.com/u/12240844?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/gante",
"html_url": "https://github.com/gante",
"followers_url": "https://api.github.com/users/gante/follow... | [] | closed | false | null | [] | null | [] | 2025-09-15T17:30:16 | 2025-09-15T18:22:41 | 2025-09-15T18:22:32 | MEMBER | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40895",
"html_url": "https://github.com/huggingface/transformers/pull/40895",
"diff_url": "https://github.com/huggingface/transformers/pull/40895.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40895.patch",
"merged_at... | # What does this PR do?
Addresses [this comment](https://github.com/huggingface/transformers/pull/36685#issuecomment-3289824309): end-to-end generation is no longer supported, so let's remove its docs.
(Thank you @vfdev-5) | {
"login": "gante",
"id": 12240844,
"node_id": "MDQ6VXNlcjEyMjQwODQ0",
"avatar_url": "https://avatars.githubusercontent.com/u/12240844?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/gante",
"html_url": "https://github.com/gante",
"followers_url": "https://api.github.com/users/gante/follow... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/40895/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/40895/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/40894 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/40894/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/40894/comments | https://api.github.com/repos/huggingface/transformers/issues/40894/events | https://github.com/huggingface/transformers/pull/40894 | 3,418,840,200 | PR_kwDOCUB6oc6orx93 | 40,894 | Chat response parsing | {
"login": "Rocketknight1",
"id": 12866554,
"node_id": "MDQ6VXNlcjEyODY2NTU0",
"avatar_url": "https://avatars.githubusercontent.com/u/12866554?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Rocketknight1",
"html_url": "https://github.com/Rocketknight1",
"followers_url": "https://api.githu... | [] | closed | false | null | [] | null | [] | 2025-09-15T17:21:48 | 2025-10-28T13:57:00 | 2025-10-21T16:26:18 | MEMBER | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40894",
"html_url": "https://github.com/huggingface/transformers/pull/40894",
"diff_url": "https://github.com/huggingface/transformers/pull/40894.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40894.patch",
"merged_at... | This PR is a replacement for #39609. The idea is that models can include a message schema, allowing model output to be parsed into a structured form. The original plan was to allow parsing of the entire chat history, essentially the inverse operation of `apply_chat_template`, but the schemas involved were too complex a... | {
"login": "Rocketknight1",
"id": 12866554,
"node_id": "MDQ6VXNlcjEyODY2NTU0",
"avatar_url": "https://avatars.githubusercontent.com/u/12866554?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Rocketknight1",
"html_url": "https://github.com/Rocketknight1",
"followers_url": "https://api.githu... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/40894/reactions",
"total_count": 3,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 3
} | https://api.github.com/repos/huggingface/transformers/issues/40894/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/40893 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/40893/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/40893/comments | https://api.github.com/repos/huggingface/transformers/issues/40893/events | https://github.com/huggingface/transformers/pull/40893 | 3,418,830,856 | PR_kwDOCUB6oc6orv5o | 40,893 | [cache] Merge static sliding and static chunked layer | {
"login": "Cyrilvallez",
"id": 71554963,
"node_id": "MDQ6VXNlcjcxNTU0OTYz",
"avatar_url": "https://avatars.githubusercontent.com/u/71554963?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Cyrilvallez",
"html_url": "https://github.com/Cyrilvallez",
"followers_url": "https://api.github.com/... | [] | closed | false | null | [] | null | [] | 2025-09-15T17:18:24 | 2025-09-16T10:11:28 | 2025-09-16T09:41:20 | MEMBER | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40893",
"html_url": "https://github.com/huggingface/transformers/pull/40893",
"diff_url": "https://github.com/huggingface/transformers/pull/40893.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40893.patch",
"merged_at... | # What does this PR do?
As per the title. As discussed quite a few times, they are exactly the same, except the Chunked version is more general, as it can handle an arbitrary number of new tokens even after prefill (i.e. prefill caching, chat continuation etc...).
This PR merges them both, to only keep the more gen... | {
"login": "Cyrilvallez",
"id": 71554963,
"node_id": "MDQ6VXNlcjcxNTU0OTYz",
"avatar_url": "https://avatars.githubusercontent.com/u/71554963?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Cyrilvallez",
"html_url": "https://github.com/Cyrilvallez",
"followers_url": "https://api.github.com/... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/40893/reactions",
"total_count": 1,
"+1": 1,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/40893/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/40892 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/40892/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/40892/comments | https://api.github.com/repos/huggingface/transformers/issues/40892/events | https://github.com/huggingface/transformers/pull/40892 | 3,418,663,099 | PR_kwDOCUB6oc6orK2U | 40,892 | Harmonize CacheLayer names | {
"login": "Cyrilvallez",
"id": 71554963,
"node_id": "MDQ6VXNlcjcxNTU0OTYz",
"avatar_url": "https://avatars.githubusercontent.com/u/71554963?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Cyrilvallez",
"html_url": "https://github.com/Cyrilvallez",
"followers_url": "https://api.github.com/... | [] | closed | false | null | [] | null | [] | 2025-09-15T16:28:05 | 2025-09-16T10:14:39 | 2025-09-16T10:14:12 | MEMBER | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40892",
"html_url": "https://github.com/huggingface/transformers/pull/40892",
"diff_url": "https://github.com/huggingface/transformers/pull/40892.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40892.patch",
"merged_at... | # What does this PR do?
As per the title. As it's only used internally in the Caches, it does not necessarily needs a deprecation cycle where we keep the old names IMO. We can however do it, will let you judge @ArthurZucker
cc @gante and @manueldeprada as well, we talked about it before!
| {
"login": "Cyrilvallez",
"id": 71554963,
"node_id": "MDQ6VXNlcjcxNTU0OTYz",
"avatar_url": "https://avatars.githubusercontent.com/u/71554963?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Cyrilvallez",
"html_url": "https://github.com/Cyrilvallez",
"followers_url": "https://api.github.com/... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/40892/reactions",
"total_count": 2,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 2,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/40892/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/40891 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/40891/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/40891/comments | https://api.github.com/repos/huggingface/transformers/issues/40891/events | https://github.com/huggingface/transformers/issues/40891 | 3,418,263,341 | I_kwDOCUB6oc7Lvo8t | 40,891 | Need help for Applying Visual Prompt Tuning with Qwen2.5-VL vision | {
"login": "davidan208",
"id": 37769067,
"node_id": "MDQ6VXNlcjM3NzY5MDY3",
"avatar_url": "https://avatars.githubusercontent.com/u/37769067?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/davidan208",
"html_url": "https://github.com/davidan208",
"followers_url": "https://api.github.com/use... | [
{
"id": 5769473378,
"node_id": "LA_kwDOCUB6oc8AAAABV-MtYg",
"url": "https://api.github.com/repos/huggingface/transformers/labels/Vision",
"name": "Vision",
"color": "C079EF",
"default": false,
"description": ""
}
] | closed | false | null | [] | null | [] | 2025-09-15T14:37:15 | 2025-10-25T08:02:15 | 2025-10-25T08:02:15 | NONE | null | null | null | null | Hi everyone,
I am trying to add soft_prompt for Qwen2.5-VL vision
This is my code for doing this:
```
from transformers import Qwen2_5_VLModel, Qwen2_5_VLForConditionalGeneration, AutoProcessor
from transformers.models.qwen2_5_vl.modeling_qwen2_5_vl import (
Qwen2_5_VisionTransformerPretrainedModel,
Qwen2_5_V... | {
"login": "github-actions[bot]",
"id": 41898282,
"node_id": "MDM6Qm90NDE4OTgyODI=",
"avatar_url": "https://avatars.githubusercontent.com/in/15368?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/github-actions%5Bbot%5D",
"html_url": "https://github.com/apps/github-actions",
"followers_url"... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/40891/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/40891/timeline | null | completed | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | true |
https://api.github.com/repos/huggingface/transformers/issues/40890 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/40890/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/40890/comments | https://api.github.com/repos/huggingface/transformers/issues/40890/events | https://github.com/huggingface/transformers/pull/40890 | 3,418,154,358 | PR_kwDOCUB6oc6opdry | 40,890 | Adding activation kernels | {
"login": "MekkCyber",
"id": 93391238,
"node_id": "U_kgDOBZEJhg",
"avatar_url": "https://avatars.githubusercontent.com/u/93391238?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/MekkCyber",
"html_url": "https://github.com/MekkCyber",
"followers_url": "https://api.github.com/users/MekkCybe... | [] | closed | false | null | [] | null | [] | 2025-09-15T14:11:50 | 2025-09-17T13:48:17 | 2025-09-17T09:36:09 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40890",
"html_url": "https://github.com/huggingface/transformers/pull/40890",
"diff_url": "https://github.com/huggingface/transformers/pull/40890.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40890.patch",
"merged_at... | # What does this PR do?
Adds GeLU activation kernels from https://huggingface.co/kernels-community/activation, to use them we simply need to pass `use_kernels=True`
Here are Some benchmarks comparing the `activation kernels` perfs with a `torch.compile` implementation
<img width="959" height="378" alt="Screens... | {
"login": "MekkCyber",
"id": 93391238,
"node_id": "U_kgDOBZEJhg",
"avatar_url": "https://avatars.githubusercontent.com/u/93391238?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/MekkCyber",
"html_url": "https://github.com/MekkCyber",
"followers_url": "https://api.github.com/users/MekkCybe... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/40890/reactions",
"total_count": 1,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 1,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/40890/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/40889 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/40889/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/40889/comments | https://api.github.com/repos/huggingface/transformers/issues/40889/events | https://github.com/huggingface/transformers/pull/40889 | 3,418,135,609 | PR_kwDOCUB6oc6opZn4 | 40,889 | Adapt and test huggingface_hub v1.0.0 | {
"login": "Wauplin",
"id": 11801849,
"node_id": "MDQ6VXNlcjExODAxODQ5",
"avatar_url": "https://avatars.githubusercontent.com/u/11801849?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Wauplin",
"html_url": "https://github.com/Wauplin",
"followers_url": "https://api.github.com/users/Waupli... | [] | closed | false | null | [] | null | [] | 2025-09-15T14:07:11 | 2025-09-25T11:13:51 | 2025-09-25T11:13:50 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40889",
"html_url": "https://github.com/huggingface/transformers/pull/40889",
"diff_url": "https://github.com/huggingface/transformers/pull/40889.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40889.patch",
"merged_at... | Test as part of https://github.com/huggingface/huggingface_hub/issues/3340
**The main changes in `huggingface_hub` impacting transformers are:**
- `Repository` removed
- `HfFolder` removed
- migrated to `httpx` instead of `requests`
---
**List of changes made in this PR:**
- removed all imports that were ... | {
"login": "Wauplin",
"id": 11801849,
"node_id": "MDQ6VXNlcjExODAxODQ5",
"avatar_url": "https://avatars.githubusercontent.com/u/11801849?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Wauplin",
"html_url": "https://github.com/Wauplin",
"followers_url": "https://api.github.com/users/Waupli... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/40889/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/40889/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/40888 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/40888/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/40888/comments | https://api.github.com/repos/huggingface/transformers/issues/40888/events | https://github.com/huggingface/transformers/pull/40888 | 3,418,022,604 | PR_kwDOCUB6oc6opArC | 40,888 | DOC Fix help for chat and serve commands | {
"login": "BenjaminBossan",
"id": 6229650,
"node_id": "MDQ6VXNlcjYyMjk2NTA=",
"avatar_url": "https://avatars.githubusercontent.com/u/6229650?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/BenjaminBossan",
"html_url": "https://github.com/BenjaminBossan",
"followers_url": "https://api.gith... | [] | open | false | null | [] | null | [] | 2025-09-15T13:39:15 | 2025-10-01T10:54:18 | null | MEMBER | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40888",
"html_url": "https://github.com/huggingface/transformers/pull/40888",
"diff_url": "https://github.com/huggingface/transformers/pull/40888.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40888.patch",
"merged_at... | # What does this PR do?
For `transformers chat` and `transformers serve', the `load_in_8bit` and `load_in_4bit` arguments wrongly state that they require LoRA. This is only true when it comes to training but for inference, LoRA is not required.
Note: The failing test seems to be unrelated.
## Before submitting... | null | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/40888/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/40888/timeline | null | null | null | null | true | false |
https://api.github.com/repos/huggingface/transformers/issues/40887 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/40887/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/40887/comments | https://api.github.com/repos/huggingface/transformers/issues/40887/events | https://github.com/huggingface/transformers/pull/40887 | 3,417,784,938 | PR_kwDOCUB6oc6ooMbo | 40,887 | Refactor output handling in generate for cleaner decoding methods | {
"login": "manueldeprada",
"id": 6536835,
"node_id": "MDQ6VXNlcjY1MzY4MzU=",
"avatar_url": "https://avatars.githubusercontent.com/u/6536835?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/manueldeprada",
"html_url": "https://github.com/manueldeprada",
"followers_url": "https://api.github.... | [] | open | false | null | [] | null | [] | 2025-09-15T12:36:13 | 2025-10-30T06:04:51 | null | CONTRIBUTOR | null | null | true | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40887",
"html_url": "https://github.com/huggingface/transformers/pull/40887",
"diff_url": "https://github.com/huggingface/transformers/pull/40887.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40887.patch",
"merged_at... | Each decoding method has a common block of output handling boilerplate that worsens readability:
```
output_attentions = generation_config.output_attentions
output_hidden_states = generation_config.output_hidden_states
output_scores = generation_config.output_scores
output_logits = generation_config.output_logit... | null | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/40887/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/40887/timeline | null | null | null | null | true | false |
https://api.github.com/repos/huggingface/transformers/issues/40886 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/40886/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/40886/comments | https://api.github.com/repos/huggingface/transformers/issues/40886/events | https://github.com/huggingface/transformers/issues/40886 | 3,417,714,623 | I_kwDOCUB6oc7Lti-_ | 40,886 | not able to import Gemma3TextForSequenceClassification on transformers == '4.56.1' | {
"login": "rishavranaut",
"id": 141845222,
"node_id": "U_kgDOCHRi5g",
"avatar_url": "https://avatars.githubusercontent.com/u/141845222?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/rishavranaut",
"html_url": "https://github.com/rishavranaut",
"followers_url": "https://api.github.com/use... | [
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
}
] | closed | false | null | [] | null | [] | 2025-09-15T12:18:14 | 2025-09-16T08:40:30 | 2025-09-16T08:40:30 | NONE | null | null | null | null | ### System Info
i can see Gemma3TextForSequenceClassification implemented in modeling_gemma3.py but getting this error when trying to import this way from transformers import Gemma3TextForSequenceClassification..
ImportError: cannot import name 'Gemma3TextForSequenceClassification' from 'transformers' __init__.py
#... | {
"login": "rishavranaut",
"id": 141845222,
"node_id": "U_kgDOCHRi5g",
"avatar_url": "https://avatars.githubusercontent.com/u/141845222?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/rishavranaut",
"html_url": "https://github.com/rishavranaut",
"followers_url": "https://api.github.com/use... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/40886/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/40886/timeline | null | completed | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | true |
https://api.github.com/repos/huggingface/transformers/issues/40885 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/40885/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/40885/comments | https://api.github.com/repos/huggingface/transformers/issues/40885/events | https://github.com/huggingface/transformers/pull/40885 | 3,417,591,821 | PR_kwDOCUB6oc6onhzK | 40,885 | [Docs] Adding documentation of MXFP4 Quantization | {
"login": "ariG23498",
"id": 36856589,
"node_id": "MDQ6VXNlcjM2ODU2NTg5",
"avatar_url": "https://avatars.githubusercontent.com/u/36856589?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ariG23498",
"html_url": "https://github.com/ariG23498",
"followers_url": "https://api.github.com/users/... | [] | closed | false | null | [] | null | [] | 2025-09-15T11:44:40 | 2025-09-16T18:31:28 | 2025-09-16T18:31:28 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40885",
"html_url": "https://github.com/huggingface/transformers/pull/40885",
"diff_url": "https://github.com/huggingface/transformers/pull/40885.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40885.patch",
"merged_at... | The documentation is taken from hf.co/blog/faster-transformers.
@stevhliu it would be great if you could get me an initial review. I would love to make it more aligned to what we usually do with documentation like these. | {
"login": "stevhliu",
"id": 59462357,
"node_id": "MDQ6VXNlcjU5NDYyMzU3",
"avatar_url": "https://avatars.githubusercontent.com/u/59462357?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/stevhliu",
"html_url": "https://github.com/stevhliu",
"followers_url": "https://api.github.com/users/ste... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/40885/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/40885/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/40884 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/40884/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/40884/comments | https://api.github.com/repos/huggingface/transformers/issues/40884/events | https://github.com/huggingface/transformers/pull/40884 | 3,417,535,615 | PR_kwDOCUB6oc6onVac | 40,884 | Any to any pipeline and auto-mapping | {
"login": "zucchini-nlp",
"id": 100715397,
"node_id": "U_kgDOBgDLhQ",
"avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/zucchini-nlp",
"html_url": "https://github.com/zucchini-nlp",
"followers_url": "https://api.github.com/use... | [] | open | false | null | [] | null | [] | 2025-09-15T11:27:54 | 2025-10-16T17:38:12 | null | MEMBER | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40884",
"html_url": "https://github.com/huggingface/transformers/pull/40884",
"diff_url": "https://github.com/huggingface/transformers/pull/40884.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40884.patch",
"merged_at... | # What does this PR do?
Adds any-to-any as a pipeline and in auto classes so that we can have a single mapping for all multimodal models. The model mapping is almost same as image-text-to-text, with inclusion of audio-LLM and omni-LLM. I hope I added all audio models, but lmk if anything is missing from recent ones
... | null | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/40884/reactions",
"total_count": 2,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 2,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/40884/timeline | null | null | null | null | true | false |
https://api.github.com/repos/huggingface/transformers/issues/40883 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/40883/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/40883/comments | https://api.github.com/repos/huggingface/transformers/issues/40883/events | https://github.com/huggingface/transformers/pull/40883 | 3,417,435,598 | PR_kwDOCUB6oc6om_LM | 40,883 | Fix modular consistency | {
"login": "Cyrilvallez",
"id": 71554963,
"node_id": "MDQ6VXNlcjcxNTU0OTYz",
"avatar_url": "https://avatars.githubusercontent.com/u/71554963?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Cyrilvallez",
"html_url": "https://github.com/Cyrilvallez",
"followers_url": "https://api.github.com/... | [] | closed | false | null | [] | null | [] | 2025-09-15T10:58:39 | 2025-09-15T11:11:11 | 2025-09-15T11:07:08 | MEMBER | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40883",
"html_url": "https://github.com/huggingface/transformers/pull/40883",
"diff_url": "https://github.com/huggingface/transformers/pull/40883.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40883.patch",
"merged_at... | # What does this PR do?
Reapply modular based on latest change in main (race condition when merging qwen3-vl PR) | {
"login": "Cyrilvallez",
"id": 71554963,
"node_id": "MDQ6VXNlcjcxNTU0OTYz",
"avatar_url": "https://avatars.githubusercontent.com/u/71554963?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Cyrilvallez",
"html_url": "https://github.com/Cyrilvallez",
"followers_url": "https://api.github.com/... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/40883/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/40883/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/40882 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/40882/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/40882/comments | https://api.github.com/repos/huggingface/transformers/issues/40882/events | https://github.com/huggingface/transformers/pull/40882 | 3,417,408,162 | PR_kwDOCUB6oc6om5J9 | 40,882 | Remove dict branch of attention_mask in sdpa_attention_paged_forward | {
"login": "cyyever",
"id": 17618148,
"node_id": "MDQ6VXNlcjE3NjE4MTQ4",
"avatar_url": "https://avatars.githubusercontent.com/u/17618148?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/cyyever",
"html_url": "https://github.com/cyyever",
"followers_url": "https://api.github.com/users/cyyeve... | [] | closed | false | null | [] | null | [] | 2025-09-15T10:51:57 | 2025-09-16T16:24:30 | 2025-09-15T15:38:13 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40882",
"html_url": "https://github.com/huggingface/transformers/pull/40882",
"diff_url": "https://github.com/huggingface/transformers/pull/40882.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40882.patch",
"merged_at... | # What does this PR do?
attention_mask should be an optional tensor. | {
"login": "Cyrilvallez",
"id": 71554963,
"node_id": "MDQ6VXNlcjcxNTU0OTYz",
"avatar_url": "https://avatars.githubusercontent.com/u/71554963?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Cyrilvallez",
"html_url": "https://github.com/Cyrilvallez",
"followers_url": "https://api.github.com/... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/40882/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/40882/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/40881 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/40881/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/40881/comments | https://api.github.com/repos/huggingface/transformers/issues/40881/events | https://github.com/huggingface/transformers/pull/40881 | 3,417,234,200 | PR_kwDOCUB6oc6omTN1 | 40,881 | Update model tags and integration references in bug report | {
"login": "ArthurZucker",
"id": 48595927,
"node_id": "MDQ6VXNlcjQ4NTk1OTI3",
"avatar_url": "https://avatars.githubusercontent.com/u/48595927?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ArthurZucker",
"html_url": "https://github.com/ArthurZucker",
"followers_url": "https://api.github.c... | [] | closed | false | null | [] | null | [] | 2025-09-15T10:04:08 | 2025-09-15T10:13:21 | 2025-09-15T10:08:29 | COLLABORATOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40881",
"html_url": "https://github.com/huggingface/transformers/pull/40881",
"diff_url": "https://github.com/huggingface/transformers/pull/40881.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40881.patch",
"merged_at... | # What does this PR do?
<!--
Congratulations! You've made it this far! You're not quite done yet though.
Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution.
Then, please replace this w... | {
"login": "LysandreJik",
"id": 30755778,
"node_id": "MDQ6VXNlcjMwNzU1Nzc4",
"avatar_url": "https://avatars.githubusercontent.com/u/30755778?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/LysandreJik",
"html_url": "https://github.com/LysandreJik",
"followers_url": "https://api.github.com/... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/40881/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/40881/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/40880 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/40880/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/40880/comments | https://api.github.com/repos/huggingface/transformers/issues/40880/events | https://github.com/huggingface/transformers/pull/40880 | 3,417,172,630 | PR_kwDOCUB6oc6omF1e | 40,880 | Remove `runner_map` | {
"login": "ydshieh",
"id": 2521628,
"node_id": "MDQ6VXNlcjI1MjE2Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ydshieh",
"html_url": "https://github.com/ydshieh",
"followers_url": "https://api.github.com/users/ydshieh/... | [] | closed | false | null | [] | null | [] | 2025-09-15T09:49:28 | 2025-09-16T13:18:10 | 2025-09-16T13:18:07 | COLLABORATOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40880",
"html_url": "https://github.com/huggingface/transformers/pull/40880",
"diff_url": "https://github.com/huggingface/transformers/pull/40880.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40880.patch",
"merged_at... | # What does this PR do?
This is added when we switched from T4 to A10. We tried to do it in a progressive way but ended up doing it in one-go (after a few days) because it was confusing about the results, in particular during the debug and fix phase.
That change also caused the `fsdp/traiiner` job not being run d... | {
"login": "ydshieh",
"id": 2521628,
"node_id": "MDQ6VXNlcjI1MjE2Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ydshieh",
"html_url": "https://github.com/ydshieh",
"followers_url": "https://api.github.com/users/ydshieh/... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/40880/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/40880/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/40879 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/40879/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/40879/comments | https://api.github.com/repos/huggingface/transformers/issues/40879/events | https://github.com/huggingface/transformers/pull/40879 | 3,417,167,384 | PR_kwDOCUB6oc6omEq9 | 40,879 | [TimesFM] add TimesFM 2.5 | {
"login": "kashif",
"id": 8100,
"node_id": "MDQ6VXNlcjgxMDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/8100?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/kashif",
"html_url": "https://github.com/kashif",
"followers_url": "https://api.github.com/users/kashif/followers",
... | [] | closed | false | null | [] | null | [] | 2025-09-15T09:48:14 | 2025-09-24T10:24:25 | 2025-09-24T10:24:25 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40879",
"html_url": "https://github.com/huggingface/transformers/pull/40879",
"diff_url": "https://github.com/huggingface/transformers/pull/40879.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40879.patch",
"merged_at... | # What does this PR do?
Add TimesFM 2.5 model | {
"login": "kashif",
"id": 8100,
"node_id": "MDQ6VXNlcjgxMDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/8100?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/kashif",
"html_url": "https://github.com/kashif",
"followers_url": "https://api.github.com/users/kashif/followers",
... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/40879/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/40879/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/40878 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/40878/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/40878/comments | https://api.github.com/repos/huggingface/transformers/issues/40878/events | https://github.com/huggingface/transformers/pull/40878 | 3,417,102,746 | PR_kwDOCUB6oc6ol2no | 40,878 | Fix deta loading & dataclass | {
"login": "Cyrilvallez",
"id": 71554963,
"node_id": "MDQ6VXNlcjcxNTU0OTYz",
"avatar_url": "https://avatars.githubusercontent.com/u/71554963?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Cyrilvallez",
"html_url": "https://github.com/Cyrilvallez",
"followers_url": "https://api.github.com/... | [] | closed | false | null | [] | null | [] | 2025-09-15T09:31:47 | 2025-09-15T15:23:14 | 2025-09-15T15:23:13 | MEMBER | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40878",
"html_url": "https://github.com/huggingface/transformers/pull/40878",
"diff_url": "https://github.com/huggingface/transformers/pull/40878.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40878.patch",
"merged_at... | # What does this PR do?
Fixes https://github.com/huggingface/transformers/issues/40853. Always a bad idea to reattribute the module's data, should simply be manipulated in-place. We should not even have initialization schemes in `__init__`, but as the model is marked as deprecated, I only fixed it quickly instead of... | {
"login": "Cyrilvallez",
"id": 71554963,
"node_id": "MDQ6VXNlcjcxNTU0OTYz",
"avatar_url": "https://avatars.githubusercontent.com/u/71554963?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Cyrilvallez",
"html_url": "https://github.com/Cyrilvallez",
"followers_url": "https://api.github.com/... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/40878/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/40878/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/40877 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/40877/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/40877/comments | https://api.github.com/repos/huggingface/transformers/issues/40877/events | https://github.com/huggingface/transformers/pull/40877 | 3,415,803,263 | PR_kwDOCUB6oc6ohfLm | 40,877 | Bug #40833: Fix for kv_offset calculation for mixed padding | {
"login": "preethamyerramsetty",
"id": 135053952,
"node_id": "U_kgDOCAzCgA",
"avatar_url": "https://avatars.githubusercontent.com/u/135053952?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/preethamyerramsetty",
"html_url": "https://github.com/preethamyerramsetty",
"followers_url": "https... | [] | open | false | null | [] | null | [] | 2025-09-15T00:18:46 | 2025-09-15T09:26:30 | null | NONE | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40877",
"html_url": "https://github.com/huggingface/transformers/pull/40877",
"diff_url": "https://github.com/huggingface/transformers/pull/40877.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40877.patch",
"merged_at... | # What does this PR do?
This fixes the kv_offset calculation in `cache_utils.py` in order to handle left and mixed padding correctly. Previously in case of mixed left and right padding the model could attend to padded tokens which results in incorrect response.
This PR ensures that correct offset is used for left ... | null | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/40877/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/40877/timeline | null | null | null | null | true | false |
https://api.github.com/repos/huggingface/transformers/issues/40876 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/40876/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/40876/comments | https://api.github.com/repos/huggingface/transformers/issues/40876/events | https://github.com/huggingface/transformers/pull/40876 | 3,415,123,305 | PR_kwDOCUB6oc6ofTFH | 40,876 | Update shieldgemma2 model card | {
"login": "BryanBradfo",
"id": 101939095,
"node_id": "U_kgDOBhN3lw",
"avatar_url": "https://avatars.githubusercontent.com/u/101939095?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/BryanBradfo",
"html_url": "https://github.com/BryanBradfo",
"followers_url": "https://api.github.com/users/... | [] | closed | false | null | [] | null | [] | 2025-09-14T14:36:17 | 2025-09-18T17:19:26 | 2025-09-18T17:19:26 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40876",
"html_url": "https://github.com/huggingface/transformers/pull/40876",
"diff_url": "https://github.com/huggingface/transformers/pull/40876.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40876.patch",
"merged_at... | # What does this PR do?
This pull request updates the shieldgemma2.md model card to align with the new standardized format, as requested in issue https://github.com/huggingface/transformers/issues/36979.
The main changes include:
- Restructuring the document to follow the new standard layout.
- Adding a compreh... | {
"login": "stevhliu",
"id": 59462357,
"node_id": "MDQ6VXNlcjU5NDYyMzU3",
"avatar_url": "https://avatars.githubusercontent.com/u/59462357?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/stevhliu",
"html_url": "https://github.com/stevhliu",
"followers_url": "https://api.github.com/users/ste... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/40876/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/40876/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/40875 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/40875/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/40875/comments | https://api.github.com/repos/huggingface/transformers/issues/40875/events | https://github.com/huggingface/transformers/issues/40875 | 3,415,067,401 | I_kwDOCUB6oc7LjcsJ | 40,875 | ColPaliForRetrieval errors out when loaded in half precision dtypes | {
"login": "merveenoyan",
"id": 53175384,
"node_id": "MDQ6VXNlcjUzMTc1Mzg0",
"avatar_url": "https://avatars.githubusercontent.com/u/53175384?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/merveenoyan",
"html_url": "https://github.com/merveenoyan",
"followers_url": "https://api.github.com/... | [
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
}
] | closed | false | null | [] | null | [] | 2025-09-14T13:52:53 | 2025-09-16T16:07:57 | 2025-09-16T16:07:57 | CONTRIBUTOR | null | null | null | null | ### System Info
transformers version: transformers==4.56.1
Here's the error, this can be fixed by setting dtype to float32. float16 and bfloat16 won't work.
```
---------------------------------------------------------------------------
RuntimeError Traceback (most recent call last)
[/tm... | {
"login": "zucchini-nlp",
"id": 100715397,
"node_id": "U_kgDOBgDLhQ",
"avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/zucchini-nlp",
"html_url": "https://github.com/zucchini-nlp",
"followers_url": "https://api.github.com/use... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/40875/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/40875/timeline | null | completed | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | true |
https://api.github.com/repos/huggingface/transformers/issues/40874 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/40874/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/40874/comments | https://api.github.com/repos/huggingface/transformers/issues/40874/events | https://github.com/huggingface/transformers/issues/40874 | 3,414,901,848 | I_kwDOCUB6oc7Li0RY | 40,874 | Missing num_hidden_layers in T5GemmaConfig | {
"login": "kuihao",
"id": 56499195,
"node_id": "MDQ6VXNlcjU2NDk5MTk1",
"avatar_url": "https://avatars.githubusercontent.com/u/56499195?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/kuihao",
"html_url": "https://github.com/kuihao",
"followers_url": "https://api.github.com/users/kuihao/fo... | [
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
}
] | closed | false | null | [] | null | [] | 2025-09-14T11:06:21 | 2025-10-01T14:55:56 | 2025-10-01T14:55:56 | NONE | null | null | null | null | ### System Info
- `transformers` version: 4.56.0
- Platform: Linux-5.15.0-151-generic-x86_64-with-glibc2.39
- Python version: 3.12.3
- Huggingface_hub version: 0.34.4
- Safetensors version: 0.5.2
- Accelerate version: 1.10.1
- Accelerate config: not found
- DeepSpeed version: 0.17.5
- PyTorch version (accelerator?)... | {
"login": "gante",
"id": 12240844,
"node_id": "MDQ6VXNlcjEyMjQwODQ0",
"avatar_url": "https://avatars.githubusercontent.com/u/12240844?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/gante",
"html_url": "https://github.com/gante",
"followers_url": "https://api.github.com/users/gante/follow... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/40874/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/40874/timeline | null | completed | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | true |
https://api.github.com/repos/huggingface/transformers/issues/40873 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/40873/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/40873/comments | https://api.github.com/repos/huggingface/transformers/issues/40873/events | https://github.com/huggingface/transformers/pull/40873 | 3,414,710,903 | PR_kwDOCUB6oc6od9-8 | 40,873 | 🌐 [i18n-KO] Translated gemma3n.md to Korean | {
"login": "HyunZ118",
"id": 156191095,
"node_id": "U_kgDOCU9Jdw",
"avatar_url": "https://avatars.githubusercontent.com/u/156191095?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/HyunZ118",
"html_url": "https://github.com/HyunZ118",
"followers_url": "https://api.github.com/users/HyunZ118/... | [] | closed | false | null | [] | null | [] | 2025-09-14T07:47:41 | 2025-10-17T16:57:05 | 2025-10-17T16:57:05 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40873",
"html_url": "https://github.com/huggingface/transformers/pull/40873",
"diff_url": "https://github.com/huggingface/transformers/pull/40873.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40873.patch",
"merged_at... | # What does this PR do?
Translated the gemma3n.md file of the documentation to Korean.
Thank you in advance for your review.
Part of https://github.com/huggingface/transformers/issues/20179
## Before reviewing
- [x] Check for missing / redundant translations (번역 누락/중복 검사)
- [x] Grammar Check (맞춤법 검사)
- [x]... | {
"login": "stevhliu",
"id": 59462357,
"node_id": "MDQ6VXNlcjU5NDYyMzU3",
"avatar_url": "https://avatars.githubusercontent.com/u/59462357?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/stevhliu",
"html_url": "https://github.com/stevhliu",
"followers_url": "https://api.github.com/users/ste... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/40873/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/40873/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/40872 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/40872/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/40872/comments | https://api.github.com/repos/huggingface/transformers/issues/40872/events | https://github.com/huggingface/transformers/pull/40872 | 3,414,530,812 | PR_kwDOCUB6oc6odY8S | 40,872 | Fixed a typo in "transformers/docs/source/en/perf_hardware.md" | {
"login": "j-harshana",
"id": 189495155,
"node_id": "U_kgDOC0t3cw",
"avatar_url": "https://avatars.githubusercontent.com/u/189495155?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/j-harshana",
"html_url": "https://github.com/j-harshana",
"followers_url": "https://api.github.com/users/j-h... | [] | closed | false | null | [] | null | [] | 2025-09-14T03:59:20 | 2025-09-15T11:58:28 | 2025-09-15T11:58:27 | NONE | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40872",
"html_url": "https://github.com/huggingface/transformers/pull/40872",
"diff_url": "https://github.com/huggingface/transformers/pull/40872.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40872.patch",
"merged_at... | # What does this PR do?
<!--
Congratulations! You've made it this far! You're not quite done yet though.
Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution.
Then, please replace this w... | {
"login": "Rocketknight1",
"id": 12866554,
"node_id": "MDQ6VXNlcjEyODY2NTU0",
"avatar_url": "https://avatars.githubusercontent.com/u/12866554?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Rocketknight1",
"html_url": "https://github.com/Rocketknight1",
"followers_url": "https://api.githu... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/40872/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/40872/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/40871 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/40871/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/40871/comments | https://api.github.com/repos/huggingface/transformers/issues/40871/events | https://github.com/huggingface/transformers/pull/40871 | 3,414,317,259 | PR_kwDOCUB6oc6ocrQ4 | 40,871 | Refactor benchmark utils: add type hints, GPU metrics helper, and con… | {
"login": "ProblemShooter",
"id": 171776292,
"node_id": "U_kgDOCj0ZJA",
"avatar_url": "https://avatars.githubusercontent.com/u/171776292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ProblemShooter",
"html_url": "https://github.com/ProblemShooter",
"followers_url": "https://api.github.c... | [] | open | false | null | [] | null | [] | 2025-09-14T00:04:26 | 2025-09-23T11:48:16 | null | NONE | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40871",
"html_url": "https://github.com/huggingface/transformers/pull/40871",
"diff_url": "https://github.com/huggingface/transformers/pull/40871.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40871.patch",
"merged_at... | Hi team 👋,
This PR refactors the benchmarking utility code to make it cleaner, more reliable, and easier to maintain. I’ve introduced a centralized collect_gpu_metrics() helper for GPU monitoring, added a validate() method in BenchmarkConfig to catch invalid configs early, and improved type hints for better readabi... | null | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/40871/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/40871/timeline | null | null | null | null | true | false |
https://api.github.com/repos/huggingface/transformers/issues/40870 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/40870/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/40870/comments | https://api.github.com/repos/huggingface/transformers/issues/40870/events | https://github.com/huggingface/transformers/pull/40870 | 3,414,290,692 | PR_kwDOCUB6oc6ocliV | 40,870 | Reduce vRAM usage during generation by allowing to transfer logits to CPU | {
"login": "SamuelBarryCS",
"id": 127697809,
"node_id": "U_kgDOB5yDkQ",
"avatar_url": "https://avatars.githubusercontent.com/u/127697809?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/SamuelBarryCS",
"html_url": "https://github.com/SamuelBarryCS",
"followers_url": "https://api.github.com/... | [] | open | false | null | [] | null | [] | 2025-09-13T23:36:29 | 2025-09-19T10:51:06 | null | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40870",
"html_url": "https://github.com/huggingface/transformers/pull/40870",
"diff_url": "https://github.com/huggingface/transformers/pull/40870.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40870.patch",
"merged_at... | ## What
- Fixes https://github.com/huggingface/transformers/issues/40794 by adding a parameter offload_logits_to_cpu to GenerationConfig which transfers logits and scores tensors to the CPUs after generation.
- Frees up memory during large runs, trading decreased vRAM usage for CPU/ GPU communication time, enabling p... | null | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/40870/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/40870/timeline | null | null | null | null | true | false |
https://api.github.com/repos/huggingface/transformers/issues/40869 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/40869/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/40869/comments | https://api.github.com/repos/huggingface/transformers/issues/40869/events | https://github.com/huggingface/transformers/pull/40869 | 3,414,150,461 | PR_kwDOCUB6oc6ocHQO | 40,869 | Bug: Fix device/dtype mismatch in DetaForObjectDetection bias initialization . | {
"login": "Aniketsy",
"id": 148300120,
"node_id": "U_kgDOCNbhWA",
"avatar_url": "https://avatars.githubusercontent.com/u/148300120?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Aniketsy",
"html_url": "https://github.com/Aniketsy",
"followers_url": "https://api.github.com/users/Aniketsy/... | [] | closed | false | null | [] | null | [] | 2025-09-13T21:13:14 | 2025-09-15T12:12:21 | 2025-09-15T12:12:21 | NONE | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40869",
"html_url": "https://github.com/huggingface/transformers/pull/40869",
"diff_url": "https://github.com/huggingface/transformers/pull/40869.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40869.patch",
"merged_at... | #40853
This PR fixes a bug that prevented DETA object detection models from loading in recent Transformers versions due to a device/dtype mismatch when initializing self.class_embed.bias.data. The fix ensures the tensor is created on the correct device and with the correct dtype, resolving error .
Please let me kno... | {
"login": "Rocketknight1",
"id": 12866554,
"node_id": "MDQ6VXNlcjEyODY2NTU0",
"avatar_url": "https://avatars.githubusercontent.com/u/12866554?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Rocketknight1",
"html_url": "https://github.com/Rocketknight1",
"followers_url": "https://api.githu... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/40869/reactions",
"total_count": 1,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 1,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/40869/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/40868 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/40868/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/40868/comments | https://api.github.com/repos/huggingface/transformers/issues/40868/events | https://github.com/huggingface/transformers/issues/40868 | 3,413,768,047 | I_kwDOCUB6oc7Lefdv | 40,868 | Review and update the Code of Conduct | {
"login": "wiwdep-netizen",
"id": 227674328,
"node_id": "U_kgDODZII2A",
"avatar_url": "https://avatars.githubusercontent.com/u/227674328?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/wiwdep-netizen",
"html_url": "https://github.com/wiwdep-netizen",
"followers_url": "https://api.github.c... | [
{
"id": 9258341780,
"node_id": "LA_kwDOCUB6oc8AAAACJ9cVlA",
"url": "https://api.github.com/repos/huggingface/transformers/labels/Code%20agent%20slop",
"name": "Code agent slop",
"color": "C59579",
"default": false,
"description": ""
}
] | closed | false | null | [] | null | [] | 2025-09-13T16:18:52 | 2025-09-15T11:48:51 | 2025-09-15T11:48:51 | NONE | null | null | null | null | This issue tracks the review and potential update of the project's Code of Conduct. The goal is to ensure our code of conduct is clear, comprehensive, and reflects our community’s values.
Sub-issues will address:
- Identifying any gaps in the current Code of Conduct
- Proposing improvements or clarifications
- Impleme... | {
"login": "Rocketknight1",
"id": 12866554,
"node_id": "MDQ6VXNlcjEyODY2NTU0",
"avatar_url": "https://avatars.githubusercontent.com/u/12866554?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Rocketknight1",
"html_url": "https://github.com/Rocketknight1",
"followers_url": "https://api.githu... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/40868/reactions",
"total_count": 2,
"+1": 1,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 1,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/40868/timeline | null | completed | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | true |
https://api.github.com/repos/huggingface/transformers/issues/40867 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/40867/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/40867/comments | https://api.github.com/repos/huggingface/transformers/issues/40867/events | https://github.com/huggingface/transformers/issues/40867 | 3,413,712,832 | I_kwDOCUB6oc7LeR_A | 40,867 | bug with AutoVideoProcessor for VJEPA 2 | {
"login": "FrancoisPorcher",
"id": 93766133,
"node_id": "U_kgDOBZbB9Q",
"avatar_url": "https://avatars.githubusercontent.com/u/93766133?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/FrancoisPorcher",
"html_url": "https://github.com/FrancoisPorcher",
"followers_url": "https://api.github.... | [
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
}
] | closed | false | null | [] | null | [] | 2025-09-13T15:41:29 | 2025-10-01T08:55:15 | 2025-10-01T08:55:15 | NONE | null | null | null | null | ### System Info
### Bug Report
Hi,
The `AutoVideoProcessor` for **VJEPA 2** was working fine, but after upgrading `transformers` to **4.56.1** it stopped working.
I’m not sure if the issue comes from `accelerate` or from the `AutoVideoProcessor` itself.
---
### Code Snippet
```python
video = self.preprocessor(... | {
"login": "zucchini-nlp",
"id": 100715397,
"node_id": "U_kgDOBgDLhQ",
"avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/zucchini-nlp",
"html_url": "https://github.com/zucchini-nlp",
"followers_url": "https://api.github.com/use... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/40867/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/40867/timeline | null | completed | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | true |
https://api.github.com/repos/huggingface/transformers/issues/40866 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/40866/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/40866/comments | https://api.github.com/repos/huggingface/transformers/issues/40866/events | https://github.com/huggingface/transformers/pull/40866 | 3,413,299,622 | PR_kwDOCUB6oc6oZQ8j | 40,866 | Updated the model card for TimeSformer | {
"login": "mreraser",
"id": 33192762,
"node_id": "MDQ6VXNlcjMzMTkyNzYy",
"avatar_url": "https://avatars.githubusercontent.com/u/33192762?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mreraser",
"html_url": "https://github.com/mreraser",
"followers_url": "https://api.github.com/users/mre... | [] | closed | false | null | [] | null | [] | 2025-09-13T12:01:13 | 2025-09-18T17:23:52 | 2025-09-18T17:23:52 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40866",
"html_url": "https://github.com/huggingface/transformers/pull/40866",
"diff_url": "https://github.com/huggingface/transformers/pull/40866.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40866.patch",
"merged_at... | # What does this PR do?
As suggested in this issue - https://github.com/huggingface/transformers/issues/36979#issue-2947704577 - this PR updates the documentation of the [TimeSformer](https://huggingface.co/docs/transformers/main/model_doc/timesformer) model, which will now be aligned with the standardized format fo... | {
"login": "stevhliu",
"id": 59462357,
"node_id": "MDQ6VXNlcjU5NDYyMzU3",
"avatar_url": "https://avatars.githubusercontent.com/u/59462357?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/stevhliu",
"html_url": "https://github.com/stevhliu",
"followers_url": "https://api.github.com/users/ste... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/40866/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/40866/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/40865 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/40865/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/40865/comments | https://api.github.com/repos/huggingface/transformers/issues/40865/events | https://github.com/huggingface/transformers/pull/40865 | 3,413,237,731 | PR_kwDOCUB6oc6oZDJM | 40,865 | Updated the model card for ViViT | {
"login": "mreraser",
"id": 33192762,
"node_id": "MDQ6VXNlcjMzMTkyNzYy",
"avatar_url": "https://avatars.githubusercontent.com/u/33192762?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mreraser",
"html_url": "https://github.com/mreraser",
"followers_url": "https://api.github.com/users/mre... | [] | closed | false | null | [] | null | [] | 2025-09-13T11:25:12 | 2025-09-18T17:23:38 | 2025-09-18T17:23:38 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40865",
"html_url": "https://github.com/huggingface/transformers/pull/40865",
"diff_url": "https://github.com/huggingface/transformers/pull/40865.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40865.patch",
"merged_at... | # What does this PR do?
As suggested in this issue - https://github.com/huggingface/transformers/issues/36979#issue-2947704577 - this PR updates the documentation of the [ViViT](https://huggingface.co/docs/transformers/main/model_doc/vivit) model, which will now be aligned with the standardized format for all the do... | {
"login": "stevhliu",
"id": 59462357,
"node_id": "MDQ6VXNlcjU5NDYyMzU3",
"avatar_url": "https://avatars.githubusercontent.com/u/59462357?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/stevhliu",
"html_url": "https://github.com/stevhliu",
"followers_url": "https://api.github.com/users/ste... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/40865/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/40865/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/40864 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/40864/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/40864/comments | https://api.github.com/repos/huggingface/transformers/issues/40864/events | https://github.com/huggingface/transformers/pull/40864 | 3,413,232,799 | PR_kwDOCUB6oc6oZCDh | 40,864 | remove dummy EncodingFast | {
"login": "cyyever",
"id": 17618148,
"node_id": "MDQ6VXNlcjE3NjE4MTQ4",
"avatar_url": "https://avatars.githubusercontent.com/u/17618148?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/cyyever",
"html_url": "https://github.com/cyyever",
"followers_url": "https://api.github.com/users/cyyeve... | [] | closed | false | null | [] | null | [] | 2025-09-13T11:21:48 | 2025-09-16T13:05:13 | 2025-09-16T12:56:11 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40864",
"html_url": "https://github.com/huggingface/transformers/pull/40864",
"diff_url": "https://github.com/huggingface/transformers/pull/40864.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40864.patch",
"merged_at... | # What does this PR do?
Remove the dummy EncodingFast class. It's safer to always use the real EncodingFast. | {
"login": "Rocketknight1",
"id": 12866554,
"node_id": "MDQ6VXNlcjEyODY2NTU0",
"avatar_url": "https://avatars.githubusercontent.com/u/12866554?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Rocketknight1",
"html_url": "https://github.com/Rocketknight1",
"followers_url": "https://api.githu... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/40864/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/40864/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/40863 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/40863/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/40863/comments | https://api.github.com/repos/huggingface/transformers/issues/40863/events | https://github.com/huggingface/transformers/pull/40863 | 3,412,862,715 | PR_kwDOCUB6oc6oX80r | 40,863 | [VisionEncoderDecoderModel] Update loss function | {
"login": "NielsRogge",
"id": 48327001,
"node_id": "MDQ6VXNlcjQ4MzI3MDAx",
"avatar_url": "https://avatars.githubusercontent.com/u/48327001?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/NielsRogge",
"html_url": "https://github.com/NielsRogge",
"followers_url": "https://api.github.com/use... | [] | closed | false | null | [] | null | [] | 2025-09-13T08:12:18 | 2025-10-14T14:03:01 | 2025-10-14T14:03:00 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40863",
"html_url": "https://github.com/huggingface/transformers/pull/40863",
"diff_url": "https://github.com/huggingface/transformers/pull/40863.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40863.patch",
"merged_at... | # What does this PR do?
Models like Donut are currently broken on main, they can't be fine-tuned. In order to unblock users at #39473, this PR reverts #36753.
It looks like the `ForCausalLMLoss` class shifts the labels, however the VisionEncoderDecoderModel class does not expect shifted labels as seen [here](http... | {
"login": "SunMarc",
"id": 57196510,
"node_id": "MDQ6VXNlcjU3MTk2NTEw",
"avatar_url": "https://avatars.githubusercontent.com/u/57196510?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/SunMarc",
"html_url": "https://github.com/SunMarc",
"followers_url": "https://api.github.com/users/SunMar... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/40863/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/40863/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/40862 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/40862/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/40862/comments | https://api.github.com/repos/huggingface/transformers/issues/40862/events | https://github.com/huggingface/transformers/pull/40862 | 3,412,784,458 | PR_kwDOCUB6oc6oXrlx | 40,862 | Redirect MI355 CI results to dummy dataset | {
"login": "ahadnagy",
"id": 21314428,
"node_id": "MDQ6VXNlcjIxMzE0NDI4",
"avatar_url": "https://avatars.githubusercontent.com/u/21314428?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ahadnagy",
"html_url": "https://github.com/ahadnagy",
"followers_url": "https://api.github.com/users/aha... | [] | closed | false | null | [] | null | [] | 2025-09-13T07:17:13 | 2025-09-14T16:42:50 | 2025-09-14T16:42:50 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40862",
"html_url": "https://github.com/huggingface/transformers/pull/40862",
"diff_url": "https://github.com/huggingface/transformers/pull/40862.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40862.patch",
"merged_at... | # What does this PR do?
This PR will temporarily redirect the MI355 CI results to a dummy dataset until the runners become stable and the isolation of results in the main dataset is sorted out.
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the ca... | {
"login": "ivarflakstad",
"id": 69173633,
"node_id": "MDQ6VXNlcjY5MTczNjMz",
"avatar_url": "https://avatars.githubusercontent.com/u/69173633?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ivarflakstad",
"html_url": "https://github.com/ivarflakstad",
"followers_url": "https://api.github.c... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/40862/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/40862/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/40861 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/40861/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/40861/comments | https://api.github.com/repos/huggingface/transformers/issues/40861/events | https://github.com/huggingface/transformers/pull/40861 | 3,412,667,357 | PR_kwDOCUB6oc6oXS_F | 40,861 | Support n_groups>1 for mamba2 | {
"login": "tdoublep",
"id": 7945038,
"node_id": "MDQ6VXNlcjc5NDUwMzg=",
"avatar_url": "https://avatars.githubusercontent.com/u/7945038?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/tdoublep",
"html_url": "https://github.com/tdoublep",
"followers_url": "https://api.github.com/users/tdoub... | [] | open | false | null | [] | null | [] | 2025-09-13T05:37:17 | 2025-09-15T11:04:07 | null | NONE | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40861",
"html_url": "https://github.com/huggingface/transformers/pull/40861",
"diff_url": "https://github.com/huggingface/transformers/pull/40861.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40861.patch",
"merged_at... | # What does this PR do?
<!--
Congratulations! You've made it this far! You're not quite done yet though.
Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution.
Then, please replace this w... | null | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/40861/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/40861/timeline | null | null | null | null | true | false |
https://api.github.com/repos/huggingface/transformers/issues/40860 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/40860/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/40860/comments | https://api.github.com/repos/huggingface/transformers/issues/40860/events | https://github.com/huggingface/transformers/pull/40860 | 3,412,355,199 | PR_kwDOCUB6oc6oWO8_ | 40,860 | Use torch.expm1 and torch.log1p for better numerical results | {
"login": "cyyever",
"id": 17618148,
"node_id": "MDQ6VXNlcjE3NjE4MTQ4",
"avatar_url": "https://avatars.githubusercontent.com/u/17618148?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/cyyever",
"html_url": "https://github.com/cyyever",
"followers_url": "https://api.github.com/users/cyyeve... | [] | closed | false | null | [] | null | [] | 2025-09-13T01:18:09 | 2025-09-15T12:13:39 | 2025-09-15T11:54:14 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40860",
"html_url": "https://github.com/huggingface/transformers/pull/40860",
"diff_url": "https://github.com/huggingface/transformers/pull/40860.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40860.patch",
"merged_at... | # What does this PR do?
Detected by TorchFix
| {
"login": "Rocketknight1",
"id": 12866554,
"node_id": "MDQ6VXNlcjEyODY2NTU0",
"avatar_url": "https://avatars.githubusercontent.com/u/12866554?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Rocketknight1",
"html_url": "https://github.com/Rocketknight1",
"followers_url": "https://api.githu... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/40860/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/40860/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/40859 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/40859/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/40859/comments | https://api.github.com/repos/huggingface/transformers/issues/40859/events | https://github.com/huggingface/transformers/pull/40859 | 3,412,097,343 | PR_kwDOCUB6oc6oVWXw | 40,859 | 🚨 [lightglue] fix: matches order changed because of early stopped indices | {
"login": "sbucaille",
"id": 24275548,
"node_id": "MDQ6VXNlcjI0Mjc1NTQ4",
"avatar_url": "https://avatars.githubusercontent.com/u/24275548?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/sbucaille",
"html_url": "https://github.com/sbucaille",
"followers_url": "https://api.github.com/users/... | [] | closed | false | null | [] | null | [] | 2025-09-12T22:39:30 | 2025-09-19T16:52:44 | 2025-09-19T15:41:22 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40859",
"html_url": "https://github.com/huggingface/transformers/pull/40859",
"diff_url": "https://github.com/huggingface/transformers/pull/40859.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40859.patch",
"merged_at... | # What does this PR do?
Fixes https://github.com/cvg/LightGlue/issues/171#issuecomment-3284295107
A bug is present in LightGlue when using batching.
The way early stopped indices were handled made order of matches change.
Example :
```python
# Input being
[[image2, image0], [image2, image0], [image1, image1]]
... | {
"login": "qubvel",
"id": 31920396,
"node_id": "MDQ6VXNlcjMxOTIwMzk2",
"avatar_url": "https://avatars.githubusercontent.com/u/31920396?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/qubvel",
"html_url": "https://github.com/qubvel",
"followers_url": "https://api.github.com/users/qubvel/fo... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/40859/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/40859/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/40858 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/40858/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/40858/comments | https://api.github.com/repos/huggingface/transformers/issues/40858/events | https://github.com/huggingface/transformers/issues/40858 | 3,411,882,835 | I_kwDOCUB6oc7LXTNT | 40,858 | torch.no_grad() yields NaN values on mps device, 4D attention mask | {
"login": "AmitMY",
"id": 5757359,
"node_id": "MDQ6VXNlcjU3NTczNTk=",
"avatar_url": "https://avatars.githubusercontent.com/u/5757359?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/AmitMY",
"html_url": "https://github.com/AmitMY",
"followers_url": "https://api.github.com/users/AmitMY/foll... | [
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
}
] | open | false | null | [] | null | [] | 2025-09-12T20:40:36 | 2025-10-13T11:29:17 | null | NONE | null | null | null | null | ### System Info
- `transformers` version: 4.56.0
- Platform: macOS-15.6.1-arm64-arm-64bit
- Python version: 3.12.2
- Huggingface_hub version: 0.34.3
- Safetensors version: 0.5.3
- Accelerate version: 1.10.1
- Accelerate config: not found
- DeepSpeed version: not installed
- PyTorch version (accelerator?): 2.7.1 (NA... | null | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/40858/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/40858/timeline | null | null | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | false |
https://api.github.com/repos/huggingface/transformers/issues/40857 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/40857/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/40857/comments | https://api.github.com/repos/huggingface/transformers/issues/40857/events | https://github.com/huggingface/transformers/pull/40857 | 3,411,668,330 | PR_kwDOCUB6oc6oT4bA | 40,857 | Token | {
"login": "ArkVex",
"id": 159469387,
"node_id": "U_kgDOCYFPSw",
"avatar_url": "https://avatars.githubusercontent.com/u/159469387?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ArkVex",
"html_url": "https://github.com/ArkVex",
"followers_url": "https://api.github.com/users/ArkVex/follower... | [] | open | false | null | [] | null | [] | 2025-09-12T19:10:26 | 2025-09-19T06:12:33 | null | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40857",
"html_url": "https://github.com/huggingface/transformers/pull/40857",
"diff_url": "https://github.com/huggingface/transformers/pull/40857.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40857.patch",
"merged_at... | # What does this PR do?
This PR fixes the calculation of `train_tokens_per_second` when resuming training from a checkpoint. Previously, the metric was calculated using global state, which could result in unrealistically high values after resuming. Now, the timer and token counters are reset when resuming, so the me... | null | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/40857/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/40857/timeline | null | null | null | null | true | false |
https://api.github.com/repos/huggingface/transformers/issues/40856 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/40856/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/40856/comments | https://api.github.com/repos/huggingface/transformers/issues/40856/events | https://github.com/huggingface/transformers/pull/40856 | 3,411,615,799 | PR_kwDOCUB6oc6oTsup | 40,856 | 🔴Make `center_crop` fast equivalent to slow | {
"login": "yonigozlan",
"id": 74535834,
"node_id": "MDQ6VXNlcjc0NTM1ODM0",
"avatar_url": "https://avatars.githubusercontent.com/u/74535834?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/yonigozlan",
"html_url": "https://github.com/yonigozlan",
"followers_url": "https://api.github.com/use... | [] | closed | false | null | [] | null | [] | 2025-09-12T18:52:48 | 2025-09-16T16:01:39 | 2025-09-16T16:01:39 | MEMBER | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40856",
"html_url": "https://github.com/huggingface/transformers/pull/40856",
"diff_url": "https://github.com/huggingface/transformers/pull/40856.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40856.patch",
"merged_at... | # What does this PR do?
Use a custom `center_crop` function to be equivalent to the one used in slow processors.
The only difference with torchvision one is that instead of using `int(round(..))` to define `crop_top` and `crop_left`, which round towards the even number, we just use `int(...)` to always round down.
... | {
"login": "yonigozlan",
"id": 74535834,
"node_id": "MDQ6VXNlcjc0NTM1ODM0",
"avatar_url": "https://avatars.githubusercontent.com/u/74535834?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/yonigozlan",
"html_url": "https://github.com/yonigozlan",
"followers_url": "https://api.github.com/use... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/40856/reactions",
"total_count": 1,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 1
} | https://api.github.com/repos/huggingface/transformers/issues/40856/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/40855 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/40855/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/40855/comments | https://api.github.com/repos/huggingface/transformers/issues/40855/events | https://github.com/huggingface/transformers/pull/40855 | 3,411,571,897 | PR_kwDOCUB6oc6oTjUM | 40,855 | [`VaultGemma`] Update expectations in integration tests | {
"login": "vasqu",
"id": 73884904,
"node_id": "MDQ6VXNlcjczODg0OTA0",
"avatar_url": "https://avatars.githubusercontent.com/u/73884904?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/vasqu",
"html_url": "https://github.com/vasqu",
"followers_url": "https://api.github.com/users/vasqu/follow... | [] | closed | false | null | [] | null | [] | 2025-09-12T18:37:29 | 2025-09-15T10:46:32 | 2025-09-15T10:46:30 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40855",
"html_url": "https://github.com/huggingface/transformers/pull/40855",
"diff_url": "https://github.com/huggingface/transformers/pull/40855.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40855.patch",
"merged_at... | As per title
cc @Cyrilvallez @ArthurZucker | {
"login": "vasqu",
"id": 73884904,
"node_id": "MDQ6VXNlcjczODg0OTA0",
"avatar_url": "https://avatars.githubusercontent.com/u/73884904?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/vasqu",
"html_url": "https://github.com/vasqu",
"followers_url": "https://api.github.com/users/vasqu/follow... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/40855/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/40855/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/40854 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/40854/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/40854/comments | https://api.github.com/repos/huggingface/transformers/issues/40854/events | https://github.com/huggingface/transformers/pull/40854 | 3,410,976,333 | PR_kwDOCUB6oc6oRdYJ | 40,854 | [tests] move generative tests away from `test_modeling_common.py` | {
"login": "gante",
"id": 12240844,
"node_id": "MDQ6VXNlcjEyMjQwODQ0",
"avatar_url": "https://avatars.githubusercontent.com/u/12240844?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/gante",
"html_url": "https://github.com/gante",
"followers_url": "https://api.github.com/users/gante/follow... | [] | closed | false | null | [] | null | [] | 2025-09-12T15:41:13 | 2025-09-12T16:15:16 | 2025-09-12T16:12:28 | MEMBER | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40854",
"html_url": "https://github.com/huggingface/transformers/pull/40854",
"diff_url": "https://github.com/huggingface/transformers/pull/40854.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40854.patch",
"merged_at... | # What does this PR do?
skips 🔫 (104k -> 101k tests in `tests/models`)
Moves generative tests away from `test_modeling_common.py`, which should be reserved for generalist tests.
TL;DR:
- a test loops over `self.all_generative_model_classes` -> moved to `GenerationTesterMixin`
- a test relies on `AutoModelFo... | {
"login": "gante",
"id": 12240844,
"node_id": "MDQ6VXNlcjEyMjQwODQ0",
"avatar_url": "https://avatars.githubusercontent.com/u/12240844?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/gante",
"html_url": "https://github.com/gante",
"followers_url": "https://api.github.com/users/gante/follow... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/40854/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/40854/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/40853 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/40853/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/40853/comments | https://api.github.com/repos/huggingface/transformers/issues/40853/events | https://github.com/huggingface/transformers/issues/40853 | 3,410,975,603 | I_kwDOCUB6oc7LT1tz | 40,853 | Top Three Models on Object Detection Leaderboard Won't Load on Newest Version | {
"login": "waylonflinn",
"id": 804108,
"node_id": "MDQ6VXNlcjgwNDEwOA==",
"avatar_url": "https://avatars.githubusercontent.com/u/804108?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/waylonflinn",
"html_url": "https://github.com/waylonflinn",
"followers_url": "https://api.github.com/user... | [
{
"id": 2392046359,
"node_id": "MDU6TGFiZWwyMzkyMDQ2MzU5",
"url": "https://api.github.com/repos/huggingface/transformers/labels/Good%20Second%20Issue",
"name": "Good Second Issue",
"color": "dd935a",
"default": false,
"description": "Issues that are more difficult to do than \"Good First... | closed | false | null | [] | null | [] | 2025-09-12T15:41:02 | 2025-09-15T15:59:44 | 2025-09-15T15:23:14 | NONE | null | null | null | null | ### System Info
The top three models on the Object Detection leaderboard won't load in the latest version.
"jozhang97/deta-swin-large"
"jozhang97/deta-resnet-50-24-epochs"
"jozhang97/deta-resnet-50"
@Cyrilvallez
Versions 4.27.1, 4.49.0 and 4.50.3 are confirmed to work.
Version 4.51.0 is broken and gives the follo... | {
"login": "Cyrilvallez",
"id": 71554963,
"node_id": "MDQ6VXNlcjcxNTU0OTYz",
"avatar_url": "https://avatars.githubusercontent.com/u/71554963?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Cyrilvallez",
"html_url": "https://github.com/Cyrilvallez",
"followers_url": "https://api.github.com/... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/40853/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/40853/timeline | null | completed | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | true |
https://api.github.com/repos/huggingface/transformers/issues/40852 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/40852/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/40852/comments | https://api.github.com/repos/huggingface/transformers/issues/40852/events | https://github.com/huggingface/transformers/pull/40852 | 3,410,791,535 | PR_kwDOCUB6oc6oQ04v | 40,852 | [test] Fix test_eager_matches_sdpa incorrectly skipped | {
"login": "eustlb",
"id": 94853470,
"node_id": "U_kgDOBadZXg",
"avatar_url": "https://avatars.githubusercontent.com/u/94853470?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/eustlb",
"html_url": "https://github.com/eustlb",
"followers_url": "https://api.github.com/users/eustlb/followers"... | [] | closed | false | null | [] | null | [] | 2025-09-12T14:45:42 | 2025-09-12T16:13:31 | 2025-09-12T16:07:48 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40852",
"html_url": "https://github.com/huggingface/transformers/pull/40852",
"diff_url": "https://github.com/huggingface/transformers/pull/40852.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40852.patch",
"merged_at... | # What does this PR do?
After the introduction of `TransformersKwargs`, test_eager_matches_sdpa is incorrectly in the output_attentions case because the current `"output_attentions" in inspect.signature(model_sdpa.forward).parameters` is not enough to find `"output_attentions"` in typed kwargs.
Moreover, it seems... | {
"login": "Cyrilvallez",
"id": 71554963,
"node_id": "MDQ6VXNlcjcxNTU0OTYz",
"avatar_url": "https://avatars.githubusercontent.com/u/71554963?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Cyrilvallez",
"html_url": "https://github.com/Cyrilvallez",
"followers_url": "https://api.github.com/... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/40852/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/40852/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/40851 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/40851/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/40851/comments | https://api.github.com/repos/huggingface/transformers/issues/40851/events | https://github.com/huggingface/transformers/pull/40851 | 3,410,754,020 | PR_kwDOCUB6oc6oQsqH | 40,851 | add: differential privacy research model | {
"login": "RyanMullins",
"id": 868555,
"node_id": "MDQ6VXNlcjg2ODU1NQ==",
"avatar_url": "https://avatars.githubusercontent.com/u/868555?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/RyanMullins",
"html_url": "https://github.com/RyanMullins",
"followers_url": "https://api.github.com/user... | [
{
"id": 1843244711,
"node_id": "MDU6TGFiZWwxODQzMjQ0NzEx",
"url": "https://api.github.com/repos/huggingface/transformers/labels/New%20model",
"name": "New model",
"color": "fbca04",
"default": false,
"description": ""
}
] | closed | false | null | [] | null | [] | 2025-09-12T14:34:38 | 2025-09-12T15:36:04 | 2025-09-12T15:36:04 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40851",
"html_url": "https://github.com/huggingface/transformers/pull/40851",
"diff_url": "https://github.com/huggingface/transformers/pull/40851.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40851.patch",
"merged_at... | # What does this PR do?
This PR adds VaultGemma, an LLM trained with sequence-level differential privacy (DP).
## Before submitting
- [x] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [x] Did you read the [contributor guideline](https://github.com/huggingface... | {
"login": "Cyrilvallez",
"id": 71554963,
"node_id": "MDQ6VXNlcjcxNTU0OTYz",
"avatar_url": "https://avatars.githubusercontent.com/u/71554963?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Cyrilvallez",
"html_url": "https://github.com/Cyrilvallez",
"followers_url": "https://api.github.com/... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/40851/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/40851/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/40850 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/40850/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/40850/comments | https://api.github.com/repos/huggingface/transformers/issues/40850/events | https://github.com/huggingface/transformers/pull/40850 | 3,410,619,954 | PR_kwDOCUB6oc6oQPD6 | 40,850 | Fix loading logic flaw with regards to unexpected and missing keys | {
"login": "LysandreJik",
"id": 30755778,
"node_id": "MDQ6VXNlcjMwNzU1Nzc4",
"avatar_url": "https://avatars.githubusercontent.com/u/30755778?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/LysandreJik",
"html_url": "https://github.com/LysandreJik",
"followers_url": "https://api.github.com/... | [] | closed | false | null | [] | null | [] | 2025-09-12T13:57:58 | 2025-09-24T14:44:44 | 2025-09-24T14:44:43 | MEMBER | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40850",
"html_url": "https://github.com/huggingface/transformers/pull/40850",
"diff_url": "https://github.com/huggingface/transformers/pull/40850.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40850.patch",
"merged_at... | As per tests, currently there were some paths failing when loading a checkpoint with unrelated weights (which are well defined in the "unexpected weights on load"). This makes sure that this flag is respected and to ignore such weights. | {
"login": "Cyrilvallez",
"id": 71554963,
"node_id": "MDQ6VXNlcjcxNTU0OTYz",
"avatar_url": "https://avatars.githubusercontent.com/u/71554963?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Cyrilvallez",
"html_url": "https://github.com/Cyrilvallez",
"followers_url": "https://api.github.com/... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/40850/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/40850/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/40849 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/40849/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/40849/comments | https://api.github.com/repos/huggingface/transformers/issues/40849/events | https://github.com/huggingface/transformers/issues/40849 | 3,410,506,076 | I_kwDOCUB6oc7LSDFc | 40,849 | The chat prompt template for google/gemma-3-270m-it omits the system message. | {
"login": "umang-0801",
"id": 178144901,
"node_id": "U_kgDOCp5GhQ",
"avatar_url": "https://avatars.githubusercontent.com/u/178144901?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/umang-0801",
"html_url": "https://github.com/umang-0801",
"followers_url": "https://api.github.com/users/uma... | [
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
}
] | open | false | null | [] | null | [] | 2025-09-12T13:24:13 | 2025-10-15T00:42:10 | null | NONE | null | null | null | null | ### System Info
Objective: To use [`google/gemma-3-270m-it`](https://huggingface.co/google/gemma-3-270m-it) for a chat application.
Problem Description: The tokenizer provides a `apply_chat_template` function which
1. Fails to include the system message in the prompt template with the right tag.
To reproduce this... | null | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/40849/reactions",
"total_count": 1,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 1
} | https://api.github.com/repos/huggingface/transformers/issues/40849/timeline | null | null | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | false |
https://api.github.com/repos/huggingface/transformers/issues/40848 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/40848/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/40848/comments | https://api.github.com/repos/huggingface/transformers/issues/40848/events | https://github.com/huggingface/transformers/pull/40848 | 3,410,424,390 | PR_kwDOCUB6oc6oPjwJ | 40,848 | [Qwen3 Next] Use numerically stable `rsqrt` | {
"login": "thalahors",
"id": 178652170,
"node_id": "U_kgDOCqYECg",
"avatar_url": "https://avatars.githubusercontent.com/u/178652170?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/thalahors",
"html_url": "https://github.com/thalahors",
"followers_url": "https://api.github.com/users/thalah... | [] | closed | false | null | [] | null | [] | 2025-09-12T13:05:26 | 2025-09-15T13:06:09 | 2025-09-15T10:45:14 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40848",
"html_url": "https://github.com/huggingface/transformers/pull/40848",
"diff_url": "https://github.com/huggingface/transformers/pull/40848.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40848.patch",
"merged_at... | # What does this PR do?
Uses numerically stable `rsqrt`
## Before submitting
- [x] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [x] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull... | {
"login": "Cyrilvallez",
"id": 71554963,
"node_id": "MDQ6VXNlcjcxNTU0OTYz",
"avatar_url": "https://avatars.githubusercontent.com/u/71554963?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Cyrilvallez",
"html_url": "https://github.com/Cyrilvallez",
"followers_url": "https://api.github.com/... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/40848/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/40848/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/40847 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/40847/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/40847/comments | https://api.github.com/repos/huggingface/transformers/issues/40847/events | https://github.com/huggingface/transformers/pull/40847 | 3,410,415,851 | PR_kwDOCUB6oc6oPh2C | 40,847 | [config] accept non-full dtypes | {
"login": "gante",
"id": 12240844,
"node_id": "MDQ6VXNlcjEyMjQwODQ0",
"avatar_url": "https://avatars.githubusercontent.com/u/12240844?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/gante",
"html_url": "https://github.com/gante",
"followers_url": "https://api.github.com/users/gante/follow... | [] | closed | false | null | [] | null | [] | 2025-09-12T13:03:44 | 2025-09-12T13:07:55 | 2025-09-12T13:07:45 | MEMBER | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40847",
"html_url": "https://github.com/huggingface/transformers/pull/40847",
"diff_url": "https://github.com/huggingface/transformers/pull/40847.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40847.patch",
"merged_at... | # What does this PR do?
Fixes the error following script:
(root cause: the existing logic was expecting things like `torch_dtype="torch.float32"`, as opposed to `torch_dtype="float32"`. With this PR, we now accept both)
```py
from transformers import AutoConfig
config = AutoConfig.from_pretrained("BAAI/Emu... | {
"login": "gante",
"id": 12240844,
"node_id": "MDQ6VXNlcjEyMjQwODQ0",
"avatar_url": "https://avatars.githubusercontent.com/u/12240844?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/gante",
"html_url": "https://github.com/gante",
"followers_url": "https://api.github.com/users/gante/follow... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/40847/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/40847/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/40846 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/40846/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/40846/comments | https://api.github.com/repos/huggingface/transformers/issues/40846/events | https://github.com/huggingface/transformers/pull/40846 | 3,410,112,269 | PR_kwDOCUB6oc6oOf4G | 40,846 | [tests] re-enable aria fast tests | {
"login": "gante",
"id": 12240844,
"node_id": "MDQ6VXNlcjEyMjQwODQ0",
"avatar_url": "https://avatars.githubusercontent.com/u/12240844?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/gante",
"html_url": "https://github.com/gante",
"followers_url": "https://api.github.com/users/gante/follow... | [] | closed | false | null | [] | null | [] | 2025-09-12T11:34:17 | 2025-09-12T16:43:58 | 2025-09-12T14:14:54 | MEMBER | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40846",
"html_url": "https://github.com/huggingface/transformers/pull/40846",
"diff_url": "https://github.com/huggingface/transformers/pull/40846.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40846.patch",
"merged_at... | # What does this PR do?
Aria tests were decorated with `@slow` in #38615 because they were slow. The tests were slow because the test model was quite large.
This PR:
- reduces the size of the test aria model and removes `@slow` (non-slow test runtime reduced by 66% on my machine, ~1min -> ~20 secs)
- fixes thin... | {
"login": "gante",
"id": 12240844,
"node_id": "MDQ6VXNlcjEyMjQwODQ0",
"avatar_url": "https://avatars.githubusercontent.com/u/12240844?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/gante",
"html_url": "https://github.com/gante",
"followers_url": "https://api.github.com/users/gante/follow... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/40846/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/40846/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/40845 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/40845/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/40845/comments | https://api.github.com/repos/huggingface/transformers/issues/40845/events | https://github.com/huggingface/transformers/pull/40845 | 3,410,085,005 | PR_kwDOCUB6oc6oOaCl | 40,845 | Fix typoes in src and tests | {
"login": "cyyever",
"id": 17618148,
"node_id": "MDQ6VXNlcjE3NjE4MTQ4",
"avatar_url": "https://avatars.githubusercontent.com/u/17618148?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/cyyever",
"html_url": "https://github.com/cyyever",
"followers_url": "https://api.github.com/users/cyyeve... | [] | closed | false | null | [] | null | [] | 2025-09-12T11:24:42 | 2025-09-19T13:22:02 | 2025-09-19T13:18:38 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40845",
"html_url": "https://github.com/huggingface/transformers/pull/40845",
"diff_url": "https://github.com/huggingface/transformers/pull/40845.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40845.patch",
"merged_at... | # What does this PR do?
Fix more typos
## Before submitting
- [X] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [ ] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
... | {
"login": "Rocketknight1",
"id": 12866554,
"node_id": "MDQ6VXNlcjEyODY2NTU0",
"avatar_url": "https://avatars.githubusercontent.com/u/12866554?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Rocketknight1",
"html_url": "https://github.com/Rocketknight1",
"followers_url": "https://api.githu... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/40845/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/40845/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/40844 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/40844/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/40844/comments | https://api.github.com/repos/huggingface/transformers/issues/40844/events | https://github.com/huggingface/transformers/pull/40844 | 3,409,911,797 | PR_kwDOCUB6oc6oN0D7 | 40,844 | Use checkpoint in auto_class_docstring | {
"login": "cyyever",
"id": 17618148,
"node_id": "MDQ6VXNlcjE3NjE4MTQ4",
"avatar_url": "https://avatars.githubusercontent.com/u/17618148?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/cyyever",
"html_url": "https://github.com/cyyever",
"followers_url": "https://api.github.com/users/cyyeve... | [] | closed | false | null | [] | null | [] | 2025-09-12T10:26:10 | 2025-09-13T00:52:46 | 2025-09-13T00:49:19 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40844",
"html_url": "https://github.com/huggingface/transformers/pull/40844",
"diff_url": "https://github.com/huggingface/transformers/pull/40844.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40844.patch",
"merged_at... | # What does this PR do?
Using checkpoint in auto_class_docstring. | {
"login": "yonigozlan",
"id": 74535834,
"node_id": "MDQ6VXNlcjc0NTM1ODM0",
"avatar_url": "https://avatars.githubusercontent.com/u/74535834?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/yonigozlan",
"html_url": "https://github.com/yonigozlan",
"followers_url": "https://api.github.com/use... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/40844/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/40844/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/40843 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/40843/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/40843/comments | https://api.github.com/repos/huggingface/transformers/issues/40843/events | https://github.com/huggingface/transformers/pull/40843 | 3,409,772,966 | PR_kwDOCUB6oc6oNVIx | 40,843 | Add VideoProcessors to auto-backend requirements | {
"login": "Cyrilvallez",
"id": 71554963,
"node_id": "MDQ6VXNlcjcxNTU0OTYz",
"avatar_url": "https://avatars.githubusercontent.com/u/71554963?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Cyrilvallez",
"html_url": "https://github.com/Cyrilvallez",
"followers_url": "https://api.github.com/... | [] | closed | false | null | [] | null | [] | 2025-09-12T09:51:09 | 2025-09-12T10:21:14 | 2025-09-12T10:21:12 | MEMBER | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40843",
"html_url": "https://github.com/huggingface/transformers/pull/40843",
"diff_url": "https://github.com/huggingface/transformers/pull/40843.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40843.patch",
"merged_at... | # What does this PR do?
As per the title. They have the same requirements as fast image processors | {
"login": "Cyrilvallez",
"id": 71554963,
"node_id": "MDQ6VXNlcjcxNTU0OTYz",
"avatar_url": "https://avatars.githubusercontent.com/u/71554963?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Cyrilvallez",
"html_url": "https://github.com/Cyrilvallez",
"followers_url": "https://api.github.com/... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/40843/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/40843/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/40842 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/40842/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/40842/comments | https://api.github.com/repos/huggingface/transformers/issues/40842/events | https://github.com/huggingface/transformers/pull/40842 | 3,409,590,841 | PR_kwDOCUB6oc6oMspB | 40,842 | Fix the misalignment between the l2norm in GDN of Qwen3-Next and the implementation in the FLA library. | {
"login": "bozheng-hit",
"id": 8787969,
"node_id": "MDQ6VXNlcjg3ODc5Njk=",
"avatar_url": "https://avatars.githubusercontent.com/u/8787969?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/bozheng-hit",
"html_url": "https://github.com/bozheng-hit",
"followers_url": "https://api.github.com/us... | [] | closed | false | null | [] | null | [] | 2025-09-12T09:08:35 | 2025-09-12T12:08:01 | 2025-09-12T12:08:01 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40842",
"html_url": "https://github.com/huggingface/transformers/pull/40842",
"diff_url": "https://github.com/huggingface/transformers/pull/40842.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40842.patch",
"merged_at... | Fix the misalignment between the l2norm in GDN of Qwen3-Next and the implementation in the FLA library. | {
"login": "Cyrilvallez",
"id": 71554963,
"node_id": "MDQ6VXNlcjcxNTU0OTYz",
"avatar_url": "https://avatars.githubusercontent.com/u/71554963?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Cyrilvallez",
"html_url": "https://github.com/Cyrilvallez",
"followers_url": "https://api.github.com/... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/40842/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/40842/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/40841 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/40841/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/40841/comments | https://api.github.com/repos/huggingface/transformers/issues/40841/events | https://github.com/huggingface/transformers/issues/40841 | 3,409,482,415 | I_kwDOCUB6oc7LOJKv | 40,841 | Add Support for Ovis2.5 Multi-Modal Model | {
"login": "xschen-beb",
"id": 61721839,
"node_id": "MDQ6VXNlcjYxNzIxODM5",
"avatar_url": "https://avatars.githubusercontent.com/u/61721839?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/xschen-beb",
"html_url": "https://github.com/xschen-beb",
"followers_url": "https://api.github.com/use... | [
{
"id": 1843244711,
"node_id": "MDU6TGFiZWwxODQzMjQ0NzEx",
"url": "https://api.github.com/repos/huggingface/transformers/labels/New%20model",
"name": "New model",
"color": "fbca04",
"default": false,
"description": ""
}
] | open | false | null | [] | null | [] | 2025-09-12T08:41:30 | 2025-09-16T11:18:58 | null | NONE | null | null | null | null | ### Model description
Key Features:
Small Model Performance: Optimized training strategies enable small-scale models to achieve higher capability density, demonstrating cross-tier leading advantages.
Enhanced Reasoning Capabilities: Significantly strengthens Chain-of-Thought (CoT) reasoning abilities through the com... | null | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/40841/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/40841/timeline | null | null | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | false |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.