url string | repository_url string | labels_url string | comments_url string | events_url string | html_url string | id int64 | node_id string | number int64 | title string | user dict | labels list | state string | locked bool | assignee dict | assignees list | milestone null | comments list | created_at timestamp[ms] | updated_at timestamp[ms] | closed_at timestamp[ms] | author_association string | type dict | active_lock_reason null | draft bool | pull_request dict | body string | closed_by dict | reactions dict | timeline_url string | performed_via_github_app null | state_reason string | sub_issues_summary dict | issue_dependencies_summary dict | is_pull_request bool | is_closed bool |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
https://api.github.com/repos/huggingface/transformers/issues/38934 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38934/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38934/comments | https://api.github.com/repos/huggingface/transformers/issues/38934/events | https://github.com/huggingface/transformers/pull/38934 | 3,162,508,947 | PR_kwDOCUB6oc6bX7L5 | 38,934 | Use newer typing notation | {
"login": "cyyever",
"id": 17618148,
"node_id": "MDQ6VXNlcjE3NjE4MTQ4",
"avatar_url": "https://avatars.githubusercontent.com/u/17618148?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/cyyever",
"html_url": "https://github.com/cyyever",
"followers_url": "https://api.github.com/users/cyyever/followers",
"following_url": "https://api.github.com/users/cyyever/following{/other_user}",
"gists_url": "https://api.github.com/users/cyyever/gists{/gist_id}",
"starred_url": "https://api.github.com/users/cyyever/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/cyyever/subscriptions",
"organizations_url": "https://api.github.com/users/cyyever/orgs",
"repos_url": "https://api.github.com/users/cyyever/repos",
"events_url": "https://api.github.com/users/cyyever/events{/privacy}",
"received_events_url": "https://api.github.com/users/cyyever/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-06-20T09:15:51 | 2025-07-17T13:41:06 | 2025-07-17T13:05:21 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38934",
"html_url": "https://github.com/huggingface/transformers/pull/38934",
"diff_url": "https://github.com/huggingface/transformers/pull/38934.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38934.patch",
"merged_at": "2025-07-17T13:05:21"
} | # What does this PR do?
Use newer typing for source files with `from __future__ import annotations`. | {
"login": "Rocketknight1",
"id": 12866554,
"node_id": "MDQ6VXNlcjEyODY2NTU0",
"avatar_url": "https://avatars.githubusercontent.com/u/12866554?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Rocketknight1",
"html_url": "https://github.com/Rocketknight1",
"followers_url": "https://api.github.com/users/Rocketknight1/followers",
"following_url": "https://api.github.com/users/Rocketknight1/following{/other_user}",
"gists_url": "https://api.github.com/users/Rocketknight1/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Rocketknight1/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Rocketknight1/subscriptions",
"organizations_url": "https://api.github.com/users/Rocketknight1/orgs",
"repos_url": "https://api.github.com/users/Rocketknight1/repos",
"events_url": "https://api.github.com/users/Rocketknight1/events{/privacy}",
"received_events_url": "https://api.github.com/users/Rocketknight1/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38934/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38934/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/38933 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38933/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38933/comments | https://api.github.com/repos/huggingface/transformers/issues/38933/events | https://github.com/huggingface/transformers/issues/38933 | 3,162,478,946 | I_kwDOCUB6oc68f5li | 38,933 | ddp_time in TrainingArguments with deepspeed does not work | {
"login": "kindernerd",
"id": 36913314,
"node_id": "MDQ6VXNlcjM2OTEzMzE0",
"avatar_url": "https://avatars.githubusercontent.com/u/36913314?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/kindernerd",
"html_url": "https://github.com/kindernerd",
"followers_url": "https://api.github.com/users/kindernerd/followers",
"following_url": "https://api.github.com/users/kindernerd/following{/other_user}",
"gists_url": "https://api.github.com/users/kindernerd/gists{/gist_id}",
"starred_url": "https://api.github.com/users/kindernerd/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kindernerd/subscriptions",
"organizations_url": "https://api.github.com/users/kindernerd/orgs",
"repos_url": "https://api.github.com/users/kindernerd/repos",
"events_url": "https://api.github.com/users/kindernerd/events{/privacy}",
"received_events_url": "https://api.github.com/users/kindernerd/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
}
] | closed | false | null | [] | null | [] | 2025-06-20T09:06:09 | 2025-09-11T10:22:59 | 2025-07-09T09:43:29 | NONE | null | null | null | null | ### System Info
- `transformers` version: 4.52.4
- Platform: Linux-5.15.0-72-generic-x86_64-with-glibc2.35
- Python version: 3.11.5
- Huggingface_hub version: 0.30.2
- Safetensors version: 0.4.5
- Accelerate version: 1.8.0
- Accelerate config: not found
- DeepSpeed version: 0.17.1
- PyTorch version (GPU?): 2.6.0+cu124 (True)
- Tensorflow version (GPU?): not installed (NA)
- Flax version (CPU?/GPU?/TPU?): not installed (NA)
- Jax version: not installed
- JaxLib version: not installed
- Using distributed or parallel set-up in script?: yes
- Using GPU in script?: yes
- GPU type: NVIDIA A100-SXM4-80GB
### Who can help?
@SunMarc @zach-huggingface
### Information
- [ ] The official example scripts
- [ ] My own modified scripts
### Tasks
- [ ] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...)
- [ ] My own task or dataset (give details below)
### Reproduction
args = TrainingArguments(ddp_timeout = 17)
ddp_timeoutdoes not work, still timeout after default 600 seconds
actually i found that in torch.distributed.init_process_group, timeout is correctly passed in, but in new_group function, the timeout is None again
### Expected behavior
timeout should be consistent with the ddp_time in TrainingArguments.
same issue as in https://github.com/huggingface/transformers/issues/32036 | {
"login": "kindernerd",
"id": 36913314,
"node_id": "MDQ6VXNlcjM2OTEzMzE0",
"avatar_url": "https://avatars.githubusercontent.com/u/36913314?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/kindernerd",
"html_url": "https://github.com/kindernerd",
"followers_url": "https://api.github.com/users/kindernerd/followers",
"following_url": "https://api.github.com/users/kindernerd/following{/other_user}",
"gists_url": "https://api.github.com/users/kindernerd/gists{/gist_id}",
"starred_url": "https://api.github.com/users/kindernerd/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kindernerd/subscriptions",
"organizations_url": "https://api.github.com/users/kindernerd/orgs",
"repos_url": "https://api.github.com/users/kindernerd/repos",
"events_url": "https://api.github.com/users/kindernerd/events{/privacy}",
"received_events_url": "https://api.github.com/users/kindernerd/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38933/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38933/timeline | null | completed | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | true |
https://api.github.com/repos/huggingface/transformers/issues/38932 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38932/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38932/comments | https://api.github.com/repos/huggingface/transformers/issues/38932/events | https://github.com/huggingface/transformers/pull/38932 | 3,162,463,798 | PR_kwDOCUB6oc6bXxKj | 38,932 | Fix more flaky `test_initialization` | {
"login": "ydshieh",
"id": 2521628,
"node_id": "MDQ6VXNlcjI1MjE2Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ydshieh",
"html_url": "https://github.com/ydshieh",
"followers_url": "https://api.github.com/users/ydshieh/followers",
"following_url": "https://api.github.com/users/ydshieh/following{/other_user}",
"gists_url": "https://api.github.com/users/ydshieh/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ydshieh/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ydshieh/subscriptions",
"organizations_url": "https://api.github.com/users/ydshieh/orgs",
"repos_url": "https://api.github.com/users/ydshieh/repos",
"events_url": "https://api.github.com/users/ydshieh/events{/privacy}",
"received_events_url": "https://api.github.com/users/ydshieh/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-06-20T09:01:16 | 2025-06-20T15:28:34 | 2025-06-20T15:28:32 | COLLABORATOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38932",
"html_url": "https://github.com/huggingface/transformers/pull/38932",
"diff_url": "https://github.com/huggingface/transformers/pull/38932.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38932.patch",
"merged_at": "2025-06-20T15:28:32"
} | # What does this PR do?
Same as in #38607
> tests/models/depth_pro/test_modeling_depth_pro.py::DepthProModelTest::test_initialization
run 300 times: number of failures:
before: 11
after: 0
(for a failing job, see https://app.circleci.com/pipelines/github/huggingface/transformers/134769/workflows/5704b318-b0c7-412c-9381-c223cda35cce/jobs/1787781/parallel-runs/2)
> tests/models/swin2sr/test_modeling_swin2sr.py::Swin2SRModelTest::test_initialization
run 300 times: number of failures:
before: 2
after: 0 | {
"login": "ydshieh",
"id": 2521628,
"node_id": "MDQ6VXNlcjI1MjE2Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ydshieh",
"html_url": "https://github.com/ydshieh",
"followers_url": "https://api.github.com/users/ydshieh/followers",
"following_url": "https://api.github.com/users/ydshieh/following{/other_user}",
"gists_url": "https://api.github.com/users/ydshieh/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ydshieh/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ydshieh/subscriptions",
"organizations_url": "https://api.github.com/users/ydshieh/orgs",
"repos_url": "https://api.github.com/users/ydshieh/repos",
"events_url": "https://api.github.com/users/ydshieh/events{/privacy}",
"received_events_url": "https://api.github.com/users/ydshieh/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38932/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38932/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/38931 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38931/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38931/comments | https://api.github.com/repos/huggingface/transformers/issues/38931/events | https://github.com/huggingface/transformers/pull/38931 | 3,162,419,979 | PR_kwDOCUB6oc6bXndq | 38,931 | Skip some tests for now | {
"login": "ydshieh",
"id": 2521628,
"node_id": "MDQ6VXNlcjI1MjE2Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ydshieh",
"html_url": "https://github.com/ydshieh",
"followers_url": "https://api.github.com/users/ydshieh/followers",
"following_url": "https://api.github.com/users/ydshieh/following{/other_user}",
"gists_url": "https://api.github.com/users/ydshieh/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ydshieh/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ydshieh/subscriptions",
"organizations_url": "https://api.github.com/users/ydshieh/orgs",
"repos_url": "https://api.github.com/users/ydshieh/repos",
"events_url": "https://api.github.com/users/ydshieh/events{/privacy}",
"received_events_url": "https://api.github.com/users/ydshieh/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-06-20T08:46:34 | 2025-06-20T09:06:11 | 2025-06-20T09:05:49 | COLLABORATOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38931",
"html_url": "https://github.com/huggingface/transformers/pull/38931",
"diff_url": "https://github.com/huggingface/transformers/pull/38931.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38931.patch",
"merged_at": "2025-06-20T09:05:49"
} | # What does this PR do?
We should revert once those are fixed on datasets side | {
"login": "ydshieh",
"id": 2521628,
"node_id": "MDQ6VXNlcjI1MjE2Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ydshieh",
"html_url": "https://github.com/ydshieh",
"followers_url": "https://api.github.com/users/ydshieh/followers",
"following_url": "https://api.github.com/users/ydshieh/following{/other_user}",
"gists_url": "https://api.github.com/users/ydshieh/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ydshieh/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ydshieh/subscriptions",
"organizations_url": "https://api.github.com/users/ydshieh/orgs",
"repos_url": "https://api.github.com/users/ydshieh/repos",
"events_url": "https://api.github.com/users/ydshieh/events{/privacy}",
"received_events_url": "https://api.github.com/users/ydshieh/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38931/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38931/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/38930 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38930/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38930/comments | https://api.github.com/repos/huggingface/transformers/issues/38930/events | https://github.com/huggingface/transformers/pull/38930 | 3,162,353,878 | PR_kwDOCUB6oc6bXZCk | 38,930 | [qwen] refactor attentions for vision/audio | {
"login": "zucchini-nlp",
"id": 100715397,
"node_id": "U_kgDOBgDLhQ",
"avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/zucchini-nlp",
"html_url": "https://github.com/zucchini-nlp",
"followers_url": "https://api.github.com/users/zucchini-nlp/followers",
"following_url": "https://api.github.com/users/zucchini-nlp/following{/other_user}",
"gists_url": "https://api.github.com/users/zucchini-nlp/gists{/gist_id}",
"starred_url": "https://api.github.com/users/zucchini-nlp/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/zucchini-nlp/subscriptions",
"organizations_url": "https://api.github.com/users/zucchini-nlp/orgs",
"repos_url": "https://api.github.com/users/zucchini-nlp/repos",
"events_url": "https://api.github.com/users/zucchini-nlp/events{/privacy}",
"received_events_url": "https://api.github.com/users/zucchini-nlp/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-06-20T08:22:17 | 2025-06-24T08:53:53 | 2025-06-24T08:53:53 | MEMBER | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38930",
"html_url": "https://github.com/huggingface/transformers/pull/38930",
"diff_url": "https://github.com/huggingface/transformers/pull/38930.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38930.patch",
"merged_at": "2025-06-24T08:53:52"
} | # What does this PR do?
As per title, these modalities were left out when refactoring VLMs earlier | {
"login": "zucchini-nlp",
"id": 100715397,
"node_id": "U_kgDOBgDLhQ",
"avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/zucchini-nlp",
"html_url": "https://github.com/zucchini-nlp",
"followers_url": "https://api.github.com/users/zucchini-nlp/followers",
"following_url": "https://api.github.com/users/zucchini-nlp/following{/other_user}",
"gists_url": "https://api.github.com/users/zucchini-nlp/gists{/gist_id}",
"starred_url": "https://api.github.com/users/zucchini-nlp/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/zucchini-nlp/subscriptions",
"organizations_url": "https://api.github.com/users/zucchini-nlp/orgs",
"repos_url": "https://api.github.com/users/zucchini-nlp/repos",
"events_url": "https://api.github.com/users/zucchini-nlp/events{/privacy}",
"received_events_url": "https://api.github.com/users/zucchini-nlp/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38930/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38930/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/38929 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38929/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38929/comments | https://api.github.com/repos/huggingface/transformers/issues/38929/events | https://github.com/huggingface/transformers/pull/38929 | 3,162,225,507 | PR_kwDOCUB6oc6bW9ZD | 38,929 | Enable XPU doc | {
"login": "jiqing-feng",
"id": 107918818,
"node_id": "U_kgDOBm614g",
"avatar_url": "https://avatars.githubusercontent.com/u/107918818?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jiqing-feng",
"html_url": "https://github.com/jiqing-feng",
"followers_url": "https://api.github.com/users/jiqing-feng/followers",
"following_url": "https://api.github.com/users/jiqing-feng/following{/other_user}",
"gists_url": "https://api.github.com/users/jiqing-feng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jiqing-feng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jiqing-feng/subscriptions",
"organizations_url": "https://api.github.com/users/jiqing-feng/orgs",
"repos_url": "https://api.github.com/users/jiqing-feng/repos",
"events_url": "https://api.github.com/users/jiqing-feng/events{/privacy}",
"received_events_url": "https://api.github.com/users/jiqing-feng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-06-20T07:28:52 | 2025-06-30T14:56:55 | 2025-06-30T14:56:55 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38929",
"html_url": "https://github.com/huggingface/transformers/pull/38929",
"diff_url": "https://github.com/huggingface/transformers/pull/38929.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38929.patch",
"merged_at": "2025-06-30T14:56:55"
} | Hi @SunMarc . This PR enables XPU torchao example in the doc, and also fixed a minor bug on the doc. Please review it. Thanks! | {
"login": "stevhliu",
"id": 59462357,
"node_id": "MDQ6VXNlcjU5NDYyMzU3",
"avatar_url": "https://avatars.githubusercontent.com/u/59462357?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/stevhliu",
"html_url": "https://github.com/stevhliu",
"followers_url": "https://api.github.com/users/stevhliu/followers",
"following_url": "https://api.github.com/users/stevhliu/following{/other_user}",
"gists_url": "https://api.github.com/users/stevhliu/gists{/gist_id}",
"starred_url": "https://api.github.com/users/stevhliu/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/stevhliu/subscriptions",
"organizations_url": "https://api.github.com/users/stevhliu/orgs",
"repos_url": "https://api.github.com/users/stevhliu/repos",
"events_url": "https://api.github.com/users/stevhliu/events{/privacy}",
"received_events_url": "https://api.github.com/users/stevhliu/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38929/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38929/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/38928 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38928/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38928/comments | https://api.github.com/repos/huggingface/transformers/issues/38928/events | https://github.com/huggingface/transformers/issues/38928 | 3,162,195,544 | I_kwDOCUB6oc68e0ZY | 38,928 | Improve CI/CD by completing migration from setup.py to pyproject.toml | {
"login": "ParagEkbote",
"id": 69567729,
"node_id": "MDQ6VXNlcjY5NTY3NzI5",
"avatar_url": "https://avatars.githubusercontent.com/u/69567729?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ParagEkbote",
"html_url": "https://github.com/ParagEkbote",
"followers_url": "https://api.github.com/users/ParagEkbote/followers",
"following_url": "https://api.github.com/users/ParagEkbote/following{/other_user}",
"gists_url": "https://api.github.com/users/ParagEkbote/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ParagEkbote/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ParagEkbote/subscriptions",
"organizations_url": "https://api.github.com/users/ParagEkbote/orgs",
"repos_url": "https://api.github.com/users/ParagEkbote/repos",
"events_url": "https://api.github.com/users/ParagEkbote/events{/privacy}",
"received_events_url": "https://api.github.com/users/ParagEkbote/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-06-20T07:15:52 | 2025-08-12T12:54:30 | 2025-08-12T12:54:30 | CONTRIBUTOR | null | null | null | null | Since the release of [PEP 621](https://peps.python.org/pep-0621/), the .toml file has been increasingly adopted for simpler usage and its enhanced functionality. I believe that by adopting the .toml file for packaging and managing dependencies as well, we can also make the process of publishing future release versions simpler as well.
For the upcoming release of transformers v.5.0, I think that the CI/CD practices of the project can move beyond the allennlp package, which has been in read-only mode for a couple of years. WDYT?
cc: @ydshieh, @ivarflakstad | {
"login": "ArthurZucker",
"id": 48595927,
"node_id": "MDQ6VXNlcjQ4NTk1OTI3",
"avatar_url": "https://avatars.githubusercontent.com/u/48595927?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ArthurZucker",
"html_url": "https://github.com/ArthurZucker",
"followers_url": "https://api.github.com/users/ArthurZucker/followers",
"following_url": "https://api.github.com/users/ArthurZucker/following{/other_user}",
"gists_url": "https://api.github.com/users/ArthurZucker/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ArthurZucker/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ArthurZucker/subscriptions",
"organizations_url": "https://api.github.com/users/ArthurZucker/orgs",
"repos_url": "https://api.github.com/users/ArthurZucker/repos",
"events_url": "https://api.github.com/users/ArthurZucker/events{/privacy}",
"received_events_url": "https://api.github.com/users/ArthurZucker/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38928/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38928/timeline | null | completed | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | true |
https://api.github.com/repos/huggingface/transformers/issues/38927 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38927/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38927/comments | https://api.github.com/repos/huggingface/transformers/issues/38927/events | https://github.com/huggingface/transformers/issues/38927 | 3,162,152,366 | I_kwDOCUB6oc68ep2u | 38,927 | Can't load my LoRA checkpoint after gemma3 refactor | {
"login": "jood-canva",
"id": 206628664,
"node_id": "U_kgDODFDnOA",
"avatar_url": "https://avatars.githubusercontent.com/u/206628664?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jood-canva",
"html_url": "https://github.com/jood-canva",
"followers_url": "https://api.github.com/users/jood-canva/followers",
"following_url": "https://api.github.com/users/jood-canva/following{/other_user}",
"gists_url": "https://api.github.com/users/jood-canva/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jood-canva/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jood-canva/subscriptions",
"organizations_url": "https://api.github.com/users/jood-canva/orgs",
"repos_url": "https://api.github.com/users/jood-canva/repos",
"events_url": "https://api.github.com/users/jood-canva/events{/privacy}",
"received_events_url": "https://api.github.com/users/jood-canva/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
}
] | closed | false | null | [] | null | [] | 2025-06-20T06:59:34 | 2025-10-07T18:53:15 | 2025-07-10T06:12:42 | NONE | null | null | null | null | ### System Info
- `transformers` version: 4.52.4
- Platform: Linux-6.8.0-1029-aws-x86_64-with-glibc2.35
- Python version: 3.10.15
- Huggingface_hub version: 0.32.2
- Safetensors version: 0.4.3
- Accelerate version: 1.6.0
- Accelerate config: not found
- DeepSpeed version: not installed
- PyTorch version (GPU?): 2.6.0+cu124 (True)
- Tensorflow version (GPU?): not installed (NA)
- Flax version (CPU?/GPU?/TPU?): not installed (NA)
- Jax version: not installed
- JaxLib version: not installed
- Using distributed or parallel set-up in script?: yes but not relevant here, it happens on single gpu too
- Using GPU in script?: yes but same error on cpu only
- GPU type: NVIDIA L40S
### Who can help?
Hi @ArthurZucker and @zucchini-nlp
I am using my own implementation of `Gemma3ForConditionalGeneration`. I was using transformers 4.50 for a while and upgraded to 4.52.4. After the update I realised that the `Gemma3ForConditionalGeneration` implementation had changed. Mostly `self.language_model` became `self.model`.
The issue is that when I use `PeftModel.from_pretrained` on my old LoRA checkpoint, it can't find the weights and I get a bunch of
```
Found missing adapter keys while loading the checkpoint: ['base_model.model.model.language_model.layers.0.self_attn.q_proj.lora_A.default.weight', 'base_model.model.model.language_model.layers.0.self_attn.q_proj.lora_B.default.weight', ...
```
I thought the `_checkpoint_conversion_mapping` [attribute](https://github.com/huggingface/transformers/blob/v4.52.4/src/transformers/models/gemma3/modeling_gemma3.py#L1236) would be enough but it isn't. Is there an easy way I can still use my old checkpoint?
Thanks in advance for you help, I really appreciate all the effort you guys make and sorry if this was explained somewhere in the documentation!
### Information
- [ ] The official example scripts
- [x] My own modified scripts
### Tasks
- [ ] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...)
- [x] My own task or dataset (give details below)
### Reproduction
I have custom gemma
```
class MyCustomiGemma(Gemma3ForConditionalGeneration):
_checkpoint_conversion_mapping = {
"^language_model.model": "model.language_model",
"^vision_tower": "model.vision_tower",
"^multi_modal_projector": "model.multi_modal_projector",
"^language_model.lm_head": "lm_head",
}
def __init__(
self,
config: Gemma3Config,
):
super().__init__(config)
self.vocab_size = config.text_config.vocab_size
self.model = Gemma3Model(config)
self.lm_head = nn.Linear(
config.text_config.hidden_size, config.text_config.vocab_size, bias=False
)
self.another_head = nn.Linear(...)
self.post_init()
```
When using
```
base_model = MyCustomiGemma.from_pretrained()
model = PeftModel.from_pretrained(
base_model,
checkpoint_path,
is_trainable=True,
)
```
I get the `Found missing adapter keys while loading the checkpoint:` warning for all my LoRAs
### Expected behavior
I think the issue is just a name mapping and I thought it be backwards compatible | {
"login": "jood-canva",
"id": 206628664,
"node_id": "U_kgDODFDnOA",
"avatar_url": "https://avatars.githubusercontent.com/u/206628664?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jood-canva",
"html_url": "https://github.com/jood-canva",
"followers_url": "https://api.github.com/users/jood-canva/followers",
"following_url": "https://api.github.com/users/jood-canva/following{/other_user}",
"gists_url": "https://api.github.com/users/jood-canva/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jood-canva/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jood-canva/subscriptions",
"organizations_url": "https://api.github.com/users/jood-canva/orgs",
"repos_url": "https://api.github.com/users/jood-canva/repos",
"events_url": "https://api.github.com/users/jood-canva/events{/privacy}",
"received_events_url": "https://api.github.com/users/jood-canva/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38927/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38927/timeline | null | completed | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | true |
https://api.github.com/repos/huggingface/transformers/issues/38926 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38926/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38926/comments | https://api.github.com/repos/huggingface/transformers/issues/38926/events | https://github.com/huggingface/transformers/pull/38926 | 3,161,978,970 | PR_kwDOCUB6oc6bWHqo | 38,926 | Clarify Python and framework version support in installation.md | {
"login": "dhyeyinf",
"id": 131277481,
"node_id": "U_kgDOB9MiqQ",
"avatar_url": "https://avatars.githubusercontent.com/u/131277481?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhyeyinf",
"html_url": "https://github.com/dhyeyinf",
"followers_url": "https://api.github.com/users/dhyeyinf/followers",
"following_url": "https://api.github.com/users/dhyeyinf/following{/other_user}",
"gists_url": "https://api.github.com/users/dhyeyinf/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhyeyinf/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhyeyinf/subscriptions",
"organizations_url": "https://api.github.com/users/dhyeyinf/orgs",
"repos_url": "https://api.github.com/users/dhyeyinf/repos",
"events_url": "https://api.github.com/users/dhyeyinf/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhyeyinf/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | open | false | null | [] | null | [] | 2025-06-20T05:35:16 | 2025-06-20T13:42:48 | null | NONE | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38926",
"html_url": "https://github.com/huggingface/transformers/pull/38926",
"diff_url": "https://github.com/huggingface/transformers/pull/38926.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38926.patch",
"merged_at": null
} | This PR improves the installation documentation by:
- Clarifying that Transformers supports Python 3.7 and above
- Listing the supported versions of PyTorch, TensorFlow, and Flax
This helps new users better understand environment compatibility. No functional changes.
| null | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38926/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38926/timeline | null | null | null | null | true | false |
https://api.github.com/repos/huggingface/transformers/issues/38925 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38925/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38925/comments | https://api.github.com/repos/huggingface/transformers/issues/38925/events | https://github.com/huggingface/transformers/issues/38925 | 3,161,833,145 | I_kwDOCUB6oc68db65 | 38,925 | Checkpointing broken for classifier training multi-gpu | {
"login": "ojh31",
"id": 67026888,
"node_id": "MDQ6VXNlcjY3MDI2ODg4",
"avatar_url": "https://avatars.githubusercontent.com/u/67026888?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ojh31",
"html_url": "https://github.com/ojh31",
"followers_url": "https://api.github.com/users/ojh31/followers",
"following_url": "https://api.github.com/users/ojh31/following{/other_user}",
"gists_url": "https://api.github.com/users/ojh31/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ojh31/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ojh31/subscriptions",
"organizations_url": "https://api.github.com/users/ojh31/orgs",
"repos_url": "https://api.github.com/users/ojh31/repos",
"events_url": "https://api.github.com/users/ojh31/events{/privacy}",
"received_events_url": "https://api.github.com/users/ojh31/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
}
] | closed | false | null | [] | null | [] | 2025-06-20T03:48:11 | 2025-07-28T08:02:59 | 2025-07-28T08:02:59 | NONE | null | null | null | null | ### System Info
- `transformers` version: 4.52.4
- Platform: Linux-5.15.0-141-generic-x86_64-with-glibc2.35
- Python version: 3.11.11
- Huggingface_hub version: 0.33.0
- Safetensors version: 0.5.3
- Accelerate version: 1.7.0
- Accelerate config: not found
- DeepSpeed version: not installed
- PyTorch version (GPU?): 2.6.0+cu124 (True)
- Tensorflow version (GPU?): not installed (NA)
- Flax version (CPU?/GPU?/TPU?): not installed (NA)
- Jax version: not installed
- JaxLib version: not installed
- Using distributed or parallel set-up in script?: yes
- Using GPU in script?: yes
- GPU type: NVIDIA H100 80GB HBM3
### Who can help?
@zach-huggingface @SunMarc
### Information
- [ ] The official example scripts
- [x] My own modified scripts
### Tasks
- [x] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...)
- [ ] My own task or dataset (give details below)
### Reproduction
Script:
```
"""Simple IMDB binary classification training script using Qwen3-14B and HF Trainer."""
import torch
from datasets import DatasetDict, load_dataset
from transformers import (
AutoTokenizer,
AutoModelForSequenceClassification,
)
from transformers.training_args import TrainingArguments
from transformers.trainer import Trainer
from transformers.data.data_collator import DataCollatorWithPadding
from peft import LoraConfig, get_peft_model, TaskType
def tokenize_function(examples, tokenizer, max_length=512):
"""Tokenize the text data."""
return tokenizer(
examples["text"],
truncation=True,
padding=False,
max_length=max_length,
)
def main():
# Model and tokenizer setup
model_name = "Qwen/Qwen3-14B"
print(f"Loading tokenizer from {model_name}...")
tokenizer = AutoTokenizer.from_pretrained(model_name, trust_remote_code=True)
# Set pad token if not present
if tokenizer.pad_token is None:
tokenizer.pad_token = tokenizer.eos_token
print(f"Loading model from {model_name}...")
model = AutoModelForSequenceClassification.from_pretrained(
model_name,
num_labels=2,
torch_dtype=torch.bfloat16,
trust_remote_code=True,
)
# Set pad_token_id in model config
model.config.pad_token_id = tokenizer.pad_token_id
# Configure LoRA
print("Adding LoRA adapters...")
lora_config = LoraConfig(
r=16,
lora_alpha=32,
target_modules=["q_proj", "v_proj", "k_proj", "o_proj"],
lora_dropout=0.1,
bias="none",
task_type=TaskType.SEQ_CLS,
)
model = get_peft_model(model, lora_config)
model.print_trainable_parameters()
# Load IMDB dataset
print("Loading IMDB dataset...")
dataset = load_dataset("imdb")
# Tokenize dataset
print("Tokenizing dataset...")
tokenized_dataset = dataset.map(
lambda examples: tokenize_function(examples, tokenizer),
batched=True,
remove_columns=["text"],
)
assert isinstance(tokenized_dataset, DatasetDict)
# Data collator
data_collator = DataCollatorWithPadding(tokenizer=tokenizer)
# Training arguments
training_args = TrainingArguments(
output_dir="./results_imdb_qwen14b",
num_train_epochs=1,
per_device_train_batch_size=2,
per_device_eval_batch_size=4,
gradient_accumulation_steps=16,
log_level="debug",
save_strategy="steps",
save_steps=1,
bf16=True,
)
# Initialize trainer
trainer = Trainer(
model=model,
args=training_args,
train_dataset=tokenized_dataset["train"],
eval_dataset=tokenized_dataset["test"],
processing_class=tokenizer,
data_collator=data_collator,
)
# Train the model
print("Starting training...")
trainer.train()
if __name__ == "__main__":
main()
```
Launch with `accelerate launch --num_processes=2 --config_file=accelerate_config.yaml script.py`
```
# accelerate config
compute_environment: LOCAL_MACHINE
debug: false
# We want FSDP to shard model parameters between devices.
distributed_type: FSDP
downcast_bf16: "no"
fsdp_config:
fsdp_auto_wrap_policy: TRANSFORMER_BASED_WRAP
fsdp_backward_prefetch: BACKWARD_PRE
fsdp_cpu_ram_efficient_loading: true
fsdp_forward_prefetch: false
fsdp_offload_params: false
fsdp_sharding_strategy: FULL_SHARD
fsdp_state_dict_type: FULL_STATE_DICT
fsdp_sync_module_states: true
fsdp_use_orig_params: true
machine_rank: 0
main_training_function: main
# Previously we used "fp16" since that is how Pythia models were trained,
# but decided to turn off mixed precision for simplicity.
mixed_precision: "no"
num_machines: 1
# We overwrite this with a CLI argument
num_processes: 1
rdzv_backend: static
same_network: true
tpu_env: []
tpu_use_cluster: false
tpu_use_sudo: false
use_cpu: false
```
### Expected behavior
The trainer should save a checkpoint to the local `results_imdb_qwen14b` folder after a single step, but instead it hangs and eventually there is an NCCL timeout. The code never gets past the first call to save_fsdp_model -> _get_model_state_dict -> get_peft_model_state_dict -> model.state_dict() | {
"login": "github-actions[bot]",
"id": 41898282,
"node_id": "MDM6Qm90NDE4OTgyODI=",
"avatar_url": "https://avatars.githubusercontent.com/in/15368?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/github-actions%5Bbot%5D",
"html_url": "https://github.com/apps/github-actions",
"followers_url": "https://api.github.com/users/github-actions%5Bbot%5D/followers",
"following_url": "https://api.github.com/users/github-actions%5Bbot%5D/following{/other_user}",
"gists_url": "https://api.github.com/users/github-actions%5Bbot%5D/gists{/gist_id}",
"starred_url": "https://api.github.com/users/github-actions%5Bbot%5D/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/github-actions%5Bbot%5D/subscriptions",
"organizations_url": "https://api.github.com/users/github-actions%5Bbot%5D/orgs",
"repos_url": "https://api.github.com/users/github-actions%5Bbot%5D/repos",
"events_url": "https://api.github.com/users/github-actions%5Bbot%5D/events{/privacy}",
"received_events_url": "https://api.github.com/users/github-actions%5Bbot%5D/received_events",
"type": "Bot",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38925/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38925/timeline | null | completed | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | true |
https://api.github.com/repos/huggingface/transformers/issues/38924 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38924/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38924/comments | https://api.github.com/repos/huggingface/transformers/issues/38924/events | https://github.com/huggingface/transformers/issues/38924 | 3,161,532,906 | I_kwDOCUB6oc68cSnq | 38,924 | Exporting Llava decoder into ONNX format | {
"login": "EricJi150",
"id": 73372943,
"node_id": "MDQ6VXNlcjczMzcyOTQz",
"avatar_url": "https://avatars.githubusercontent.com/u/73372943?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/EricJi150",
"html_url": "https://github.com/EricJi150",
"followers_url": "https://api.github.com/users/EricJi150/followers",
"following_url": "https://api.github.com/users/EricJi150/following{/other_user}",
"gists_url": "https://api.github.com/users/EricJi150/gists{/gist_id}",
"starred_url": "https://api.github.com/users/EricJi150/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/EricJi150/subscriptions",
"organizations_url": "https://api.github.com/users/EricJi150/orgs",
"repos_url": "https://api.github.com/users/EricJi150/repos",
"events_url": "https://api.github.com/users/EricJi150/events{/privacy}",
"received_events_url": "https://api.github.com/users/EricJi150/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | {
"login": "zucchini-nlp",
"id": 100715397,
"node_id": "U_kgDOBgDLhQ",
"avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/zucchini-nlp",
"html_url": "https://github.com/zucchini-nlp",
"followers_url": "https://api.github.com/users/zucchini-nlp/followers",
"following_url": "https://api.github.com/users/zucchini-nlp/following{/other_user}",
"gists_url": "https://api.github.com/users/zucchini-nlp/gists{/gist_id}",
"starred_url": "https://api.github.com/users/zucchini-nlp/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/zucchini-nlp/subscriptions",
"organizations_url": "https://api.github.com/users/zucchini-nlp/orgs",
"repos_url": "https://api.github.com/users/zucchini-nlp/repos",
"events_url": "https://api.github.com/users/zucchini-nlp/events{/privacy}",
"received_events_url": "https://api.github.com/users/zucchini-nlp/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"login": "zucchini-nlp",
"id": 100715397,
"node_id": "U_kgDOBgDLhQ",
"avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/zucchini-nlp",
"html_url": "https://github.com/zucchini-nlp",
"followers_url": "https://api.github.com/users/zucchini-nlp/followers",
"following_url": "https://api.github.com/users/zucchini-nlp/following{/other_user}",
"gists_url": "https://api.github.com/users/zucchini-nlp/gists{/gist_id}",
"starred_url": "https://api.github.com/users/zucchini-nlp/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/zucchini-nlp/subscriptions",
"organizations_url": "https://api.github.com/users/zucchini-nlp/orgs",
"repos_url": "https://api.github.com/users/zucchini-nlp/repos",
"events_url": "https://api.github.com/users/zucchini-nlp/events{/privacy}",
"received_events_url": "https://api.github.com/users/zucchini-nlp/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | [] | 2025-06-19T23:32:47 | 2025-08-12T08:03:14 | 2025-08-12T08:03:14 | NONE | null | null | null | null | I am working on exporting Llava into ONNX format. I came across this previous issue: https://github.com/huggingface/transformers/issues/33637 which had a notebook that outlined how to export in three separate parts. I noticed there wasn't any actual code on how the decoder was exported unlike the other two components. Does anyone know how they were able to export the decoder in the original notebook?
Notebook: https://colab.research.google.com/drive/1IhC8YOV68cze0XWGfuqSclnVTt_FskUd?usp=sharing | {
"login": "github-actions[bot]",
"id": 41898282,
"node_id": "MDM6Qm90NDE4OTgyODI=",
"avatar_url": "https://avatars.githubusercontent.com/in/15368?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/github-actions%5Bbot%5D",
"html_url": "https://github.com/apps/github-actions",
"followers_url": "https://api.github.com/users/github-actions%5Bbot%5D/followers",
"following_url": "https://api.github.com/users/github-actions%5Bbot%5D/following{/other_user}",
"gists_url": "https://api.github.com/users/github-actions%5Bbot%5D/gists{/gist_id}",
"starred_url": "https://api.github.com/users/github-actions%5Bbot%5D/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/github-actions%5Bbot%5D/subscriptions",
"organizations_url": "https://api.github.com/users/github-actions%5Bbot%5D/orgs",
"repos_url": "https://api.github.com/users/github-actions%5Bbot%5D/repos",
"events_url": "https://api.github.com/users/github-actions%5Bbot%5D/events{/privacy}",
"received_events_url": "https://api.github.com/users/github-actions%5Bbot%5D/received_events",
"type": "Bot",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38924/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38924/timeline | null | completed | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | true |
https://api.github.com/repos/huggingface/transformers/issues/38923 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38923/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38923/comments | https://api.github.com/repos/huggingface/transformers/issues/38923/events | https://github.com/huggingface/transformers/pull/38923 | 3,161,393,332 | PR_kwDOCUB6oc6bUMuf | 38,923 | Remove deprecated max_size support from YOLOS image processor | {
"login": "kartickkt",
"id": 155653940,
"node_id": "U_kgDOCUcXNA",
"avatar_url": "https://avatars.githubusercontent.com/u/155653940?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/kartickkt",
"html_url": "https://github.com/kartickkt",
"followers_url": "https://api.github.com/users/kartickkt/followers",
"following_url": "https://api.github.com/users/kartickkt/following{/other_user}",
"gists_url": "https://api.github.com/users/kartickkt/gists{/gist_id}",
"starred_url": "https://api.github.com/users/kartickkt/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kartickkt/subscriptions",
"organizations_url": "https://api.github.com/users/kartickkt/orgs",
"repos_url": "https://api.github.com/users/kartickkt/repos",
"events_url": "https://api.github.com/users/kartickkt/events{/privacy}",
"received_events_url": "https://api.github.com/users/kartickkt/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | open | false | null | [] | null | [] | 2025-06-19T21:50:02 | 2025-06-20T13:39:10 | null | NONE | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38923",
"html_url": "https://github.com/huggingface/transformers/pull/38923",
"diff_url": "https://github.com/huggingface/transformers/pull/38923.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38923.patch",
"merged_at": null
} | - Updated `test_image_processor_from_dict_with_kwargs` to remove the deprecated `max_size` argument.
- Uses only `{"shortest_edge": ...}` in `size`, consistent with current API.
- Ensures compatibility with the latest image processor spec.
## Before submitting
- [x] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request), Pull Request section?
- [x] Did you write any new necessary tests?
All tests passed locally using pytest tests/models/yolos/test_image_processing_yolos.py. | null | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38923/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38923/timeline | null | null | null | null | true | false |
https://api.github.com/repos/huggingface/transformers/issues/38922 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38922/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38922/comments | https://api.github.com/repos/huggingface/transformers/issues/38922/events | https://github.com/huggingface/transformers/pull/38922 | 3,161,084,403 | PR_kwDOCUB6oc6bTK7N | 38,922 | Remove `ALL_LAYERNORM_LAYERS` | {
"login": "Cyrilvallez",
"id": 71554963,
"node_id": "MDQ6VXNlcjcxNTU0OTYz",
"avatar_url": "https://avatars.githubusercontent.com/u/71554963?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Cyrilvallez",
"html_url": "https://github.com/Cyrilvallez",
"followers_url": "https://api.github.com/users/Cyrilvallez/followers",
"following_url": "https://api.github.com/users/Cyrilvallez/following{/other_user}",
"gists_url": "https://api.github.com/users/Cyrilvallez/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Cyrilvallez/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Cyrilvallez/subscriptions",
"organizations_url": "https://api.github.com/users/Cyrilvallez/orgs",
"repos_url": "https://api.github.com/users/Cyrilvallez/repos",
"events_url": "https://api.github.com/users/Cyrilvallez/events{/privacy}",
"received_events_url": "https://api.github.com/users/Cyrilvallez/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-06-19T18:33:06 | 2025-06-20T10:06:50 | 2025-06-20T10:06:48 | MEMBER | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38922",
"html_url": "https://github.com/huggingface/transformers/pull/38922",
"diff_url": "https://github.com/huggingface/transformers/pull/38922.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38922.patch",
"merged_at": "2025-06-20T10:06:48"
} | # What does this PR do?
As per the title and as discussed offline. The list is super badly maintained, and it's better/easier to rely on pattern matching. I personally checked that everywhere it was removed, the new patterns will allow to match the layers. Also took care of not making the new pattern too general, i.e. `norm` is matched only if it's a full layer name (so that we don't match e.g. `normal` or `normalization` or whatever containing `norm`).
cc @SunMarc @ArthurZucker | {
"login": "Cyrilvallez",
"id": 71554963,
"node_id": "MDQ6VXNlcjcxNTU0OTYz",
"avatar_url": "https://avatars.githubusercontent.com/u/71554963?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Cyrilvallez",
"html_url": "https://github.com/Cyrilvallez",
"followers_url": "https://api.github.com/users/Cyrilvallez/followers",
"following_url": "https://api.github.com/users/Cyrilvallez/following{/other_user}",
"gists_url": "https://api.github.com/users/Cyrilvallez/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Cyrilvallez/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Cyrilvallez/subscriptions",
"organizations_url": "https://api.github.com/users/Cyrilvallez/orgs",
"repos_url": "https://api.github.com/users/Cyrilvallez/repos",
"events_url": "https://api.github.com/users/Cyrilvallez/events{/privacy}",
"received_events_url": "https://api.github.com/users/Cyrilvallez/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38922/reactions",
"total_count": 1,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 1,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38922/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/38921 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38921/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38921/comments | https://api.github.com/repos/huggingface/transformers/issues/38921/events | https://github.com/huggingface/transformers/pull/38921 | 3,161,083,236 | PR_kwDOCUB6oc6bTKr6 | 38,921 | Add T5 summarization example to examples/t5 | {
"login": "robbiefrankie",
"id": 164729958,
"node_id": "U_kgDOCdGUZg",
"avatar_url": "https://avatars.githubusercontent.com/u/164729958?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/robbiefrankie",
"html_url": "https://github.com/robbiefrankie",
"followers_url": "https://api.github.com/users/robbiefrankie/followers",
"following_url": "https://api.github.com/users/robbiefrankie/following{/other_user}",
"gists_url": "https://api.github.com/users/robbiefrankie/gists{/gist_id}",
"starred_url": "https://api.github.com/users/robbiefrankie/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/robbiefrankie/subscriptions",
"organizations_url": "https://api.github.com/users/robbiefrankie/orgs",
"repos_url": "https://api.github.com/users/robbiefrankie/repos",
"events_url": "https://api.github.com/users/robbiefrankie/events{/privacy}",
"received_events_url": "https://api.github.com/users/robbiefrankie/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-06-19T18:32:18 | 2025-06-19T18:33:49 | 2025-06-19T18:33:48 | NONE | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38921",
"html_url": "https://github.com/huggingface/transformers/pull/38921",
"diff_url": "https://github.com/huggingface/transformers/pull/38921.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38921.patch",
"merged_at": null
} | # What does this PR do?
<!--
Congratulations! You've made it this far! You're not quite done yet though.
Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution.
Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change.
Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost.
-->
<!-- Remove if not applicable -->
Fixes # (issue)
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [ ] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
<!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @
If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**.
Please tag fewer than 3 people.
Models:
- text models: @ArthurZucker
- vision models: @amyeroberts, @qubvel
- speech models: @eustlb
- graph models: @clefourrier
Library:
- flax: @gante and @Rocketknight1
- generate: @zucchini-nlp (visual-language models) or @gante (all others)
- pipelines: @Rocketknight1
- tensorflow: @gante and @Rocketknight1
- tokenizers: @ArthurZucker
- trainer: @zach-huggingface, @SunMarc and @qgallouedec
- chat templates: @Rocketknight1
Integrations:
- deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface
- ray/raytune: @richardliaw, @amogkam
- Big Model Inference: @SunMarc
- quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber
Documentation: @stevhliu
HF projects:
- accelerate: [different repo](https://github.com/huggingface/accelerate)
- datasets: [different repo](https://github.com/huggingface/datasets)
- diffusers: [different repo](https://github.com/huggingface/diffusers)
- rust tokenizers: [different repo](https://github.com/huggingface/tokenizers)
Maintained examples (not research project or legacy):
- Flax: @Rocketknight1
- PyTorch: See Models above and tag the person corresponding to the modality of the example.
- TensorFlow: @Rocketknight1
-->
| {
"login": "robbiefrankie",
"id": 164729958,
"node_id": "U_kgDOCdGUZg",
"avatar_url": "https://avatars.githubusercontent.com/u/164729958?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/robbiefrankie",
"html_url": "https://github.com/robbiefrankie",
"followers_url": "https://api.github.com/users/robbiefrankie/followers",
"following_url": "https://api.github.com/users/robbiefrankie/following{/other_user}",
"gists_url": "https://api.github.com/users/robbiefrankie/gists{/gist_id}",
"starred_url": "https://api.github.com/users/robbiefrankie/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/robbiefrankie/subscriptions",
"organizations_url": "https://api.github.com/users/robbiefrankie/orgs",
"repos_url": "https://api.github.com/users/robbiefrankie/repos",
"events_url": "https://api.github.com/users/robbiefrankie/events{/privacy}",
"received_events_url": "https://api.github.com/users/robbiefrankie/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38921/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38921/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/38920 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38920/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38920/comments | https://api.github.com/repos/huggingface/transformers/issues/38920/events | https://github.com/huggingface/transformers/pull/38920 | 3,161,028,432 | PR_kwDOCUB6oc6bS_JU | 38,920 | Add T5 summarization example to examples/t5 | {
"login": "robbiefrankie",
"id": 164729958,
"node_id": "U_kgDOCdGUZg",
"avatar_url": "https://avatars.githubusercontent.com/u/164729958?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/robbiefrankie",
"html_url": "https://github.com/robbiefrankie",
"followers_url": "https://api.github.com/users/robbiefrankie/followers",
"following_url": "https://api.github.com/users/robbiefrankie/following{/other_user}",
"gists_url": "https://api.github.com/users/robbiefrankie/gists{/gist_id}",
"starred_url": "https://api.github.com/users/robbiefrankie/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/robbiefrankie/subscriptions",
"organizations_url": "https://api.github.com/users/robbiefrankie/orgs",
"repos_url": "https://api.github.com/users/robbiefrankie/repos",
"events_url": "https://api.github.com/users/robbiefrankie/events{/privacy}",
"received_events_url": "https://api.github.com/users/robbiefrankie/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-06-19T18:00:26 | 2025-06-19T18:00:44 | 2025-06-19T18:00:43 | NONE | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38920",
"html_url": "https://github.com/huggingface/transformers/pull/38920",
"diff_url": "https://github.com/huggingface/transformers/pull/38920.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38920.patch",
"merged_at": null
} | # What does this PR do?
<!--
Congratulations! You've made it this far! You're not quite done yet though.
Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution.
Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change.
Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost.
-->
<!-- Remove if not applicable -->
Fixes # (issue)
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [ ] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
<!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @
If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**.
Please tag fewer than 3 people.
Models:
- text models: @ArthurZucker
- vision models: @amyeroberts, @qubvel
- speech models: @eustlb
- graph models: @clefourrier
Library:
- flax: @gante and @Rocketknight1
- generate: @zucchini-nlp (visual-language models) or @gante (all others)
- pipelines: @Rocketknight1
- tensorflow: @gante and @Rocketknight1
- tokenizers: @ArthurZucker
- trainer: @zach-huggingface, @SunMarc and @qgallouedec
- chat templates: @Rocketknight1
Integrations:
- deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface
- ray/raytune: @richardliaw, @amogkam
- Big Model Inference: @SunMarc
- quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber
Documentation: @stevhliu
HF projects:
- accelerate: [different repo](https://github.com/huggingface/accelerate)
- datasets: [different repo](https://github.com/huggingface/datasets)
- diffusers: [different repo](https://github.com/huggingface/diffusers)
- rust tokenizers: [different repo](https://github.com/huggingface/tokenizers)
Maintained examples (not research project or legacy):
- Flax: @Rocketknight1
- PyTorch: See Models above and tag the person corresponding to the modality of the example.
- TensorFlow: @Rocketknight1
-->
| {
"login": "robbiefrankie",
"id": 164729958,
"node_id": "U_kgDOCdGUZg",
"avatar_url": "https://avatars.githubusercontent.com/u/164729958?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/robbiefrankie",
"html_url": "https://github.com/robbiefrankie",
"followers_url": "https://api.github.com/users/robbiefrankie/followers",
"following_url": "https://api.github.com/users/robbiefrankie/following{/other_user}",
"gists_url": "https://api.github.com/users/robbiefrankie/gists{/gist_id}",
"starred_url": "https://api.github.com/users/robbiefrankie/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/robbiefrankie/subscriptions",
"organizations_url": "https://api.github.com/users/robbiefrankie/orgs",
"repos_url": "https://api.github.com/users/robbiefrankie/repos",
"events_url": "https://api.github.com/users/robbiefrankie/events{/privacy}",
"received_events_url": "https://api.github.com/users/robbiefrankie/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38920/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38920/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/38919 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38919/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38919/comments | https://api.github.com/repos/huggingface/transformers/issues/38919/events | https://github.com/huggingface/transformers/pull/38919 | 3,160,958,879 | PR_kwDOCUB6oc6bSwZS | 38,919 | Remove deprecated classes in modeling_utils.py | {
"login": "Cyrilvallez",
"id": 71554963,
"node_id": "MDQ6VXNlcjcxNTU0OTYz",
"avatar_url": "https://avatars.githubusercontent.com/u/71554963?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Cyrilvallez",
"html_url": "https://github.com/Cyrilvallez",
"followers_url": "https://api.github.com/users/Cyrilvallez/followers",
"following_url": "https://api.github.com/users/Cyrilvallez/following{/other_user}",
"gists_url": "https://api.github.com/users/Cyrilvallez/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Cyrilvallez/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Cyrilvallez/subscriptions",
"organizations_url": "https://api.github.com/users/Cyrilvallez/orgs",
"repos_url": "https://api.github.com/users/Cyrilvallez/repos",
"events_url": "https://api.github.com/users/Cyrilvallez/events{/privacy}",
"received_events_url": "https://api.github.com/users/Cyrilvallez/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-06-19T17:17:00 | 2025-08-19T21:51:22 | 2025-06-19T17:25:20 | MEMBER | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38919",
"html_url": "https://github.com/huggingface/transformers/pull/38919",
"diff_url": "https://github.com/huggingface/transformers/pull/38919.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38919.patch",
"merged_at": "2025-06-19T17:25:20"
} | # What does this PR do?
As per the title
cc @qubvel! Got too annoyed to see them, it was time! | {
"login": "Cyrilvallez",
"id": 71554963,
"node_id": "MDQ6VXNlcjcxNTU0OTYz",
"avatar_url": "https://avatars.githubusercontent.com/u/71554963?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Cyrilvallez",
"html_url": "https://github.com/Cyrilvallez",
"followers_url": "https://api.github.com/users/Cyrilvallez/followers",
"following_url": "https://api.github.com/users/Cyrilvallez/following{/other_user}",
"gists_url": "https://api.github.com/users/Cyrilvallez/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Cyrilvallez/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Cyrilvallez/subscriptions",
"organizations_url": "https://api.github.com/users/Cyrilvallez/orgs",
"repos_url": "https://api.github.com/users/Cyrilvallez/repos",
"events_url": "https://api.github.com/users/Cyrilvallez/events{/privacy}",
"received_events_url": "https://api.github.com/users/Cyrilvallez/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38919/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38919/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/38918 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38918/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38918/comments | https://api.github.com/repos/huggingface/transformers/issues/38918/events | https://github.com/huggingface/transformers/issues/38918 | 3,160,958,197 | I_kwDOCUB6oc68aGT1 | 38,918 | Lack of IDE-Specific Authentication Instructions in Hugging Face "Quickstart" Documentation | {
"login": "marcndo",
"id": 178362075,
"node_id": "U_kgDOCqGW2w",
"avatar_url": "https://avatars.githubusercontent.com/u/178362075?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/marcndo",
"html_url": "https://github.com/marcndo",
"followers_url": "https://api.github.com/users/marcndo/followers",
"following_url": "https://api.github.com/users/marcndo/following{/other_user}",
"gists_url": "https://api.github.com/users/marcndo/gists{/gist_id}",
"starred_url": "https://api.github.com/users/marcndo/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/marcndo/subscriptions",
"organizations_url": "https://api.github.com/users/marcndo/orgs",
"repos_url": "https://api.github.com/users/marcndo/repos",
"events_url": "https://api.github.com/users/marcndo/events{/privacy}",
"received_events_url": "https://api.github.com/users/marcndo/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-06-19T17:16:32 | 2025-06-24T18:48:17 | 2025-06-24T18:48:17 | CONTRIBUTOR | null | null | null | null | Explanation:
I’m currently exploring the Transformers library and want to understand its architecture in order to make meaningful contributions. I started with the Quickstart page, particularly the setup section, which provides instructions for getting started with the Hugging Face Hub.
However, I noticed that the documentation appears to be primarily tailored for users working in Jupyter notebooks. The instructions for authentication (using notebook_login()) seem to assume that the user is running code within a notebook environment. As someone who is working in PyCharm (and possibly others working in VS Code or other IDEs), I found that there is no clear guidance for authenticating via these IDEs.
It would be helpful to explicitly mention how users working in an IDE like PyCharm or VS Code should authenticate. Specifically, using huggingface-cli for authentication in a non-notebook environment could be a good solution. Providing a simple, clear guide on how to authenticate via the CLI or within the IDE would greatly improve the documentation.
Suggestion:
I recommend updating the documentation to include a section specifically addressing authentication when working in IDEs like PyCharm or VS Code.
Please let me know if this suggestion makes sense or if you need any further clarification before I proceed with the update.
| {
"login": "stevhliu",
"id": 59462357,
"node_id": "MDQ6VXNlcjU5NDYyMzU3",
"avatar_url": "https://avatars.githubusercontent.com/u/59462357?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/stevhliu",
"html_url": "https://github.com/stevhliu",
"followers_url": "https://api.github.com/users/stevhliu/followers",
"following_url": "https://api.github.com/users/stevhliu/following{/other_user}",
"gists_url": "https://api.github.com/users/stevhliu/gists{/gist_id}",
"starred_url": "https://api.github.com/users/stevhliu/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/stevhliu/subscriptions",
"organizations_url": "https://api.github.com/users/stevhliu/orgs",
"repos_url": "https://api.github.com/users/stevhliu/repos",
"events_url": "https://api.github.com/users/stevhliu/events{/privacy}",
"received_events_url": "https://api.github.com/users/stevhliu/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38918/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38918/timeline | null | completed | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | true |
https://api.github.com/repos/huggingface/transformers/issues/38917 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38917/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38917/comments | https://api.github.com/repos/huggingface/transformers/issues/38917/events | https://github.com/huggingface/transformers/issues/38917 | 3,160,935,515 | I_kwDOCUB6oc68aAxb | 38,917 | Add past_key_values as inputs to TorchExportableModuleWithStaticCache forward | {
"login": "justinchuby",
"id": 11205048,
"node_id": "MDQ6VXNlcjExMjA1MDQ4",
"avatar_url": "https://avatars.githubusercontent.com/u/11205048?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/justinchuby",
"html_url": "https://github.com/justinchuby",
"followers_url": "https://api.github.com/users/justinchuby/followers",
"following_url": "https://api.github.com/users/justinchuby/following{/other_user}",
"gists_url": "https://api.github.com/users/justinchuby/gists{/gist_id}",
"starred_url": "https://api.github.com/users/justinchuby/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/justinchuby/subscriptions",
"organizations_url": "https://api.github.com/users/justinchuby/orgs",
"repos_url": "https://api.github.com/users/justinchuby/repos",
"events_url": "https://api.github.com/users/justinchuby/events{/privacy}",
"received_events_url": "https://api.github.com/users/justinchuby/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 2648621985,
"node_id": "MDU6TGFiZWwyNjQ4NjIxOTg1",
"url": "https://api.github.com/repos/huggingface/transformers/labels/Feature%20request",
"name": "Feature request",
"color": "FBCA04",
"default": false,
"description": "Request for a new feature"
}
] | open | false | null | [] | null | [] | 2025-06-19T17:05:24 | 2025-06-20T09:01:16 | null | CONTRIBUTOR | null | null | null | null | ### Feature request
With https://github.com/huggingface/transformers/pull/38879, exporting with static cache as input should be supported in https://github.com/huggingface/transformers/blob/797860c68cfd8bd3ad38ce312540445073f76b30/src/transformers/integrations/executorch.py#L288. This helps the exported program preserve proper input order for downstream conversion (e.g. torch.onnx.export).
Related
- https://github.com/pytorch/pytorch/issues/155862
cc @xadupre @yuanyao-nv @guangy10
### Motivation
Better preservation of exported program signatures
### Your contribution
Yes | null | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38917/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38917/timeline | null | null | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | false |
https://api.github.com/repos/huggingface/transformers/issues/38916 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38916/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38916/comments | https://api.github.com/repos/huggingface/transformers/issues/38916/events | https://github.com/huggingface/transformers/pull/38916 | 3,160,665,888 | PR_kwDOCUB6oc6bRxF8 | 38,916 | Fix custom generate from local directory | {
"login": "manueldeprada",
"id": 6536835,
"node_id": "MDQ6VXNlcjY1MzY4MzU=",
"avatar_url": "https://avatars.githubusercontent.com/u/6536835?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/manueldeprada",
"html_url": "https://github.com/manueldeprada",
"followers_url": "https://api.github.com/users/manueldeprada/followers",
"following_url": "https://api.github.com/users/manueldeprada/following{/other_user}",
"gists_url": "https://api.github.com/users/manueldeprada/gists{/gist_id}",
"starred_url": "https://api.github.com/users/manueldeprada/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/manueldeprada/subscriptions",
"organizations_url": "https://api.github.com/users/manueldeprada/orgs",
"repos_url": "https://api.github.com/users/manueldeprada/repos",
"events_url": "https://api.github.com/users/manueldeprada/events{/privacy}",
"received_events_url": "https://api.github.com/users/manueldeprada/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-06-19T15:11:21 | 2025-06-20T16:36:58 | 2025-06-20T16:36:58 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38916",
"html_url": "https://github.com/huggingface/transformers/pull/38916",
"diff_url": "https://github.com/huggingface/transformers/pull/38916.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38916.patch",
"merged_at": "2025-06-20T16:36:58"
} | Passing a local folder to `custom_generate` via `model.generate(**inputs, custom_generate="custom_generate_example", ...)` fails because
1. `shutil.copy` in `src/transformers/dynamic_module_utils.py` silently fails if the parent directory structure doesn't exist.
2. The relative imports do not get correctly copied to the fresh submodule in .cache.
So, this PR does:
1. Create parent dirs before copying files (custom_generate dir)
2. Correctly copy relative import files to the submodule folder.
3. Update docs. | {
"login": "manueldeprada",
"id": 6536835,
"node_id": "MDQ6VXNlcjY1MzY4MzU=",
"avatar_url": "https://avatars.githubusercontent.com/u/6536835?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/manueldeprada",
"html_url": "https://github.com/manueldeprada",
"followers_url": "https://api.github.com/users/manueldeprada/followers",
"following_url": "https://api.github.com/users/manueldeprada/following{/other_user}",
"gists_url": "https://api.github.com/users/manueldeprada/gists{/gist_id}",
"starred_url": "https://api.github.com/users/manueldeprada/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/manueldeprada/subscriptions",
"organizations_url": "https://api.github.com/users/manueldeprada/orgs",
"repos_url": "https://api.github.com/users/manueldeprada/repos",
"events_url": "https://api.github.com/users/manueldeprada/events{/privacy}",
"received_events_url": "https://api.github.com/users/manueldeprada/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38916/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38916/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/38915 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38915/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38915/comments | https://api.github.com/repos/huggingface/transformers/issues/38915/events | https://github.com/huggingface/transformers/pull/38915 | 3,160,607,212 | PR_kwDOCUB6oc6bRkTg | 38,915 | Small fixes for utils/check_docstrings.py | {
"login": "manueldeprada",
"id": 6536835,
"node_id": "MDQ6VXNlcjY1MzY4MzU=",
"avatar_url": "https://avatars.githubusercontent.com/u/6536835?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/manueldeprada",
"html_url": "https://github.com/manueldeprada",
"followers_url": "https://api.github.com/users/manueldeprada/followers",
"following_url": "https://api.github.com/users/manueldeprada/following{/other_user}",
"gists_url": "https://api.github.com/users/manueldeprada/gists{/gist_id}",
"starred_url": "https://api.github.com/users/manueldeprada/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/manueldeprada/subscriptions",
"organizations_url": "https://api.github.com/users/manueldeprada/orgs",
"repos_url": "https://api.github.com/users/manueldeprada/repos",
"events_url": "https://api.github.com/users/manueldeprada/events{/privacy}",
"received_events_url": "https://api.github.com/users/manueldeprada/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-06-19T14:49:31 | 2025-07-11T14:36:11 | 2025-07-11T14:36:11 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38915",
"html_url": "https://github.com/huggingface/transformers/pull/38915",
"diff_url": "https://github.com/huggingface/transformers/pull/38915.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38915.patch",
"merged_at": "2025-07-11T14:36:10"
} | 1. fix ast deprecations for python 3.14 by replacing:
1. `node.n` by `node.value` ([deprecation notice](https://github.com/python/cpython/blob/af48f39a440a5c23d89508c038da5d560e74fe48/Lib/ast.py#L530))
2. `ast.Num` by `ast.Constant` ([deprecation notice](https://github.com/python/cpython/blob/af48f39a440a5c23d89508c038da5d560e74fe48/Lib/ast.py#L1822-L1830))
2. When `--fix_and_overwrite` is passed but fixing fails, exit with a detailed exception instead of failing silently. | {
"login": "manueldeprada",
"id": 6536835,
"node_id": "MDQ6VXNlcjY1MzY4MzU=",
"avatar_url": "https://avatars.githubusercontent.com/u/6536835?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/manueldeprada",
"html_url": "https://github.com/manueldeprada",
"followers_url": "https://api.github.com/users/manueldeprada/followers",
"following_url": "https://api.github.com/users/manueldeprada/following{/other_user}",
"gists_url": "https://api.github.com/users/manueldeprada/gists{/gist_id}",
"starred_url": "https://api.github.com/users/manueldeprada/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/manueldeprada/subscriptions",
"organizations_url": "https://api.github.com/users/manueldeprada/orgs",
"repos_url": "https://api.github.com/users/manueldeprada/repos",
"events_url": "https://api.github.com/users/manueldeprada/events{/privacy}",
"received_events_url": "https://api.github.com/users/manueldeprada/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38915/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38915/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/38914 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38914/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38914/comments | https://api.github.com/repos/huggingface/transformers/issues/38914/events | https://github.com/huggingface/transformers/issues/38914 | 3,160,267,954 | I_kwDOCUB6oc68Xdyy | 38,914 | Error: StaticCache.__init__() got an unexpected keyword argument 'batch_size' | {
"login": "rsxdalv",
"id": 6757283,
"node_id": "MDQ6VXNlcjY3NTcyODM=",
"avatar_url": "https://avatars.githubusercontent.com/u/6757283?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/rsxdalv",
"html_url": "https://github.com/rsxdalv",
"followers_url": "https://api.github.com/users/rsxdalv/followers",
"following_url": "https://api.github.com/users/rsxdalv/following{/other_user}",
"gists_url": "https://api.github.com/users/rsxdalv/gists{/gist_id}",
"starred_url": "https://api.github.com/users/rsxdalv/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/rsxdalv/subscriptions",
"organizations_url": "https://api.github.com/users/rsxdalv/orgs",
"repos_url": "https://api.github.com/users/rsxdalv/repos",
"events_url": "https://api.github.com/users/rsxdalv/events{/privacy}",
"received_events_url": "https://api.github.com/users/rsxdalv/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-06-19T13:01:46 | 2025-08-23T13:56:15 | 2025-08-22T08:03:20 | NONE | null | null | null | null | Upgrading transformers breaks the 'new' name and instead reverts back to the 'deprecated' name...
```
Attempting uninstall: transformers
Found existing installation: transformers 4.46.1
Uninstalling transformers-4.46.1:
Successfully uninstalled transformers-4.46.1
Successfully installed tokenizers-0.21.1 transformers-4.52.4
```
Code went from:
```
def __init__(
self,
config: PretrainedConfig,
batch_size: int = None,
max_cache_len: int = None,
device: torch.device = None,
dtype: torch.dtype = torch.float32,
max_batch_size: Optional[int] = None,
layer_device_map: Optional[Dict[int, Union[str, torch.device, int]]] = None,
) -> None:
super().__init__()
if max_batch_size is not None:
logger.warning_once(
f"The 'max_batch_size' argument of {self.__class__.__name__} is deprecated and will be removed in "
"v4.46. Use the more precisely named 'batch_size' argument instead."
)
self.batch_size = batch_size or max_batch_size
self.max_cache_len = config.max_position_embeddings if max_cache_len is None else max_cache_len
```
to
```
def __init__(
self,
config: PretrainedConfig,
max_batch_size: int,
max_cache_len: Optional[int] = None,
device: Union[torch.device, str, None] = None,
dtype: torch.dtype = torch.float32,
layer_device_map: Optional[Dict[int, Union[str, torch.device, int]]] = None,
) -> None:
super().__init__()
self.max_batch_size = max_batch_size
self.max_cache_len = config.max_position_embeddings if max_cache_len is None else max_cache_len
```
Edit:
One more:
`Error: 'StaticCache' object has no attribute 'dtype'` | {
"login": "github-actions[bot]",
"id": 41898282,
"node_id": "MDM6Qm90NDE4OTgyODI=",
"avatar_url": "https://avatars.githubusercontent.com/in/15368?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/github-actions%5Bbot%5D",
"html_url": "https://github.com/apps/github-actions",
"followers_url": "https://api.github.com/users/github-actions%5Bbot%5D/followers",
"following_url": "https://api.github.com/users/github-actions%5Bbot%5D/following{/other_user}",
"gists_url": "https://api.github.com/users/github-actions%5Bbot%5D/gists{/gist_id}",
"starred_url": "https://api.github.com/users/github-actions%5Bbot%5D/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/github-actions%5Bbot%5D/subscriptions",
"organizations_url": "https://api.github.com/users/github-actions%5Bbot%5D/orgs",
"repos_url": "https://api.github.com/users/github-actions%5Bbot%5D/repos",
"events_url": "https://api.github.com/users/github-actions%5Bbot%5D/events{/privacy}",
"received_events_url": "https://api.github.com/users/github-actions%5Bbot%5D/received_events",
"type": "Bot",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38914/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38914/timeline | null | completed | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | true |
https://api.github.com/repos/huggingface/transformers/issues/38913 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38913/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38913/comments | https://api.github.com/repos/huggingface/transformers/issues/38913/events | https://github.com/huggingface/transformers/pull/38913 | 3,160,239,014 | PR_kwDOCUB6oc6bQTxC | 38,913 | Apply GradientCheckpointingLayer to the whole repo | {
"login": "qubvel",
"id": 31920396,
"node_id": "MDQ6VXNlcjMxOTIwMzk2",
"avatar_url": "https://avatars.githubusercontent.com/u/31920396?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/qubvel",
"html_url": "https://github.com/qubvel",
"followers_url": "https://api.github.com/users/qubvel/followers",
"following_url": "https://api.github.com/users/qubvel/following{/other_user}",
"gists_url": "https://api.github.com/users/qubvel/gists{/gist_id}",
"starred_url": "https://api.github.com/users/qubvel/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/qubvel/subscriptions",
"organizations_url": "https://api.github.com/users/qubvel/orgs",
"repos_url": "https://api.github.com/users/qubvel/repos",
"events_url": "https://api.github.com/users/qubvel/events{/privacy}",
"received_events_url": "https://api.github.com/users/qubvel/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-06-19T12:51:32 | 2025-06-23T14:49:43 | 2025-06-23T12:24:48 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38913",
"html_url": "https://github.com/huggingface/transformers/pull/38913",
"diff_url": "https://github.com/huggingface/transformers/pull/38913.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38913.patch",
"merged_at": "2025-06-23T12:24:48"
} | # What does this PR do?
Apply `GradientCheckpointingLayer` to the remaining models in the repository.
Most of the PR is pretty much similar changes for all models:
1) Add import for `GradientCheckpointingLayer`
2) Inherit `*Layer` modules from `GradientCheckpointingLayer`
3) Remove `if/else` path for gradient checkpointing, keeping the `else` path only
3a) Some changes were required to make sure all tensors with gradients were passed as positional arguments.
Additionally, `GradientCheckpointingLayer` was modified slightly. I added handling for `use_cache` and `past_key_values` within the layer to disable them in case gradient checkpointing is enabled.
We still have to keep some redundant code, though:
#### Case 1.
```py
if self.gradient_checkpointing and self.training and use_cache:
logger.warning_once(
"`use_cache=True` is incompatible with gradient checkpointing. Setting `use_cache=False`."
)
use_cache = False
```
because, later, most of the models rely on the `use_cache` parameter as follows
```py
if use_cache:
next_decoder_cache = layer_outputs[2 if output_attentions else 1]
```
In case it is handled only by `GradientCheckpointingLayer` and not modified in the outer module, it leads to an IndexError.
#### Case 2.
In some cases layer parameters order doesn't allow to handle `past_key_values` as kwargs, e.g. GPT2
```py
outputs = block(
hidden_states,
past_key_values if not (self.gradient_checkpointing and self.training) else None,
cache_position,
causal_mask,
head_mask[i],
encoder_hidden_states, # as a positional argument for gradient checkpointing
encoder_attention_mask=encoder_attention_mask,
use_cache=use_cache,
output_attentions=output_attentions,
**kwargs,
)
```
We have to pass all params up to `encoder_hidden_states` as positional args (tensors that require grads have to be passed that way), so `past_key_values` is also passed a positional argument and resolved manually.
Alternatively, we can refactor layers' params order, but that would be a breaking change. Not that many models and mostly the old ones.
### Not supported models
Also, there are a couple of exceptions where `GradientCheckpointingLayer` does not work. I tried to fix it, but I didn't go too far and just kept it in its original state
- zamba / zamba2
- mllama
cc @ArthurZucker @Cyrilvallez | {
"login": "Cyrilvallez",
"id": 71554963,
"node_id": "MDQ6VXNlcjcxNTU0OTYz",
"avatar_url": "https://avatars.githubusercontent.com/u/71554963?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Cyrilvallez",
"html_url": "https://github.com/Cyrilvallez",
"followers_url": "https://api.github.com/users/Cyrilvallez/followers",
"following_url": "https://api.github.com/users/Cyrilvallez/following{/other_user}",
"gists_url": "https://api.github.com/users/Cyrilvallez/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Cyrilvallez/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Cyrilvallez/subscriptions",
"organizations_url": "https://api.github.com/users/Cyrilvallez/orgs",
"repos_url": "https://api.github.com/users/Cyrilvallez/repos",
"events_url": "https://api.github.com/users/Cyrilvallez/events{/privacy}",
"received_events_url": "https://api.github.com/users/Cyrilvallez/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38913/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38913/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/38912 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38912/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38912/comments | https://api.github.com/repos/huggingface/transformers/issues/38912/events | https://github.com/huggingface/transformers/pull/38912 | 3,160,003,598 | PR_kwDOCUB6oc6bPfKB | 38,912 | Modernbert fixes | {
"login": "remi-or",
"id": 83456801,
"node_id": "MDQ6VXNlcjgzNDU2ODAx",
"avatar_url": "https://avatars.githubusercontent.com/u/83456801?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/remi-or",
"html_url": "https://github.com/remi-or",
"followers_url": "https://api.github.com/users/remi-or/followers",
"following_url": "https://api.github.com/users/remi-or/following{/other_user}",
"gists_url": "https://api.github.com/users/remi-or/gists{/gist_id}",
"starred_url": "https://api.github.com/users/remi-or/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/remi-or/subscriptions",
"organizations_url": "https://api.github.com/users/remi-or/orgs",
"repos_url": "https://api.github.com/users/remi-or/repos",
"events_url": "https://api.github.com/users/remi-or/events{/privacy}",
"received_events_url": "https://api.github.com/users/remi-or/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-06-19T11:41:54 | 2025-06-20T09:22:33 | 2025-06-20T09:22:33 | COLLABORATOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38912",
"html_url": "https://github.com/huggingface/transformers/pull/38912",
"diff_url": "https://github.com/huggingface/transformers/pull/38912.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38912.patch",
"merged_at": "2025-06-20T09:22:33"
} | This PR fixes two thing that led to several test fails in `modernbert`:
- there was a deprecated argument, `pos_idx_in_fp32` , passed to flash_attention `RotaryEmbedding` (class is [here](https://github.com/Dao-AILab/flash-attention/blob/32c491f8c592650c38374545738f589cc3b73c5f/flash_attn/layers/rotary.py#L331)) which caused the error `TypeError: RotaryEmbedding.__init__() got an unexpected keyword argument 'pos_idx_in_fp32'`. This did not happen in A100 because the code path was not used.
- the test `test_sdpa_can_dispatch_on_flash` was not skipped although `modernbert` always sets the attention mask and thus the test fails. It also failed on A100. | {
"login": "ydshieh",
"id": 2521628,
"node_id": "MDQ6VXNlcjI1MjE2Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ydshieh",
"html_url": "https://github.com/ydshieh",
"followers_url": "https://api.github.com/users/ydshieh/followers",
"following_url": "https://api.github.com/users/ydshieh/following{/other_user}",
"gists_url": "https://api.github.com/users/ydshieh/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ydshieh/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ydshieh/subscriptions",
"organizations_url": "https://api.github.com/users/ydshieh/orgs",
"repos_url": "https://api.github.com/users/ydshieh/repos",
"events_url": "https://api.github.com/users/ydshieh/events{/privacy}",
"received_events_url": "https://api.github.com/users/ydshieh/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38912/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38912/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/38911 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38911/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38911/comments | https://api.github.com/repos/huggingface/transformers/issues/38911/events | https://github.com/huggingface/transformers/pull/38911 | 3,159,992,741 | PR_kwDOCUB6oc6bPcxs | 38,911 | feat: add flexible Liger Kernel configuration to TrainingArguments | {
"login": "hamza-hcompany",
"id": 208606724,
"node_id": "U_kgDODG8WBA",
"avatar_url": "https://avatars.githubusercontent.com/u/208606724?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/hamza-hcompany",
"html_url": "https://github.com/hamza-hcompany",
"followers_url": "https://api.github.com/users/hamza-hcompany/followers",
"following_url": "https://api.github.com/users/hamza-hcompany/following{/other_user}",
"gists_url": "https://api.github.com/users/hamza-hcompany/gists{/gist_id}",
"starred_url": "https://api.github.com/users/hamza-hcompany/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/hamza-hcompany/subscriptions",
"organizations_url": "https://api.github.com/users/hamza-hcompany/orgs",
"repos_url": "https://api.github.com/users/hamza-hcompany/repos",
"events_url": "https://api.github.com/users/hamza-hcompany/events{/privacy}",
"received_events_url": "https://api.github.com/users/hamza-hcompany/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-06-19T11:38:08 | 2025-06-19T15:54:48 | 2025-06-19T15:54:08 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38911",
"html_url": "https://github.com/huggingface/transformers/pull/38911",
"diff_url": "https://github.com/huggingface/transformers/pull/38911.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38911.patch",
"merged_at": "2025-06-19T15:54:08"
} | # What does this PR do?
Add support for granular Liger Kernel configuration through a new `liger_kernel_config` parameter in TrainingArguments. This allows users to selectively enable/disable specific kernels (rope, swiglu, cross_entropy, etc.) instead of the current approach that rely on default configuration.
Features:
- Add `liger_kernel_config` dict parameter to TrainingArguments
- Support selective kernel application for all supported models
- Maintain full backward compatibility with existing `use_liger_kernel` flag
Example usage:
```python
TrainingArguments(
use_liger_kernel=True,
liger_kernel_config={
"rope": True,
"swiglu": True,
"cross_entropy": False,
"fused_linear_cross_entropy": True
}
)
```
Closes #38905
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [x] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [x] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [x] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [x] Did you write any new necessary tests?
## Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
<!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @
If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**.
Please tag fewer than 3 people.
Models:
- text models: @ArthurZucker
- vision models: @amyeroberts, @qubvel
- speech models: @eustlb
- graph models: @clefourrier
Library:
- flax: @gante and @Rocketknight1
- generate: @zucchini-nlp (visual-language models) or @gante (all others)
- pipelines: @Rocketknight1
- tensorflow: @gante and @Rocketknight1
- tokenizers: @ArthurZucker
- trainer: @zach-huggingface, @SunMarc and @qgallouedec
- chat templates: @Rocketknight1
Integrations:
- deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface
- ray/raytune: @richardliaw, @amogkam
- Big Model Inference: @SunMarc
- quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber
Documentation: @stevhliu
HF projects:
- accelerate: [different repo](https://github.com/huggingface/accelerate)
- datasets: [different repo](https://github.com/huggingface/datasets)
- diffusers: [different repo](https://github.com/huggingface/diffusers)
- rust tokenizers: [different repo](https://github.com/huggingface/tokenizers)
Maintained examples (not research project or legacy):
- Flax: @Rocketknight1
- PyTorch: See Models above and tag the person corresponding to the modality of the example.
- TensorFlow: @Rocketknight1
-->
| {
"login": "SunMarc",
"id": 57196510,
"node_id": "MDQ6VXNlcjU3MTk2NTEw",
"avatar_url": "https://avatars.githubusercontent.com/u/57196510?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/SunMarc",
"html_url": "https://github.com/SunMarc",
"followers_url": "https://api.github.com/users/SunMarc/followers",
"following_url": "https://api.github.com/users/SunMarc/following{/other_user}",
"gists_url": "https://api.github.com/users/SunMarc/gists{/gist_id}",
"starred_url": "https://api.github.com/users/SunMarc/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/SunMarc/subscriptions",
"organizations_url": "https://api.github.com/users/SunMarc/orgs",
"repos_url": "https://api.github.com/users/SunMarc/repos",
"events_url": "https://api.github.com/users/SunMarc/events{/privacy}",
"received_events_url": "https://api.github.com/users/SunMarc/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38911/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38911/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/38910 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38910/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38910/comments | https://api.github.com/repos/huggingface/transformers/issues/38910/events | https://github.com/huggingface/transformers/issues/38910 | 3,159,964,028 | I_kwDOCUB6oc68WTl8 | 38,910 | `load_balancing_loss_func` doesn't support 4D attention mask | {
"login": "Vermouth7",
"id": 66453086,
"node_id": "MDQ6VXNlcjY2NDUzMDg2",
"avatar_url": "https://avatars.githubusercontent.com/u/66453086?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Vermouth7",
"html_url": "https://github.com/Vermouth7",
"followers_url": "https://api.github.com/users/Vermouth7/followers",
"following_url": "https://api.github.com/users/Vermouth7/following{/other_user}",
"gists_url": "https://api.github.com/users/Vermouth7/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Vermouth7/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Vermouth7/subscriptions",
"organizations_url": "https://api.github.com/users/Vermouth7/orgs",
"repos_url": "https://api.github.com/users/Vermouth7/repos",
"events_url": "https://api.github.com/users/Vermouth7/events{/privacy}",
"received_events_url": "https://api.github.com/users/Vermouth7/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
}
] | closed | false | null | [] | null | [] | 2025-06-19T11:26:54 | 2025-08-06T01:49:36 | 2025-07-30T08:02:55 | NONE | null | null | null | null | ### System Info
- accelerate 1.7.0
- datasets 3.6.0
- deepspeed 0.16.9
- huggingface-hub 0.32.4
- llamafactory 0.9.3.dev0
- numpy 1.26.4
- torch 2.6.0+cu124
- transformers 4.51.0
### Who can help?
@ArthurZucker
### Information
- [ ] The official example scripts
- [x] My own modified scripts
### Tasks
- [x] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...)
- [ ] My own task or dataset (give details below)
### Reproduction
1. Install `llamafactory 0.9.3`
2. Enable `output_router_logits` argument
3. Run full sft script:
```
model_name_or_path: /data/Qwen3-235B-A22B
trust_remote_code: true
stage: sft
do_train: true
finetuning_type: full
deepspeed: examples/deepspeed/ds_z3_config.json
dataset: sharegpt4_zh
template: qwen3
cutoff_len: 4096
overwrite_cache: true
preprocessing_num_workers: 16
dataloader_num_workers: 4
output_dir: /data/Qwen3-235B-A22B-sft
logging_steps: 1
save_steps: 1000
plot_loss: true
overwrite_output_dir: true
save_only_model: false
report_to: none # choices: [none, wandb, tensorboard, swanlab, mlflow]
per_device_train_batch_size: 1
gradient_accumulation_steps: 4
learning_rate: 1.0e-5
num_train_epochs: 3.0
lr_scheduler_type: constant
warmup_ratio: 0.0
bf16: true
ddp_timeout: 180000000
resume_from_checkpoint: null
enable_thinking: true
max_steps: 2000
weight_decay: 0.0
seed: 42
packing: true
neat_packing: true
preprocessing_batch_size: 100000
moe_aux_loss_coef: 0.001
adam_beta1: 0.9
adam_beta2: 0.999
```
### Expected behavior
When I use `llamafactory` to finetune full Qwen3 moe model, and enable calculation of `aux_loss`, it will cause the dimension error when unpacking attention mask. The detail of traceback is shown below.
- Traceback
```
[rank2]: File "/home/temp/LLaMA-Factory/src/llamafactory/launcher.py", line 23, in <module>
[rank2]: launch()
[rank2]: File "/home/temp/LLaMA-Factory/src/llamafactory/launcher.py", line 19, in launch
[rank2]: run_exp()
[rank2]: File "/home/temp/LLaMA-Factory/src/llamafactory/train/tuner.py", line 110, in run_exp
[rank2]: _training_function(config={"args": args, "callbacks": callbacks})
[rank2]: File "/home/temp/LLaMA-Factory/src/llamafactory/train/tuner.py", line 72, in _training_function
[rank2]: run_sft(model_args, data_args, training_args, finetuning_args, generating_args, callbacks)
[rank2]: File "/home/temp/LLaMA-Factory/src/llamafactory/train/sft/workflow.py", line 105, in run_sft
[rank2]: train_result = trainer.train(resume_from_checkpoint=training_args.resume_from_checkpoint)
[rank2]: File "/root/miniconda3/envs/temp/lib/python3.10/site-packages/transformers/trainer.py", line 2240, in train
[rank2]: return inner_training_loop(
[rank2]: File "/root/miniconda3/envs/temp/lib/python3.10/site-packages/transformers/trainer.py", line 2555, in _inner_training_loop
[rank2]: tr_loss_step = self.training_step(model, inputs, num_items_in_batch)
[rank2]: File "/root/miniconda3/envs/temp/lib/python3.10/site-packages/transformers/trainer.py", line 3745, in training_step
[rank2]: loss = self.compute_loss(model, inputs, num_items_in_batch=num_items_in_batch)
[rank2]: File "/home/temp/LLaMA-Factory/src/llamafactory/train/sft/trainer.py", line 105, in compute_loss
[rank2]: return super().compute_loss(model, inputs, *args, **kwargs)
[rank2]: File "/root/miniconda3/envs/temp/lib/python3.10/site-packages/transformers/trainer.py", line 3810, in compute_loss
[rank2]: outputs = model(**inputs)
[rank2]: File "/root/miniconda3/envs/temp/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1739, in _wrapped_call_impl
[rank2]: return self._call_impl(*args, **kwargs)
[rank2]: File "/root/miniconda3/envs/temp/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1750, in _call_impl
[rank2]: return forward_call(*args, **kwargs)
[rank2]: File "/root/miniconda3/envs/temp/lib/python3.10/site-packages/deepspeed/utils/nvtx.py", line 20, in wrapped_fn
[rank2]: ret_val = func(*args, **kwargs)
[rank2]: File "/root/miniconda3/envs/temp/lib/python3.10/site-packages/deepspeed/runtime/engine.py", line 2054, in forward
[rank2]: loss = self.module(*inputs, **kwargs)
[rank2]: File "/root/miniconda3/envs/temp/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1739, in _wrapped_call_impl
[rank2]: return self._call_impl(*args, **kwargs)
[rank2]: File "/root/miniconda3/envs/temp/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1845, in _call_impl
[rank2]: return inner()
[rank2]: File "/root/miniconda3/envs/temp/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1793, in inner
[rank2]: result = forward_call(*args, **kwargs)
[rank2]: File "/root/miniconda3/envs/temp/lib/python3.10/site-packages/transformers/utils/generic.py", line 969, in wrapper
[rank2]: output = func(self, *args, **kwargs)
[rank2]: File "/root/miniconda3/envs/temp/lib/python3.10/site-packages/transformers/models/qwen3_moe/modeling_qwen3_moe.py", line 1007, in forward
[rank2]: aux_loss = load_balancing_loss_func(
[rank2]: File "/root/miniconda3/envs/temp/lib/python3.10/site-packages/transformers/models/qwen3_moe/modeling_qwen3_moe.py", line 812, in load_balancing_loss_func
[rank2]: batch_size,sequence_length = attention_mask.shape
[rank2]: ValueError: too many values to unpack (expected 2)
``` | {
"login": "github-actions[bot]",
"id": 41898282,
"node_id": "MDM6Qm90NDE4OTgyODI=",
"avatar_url": "https://avatars.githubusercontent.com/in/15368?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/github-actions%5Bbot%5D",
"html_url": "https://github.com/apps/github-actions",
"followers_url": "https://api.github.com/users/github-actions%5Bbot%5D/followers",
"following_url": "https://api.github.com/users/github-actions%5Bbot%5D/following{/other_user}",
"gists_url": "https://api.github.com/users/github-actions%5Bbot%5D/gists{/gist_id}",
"starred_url": "https://api.github.com/users/github-actions%5Bbot%5D/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/github-actions%5Bbot%5D/subscriptions",
"organizations_url": "https://api.github.com/users/github-actions%5Bbot%5D/orgs",
"repos_url": "https://api.github.com/users/github-actions%5Bbot%5D/repos",
"events_url": "https://api.github.com/users/github-actions%5Bbot%5D/events{/privacy}",
"received_events_url": "https://api.github.com/users/github-actions%5Bbot%5D/received_events",
"type": "Bot",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38910/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38910/timeline | null | completed | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | true |
https://api.github.com/repos/huggingface/transformers/issues/38909 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38909/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38909/comments | https://api.github.com/repos/huggingface/transformers/issues/38909/events | https://github.com/huggingface/transformers/pull/38909 | 3,159,914,662 | PR_kwDOCUB6oc6bPLu4 | 38,909 | Add kyutai stt | {
"login": "eustlb",
"id": 94853470,
"node_id": "U_kgDOBadZXg",
"avatar_url": "https://avatars.githubusercontent.com/u/94853470?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/eustlb",
"html_url": "https://github.com/eustlb",
"followers_url": "https://api.github.com/users/eustlb/followers",
"following_url": "https://api.github.com/users/eustlb/following{/other_user}",
"gists_url": "https://api.github.com/users/eustlb/gists{/gist_id}",
"starred_url": "https://api.github.com/users/eustlb/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/eustlb/subscriptions",
"organizations_url": "https://api.github.com/users/eustlb/orgs",
"repos_url": "https://api.github.com/users/eustlb/repos",
"events_url": "https://api.github.com/users/eustlb/events{/privacy}",
"received_events_url": "https://api.github.com/users/eustlb/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-06-19T11:07:14 | 2025-06-24T16:01:16 | 2025-06-24T16:01:16 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38909",
"html_url": "https://github.com/huggingface/transformers/pull/38909",
"diff_url": "https://github.com/huggingface/transformers/pull/38909.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38909.patch",
"merged_at": "2025-06-24T16:01:16"
} | # What does this PR do?
Adds Kyutai's new STT model 🚀 | {
"login": "LysandreJik",
"id": 30755778,
"node_id": "MDQ6VXNlcjMwNzU1Nzc4",
"avatar_url": "https://avatars.githubusercontent.com/u/30755778?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/LysandreJik",
"html_url": "https://github.com/LysandreJik",
"followers_url": "https://api.github.com/users/LysandreJik/followers",
"following_url": "https://api.github.com/users/LysandreJik/following{/other_user}",
"gists_url": "https://api.github.com/users/LysandreJik/gists{/gist_id}",
"starred_url": "https://api.github.com/users/LysandreJik/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/LysandreJik/subscriptions",
"organizations_url": "https://api.github.com/users/LysandreJik/orgs",
"repos_url": "https://api.github.com/users/LysandreJik/repos",
"events_url": "https://api.github.com/users/LysandreJik/events{/privacy}",
"received_events_url": "https://api.github.com/users/LysandreJik/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38909/reactions",
"total_count": 7,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 7,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38909/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/38908 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38908/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38908/comments | https://api.github.com/repos/huggingface/transformers/issues/38908/events | https://github.com/huggingface/transformers/pull/38908 | 3,159,887,146 | PR_kwDOCUB6oc6bPFoU | 38,908 | Add support to use config dtype in HybridChunkedCache | {
"login": "vivekkhandelwal1",
"id": 68822896,
"node_id": "MDQ6VXNlcjY4ODIyODk2",
"avatar_url": "https://avatars.githubusercontent.com/u/68822896?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/vivekkhandelwal1",
"html_url": "https://github.com/vivekkhandelwal1",
"followers_url": "https://api.github.com/users/vivekkhandelwal1/followers",
"following_url": "https://api.github.com/users/vivekkhandelwal1/following{/other_user}",
"gists_url": "https://api.github.com/users/vivekkhandelwal1/gists{/gist_id}",
"starred_url": "https://api.github.com/users/vivekkhandelwal1/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/vivekkhandelwal1/subscriptions",
"organizations_url": "https://api.github.com/users/vivekkhandelwal1/orgs",
"repos_url": "https://api.github.com/users/vivekkhandelwal1/repos",
"events_url": "https://api.github.com/users/vivekkhandelwal1/events{/privacy}",
"received_events_url": "https://api.github.com/users/vivekkhandelwal1/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | open | false | null | [] | null | [] | 2025-06-19T10:56:43 | 2025-06-23T15:25:32 | null | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38908",
"html_url": "https://github.com/huggingface/transformers/pull/38908",
"diff_url": "https://github.com/huggingface/transformers/pull/38908.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38908.patch",
"merged_at": null
} | # What does this PR do?
This PR fixes the issue with the HybridChunkedCache initialisation since it does not consider the dtype present in the model config. Here: https://github.com/huggingface/transformers/blob/b949747b54b6d81c5e4ab93c4d98ebc7a5901b31/src/transformers/cache_utils.py#L1806 | null | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38908/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38908/timeline | null | null | null | null | true | false |
https://api.github.com/repos/huggingface/transformers/issues/38907 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38907/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38907/comments | https://api.github.com/repos/huggingface/transformers/issues/38907/events | https://github.com/huggingface/transformers/pull/38907 | 3,159,649,165 | PR_kwDOCUB6oc6bORzE | 38,907 | Skip sdpa tests if submodule does not support sdpa | {
"login": "ivarflakstad",
"id": 69173633,
"node_id": "MDQ6VXNlcjY5MTczNjMz",
"avatar_url": "https://avatars.githubusercontent.com/u/69173633?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ivarflakstad",
"html_url": "https://github.com/ivarflakstad",
"followers_url": "https://api.github.com/users/ivarflakstad/followers",
"following_url": "https://api.github.com/users/ivarflakstad/following{/other_user}",
"gists_url": "https://api.github.com/users/ivarflakstad/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ivarflakstad/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ivarflakstad/subscriptions",
"organizations_url": "https://api.github.com/users/ivarflakstad/orgs",
"repos_url": "https://api.github.com/users/ivarflakstad/repos",
"events_url": "https://api.github.com/users/ivarflakstad/events{/privacy}",
"received_events_url": "https://api.github.com/users/ivarflakstad/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-06-19T09:32:29 | 2025-06-19T13:11:02 | 2025-06-19T13:11:02 | MEMBER | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38907",
"html_url": "https://github.com/huggingface/transformers/pull/38907",
"diff_url": "https://github.com/huggingface/transformers/pull/38907.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38907.patch",
"merged_at": "2025-06-19T13:11:01"
} | The tests `test_sdpa_can_dispatch_on_flash` and `test_sdpa_can_compile_dynamic` fail if a submodule of the model being tested does not support sdpa.
An example of this is `Blip2ModelTest` (and friends) because it has the submodule `Blip2QFormerModel` which does not support sdpa.
With this PR we should now be skipping these tests after detecting that the submodule has `_supports_sdpa = False`.
This addresses several of the "hidden failing tests" described in #38820 | {
"login": "ivarflakstad",
"id": 69173633,
"node_id": "MDQ6VXNlcjY5MTczNjMz",
"avatar_url": "https://avatars.githubusercontent.com/u/69173633?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ivarflakstad",
"html_url": "https://github.com/ivarflakstad",
"followers_url": "https://api.github.com/users/ivarflakstad/followers",
"following_url": "https://api.github.com/users/ivarflakstad/following{/other_user}",
"gists_url": "https://api.github.com/users/ivarflakstad/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ivarflakstad/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ivarflakstad/subscriptions",
"organizations_url": "https://api.github.com/users/ivarflakstad/orgs",
"repos_url": "https://api.github.com/users/ivarflakstad/repos",
"events_url": "https://api.github.com/users/ivarflakstad/events{/privacy}",
"received_events_url": "https://api.github.com/users/ivarflakstad/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38907/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38907/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/38906 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38906/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38906/comments | https://api.github.com/repos/huggingface/transformers/issues/38906/events | https://github.com/huggingface/transformers/pull/38906 | 3,159,613,467 | PR_kwDOCUB6oc6bOJ7J | 38,906 | Add mistral common support | {
"login": "juliendenize",
"id": 40604584,
"node_id": "MDQ6VXNlcjQwNjA0NTg0",
"avatar_url": "https://avatars.githubusercontent.com/u/40604584?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/juliendenize",
"html_url": "https://github.com/juliendenize",
"followers_url": "https://api.github.com/users/juliendenize/followers",
"following_url": "https://api.github.com/users/juliendenize/following{/other_user}",
"gists_url": "https://api.github.com/users/juliendenize/gists{/gist_id}",
"starred_url": "https://api.github.com/users/juliendenize/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/juliendenize/subscriptions",
"organizations_url": "https://api.github.com/users/juliendenize/orgs",
"repos_url": "https://api.github.com/users/juliendenize/repos",
"events_url": "https://api.github.com/users/juliendenize/events{/privacy}",
"received_events_url": "https://api.github.com/users/juliendenize/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-06-19T09:23:16 | 2025-07-11T16:26:59 | 2025-07-11T16:26:58 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38906",
"html_url": "https://github.com/huggingface/transformers/pull/38906",
"diff_url": "https://github.com/huggingface/transformers/pull/38906.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38906.patch",
"merged_at": "2025-07-11T16:26:58"
} | # What does this PR do?
This PR aims to add support to mistral-common to ensure best usage of Mistral models. For now Mistral tokenizers are converted to HF format which can cause some discrepancy with official usage. We'd like to propose a soft dependency and a smooth integration in the HF environment via a `MistralCommonTokenizer` that implements the usual methods of Transformers' pretrained tokenizers.
To best align with expected behavior and support various features from Transformers, a PR is simultaneously opened in mistral-common https://github.com/mistralai/mistral-common/pull/104.
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [x] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [x] Did you write any new necessary tests?
## Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
| {
"login": "ydshieh",
"id": 2521628,
"node_id": "MDQ6VXNlcjI1MjE2Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ydshieh",
"html_url": "https://github.com/ydshieh",
"followers_url": "https://api.github.com/users/ydshieh/followers",
"following_url": "https://api.github.com/users/ydshieh/following{/other_user}",
"gists_url": "https://api.github.com/users/ydshieh/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ydshieh/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ydshieh/subscriptions",
"organizations_url": "https://api.github.com/users/ydshieh/orgs",
"repos_url": "https://api.github.com/users/ydshieh/repos",
"events_url": "https://api.github.com/users/ydshieh/events{/privacy}",
"received_events_url": "https://api.github.com/users/ydshieh/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38906/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38906/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/38905 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38905/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38905/comments | https://api.github.com/repos/huggingface/transformers/issues/38905/events | https://github.com/huggingface/transformers/issues/38905 | 3,159,472,127 | I_kwDOCUB6oc68Ubf_ | 38,905 | Add flexible configuration for Liger Kernel in TrainingArguments | {
"login": "hamza-hcompany",
"id": 208606724,
"node_id": "U_kgDODG8WBA",
"avatar_url": "https://avatars.githubusercontent.com/u/208606724?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/hamza-hcompany",
"html_url": "https://github.com/hamza-hcompany",
"followers_url": "https://api.github.com/users/hamza-hcompany/followers",
"following_url": "https://api.github.com/users/hamza-hcompany/following{/other_user}",
"gists_url": "https://api.github.com/users/hamza-hcompany/gists{/gist_id}",
"starred_url": "https://api.github.com/users/hamza-hcompany/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/hamza-hcompany/subscriptions",
"organizations_url": "https://api.github.com/users/hamza-hcompany/orgs",
"repos_url": "https://api.github.com/users/hamza-hcompany/repos",
"events_url": "https://api.github.com/users/hamza-hcompany/events{/privacy}",
"received_events_url": "https://api.github.com/users/hamza-hcompany/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 2648621985,
"node_id": "MDU6TGFiZWwyNjQ4NjIxOTg1",
"url": "https://api.github.com/repos/huggingface/transformers/labels/Feature%20request",
"name": "Feature request",
"color": "FBCA04",
"default": false,
"description": "Request for a new feature"
}
] | closed | false | null | [] | null | [] | 2025-06-19T08:36:43 | 2025-06-19T15:54:09 | 2025-06-19T15:54:09 | CONTRIBUTOR | null | null | null | null | ### Feature request
The current Liger Kernel integration in HuggingFace Transformers only supports a boolean `use_liger_kernel` flag that applies all available kernels. The underlying `_apply_liger_kernel_to_instance` method supports granular configuration through `kwargs`, but this flexibility isn't exposed at the `Trainer` level.
Current implementation :
```python
# Only boolean flag available
training_args = TrainingArguments(
use_liger_kernel=True, # All or nothing, non granular selection
...
)
```
Proposed enhancement :
```python
# Add granular configuration support
training_args = TrainingArguments(
use_liger_kernel=True,
liger_kernel_config={
"rope": True,
"swiglu": True,
"cross_entropy": True,
"fused_linear_cross_entropy": False,
"rms_norm": False
},
...
)
```
### Motivation
- Debugging: Ability to isolate specific kernels when troubleshooting
- Compatibility: Some kernels might not be compatible with certain setups
- API Consistency: `Trainer` should expose same flexibility as low-level patching APIs
### Your contribution
I am willing to submit a PR | {
"login": "SunMarc",
"id": 57196510,
"node_id": "MDQ6VXNlcjU3MTk2NTEw",
"avatar_url": "https://avatars.githubusercontent.com/u/57196510?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/SunMarc",
"html_url": "https://github.com/SunMarc",
"followers_url": "https://api.github.com/users/SunMarc/followers",
"following_url": "https://api.github.com/users/SunMarc/following{/other_user}",
"gists_url": "https://api.github.com/users/SunMarc/gists{/gist_id}",
"starred_url": "https://api.github.com/users/SunMarc/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/SunMarc/subscriptions",
"organizations_url": "https://api.github.com/users/SunMarc/orgs",
"repos_url": "https://api.github.com/users/SunMarc/repos",
"events_url": "https://api.github.com/users/SunMarc/events{/privacy}",
"received_events_url": "https://api.github.com/users/SunMarc/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38905/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38905/timeline | null | completed | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | true |
https://api.github.com/repos/huggingface/transformers/issues/38904 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38904/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38904/comments | https://api.github.com/repos/huggingface/transformers/issues/38904/events | https://github.com/huggingface/transformers/pull/38904 | 3,159,436,637 | PR_kwDOCUB6oc6bNlRz | 38,904 | Fix `fsmt` tests | {
"login": "ydshieh",
"id": 2521628,
"node_id": "MDQ6VXNlcjI1MjE2Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ydshieh",
"html_url": "https://github.com/ydshieh",
"followers_url": "https://api.github.com/users/ydshieh/followers",
"following_url": "https://api.github.com/users/ydshieh/following{/other_user}",
"gists_url": "https://api.github.com/users/ydshieh/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ydshieh/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ydshieh/subscriptions",
"organizations_url": "https://api.github.com/users/ydshieh/orgs",
"repos_url": "https://api.github.com/users/ydshieh/repos",
"events_url": "https://api.github.com/users/ydshieh/events{/privacy}",
"received_events_url": "https://api.github.com/users/ydshieh/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-06-19T08:23:50 | 2025-06-19T08:56:36 | 2025-06-19T08:56:34 | COLLABORATOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38904",
"html_url": "https://github.com/huggingface/transformers/pull/38904",
"diff_url": "https://github.com/huggingface/transformers/pull/38904.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38904.patch",
"merged_at": "2025-06-19T08:56:34"
} | # What does this PR do?
I make comments in `Files changed` tab, but in short, the hub repo. safetensors files are broken. | {
"login": "ydshieh",
"id": 2521628,
"node_id": "MDQ6VXNlcjI1MjE2Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ydshieh",
"html_url": "https://github.com/ydshieh",
"followers_url": "https://api.github.com/users/ydshieh/followers",
"following_url": "https://api.github.com/users/ydshieh/following{/other_user}",
"gists_url": "https://api.github.com/users/ydshieh/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ydshieh/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ydshieh/subscriptions",
"organizations_url": "https://api.github.com/users/ydshieh/orgs",
"repos_url": "https://api.github.com/users/ydshieh/repos",
"events_url": "https://api.github.com/users/ydshieh/events{/privacy}",
"received_events_url": "https://api.github.com/users/ydshieh/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38904/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38904/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/38903 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38903/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38903/comments | https://api.github.com/repos/huggingface/transformers/issues/38903/events | https://github.com/huggingface/transformers/issues/38903 | 3,159,309,917 | I_kwDOCUB6oc68Tz5d | 38,903 | Incorrect comparison when generating `special_image_mask` in Gemma3Model | {
"login": "caicai0402",
"id": 71309688,
"node_id": "MDQ6VXNlcjcxMzA5Njg4",
"avatar_url": "https://avatars.githubusercontent.com/u/71309688?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/caicai0402",
"html_url": "https://github.com/caicai0402",
"followers_url": "https://api.github.com/users/caicai0402/followers",
"following_url": "https://api.github.com/users/caicai0402/following{/other_user}",
"gists_url": "https://api.github.com/users/caicai0402/gists{/gist_id}",
"starred_url": "https://api.github.com/users/caicai0402/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/caicai0402/subscriptions",
"organizations_url": "https://api.github.com/users/caicai0402/orgs",
"repos_url": "https://api.github.com/users/caicai0402/repos",
"events_url": "https://api.github.com/users/caicai0402/events{/privacy}",
"received_events_url": "https://api.github.com/users/caicai0402/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
}
] | closed | false | null | [] | null | [] | 2025-06-19T07:37:19 | 2025-06-19T09:15:46 | 2025-06-19T09:15:46 | NONE | null | null | null | null | ### System Info
- `transformers` version: 4.52.4
- Platform: Linux-5.15.0-139-generic-x86_64-with-glibc2.31
- Python version: 3.11.11
- Huggingface_hub version: 0.32.3
- Safetensors version: 0.5.3
- Accelerate version: 1.4.0
- Accelerate config: not found
- DeepSpeed version: not installed
- PyTorch version (GPU?): 2.7.0+cu126 (True)
- Tensorflow version (GPU?): not installed (NA)
- Flax version (CPU?/GPU?/TPU?): not installed (NA)
- Jax version: not installed
- JaxLib version: not installed
- Using distributed or parallel set-up in script?: No
- Using GPU in script?: No
- GPU type: NVIDIA RTX A6000
### Who can help?
@amyeroberts, @qubvel
### Information
- [ ] The official example scripts
- [x] My own modified scripts
### Tasks
- [ ] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...)
- [x] My own task or dataset (give details below)
### Reproduction
```
from transformers import Gemma3Processor, Gemma3Model
import torch
model_id = "google/gemma-3-4b-it"
processor = Gemma3Processor.from_pretrained(model_id)
model = Gemma3Model.from_pretrained(model_id)
messages = [
{
"role": "user",
"content": [
{"type": "image", "url": "https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/p-blog/candy.JPG"},
{"type": "text", "text": "What animal is on the candy?"}
],
}
]
inputs = processor.apply_chat_template(
messages,
tokenize=True,
return_dict=True,
return_tensors="pt"
).to(model.device, dtype=torch.bfloat16)
input_embeds = model.get_input_embeddings()(inputs["input_ids"])
output = model(inputs_embeds=input_embeds, pixel_values=inputs["pixel_values"])
```
### Expected behavior
- ### Files
[transformers/src/transformers/models/gemma3/modeling_gemma3.py (line 1188)](https://github.com/huggingface/transformers/blob/v4.52.3/src/transformers/models/gemma3/modeling_gemma3.py#L1188)
[transformers/src/transformers/models/gemma3/modular_gemma3.py (line 866)](https://github.com/huggingface/transformers/blob/v4.52.3/src/transformers/models/gemma3/modular_gemma3.py#L866)
- ### Description
In Gemma3Model.forward(), when pixel_values is provided and input_ids is None, the model attempts to compute special_image_mask by comparing each inputs_embeds vector to the image token embedding.
**Current logic**
```
special_image_mask = inputs_embeds == self.get_input_embeddings()(
torch.tensor(self.config.image_token_id, dtype=torch.long, device=inputs_embeds.device)
)
```
This performs an elementwise comparison between a tensor of shape [batch_size, seq_len, hidden_dim] and a single embedding vector of shape [hidden_dim]. As a result, the mask may contain True values at incorrect positions if only some dimensions match, which causes the following check to fail:
```
inputs_embeds[special_image_mask].numel() != image_features.numel()
```
**Suggested Fix**
```
image_embed = self.get_input_embeddings()(
torch.tensor(self.config.image_token_id, dtype=torch.long, device=inputs_embeds.device)
)
special_image_mask = (inputs_embeds == image_embed).all(dim=-1).unsqueeze(-1).expand_as(inputs_embeds)
```
| {
"login": "caicai0402",
"id": 71309688,
"node_id": "MDQ6VXNlcjcxMzA5Njg4",
"avatar_url": "https://avatars.githubusercontent.com/u/71309688?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/caicai0402",
"html_url": "https://github.com/caicai0402",
"followers_url": "https://api.github.com/users/caicai0402/followers",
"following_url": "https://api.github.com/users/caicai0402/following{/other_user}",
"gists_url": "https://api.github.com/users/caicai0402/gists{/gist_id}",
"starred_url": "https://api.github.com/users/caicai0402/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/caicai0402/subscriptions",
"organizations_url": "https://api.github.com/users/caicai0402/orgs",
"repos_url": "https://api.github.com/users/caicai0402/repos",
"events_url": "https://api.github.com/users/caicai0402/events{/privacy}",
"received_events_url": "https://api.github.com/users/caicai0402/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38903/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38903/timeline | null | completed | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | true |
https://api.github.com/repos/huggingface/transformers/issues/38902 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38902/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38902/comments | https://api.github.com/repos/huggingface/transformers/issues/38902/events | https://github.com/huggingface/transformers/pull/38902 | 3,159,233,208 | PR_kwDOCUB6oc6bM5bg | 38,902 | Skip and track hidden failing tests | {
"login": "ivarflakstad",
"id": 69173633,
"node_id": "MDQ6VXNlcjY5MTczNjMz",
"avatar_url": "https://avatars.githubusercontent.com/u/69173633?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ivarflakstad",
"html_url": "https://github.com/ivarflakstad",
"followers_url": "https://api.github.com/users/ivarflakstad/followers",
"following_url": "https://api.github.com/users/ivarflakstad/following{/other_user}",
"gists_url": "https://api.github.com/users/ivarflakstad/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ivarflakstad/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ivarflakstad/subscriptions",
"organizations_url": "https://api.github.com/users/ivarflakstad/orgs",
"repos_url": "https://api.github.com/users/ivarflakstad/repos",
"events_url": "https://api.github.com/users/ivarflakstad/events{/privacy}",
"received_events_url": "https://api.github.com/users/ivarflakstad/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-06-19T07:10:40 | 2025-10-16T22:45:04 | 2025-10-16T22:45:04 | MEMBER | null | null | true | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38902",
"html_url": "https://github.com/huggingface/transformers/pull/38902",
"diff_url": "https://github.com/huggingface/transformers/pull/38902.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38902.patch",
"merged_at": null
} | Disabling several tests that have been skipped in CI but had they not been they would have failed. | {
"login": "ivarflakstad",
"id": 69173633,
"node_id": "MDQ6VXNlcjY5MTczNjMz",
"avatar_url": "https://avatars.githubusercontent.com/u/69173633?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ivarflakstad",
"html_url": "https://github.com/ivarflakstad",
"followers_url": "https://api.github.com/users/ivarflakstad/followers",
"following_url": "https://api.github.com/users/ivarflakstad/following{/other_user}",
"gists_url": "https://api.github.com/users/ivarflakstad/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ivarflakstad/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ivarflakstad/subscriptions",
"organizations_url": "https://api.github.com/users/ivarflakstad/orgs",
"repos_url": "https://api.github.com/users/ivarflakstad/repos",
"events_url": "https://api.github.com/users/ivarflakstad/events{/privacy}",
"received_events_url": "https://api.github.com/users/ivarflakstad/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38902/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38902/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/38901 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38901/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38901/comments | https://api.github.com/repos/huggingface/transformers/issues/38901/events | https://github.com/huggingface/transformers/pull/38901 | 3,159,195,561 | PR_kwDOCUB6oc6bMxXM | 38,901 | Fix initialization of OneFormer | {
"login": "bvantuan",
"id": 37981884,
"node_id": "MDQ6VXNlcjM3OTgxODg0",
"avatar_url": "https://avatars.githubusercontent.com/u/37981884?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/bvantuan",
"html_url": "https://github.com/bvantuan",
"followers_url": "https://api.github.com/users/bvantuan/followers",
"following_url": "https://api.github.com/users/bvantuan/following{/other_user}",
"gists_url": "https://api.github.com/users/bvantuan/gists{/gist_id}",
"starred_url": "https://api.github.com/users/bvantuan/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/bvantuan/subscriptions",
"organizations_url": "https://api.github.com/users/bvantuan/orgs",
"repos_url": "https://api.github.com/users/bvantuan/repos",
"events_url": "https://api.github.com/users/bvantuan/events{/privacy}",
"received_events_url": "https://api.github.com/users/bvantuan/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-06-19T06:57:50 | 2025-06-27T10:39:38 | 2025-06-27T10:39:38 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38901",
"html_url": "https://github.com/huggingface/transformers/pull/38901",
"diff_url": "https://github.com/huggingface/transformers/pull/38901.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38901.patch",
"merged_at": "2025-06-27T10:39:38"
} | # What does this PR do?
<!--
Congratulations! You've made it this far! You're not quite done yet though.
Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution.
Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change.
Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost.
-->
<!-- Remove if not applicable -->
Following #38061 and #38864, I noticed the same error for OneFormer due to duplicate initializations. I essentially applied the same fix as in #38864.
Code to reproduce the bug:
```python
from transformers import (
AutoModelForImageClassification,
OneFormerForUniversalSegmentation,
OneFormerConfig,
)
backbone_name = "microsoft/resnet-18"
# backbone_name = "microsoft/swin-tiny-patch4-window7-224"
def params_match(params1, params2):
return all((p1 == p2).all() for p1, p2 in zip(params1, params2))
# load pretrained backbone model
backbone_model = AutoModelForImageClassification.from_pretrained(backbone_name)
# load OneFormerConfig with a pretrained backbone
config = OneFormerConfig(
backbone=backbone_name,
use_pretrained_backbone=True,
)
oneformer = OneFormerForUniversalSegmentation(config)
# AssertionError: parameters don't match
assert params_match(
backbone_model.base_model.encoder.parameters(),
oneformer.model.pixel_level_module.encoder.encoder.parameters(),
)
```
## Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
@Cyrilvallez
<!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @
If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**.
Please tag fewer than 3 people.
Models:
- text models: @ArthurZucker
- vision models: @amyeroberts, @qubvel
- speech models: @eustlb
- graph models: @clefourrier
Library:
- flax: @gante and @Rocketknight1
- generate: @zucchini-nlp (visual-language models) or @gante (all others)
- pipelines: @Rocketknight1
- tensorflow: @gante and @Rocketknight1
- tokenizers: @ArthurZucker
- trainer: @zach-huggingface, @SunMarc and @qgallouedec
- chat templates: @Rocketknight1
Integrations:
- deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface
- ray/raytune: @richardliaw, @amogkam
- Big Model Inference: @SunMarc
- quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber
Documentation: @stevhliu
HF projects:
- accelerate: [different repo](https://github.com/huggingface/accelerate)
- datasets: [different repo](https://github.com/huggingface/datasets)
- diffusers: [different repo](https://github.com/huggingface/diffusers)
- rust tokenizers: [different repo](https://github.com/huggingface/tokenizers)
Maintained examples (not research project or legacy):
- Flax: @Rocketknight1
- PyTorch: See Models above and tag the person corresponding to the modality of the example.
- TensorFlow: @Rocketknight1
-->
| {
"login": "Cyrilvallez",
"id": 71554963,
"node_id": "MDQ6VXNlcjcxNTU0OTYz",
"avatar_url": "https://avatars.githubusercontent.com/u/71554963?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Cyrilvallez",
"html_url": "https://github.com/Cyrilvallez",
"followers_url": "https://api.github.com/users/Cyrilvallez/followers",
"following_url": "https://api.github.com/users/Cyrilvallez/following{/other_user}",
"gists_url": "https://api.github.com/users/Cyrilvallez/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Cyrilvallez/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Cyrilvallez/subscriptions",
"organizations_url": "https://api.github.com/users/Cyrilvallez/orgs",
"repos_url": "https://api.github.com/users/Cyrilvallez/repos",
"events_url": "https://api.github.com/users/Cyrilvallez/events{/privacy}",
"received_events_url": "https://api.github.com/users/Cyrilvallez/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38901/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38901/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/38900 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38900/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38900/comments | https://api.github.com/repos/huggingface/transformers/issues/38900/events | https://github.com/huggingface/transformers/pull/38900 | 3,159,164,440 | PR_kwDOCUB6oc6bMqqh | 38,900 | Deprecate AutoModelForVision2Seq | {
"login": "zucchini-nlp",
"id": 100715397,
"node_id": "U_kgDOBgDLhQ",
"avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/zucchini-nlp",
"html_url": "https://github.com/zucchini-nlp",
"followers_url": "https://api.github.com/users/zucchini-nlp/followers",
"following_url": "https://api.github.com/users/zucchini-nlp/following{/other_user}",
"gists_url": "https://api.github.com/users/zucchini-nlp/gists{/gist_id}",
"starred_url": "https://api.github.com/users/zucchini-nlp/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/zucchini-nlp/subscriptions",
"organizations_url": "https://api.github.com/users/zucchini-nlp/orgs",
"repos_url": "https://api.github.com/users/zucchini-nlp/repos",
"events_url": "https://api.github.com/users/zucchini-nlp/events{/privacy}",
"received_events_url": "https://api.github.com/users/zucchini-nlp/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-06-19T06:44:46 | 2025-07-14T09:42:06 | 2025-07-14T09:42:06 | MEMBER | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38900",
"html_url": "https://github.com/huggingface/transformers/pull/38900",
"diff_url": "https://github.com/huggingface/transformers/pull/38900.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38900.patch",
"merged_at": "2025-07-14T09:42:06"
} | # What does this PR do?
As per title, we'll remove it anyway in the end so let's start raising warnings. I already ask all new models to not use `Vision2Seq` | {
"login": "zucchini-nlp",
"id": 100715397,
"node_id": "U_kgDOBgDLhQ",
"avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/zucchini-nlp",
"html_url": "https://github.com/zucchini-nlp",
"followers_url": "https://api.github.com/users/zucchini-nlp/followers",
"following_url": "https://api.github.com/users/zucchini-nlp/following{/other_user}",
"gists_url": "https://api.github.com/users/zucchini-nlp/gists{/gist_id}",
"starred_url": "https://api.github.com/users/zucchini-nlp/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/zucchini-nlp/subscriptions",
"organizations_url": "https://api.github.com/users/zucchini-nlp/orgs",
"repos_url": "https://api.github.com/users/zucchini-nlp/repos",
"events_url": "https://api.github.com/users/zucchini-nlp/events{/privacy}",
"received_events_url": "https://api.github.com/users/zucchini-nlp/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38900/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38900/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/38899 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38899/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38899/comments | https://api.github.com/repos/huggingface/transformers/issues/38899/events | https://github.com/huggingface/transformers/pull/38899 | 3,158,909,905 | PR_kwDOCUB6oc6bLziY | 38,899 | fix: add __bool__ operator to tokenizer to avoid bloated asserts | {
"login": "kallewoof",
"id": 250224,
"node_id": "MDQ6VXNlcjI1MDIyNA==",
"avatar_url": "https://avatars.githubusercontent.com/u/250224?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/kallewoof",
"html_url": "https://github.com/kallewoof",
"followers_url": "https://api.github.com/users/kallewoof/followers",
"following_url": "https://api.github.com/users/kallewoof/following{/other_user}",
"gists_url": "https://api.github.com/users/kallewoof/gists{/gist_id}",
"starred_url": "https://api.github.com/users/kallewoof/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kallewoof/subscriptions",
"organizations_url": "https://api.github.com/users/kallewoof/orgs",
"repos_url": "https://api.github.com/users/kallewoof/repos",
"events_url": "https://api.github.com/users/kallewoof/events{/privacy}",
"received_events_url": "https://api.github.com/users/kallewoof/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-06-19T04:44:26 | 2025-06-23T14:42:14 | 2025-06-23T14:32:17 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38899",
"html_url": "https://github.com/huggingface/transformers/pull/38899",
"diff_url": "https://github.com/huggingface/transformers/pull/38899.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38899.patch",
"merged_at": "2025-06-23T14:32:17"
} | When a user does `assert tokenizer` to ensure that the tokenizer is not None, they inadvertently set off a rather expensive process in the `__len__()` operator. This fix adds a trivial `__bool__()` that returns True, so that a `None` tokenizer asserts and an actual tokenizer returns True when asserted, without calling length op.
A user can already address this by doing `assert tokenizer is not None`, but failing to do so is a big unexpected gotcha that shot me personally in the foot so I figured I'd at least propose a fix, which is trivial.
# What does this PR do?
This PR results in a roughly 300,000x speed-up when trying to `assert` a tokenizer object:
```python
import time
from transformers import AutoTokenizer
tokenizer = AutoTokenizer.from_pretrained("some-tokenizer")
start = time.perf_counter()
for i in range(4000):
assert tokenizer
elapsed = time.perf_counter() - start
print(elapsed)
# PRE PR: 50.92993460899743
# POST PR: 0.0001564559934195131
```
## Before submitting
- [X] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [ ] Did you write any new necessary tests?
## Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
@ArthurZucker or @n1t0 might have opinions, otherwise not sure who might review this. It's very trivial.
| {
"login": "Rocketknight1",
"id": 12866554,
"node_id": "MDQ6VXNlcjEyODY2NTU0",
"avatar_url": "https://avatars.githubusercontent.com/u/12866554?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Rocketknight1",
"html_url": "https://github.com/Rocketknight1",
"followers_url": "https://api.github.com/users/Rocketknight1/followers",
"following_url": "https://api.github.com/users/Rocketknight1/following{/other_user}",
"gists_url": "https://api.github.com/users/Rocketknight1/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Rocketknight1/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Rocketknight1/subscriptions",
"organizations_url": "https://api.github.com/users/Rocketknight1/orgs",
"repos_url": "https://api.github.com/users/Rocketknight1/repos",
"events_url": "https://api.github.com/users/Rocketknight1/events{/privacy}",
"received_events_url": "https://api.github.com/users/Rocketknight1/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38899/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38899/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/38898 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38898/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38898/comments | https://api.github.com/repos/huggingface/transformers/issues/38898/events | https://github.com/huggingface/transformers/issues/38898 | 3,158,821,711 | I_kwDOCUB6oc68R8tP | 38,898 | Qwen2_5OmniProcessor.__init__() got multiple values for argument 'image_processor' | {
"login": "WenmuZhou",
"id": 12406017,
"node_id": "MDQ6VXNlcjEyNDA2MDE3",
"avatar_url": "https://avatars.githubusercontent.com/u/12406017?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/WenmuZhou",
"html_url": "https://github.com/WenmuZhou",
"followers_url": "https://api.github.com/users/WenmuZhou/followers",
"following_url": "https://api.github.com/users/WenmuZhou/following{/other_user}",
"gists_url": "https://api.github.com/users/WenmuZhou/gists{/gist_id}",
"starred_url": "https://api.github.com/users/WenmuZhou/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/WenmuZhou/subscriptions",
"organizations_url": "https://api.github.com/users/WenmuZhou/orgs",
"repos_url": "https://api.github.com/users/WenmuZhou/repos",
"events_url": "https://api.github.com/users/WenmuZhou/events{/privacy}",
"received_events_url": "https://api.github.com/users/WenmuZhou/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
}
] | closed | false | null | [] | null | [] | 2025-06-19T03:51:19 | 2025-07-01T03:36:07 | 2025-07-01T03:36:07 | NONE | null | null | null | null | ### System Info
### Your current environment
```text
INFO 06-19 02:57:16 [__init__.py:244] Automatically detected platform cuda.
Collecting environment information...
==============================
System Info
==============================
OS : Ubuntu 22.04.4 LTS (x86_64)
GCC version : (Ubuntu 11.4.0-1ubuntu1~22.04) 11.4.0
Clang version : Could not collect
CMake version : version 3.30.2
Libc version : glibc-2.35
==============================
PyTorch Info
==============================
PyTorch version : 2.7.0+cu126
Is debug build : False
CUDA used to build PyTorch : 12.6
ROCM used to build PyTorch : N/A
==============================
Python Environment
==============================
Python version : 3.10.12 (main, Jul 29 2024, 16:56:48) [GCC 11.4.0] (64-bit runtime)
Python platform : Linux-5.4.0-167-generic-x86_64-with-glibc2.35
==============================
CUDA / GPU Info
==============================
Is CUDA available : True
CUDA runtime version : 12.6.20
CUDA_MODULE_LOADING set to : LAZY
GPU models and configuration : GPU 0: NVIDIA A100-SXM4-40GB
Nvidia driver version : 535.216.03
cuDNN version : Probably one of the following:
/usr/lib/x86_64-linux-gnu/libcudnn.so.9.3.0
/usr/lib/x86_64-linux-gnu/libcudnn_adv.so.9.3.0
/usr/lib/x86_64-linux-gnu/libcudnn_cnn.so.9.3.0
/usr/lib/x86_64-linux-gnu/libcudnn_engines_precompiled.so.9.3.0
/usr/lib/x86_64-linux-gnu/libcudnn_engines_runtime_compiled.so.9.3.0
/usr/lib/x86_64-linux-gnu/libcudnn_graph.so.9.3.0
/usr/lib/x86_64-linux-gnu/libcudnn_heuristic.so.9.3.0
/usr/lib/x86_64-linux-gnu/libcudnn_ops.so.9.3.0
HIP runtime version : N/A
MIOpen runtime version : N/A
Is XNNPACK available : True
==============================
CPU Info
==============================
Architecture: x86_64
CPU op-mode(s): 32-bit, 64-bit
Address sizes: 43 bits physical, 48 bits virtual
Byte Order: Little Endian
CPU(s): 128
On-line CPU(s) list: 0-127
Vendor ID: AuthenticAMD
Model name: AMD EPYC 7763 64-Core Processor
CPU family: 25
Model: 1
Thread(s) per core: 1
Core(s) per socket: 64
Socket(s): 2
Stepping: 1
Frequency boost: enabled
CPU max MHz: 2450.0000
CPU min MHz: 1500.0000
BogoMIPS: 4890.72
Flags: fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush mmx fxsr sse sse2 ht syscall nx mmxext fxsr_opt pdpe1gb rdtscp lm constant_tsc rep_good nopl nonstop_tsc cpuid extd_apicid aperfmperf pni pclmulqdq monitor ssse3 fma cx16 pcid sse4_1 sse4_2 movbe popcnt aes xsave avx f16c rdrand lahf_lm cmp_legacy svm extapic cr8_legacy abm sse4a misalignsse 3dnowprefetch osvw ibs skinit wdt tce topoext perfctr_core perfctr_nb bpext perfctr_llc mwaitx cpb cat_l3 cdp_l3 invpcid_single hw_pstate ssbd mba ibrs ibpb stibp vmmcall fsgsbase bmi1 avx2 smep bmi2 erms invpcid cqm rdt_a rdseed adx smap clflushopt clwb sha_ni xsaveopt xsavec xgetbv1 xsaves cqm_llc cqm_occup_llc cqm_mbm_total cqm_mbm_local clzero irperf xsaveerptr wbnoinvd arat npt lbrv svm_lock nrip_save tsc_scale vmcb_clean flushbyasid decodeassists pausefilter pfthreshold v_vmsave_vmload vgif umip pku ospke vaes vpclmulqdq rdpid overflow_recov succor smca sme sev sev_es
Virtualization: AMD-V
L1d cache: 4 MiB (128 instances)
L1i cache: 4 MiB (128 instances)
L2 cache: 64 MiB (128 instances)
L3 cache: 512 MiB (16 instances)
NUMA node(s): 4
NUMA node0 CPU(s): 0-31
NUMA node1 CPU(s): 32-63
NUMA node2 CPU(s): 64-95
NUMA node3 CPU(s): 96-127
Vulnerability Gather data sampling: Not affected
Vulnerability Itlb multihit: Not affected
Vulnerability L1tf: Not affected
Vulnerability Mds: Not affected
Vulnerability Meltdown: Not affected
Vulnerability Mmio stale data: Not affected
Vulnerability Retbleed: Not affected
Vulnerability Spec store bypass: Vulnerable
Vulnerability Spectre v1: Vulnerable: __user pointer sanitization and usercopy barriers only; no swapgs barriers
Vulnerability Spectre v2: Vulnerable, IBPB: disabled, STIBP: disabled, PBRSB-eIBRS: Not affected
Vulnerability Srbds: Not affected
Vulnerability Tsx async abort: Not affected
==============================
Versions of relevant libraries
==============================
[pip3] flake8==7.1.1
[pip3] mypy-extensions==1.0.0
[pip3] numpy==1.26.4
[pip3] nvidia-cublas-cu12==12.6.4.1
[pip3] nvidia-cuda-cupti-cu12==12.6.80
[pip3] nvidia-cuda-nvrtc-cu12==12.6.77
[pip3] nvidia-cuda-runtime-cu12==12.6.77
[pip3] nvidia-cudnn-cu12==9.5.1.17
[pip3] nvidia-cudnn-frontend==1.5.2
[pip3] nvidia-cufft-cu12==11.3.0.4
[pip3] nvidia-cufile-cu12==1.11.1.6
[pip3] nvidia-curand-cu12==10.3.7.77
[pip3] nvidia-cusolver-cu12==11.7.1.2
[pip3] nvidia-cusparse-cu12==12.5.4.2
[pip3] nvidia-cusparselt-cu12==0.6.3
[pip3] nvidia-dali-cuda120==1.40.0
[pip3] nvidia-ml-py==12.575.51
[pip3] nvidia-ml-py3==7.352.0
[pip3] nvidia-modelopt==0.15.0
[pip3] nvidia-nccl-cu12==2.26.2
[pip3] nvidia-nvimgcodec-cu12==0.3.0.5
[pip3] nvidia-nvjitlink-cu12==12.6.85
[pip3] nvidia-nvtx-cu12==12.6.77
[pip3] nvidia-pyindex==1.0.9
[pip3] nvidia-smi==0.1.3
[pip3] onnx==1.16.1
[pip3] onnxruntime-gpu==1.17.1
[pip3] onnxsim==0.4.36
[pip3] open-clip-torch==2.24.0
[pip3] optree==0.13.0
[pip3] pynvml==12.0.0
[pip3] pytorch-lightning==2.2.4
[pip3] pytorch-triton==3.0.0+dedb7bdf3
[pip3] pyzmq==26.2.0
[pip3] sentence-transformers==4.1.0
[pip3] torch==2.7.0
[pip3] torchaudio==2.7.0
[pip3] torchmetrics==1.4.0.post0
[pip3] torchpack==0.3.1
[pip3] torchprofile==0.0.4
[pip3] torchvision==0.22.0
[pip3] transformers==f7b21822e32fba8bd92a939db7f352d1623f09e4
[pip3] transformers-stream-generator==0.0.5
[pip3] triton==3.3.0
[conda] Could not collect
==============================
vLLM Info
==============================
ROCM Version : Could not collect
Neuron SDK Version : N/A
vLLM Version : 0.9.1
vLLM Build Flags:
CUDA Archs: 5.2 6.0 6.1 7.0 7.2 7.5 8.0 8.6 8.7 9.0+PTX; ROCm: Disabled; Neuron: Disabled
GPU Topology:
GPU0 NIC0 NIC1 NIC2 NIC3 CPU Affinity NUMA Affinity GPU NUMA ID
GPU0 X PXB SYS SYS SYS 32-63 1 N/A
NIC0 PXB X SYS SYS SYS
NIC1 SYS SYS X SYS SYS
NIC2 SYS SYS SYS X PIX
NIC3 SYS SYS SYS PIX X
Legend:
X = Self
SYS = Connection traversing PCIe as well as the SMP interconnect between NUMA nodes (e.g., QPI/UPI)
NODE = Connection traversing PCIe as well as the interconnect between PCIe Host Bridges within a NUMA node
PHB = Connection traversing PCIe as well as a PCIe Host Bridge (typically the CPU)
PXB = Connection traversing multiple PCIe bridges (without traversing the PCIe Host Bridge)
PIX = Connection traversing at most a single PCIe bridge
NV# = Connection traversing a bonded set of # NVLinks
NIC Legend:
NIC0: mlx5_0
NIC1: mlx5_1
NIC2: mlx5_2
NIC3: mlx5_3
==============================
Environment Variables
==============================
NVIDIA_VISIBLE_DEVICES=GPU-e77165e1-0491-57a4-d10c-852cbce0cf61
CUBLAS_VERSION=12.6.0.22
NVIDIA_REQUIRE_CUDA=cuda>=9.0
CUDA_CACHE_DISABLE=1
TORCH_CUDA_ARCH_LIST=5.2 6.0 6.1 7.0 7.2 7.5 8.0 8.6 8.7 9.0+PTX
NCCL_VERSION=2.22.3
NVIDIA_DRIVER_CAPABILITIES=video,compute,utility,graphics
NVIDIA_PRODUCT_NAME=PyTorch
CUDA_VERSION=12.6.0.022
PYTORCH_VERSION=2.5.0a0+872d972
PYTORCH_BUILD_NUMBER=0
CUDNN_FRONTEND_VERSION=1.5.2
CUDNN_VERSION=9.3.0.75
PYTORCH_HOME=/opt/pytorch/pytorch
LD_LIBRARY_PATH=/usr/local/lib/python3.10/dist-packages/torch/lib:/usr/local/lib/python3.10/dist-packages/torch_tensorrt/lib:/usr/local/cuda/compat/lib:/usr/local/nvidia/lib:/usr/local/nvidia/lib64
NVIDIA_BUILD_ID=107063150
CUDA_DRIVER_VERSION=560.35.03
PYTORCH_BUILD_VERSION=2.5.0a0+872d972
CUDA_HOME=/usr/local/cuda
CUDA_HOME=/usr/local/cuda
CUDA_MODULE_LOADING=LAZY
NVIDIA_REQUIRE_JETPACK_HOST_MOUNTS=
NVIDIA_PYTORCH_VERSION=24.08
TORCH_ALLOW_TF32_CUBLAS_OVERRIDE=1
NCCL_CUMEM_ENABLE=0
PYTORCH_NVML_BASED_CUDA_CHECK=1
TORCHINDUCTOR_COMPILE_THREADS=1
```
### Who can help?
_No response_
### Information
- [ ] The official example scripts
- [x] My own modified scripts
### Tasks
- [x] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...)
- [ ] My own task or dataset (give details below)
### Reproduction
### 🐛 Describe the bug
When I use vllm to infer Qwen/Qwen2.5-Omni-3B model , I meet an init error
```python
from vllm import LLM, SamplingParams
llm = LLM(model="Qwen/Qwen2.5-Omni-3B")
outputs = llm.generate('你是谁', SamplingParams(temperature=0.8, top_p=0.95))
print(outputs)
```
The complete log is as follows
```log
INFO 06-19 02:48:20 [__init__.py:244] Automatically detected platform cuda.
Unrecognized keys in `rope_scaling` for 'rope_type'='default': {'mrope_section'}
INFO 06-19 02:49:07 [config.py:823] This model supports multiple tasks: {'classify', 'reward', 'score', 'generate', 'embed'}. Defaulting to 'generate'.
INFO 06-19 02:49:08 [config.py:2195] Chunked prefill is enabled with max_num_batched_tokens=8192.
INFO 06-19 02:49:16 [core.py:455] Waiting for init message from front-end.
INFO 06-19 02:49:16 [core.py:70] Initializing a V1 LLM engine (v0.9.1) with config: model='Qwen/Qwen2.5-Omni-3B', speculative_config=None, tokenizer='Qwen/Qwen2.5-Omni-3B', skip_tokenizer_init=False, tokenizer_mode=auto, revision=None, override_neuron_config={}, tokenizer_revision=None, trust_remote_code=False, dtype=torch.bfloat16, max_seq_len=32768, download_dir=None, load_format=auto, tensor_parallel_size=1, pipeline_parallel_size=1, disable_custom_all_reduce=False, quantization=None, enforce_eager=False, kv_cache_dtype=auto, device_config=cuda, decoding_config=DecodingConfig(backend='auto', disable_fallback=False, disable_any_whitespace=False, disable_additional_properties=False, reasoning_backend=''), observability_config=ObservabilityConfig(show_hidden_metrics_for_version=None, otlp_traces_endpoint=None, collect_detailed_traces=None), seed=0, served_model_name=Qwen/Qwen2.5-Omni-3B, num_scheduler_steps=1, multi_step_stream_outputs=True, enable_prefix_caching=True, chunked_prefill_enabled=True, use_async_output_proc=True, pooler_config=None, compilation_config={"level":3,"debug_dump_path":"","cache_dir":"","backend":"","custom_ops":["none"],"splitting_ops":["vllm.unified_attention","vllm.unified_attention_with_output"],"use_inductor":true,"compile_sizes":[],"inductor_compile_config":{"enable_auto_functionalized_v2":false},"inductor_passes":{},"use_cudagraph":true,"cudagraph_num_of_warmups":1,"cudagraph_capture_sizes":[512,504,496,488,480,472,464,456,448,440,432,424,416,408,400,392,384,376,368,360,352,344,336,328,320,312,304,296,288,280,272,264,256,248,240,232,224,216,208,200,192,184,176,168,160,152,144,136,128,120,112,104,96,88,80,72,64,56,48,40,32,24,16,8,4,2,1],"cudagraph_copy_inputs":false,"full_cuda_graph":false,"max_capture_size":512,"local_cache_dir":null}
WARNING 06-19 02:49:17 [utils.py:2737] Methods determine_num_available_blocks,device_config,get_cache_block_size_bytes,initialize_cache not implemented in <vllm.v1.worker.gpu_worker.Worker object at 0x7f531dc0a050>
INFO 06-19 02:49:18 [parallel_state.py:1065] rank 0 in world size 1 is assigned as DP rank 0, PP rank 0, TP rank 0, EP rank 0
Using a slow image processor as `use_fast` is unset and a slow processor was saved with this model. `use_fast=True` will be the default behavior in v4.52, even if the model was saved with a slow processor. This will result in minor differences in outputs. You'll still be able to use a slow processor with `use_fast=False`.
You have video processor config saved in `preprocessor.json` file which is deprecated. Video processor configs should be saved in their own `video_preprocessor.json` file. You can rename the file or load and save the processor back which renames it automatically. Loading from `preprocessor.json` will be removed in v5.0.
ERROR 06-19 02:49:35 [core.py:515] EngineCore failed to start.
ERROR 06-19 02:49:35 [core.py:515] Traceback (most recent call last):
ERROR 06-19 02:49:35 [core.py:515] File "/home/jun.zhou10/.local/lib/python3.10/site-packages/vllm/v1/engine/core.py", line 506, in run_engine_core
ERROR 06-19 02:49:35 [core.py:515] engine_core = EngineCoreProc(*args, **kwargs)
ERROR 06-19 02:49:35 [core.py:515] File "/home/jun.zhou10/.local/lib/python3.10/site-packages/vllm/v1/engine/core.py", line 390, in __init__
ERROR 06-19 02:49:35 [core.py:515] super().__init__(vllm_config, executor_class, log_stats,
ERROR 06-19 02:49:35 [core.py:515] File "/home/jun.zhou10/.local/lib/python3.10/site-packages/vllm/v1/engine/core.py", line 76, in __init__
ERROR 06-19 02:49:35 [core.py:515] self.model_executor = executor_class(vllm_config)
ERROR 06-19 02:49:35 [core.py:515] File "/home/jun.zhou10/.local/lib/python3.10/site-packages/vllm/executor/executor_base.py", line 53, in __init__
ERROR 06-19 02:49:35 [core.py:515] self._init_executor()
ERROR 06-19 02:49:35 [core.py:515] File "/home/jun.zhou10/.local/lib/python3.10/site-packages/vllm/executor/uniproc_executor.py", line 47, in _init_executor
ERROR 06-19 02:49:35 [core.py:515] self.collective_rpc("init_device")
ERROR 06-19 02:49:35 [core.py:515] File "/home/jun.zhou10/.local/lib/python3.10/site-packages/vllm/executor/uniproc_executor.py", line 57, in collective_rpc
ERROR 06-19 02:49:35 [core.py:515] answer = run_method(self.driver_worker, method, args, kwargs)
ERROR 06-19 02:49:35 [core.py:515] File "/home/jun.zhou10/.local/lib/python3.10/site-packages/vllm/utils.py", line 2671, in run_method
ERROR 06-19 02:49:35 [core.py:515] return func(*args, **kwargs)
ERROR 06-19 02:49:35 [core.py:515] File "/home/jun.zhou10/.local/lib/python3.10/site-packages/vllm/worker/worker_base.py", line 606, in init_device
ERROR 06-19 02:49:35 [core.py:515] self.worker.init_device() # type: ignore
ERROR 06-19 02:49:35 [core.py:515] File "/home/jun.zhou10/.local/lib/python3.10/site-packages/vllm/v1/worker/gpu_worker.py", line 160, in init_device
ERROR 06-19 02:49:35 [core.py:515] self.model_runner: GPUModelRunner = GPUModelRunner(
ERROR 06-19 02:49:35 [core.py:515] File "/home/jun.zhou10/.local/lib/python3.10/site-packages/vllm/v1/worker/gpu_model_runner.py", line 129, in __init__
ERROR 06-19 02:49:35 [core.py:515] encoder_compute_budget, encoder_cache_size = compute_encoder_budget(
ERROR 06-19 02:49:35 [core.py:515] File "/home/jun.zhou10/.local/lib/python3.10/site-packages/vllm/v1/core/encoder_cache_manager.py", line 95, in compute_encoder_budget
ERROR 06-19 02:49:35 [core.py:515] ) = _compute_encoder_budget_multimodal(
ERROR 06-19 02:49:35 [core.py:515] File "/home/jun.zhou10/.local/lib/python3.10/site-packages/vllm/v1/core/encoder_cache_manager.py", line 125, in _compute_encoder_budget_multimodal
ERROR 06-19 02:49:35 [core.py:515] .get_max_tokens_per_item_by_nonzero_modality(model_config)
ERROR 06-19 02:49:35 [core.py:515] File "/home/jun.zhou10/.local/lib/python3.10/site-packages/vllm/multimodal/registry.py", line 153, in get_max_tokens_per_item_by_nonzero_modality
ERROR 06-19 02:49:35 [core.py:515] mm_limits = self.get_mm_limits_per_prompt(model_config)
ERROR 06-19 02:49:35 [core.py:515] File "/home/jun.zhou10/.local/lib/python3.10/site-packages/vllm/multimodal/registry.py", line 206, in get_mm_limits_per_prompt
ERROR 06-19 02:49:35 [core.py:515] processor = self.create_processor(model_config, disable_cache=False)
ERROR 06-19 02:49:35 [core.py:515] File "/home/jun.zhou10/.local/lib/python3.10/site-packages/vllm/multimodal/registry.py", line 281, in create_processor
ERROR 06-19 02:49:35 [core.py:515] return factories.build_processor(ctx, cache=cache)
ERROR 06-19 02:49:35 [core.py:515] File "/home/jun.zhou10/.local/lib/python3.10/site-packages/vllm/multimodal/registry.py", line 88, in build_processor
ERROR 06-19 02:49:35 [core.py:515] return self.processor(info, dummy_inputs_builder, cache=cache)
ERROR 06-19 02:49:35 [core.py:515] File "/home/jun.zhou10/.local/lib/python3.10/site-packages/vllm/multimodal/processing.py", line 1131, in __init__
ERROR 06-19 02:49:35 [core.py:515] self.data_parser = self._get_data_parser()
ERROR 06-19 02:49:35 [core.py:515] File "/home/jun.zhou10/.local/lib/python3.10/site-packages/vllm/model_executor/models/qwen2_5_omni_thinker.py", line 238, in _get_data_parser
ERROR 06-19 02:49:35 [core.py:515] feature_extractor = self.info.get_feature_extractor()
ERROR 06-19 02:49:35 [core.py:515] File "/home/jun.zhou10/.local/lib/python3.10/site-packages/vllm/model_executor/models/qwen2_5_omni_thinker.py", line 170, in get_feature_extractor
ERROR 06-19 02:49:35 [core.py:515] hf_processor = self.get_hf_processor(sampling_rate=sampling_rate)
ERROR 06-19 02:49:35 [core.py:515] File "/home/jun.zhou10/.local/lib/python3.10/site-packages/vllm/model_executor/models/qwen2_5_omni_thinker.py", line 147, in get_hf_processor
ERROR 06-19 02:49:35 [core.py:515] processor = self.ctx.get_hf_processor(
ERROR 06-19 02:49:35 [core.py:515] File "/home/jun.zhou10/.local/lib/python3.10/site-packages/vllm/inputs/registry.py", line 131, in get_hf_processor
ERROR 06-19 02:49:35 [core.py:515] return super().get_hf_processor(
ERROR 06-19 02:49:35 [core.py:515] File "/home/jun.zhou10/.local/lib/python3.10/site-packages/vllm/inputs/registry.py", line 94, in get_hf_processor
ERROR 06-19 02:49:35 [core.py:515] return cached_processor_from_config(
ERROR 06-19 02:49:35 [core.py:515] File "/home/jun.zhou10/.local/lib/python3.10/site-packages/vllm/transformers_utils/processor.py", line 110, in cached_processor_from_config
ERROR 06-19 02:49:35 [core.py:515] return cached_get_processor(
ERROR 06-19 02:49:35 [core.py:515] File "/home/jun.zhou10/.local/lib/python3.10/site-packages/vllm/transformers_utils/processor.py", line 72, in get_processor
ERROR 06-19 02:49:35 [core.py:515] processor = processor_factory.from_pretrained(
ERROR 06-19 02:49:35 [core.py:515] File "/home/jun.zhou10/.local/lib/python3.10/site-packages/transformers/processing_utils.py", line 1213, in from_pretrained
ERROR 06-19 02:49:35 [core.py:515] return cls.from_args_and_dict(args, processor_dict, **kwargs)
ERROR 06-19 02:49:35 [core.py:515] File "/home/jun.zhou10/.local/lib/python3.10/site-packages/transformers/processing_utils.py", line 1014, in from_args_and_dict
ERROR 06-19 02:49:35 [core.py:515] processor = cls(*args, **valid_kwargs)
ERROR 06-19 02:49:35 [core.py:515] TypeError: Qwen2_5OmniProcessor.__init__() got multiple values for argument 'image_processor'
Process EngineCore_0:
Traceback (most recent call last):
File "/usr/lib/python3.10/multiprocessing/process.py", line 314, in _bootstrap
self.run()
File "/usr/lib/python3.10/multiprocessing/process.py", line 108, in run
self._target(*self._args, **self._kwargs)
File "/home/jun.zhou10/.local/lib/python3.10/site-packages/vllm/v1/engine/core.py", line 519, in run_engine_core
raise e
File "/home/jun.zhou10/.local/lib/python3.10/site-packages/vllm/v1/engine/core.py", line 506, in run_engine_core
engine_core = EngineCoreProc(*args, **kwargs)
File "/home/jun.zhou10/.local/lib/python3.10/site-packages/vllm/v1/engine/core.py", line 390, in __init__
super().__init__(vllm_config, executor_class, log_stats,
File "/home/jun.zhou10/.local/lib/python3.10/site-packages/vllm/v1/engine/core.py", line 76, in __init__
self.model_executor = executor_class(vllm_config)
File "/home/jun.zhou10/.local/lib/python3.10/site-packages/vllm/executor/executor_base.py", line 53, in __init__
self._init_executor()
File "/home/jun.zhou10/.local/lib/python3.10/site-packages/vllm/executor/uniproc_executor.py", line 47, in _init_executor
self.collective_rpc("init_device")
File "/home/jun.zhou10/.local/lib/python3.10/site-packages/vllm/executor/uniproc_executor.py", line 57, in collective_rpc
answer = run_method(self.driver_worker, method, args, kwargs)
File "/home/jun.zhou10/.local/lib/python3.10/site-packages/vllm/utils.py", line 2671, in run_method
return func(*args, **kwargs)
File "/home/jun.zhou10/.local/lib/python3.10/site-packages/vllm/worker/worker_base.py", line 606, in init_device
self.worker.init_device() # type: ignore
File "/home/jun.zhou10/.local/lib/python3.10/site-packages/vllm/v1/worker/gpu_worker.py", line 160, in init_device
self.model_runner: GPUModelRunner = GPUModelRunner(
File "/home/jun.zhou10/.local/lib/python3.10/site-packages/vllm/v1/worker/gpu_model_runner.py", line 129, in __init__
encoder_compute_budget, encoder_cache_size = compute_encoder_budget(
File "/home/jun.zhou10/.local/lib/python3.10/site-packages/vllm/v1/core/encoder_cache_manager.py", line 95, in compute_encoder_budget
) = _compute_encoder_budget_multimodal(
File "/home/jun.zhou10/.local/lib/python3.10/site-packages/vllm/v1/core/encoder_cache_manager.py", line 125, in _compute_encoder_budget_multimodal
.get_max_tokens_per_item_by_nonzero_modality(model_config)
File "/home/jun.zhou10/.local/lib/python3.10/site-packages/vllm/multimodal/registry.py", line 153, in get_max_tokens_per_item_by_nonzero_modality
mm_limits = self.get_mm_limits_per_prompt(model_config)
File "/home/jun.zhou10/.local/lib/python3.10/site-packages/vllm/multimodal/registry.py", line 206, in get_mm_limits_per_prompt
processor = self.create_processor(model_config, disable_cache=False)
File "/home/jun.zhou10/.local/lib/python3.10/site-packages/vllm/multimodal/registry.py", line 281, in create_processor
return factories.build_processor(ctx, cache=cache)
File "/home/jun.zhou10/.local/lib/python3.10/site-packages/vllm/multimodal/registry.py", line 88, in build_processor
return self.processor(info, dummy_inputs_builder, cache=cache)
File "/home/jun.zhou10/.local/lib/python3.10/site-packages/vllm/multimodal/processing.py", line 1131, in __init__
self.data_parser = self._get_data_parser()
File "/home/jun.zhou10/.local/lib/python3.10/site-packages/vllm/model_executor/models/qwen2_5_omni_thinker.py", line 238, in _get_data_parser
feature_extractor = self.info.get_feature_extractor()
File "/home/jun.zhou10/.local/lib/python3.10/site-packages/vllm/model_executor/models/qwen2_5_omni_thinker.py", line 170, in get_feature_extractor
hf_processor = self.get_hf_processor(sampling_rate=sampling_rate)
File "/home/jun.zhou10/.local/lib/python3.10/site-packages/vllm/model_executor/models/qwen2_5_omni_thinker.py", line 147, in get_hf_processor
processor = self.ctx.get_hf_processor(
File "/home/jun.zhou10/.local/lib/python3.10/site-packages/vllm/inputs/registry.py", line 131, in get_hf_processor
return super().get_hf_processor(
File "/home/jun.zhou10/.local/lib/python3.10/site-packages/vllm/inputs/registry.py", line 94, in get_hf_processor
return cached_processor_from_config(
File "/home/jun.zhou10/.local/lib/python3.10/site-packages/vllm/transformers_utils/processor.py", line 110, in cached_processor_from_config
return cached_get_processor(
File "/home/jun.zhou10/.local/lib/python3.10/site-packages/vllm/transformers_utils/processor.py", line 72, in get_processor
processor = processor_factory.from_pretrained(
File "/home/jun.zhou10/.local/lib/python3.10/site-packages/transformers/processing_utils.py", line 1213, in from_pretrained
return cls.from_args_and_dict(args, processor_dict, **kwargs)
File "/home/jun.zhou10/.local/lib/python3.10/site-packages/transformers/processing_utils.py", line 1014, in from_args_and_dict
processor = cls(*args, **valid_kwargs)
TypeError: Qwen2_5OmniProcessor.__init__() got multiple values for argument 'image_processor'
Traceback (most recent call last):
File "/dc-hl/jun.zhou10/swift/eval/infer_vllm.py", line 65, in <module>
llm = LLM(model="Qwen/Qwen2.5-Omni-3B")
File "/home/jun.zhou10/.local/lib/python3.10/site-packages/vllm/entrypoints/llm.py", line 243, in __init__
self.llm_engine = LLMEngine.from_engine_args(
File "/home/jun.zhou10/.local/lib/python3.10/site-packages/vllm/engine/llm_engine.py", line 501, in from_engine_args
return engine_cls.from_vllm_config(
File "/home/jun.zhou10/.local/lib/python3.10/site-packages/vllm/v1/engine/llm_engine.py", line 124, in from_vllm_config
return cls(vllm_config=vllm_config,
File "/home/jun.zhou10/.local/lib/python3.10/site-packages/vllm/v1/engine/llm_engine.py", line 101, in __init__
self.engine_core = EngineCoreClient.make_client(
File "/home/jun.zhou10/.local/lib/python3.10/site-packages/vllm/v1/engine/core_client.py", line 75, in make_client
return SyncMPClient(vllm_config, executor_class, log_stats)
File "/home/jun.zhou10/.local/lib/python3.10/site-packages/vllm/v1/engine/core_client.py", line 558, in __init__
super().__init__(
File "/home/jun.zhou10/.local/lib/python3.10/site-packages/vllm/v1/engine/core_client.py", line 422, in __init__
self._init_engines_direct(vllm_config, local_only,
File "/home/jun.zhou10/.local/lib/python3.10/site-packages/vllm/v1/engine/core_client.py", line 491, in _init_engines_direct
self._wait_for_engine_startup(handshake_socket, input_address,
File "/home/jun.zhou10/.local/lib/python3.10/site-packages/vllm/v1/engine/core_client.py", line 511, in _wait_for_engine_startup
wait_for_engine_startup(
File "/home/jun.zhou10/.local/lib/python3.10/site-packages/vllm/v1/utils.py", line 494, in wait_for_engine_startup
raise RuntimeError("Engine core initialization failed. "
RuntimeError: Engine core initialization failed. See root cause above. Failed core proc(s): {}
```
But it runs normally under 4.52.4
### Expected behavior
Get the correct output | {
"login": "WenmuZhou",
"id": 12406017,
"node_id": "MDQ6VXNlcjEyNDA2MDE3",
"avatar_url": "https://avatars.githubusercontent.com/u/12406017?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/WenmuZhou",
"html_url": "https://github.com/WenmuZhou",
"followers_url": "https://api.github.com/users/WenmuZhou/followers",
"following_url": "https://api.github.com/users/WenmuZhou/following{/other_user}",
"gists_url": "https://api.github.com/users/WenmuZhou/gists{/gist_id}",
"starred_url": "https://api.github.com/users/WenmuZhou/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/WenmuZhou/subscriptions",
"organizations_url": "https://api.github.com/users/WenmuZhou/orgs",
"repos_url": "https://api.github.com/users/WenmuZhou/repos",
"events_url": "https://api.github.com/users/WenmuZhou/events{/privacy}",
"received_events_url": "https://api.github.com/users/WenmuZhou/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38898/reactions",
"total_count": 3,
"+1": 3,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38898/timeline | null | completed | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | true |
https://api.github.com/repos/huggingface/transformers/issues/38897 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38897/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38897/comments | https://api.github.com/repos/huggingface/transformers/issues/38897/events | https://github.com/huggingface/transformers/pull/38897 | 3,158,509,622 | PR_kwDOCUB6oc6bKdL_ | 38,897 | Fix unnecessary super calls | {
"login": "cyyever",
"id": 17618148,
"node_id": "MDQ6VXNlcjE3NjE4MTQ4",
"avatar_url": "https://avatars.githubusercontent.com/u/17618148?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/cyyever",
"html_url": "https://github.com/cyyever",
"followers_url": "https://api.github.com/users/cyyever/followers",
"following_url": "https://api.github.com/users/cyyever/following{/other_user}",
"gists_url": "https://api.github.com/users/cyyever/gists{/gist_id}",
"starred_url": "https://api.github.com/users/cyyever/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/cyyever/subscriptions",
"organizations_url": "https://api.github.com/users/cyyever/orgs",
"repos_url": "https://api.github.com/users/cyyever/repos",
"events_url": "https://api.github.com/users/cyyever/events{/privacy}",
"received_events_url": "https://api.github.com/users/cyyever/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-06-19T00:24:02 | 2025-06-19T13:55:18 | 2025-06-19T11:45:51 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38897",
"html_url": "https://github.com/huggingface/transformers/pull/38897",
"diff_url": "https://github.com/huggingface/transformers/pull/38897.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38897.patch",
"merged_at": "2025-06-19T11:45:51"
} | # What does this PR do?
These are detected and fixed by `ruff UP008`. | {
"login": "Rocketknight1",
"id": 12866554,
"node_id": "MDQ6VXNlcjEyODY2NTU0",
"avatar_url": "https://avatars.githubusercontent.com/u/12866554?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Rocketknight1",
"html_url": "https://github.com/Rocketknight1",
"followers_url": "https://api.github.com/users/Rocketknight1/followers",
"following_url": "https://api.github.com/users/Rocketknight1/following{/other_user}",
"gists_url": "https://api.github.com/users/Rocketknight1/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Rocketknight1/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Rocketknight1/subscriptions",
"organizations_url": "https://api.github.com/users/Rocketknight1/orgs",
"repos_url": "https://api.github.com/users/Rocketknight1/repos",
"events_url": "https://api.github.com/users/Rocketknight1/events{/privacy}",
"received_events_url": "https://api.github.com/users/Rocketknight1/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38897/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38897/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/38896 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38896/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38896/comments | https://api.github.com/repos/huggingface/transformers/issues/38896/events | https://github.com/huggingface/transformers/pull/38896 | 3,158,327,186 | PR_kwDOCUB6oc6bJ1eD | 38,896 | Update SuperPoint model card | {
"login": "sbucaille",
"id": 24275548,
"node_id": "MDQ6VXNlcjI0Mjc1NTQ4",
"avatar_url": "https://avatars.githubusercontent.com/u/24275548?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/sbucaille",
"html_url": "https://github.com/sbucaille",
"followers_url": "https://api.github.com/users/sbucaille/followers",
"following_url": "https://api.github.com/users/sbucaille/following{/other_user}",
"gists_url": "https://api.github.com/users/sbucaille/gists{/gist_id}",
"starred_url": "https://api.github.com/users/sbucaille/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sbucaille/subscriptions",
"organizations_url": "https://api.github.com/users/sbucaille/orgs",
"repos_url": "https://api.github.com/users/sbucaille/repos",
"events_url": "https://api.github.com/users/sbucaille/events{/privacy}",
"received_events_url": "https://api.github.com/users/sbucaille/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-06-18T22:22:23 | 2025-07-06T13:22:41 | 2025-06-26T17:13:06 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38896",
"html_url": "https://github.com/huggingface/transformers/pull/38896",
"diff_url": "https://github.com/huggingface/transformers/pull/38896.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38896.patch",
"merged_at": "2025-06-26T17:13:06"
} | # What does this PR do?
Updates SuperPoint model card for #36979
## Before submitting
- [x] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
## Who can review?
@stevhliu
| {
"login": "stevhliu",
"id": 59462357,
"node_id": "MDQ6VXNlcjU5NDYyMzU3",
"avatar_url": "https://avatars.githubusercontent.com/u/59462357?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/stevhliu",
"html_url": "https://github.com/stevhliu",
"followers_url": "https://api.github.com/users/stevhliu/followers",
"following_url": "https://api.github.com/users/stevhliu/following{/other_user}",
"gists_url": "https://api.github.com/users/stevhliu/gists{/gist_id}",
"starred_url": "https://api.github.com/users/stevhliu/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/stevhliu/subscriptions",
"organizations_url": "https://api.github.com/users/stevhliu/orgs",
"repos_url": "https://api.github.com/users/stevhliu/repos",
"events_url": "https://api.github.com/users/stevhliu/events{/privacy}",
"received_events_url": "https://api.github.com/users/stevhliu/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38896/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38896/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/38895 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38895/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38895/comments | https://api.github.com/repos/huggingface/transformers/issues/38895/events | https://github.com/huggingface/transformers/pull/38895 | 3,157,922,068 | PR_kwDOCUB6oc6bIchU | 38,895 | Sandeepyadav1478/2025 06 19 deberta v2 model card update | {
"login": "sandeepyadav1478",
"id": 31266882,
"node_id": "MDQ6VXNlcjMxMjY2ODgy",
"avatar_url": "https://avatars.githubusercontent.com/u/31266882?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/sandeepyadav1478",
"html_url": "https://github.com/sandeepyadav1478",
"followers_url": "https://api.github.com/users/sandeepyadav1478/followers",
"following_url": "https://api.github.com/users/sandeepyadav1478/following{/other_user}",
"gists_url": "https://api.github.com/users/sandeepyadav1478/gists{/gist_id}",
"starred_url": "https://api.github.com/users/sandeepyadav1478/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sandeepyadav1478/subscriptions",
"organizations_url": "https://api.github.com/users/sandeepyadav1478/orgs",
"repos_url": "https://api.github.com/users/sandeepyadav1478/repos",
"events_url": "https://api.github.com/users/sandeepyadav1478/events{/privacy}",
"received_events_url": "https://api.github.com/users/sandeepyadav1478/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-06-18T19:15:37 | 2025-06-27T17:35:30 | 2025-06-27T17:35:30 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38895",
"html_url": "https://github.com/huggingface/transformers/pull/38895",
"diff_url": "https://github.com/huggingface/transformers/pull/38895.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38895.patch",
"merged_at": "2025-06-27T17:35:30"
} | # What does this PR do?
Updates the DeBERTa-v2 card according to the description in [#36979](https://github.com/huggingface/transformers/issues/36979) to standardize all model cards' look.
The pr includes:
- Model description
- Examples for Pipeline, AutoModel, and command line usage
Note: AttentionMaskVisualizer is not applicable either, as DeBERTa-v2is currently not supported
- [x] This PR improves the docs
Who can review?
@stevhliu Please have a look and let me know if any further work is needed. thank you
| {
"login": "stevhliu",
"id": 59462357,
"node_id": "MDQ6VXNlcjU5NDYyMzU3",
"avatar_url": "https://avatars.githubusercontent.com/u/59462357?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/stevhliu",
"html_url": "https://github.com/stevhliu",
"followers_url": "https://api.github.com/users/stevhliu/followers",
"following_url": "https://api.github.com/users/stevhliu/following{/other_user}",
"gists_url": "https://api.github.com/users/stevhliu/gists{/gist_id}",
"starred_url": "https://api.github.com/users/stevhliu/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/stevhliu/subscriptions",
"organizations_url": "https://api.github.com/users/stevhliu/orgs",
"repos_url": "https://api.github.com/users/stevhliu/repos",
"events_url": "https://api.github.com/users/stevhliu/events{/privacy}",
"received_events_url": "https://api.github.com/users/stevhliu/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38895/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38895/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/38894 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38894/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38894/comments | https://api.github.com/repos/huggingface/transformers/issues/38894/events | https://github.com/huggingface/transformers/pull/38894 | 3,157,847,410 | PR_kwDOCUB6oc6bIMPe | 38,894 | docs: update LLaVA-NeXT model card | {
"login": "Bpriya42",
"id": 81420085,
"node_id": "MDQ6VXNlcjgxNDIwMDg1",
"avatar_url": "https://avatars.githubusercontent.com/u/81420085?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Bpriya42",
"html_url": "https://github.com/Bpriya42",
"followers_url": "https://api.github.com/users/Bpriya42/followers",
"following_url": "https://api.github.com/users/Bpriya42/following{/other_user}",
"gists_url": "https://api.github.com/users/Bpriya42/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Bpriya42/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Bpriya42/subscriptions",
"organizations_url": "https://api.github.com/users/Bpriya42/orgs",
"repos_url": "https://api.github.com/users/Bpriya42/repos",
"events_url": "https://api.github.com/users/Bpriya42/events{/privacy}",
"received_events_url": "https://api.github.com/users/Bpriya42/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-06-18T18:44:08 | 2025-07-09T18:32:40 | 2025-07-09T18:32:40 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38894",
"html_url": "https://github.com/huggingface/transformers/pull/38894",
"diff_url": "https://github.com/huggingface/transformers/pull/38894.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38894.patch",
"merged_at": "2025-07-09T18:32:40"
} | # What does this PR do?
As mentioned in the issue https://github.com/huggingface/transformers/issues/36979 this PR updates the documentation of the Llava-Next model, which will now be aligned with the standardized format for all the docs.
## Before submitting
- [x] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [x] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review?
@stevhliu, please let me know if any changes are needed.
| {
"login": "stevhliu",
"id": 59462357,
"node_id": "MDQ6VXNlcjU5NDYyMzU3",
"avatar_url": "https://avatars.githubusercontent.com/u/59462357?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/stevhliu",
"html_url": "https://github.com/stevhliu",
"followers_url": "https://api.github.com/users/stevhliu/followers",
"following_url": "https://api.github.com/users/stevhliu/following{/other_user}",
"gists_url": "https://api.github.com/users/stevhliu/gists{/gist_id}",
"starred_url": "https://api.github.com/users/stevhliu/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/stevhliu/subscriptions",
"organizations_url": "https://api.github.com/users/stevhliu/orgs",
"repos_url": "https://api.github.com/users/stevhliu/repos",
"events_url": "https://api.github.com/users/stevhliu/events{/privacy}",
"received_events_url": "https://api.github.com/users/stevhliu/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38894/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38894/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/38893 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38893/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38893/comments | https://api.github.com/repos/huggingface/transformers/issues/38893/events | https://github.com/huggingface/transformers/pull/38893 | 3,157,664,957 | PR_kwDOCUB6oc6bHkVn | 38,893 | Fix/deprecate max size conditional detr | {
"login": "druvdub",
"id": 59387969,
"node_id": "MDQ6VXNlcjU5Mzg3OTY5",
"avatar_url": "https://avatars.githubusercontent.com/u/59387969?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/druvdub",
"html_url": "https://github.com/druvdub",
"followers_url": "https://api.github.com/users/druvdub/followers",
"following_url": "https://api.github.com/users/druvdub/following{/other_user}",
"gists_url": "https://api.github.com/users/druvdub/gists{/gist_id}",
"starred_url": "https://api.github.com/users/druvdub/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/druvdub/subscriptions",
"organizations_url": "https://api.github.com/users/druvdub/orgs",
"repos_url": "https://api.github.com/users/druvdub/repos",
"events_url": "https://api.github.com/users/druvdub/events{/privacy}",
"received_events_url": "https://api.github.com/users/druvdub/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | open | false | null | [] | null | [] | 2025-06-18T17:25:18 | 2025-06-18T17:25:18 | null | CONTRIBUTOR | null | null | true | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38893",
"html_url": "https://github.com/huggingface/transformers/pull/38893",
"diff_url": "https://github.com/huggingface/transformers/pull/38893.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38893.patch",
"merged_at": null
} | # What does this PR do?
Fixes #37939
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [x] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [x] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case #37939
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
@amyeroberts, @qubvel | null | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38893/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38893/timeline | null | null | null | null | true | false |
https://api.github.com/repos/huggingface/transformers/issues/38892 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38892/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38892/comments | https://api.github.com/repos/huggingface/transformers/issues/38892/events | https://github.com/huggingface/transformers/pull/38892 | 3,157,422,302 | PR_kwDOCUB6oc6bGv-Q | 38,892 | Allow make-fixup on main branch, albeit slowly | {
"login": "Rocketknight1",
"id": 12866554,
"node_id": "MDQ6VXNlcjEyODY2NTU0",
"avatar_url": "https://avatars.githubusercontent.com/u/12866554?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Rocketknight1",
"html_url": "https://github.com/Rocketknight1",
"followers_url": "https://api.github.com/users/Rocketknight1/followers",
"following_url": "https://api.github.com/users/Rocketknight1/following{/other_user}",
"gists_url": "https://api.github.com/users/Rocketknight1/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Rocketknight1/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Rocketknight1/subscriptions",
"organizations_url": "https://api.github.com/users/Rocketknight1/orgs",
"repos_url": "https://api.github.com/users/Rocketknight1/repos",
"events_url": "https://api.github.com/users/Rocketknight1/events{/privacy}",
"received_events_url": "https://api.github.com/users/Rocketknight1/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-06-18T16:00:09 | 2025-06-19T14:23:01 | 2025-06-19T14:23:00 | MEMBER | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38892",
"html_url": "https://github.com/huggingface/transformers/pull/38892",
"diff_url": "https://github.com/huggingface/transformers/pull/38892.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38892.patch",
"merged_at": "2025-06-19T14:22:59"
} | I've seen a couple of cases of users making a fork and then developing on their `main` branch. This causes problems, but one of the more confusing ones is that our style tools stop working because they compare against `main` to generate a diff. Ideally, people wouldn't do this, but we can reduce the errors a little with a couple of tweaks.
This PR updates the behaviour - if you're running on `main`, no diff is generated and all files are checked. This is slow, but at least it's correct and doesn't throw an error! | {
"login": "Rocketknight1",
"id": 12866554,
"node_id": "MDQ6VXNlcjEyODY2NTU0",
"avatar_url": "https://avatars.githubusercontent.com/u/12866554?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Rocketknight1",
"html_url": "https://github.com/Rocketknight1",
"followers_url": "https://api.github.com/users/Rocketknight1/followers",
"following_url": "https://api.github.com/users/Rocketknight1/following{/other_user}",
"gists_url": "https://api.github.com/users/Rocketknight1/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Rocketknight1/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Rocketknight1/subscriptions",
"organizations_url": "https://api.github.com/users/Rocketknight1/orgs",
"repos_url": "https://api.github.com/users/Rocketknight1/repos",
"events_url": "https://api.github.com/users/Rocketknight1/events{/privacy}",
"received_events_url": "https://api.github.com/users/Rocketknight1/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38892/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38892/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/38891 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38891/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38891/comments | https://api.github.com/repos/huggingface/transformers/issues/38891/events | https://github.com/huggingface/transformers/issues/38891 | 3,157,374,654 | I_kwDOCUB6oc68Mba- | 38,891 | Add EdgeTam to Transformers 🤗 | {
"login": "yonigozlan",
"id": 74535834,
"node_id": "MDQ6VXNlcjc0NTM1ODM0",
"avatar_url": "https://avatars.githubusercontent.com/u/74535834?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/yonigozlan",
"html_url": "https://github.com/yonigozlan",
"followers_url": "https://api.github.com/users/yonigozlan/followers",
"following_url": "https://api.github.com/users/yonigozlan/following{/other_user}",
"gists_url": "https://api.github.com/users/yonigozlan/gists{/gist_id}",
"starred_url": "https://api.github.com/users/yonigozlan/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/yonigozlan/subscriptions",
"organizations_url": "https://api.github.com/users/yonigozlan/orgs",
"repos_url": "https://api.github.com/users/yonigozlan/repos",
"events_url": "https://api.github.com/users/yonigozlan/events{/privacy}",
"received_events_url": "https://api.github.com/users/yonigozlan/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 1843244711,
"node_id": "MDU6TGFiZWwxODQzMjQ0NzEx",
"url": "https://api.github.com/repos/huggingface/transformers/labels/New%20model",
"name": "New model",
"color": "fbca04",
"default": false,
"description": ""
}
] | open | false | null | [] | null | [] | 2025-06-18T15:46:45 | 2025-06-19T17:01:29 | null | MEMBER | null | null | null | null | ### Model description
Happy to take this one, as I'm working on Sam2 and the implementation looks very similar.
I'll have to dive deeper in the code to check if this even need to be a separate model.
### Open source status
- [x] The model implementation is available
- [x] The model weights are available
### Provide useful links for the implementation
https://github.com/facebookresearch/EdgeTAM | null | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38891/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38891/timeline | null | null | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | false |
https://api.github.com/repos/huggingface/transformers/issues/38889 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38889/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38889/comments | https://api.github.com/repos/huggingface/transformers/issues/38889/events | https://github.com/huggingface/transformers/issues/38889 | 3,157,287,284 | I_kwDOCUB6oc68MGF0 | 38,889 | The wrong config parameter found in src/transformers/models/qwen2_5_vl/configuration_qwen2_5_vl.py. | {
"login": "Bytes-Lin",
"id": 73384757,
"node_id": "MDQ6VXNlcjczMzg0NzU3",
"avatar_url": "https://avatars.githubusercontent.com/u/73384757?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Bytes-Lin",
"html_url": "https://github.com/Bytes-Lin",
"followers_url": "https://api.github.com/users/Bytes-Lin/followers",
"following_url": "https://api.github.com/users/Bytes-Lin/following{/other_user}",
"gists_url": "https://api.github.com/users/Bytes-Lin/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Bytes-Lin/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Bytes-Lin/subscriptions",
"organizations_url": "https://api.github.com/users/Bytes-Lin/orgs",
"repos_url": "https://api.github.com/users/Bytes-Lin/repos",
"events_url": "https://api.github.com/users/Bytes-Lin/events{/privacy}",
"received_events_url": "https://api.github.com/users/Bytes-Lin/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
}
] | closed | false | null | [] | null | [] | 2025-06-18T15:18:29 | 2025-07-27T08:02:37 | 2025-07-27T08:02:37 | NONE | null | null | null | null | ### System Info
According to Qwen2.5-VL Technical Report, the channel of ViT and the in channel of VL-merger could be 1280.

And I try to load the pretrained Qwen2.5-VL-7B model, clearly, the intput hidden size of merger is 1280. After operation, the in channel of linear is 5120. It's correct.


But in the qwen2_5_vl/configuration_qwen2_5_vl.py, we can see the hidden size in VisionConfig is set to 3584, so the in channel of merger becomes 3854 instead of 1280. I have some confusion with it, is it a wrong parameter?


### Who can help?
_No response_
### Information
- [ ] The official example scripts
- [ ] My own modified scripts
### Tasks
- [ ] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...)
- [ ] My own task or dataset (give details below)
### Reproduction
[ hidden_size=3584,](https://github.com/huggingface/transformers/blob/9cd7570f34fdb833ed874b2eba4d4ea3ae9ccb03/src/transformers/models/qwen2_5_vl/configuration_qwen2_5_vl.py#L37)
### Expected behavior
Please make sure the parameter is correct. | {
"login": "github-actions[bot]",
"id": 41898282,
"node_id": "MDM6Qm90NDE4OTgyODI=",
"avatar_url": "https://avatars.githubusercontent.com/in/15368?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/github-actions%5Bbot%5D",
"html_url": "https://github.com/apps/github-actions",
"followers_url": "https://api.github.com/users/github-actions%5Bbot%5D/followers",
"following_url": "https://api.github.com/users/github-actions%5Bbot%5D/following{/other_user}",
"gists_url": "https://api.github.com/users/github-actions%5Bbot%5D/gists{/gist_id}",
"starred_url": "https://api.github.com/users/github-actions%5Bbot%5D/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/github-actions%5Bbot%5D/subscriptions",
"organizations_url": "https://api.github.com/users/github-actions%5Bbot%5D/orgs",
"repos_url": "https://api.github.com/users/github-actions%5Bbot%5D/repos",
"events_url": "https://api.github.com/users/github-actions%5Bbot%5D/events{/privacy}",
"received_events_url": "https://api.github.com/users/github-actions%5Bbot%5D/received_events",
"type": "Bot",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38889/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38889/timeline | null | completed | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | true |
https://api.github.com/repos/huggingface/transformers/issues/38888 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38888/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38888/comments | https://api.github.com/repos/huggingface/transformers/issues/38888/events | https://github.com/huggingface/transformers/pull/38888 | 3,157,211,134 | PR_kwDOCUB6oc6bGB0D | 38,888 | continue to fix distributed_type from TPU to XLA in LM examples (#38652) | {
"login": "PT0X0E",
"id": 22206188,
"node_id": "MDQ6VXNlcjIyMjA2MTg4",
"avatar_url": "https://avatars.githubusercontent.com/u/22206188?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/PT0X0E",
"html_url": "https://github.com/PT0X0E",
"followers_url": "https://api.github.com/users/PT0X0E/followers",
"following_url": "https://api.github.com/users/PT0X0E/following{/other_user}",
"gists_url": "https://api.github.com/users/PT0X0E/gists{/gist_id}",
"starred_url": "https://api.github.com/users/PT0X0E/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/PT0X0E/subscriptions",
"organizations_url": "https://api.github.com/users/PT0X0E/orgs",
"repos_url": "https://api.github.com/users/PT0X0E/repos",
"events_url": "https://api.github.com/users/PT0X0E/events{/privacy}",
"received_events_url": "https://api.github.com/users/PT0X0E/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | open | false | null | [] | null | [] | 2025-06-18T14:56:25 | 2025-06-19T11:29:57 | null | NONE | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38888",
"html_url": "https://github.com/huggingface/transformers/pull/38888",
"diff_url": "https://github.com/huggingface/transformers/pull/38888.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38888.patch",
"merged_at": null
} | # What does this PR do?
In some language-modeling and image-pretraining examples, the _no_trainer version training scripts check whether `accelerator.distributed_type` is `TPU`. However, `TPU` is already deprecated in new Accelerator library. They should be changed to `XLA`.
This is a follow-up fix of #38652 .
## Before submitting
- [x] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [x] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [x] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
#38652
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
<!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @
If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**.
Please tag fewer than 3 people.
Models:
- text models: @ArthurZucker
- vision models: @amyeroberts, @qubvel
- speech models: @eustlb
- graph models: @clefourrier
Library:
- flax: @gante and @Rocketknight1
- generate: @zucchini-nlp (visual-language models) or @gante (all others)
- pipelines: @Rocketknight1
- tensorflow: @gante and @Rocketknight1
- tokenizers: @ArthurZucker
- trainer: @zach-huggingface, @SunMarc and @qgallouedec
- chat templates: @Rocketknight1
Integrations:
- deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface
- ray/raytune: @richardliaw, @amogkam
- Big Model Inference: @SunMarc
- quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber
Documentation: @stevhliu
HF projects:
- accelerate: [different repo](https://github.com/huggingface/accelerate)
- datasets: [different repo](https://github.com/huggingface/datasets)
- diffusers: [different repo](https://github.com/huggingface/diffusers)
- rust tokenizers: [different repo](https://github.com/huggingface/tokenizers)
Maintained examples (not research project or legacy):
- Flax: @Rocketknight1
- PyTorch: See Models above and tag the person corresponding to the modality of the example.
- TensorFlow: @Rocketknight1
-->
| null | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38888/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38888/timeline | null | null | null | null | true | false |
https://api.github.com/repos/huggingface/transformers/issues/38887 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38887/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38887/comments | https://api.github.com/repos/huggingface/transformers/issues/38887/events | https://github.com/huggingface/transformers/pull/38887 | 3,157,105,875 | PR_kwDOCUB6oc6bFrzp | 38,887 | Add RF-DETR | {
"login": "ZaraCook",
"id": 118310920,
"node_id": "U_kgDOBw1ICA",
"avatar_url": "https://avatars.githubusercontent.com/u/118310920?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ZaraCook",
"html_url": "https://github.com/ZaraCook",
"followers_url": "https://api.github.com/users/ZaraCook/followers",
"following_url": "https://api.github.com/users/ZaraCook/following{/other_user}",
"gists_url": "https://api.github.com/users/ZaraCook/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ZaraCook/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ZaraCook/subscriptions",
"organizations_url": "https://api.github.com/users/ZaraCook/orgs",
"repos_url": "https://api.github.com/users/ZaraCook/repos",
"events_url": "https://api.github.com/users/ZaraCook/events{/privacy}",
"received_events_url": "https://api.github.com/users/ZaraCook/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-06-18T14:27:36 | 2025-09-13T00:43:42 | 2025-07-05T14:15:05 | NONE | null | null | true | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38887",
"html_url": "https://github.com/huggingface/transformers/pull/38887",
"diff_url": "https://github.com/huggingface/transformers/pull/38887.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38887.patch",
"merged_at": null
} | # What does this PR do?
Draft PR to add RF-DETR based on: https://github.com/huggingface/transformers/pull/36895
| {
"login": "ZaraCook",
"id": 118310920,
"node_id": "U_kgDOBw1ICA",
"avatar_url": "https://avatars.githubusercontent.com/u/118310920?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ZaraCook",
"html_url": "https://github.com/ZaraCook",
"followers_url": "https://api.github.com/users/ZaraCook/followers",
"following_url": "https://api.github.com/users/ZaraCook/following{/other_user}",
"gists_url": "https://api.github.com/users/ZaraCook/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ZaraCook/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ZaraCook/subscriptions",
"organizations_url": "https://api.github.com/users/ZaraCook/orgs",
"repos_url": "https://api.github.com/users/ZaraCook/repos",
"events_url": "https://api.github.com/users/ZaraCook/events{/privacy}",
"received_events_url": "https://api.github.com/users/ZaraCook/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38887/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38887/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/38886 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38886/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38886/comments | https://api.github.com/repos/huggingface/transformers/issues/38886/events | https://github.com/huggingface/transformers/pull/38886 | 3,157,096,823 | PR_kwDOCUB6oc6bFpyd | 38,886 | Allow compile with bnb | {
"login": "SunMarc",
"id": 57196510,
"node_id": "MDQ6VXNlcjU3MTk2NTEw",
"avatar_url": "https://avatars.githubusercontent.com/u/57196510?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/SunMarc",
"html_url": "https://github.com/SunMarc",
"followers_url": "https://api.github.com/users/SunMarc/followers",
"following_url": "https://api.github.com/users/SunMarc/following{/other_user}",
"gists_url": "https://api.github.com/users/SunMarc/gists{/gist_id}",
"starred_url": "https://api.github.com/users/SunMarc/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/SunMarc/subscriptions",
"organizations_url": "https://api.github.com/users/SunMarc/orgs",
"repos_url": "https://api.github.com/users/SunMarc/repos",
"events_url": "https://api.github.com/users/SunMarc/events{/privacy}",
"received_events_url": "https://api.github.com/users/SunMarc/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | open | false | null | [] | null | [] | 2025-06-18T14:25:16 | 2025-07-03T09:15:36 | null | MEMBER | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38886",
"html_url": "https://github.com/huggingface/transformers/pull/38886",
"diff_url": "https://github.com/huggingface/transformers/pull/38886.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38886.patch",
"merged_at": null
} | # What does this PR do?
This PR enable compilation when generating with bnb models. This is supported with the latest bnb. | null | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38886/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38886/timeline | null | null | null | null | true | false |
https://api.github.com/repos/huggingface/transformers/issues/38885 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38885/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38885/comments | https://api.github.com/repos/huggingface/transformers/issues/38885/events | https://github.com/huggingface/transformers/pull/38885 | 3,156,911,838 | PR_kwDOCUB6oc6bFCZ4 | 38,885 | Fix loop var naming | {
"login": "Rocketknight1",
"id": 12866554,
"node_id": "MDQ6VXNlcjEyODY2NTU0",
"avatar_url": "https://avatars.githubusercontent.com/u/12866554?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Rocketknight1",
"html_url": "https://github.com/Rocketknight1",
"followers_url": "https://api.github.com/users/Rocketknight1/followers",
"following_url": "https://api.github.com/users/Rocketknight1/following{/other_user}",
"gists_url": "https://api.github.com/users/Rocketknight1/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Rocketknight1/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Rocketknight1/subscriptions",
"organizations_url": "https://api.github.com/users/Rocketknight1/orgs",
"repos_url": "https://api.github.com/users/Rocketknight1/repos",
"events_url": "https://api.github.com/users/Rocketknight1/events{/privacy}",
"received_events_url": "https://api.github.com/users/Rocketknight1/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-06-18T13:32:26 | 2025-06-18T13:46:04 | 2025-06-18T13:45:01 | MEMBER | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38885",
"html_url": "https://github.com/huggingface/transformers/pull/38885",
"diff_url": "https://github.com/huggingface/transformers/pull/38885.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38885.patch",
"merged_at": "2025-06-18T13:45:01"
} | Our code contained the loop `for tuple in gen:` which clobbered the built-in Python `tuple`. This was a silent error that surfaced when we did some recent type hint fixes.
This PR fixes it by renaming the loop variable to something that doesn't overwrite a Python built-in like that! | {
"login": "Rocketknight1",
"id": 12866554,
"node_id": "MDQ6VXNlcjEyODY2NTU0",
"avatar_url": "https://avatars.githubusercontent.com/u/12866554?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Rocketknight1",
"html_url": "https://github.com/Rocketknight1",
"followers_url": "https://api.github.com/users/Rocketknight1/followers",
"following_url": "https://api.github.com/users/Rocketknight1/following{/other_user}",
"gists_url": "https://api.github.com/users/Rocketknight1/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Rocketknight1/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Rocketknight1/subscriptions",
"organizations_url": "https://api.github.com/users/Rocketknight1/orgs",
"repos_url": "https://api.github.com/users/Rocketknight1/repos",
"events_url": "https://api.github.com/users/Rocketknight1/events{/privacy}",
"received_events_url": "https://api.github.com/users/Rocketknight1/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38885/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38885/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/38884 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38884/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38884/comments | https://api.github.com/repos/huggingface/transformers/issues/38884/events | https://github.com/huggingface/transformers/pull/38884 | 3,156,806,440 | PR_kwDOCUB6oc6bErGR | 38,884 | Llama 4 conversion fix for moe models | {
"login": "pcuenca",
"id": 1177582,
"node_id": "MDQ6VXNlcjExNzc1ODI=",
"avatar_url": "https://avatars.githubusercontent.com/u/1177582?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/pcuenca",
"html_url": "https://github.com/pcuenca",
"followers_url": "https://api.github.com/users/pcuenca/followers",
"following_url": "https://api.github.com/users/pcuenca/following{/other_user}",
"gists_url": "https://api.github.com/users/pcuenca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/pcuenca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/pcuenca/subscriptions",
"organizations_url": "https://api.github.com/users/pcuenca/orgs",
"repos_url": "https://api.github.com/users/pcuenca/repos",
"events_url": "https://api.github.com/users/pcuenca/events{/privacy}",
"received_events_url": "https://api.github.com/users/pcuenca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | open | false | null | [] | null | [] | 2025-06-18T13:05:01 | 2025-06-18T13:18:11 | null | MEMBER | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38884",
"html_url": "https://github.com/huggingface/transformers/pull/38884",
"diff_url": "https://github.com/huggingface/transformers/pull/38884.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38884.patch",
"merged_at": null
} | I think I broke it when adding support for Llama Guard. | null | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38884/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38884/timeline | null | null | null | null | true | false |
https://api.github.com/repos/huggingface/transformers/issues/38883 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38883/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38883/comments | https://api.github.com/repos/huggingface/transformers/issues/38883/events | https://github.com/huggingface/transformers/pull/38883 | 3,156,692,780 | PR_kwDOCUB6oc6bESdy | 38,883 | More PYUP fixes | {
"login": "cyyever",
"id": 17618148,
"node_id": "MDQ6VXNlcjE3NjE4MTQ4",
"avatar_url": "https://avatars.githubusercontent.com/u/17618148?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/cyyever",
"html_url": "https://github.com/cyyever",
"followers_url": "https://api.github.com/users/cyyever/followers",
"following_url": "https://api.github.com/users/cyyever/following{/other_user}",
"gists_url": "https://api.github.com/users/cyyever/gists{/gist_id}",
"starred_url": "https://api.github.com/users/cyyever/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/cyyever/subscriptions",
"organizations_url": "https://api.github.com/users/cyyever/orgs",
"repos_url": "https://api.github.com/users/cyyever/repos",
"events_url": "https://api.github.com/users/cyyever/events{/privacy}",
"received_events_url": "https://api.github.com/users/cyyever/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-06-18T12:29:37 | 2025-06-19T01:26:15 | 2025-06-18T13:38:09 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38883",
"html_url": "https://github.com/huggingface/transformers/pull/38883",
"diff_url": "https://github.com/huggingface/transformers/pull/38883.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38883.patch",
"merged_at": "2025-06-18T13:38:09"
} | # What does this PR do?
Most changes are using f-string. | {
"login": "Rocketknight1",
"id": 12866554,
"node_id": "MDQ6VXNlcjEyODY2NTU0",
"avatar_url": "https://avatars.githubusercontent.com/u/12866554?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Rocketknight1",
"html_url": "https://github.com/Rocketknight1",
"followers_url": "https://api.github.com/users/Rocketknight1/followers",
"following_url": "https://api.github.com/users/Rocketknight1/following{/other_user}",
"gists_url": "https://api.github.com/users/Rocketknight1/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Rocketknight1/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Rocketknight1/subscriptions",
"organizations_url": "https://api.github.com/users/Rocketknight1/orgs",
"repos_url": "https://api.github.com/users/Rocketknight1/repos",
"events_url": "https://api.github.com/users/Rocketknight1/events{/privacy}",
"received_events_url": "https://api.github.com/users/Rocketknight1/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38883/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38883/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/38882 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38882/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38882/comments | https://api.github.com/repos/huggingface/transformers/issues/38882/events | https://github.com/huggingface/transformers/pull/38882 | 3,156,243,943 | PR_kwDOCUB6oc6bCxML | 38,882 | 🔴 Update default `dtype` for pipelines to `auto` | {
"login": "Vaibhavs10",
"id": 18682411,
"node_id": "MDQ6VXNlcjE4NjgyNDEx",
"avatar_url": "https://avatars.githubusercontent.com/u/18682411?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Vaibhavs10",
"html_url": "https://github.com/Vaibhavs10",
"followers_url": "https://api.github.com/users/Vaibhavs10/followers",
"following_url": "https://api.github.com/users/Vaibhavs10/following{/other_user}",
"gists_url": "https://api.github.com/users/Vaibhavs10/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Vaibhavs10/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Vaibhavs10/subscriptions",
"organizations_url": "https://api.github.com/users/Vaibhavs10/orgs",
"repos_url": "https://api.github.com/users/Vaibhavs10/repos",
"events_url": "https://api.github.com/users/Vaibhavs10/events{/privacy}",
"received_events_url": "https://api.github.com/users/Vaibhavs10/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-06-18T09:59:02 | 2025-06-24T08:39:19 | 2025-06-24T08:39:18 | MEMBER | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38882",
"html_url": "https://github.com/huggingface/transformers/pull/38882",
"diff_url": "https://github.com/huggingface/transformers/pull/38882.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38882.patch",
"merged_at": "2025-06-24T08:39:18"
} | # What does this PR do?
Issue:
Currently, the auto-generated code snippets on the Hub don’t account for torch_dtype, resulting in models being loaded in float32 by default. This often leads to unnecessarily high memory usage and frequent OOM errors, especially in environments like Colab.
For example:
The [Qwen-4B Colab](https://huggingface.co/Qwen/Qwen3-4B/colab) OOMs when loaded without torch_dtype (weights are bf16 but default to float32).
Fix:
This PR changes the default torch_dtype in pipeline to "auto", which ensures the model loads in the dtype it was saved with (e.g., bf16). This reduces memory overhead and aligns with the model’s intended configuration.
Fixes # (issue)
## Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR. | {
"login": "Vaibhavs10",
"id": 18682411,
"node_id": "MDQ6VXNlcjE4NjgyNDEx",
"avatar_url": "https://avatars.githubusercontent.com/u/18682411?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Vaibhavs10",
"html_url": "https://github.com/Vaibhavs10",
"followers_url": "https://api.github.com/users/Vaibhavs10/followers",
"following_url": "https://api.github.com/users/Vaibhavs10/following{/other_user}",
"gists_url": "https://api.github.com/users/Vaibhavs10/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Vaibhavs10/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Vaibhavs10/subscriptions",
"organizations_url": "https://api.github.com/users/Vaibhavs10/orgs",
"repos_url": "https://api.github.com/users/Vaibhavs10/repos",
"events_url": "https://api.github.com/users/Vaibhavs10/events{/privacy}",
"received_events_url": "https://api.github.com/users/Vaibhavs10/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38882/reactions",
"total_count": 3,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 3,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38882/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/38881 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38881/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38881/comments | https://api.github.com/repos/huggingface/transformers/issues/38881/events | https://github.com/huggingface/transformers/pull/38881 | 3,156,146,798 | PR_kwDOCUB6oc6bCcRB | 38,881 | [video processor] fix slow tests | {
"login": "zucchini-nlp",
"id": 100715397,
"node_id": "U_kgDOBgDLhQ",
"avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/zucchini-nlp",
"html_url": "https://github.com/zucchini-nlp",
"followers_url": "https://api.github.com/users/zucchini-nlp/followers",
"following_url": "https://api.github.com/users/zucchini-nlp/following{/other_user}",
"gists_url": "https://api.github.com/users/zucchini-nlp/gists{/gist_id}",
"starred_url": "https://api.github.com/users/zucchini-nlp/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/zucchini-nlp/subscriptions",
"organizations_url": "https://api.github.com/users/zucchini-nlp/orgs",
"repos_url": "https://api.github.com/users/zucchini-nlp/repos",
"events_url": "https://api.github.com/users/zucchini-nlp/events{/privacy}",
"received_events_url": "https://api.github.com/users/zucchini-nlp/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-06-18T09:28:26 | 2025-06-18T20:39:56 | 2025-06-18T20:39:56 | MEMBER | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38881",
"html_url": "https://github.com/huggingface/transformers/pull/38881",
"diff_url": "https://github.com/huggingface/transformers/pull/38881.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38881.patch",
"merged_at": "2025-06-18T20:39:56"
} | # What does this PR do?
Fixes failing CI. Some models like InternVL have a shared image processor with a different model (GotOCR). So when we check for an `image_processor_type` and infer that changing `Image -> Video` gives us video processor, it fails. Because GotOCR has no video video processing and should not have. Thus we need to check against mapping to be safe and if the inferred name is not in mapping, fallback to `type(config.json)` at last
Video processor BC is giving me headaches 😿
cc @ydshieh | {
"login": "ydshieh",
"id": 2521628,
"node_id": "MDQ6VXNlcjI1MjE2Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ydshieh",
"html_url": "https://github.com/ydshieh",
"followers_url": "https://api.github.com/users/ydshieh/followers",
"following_url": "https://api.github.com/users/ydshieh/following{/other_user}",
"gists_url": "https://api.github.com/users/ydshieh/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ydshieh/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ydshieh/subscriptions",
"organizations_url": "https://api.github.com/users/ydshieh/orgs",
"repos_url": "https://api.github.com/users/ydshieh/repos",
"events_url": "https://api.github.com/users/ydshieh/events{/privacy}",
"received_events_url": "https://api.github.com/users/ydshieh/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38881/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38881/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/38880 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38880/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38880/comments | https://api.github.com/repos/huggingface/transformers/issues/38880/events | https://github.com/huggingface/transformers/pull/38880 | 3,155,992,051 | PR_kwDOCUB6oc6bB61V | 38,880 | [video processor] support torchcodec and decrease cuda memory usage | {
"login": "zucchini-nlp",
"id": 100715397,
"node_id": "U_kgDOBgDLhQ",
"avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/zucchini-nlp",
"html_url": "https://github.com/zucchini-nlp",
"followers_url": "https://api.github.com/users/zucchini-nlp/followers",
"following_url": "https://api.github.com/users/zucchini-nlp/following{/other_user}",
"gists_url": "https://api.github.com/users/zucchini-nlp/gists{/gist_id}",
"starred_url": "https://api.github.com/users/zucchini-nlp/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/zucchini-nlp/subscriptions",
"organizations_url": "https://api.github.com/users/zucchini-nlp/orgs",
"repos_url": "https://api.github.com/users/zucchini-nlp/repos",
"events_url": "https://api.github.com/users/zucchini-nlp/events{/privacy}",
"received_events_url": "https://api.github.com/users/zucchini-nlp/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-06-18T08:38:53 | 2025-06-25T08:23:37 | 2025-06-25T08:23:37 | MEMBER | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38880",
"html_url": "https://github.com/huggingface/transformers/pull/38880",
"diff_url": "https://github.com/huggingface/transformers/pull/38880.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38880.patch",
"merged_at": "2025-06-25T08:23:37"
} | # What does this PR do?
As per title, adds a small utility for loading videos with torchcodec. Note that we don't use torchcodec to its fullest, i.e. loading to device or streaming. Loading to device incurs high memory usage because we load the whole video and sampling only after that. For streaming and other features, let's do it one thing at a time gradually and see how it fits in the codebase
For now we just deprecate `read_video_torchvision` which is anyway deprecated in torchvision for the next 2 minor releases. Users are nudged to use `torchcodec` instead
Also I noticed that there was a high GPU memory spike with long videos, because we moved the whole video to GPU before processing. This PR moves the device-placement after sampling so only the sampled frames are on device
The next PR will be on using torchcodec to load audio from video files, seems like it is better than librosa and supports more formats. I will still need to test. Ideally making torchcodec the default would be the final goal, as we test and iterate | {
"login": "zucchini-nlp",
"id": 100715397,
"node_id": "U_kgDOBgDLhQ",
"avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/zucchini-nlp",
"html_url": "https://github.com/zucchini-nlp",
"followers_url": "https://api.github.com/users/zucchini-nlp/followers",
"following_url": "https://api.github.com/users/zucchini-nlp/following{/other_user}",
"gists_url": "https://api.github.com/users/zucchini-nlp/gists{/gist_id}",
"starred_url": "https://api.github.com/users/zucchini-nlp/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/zucchini-nlp/subscriptions",
"organizations_url": "https://api.github.com/users/zucchini-nlp/orgs",
"repos_url": "https://api.github.com/users/zucchini-nlp/repos",
"events_url": "https://api.github.com/users/zucchini-nlp/events{/privacy}",
"received_events_url": "https://api.github.com/users/zucchini-nlp/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38880/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38880/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/38879 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38879/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38879/comments | https://api.github.com/repos/huggingface/transformers/issues/38879/events | https://github.com/huggingface/transformers/pull/38879 | 3,155,620,658 | PR_kwDOCUB6oc6bArap | 38,879 | Add serialization function for StaticCache | {
"login": "xadupre",
"id": 22452781,
"node_id": "MDQ6VXNlcjIyNDUyNzgx",
"avatar_url": "https://avatars.githubusercontent.com/u/22452781?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/xadupre",
"html_url": "https://github.com/xadupre",
"followers_url": "https://api.github.com/users/xadupre/followers",
"following_url": "https://api.github.com/users/xadupre/following{/other_user}",
"gists_url": "https://api.github.com/users/xadupre/gists{/gist_id}",
"starred_url": "https://api.github.com/users/xadupre/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/xadupre/subscriptions",
"organizations_url": "https://api.github.com/users/xadupre/orgs",
"repos_url": "https://api.github.com/users/xadupre/repos",
"events_url": "https://api.github.com/users/xadupre/events{/privacy}",
"received_events_url": "https://api.github.com/users/xadupre/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-06-18T06:21:45 | 2025-08-05T18:10:52 | 2025-08-05T18:10:52 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38879",
"html_url": "https://github.com/huggingface/transformers/pull/38879",
"diff_url": "https://github.com/huggingface/transformers/pull/38879.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38879.patch",
"merged_at": null
} | # What does this PR do?
Implements serialization functions for StaticCache similar to the one implemented for DynamicCache. Fixes https://github.com/pytorch/pytorch/issues/155862.
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [ ] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [x] Did you write any new necessary tests?
## Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
<!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @
If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**.
Please tag fewer than 3 people.
Models:
- text models: @ArthurZucker
- vision models: @amyeroberts, @qubvel
- speech models: @eustlb
- graph models: @clefourrier
Library:
- flax: @gante and @Rocketknight1
- generate: @zucchini-nlp (visual-language models) or @gante (all others)
- pipelines: @Rocketknight1
- tensorflow: @gante and @Rocketknight1
- tokenizers: @ArthurZucker
- trainer: @zach-huggingface, @SunMarc and @qgallouedec
- chat templates: @Rocketknight1
Integrations:
- deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface
- ray/raytune: @richardliaw, @amogkam
- Big Model Inference: @SunMarc
- quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber
Documentation: @stevhliu
HF projects:
- accelerate: [different repo](https://github.com/huggingface/accelerate)
- datasets: [different repo](https://github.com/huggingface/datasets)
- diffusers: [different repo](https://github.com/huggingface/diffusers)
- rust tokenizers: [different repo](https://github.com/huggingface/tokenizers)
Maintained examples (not research project or legacy):
- Flax: @Rocketknight1
- PyTorch: See Models above and tag the person corresponding to the modality of the example.
- TensorFlow: @Rocketknight1
-->
| {
"login": "xadupre",
"id": 22452781,
"node_id": "MDQ6VXNlcjIyNDUyNzgx",
"avatar_url": "https://avatars.githubusercontent.com/u/22452781?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/xadupre",
"html_url": "https://github.com/xadupre",
"followers_url": "https://api.github.com/users/xadupre/followers",
"following_url": "https://api.github.com/users/xadupre/following{/other_user}",
"gists_url": "https://api.github.com/users/xadupre/gists{/gist_id}",
"starred_url": "https://api.github.com/users/xadupre/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/xadupre/subscriptions",
"organizations_url": "https://api.github.com/users/xadupre/orgs",
"repos_url": "https://api.github.com/users/xadupre/repos",
"events_url": "https://api.github.com/users/xadupre/events{/privacy}",
"received_events_url": "https://api.github.com/users/xadupre/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38879/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38879/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/38878 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38878/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38878/comments | https://api.github.com/repos/huggingface/transformers/issues/38878/events | https://github.com/huggingface/transformers/issues/38878 | 3,155,545,376 | I_kwDOCUB6oc68Fc0g | 38,878 | 📄 Information about the usage of models and code with non-commercial LICENSE on transformers and diffusers | {
"login": "ishandutta0098",
"id": 47643789,
"node_id": "MDQ6VXNlcjQ3NjQzNzg5",
"avatar_url": "https://avatars.githubusercontent.com/u/47643789?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ishandutta0098",
"html_url": "https://github.com/ishandutta0098",
"followers_url": "https://api.github.com/users/ishandutta0098/followers",
"following_url": "https://api.github.com/users/ishandutta0098/following{/other_user}",
"gists_url": "https://api.github.com/users/ishandutta0098/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ishandutta0098/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ishandutta0098/subscriptions",
"organizations_url": "https://api.github.com/users/ishandutta0098/orgs",
"repos_url": "https://api.github.com/users/ishandutta0098/repos",
"events_url": "https://api.github.com/users/ishandutta0098/events{/privacy}",
"received_events_url": "https://api.github.com/users/ishandutta0098/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-06-18T05:44:45 | 2025-06-20T14:02:40 | 2025-06-20T14:02:40 | NONE | null | null | null | null | Hi Team,
I want to understand how HuggingFace handles code and models which are not licensed for commercial use in its repositories even though the repository itself is for commercial use. The `transformers` and `diffusers` libraries both are Apache License but both cater to non-commercial models as well.
I have read about model cards, and wanted to know if it is enough to just add the original license information in them along with mentioning the original license in the code headers or there are any extra steps being taken care of?
Also I believe the inference of these models would require code from the original authors itself, what if the original code is under a different license?
I am asking this question because I am building my first open source package [mukh](https://github.com/ishandutta0098/mukh) which is a Face Analysis library and the structure is inspired from HuggingFace where I provide a unified API for multiple models around Face Analysis tasks.
I have kept the repository under the Apache License but I have one model and code for that respective model which is under the GPL License.
Here are things I came across which can be done -
1. Mention the original License in the code files and models being used, and use them in my repository with the current Apache License
2. Convert my repository to GPL License
3. Remove the model from my repository and retain the Apache License
Is the first point valid? Or do I have to go ahead with 2 or 3?
Any suggestions are appreciated 🤗 | {
"login": "Rocketknight1",
"id": 12866554,
"node_id": "MDQ6VXNlcjEyODY2NTU0",
"avatar_url": "https://avatars.githubusercontent.com/u/12866554?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Rocketknight1",
"html_url": "https://github.com/Rocketknight1",
"followers_url": "https://api.github.com/users/Rocketknight1/followers",
"following_url": "https://api.github.com/users/Rocketknight1/following{/other_user}",
"gists_url": "https://api.github.com/users/Rocketknight1/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Rocketknight1/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Rocketknight1/subscriptions",
"organizations_url": "https://api.github.com/users/Rocketknight1/orgs",
"repos_url": "https://api.github.com/users/Rocketknight1/repos",
"events_url": "https://api.github.com/users/Rocketknight1/events{/privacy}",
"received_events_url": "https://api.github.com/users/Rocketknight1/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38878/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38878/timeline | null | completed | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | true |
https://api.github.com/repos/huggingface/transformers/issues/38877 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38877/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38877/comments | https://api.github.com/repos/huggingface/transformers/issues/38877/events | https://github.com/huggingface/transformers/pull/38877 | 3,155,493,846 | PR_kwDOCUB6oc6bAQjI | 38,877 | DOC: Clarify attention_mask usage in BertModel forward method | {
"login": "dhyeyinf",
"id": 131277481,
"node_id": "U_kgDOB9MiqQ",
"avatar_url": "https://avatars.githubusercontent.com/u/131277481?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhyeyinf",
"html_url": "https://github.com/dhyeyinf",
"followers_url": "https://api.github.com/users/dhyeyinf/followers",
"following_url": "https://api.github.com/users/dhyeyinf/following{/other_user}",
"gists_url": "https://api.github.com/users/dhyeyinf/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhyeyinf/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhyeyinf/subscriptions",
"organizations_url": "https://api.github.com/users/dhyeyinf/orgs",
"repos_url": "https://api.github.com/users/dhyeyinf/repos",
"events_url": "https://api.github.com/users/dhyeyinf/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhyeyinf/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | open | false | null | [] | null | [] | 2025-06-18T05:12:05 | 2025-06-20T16:47:35 | null | NONE | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38877",
"html_url": "https://github.com/huggingface/transformers/pull/38877",
"diff_url": "https://github.com/huggingface/transformers/pull/38877.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38877.patch",
"merged_at": null
} | This PR improves the documentation of the `attention_mask` parameter in the `forward()` method of the `BertModel` class.
### Changes made:
- Added a clear explanation of how 1s and 0s in `attention_mask` control attention behavior
- Included a minimal example tensor to demonstrate typical usage
- Follows the Hugging Face docstring style for consistency
The goal is to make the docstring more understandable and helpful for beginners exploring the source code.
Please let me know if you'd like any phrasing or formatting adjusted — happy to revise!
| null | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38877/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38877/timeline | null | null | null | null | true | false |
https://api.github.com/repos/huggingface/transformers/issues/38876 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38876/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38876/comments | https://api.github.com/repos/huggingface/transformers/issues/38876/events | https://github.com/huggingface/transformers/pull/38876 | 3,155,364,778 | PR_kwDOCUB6oc6a_1Yx | 38,876 | [bugfix] fix ATTN_MASK_NPU device mismatch error on multi-device NPU … | {
"login": "qykong",
"id": 7960549,
"node_id": "MDQ6VXNlcjc5NjA1NDk=",
"avatar_url": "https://avatars.githubusercontent.com/u/7960549?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/qykong",
"html_url": "https://github.com/qykong",
"followers_url": "https://api.github.com/users/qykong/followers",
"following_url": "https://api.github.com/users/qykong/following{/other_user}",
"gists_url": "https://api.github.com/users/qykong/gists{/gist_id}",
"starred_url": "https://api.github.com/users/qykong/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/qykong/subscriptions",
"organizations_url": "https://api.github.com/users/qykong/orgs",
"repos_url": "https://api.github.com/users/qykong/repos",
"events_url": "https://api.github.com/users/qykong/events{/privacy}",
"received_events_url": "https://api.github.com/users/qykong/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-06-18T03:46:53 | 2025-06-18T16:26:57 | 2025-06-18T16:26:23 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38876",
"html_url": "https://github.com/huggingface/transformers/pull/38876",
"diff_url": "https://github.com/huggingface/transformers/pull/38876.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38876.patch",
"merged_at": "2025-06-18T16:26:23"
} | # What does this PR do?
This PR fixes a device mismatch error that occurs when using NPU flash attention functions across multiple devices. Currently, running the following test script with more than one npus
```
import torch
from transformers import Qwen2_5_VLForConditionalGeneration, AutoProcessor
from qwen_vl_utils import process_vision_info
model = Qwen2_5_VLForConditionalGeneration.from_pretrained(
"/serving_model/qwen2.5-vl-7b-instruct", torch_dtype=torch.bfloat16, device_map="auto", attn_implementation="flash_attention_2"
)
processor = AutoProcessor.from_pretrained("/serving_model/qwen2.5-vl-7b-instruct")
messages = [
{
"role": "user",
"content": [
{
"type": "image",
"image": "https://qianwen-res.oss-cn-beijing.aliyuncs.com/Qwen-VL/assets/demo.jpeg",
},
{"type": "text", "text": "Describe this image."},
],
}
]
text = processor.apply_chat_template(
messages, tokenize=False, add_generation_prompt=True
)
image_inputs, video_inputs = process_vision_info(messages)
inputs = processor(
text=[text],
images=image_inputs,
videos=video_inputs,
padding=True,
return_tensors="pt",
)
inputs = inputs.to(model.device)
generated_ids = model.generate(**inputs, max_new_tokens=128)
generated_ids_trimmed = [
out_ids[len(in_ids) :] for in_ids, out_ids in zip(inputs.input_ids, generated_ids)
]
output_text = processor.batch_decode(
generated_ids_trimmed, skip_special_tokens=True, clean_up_tokenization_spaces=False
)
print(output_text)
```
will result in
```
RuntimeError: Expected all tensors to be on the same device, but found at least two devices, npu:1 and npu:0! (when checking argument for argument atten_mask in method wrapper__npu_fusion_attention)
```
Solution: replaced the global ATTN_MASK_NPU variable with a device-specific cache
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [X] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
| {
"login": "SunMarc",
"id": 57196510,
"node_id": "MDQ6VXNlcjU3MTk2NTEw",
"avatar_url": "https://avatars.githubusercontent.com/u/57196510?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/SunMarc",
"html_url": "https://github.com/SunMarc",
"followers_url": "https://api.github.com/users/SunMarc/followers",
"following_url": "https://api.github.com/users/SunMarc/following{/other_user}",
"gists_url": "https://api.github.com/users/SunMarc/gists{/gist_id}",
"starred_url": "https://api.github.com/users/SunMarc/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/SunMarc/subscriptions",
"organizations_url": "https://api.github.com/users/SunMarc/orgs",
"repos_url": "https://api.github.com/users/SunMarc/repos",
"events_url": "https://api.github.com/users/SunMarc/events{/privacy}",
"received_events_url": "https://api.github.com/users/SunMarc/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38876/reactions",
"total_count": 1,
"+1": 1,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38876/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/38875 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38875/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38875/comments | https://api.github.com/repos/huggingface/transformers/issues/38875/events | https://github.com/huggingface/transformers/pull/38875 | 3,155,260,108 | PR_kwDOCUB6oc6a_f-1 | 38,875 | add pytorch-xpu Dockerfile | {
"login": "yao-matrix",
"id": 7245027,
"node_id": "MDQ6VXNlcjcyNDUwMjc=",
"avatar_url": "https://avatars.githubusercontent.com/u/7245027?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/yao-matrix",
"html_url": "https://github.com/yao-matrix",
"followers_url": "https://api.github.com/users/yao-matrix/followers",
"following_url": "https://api.github.com/users/yao-matrix/following{/other_user}",
"gists_url": "https://api.github.com/users/yao-matrix/gists{/gist_id}",
"starred_url": "https://api.github.com/users/yao-matrix/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/yao-matrix/subscriptions",
"organizations_url": "https://api.github.com/users/yao-matrix/orgs",
"repos_url": "https://api.github.com/users/yao-matrix/repos",
"events_url": "https://api.github.com/users/yao-matrix/events{/privacy}",
"received_events_url": "https://api.github.com/users/yao-matrix/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-06-18T02:31:15 | 2025-06-23T23:53:04 | 2025-06-20T09:42:44 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38875",
"html_url": "https://github.com/huggingface/transformers/pull/38875",
"diff_url": "https://github.com/huggingface/transformers/pull/38875.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38875.patch",
"merged_at": "2025-06-20T09:42:44"
} | null | {
"login": "ydshieh",
"id": 2521628,
"node_id": "MDQ6VXNlcjI1MjE2Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ydshieh",
"html_url": "https://github.com/ydshieh",
"followers_url": "https://api.github.com/users/ydshieh/followers",
"following_url": "https://api.github.com/users/ydshieh/following{/other_user}",
"gists_url": "https://api.github.com/users/ydshieh/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ydshieh/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ydshieh/subscriptions",
"organizations_url": "https://api.github.com/users/ydshieh/orgs",
"repos_url": "https://api.github.com/users/ydshieh/repos",
"events_url": "https://api.github.com/users/ydshieh/events{/privacy}",
"received_events_url": "https://api.github.com/users/ydshieh/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38875/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38875/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/38874 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38874/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38874/comments | https://api.github.com/repos/huggingface/transformers/issues/38874/events | https://github.com/huggingface/transformers/issues/38874 | 3,155,134,344 | I_kwDOCUB6oc68D4eI | 38,874 | Reproducibility Issue of Siglip2 with Blackwell Architecture GPUs (RTX 5090) | {
"login": "silverstar0727",
"id": 49096513,
"node_id": "MDQ6VXNlcjQ5MDk2NTEz",
"avatar_url": "https://avatars.githubusercontent.com/u/49096513?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/silverstar0727",
"html_url": "https://github.com/silverstar0727",
"followers_url": "https://api.github.com/users/silverstar0727/followers",
"following_url": "https://api.github.com/users/silverstar0727/following{/other_user}",
"gists_url": "https://api.github.com/users/silverstar0727/gists{/gist_id}",
"starred_url": "https://api.github.com/users/silverstar0727/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/silverstar0727/subscriptions",
"organizations_url": "https://api.github.com/users/silverstar0727/orgs",
"repos_url": "https://api.github.com/users/silverstar0727/repos",
"events_url": "https://api.github.com/users/silverstar0727/events{/privacy}",
"received_events_url": "https://api.github.com/users/silverstar0727/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
}
] | closed | false | null | [] | null | [] | 2025-06-18T01:07:30 | 2025-07-27T08:02:39 | 2025-07-27T08:02:39 | NONE | null | null | null | null | ### System Info
- transformers version: `4.52.4`
- Platform: Linux
- Python version: 3.12
- PyTorch version: 2.7.0+cu128
- CUDA version: 12.8
- GPU: NVIDIA GeForce RTX 5090 (Blackwell architecture)
- Driver Version: 570.144
### Who can help?
## Who can help?
@amyeroberts @qubvel (vision models - SigLIP2 related issue)
### Information
- [ ] The official example scripts
- [x] My own modified scripts
### Tasks
- [ ] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...)
- [x] My own task or dataset (give details below)
### Reproduction
### Problem Description
**Non-deterministic behavior specifically on Blackwell architecture GPUs (RTX 5090)** when running identical inference code. The same SigLIP2 model produces different outputs across runs on RTX 5090, while maintaining perfect reproducibility on other GPU architectures (A6000, RTX 4090).
### Environment Comparison
- ✅ **NVIDIA A6000 (Ampere)**: Perfect reproducibility without any deterministic flags
- ✅ **NVIDIA RTX 4090 (Ada Lovelace)**: Perfect reproducibility without any deterministic flags
- ❌ **NVIDIA RTX 5090 (Blackwell)**: Non-deterministic behavior even with all deterministic settings enabled
### Steps to reproduce the behavior:
1. **Set all deterministic flags** (this still doesn't fix the issue on RTX 5090):
```python
import os
os.environ['CUBLAS_WORKSPACE_CONFIG'] = ':4096:8'
import torch
torch.use_deterministic_algorithms(True)
torch.backends.cudnn.deterministic = True
torch.backends.cudnn.benchmark = False
torch.manual_seed(42)
torch.cuda.manual_seed_all(42)
```
2. **Run the reproduction script twice**:
```python
import torch
import pandas as pd
from datasets import load_dataset
from transformers import pipeline
from transformers.pipelines.pt_utils import KeyDataset
from tqdm import tqdm
import os
from accelerate import Accelerator
from accelerate.utils import gather_object
def set_deterministic():
"""Set deterministic behavior"""
os.environ['CUBLAS_WORKSPACE_CONFIG'] = ':4096:8'
torch.use_deterministic_algorithms(True)
torch.backends.cudnn.deterministic = True
torch.backends.cudnn.benchmark = False
torch.manual_seed(42)
torch.cuda.manual_seed_all(42)
def run_test(run_id):
"""Run reproducibility test"""
set_deterministic()
accelerator = Accelerator()
# Load CIFAR-10 dataset (100 images)
ds = load_dataset("cifar10", split="test[:100]")
# Create pipeline with SigLIP2
classifier = pipeline(
"image-classification",
model="google/siglip2-giant-opt-patch16-384",
device_map={"": accelerator.process_index},
trust_remote_code=True,
)
if hasattr(classifier.model, 'eval'):
classifier.model.eval()
accelerator.wait_for_everyone()
# Process images
with accelerator.split_between_processes(ds) as subset:
img_ds = KeyDataset(subset, "img")
rows = []
for idx, out in enumerate(
tqdm(
classifier(img_ds, batch_size=32, top_k=50),
total=len(img_ds),
disable=not accelerator.is_main_process,
)
):
sorted_out = sorted(out, key=lambda x: x["label"])
result_dict = {"ID": idx}
for r in sorted_out:
result_dict[r["label"]] = r["score"]
rows.append(result_dict)
rows = gather_object(rows)
if accelerator.is_main_process:
df = pd.DataFrame(rows)
columns = ["ID"] + sorted([c for c in df.columns if c != "ID"])
df = df.reindex(columns, axis=1)
filename = f"siglip2_test_run{run_id}.csv"
df.to_csv(filename, index=False)
return filename
return None
# Run twice and compare
file1 = run_test(1)
file2 = run_test(2)
# Compare results
df1 = pd.read_csv(file1)
df2 = pd.read_csv(file2)
numeric_cols = [col for col in df1.columns if col != "ID"]
for col in numeric_cols[:2]:
diff = (df1[col] - df2[col]).abs()
print(f"{col}: max_diff = {diff.max():.2e}")
```
3. **Run on RTX 5090**
### Actual Results on RTX 5090
```
🔍 COMPARING RESULTS:
❌ LABEL_0: max_diff = 4.63e-03
❌ LABEL_1: max_diff = 4.63e-03
📋 SUMMARY:
Maximum difference: 4.63e-03
⚠️ REPRODUCIBILITY ISSUES DETECTED!
🔍 Worst case in 'LABEL_1':
Row 82: 0.4096104801 vs 0.4049816132
```
**Error magnitude**: Up to `4.63e-03` difference between identical runs - this is a **significant reproducibility issue**.
### Investigation Results
- ✅ All deterministic PyTorch settings properly configured
- ✅ Same code works reproducibly on Ampere (A6000) and Ada Lovelace (4090) architectures
- ✅ Tested with Accelerate library (identical to production usage)
- ✅ Model set to eval mode explicitly
- ❌ Issue persists specifically on Blackwell architecture (RTX 5090)
### Expected behavior
Identical inference runs should produce identical outputs when deterministic flags are set, regardless of GPU architecture. This behavior works correctly on:
- NVIDIA A6000 (Ampere architecture)
- NVIDIA RTX 4090 (Ada Lovelace architecture)
## Additional Context
This appears to be a **Blackwell architecture-specific issue** where new CUDA kernels or hardware optimizations don't fully respect PyTorch's deterministic settings. The issue has significant practical implications:
1. **Model evaluation inconsistency**: Results vary between runs on the same hardware
2. **Research reproducibility**: Experiments cannot be reliably reproduced
3. **Production reliability**: Model outputs are unpredictable
### Hardware Specifications
- **RTX 5090**: Compute Capability (12, 0), CUDA 12.8, Driver 570.144
- **Memory**: 32GB GDDR7
- **Architecture**: Blackwell with 5th gen RT cores, 4th gen Tensor cores
### Potential Root Causes
- New Blackwell CUDA kernels may not implement deterministic algorithms
- Updated attention mechanisms (Flash Attention 3.0) might be non-deterministic
- Memory coalescing patterns could be different from previous architectures
| {
"login": "github-actions[bot]",
"id": 41898282,
"node_id": "MDM6Qm90NDE4OTgyODI=",
"avatar_url": "https://avatars.githubusercontent.com/in/15368?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/github-actions%5Bbot%5D",
"html_url": "https://github.com/apps/github-actions",
"followers_url": "https://api.github.com/users/github-actions%5Bbot%5D/followers",
"following_url": "https://api.github.com/users/github-actions%5Bbot%5D/following{/other_user}",
"gists_url": "https://api.github.com/users/github-actions%5Bbot%5D/gists{/gist_id}",
"starred_url": "https://api.github.com/users/github-actions%5Bbot%5D/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/github-actions%5Bbot%5D/subscriptions",
"organizations_url": "https://api.github.com/users/github-actions%5Bbot%5D/orgs",
"repos_url": "https://api.github.com/users/github-actions%5Bbot%5D/repos",
"events_url": "https://api.github.com/users/github-actions%5Bbot%5D/events{/privacy}",
"received_events_url": "https://api.github.com/users/github-actions%5Bbot%5D/received_events",
"type": "Bot",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38874/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38874/timeline | null | completed | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | true |
https://api.github.com/repos/huggingface/transformers/issues/38873 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38873/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38873/comments | https://api.github.com/repos/huggingface/transformers/issues/38873/events | https://github.com/huggingface/transformers/issues/38873 | 3,155,016,889 | I_kwDOCUB6oc68Dby5 | 38,873 | Inconsistency between from_pretrained and save_pretrained API | {
"login": "jiafatom",
"id": 30608893,
"node_id": "MDQ6VXNlcjMwNjA4ODkz",
"avatar_url": "https://avatars.githubusercontent.com/u/30608893?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jiafatom",
"html_url": "https://github.com/jiafatom",
"followers_url": "https://api.github.com/users/jiafatom/followers",
"following_url": "https://api.github.com/users/jiafatom/following{/other_user}",
"gists_url": "https://api.github.com/users/jiafatom/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jiafatom/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jiafatom/subscriptions",
"organizations_url": "https://api.github.com/users/jiafatom/orgs",
"repos_url": "https://api.github.com/users/jiafatom/repos",
"events_url": "https://api.github.com/users/jiafatom/events{/privacy}",
"received_events_url": "https://api.github.com/users/jiafatom/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
}
] | closed | false | null | [] | null | [] | 2025-06-17T23:47:10 | 2025-06-18T12:23:52 | 2025-06-18T12:23:52 | CONTRIBUTOR | null | null | null | null | ### System Info
I have a phi-4-mini-reasoning model, here the config:
https://huggingface.co/microsoft/Phi-4-mini-instruct/blob/main/tokenizer_config.json
I downloaded it locally.
and I use `tokenizer = AutoTokenizer.from_pretrained(input_folder)` and then `tokenizer.save_pretrained(output_folder)` Now the output tokenzier_config.json has diff
```
103a104
> "chat_template": "{{ '<|system|>Your name is Phi, an AI math expert developed by Microsoft.' }}{% for message in messages %}{% if message['role'] == 'system' %} {{ message['content'] }}{% if 'tools' in message and message['tools'] is not none %}{{ '<|tool|>' + message['tools'] + '<|/tool|>' }}{% endif %}{% endif %}{% endfor %}{{ '<|end|>' }}{% for message in messages %}{% if message['role'] != 'system' %}{{ '<|' + message['role'] + '|>' + message['content'] + '<|end|>' }}{% endif %}{% endfor %}{% if add_generation_prompt %}{{ '<|assistant|>' }}{% else %}{{ eos_token }}{% endif %}",
106,107d106
< "extra_special_tokens": {},
< "max_length": 1024,
110d108
< "stride": 0,
112,113d109
< "truncation_side": "right",
< "truncation_strategy": "longest_first",
```
I am using
tokenizers 0.21.1
transformers 4.52.3
Could someone help here? Thank you!
### Who can help?
@ArthurZucker and @itazap
### Information
- [ ] The official example scripts
- [ ] My own modified scripts
### Tasks
- [ ] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...)
- [ ] My own task or dataset (give details below)
### Reproduction
See above.
### Expected behavior
See above. | {
"login": "Rocketknight1",
"id": 12866554,
"node_id": "MDQ6VXNlcjEyODY2NTU0",
"avatar_url": "https://avatars.githubusercontent.com/u/12866554?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Rocketknight1",
"html_url": "https://github.com/Rocketknight1",
"followers_url": "https://api.github.com/users/Rocketknight1/followers",
"following_url": "https://api.github.com/users/Rocketknight1/following{/other_user}",
"gists_url": "https://api.github.com/users/Rocketknight1/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Rocketknight1/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Rocketknight1/subscriptions",
"organizations_url": "https://api.github.com/users/Rocketknight1/orgs",
"repos_url": "https://api.github.com/users/Rocketknight1/repos",
"events_url": "https://api.github.com/users/Rocketknight1/events{/privacy}",
"received_events_url": "https://api.github.com/users/Rocketknight1/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38873/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38873/timeline | null | completed | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | true |
https://api.github.com/repos/huggingface/transformers/issues/38872 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38872/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38872/comments | https://api.github.com/repos/huggingface/transformers/issues/38872/events | https://github.com/huggingface/transformers/pull/38872 | 3,154,817,867 | PR_kwDOCUB6oc6a-H3B | 38,872 | Fix loss scaling in Trainer during final step of gradient accumulation | {
"login": "andrewtran117",
"id": 89867547,
"node_id": "MDQ6VXNlcjg5ODY3NTQ3",
"avatar_url": "https://avatars.githubusercontent.com/u/89867547?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/andrewtran117",
"html_url": "https://github.com/andrewtran117",
"followers_url": "https://api.github.com/users/andrewtran117/followers",
"following_url": "https://api.github.com/users/andrewtran117/following{/other_user}",
"gists_url": "https://api.github.com/users/andrewtran117/gists{/gist_id}",
"starred_url": "https://api.github.com/users/andrewtran117/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/andrewtran117/subscriptions",
"organizations_url": "https://api.github.com/users/andrewtran117/orgs",
"repos_url": "https://api.github.com/users/andrewtran117/repos",
"events_url": "https://api.github.com/users/andrewtran117/events{/privacy}",
"received_events_url": "https://api.github.com/users/andrewtran117/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-06-17T21:30:36 | 2025-06-18T04:37:32 | 2025-06-18T04:37:32 | NONE | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38872",
"html_url": "https://github.com/huggingface/transformers/pull/38872",
"diff_url": "https://github.com/huggingface/transformers/pull/38872.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38872.patch",
"merged_at": null
} | Fix for bug in issue (#38837)
# What does this PR do?
This PR fixes a bug in the Trainer.training_step method where the loss is always divided by gradient_accumulation_steps, even on the final accumulation cycle of an epoch, which may contain fewer batches. This leads to an artificially small loss during the last optimizer steps. For example, the following buggy output is provided by @hutaiHang:
"
{'loss': 0.9984, 'grad_norm': 36.21815490722656, 'learning_rate': 0.0, 'epoch': 0.4}
{'loss': 0.9984, 'grad_norm': 36.21815490722656, 'learning_rate': 0.0, 'epoch': 0.8}
{'loss': 0.4992, 'grad_norm': 18.10907745361328, 'learning_rate': 0.0, 'epoch': 1.0} <-- The problem!
{'loss': 0.9984, 'grad_norm': 36.21815490722656, 'learning_rate': 0.0, 'epoch': 1.4}
"
In this PR, I provide a fix by tracking the number of batches accumulated before each optimizer step. Rather than dividing the loss by 'gradient_accumulation_steps,' the loss is now divided by the true number of accumulated steps.
Fixes #38837
Loss is incorrectly scaled in Trainer during the last step with gradient accumulation when the final batch is smaller than accumulation steps.
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [ x ] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review?
Tagging same people from original issue: @zach-huggingface @SunMarc
| {
"login": "andrewtran117",
"id": 89867547,
"node_id": "MDQ6VXNlcjg5ODY3NTQ3",
"avatar_url": "https://avatars.githubusercontent.com/u/89867547?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/andrewtran117",
"html_url": "https://github.com/andrewtran117",
"followers_url": "https://api.github.com/users/andrewtran117/followers",
"following_url": "https://api.github.com/users/andrewtran117/following{/other_user}",
"gists_url": "https://api.github.com/users/andrewtran117/gists{/gist_id}",
"starred_url": "https://api.github.com/users/andrewtran117/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/andrewtran117/subscriptions",
"organizations_url": "https://api.github.com/users/andrewtran117/orgs",
"repos_url": "https://api.github.com/users/andrewtran117/repos",
"events_url": "https://api.github.com/users/andrewtran117/events{/privacy}",
"received_events_url": "https://api.github.com/users/andrewtran117/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38872/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38872/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/38871 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38871/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38871/comments | https://api.github.com/repos/huggingface/transformers/issues/38871/events | https://github.com/huggingface/transformers/issues/38871 | 3,154,813,118 | I_kwDOCUB6oc68CqC- | 38,871 | Qwen3 thinking flag is flipped | {
"login": "rasbt",
"id": 5618407,
"node_id": "MDQ6VXNlcjU2MTg0MDc=",
"avatar_url": "https://avatars.githubusercontent.com/u/5618407?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/rasbt",
"html_url": "https://github.com/rasbt",
"followers_url": "https://api.github.com/users/rasbt/followers",
"following_url": "https://api.github.com/users/rasbt/following{/other_user}",
"gists_url": "https://api.github.com/users/rasbt/gists{/gist_id}",
"starred_url": "https://api.github.com/users/rasbt/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/rasbt/subscriptions",
"organizations_url": "https://api.github.com/users/rasbt/orgs",
"repos_url": "https://api.github.com/users/rasbt/repos",
"events_url": "https://api.github.com/users/rasbt/events{/privacy}",
"received_events_url": "https://api.github.com/users/rasbt/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
}
] | closed | false | null | [] | null | [] | 2025-06-17T21:28:02 | 2025-06-18T12:15:13 | 2025-06-18T12:15:13 | NONE | null | null | null | null | ### System Info
```python
import transformers
transformers.__version__
```
```
'4.52.4'
```
### Who can help?
_No response_
### Information
- [ ] The official example scripts
- [ ] My own modified scripts
### Tasks
- [ ] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...)
- [ ] My own task or dataset (give details below)
### Reproduction
Init the Qwen3 tokenizer and run it with and without thinking enabled as shown below.
### Expected behavior
Looks like `enable_thinking=True` removes the `<think></think>` tokens, but shouldn't it be the other way around?
I.e.,
```python
from transformers import AutoTokenizer
tokenizer = AutoTokenizer.from_pretrained("Qwen/Qwen3-0.6B-Base")
prompt = "Give me a short introduction to large language models."
messages = [
{"role": "user", "content": prompt},
]
token_ids = tokenizer.apply_chat_template(
messages,
tokenize=True,
add_generation_prompt=True,
enable_thinking=True,
)
tokenizer.decode(token_ids)
```
does not have any think tokens:

However, `enable_thinking=False` adds thinking tokens:

| {
"login": "Rocketknight1",
"id": 12866554,
"node_id": "MDQ6VXNlcjEyODY2NTU0",
"avatar_url": "https://avatars.githubusercontent.com/u/12866554?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Rocketknight1",
"html_url": "https://github.com/Rocketknight1",
"followers_url": "https://api.github.com/users/Rocketknight1/followers",
"following_url": "https://api.github.com/users/Rocketknight1/following{/other_user}",
"gists_url": "https://api.github.com/users/Rocketknight1/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Rocketknight1/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Rocketknight1/subscriptions",
"organizations_url": "https://api.github.com/users/Rocketknight1/orgs",
"repos_url": "https://api.github.com/users/Rocketknight1/repos",
"events_url": "https://api.github.com/users/Rocketknight1/events{/privacy}",
"received_events_url": "https://api.github.com/users/Rocketknight1/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38871/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38871/timeline | null | completed | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | true |
https://api.github.com/repos/huggingface/transformers/issues/38870 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38870/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38870/comments | https://api.github.com/repos/huggingface/transformers/issues/38870/events | https://github.com/huggingface/transformers/issues/38870 | 3,154,652,298 | I_kwDOCUB6oc68CCyK | 38,870 | Safetensors deserializing silently mishandles tied parameters | {
"login": "edmcman",
"id": 1017189,
"node_id": "MDQ6VXNlcjEwMTcxODk=",
"avatar_url": "https://avatars.githubusercontent.com/u/1017189?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/edmcman",
"html_url": "https://github.com/edmcman",
"followers_url": "https://api.github.com/users/edmcman/followers",
"following_url": "https://api.github.com/users/edmcman/following{/other_user}",
"gists_url": "https://api.github.com/users/edmcman/gists{/gist_id}",
"starred_url": "https://api.github.com/users/edmcman/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/edmcman/subscriptions",
"organizations_url": "https://api.github.com/users/edmcman/orgs",
"repos_url": "https://api.github.com/users/edmcman/repos",
"events_url": "https://api.github.com/users/edmcman/events{/privacy}",
"received_events_url": "https://api.github.com/users/edmcman/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
}
] | closed | false | null | [] | null | [] | 2025-06-17T20:13:56 | 2025-10-26T08:03:43 | 2025-10-26T08:03:43 | NONE | null | null | null | null | ### System Info
- `transformers` version: 4.52.4
- Platform: Linux-6.1.123+-x86_64-with-glibc2.35
- Python version: 3.11.13
- Huggingface_hub version: 0.33.0
- Safetensors version: 0.5.3
- Accelerate version: 1.7.0
- Accelerate config: not found
- DeepSpeed version: not installed
- PyTorch version (GPU?): 2.6.0+cu124 (False)
- Tensorflow version (GPU?): 2.18.0 (False)
- Flax version (CPU?/GPU?/TPU?): 0.10.6 (cpu)
- Jax version: 0.5.2
- JaxLib version: 0.5.1
- Using distributed or parallel set-up in script?: no
### Who can help?
@Cyrilvallez (this is a pretty rough guess)
### Information
- [ ] The official example scripts
- [x] My own modified scripts
### Tasks
- [ ] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...)
- [x] My own task or dataset (give details below)
### Reproduction
https://colab.research.google.com/drive/1FwWTcK9N9nDiLeDyuW9ckeLSzZfpI7wJ?usp=sharing
### Expected behavior
I expected that my model would be loaded and that the `transformers.wte.weight` would have weights loaded from the safetensors checkpoint. Instead, it does not have weights and causes `accelerate` to crash.
At the very least, I expected a warning about the missing keys. There is actually some code to warn about this, but in this case it did not fire because the missing key is a tied parameter.
I can't quite clearly describe or fix the bug, but it has to do with safetensors deserialization of tied parameters. The tiee parameter's weights are not actually set from the saved checkpoint and are just the blank `meta` versions.
Disabling safetensors avoids the problem.
Setting `tie_word_embeddings=False` oddly does emit a warning:
```
Some weights of GPTBigCodeForCausalLM were not initialized from the model checkpoint at ejschwartz/resym-fielddecoder and are newly initialized: ['transformer.wte.weight']
You should probably TRAIN this model on a down-stream task to be able to use it for predictions and inference.
```
Here is the the original [issue](https://github.com/huggingface/accelerate/issues/3617) I opened in accelerate, but I think that the model accelerate is operating on is invalid. | {
"login": "github-actions[bot]",
"id": 41898282,
"node_id": "MDM6Qm90NDE4OTgyODI=",
"avatar_url": "https://avatars.githubusercontent.com/in/15368?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/github-actions%5Bbot%5D",
"html_url": "https://github.com/apps/github-actions",
"followers_url": "https://api.github.com/users/github-actions%5Bbot%5D/followers",
"following_url": "https://api.github.com/users/github-actions%5Bbot%5D/following{/other_user}",
"gists_url": "https://api.github.com/users/github-actions%5Bbot%5D/gists{/gist_id}",
"starred_url": "https://api.github.com/users/github-actions%5Bbot%5D/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/github-actions%5Bbot%5D/subscriptions",
"organizations_url": "https://api.github.com/users/github-actions%5Bbot%5D/orgs",
"repos_url": "https://api.github.com/users/github-actions%5Bbot%5D/repos",
"events_url": "https://api.github.com/users/github-actions%5Bbot%5D/events{/privacy}",
"received_events_url": "https://api.github.com/users/github-actions%5Bbot%5D/received_events",
"type": "Bot",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38870/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38870/timeline | null | completed | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | true |
https://api.github.com/repos/huggingface/transformers/issues/38869 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38869/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38869/comments | https://api.github.com/repos/huggingface/transformers/issues/38869/events | https://github.com/huggingface/transformers/pull/38869 | 3,154,461,417 | PR_kwDOCUB6oc6a89Ym | 38,869 | [docs] TP | {
"login": "stevhliu",
"id": 59462357,
"node_id": "MDQ6VXNlcjU5NDYyMzU3",
"avatar_url": "https://avatars.githubusercontent.com/u/59462357?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/stevhliu",
"html_url": "https://github.com/stevhliu",
"followers_url": "https://api.github.com/users/stevhliu/followers",
"following_url": "https://api.github.com/users/stevhliu/following{/other_user}",
"gists_url": "https://api.github.com/users/stevhliu/gists{/gist_id}",
"starred_url": "https://api.github.com/users/stevhliu/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/stevhliu/subscriptions",
"organizations_url": "https://api.github.com/users/stevhliu/orgs",
"repos_url": "https://api.github.com/users/stevhliu/repos",
"events_url": "https://api.github.com/users/stevhliu/events{/privacy}",
"received_events_url": "https://api.github.com/users/stevhliu/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-06-17T18:53:43 | 2025-06-17T19:07:27 | 2025-06-17T19:07:27 | MEMBER | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38869",
"html_url": "https://github.com/huggingface/transformers/pull/38869",
"diff_url": "https://github.com/huggingface/transformers/pull/38869.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38869.patch",
"merged_at": null
} | Update TP docs | {
"login": "stevhliu",
"id": 59462357,
"node_id": "MDQ6VXNlcjU5NDYyMzU3",
"avatar_url": "https://avatars.githubusercontent.com/u/59462357?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/stevhliu",
"html_url": "https://github.com/stevhliu",
"followers_url": "https://api.github.com/users/stevhliu/followers",
"following_url": "https://api.github.com/users/stevhliu/following{/other_user}",
"gists_url": "https://api.github.com/users/stevhliu/gists{/gist_id}",
"starred_url": "https://api.github.com/users/stevhliu/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/stevhliu/subscriptions",
"organizations_url": "https://api.github.com/users/stevhliu/orgs",
"repos_url": "https://api.github.com/users/stevhliu/repos",
"events_url": "https://api.github.com/users/stevhliu/events{/privacy}",
"received_events_url": "https://api.github.com/users/stevhliu/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38869/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38869/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/38868 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38868/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38868/comments | https://api.github.com/repos/huggingface/transformers/issues/38868/events | https://github.com/huggingface/transformers/pull/38868 | 3,154,425,663 | PR_kwDOCUB6oc6a81uo | 38,868 | Post-PR fixes! | {
"login": "Rocketknight1",
"id": 12866554,
"node_id": "MDQ6VXNlcjEyODY2NTU0",
"avatar_url": "https://avatars.githubusercontent.com/u/12866554?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Rocketknight1",
"html_url": "https://github.com/Rocketknight1",
"followers_url": "https://api.github.com/users/Rocketknight1/followers",
"following_url": "https://api.github.com/users/Rocketknight1/following{/other_user}",
"gists_url": "https://api.github.com/users/Rocketknight1/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Rocketknight1/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Rocketknight1/subscriptions",
"organizations_url": "https://api.github.com/users/Rocketknight1/orgs",
"repos_url": "https://api.github.com/users/Rocketknight1/repos",
"events_url": "https://api.github.com/users/Rocketknight1/events{/privacy}",
"received_events_url": "https://api.github.com/users/Rocketknight1/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-06-17T18:39:24 | 2025-06-17T18:58:49 | 2025-06-17T18:58:47 | MEMBER | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38868",
"html_url": "https://github.com/huggingface/transformers/pull/38868",
"diff_url": "https://github.com/huggingface/transformers/pull/38868.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38868.patch",
"merged_at": "2025-06-17T18:58:47"
} | Divergent branches mean the CI is red after #38797, this PR fixes it by applying the style fixes to new files too! | {
"login": "Rocketknight1",
"id": 12866554,
"node_id": "MDQ6VXNlcjEyODY2NTU0",
"avatar_url": "https://avatars.githubusercontent.com/u/12866554?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Rocketknight1",
"html_url": "https://github.com/Rocketknight1",
"followers_url": "https://api.github.com/users/Rocketknight1/followers",
"following_url": "https://api.github.com/users/Rocketknight1/following{/other_user}",
"gists_url": "https://api.github.com/users/Rocketknight1/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Rocketknight1/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Rocketknight1/subscriptions",
"organizations_url": "https://api.github.com/users/Rocketknight1/orgs",
"repos_url": "https://api.github.com/users/Rocketknight1/repos",
"events_url": "https://api.github.com/users/Rocketknight1/events{/privacy}",
"received_events_url": "https://api.github.com/users/Rocketknight1/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38868/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38868/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/38867 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38867/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38867/comments | https://api.github.com/repos/huggingface/transformers/issues/38867/events | https://github.com/huggingface/transformers/pull/38867 | 3,154,256,994 | PR_kwDOCUB6oc6a8SHx | 38,867 | null deepspeed_plugin in args for wandb callback fake trainer | {
"login": "winglian",
"id": 381258,
"node_id": "MDQ6VXNlcjM4MTI1OA==",
"avatar_url": "https://avatars.githubusercontent.com/u/381258?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/winglian",
"html_url": "https://github.com/winglian",
"followers_url": "https://api.github.com/users/winglian/followers",
"following_url": "https://api.github.com/users/winglian/following{/other_user}",
"gists_url": "https://api.github.com/users/winglian/gists{/gist_id}",
"starred_url": "https://api.github.com/users/winglian/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/winglian/subscriptions",
"organizations_url": "https://api.github.com/users/winglian/orgs",
"repos_url": "https://api.github.com/users/winglian/repos",
"events_url": "https://api.github.com/users/winglian/events{/privacy}",
"received_events_url": "https://api.github.com/users/winglian/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-06-17T17:37:01 | 2025-06-18T13:10:31 | 2025-06-18T13:10:22 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38867",
"html_url": "https://github.com/huggingface/transformers/pull/38867",
"diff_url": "https://github.com/huggingface/transformers/pull/38867.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38867.patch",
"merged_at": "2025-06-18T13:10:22"
} | # What does this PR do?
Follow up to https://github.com/huggingface/transformers/pull/38101 as the error still seems to happen with a similar error
```
[rank0]: File "/root/miniconda3/envs/py3.11/lib/python3.11/site-packages/transformers/trainer.py", line 2720, in _inner_training_loop
[rank0]: self.control = self.callback_handler.on_train_end(args, self.state, self.control)
[rank0]: ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
[rank0]: File "/root/miniconda3/envs/py3.11/lib/python3.11/site-packages/transformers/trainer_callback.py", line 509, in on_train_end
[rank0]: return self.call_event("on_train_end", args, state, control)
[rank0]: ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
[rank0]: File "/root/miniconda3/envs/py3.11/lib/python3.11/site-packages/transformers/trainer_callback.py", line 556, in call_event
[rank0]: result = getattr(callback, event)(
[rank0]: ^^^^^^^^^^^^^^^^^^^^^^^^^
[rank0]: File "/root/miniconda3/envs/py3.11/lib/python3.11/site-packages/transformers/integrations/integration_utils.py", line 941, in on_train_end
[rank0]: fake_trainer = Trainer(
[rank0]: ^^^^^^^^
[rank0]: File "/root/miniconda3/envs/py3.11/lib/python3.11/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func
[rank0]: return func(*args, **kwargs)
[rank0]: ^^^^^^^^^^^^^^^^^^^^^
[rank0]: File "/root/miniconda3/envs/py3.11/lib/python3.11/site-packages/transformers/trainer.py", line 471, in __init__
[rank0]: self.create_accelerator_and_postprocess()
[rank0]: File "/root/miniconda3/envs/py3.11/lib/python3.11/site-packages/transformers/trainer.py", line 5176, in create_accelerator_and_postprocess
[rank0]: self.accelerator = Accelerator(**args)
[rank0]: ^^^^^^^^^^^^^^^^^^^
[rank0]: File "/root/miniconda3/envs/py3.11/lib/python3.11/site-packages/accelerate/accelerator.py", line 335, in __init__
[rank0]: raise NotImplementedError(
[rank0]: NotImplementedError: You cannot pass in a `deepspeed_plugin` when creating a second `Accelerator`. Please make sure the first `Accelerator` is initialized with all the plugins you want to use.
wandb:
```
Fixes # (issue)
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [ ] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
<!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @
If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**.
Please tag fewer than 3 people.
Models:
- text models: @ArthurZucker
- vision models: @amyeroberts, @qubvel
- speech models: @eustlb
- graph models: @clefourrier
Library:
- flax: @gante and @Rocketknight1
- generate: @zucchini-nlp (visual-language models) or @gante (all others)
- pipelines: @Rocketknight1
- tensorflow: @gante and @Rocketknight1
- tokenizers: @ArthurZucker
- trainer: @zach-huggingface, @SunMarc and @qgallouedec
- chat templates: @Rocketknight1
Integrations:
- deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface
- ray/raytune: @richardliaw, @amogkam
- Big Model Inference: @SunMarc
- quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber
Documentation: @stevhliu
HF projects:
- accelerate: [different repo](https://github.com/huggingface/accelerate)
- datasets: [different repo](https://github.com/huggingface/datasets)
- diffusers: [different repo](https://github.com/huggingface/diffusers)
- rust tokenizers: [different repo](https://github.com/huggingface/tokenizers)
Maintained examples (not research project or legacy):
- Flax: @Rocketknight1
- PyTorch: See Models above and tag the person corresponding to the modality of the example.
- TensorFlow: @Rocketknight1
-->
| {
"login": "SunMarc",
"id": 57196510,
"node_id": "MDQ6VXNlcjU3MTk2NTEw",
"avatar_url": "https://avatars.githubusercontent.com/u/57196510?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/SunMarc",
"html_url": "https://github.com/SunMarc",
"followers_url": "https://api.github.com/users/SunMarc/followers",
"following_url": "https://api.github.com/users/SunMarc/following{/other_user}",
"gists_url": "https://api.github.com/users/SunMarc/gists{/gist_id}",
"starred_url": "https://api.github.com/users/SunMarc/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/SunMarc/subscriptions",
"organizations_url": "https://api.github.com/users/SunMarc/orgs",
"repos_url": "https://api.github.com/users/SunMarc/repos",
"events_url": "https://api.github.com/users/SunMarc/events{/privacy}",
"received_events_url": "https://api.github.com/users/SunMarc/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38867/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38867/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/38866 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38866/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38866/comments | https://api.github.com/repos/huggingface/transformers/issues/38866/events | https://github.com/huggingface/transformers/issues/38866 | 3,154,154,478 | I_kwDOCUB6oc68AJPu | 38,866 | Super slow inference using `eager` attention with `Llama-4-Scout-17B-16E-Instruct` | {
"login": "Tizzzzy",
"id": 107573421,
"node_id": "U_kgDOBmlwrQ",
"avatar_url": "https://avatars.githubusercontent.com/u/107573421?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Tizzzzy",
"html_url": "https://github.com/Tizzzzy",
"followers_url": "https://api.github.com/users/Tizzzzy/followers",
"following_url": "https://api.github.com/users/Tizzzzy/following{/other_user}",
"gists_url": "https://api.github.com/users/Tizzzzy/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Tizzzzy/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Tizzzzy/subscriptions",
"organizations_url": "https://api.github.com/users/Tizzzzy/orgs",
"repos_url": "https://api.github.com/users/Tizzzzy/repos",
"events_url": "https://api.github.com/users/Tizzzzy/events{/privacy}",
"received_events_url": "https://api.github.com/users/Tizzzzy/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
}
] | closed | false | null | [] | null | [] | 2025-06-17T16:54:18 | 2025-06-18T11:59:58 | 2025-06-18T11:57:46 | NONE | null | null | null | null | ### System Info
Env:
```
torch 2.7.1 pypi_0 pypi
torchvision 0.22.1 pypi_0 pypi
tqdm 4.67.1 pypi_0 pypi
transformers 4.51.0 pypi_0 pypi
pillow 11.2.1 pypi_0 pypi
```
My code:
``` python
import torch
from transformers import AutoProcessor, Llama4ForConditionalGeneration
model_id = "meta-llama/Llama-4-Scout-17B-16E-Instruct"
processor = AutoProcessor.from_pretrained(model_id)
model = Llama4ForConditionalGeneration.from_pretrained(
model_id,
attn_implementation="eager",
device_map="auto",
torch_dtype=torch.bfloat16,
cache_dir = cache_dir,
)
messages = {...}
inputs = processor.apply_chat_template(
messages,
add_generation_prompt=True,
tokenize=True,
return_dict=True,
return_tensors="pt",
).to("cuda")
# print(inputs)
outputs = model.generate(
**inputs,
max_new_tokens=128,
)
responses = processor.batch_decode(outputs[:, inputs["input_ids"].shape[-1]:])
```
If I use `flex_attention` when loading the model I will get this error:
TypeError: pad(): argument 'pad' failed to unpack the object at pos 2 with error "type must be tuple of ints,but got NoneType" #37323
However, if I use `eager` or `sdpa` attention, the inference speed is unreasonably slow, around 30 mins per question.
### Who can help?
_No response_
### Information
- [ ] The official example scripts
- [ ] My own modified scripts
### Tasks
- [ ] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...)
- [ ] My own task or dataset (give details below)
### Reproduction
The above env and code should reproduce the error.
### Expected behavior
I want the inference speed to be faster | {
"login": "Rocketknight1",
"id": 12866554,
"node_id": "MDQ6VXNlcjEyODY2NTU0",
"avatar_url": "https://avatars.githubusercontent.com/u/12866554?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Rocketknight1",
"html_url": "https://github.com/Rocketknight1",
"followers_url": "https://api.github.com/users/Rocketknight1/followers",
"following_url": "https://api.github.com/users/Rocketknight1/following{/other_user}",
"gists_url": "https://api.github.com/users/Rocketknight1/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Rocketknight1/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Rocketknight1/subscriptions",
"organizations_url": "https://api.github.com/users/Rocketknight1/orgs",
"repos_url": "https://api.github.com/users/Rocketknight1/repos",
"events_url": "https://api.github.com/users/Rocketknight1/events{/privacy}",
"received_events_url": "https://api.github.com/users/Rocketknight1/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38866/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38866/timeline | null | completed | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | true |
https://api.github.com/repos/huggingface/transformers/issues/38865 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38865/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38865/comments | https://api.github.com/repos/huggingface/transformers/issues/38865/events | https://github.com/huggingface/transformers/pull/38865 | 3,154,065,763 | PR_kwDOCUB6oc6a7o36 | 38,865 | Fix `qwen3_moe` tests | {
"login": "ydshieh",
"id": 2521628,
"node_id": "MDQ6VXNlcjI1MjE2Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ydshieh",
"html_url": "https://github.com/ydshieh",
"followers_url": "https://api.github.com/users/ydshieh/followers",
"following_url": "https://api.github.com/users/ydshieh/following{/other_user}",
"gists_url": "https://api.github.com/users/ydshieh/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ydshieh/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ydshieh/subscriptions",
"organizations_url": "https://api.github.com/users/ydshieh/orgs",
"repos_url": "https://api.github.com/users/ydshieh/repos",
"events_url": "https://api.github.com/users/ydshieh/events{/privacy}",
"received_events_url": "https://api.github.com/users/ydshieh/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-06-17T16:17:49 | 2025-06-18T12:36:05 | 2025-06-18T12:36:03 | COLLABORATOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38865",
"html_url": "https://github.com/huggingface/transformers/pull/38865",
"diff_url": "https://github.com/huggingface/transformers/pull/38865.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38865.patch",
"merged_at": "2025-06-18T12:36:03"
} | # What does this PR do?
See comments.
All tests pass on the desired runners (multi-A10 runners).
I haven't checked the flash attn test - let's make effort when I switch to A10 runners (soon).
(Loading the FA model would require not to use the newly added `get_model`) | {
"login": "ydshieh",
"id": 2521628,
"node_id": "MDQ6VXNlcjI1MjE2Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ydshieh",
"html_url": "https://github.com/ydshieh",
"followers_url": "https://api.github.com/users/ydshieh/followers",
"following_url": "https://api.github.com/users/ydshieh/following{/other_user}",
"gists_url": "https://api.github.com/users/ydshieh/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ydshieh/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ydshieh/subscriptions",
"organizations_url": "https://api.github.com/users/ydshieh/orgs",
"repos_url": "https://api.github.com/users/ydshieh/repos",
"events_url": "https://api.github.com/users/ydshieh/events{/privacy}",
"received_events_url": "https://api.github.com/users/ydshieh/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38865/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38865/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/38864 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38864/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38864/comments | https://api.github.com/repos/huggingface/transformers/issues/38864/events | https://github.com/huggingface/transformers/pull/38864 | 3,154,008,306 | PR_kwDOCUB6oc6a7cWk | 38,864 | 🚨🚨 Fix initialization of Mask2Former | {
"login": "Cyrilvallez",
"id": 71554963,
"node_id": "MDQ6VXNlcjcxNTU0OTYz",
"avatar_url": "https://avatars.githubusercontent.com/u/71554963?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Cyrilvallez",
"html_url": "https://github.com/Cyrilvallez",
"followers_url": "https://api.github.com/users/Cyrilvallez/followers",
"following_url": "https://api.github.com/users/Cyrilvallez/following{/other_user}",
"gists_url": "https://api.github.com/users/Cyrilvallez/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Cyrilvallez/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Cyrilvallez/subscriptions",
"organizations_url": "https://api.github.com/users/Cyrilvallez/orgs",
"repos_url": "https://api.github.com/users/Cyrilvallez/repos",
"events_url": "https://api.github.com/users/Cyrilvallez/events{/privacy}",
"received_events_url": "https://api.github.com/users/Cyrilvallez/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-06-17T15:59:29 | 2025-06-18T07:46:24 | 2025-06-18T07:46:22 | MEMBER | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38864",
"html_url": "https://github.com/huggingface/transformers/pull/38864",
"diff_url": "https://github.com/huggingface/transformers/pull/38864.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38864.patch",
"merged_at": "2025-06-18T07:46:22"
} | # What does this PR do?
Supersedes https://github.com/huggingface/transformers/pull/38512 for simplicity. Because the initialization was so bad originally, I needed to make extra sure to follow the module graph very carefully to retain BC init. Also, original PR had a lot of unwanted commits/contributors.
@bvantuan I added you as a co-author, let me know if that works for you!
For posterity, the only BC breaking here is the init of `DeformableDetrMultiscaleDeformableAttention` - it was previously overridden by the block for `Mask2FormerPixelLevelModule`, setting the `Linear` layer back to normal distribution since the `_init_weights` is applied to all modules in the graph in a depth-first manner
| {
"login": "Cyrilvallez",
"id": 71554963,
"node_id": "MDQ6VXNlcjcxNTU0OTYz",
"avatar_url": "https://avatars.githubusercontent.com/u/71554963?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Cyrilvallez",
"html_url": "https://github.com/Cyrilvallez",
"followers_url": "https://api.github.com/users/Cyrilvallez/followers",
"following_url": "https://api.github.com/users/Cyrilvallez/following{/other_user}",
"gists_url": "https://api.github.com/users/Cyrilvallez/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Cyrilvallez/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Cyrilvallez/subscriptions",
"organizations_url": "https://api.github.com/users/Cyrilvallez/orgs",
"repos_url": "https://api.github.com/users/Cyrilvallez/repos",
"events_url": "https://api.github.com/users/Cyrilvallez/events{/privacy}",
"received_events_url": "https://api.github.com/users/Cyrilvallez/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38864/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38864/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/38863 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38863/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38863/comments | https://api.github.com/repos/huggingface/transformers/issues/38863/events | https://github.com/huggingface/transformers/pull/38863 | 3,153,635,102 | PR_kwDOCUB6oc6a6LaH | 38,863 | Fix num_return_sequences by overwriting WhisperGenerationMixin | {
"login": "MatthiasLeimeisterSonos",
"id": 105651002,
"node_id": "U_kgDOBkwbOg",
"avatar_url": "https://avatars.githubusercontent.com/u/105651002?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/MatthiasLeimeisterSonos",
"html_url": "https://github.com/MatthiasLeimeisterSonos",
"followers_url": "https://api.github.com/users/MatthiasLeimeisterSonos/followers",
"following_url": "https://api.github.com/users/MatthiasLeimeisterSonos/following{/other_user}",
"gists_url": "https://api.github.com/users/MatthiasLeimeisterSonos/gists{/gist_id}",
"starred_url": "https://api.github.com/users/MatthiasLeimeisterSonos/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/MatthiasLeimeisterSonos/subscriptions",
"organizations_url": "https://api.github.com/users/MatthiasLeimeisterSonos/orgs",
"repos_url": "https://api.github.com/users/MatthiasLeimeisterSonos/repos",
"events_url": "https://api.github.com/users/MatthiasLeimeisterSonos/events{/privacy}",
"received_events_url": "https://api.github.com/users/MatthiasLeimeisterSonos/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-06-17T14:02:32 | 2025-06-17T14:02:44 | 2025-06-17T14:02:44 | NONE | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38863",
"html_url": "https://github.com/huggingface/transformers/pull/38863",
"diff_url": "https://github.com/huggingface/transformers/pull/38863.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38863.patch",
"merged_at": null
} | # What does this PR do?
<!--
Congratulations! You've made it this far! You're not quite done yet though.
Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution.
Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change.
Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost.
-->
<!-- Remove if not applicable -->
Fixes # (issue)
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [ ] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
<!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @
If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**.
Please tag fewer than 3 people.
Models:
- text models: @ArthurZucker
- vision models: @amyeroberts, @qubvel
- speech models: @eustlb
- graph models: @clefourrier
Library:
- flax: @gante and @Rocketknight1
- generate: @zucchini-nlp (visual-language models) or @gante (all others)
- pipelines: @Rocketknight1
- tensorflow: @gante and @Rocketknight1
- tokenizers: @ArthurZucker
- trainer: @zach-huggingface, @SunMarc and @qgallouedec
- chat templates: @Rocketknight1
Integrations:
- deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface
- ray/raytune: @richardliaw, @amogkam
- Big Model Inference: @SunMarc
- quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber
Documentation: @stevhliu
HF projects:
- accelerate: [different repo](https://github.com/huggingface/accelerate)
- datasets: [different repo](https://github.com/huggingface/datasets)
- diffusers: [different repo](https://github.com/huggingface/diffusers)
- rust tokenizers: [different repo](https://github.com/huggingface/tokenizers)
Maintained examples (not research project or legacy):
- Flax: @Rocketknight1
- PyTorch: See Models above and tag the person corresponding to the modality of the example.
- TensorFlow: @Rocketknight1
-->
| {
"login": "MatthiasLeimeisterSonos",
"id": 105651002,
"node_id": "U_kgDOBkwbOg",
"avatar_url": "https://avatars.githubusercontent.com/u/105651002?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/MatthiasLeimeisterSonos",
"html_url": "https://github.com/MatthiasLeimeisterSonos",
"followers_url": "https://api.github.com/users/MatthiasLeimeisterSonos/followers",
"following_url": "https://api.github.com/users/MatthiasLeimeisterSonos/following{/other_user}",
"gists_url": "https://api.github.com/users/MatthiasLeimeisterSonos/gists{/gist_id}",
"starred_url": "https://api.github.com/users/MatthiasLeimeisterSonos/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/MatthiasLeimeisterSonos/subscriptions",
"organizations_url": "https://api.github.com/users/MatthiasLeimeisterSonos/orgs",
"repos_url": "https://api.github.com/users/MatthiasLeimeisterSonos/repos",
"events_url": "https://api.github.com/users/MatthiasLeimeisterSonos/events{/privacy}",
"received_events_url": "https://api.github.com/users/MatthiasLeimeisterSonos/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38863/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38863/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/38862 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38862/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38862/comments | https://api.github.com/repos/huggingface/transformers/issues/38862/events | https://github.com/huggingface/transformers/pull/38862 | 3,153,388,642 | PR_kwDOCUB6oc6a5WL8 | 38,862 | Fix `qwen3` tests | {
"login": "ydshieh",
"id": 2521628,
"node_id": "MDQ6VXNlcjI1MjE2Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ydshieh",
"html_url": "https://github.com/ydshieh",
"followers_url": "https://api.github.com/users/ydshieh/followers",
"following_url": "https://api.github.com/users/ydshieh/following{/other_user}",
"gists_url": "https://api.github.com/users/ydshieh/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ydshieh/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ydshieh/subscriptions",
"organizations_url": "https://api.github.com/users/ydshieh/orgs",
"repos_url": "https://api.github.com/users/ydshieh/repos",
"events_url": "https://api.github.com/users/ydshieh/events{/privacy}",
"received_events_url": "https://api.github.com/users/ydshieh/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-06-17T12:48:41 | 2025-06-17T15:30:09 | 2025-06-17T13:21:37 | COLLABORATOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38862",
"html_url": "https://github.com/huggingface/transformers/pull/38862",
"diff_url": "https://github.com/huggingface/transformers/pull/38862.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38862.patch",
"merged_at": "2025-06-17T13:21:37"
} | # What does this PR do?
| {
"login": "ydshieh",
"id": 2521628,
"node_id": "MDQ6VXNlcjI1MjE2Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ydshieh",
"html_url": "https://github.com/ydshieh",
"followers_url": "https://api.github.com/users/ydshieh/followers",
"following_url": "https://api.github.com/users/ydshieh/following{/other_user}",
"gists_url": "https://api.github.com/users/ydshieh/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ydshieh/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ydshieh/subscriptions",
"organizations_url": "https://api.github.com/users/ydshieh/orgs",
"repos_url": "https://api.github.com/users/ydshieh/repos",
"events_url": "https://api.github.com/users/ydshieh/events{/privacy}",
"received_events_url": "https://api.github.com/users/ydshieh/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38862/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38862/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/38861 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38861/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38861/comments | https://api.github.com/repos/huggingface/transformers/issues/38861/events | https://github.com/huggingface/transformers/pull/38861 | 3,153,285,238 | PR_kwDOCUB6oc6a4_6s | 38,861 | Add SamImageProcessorFast with 4x performance improvement | {
"login": "leochlon",
"id": 60907456,
"node_id": "MDQ6VXNlcjYwOTA3NDU2",
"avatar_url": "https://avatars.githubusercontent.com/u/60907456?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/leochlon",
"html_url": "https://github.com/leochlon",
"followers_url": "https://api.github.com/users/leochlon/followers",
"following_url": "https://api.github.com/users/leochlon/following{/other_user}",
"gists_url": "https://api.github.com/users/leochlon/gists{/gist_id}",
"starred_url": "https://api.github.com/users/leochlon/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/leochlon/subscriptions",
"organizations_url": "https://api.github.com/users/leochlon/orgs",
"repos_url": "https://api.github.com/users/leochlon/repos",
"events_url": "https://api.github.com/users/leochlon/events{/privacy}",
"received_events_url": "https://api.github.com/users/leochlon/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | open | false | null | [] | null | [] | 2025-06-17T12:15:00 | 2025-06-19T08:04:30 | null | NONE | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38861",
"html_url": "https://github.com/huggingface/transformers/pull/38861",
"diff_url": "https://github.com/huggingface/transformers/pull/38861.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38861.patch",
"merged_at": null
} | ## 🎯 Summary
Implements GPU-accelerated `SamImageProcessorFast` following the BEiT pattern, providing 4x performance improvement over the original CPU-bound processor while maintaining 100% compatibility.
fixes https://github.com/huggingface/transformers/pull/36999 | null | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38861/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38861/timeline | null | null | null | null | true | false |
https://api.github.com/repos/huggingface/transformers/issues/38860 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38860/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38860/comments | https://api.github.com/repos/huggingface/transformers/issues/38860/events | https://github.com/huggingface/transformers/pull/38860 | 3,153,256,487 | PR_kwDOCUB6oc6a452- | 38,860 | Add kwargs for timm.create_model in TimmWrapper | {
"login": "qubvel",
"id": 31920396,
"node_id": "MDQ6VXNlcjMxOTIwMzk2",
"avatar_url": "https://avatars.githubusercontent.com/u/31920396?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/qubvel",
"html_url": "https://github.com/qubvel",
"followers_url": "https://api.github.com/users/qubvel/followers",
"following_url": "https://api.github.com/users/qubvel/following{/other_user}",
"gists_url": "https://api.github.com/users/qubvel/gists{/gist_id}",
"starred_url": "https://api.github.com/users/qubvel/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/qubvel/subscriptions",
"organizations_url": "https://api.github.com/users/qubvel/orgs",
"repos_url": "https://api.github.com/users/qubvel/repos",
"events_url": "https://api.github.com/users/qubvel/events{/privacy}",
"received_events_url": "https://api.github.com/users/qubvel/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 5769473378,
"node_id": "LA_kwDOCUB6oc8AAAABV-MtYg",
"url": "https://api.github.com/repos/huggingface/transformers/labels/Vision",
"name": "Vision",
"color": "C079EF",
"default": false,
"description": ""
}
] | closed | false | null | [] | null | [] | 2025-06-17T12:05:05 | 2025-06-25T20:13:50 | 2025-06-20T12:00:09 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38860",
"html_url": "https://github.com/huggingface/transformers/pull/38860",
"diff_url": "https://github.com/huggingface/transformers/pull/38860.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38860.patch",
"merged_at": "2025-06-20T12:00:09"
} | # What does this PR do?
Add kwargs for timm.create_model in TimmWrapper.
requested in:
- https://github.com/huggingface/transformers/pull/37878
related to:
- https://github.com/huggingface/transformers/pull/35819
Kwargs are added to the config to save them. `timm` will read them, but for models without kwargs it doesn't break anything.
To init from config:
```py
config = TimmWrapperConfig.from_pretrained(
"timm/vit_base_patch32_clip_448.laion2b_ft_in12k_in1k",
model_args={"depth": 3},
)
model = TimmWrapperModel(config)
```
To init model from ptratrained
```py
model = TimmWrapperModel.from_pretrained(
"timm/vit_base_patch32_clip_448.laion2b_ft_in12k_in1k",
model_args={"depth": 3},
)
```
cc @zucchini-nlp @rwightman
| {
"login": "qubvel",
"id": 31920396,
"node_id": "MDQ6VXNlcjMxOTIwMzk2",
"avatar_url": "https://avatars.githubusercontent.com/u/31920396?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/qubvel",
"html_url": "https://github.com/qubvel",
"followers_url": "https://api.github.com/users/qubvel/followers",
"following_url": "https://api.github.com/users/qubvel/following{/other_user}",
"gists_url": "https://api.github.com/users/qubvel/gists{/gist_id}",
"starred_url": "https://api.github.com/users/qubvel/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/qubvel/subscriptions",
"organizations_url": "https://api.github.com/users/qubvel/orgs",
"repos_url": "https://api.github.com/users/qubvel/repos",
"events_url": "https://api.github.com/users/qubvel/events{/privacy}",
"received_events_url": "https://api.github.com/users/qubvel/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38860/reactions",
"total_count": 3,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 3,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38860/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/38859 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38859/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38859/comments | https://api.github.com/repos/huggingface/transformers/issues/38859/events | https://github.com/huggingface/transformers/pull/38859 | 3,153,078,340 | PR_kwDOCUB6oc6a4TKz | 38,859 | Add MobileViT fast image processor | {
"login": "leochlon",
"id": 60907456,
"node_id": "MDQ6VXNlcjYwOTA3NDU2",
"avatar_url": "https://avatars.githubusercontent.com/u/60907456?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/leochlon",
"html_url": "https://github.com/leochlon",
"followers_url": "https://api.github.com/users/leochlon/followers",
"following_url": "https://api.github.com/users/leochlon/following{/other_user}",
"gists_url": "https://api.github.com/users/leochlon/gists{/gist_id}",
"starred_url": "https://api.github.com/users/leochlon/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/leochlon/subscriptions",
"organizations_url": "https://api.github.com/users/leochlon/orgs",
"repos_url": "https://api.github.com/users/leochlon/repos",
"events_url": "https://api.github.com/users/leochlon/events{/privacy}",
"received_events_url": "https://api.github.com/users/leochlon/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | open | false | null | [] | null | [] | 2025-06-17T11:05:42 | 2025-06-19T08:05:16 | null | NONE | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38859",
"html_url": "https://github.com/huggingface/transformers/pull/38859",
"diff_url": "https://github.com/huggingface/transformers/pull/38859.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38859.patch",
"merged_at": null
} | ## Summary
This PR adds a fast image processor for MobileViT models, providing significant performance improvements while maintaining full functional equivalence with the existing slow processor.
## Changes
- **Added**: `MobileViTImageProcessorFast` class in `src/transformers/models/mobilevit/image_processing_mobilevit_fast.py`
- **Enhanced**: Test coverage for dual processor testing in existing test file
- **Implemented**: Custom channel flipping support (RGB→BGR) via `do_flip_channel_order` parameter
## Performance Improvements
- **Average speedup**: 1.35x across different batch sizes
- **Optimal performance**: 1.8x speedup for medium batches (16-32 images)
- **GPU acceleration**: Uses PyTorch/torchvision for batched tensor operations
## Technical Implementation
- **Channel Flipping**: Custom `_preprocess` method handles RGB→BGR conversion (required for MobileViT)
- **Size Handling**: Maintains `shortest_edge` format consistency with slow processor via `default_to_square=False`
- **Normalization**: Properly disabled to match slow processor behavior (`do_normalize=None`)
- **Code Quality**: Follows HuggingFace patterns, passes all style checks
## Testing
- ✅ All 18 existing tests pass (2 expected skips)
- ✅ Functional equivalence verified between slow and fast processors
- ✅ Performance benchmarks confirm speedup
- ✅ Both processors produce identical outputs
## Backward Compatibility
- ✅ No breaking changes to existing MobileViT workflows
- ✅ Maintains full compatibility with slow processor parameters
- ✅ Drop-in replacement for performance-critical applications
## Implementation Notes
The custom `_preprocess` method was necessary because `BaseImageProcessorFast` does not support the `do_flip_channel_order` parameter required by MobileViT models. This follows the same pattern used by other fast processors (LayoutLMv2, DepthPro) that require specialized preprocessing steps. | null | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38859/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38859/timeline | null | null | null | null | true | false |
https://api.github.com/repos/huggingface/transformers/issues/38858 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38858/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38858/comments | https://api.github.com/repos/huggingface/transformers/issues/38858/events | https://github.com/huggingface/transformers/issues/38858 | 3,152,958,077 | I_kwDOCUB6oc677lJ9 | 38,858 | BLIP-2 regression in 4.52.1 | {
"login": "ageron",
"id": 76661,
"node_id": "MDQ6VXNlcjc2NjYx",
"avatar_url": "https://avatars.githubusercontent.com/u/76661?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ageron",
"html_url": "https://github.com/ageron",
"followers_url": "https://api.github.com/users/ageron/followers",
"following_url": "https://api.github.com/users/ageron/following{/other_user}",
"gists_url": "https://api.github.com/users/ageron/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ageron/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ageron/subscriptions",
"organizations_url": "https://api.github.com/users/ageron/orgs",
"repos_url": "https://api.github.com/users/ageron/repos",
"events_url": "https://api.github.com/users/ageron/events{/privacy}",
"received_events_url": "https://api.github.com/users/ageron/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
}
] | closed | false | null | [] | null | [] | 2025-06-17T10:23:50 | 2025-06-18T07:20:11 | 2025-06-18T07:20:11 | NONE | null | null | null | null | ### System Info
- `transformers` version: 4.52.4
- Platform: Linux-6.1.123+-x86_64-with-glibc2.35
- Python version: 3.11.13
- Huggingface_hub version: 0.33.0
- Safetensors version: 0.5.3
- Accelerate version: 1.7.0
- Accelerate config: not found
- DeepSpeed version: not installed
- PyTorch version (GPU?): 2.6.0+cu124 (True)
- Tensorflow version (GPU?): 2.18.0 (True)
- Flax version (CPU?/GPU?/TPU?): 0.10.6 (gpu)
- Jax version: 0.5.2
- JaxLib version: 0.5.1
- Using distributed or parallel set-up in script?: no
- Using GPU in script?: no
- GPU type: Tesla T4
### Who can help?
@amyeroberts, @qubvel
### Information
- [x] The official example scripts
- [x] My own modified scripts
### Tasks
- [ ] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...)
- [ ] My own task or dataset (give details below)
### Reproduction
I'm getting different results from `Blip2ForConditionalGeneration` depending on the Transformers library version. Showing the usual [photo of two cats sleeping on a couch](http://images.cocodataset.org/val2017/000000039769.jpg), I get the following results:
* ✅ 4.51.3: "two cats laying on a couch"
* ❌ 4.52.1: "a"
* ❌ 4.52.2: "a"
* ❌ 4.52.3: "a"
* ❌ 4.52.4: "a woman is standing in front of a pink background"
Here's a short code example to reproduce the issue (see [this gist](https://colab.research.google.com/gist/ageron/8a3c87d9af26f0517fcf813af1ab567d/blip-2-issue.ipynb)):
```python
import torch
from transformers import Blip2Processor, Blip2ForConditionalGeneration
import requests
from PIL import Image
device = "cpu" # change to "cuda" or "mps" if needed
model_id = "Salesforce/blip2-opt-2.7b"
blip2_processor = Blip2Processor.from_pretrained(model_id)
blip2_model = Blip2ForConditionalGeneration.from_pretrained(
model_id, device_map=device, torch_dtype=torch.float16)
image_url = "http://images.cocodataset.org/val2017/000000039769.jpg" # two cats
image = Image.open(requests.get(image_url, stream=True).raw)
inputs = blip2_processor(images=image, return_tensors="pt")
inputs = inputs.to(device, dtype=torch.float16)
with torch.no_grad():
generated_ids = blip2_model.generate(**inputs)
generated_text = blip2_processor.batch_decode(generated_ids,
skip_special_tokens=True)
print(generated_text)
```
Note: I thought it might be similar to #38514 but it doesn't seem so. I tried applying #38510 but it didn't solve the issue.
### Expected behavior
I'd like the same behavior as in 4.51.3, i.e., printing "two cats laying on a couch" rather than "a" or "a woman is standing in front of a pink background".
Thanks! | {
"login": "zucchini-nlp",
"id": 100715397,
"node_id": "U_kgDOBgDLhQ",
"avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/zucchini-nlp",
"html_url": "https://github.com/zucchini-nlp",
"followers_url": "https://api.github.com/users/zucchini-nlp/followers",
"following_url": "https://api.github.com/users/zucchini-nlp/following{/other_user}",
"gists_url": "https://api.github.com/users/zucchini-nlp/gists{/gist_id}",
"starred_url": "https://api.github.com/users/zucchini-nlp/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/zucchini-nlp/subscriptions",
"organizations_url": "https://api.github.com/users/zucchini-nlp/orgs",
"repos_url": "https://api.github.com/users/zucchini-nlp/repos",
"events_url": "https://api.github.com/users/zucchini-nlp/events{/privacy}",
"received_events_url": "https://api.github.com/users/zucchini-nlp/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38858/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38858/timeline | null | completed | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | true |
https://api.github.com/repos/huggingface/transformers/issues/38857 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38857/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38857/comments | https://api.github.com/repos/huggingface/transformers/issues/38857/events | https://github.com/huggingface/transformers/pull/38857 | 3,152,851,371 | PR_kwDOCUB6oc6a3hhy | 38,857 | Clarify per_device_train_batch_size scaling in TrainingArguments (#38… | {
"login": "Shohail-Ismail",
"id": 149825575,
"node_id": "U_kgDOCO4oJw",
"avatar_url": "https://avatars.githubusercontent.com/u/149825575?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Shohail-Ismail",
"html_url": "https://github.com/Shohail-Ismail",
"followers_url": "https://api.github.com/users/Shohail-Ismail/followers",
"following_url": "https://api.github.com/users/Shohail-Ismail/following{/other_user}",
"gists_url": "https://api.github.com/users/Shohail-Ismail/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Shohail-Ismail/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Shohail-Ismail/subscriptions",
"organizations_url": "https://api.github.com/users/Shohail-Ismail/orgs",
"repos_url": "https://api.github.com/users/Shohail-Ismail/repos",
"events_url": "https://api.github.com/users/Shohail-Ismail/events{/privacy}",
"received_events_url": "https://api.github.com/users/Shohail-Ismail/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-06-17T09:51:49 | 2025-07-08T00:23:02 | 2025-07-07T16:57:42 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38857",
"html_url": "https://github.com/huggingface/transformers/pull/38857",
"diff_url": "https://github.com/huggingface/transformers/pull/38857.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38857.patch",
"merged_at": "2025-07-07T16:57:42"
} | ## What does this PR do?
This PR clarifies in the `TrainingArguments` docstring that `per_device_train_batch_size`
is multiplied by the number of devices when training on multiple GPUs or with distributed training.
Closes #38484
## Before submitting
- [x] This PR fixes a typo or improves the docs
## Who can review?
- @zach-huggingface
- @SunMarc
- @qgallouedec
| {
"login": "SunMarc",
"id": 57196510,
"node_id": "MDQ6VXNlcjU3MTk2NTEw",
"avatar_url": "https://avatars.githubusercontent.com/u/57196510?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/SunMarc",
"html_url": "https://github.com/SunMarc",
"followers_url": "https://api.github.com/users/SunMarc/followers",
"following_url": "https://api.github.com/users/SunMarc/following{/other_user}",
"gists_url": "https://api.github.com/users/SunMarc/gists{/gist_id}",
"starred_url": "https://api.github.com/users/SunMarc/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/SunMarc/subscriptions",
"organizations_url": "https://api.github.com/users/SunMarc/orgs",
"repos_url": "https://api.github.com/users/SunMarc/repos",
"events_url": "https://api.github.com/users/SunMarc/events{/privacy}",
"received_events_url": "https://api.github.com/users/SunMarc/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38857/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38857/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/38856 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38856/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38856/comments | https://api.github.com/repos/huggingface/transformers/issues/38856/events | https://github.com/huggingface/transformers/pull/38856 | 3,152,175,793 | PR_kwDOCUB6oc6a1Px_ | 38,856 | Fix(informer): Correct tensor shape for input_size=1 | {
"login": "Flink-ddd",
"id": 180720690,
"node_id": "U_kgDOCsWUMg",
"avatar_url": "https://avatars.githubusercontent.com/u/180720690?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Flink-ddd",
"html_url": "https://github.com/Flink-ddd",
"followers_url": "https://api.github.com/users/Flink-ddd/followers",
"following_url": "https://api.github.com/users/Flink-ddd/following{/other_user}",
"gists_url": "https://api.github.com/users/Flink-ddd/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Flink-ddd/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Flink-ddd/subscriptions",
"organizations_url": "https://api.github.com/users/Flink-ddd/orgs",
"repos_url": "https://api.github.com/users/Flink-ddd/repos",
"events_url": "https://api.github.com/users/Flink-ddd/events{/privacy}",
"received_events_url": "https://api.github.com/users/Flink-ddd/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-06-17T06:03:45 | 2025-06-23T09:50:52 | 2025-06-23T09:50:52 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38856",
"html_url": "https://github.com/huggingface/transformers/pull/38856",
"diff_url": "https://github.com/huggingface/transformers/pull/38856.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38856.patch",
"merged_at": "2025-06-23T09:50:52"
} | Hi @Rocketknight1, thanks for the great guidance on the previous PR!
This new pull request follows your suggestion to fix the bug at its source. It resolves a `RuntimeError` that occurs in time series models inheriting from `TimeSeriesTransformerModel` (such as `InformerModel`) when `config.input_size` is set to 1.
The root cause was that when `input_size=1`, the `loc` and `scale` tensors calculated by the scaler retained an extra dimension (e.g., shape `[B, 1, 1]` instead of `[B, 1]`). This incorrect shape caused a dimension mismatch error during a later `expand()` operation.
Instead of overriding the method in the child class, this PR applies a minimal and robust fix directly to the `create_network_inputs` method in the parent `TimeSeriesTransformerModel`. It refactors the logic to unconditionally apply `.squeeze(1)` to both the `loc` and `scale` tensors. This approach handles all `input_size` cases correctly and avoids code duplication.
Fixes #38745 | {
"login": "kashif",
"id": 8100,
"node_id": "MDQ6VXNlcjgxMDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/8100?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/kashif",
"html_url": "https://github.com/kashif",
"followers_url": "https://api.github.com/users/kashif/followers",
"following_url": "https://api.github.com/users/kashif/following{/other_user}",
"gists_url": "https://api.github.com/users/kashif/gists{/gist_id}",
"starred_url": "https://api.github.com/users/kashif/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kashif/subscriptions",
"organizations_url": "https://api.github.com/users/kashif/orgs",
"repos_url": "https://api.github.com/users/kashif/repos",
"events_url": "https://api.github.com/users/kashif/events{/privacy}",
"received_events_url": "https://api.github.com/users/kashif/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38856/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38856/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/38855 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38855/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38855/comments | https://api.github.com/repos/huggingface/transformers/issues/38855/events | https://github.com/huggingface/transformers/pull/38855 | 3,151,996,714 | PR_kwDOCUB6oc6a0qiQ | 38,855 | LlamaAttention forward function type hint is incorrect from new Branch | {
"login": "ArkVex",
"id": 159469387,
"node_id": "U_kgDOCYFPSw",
"avatar_url": "https://avatars.githubusercontent.com/u/159469387?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ArkVex",
"html_url": "https://github.com/ArkVex",
"followers_url": "https://api.github.com/users/ArkVex/followers",
"following_url": "https://api.github.com/users/ArkVex/following{/other_user}",
"gists_url": "https://api.github.com/users/ArkVex/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ArkVex/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ArkVex/subscriptions",
"organizations_url": "https://api.github.com/users/ArkVex/orgs",
"repos_url": "https://api.github.com/users/ArkVex/repos",
"events_url": "https://api.github.com/users/ArkVex/events{/privacy}",
"received_events_url": "https://api.github.com/users/ArkVex/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-06-17T04:14:05 | 2025-07-14T22:10:18 | 2025-07-14T22:10:18 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38855",
"html_url": "https://github.com/huggingface/transformers/pull/38855",
"diff_url": "https://github.com/huggingface/transformers/pull/38855.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38855.patch",
"merged_at": null
} | Hi, this PR fixes a small issue in the LlamaAttention class. The return type in the forward method currently shows three values, but the function actually returns only two. This seems to have been missed during the attention refactor (possibly in PR https://github.com/huggingface/transformers/pull/35235).
I’ve updated the type hint to reflect the actual return values, just to avoid confusion for anyone reading or using the code. Let me know if any other changes are needed. Happy to help! | {
"login": "ArkVex",
"id": 159469387,
"node_id": "U_kgDOCYFPSw",
"avatar_url": "https://avatars.githubusercontent.com/u/159469387?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ArkVex",
"html_url": "https://github.com/ArkVex",
"followers_url": "https://api.github.com/users/ArkVex/followers",
"following_url": "https://api.github.com/users/ArkVex/following{/other_user}",
"gists_url": "https://api.github.com/users/ArkVex/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ArkVex/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ArkVex/subscriptions",
"organizations_url": "https://api.github.com/users/ArkVex/orgs",
"repos_url": "https://api.github.com/users/ArkVex/repos",
"events_url": "https://api.github.com/users/ArkVex/events{/privacy}",
"received_events_url": "https://api.github.com/users/ArkVex/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38855/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38855/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/38854 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38854/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38854/comments | https://api.github.com/repos/huggingface/transformers/issues/38854/events | https://github.com/huggingface/transformers/issues/38854 | 3,151,955,979 | I_kwDOCUB6oc673wgL | 38,854 | scale loss per token/local sequence for discrete system representation | {
"login": "wesboyt",
"id": 30701972,
"node_id": "MDQ6VXNlcjMwNzAxOTcy",
"avatar_url": "https://avatars.githubusercontent.com/u/30701972?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/wesboyt",
"html_url": "https://github.com/wesboyt",
"followers_url": "https://api.github.com/users/wesboyt/followers",
"following_url": "https://api.github.com/users/wesboyt/following{/other_user}",
"gists_url": "https://api.github.com/users/wesboyt/gists{/gist_id}",
"starred_url": "https://api.github.com/users/wesboyt/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/wesboyt/subscriptions",
"organizations_url": "https://api.github.com/users/wesboyt/orgs",
"repos_url": "https://api.github.com/users/wesboyt/repos",
"events_url": "https://api.github.com/users/wesboyt/events{/privacy}",
"received_events_url": "https://api.github.com/users/wesboyt/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 2648621985,
"node_id": "MDU6TGFiZWwyNjQ4NjIxOTg1",
"url": "https://api.github.com/repos/huggingface/transformers/labels/Feature%20request",
"name": "Feature request",
"color": "FBCA04",
"default": false,
"description": "Request for a new feature"
}
] | open | false | null | [] | null | [] | 2025-06-17T03:56:28 | 2025-07-03T03:17:21 | null | NONE | null | null | null | null | ### Feature request
I have implemented a scaling loss for my discrete game usecase.
I would like to be able to hand my model a batch with a map of sequences and their associated randomness(or importance) and then scale the loss relative to the next tokens predictability.
My implementation works but is very slow relative to:
https://github.com/huggingface/transformers/blob/v4.52.3/src/transformers/trainer.py#L3795
### Motivation
My system has highly variable regions that are truly random and prevent my models from converging without manual training which is incredibly inefficient.
### Your contribution
My implementation:
```
for val in train:
val = torch.tensor(val['input_ids'], dtype=torch.long).to(device)
logits = model(val).logits
ttl_loss = torch.zeros(1,dtype=torch.float32).to(device)
for i in full_range:
loss = loss_function(logits[:, i - 1], val[:, i])
if i > 26 or i < 6:
losses.append(loss.item())
else:
loss = loss * 0.1
ttl_loss += loss
ttl_loss.backward()
optimizer.step()
optimizer.zero_grad()
lr_scheduler.step()
```
I improved accuracy for these desireable and rarer sequences from 88% to 96% in only 20000 sentences. | null | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38854/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38854/timeline | null | null | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | false |
https://api.github.com/repos/huggingface/transformers/issues/38853 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38853/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38853/comments | https://api.github.com/repos/huggingface/transformers/issues/38853/events | https://github.com/huggingface/transformers/pull/38853 | 3,151,711,515 | PR_kwDOCUB6oc6azu8I | 38,853 | Update bamba model card | {
"login": "druvdub",
"id": 59387969,
"node_id": "MDQ6VXNlcjU5Mzg3OTY5",
"avatar_url": "https://avatars.githubusercontent.com/u/59387969?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/druvdub",
"html_url": "https://github.com/druvdub",
"followers_url": "https://api.github.com/users/druvdub/followers",
"following_url": "https://api.github.com/users/druvdub/following{/other_user}",
"gists_url": "https://api.github.com/users/druvdub/gists{/gist_id}",
"starred_url": "https://api.github.com/users/druvdub/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/druvdub/subscriptions",
"organizations_url": "https://api.github.com/users/druvdub/orgs",
"repos_url": "https://api.github.com/users/druvdub/repos",
"events_url": "https://api.github.com/users/druvdub/events{/privacy}",
"received_events_url": "https://api.github.com/users/druvdub/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-06-17T00:58:13 | 2025-06-18T23:01:25 | 2025-06-18T23:01:25 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38853",
"html_url": "https://github.com/huggingface/transformers/pull/38853",
"diff_url": "https://github.com/huggingface/transformers/pull/38853.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38853.patch",
"merged_at": "2025-06-18T23:01:25"
} | # What does this PR do?
Updates `bamba` model card for #36979
## Before submitting
- [x] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [ ] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [x] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case. #36979
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review?
@stevhliu
| {
"login": "stevhliu",
"id": 59462357,
"node_id": "MDQ6VXNlcjU5NDYyMzU3",
"avatar_url": "https://avatars.githubusercontent.com/u/59462357?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/stevhliu",
"html_url": "https://github.com/stevhliu",
"followers_url": "https://api.github.com/users/stevhliu/followers",
"following_url": "https://api.github.com/users/stevhliu/following{/other_user}",
"gists_url": "https://api.github.com/users/stevhliu/gists{/gist_id}",
"starred_url": "https://api.github.com/users/stevhliu/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/stevhliu/subscriptions",
"organizations_url": "https://api.github.com/users/stevhliu/orgs",
"repos_url": "https://api.github.com/users/stevhliu/repos",
"events_url": "https://api.github.com/users/stevhliu/events{/privacy}",
"received_events_url": "https://api.github.com/users/stevhliu/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38853/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38853/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/38852 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38852/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38852/comments | https://api.github.com/repos/huggingface/transformers/issues/38852/events | https://github.com/huggingface/transformers/pull/38852 | 3,151,692,772 | PR_kwDOCUB6oc6azrM2 | 38,852 | enable misc test cases on XPU | {
"login": "yao-matrix",
"id": 7245027,
"node_id": "MDQ6VXNlcjcyNDUwMjc=",
"avatar_url": "https://avatars.githubusercontent.com/u/7245027?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/yao-matrix",
"html_url": "https://github.com/yao-matrix",
"followers_url": "https://api.github.com/users/yao-matrix/followers",
"following_url": "https://api.github.com/users/yao-matrix/following{/other_user}",
"gists_url": "https://api.github.com/users/yao-matrix/gists{/gist_id}",
"starred_url": "https://api.github.com/users/yao-matrix/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/yao-matrix/subscriptions",
"organizations_url": "https://api.github.com/users/yao-matrix/orgs",
"repos_url": "https://api.github.com/users/yao-matrix/repos",
"events_url": "https://api.github.com/users/yao-matrix/events{/privacy}",
"received_events_url": "https://api.github.com/users/yao-matrix/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-06-17T00:44:17 | 2025-06-18T23:11:10 | 2025-06-18T07:20:49 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38852",
"html_url": "https://github.com/huggingface/transformers/pull/38852",
"diff_url": "https://github.com/huggingface/transformers/pull/38852.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38852.patch",
"merged_at": "2025-06-18T07:20:49"
} | @ydshieh, pls help review, thx very much. | {
"login": "ydshieh",
"id": 2521628,
"node_id": "MDQ6VXNlcjI1MjE2Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ydshieh",
"html_url": "https://github.com/ydshieh",
"followers_url": "https://api.github.com/users/ydshieh/followers",
"following_url": "https://api.github.com/users/ydshieh/following{/other_user}",
"gists_url": "https://api.github.com/users/ydshieh/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ydshieh/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ydshieh/subscriptions",
"organizations_url": "https://api.github.com/users/ydshieh/orgs",
"repos_url": "https://api.github.com/users/ydshieh/repos",
"events_url": "https://api.github.com/users/ydshieh/events{/privacy}",
"received_events_url": "https://api.github.com/users/ydshieh/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38852/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38852/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/38851 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38851/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38851/comments | https://api.github.com/repos/huggingface/transformers/issues/38851/events | https://github.com/huggingface/transformers/issues/38851 | 3,151,647,682 | I_kwDOCUB6oc672lPC | 38,851 | Should `compute_metrics` only run on the main process when doing DDP? | {
"login": "TIE666",
"id": 47147482,
"node_id": "MDQ6VXNlcjQ3MTQ3NDgy",
"avatar_url": "https://avatars.githubusercontent.com/u/47147482?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/TIE666",
"html_url": "https://github.com/TIE666",
"followers_url": "https://api.github.com/users/TIE666/followers",
"following_url": "https://api.github.com/users/TIE666/following{/other_user}",
"gists_url": "https://api.github.com/users/TIE666/gists{/gist_id}",
"starred_url": "https://api.github.com/users/TIE666/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/TIE666/subscriptions",
"organizations_url": "https://api.github.com/users/TIE666/orgs",
"repos_url": "https://api.github.com/users/TIE666/repos",
"events_url": "https://api.github.com/users/TIE666/events{/privacy}",
"received_events_url": "https://api.github.com/users/TIE666/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-06-17T00:09:43 | 2025-07-25T08:02:33 | 2025-07-25T08:02:33 | NONE | null | null | null | null | Hi, I want to know when doing training and evaluation on a multi-GPU setup (DDP using trainer and accelerate), does `compute_metrics` only need to be run on the main process?
The reason being that `trainer` itself already does `gather_for_metrics` ([here](https://github.com/huggingface/transformers/blob/v4.51-release/src/transformers/trainer.py#L4373)), which I suppose should collect all predictions (logits) and labels across processes, running `compute_metrics` from multiple processes again will be doing duplicated work, no?
to add:
I am using `batch_eval_metrics`, where I first spotted that if I run the training script (modified version of `run_clm.py`) with `accelerate launch`, the `compute_metrics` is always called multiple times, but the logits from `EvalPrediction` for each call is `per_device_eval_batch_size` * number of GPU I am using. | {
"login": "github-actions[bot]",
"id": 41898282,
"node_id": "MDM6Qm90NDE4OTgyODI=",
"avatar_url": "https://avatars.githubusercontent.com/in/15368?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/github-actions%5Bbot%5D",
"html_url": "https://github.com/apps/github-actions",
"followers_url": "https://api.github.com/users/github-actions%5Bbot%5D/followers",
"following_url": "https://api.github.com/users/github-actions%5Bbot%5D/following{/other_user}",
"gists_url": "https://api.github.com/users/github-actions%5Bbot%5D/gists{/gist_id}",
"starred_url": "https://api.github.com/users/github-actions%5Bbot%5D/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/github-actions%5Bbot%5D/subscriptions",
"organizations_url": "https://api.github.com/users/github-actions%5Bbot%5D/orgs",
"repos_url": "https://api.github.com/users/github-actions%5Bbot%5D/repos",
"events_url": "https://api.github.com/users/github-actions%5Bbot%5D/events{/privacy}",
"received_events_url": "https://api.github.com/users/github-actions%5Bbot%5D/received_events",
"type": "Bot",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38851/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38851/timeline | null | completed | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | true |
https://api.github.com/repos/huggingface/transformers/issues/38850 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38850/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38850/comments | https://api.github.com/repos/huggingface/transformers/issues/38850/events | https://github.com/huggingface/transformers/pull/38850 | 3,151,214,719 | PR_kwDOCUB6oc6ayDcH | 38,850 | fix: fixed wrong concatenation which made batching results wrong | {
"login": "sbucaille",
"id": 24275548,
"node_id": "MDQ6VXNlcjI0Mjc1NTQ4",
"avatar_url": "https://avatars.githubusercontent.com/u/24275548?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/sbucaille",
"html_url": "https://github.com/sbucaille",
"followers_url": "https://api.github.com/users/sbucaille/followers",
"following_url": "https://api.github.com/users/sbucaille/following{/other_user}",
"gists_url": "https://api.github.com/users/sbucaille/gists{/gist_id}",
"starred_url": "https://api.github.com/users/sbucaille/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sbucaille/subscriptions",
"organizations_url": "https://api.github.com/users/sbucaille/orgs",
"repos_url": "https://api.github.com/users/sbucaille/repos",
"events_url": "https://api.github.com/users/sbucaille/events{/privacy}",
"received_events_url": "https://api.github.com/users/sbucaille/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
},
{
"id": 5769473378,
"node_id": "LA_kwDOCUB6oc8AAAABV-MtYg",
"url": "https://api.github.com/repos/huggingface/transformers/labels/Vision",
"name": "Vision",
"color": "C079EF",
"default": false,
"description": ""
}
] | closed | false | null | [] | null | [] | 2025-06-16T20:23:25 | 2025-07-06T13:23:21 | 2025-07-01T12:14:44 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38850",
"html_url": "https://github.com/huggingface/transformers/pull/38850",
"diff_url": "https://github.com/huggingface/transformers/pull/38850.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38850.patch",
"merged_at": "2025-07-01T12:14:44"
} | # What does this PR do?
Fixes #38348
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [x] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [x] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review?
@qubvel | {
"login": "qubvel",
"id": 31920396,
"node_id": "MDQ6VXNlcjMxOTIwMzk2",
"avatar_url": "https://avatars.githubusercontent.com/u/31920396?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/qubvel",
"html_url": "https://github.com/qubvel",
"followers_url": "https://api.github.com/users/qubvel/followers",
"following_url": "https://api.github.com/users/qubvel/following{/other_user}",
"gists_url": "https://api.github.com/users/qubvel/gists{/gist_id}",
"starred_url": "https://api.github.com/users/qubvel/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/qubvel/subscriptions",
"organizations_url": "https://api.github.com/users/qubvel/orgs",
"repos_url": "https://api.github.com/users/qubvel/repos",
"events_url": "https://api.github.com/users/qubvel/events{/privacy}",
"received_events_url": "https://api.github.com/users/qubvel/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38850/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38850/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/38849 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38849/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38849/comments | https://api.github.com/repos/huggingface/transformers/issues/38849/events | https://github.com/huggingface/transformers/pull/38849 | 3,151,162,197 | PR_kwDOCUB6oc6ax4EM | 38,849 | Remove merge conflict artifacts in Albert model doc | {
"login": "druvdub",
"id": 59387969,
"node_id": "MDQ6VXNlcjU5Mzg3OTY5",
"avatar_url": "https://avatars.githubusercontent.com/u/59387969?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/druvdub",
"html_url": "https://github.com/druvdub",
"followers_url": "https://api.github.com/users/druvdub/followers",
"following_url": "https://api.github.com/users/druvdub/following{/other_user}",
"gists_url": "https://api.github.com/users/druvdub/gists{/gist_id}",
"starred_url": "https://api.github.com/users/druvdub/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/druvdub/subscriptions",
"organizations_url": "https://api.github.com/users/druvdub/orgs",
"repos_url": "https://api.github.com/users/druvdub/repos",
"events_url": "https://api.github.com/users/druvdub/events{/privacy}",
"received_events_url": "https://api.github.com/users/druvdub/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-06-16T20:03:20 | 2025-06-16T21:21:19 | 2025-06-16T21:21:19 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38849",
"html_url": "https://github.com/huggingface/transformers/pull/38849",
"diff_url": "https://github.com/huggingface/transformers/pull/38849.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38849.patch",
"merged_at": "2025-06-16T21:21:19"
} | # What does this PR do?
Fixes artifacts introduced in #37753
## Before submitting
- [x] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [x] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case. #37753 #36979
## Who can review?
@stevhliu | {
"login": "stevhliu",
"id": 59462357,
"node_id": "MDQ6VXNlcjU5NDYyMzU3",
"avatar_url": "https://avatars.githubusercontent.com/u/59462357?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/stevhliu",
"html_url": "https://github.com/stevhliu",
"followers_url": "https://api.github.com/users/stevhliu/followers",
"following_url": "https://api.github.com/users/stevhliu/following{/other_user}",
"gists_url": "https://api.github.com/users/stevhliu/gists{/gist_id}",
"starred_url": "https://api.github.com/users/stevhliu/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/stevhliu/subscriptions",
"organizations_url": "https://api.github.com/users/stevhliu/orgs",
"repos_url": "https://api.github.com/users/stevhliu/repos",
"events_url": "https://api.github.com/users/stevhliu/events{/privacy}",
"received_events_url": "https://api.github.com/users/stevhliu/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38849/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38849/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/38848 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38848/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38848/comments | https://api.github.com/repos/huggingface/transformers/issues/38848/events | https://github.com/huggingface/transformers/pull/38848 | 3,151,105,358 | PR_kwDOCUB6oc6axroa | 38,848 | Update model card for auto | {
"login": "druvdub",
"id": 59387969,
"node_id": "MDQ6VXNlcjU5Mzg3OTY5",
"avatar_url": "https://avatars.githubusercontent.com/u/59387969?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/druvdub",
"html_url": "https://github.com/druvdub",
"followers_url": "https://api.github.com/users/druvdub/followers",
"following_url": "https://api.github.com/users/druvdub/following{/other_user}",
"gists_url": "https://api.github.com/users/druvdub/gists{/gist_id}",
"starred_url": "https://api.github.com/users/druvdub/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/druvdub/subscriptions",
"organizations_url": "https://api.github.com/users/druvdub/orgs",
"repos_url": "https://api.github.com/users/druvdub/repos",
"events_url": "https://api.github.com/users/druvdub/events{/privacy}",
"received_events_url": "https://api.github.com/users/druvdub/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-06-16T19:42:32 | 2025-06-16T21:21:13 | 2025-06-16T20:48:19 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38848",
"html_url": "https://github.com/huggingface/transformers/pull/38848",
"diff_url": "https://github.com/huggingface/transformers/pull/38848.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38848.patch",
"merged_at": null
} | # What does this PR do?
Updated Auto Class docs for #36979
## Before submitting
- [x] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [x] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case. #36979
## Who can review?
@stevhliu
<!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @
If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**.
Please tag fewer than 3 people.
Models:
- text models: @ArthurZucker
- vision models: @amyeroberts, @qubvel
- speech models: @eustlb
- graph models: @clefourrier
Library:
- flax: @gante and @Rocketknight1
- generate: @zucchini-nlp (visual-language models) or @gante (all others)
- pipelines: @Rocketknight1
- tensorflow: @gante and @Rocketknight1
- tokenizers: @ArthurZucker
- trainer: @zach-huggingface, @SunMarc and @qgallouedec
- chat templates: @Rocketknight1
Integrations:
- deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface
- ray/raytune: @richardliaw, @amogkam
- Big Model Inference: @SunMarc
- quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber
Documentation:
HF projects:
- accelerate: [different repo](https://github.com/huggingface/accelerate)
- datasets: [different repo](https://github.com/huggingface/datasets)
- diffusers: [different repo](https://github.com/huggingface/diffusers)
- rust tokenizers: [different repo](https://github.com/huggingface/tokenizers)
Maintained examples (not research project or legacy):
- Flax: @Rocketknight1
- PyTorch: See Models above and tag the person corresponding to the modality of the example.
- TensorFlow: @Rocketknight1
-->
| {
"login": "stevhliu",
"id": 59462357,
"node_id": "MDQ6VXNlcjU5NDYyMzU3",
"avatar_url": "https://avatars.githubusercontent.com/u/59462357?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/stevhliu",
"html_url": "https://github.com/stevhliu",
"followers_url": "https://api.github.com/users/stevhliu/followers",
"following_url": "https://api.github.com/users/stevhliu/following{/other_user}",
"gists_url": "https://api.github.com/users/stevhliu/gists{/gist_id}",
"starred_url": "https://api.github.com/users/stevhliu/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/stevhliu/subscriptions",
"organizations_url": "https://api.github.com/users/stevhliu/orgs",
"repos_url": "https://api.github.com/users/stevhliu/repos",
"events_url": "https://api.github.com/users/stevhliu/events{/privacy}",
"received_events_url": "https://api.github.com/users/stevhliu/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38848/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38848/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/38847 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38847/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38847/comments | https://api.github.com/repos/huggingface/transformers/issues/38847/events | https://github.com/huggingface/transformers/pull/38847 | 3,150,698,148 | PR_kwDOCUB6oc6awTCM | 38,847 | Docs: Add custom fine-tuning tutorial to TrOCR model page | {
"login": "Ashutosh-4485",
"id": 63778450,
"node_id": "MDQ6VXNlcjYzNzc4NDUw",
"avatar_url": "https://avatars.githubusercontent.com/u/63778450?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Ashutosh-4485",
"html_url": "https://github.com/Ashutosh-4485",
"followers_url": "https://api.github.com/users/Ashutosh-4485/followers",
"following_url": "https://api.github.com/users/Ashutosh-4485/following{/other_user}",
"gists_url": "https://api.github.com/users/Ashutosh-4485/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Ashutosh-4485/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Ashutosh-4485/subscriptions",
"organizations_url": "https://api.github.com/users/Ashutosh-4485/orgs",
"repos_url": "https://api.github.com/users/Ashutosh-4485/repos",
"events_url": "https://api.github.com/users/Ashutosh-4485/events{/privacy}",
"received_events_url": "https://api.github.com/users/Ashutosh-4485/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-06-16T17:09:11 | 2025-06-20T15:42:24 | 2025-06-18T16:38:58 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38847",
"html_url": "https://github.com/huggingface/transformers/pull/38847",
"diff_url": "https://github.com/huggingface/transformers/pull/38847.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38847.patch",
"merged_at": "2025-06-18T16:38:58"
} | Docs: add community fine‑tuning notebook link to TrOCR page
# What does this PR do?
Adds a link to a community tutorial notebook in the TrOCR model docs.
The notebook demonstrates fine-tuning TrOCR on a custom custom dataset and includes multiple training strategies:
- Train all parameters
- Train encoder only (freeze decoder)
- Train decoder only (freeze encoder)
- Train only the last N layers of both encoder and decoder
Tutorial link: https://github.com/Ashutosh-4485/trocr-custom-fine-tune/blob/66010bcdaf45313b829fa83b3c1ba20002e3c323/FineTune_TrOCR_on_Custom_data.ipynb
<!--
Congratulations! You've made it this far! You're not quite done yet though.
Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution.
Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change.
Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost.
-->
## Before submitting
- [x] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [ ] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
<!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @
If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**.
Please tag fewer than 3 people.
Models:
- text models: @ArthurZucker
- vision models: @amyeroberts, @qubvel
- speech models: @eustlb
- graph models: @clefourrier
Library:
- flax: @gante and @Rocketknight1
- generate: @zucchini-nlp (visual-language models) or @gante (all others)
- pipelines: @Rocketknight1
- tensorflow: @gante and @Rocketknight1
- tokenizers: @ArthurZucker
- trainer: @zach-huggingface, @SunMarc and @qgallouedec
- chat templates: @Rocketknight1
Integrations:
- deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface
- ray/raytune: @richardliaw, @amogkam
- Big Model Inference: @SunMarc
- quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber
Documentation: @stevhliu
HF projects:
- accelerate: [different repo](https://github.com/huggingface/accelerate)
- datasets: [different repo](https://github.com/huggingface/datasets)
- diffusers: [different repo](https://github.com/huggingface/diffusers)
- rust tokenizers: [different repo](https://github.com/huggingface/tokenizers)
Maintained examples (not research project or legacy):
- Flax: @Rocketknight1
- PyTorch: See Models above and tag the person corresponding to the modality of the example.
- TensorFlow: @Rocketknight1
-->
| {
"login": "stevhliu",
"id": 59462357,
"node_id": "MDQ6VXNlcjU5NDYyMzU3",
"avatar_url": "https://avatars.githubusercontent.com/u/59462357?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/stevhliu",
"html_url": "https://github.com/stevhliu",
"followers_url": "https://api.github.com/users/stevhliu/followers",
"following_url": "https://api.github.com/users/stevhliu/following{/other_user}",
"gists_url": "https://api.github.com/users/stevhliu/gists{/gist_id}",
"starred_url": "https://api.github.com/users/stevhliu/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/stevhliu/subscriptions",
"organizations_url": "https://api.github.com/users/stevhliu/orgs",
"repos_url": "https://api.github.com/users/stevhliu/repos",
"events_url": "https://api.github.com/users/stevhliu/events{/privacy}",
"received_events_url": "https://api.github.com/users/stevhliu/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38847/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38847/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/38846 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38846/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38846/comments | https://api.github.com/repos/huggingface/transformers/issues/38846/events | https://github.com/huggingface/transformers/issues/38846 | 3,150,225,664 | I_kwDOCUB6oc67xKEA | 38,846 | video_auto_processing.py breaks everything | {
"login": "lucasjinreal",
"id": 21303438,
"node_id": "MDQ6VXNlcjIxMzAzNDM4",
"avatar_url": "https://avatars.githubusercontent.com/u/21303438?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/lucasjinreal",
"html_url": "https://github.com/lucasjinreal",
"followers_url": "https://api.github.com/users/lucasjinreal/followers",
"following_url": "https://api.github.com/users/lucasjinreal/following{/other_user}",
"gists_url": "https://api.github.com/users/lucasjinreal/gists{/gist_id}",
"starred_url": "https://api.github.com/users/lucasjinreal/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/lucasjinreal/subscriptions",
"organizations_url": "https://api.github.com/users/lucasjinreal/orgs",
"repos_url": "https://api.github.com/users/lucasjinreal/repos",
"events_url": "https://api.github.com/users/lucasjinreal/events{/privacy}",
"received_events_url": "https://api.github.com/users/lucasjinreal/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
}
] | closed | false | null | [] | null | [] | 2025-06-16T14:24:56 | 2025-07-25T08:02:35 | 2025-07-25T08:02:35 | NONE | null | null | null | null | ### System Info
the video preprocessor now breaks everytthing,
I have a model which need Qwen2.5 VL preporssor, it's OK before, but now it can not load, breaks.
My model just need Qwen2.5 VL preprocessor, it doens't have Qwen2.5 VL arch, it's a customized model.
But now, the video prerproecssor will reading the config arch in my model, rather than just the preprocessor.config.json
This is weired.
You have video processor config saved in `preprocessor.json` file which is deprecated. Video processor configs should be saved in their own `video_preprocessor.json` file. You can rena
me the file or load and save the processor back which renames it automatically. Loading from `preprocessor.json` will be removed in v5.0.
```
from transformers import AutoProcessor, Qwen2_5_VLProcessor
processor_path = './checkpoints/Qwen3-VL-2B-Unofficial'
a = Qwen2_5_VLProcessor.from_pretrained(
processor_path, trust_remote_code=True
)
print(a)
```
This code can be used without `config.json` must be a registered model, but now, it need `confgi.sjon` must be a reigstered mdoel, even though I just need `preprocess_config.json`
Please remove the video processor code, it's useless and introduce many bug!
### Who can help?
_No response_
### Information
- [ ] The official example scripts
- [ ] My own modified scripts
### Tasks
- [ ] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...)
- [ ] My own task or dataset (give details below)
### Reproduction
from transformers import AutoProcessor, Qwen2_5_VLProcessor
processor_path = './checkpoints/Qwen3-VL-2B-Unofficial'
a = Qwen2_5_VLProcessor.from_pretrained(
processor_path, trust_remote_code=True
)
print(a)
### Expected behavior
from transformers import AutoProcessor, Qwen2_5_VLProcessor
processor_path = './checkpoints/Qwen3-VL-2B-Unofficial'
a = Qwen2_5_VLProcessor.from_pretrained(
processor_path, trust_remote_code=True
)
print(a) | {
"login": "github-actions[bot]",
"id": 41898282,
"node_id": "MDM6Qm90NDE4OTgyODI=",
"avatar_url": "https://avatars.githubusercontent.com/in/15368?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/github-actions%5Bbot%5D",
"html_url": "https://github.com/apps/github-actions",
"followers_url": "https://api.github.com/users/github-actions%5Bbot%5D/followers",
"following_url": "https://api.github.com/users/github-actions%5Bbot%5D/following{/other_user}",
"gists_url": "https://api.github.com/users/github-actions%5Bbot%5D/gists{/gist_id}",
"starred_url": "https://api.github.com/users/github-actions%5Bbot%5D/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/github-actions%5Bbot%5D/subscriptions",
"organizations_url": "https://api.github.com/users/github-actions%5Bbot%5D/orgs",
"repos_url": "https://api.github.com/users/github-actions%5Bbot%5D/repos",
"events_url": "https://api.github.com/users/github-actions%5Bbot%5D/events{/privacy}",
"received_events_url": "https://api.github.com/users/github-actions%5Bbot%5D/received_events",
"type": "Bot",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38846/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38846/timeline | null | completed | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | true |
https://api.github.com/repos/huggingface/transformers/issues/38845 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38845/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38845/comments | https://api.github.com/repos/huggingface/transformers/issues/38845/events | https://github.com/huggingface/transformers/pull/38845 | 3,150,068,751 | PR_kwDOCUB6oc6auKZy | 38,845 | Fix `qwen2_5_vl` tests | {
"login": "ydshieh",
"id": 2521628,
"node_id": "MDQ6VXNlcjI1MjE2Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ydshieh",
"html_url": "https://github.com/ydshieh",
"followers_url": "https://api.github.com/users/ydshieh/followers",
"following_url": "https://api.github.com/users/ydshieh/following{/other_user}",
"gists_url": "https://api.github.com/users/ydshieh/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ydshieh/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ydshieh/subscriptions",
"organizations_url": "https://api.github.com/users/ydshieh/orgs",
"repos_url": "https://api.github.com/users/ydshieh/repos",
"events_url": "https://api.github.com/users/ydshieh/events{/privacy}",
"received_events_url": "https://api.github.com/users/ydshieh/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-06-16T13:39:44 | 2025-06-17T08:55:26 | 2025-06-17T08:55:24 | COLLABORATOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38845",
"html_url": "https://github.com/huggingface/transformers/pull/38845",
"diff_url": "https://github.com/huggingface/transformers/pull/38845.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38845.patch",
"merged_at": "2025-06-17T08:55:24"
} | # What does this PR do?
As usual, trying to update/fix some value mismatching or OOM to make CI better.
Passing now both on A10/T4 with torch 2.7.1 | {
"login": "ydshieh",
"id": 2521628,
"node_id": "MDQ6VXNlcjI1MjE2Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ydshieh",
"html_url": "https://github.com/ydshieh",
"followers_url": "https://api.github.com/users/ydshieh/followers",
"following_url": "https://api.github.com/users/ydshieh/following{/other_user}",
"gists_url": "https://api.github.com/users/ydshieh/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ydshieh/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ydshieh/subscriptions",
"organizations_url": "https://api.github.com/users/ydshieh/orgs",
"repos_url": "https://api.github.com/users/ydshieh/repos",
"events_url": "https://api.github.com/users/ydshieh/events{/privacy}",
"received_events_url": "https://api.github.com/users/ydshieh/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38845/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38845/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/38844 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38844/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38844/comments | https://api.github.com/repos/huggingface/transformers/issues/38844/events | https://github.com/huggingface/transformers/pull/38844 | 3,149,872,281 | PR_kwDOCUB6oc6atfZV | 38,844 | Fix ReDOS in tokenizer digit substitution | {
"login": "Rocketknight1",
"id": 12866554,
"node_id": "MDQ6VXNlcjEyODY2NTU0",
"avatar_url": "https://avatars.githubusercontent.com/u/12866554?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Rocketknight1",
"html_url": "https://github.com/Rocketknight1",
"followers_url": "https://api.github.com/users/Rocketknight1/followers",
"following_url": "https://api.github.com/users/Rocketknight1/following{/other_user}",
"gists_url": "https://api.github.com/users/Rocketknight1/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Rocketknight1/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Rocketknight1/subscriptions",
"organizations_url": "https://api.github.com/users/Rocketknight1/orgs",
"repos_url": "https://api.github.com/users/Rocketknight1/repos",
"events_url": "https://api.github.com/users/Rocketknight1/events{/privacy}",
"received_events_url": "https://api.github.com/users/Rocketknight1/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-06-16T12:41:05 | 2025-06-19T13:53:54 | 2025-06-19T13:53:53 | MEMBER | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38844",
"html_url": "https://github.com/huggingface/transformers/pull/38844",
"diff_url": "https://github.com/huggingface/transformers/pull/38844.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38844.patch",
"merged_at": "2025-06-19T13:53:53"
} | We use possessive quantifiers with `regex` for Py < 3.11 or `re` for Py >= 3.11 to avoid huge slowdown here.
cc @Michellehbn ! | {
"login": "Rocketknight1",
"id": 12866554,
"node_id": "MDQ6VXNlcjEyODY2NTU0",
"avatar_url": "https://avatars.githubusercontent.com/u/12866554?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Rocketknight1",
"html_url": "https://github.com/Rocketknight1",
"followers_url": "https://api.github.com/users/Rocketknight1/followers",
"following_url": "https://api.github.com/users/Rocketknight1/following{/other_user}",
"gists_url": "https://api.github.com/users/Rocketknight1/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Rocketknight1/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Rocketknight1/subscriptions",
"organizations_url": "https://api.github.com/users/Rocketknight1/orgs",
"repos_url": "https://api.github.com/users/Rocketknight1/repos",
"events_url": "https://api.github.com/users/Rocketknight1/events{/privacy}",
"received_events_url": "https://api.github.com/users/Rocketknight1/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38844/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38844/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/38843 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38843/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38843/comments | https://api.github.com/repos/huggingface/transformers/issues/38843/events | https://github.com/huggingface/transformers/issues/38843 | 3,149,596,109 | I_kwDOCUB6oc67uwXN | 38,843 | Error when create ModernBert model with flash attention TypeError: RotaryEmbedding.__init__() got an unexpected keyword argument 'pos_idx_in_fp32' | {
"login": "KabaevAnton",
"id": 7778551,
"node_id": "MDQ6VXNlcjc3Nzg1NTE=",
"avatar_url": "https://avatars.githubusercontent.com/u/7778551?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/KabaevAnton",
"html_url": "https://github.com/KabaevAnton",
"followers_url": "https://api.github.com/users/KabaevAnton/followers",
"following_url": "https://api.github.com/users/KabaevAnton/following{/other_user}",
"gists_url": "https://api.github.com/users/KabaevAnton/gists{/gist_id}",
"starred_url": "https://api.github.com/users/KabaevAnton/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/KabaevAnton/subscriptions",
"organizations_url": "https://api.github.com/users/KabaevAnton/orgs",
"repos_url": "https://api.github.com/users/KabaevAnton/repos",
"events_url": "https://api.github.com/users/KabaevAnton/events{/privacy}",
"received_events_url": "https://api.github.com/users/KabaevAnton/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
}
] | closed | false | null | [] | null | [] | 2025-06-16T11:11:47 | 2025-07-27T08:02:42 | 2025-07-27T08:02:42 | NONE | null | null | null | null | ### System Info
linux ubuntu 22.04
Python 3.12.4
transformers 4.52.4
flash-attn 2.8.0.post2
### Who can help?
@ArthurZucker
### Information
- [x] The official example scripts
- [ ] My own modified scripts
### Tasks
- [ ] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...)
- [ ] My own task or dataset (give details below)
### Reproduction
1. pip install flash-attn
2. config = AutoConfig.from_pretrained("answerdotai/ModernBERT-base")
3. model = AutoModelForMaskedLM.from_config(config)
4. TypeError: RotaryEmbedding.__init__() got an unexpected keyword argument 'pos_idx_in_fp32'
The issue are as folows: when you try to create a ModernBert model with flash attention it uses
` class ModernBertUnpaddedRotaryEmbedding(RotaryEmbedding):
"""
The rotary position embeddings applied directly to unpadded sequences.
"""
def __init__(
self,
dim: int,
base: float = 10000.0,
max_seqlen: Optional[int] = None,
device: Optional[torch.device] = None,
dtype: Optional[torch.dtype] = None,
):
"""
max_seqlen: if max_seqlen, device, and dtype are provided, we precompute the cos_sin_cache
up to max_seqlen. If the max_seqlen, device, or dtype during training/inference differ,
the cos_sin_cache will be recomputed during the forward pass.
"""
super().__init__(dim=dim, base=base, pos_idx_in_fp32=True, device=device, interleaved=False)
self.max_seqlen = max_seqlen
if max_seqlen is not None and device is not None and dtype is not None:
self._update_cos_sin_cache(max_seqlen, device=device, dtype=dtype)
`
ModernBertUnpaddedRotaryEmbedding set pos_idx_in_fp32 parameter into its super class witch is RotaryEmbedding but the only parameter it have is dim
`class RotaryEmbedding(torch.nn.Module):
"""
Rotary position embeddings based on those in
[RoFormer](https://huggingface.co/docs/transformers/model_doc/roformer). Query and keys are transformed by rotation
matrices which depend on their relative positions.
"""
def __init__(self, dim: int):
super().__init__()
# Generate and save the inverse frequency buffer (non trainable)
inv_freq = 1.0 / (10000 ** (torch.arange(0, dim, 2, dtype=torch.int64).float() / dim))
inv_freq = inv_freq
self.register_buffer("inv_freq", inv_freq)
self._seq_len_cached = None
self._cos_cached = None
self._sin_cached = None
`
### Expected behavior
Works without error | {
"login": "github-actions[bot]",
"id": 41898282,
"node_id": "MDM6Qm90NDE4OTgyODI=",
"avatar_url": "https://avatars.githubusercontent.com/in/15368?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/github-actions%5Bbot%5D",
"html_url": "https://github.com/apps/github-actions",
"followers_url": "https://api.github.com/users/github-actions%5Bbot%5D/followers",
"following_url": "https://api.github.com/users/github-actions%5Bbot%5D/following{/other_user}",
"gists_url": "https://api.github.com/users/github-actions%5Bbot%5D/gists{/gist_id}",
"starred_url": "https://api.github.com/users/github-actions%5Bbot%5D/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/github-actions%5Bbot%5D/subscriptions",
"organizations_url": "https://api.github.com/users/github-actions%5Bbot%5D/orgs",
"repos_url": "https://api.github.com/users/github-actions%5Bbot%5D/repos",
"events_url": "https://api.github.com/users/github-actions%5Bbot%5D/events{/privacy}",
"received_events_url": "https://api.github.com/users/github-actions%5Bbot%5D/received_events",
"type": "Bot",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38843/reactions",
"total_count": 7,
"+1": 3,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 4
} | https://api.github.com/repos/huggingface/transformers/issues/38843/timeline | null | completed | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | true |
https://api.github.com/repos/huggingface/transformers/issues/38842 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38842/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38842/comments | https://api.github.com/repos/huggingface/transformers/issues/38842/events | https://github.com/huggingface/transformers/pull/38842 | 3,149,550,921 | PR_kwDOCUB6oc6asY9E | 38,842 | Fix incorrect width ratio calculation in Llama4 image processor | {
"login": "Jingxiang-Zhang",
"id": 50895846,
"node_id": "MDQ6VXNlcjUwODk1ODQ2",
"avatar_url": "https://avatars.githubusercontent.com/u/50895846?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Jingxiang-Zhang",
"html_url": "https://github.com/Jingxiang-Zhang",
"followers_url": "https://api.github.com/users/Jingxiang-Zhang/followers",
"following_url": "https://api.github.com/users/Jingxiang-Zhang/following{/other_user}",
"gists_url": "https://api.github.com/users/Jingxiang-Zhang/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Jingxiang-Zhang/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Jingxiang-Zhang/subscriptions",
"organizations_url": "https://api.github.com/users/Jingxiang-Zhang/orgs",
"repos_url": "https://api.github.com/users/Jingxiang-Zhang/repos",
"events_url": "https://api.github.com/users/Jingxiang-Zhang/events{/privacy}",
"received_events_url": "https://api.github.com/users/Jingxiang-Zhang/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-06-16T10:55:50 | 2025-06-17T07:34:14 | 2025-06-17T07:33:37 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38842",
"html_url": "https://github.com/huggingface/transformers/pull/38842",
"diff_url": "https://github.com/huggingface/transformers/pull/38842.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38842.patch",
"merged_at": "2025-06-17T07:33:36"
} | In the `Llama4ImageProcessorFast` class ([source](https://github.com/huggingface/transformers/blob/main/src/transformers/models/llama4/image_processing_llama4_fast.py#L432)), the image resizing ratios for height and width are incorrectly calculated using the image height for both dimensions:
```python
ratio_h, ratio_w = (
target_size[0] // size.height,
target_size[1] // size.height,
)
```
Here, `size` is defined as:
```python
size = {"height": 336, "width": 336}
```
This configuration is also present in the [preprocessor_config.json](https://huggingface.co/meta-llama/Llama-4-Scout-17B-16E/blob/main/preprocessor_config.json):
```python
"size": {
"height": 336,
"width": 336
}
```
Currently, the bug does not cause visible issues because the height and width are equal. However, this would lead to incorrect behavior if the dimensions were different.
**Suggested Fix**:
Update the width ratio calculation to use `size.width` instead of `size.height`:
```python
ratio_h, ratio_w = (
target_size[0] // size.height,
target_size[1] // size.width,
)
```
This change ensures correct aspect ratio handling in cases where the input size is not square. | {
"login": "zucchini-nlp",
"id": 100715397,
"node_id": "U_kgDOBgDLhQ",
"avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/zucchini-nlp",
"html_url": "https://github.com/zucchini-nlp",
"followers_url": "https://api.github.com/users/zucchini-nlp/followers",
"following_url": "https://api.github.com/users/zucchini-nlp/following{/other_user}",
"gists_url": "https://api.github.com/users/zucchini-nlp/gists{/gist_id}",
"starred_url": "https://api.github.com/users/zucchini-nlp/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/zucchini-nlp/subscriptions",
"organizations_url": "https://api.github.com/users/zucchini-nlp/orgs",
"repos_url": "https://api.github.com/users/zucchini-nlp/repos",
"events_url": "https://api.github.com/users/zucchini-nlp/events{/privacy}",
"received_events_url": "https://api.github.com/users/zucchini-nlp/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38842/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38842/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/38841 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38841/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38841/comments | https://api.github.com/repos/huggingface/transformers/issues/38841/events | https://github.com/huggingface/transformers/pull/38841 | 3,149,050,028 | PR_kwDOCUB6oc6aqsB0 | 38,841 | Fix peft integration | {
"login": "Cyrilvallez",
"id": 71554963,
"node_id": "MDQ6VXNlcjcxNTU0OTYz",
"avatar_url": "https://avatars.githubusercontent.com/u/71554963?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Cyrilvallez",
"html_url": "https://github.com/Cyrilvallez",
"followers_url": "https://api.github.com/users/Cyrilvallez/followers",
"following_url": "https://api.github.com/users/Cyrilvallez/following{/other_user}",
"gists_url": "https://api.github.com/users/Cyrilvallez/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Cyrilvallez/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Cyrilvallez/subscriptions",
"organizations_url": "https://api.github.com/users/Cyrilvallez/orgs",
"repos_url": "https://api.github.com/users/Cyrilvallez/repos",
"events_url": "https://api.github.com/users/Cyrilvallez/events{/privacy}",
"received_events_url": "https://api.github.com/users/Cyrilvallez/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-06-16T08:31:28 | 2025-06-16T08:44:43 | 2025-06-16T08:39:25 | MEMBER | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38841",
"html_url": "https://github.com/huggingface/transformers/pull/38841",
"diff_url": "https://github.com/huggingface/transformers/pull/38841.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38841.patch",
"merged_at": "2025-06-16T08:39:25"
} | # What does this PR do?
Otherwise there is a circular import - as the list is only for BC, it's fine to copy it
| {
"login": "Cyrilvallez",
"id": 71554963,
"node_id": "MDQ6VXNlcjcxNTU0OTYz",
"avatar_url": "https://avatars.githubusercontent.com/u/71554963?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Cyrilvallez",
"html_url": "https://github.com/Cyrilvallez",
"followers_url": "https://api.github.com/users/Cyrilvallez/followers",
"following_url": "https://api.github.com/users/Cyrilvallez/following{/other_user}",
"gists_url": "https://api.github.com/users/Cyrilvallez/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Cyrilvallez/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Cyrilvallez/subscriptions",
"organizations_url": "https://api.github.com/users/Cyrilvallez/orgs",
"repos_url": "https://api.github.com/users/Cyrilvallez/repos",
"events_url": "https://api.github.com/users/Cyrilvallez/events{/privacy}",
"received_events_url": "https://api.github.com/users/Cyrilvallez/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38841/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38841/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/38840 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38840/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38840/comments | https://api.github.com/repos/huggingface/transformers/issues/38840/events | https://github.com/huggingface/transformers/pull/38840 | 3,148,892,510 | PR_kwDOCUB6oc6aqKLz | 38,840 | [video processor] fix BC when no video config if found | {
"login": "zucchini-nlp",
"id": 100715397,
"node_id": "U_kgDOBgDLhQ",
"avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/zucchini-nlp",
"html_url": "https://github.com/zucchini-nlp",
"followers_url": "https://api.github.com/users/zucchini-nlp/followers",
"following_url": "https://api.github.com/users/zucchini-nlp/following{/other_user}",
"gists_url": "https://api.github.com/users/zucchini-nlp/gists{/gist_id}",
"starred_url": "https://api.github.com/users/zucchini-nlp/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/zucchini-nlp/subscriptions",
"organizations_url": "https://api.github.com/users/zucchini-nlp/orgs",
"repos_url": "https://api.github.com/users/zucchini-nlp/repos",
"events_url": "https://api.github.com/users/zucchini-nlp/events{/privacy}",
"received_events_url": "https://api.github.com/users/zucchini-nlp/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-06-16T07:37:10 | 2025-06-17T07:20:17 | 2025-06-17T07:20:17 | MEMBER | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38840",
"html_url": "https://github.com/huggingface/transformers/pull/38840",
"diff_url": "https://github.com/huggingface/transformers/pull/38840.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38840.patch",
"merged_at": "2025-06-17T07:20:16"
} | # What does this PR do?
For BC we need to check if there is an image processor, not feature extractor. Previously all video processors could be only an instance of image processor
Reported in https://github.com/huggingface/transformers/issues/38665#issuecomment-2975331102 | {
"login": "zucchini-nlp",
"id": 100715397,
"node_id": "U_kgDOBgDLhQ",
"avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/zucchini-nlp",
"html_url": "https://github.com/zucchini-nlp",
"followers_url": "https://api.github.com/users/zucchini-nlp/followers",
"following_url": "https://api.github.com/users/zucchini-nlp/following{/other_user}",
"gists_url": "https://api.github.com/users/zucchini-nlp/gists{/gist_id}",
"starred_url": "https://api.github.com/users/zucchini-nlp/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/zucchini-nlp/subscriptions",
"organizations_url": "https://api.github.com/users/zucchini-nlp/orgs",
"repos_url": "https://api.github.com/users/zucchini-nlp/repos",
"events_url": "https://api.github.com/users/zucchini-nlp/events{/privacy}",
"received_events_url": "https://api.github.com/users/zucchini-nlp/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38840/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38840/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/38839 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38839/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38839/comments | https://api.github.com/repos/huggingface/transformers/issues/38839/events | https://github.com/huggingface/transformers/pull/38839 | 3,148,796,225 | PR_kwDOCUB6oc6ap1hK | 38,839 | [DO NOT MERGE] Testing saftensors 0.6.0 | {
"login": "Narsil",
"id": 204321,
"node_id": "MDQ6VXNlcjIwNDMyMQ==",
"avatar_url": "https://avatars.githubusercontent.com/u/204321?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Narsil",
"html_url": "https://github.com/Narsil",
"followers_url": "https://api.github.com/users/Narsil/followers",
"following_url": "https://api.github.com/users/Narsil/following{/other_user}",
"gists_url": "https://api.github.com/users/Narsil/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Narsil/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Narsil/subscriptions",
"organizations_url": "https://api.github.com/users/Narsil/orgs",
"repos_url": "https://api.github.com/users/Narsil/repos",
"events_url": "https://api.github.com/users/Narsil/events{/privacy}",
"received_events_url": "https://api.github.com/users/Narsil/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | open | false | null | [] | null | [] | 2025-06-16T07:00:50 | 2025-06-16T14:35:59 | null | CONTRIBUTOR | null | null | true | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38839",
"html_url": "https://github.com/huggingface/transformers/pull/38839",
"diff_url": "https://github.com/huggingface/transformers/pull/38839.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38839.patch",
"merged_at": null
} | # What does this PR do?
<!--
Congratulations! You've made it this far! You're not quite done yet though.
Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution.
Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change.
Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost.
-->
<!-- Remove if not applicable -->
Fixes # (issue)
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [ ] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
<!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @
If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**.
Please tag fewer than 3 people.
Models:
- text models: @ArthurZucker
- vision models: @amyeroberts, @qubvel
- speech models: @eustlb
- graph models: @clefourrier
Library:
- flax: @gante and @Rocketknight1
- generate: @zucchini-nlp (visual-language models) or @gante (all others)
- pipelines: @Rocketknight1
- tensorflow: @gante and @Rocketknight1
- tokenizers: @ArthurZucker
- trainer: @zach-huggingface, @SunMarc and @qgallouedec
- chat templates: @Rocketknight1
Integrations:
- deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface
- ray/raytune: @richardliaw, @amogkam
- Big Model Inference: @SunMarc
- quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber
Documentation: @stevhliu
HF projects:
- accelerate: [different repo](https://github.com/huggingface/accelerate)
- datasets: [different repo](https://github.com/huggingface/datasets)
- diffusers: [different repo](https://github.com/huggingface/diffusers)
- rust tokenizers: [different repo](https://github.com/huggingface/tokenizers)
Maintained examples (not research project or legacy):
- Flax: @Rocketknight1
- PyTorch: See Models above and tag the person corresponding to the modality of the example.
- TensorFlow: @Rocketknight1
-->
| null | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38839/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38839/timeline | null | null | null | null | true | false |
https://api.github.com/repos/huggingface/transformers/issues/38838 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38838/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38838/comments | https://api.github.com/repos/huggingface/transformers/issues/38838/events | https://github.com/huggingface/transformers/pull/38838 | 3,148,790,968 | PR_kwDOCUB6oc6ap0Zo | 38,838 | Delete deprecated stuff | {
"login": "zucchini-nlp",
"id": 100715397,
"node_id": "U_kgDOBgDLhQ",
"avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/zucchini-nlp",
"html_url": "https://github.com/zucchini-nlp",
"followers_url": "https://api.github.com/users/zucchini-nlp/followers",
"following_url": "https://api.github.com/users/zucchini-nlp/following{/other_user}",
"gists_url": "https://api.github.com/users/zucchini-nlp/gists{/gist_id}",
"starred_url": "https://api.github.com/users/zucchini-nlp/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/zucchini-nlp/subscriptions",
"organizations_url": "https://api.github.com/users/zucchini-nlp/orgs",
"repos_url": "https://api.github.com/users/zucchini-nlp/repos",
"events_url": "https://api.github.com/users/zucchini-nlp/events{/privacy}",
"received_events_url": "https://api.github.com/users/zucchini-nlp/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-06-16T06:58:41 | 2025-07-10T05:18:45 | 2025-07-10T05:18:44 | MEMBER | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38838",
"html_url": "https://github.com/huggingface/transformers/pull/38838",
"diff_url": "https://github.com/huggingface/transformers/pull/38838.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38838.patch",
"merged_at": "2025-07-10T05:18:44"
} | # What does this PR do?
As per title. Removes
- deprecated legacy cache
- deprecation we has for new processor API
- `**rope_kwargs` from the RoPE API
- `_seen_tokens` in cache classes
First review @gante as most modifications are around cache/generation | {
"login": "zucchini-nlp",
"id": 100715397,
"node_id": "U_kgDOBgDLhQ",
"avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/zucchini-nlp",
"html_url": "https://github.com/zucchini-nlp",
"followers_url": "https://api.github.com/users/zucchini-nlp/followers",
"following_url": "https://api.github.com/users/zucchini-nlp/following{/other_user}",
"gists_url": "https://api.github.com/users/zucchini-nlp/gists{/gist_id}",
"starred_url": "https://api.github.com/users/zucchini-nlp/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/zucchini-nlp/subscriptions",
"organizations_url": "https://api.github.com/users/zucchini-nlp/orgs",
"repos_url": "https://api.github.com/users/zucchini-nlp/repos",
"events_url": "https://api.github.com/users/zucchini-nlp/events{/privacy}",
"received_events_url": "https://api.github.com/users/zucchini-nlp/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38838/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38838/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/38837 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38837/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38837/comments | https://api.github.com/repos/huggingface/transformers/issues/38837/events | https://github.com/huggingface/transformers/issues/38837 | 3,148,755,950 | I_kwDOCUB6oc67rjPu | 38,837 | Loss is incorrectly scaled in Trainer during the last step with gradient accumulation when the final batch is smaller than accumulation steps. | {
"login": "hutaiHang",
"id": 77798564,
"node_id": "MDQ6VXNlcjc3Nzk4NTY0",
"avatar_url": "https://avatars.githubusercontent.com/u/77798564?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/hutaiHang",
"html_url": "https://github.com/hutaiHang",
"followers_url": "https://api.github.com/users/hutaiHang/followers",
"following_url": "https://api.github.com/users/hutaiHang/following{/other_user}",
"gists_url": "https://api.github.com/users/hutaiHang/gists{/gist_id}",
"starred_url": "https://api.github.com/users/hutaiHang/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/hutaiHang/subscriptions",
"organizations_url": "https://api.github.com/users/hutaiHang/orgs",
"repos_url": "https://api.github.com/users/hutaiHang/repos",
"events_url": "https://api.github.com/users/hutaiHang/events{/privacy}",
"received_events_url": "https://api.github.com/users/hutaiHang/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
}
] | closed | false | null | [] | null | [] | 2025-06-16T06:44:03 | 2025-07-29T15:12:32 | 2025-07-29T15:12:32 | CONTRIBUTOR | null | null | null | null | ### System Info
- `transformers` version: 4.51.3
- Platform: Linux-5.10.134-010.ali5000.al8.x86_64-x86_64-with-glibc2.32
- Python version: 3.10.16
- Huggingface_hub version: 0.30.2
- Safetensors version: 0.5.3
- Accelerate version: 1.6.0
- Accelerate config: not found
- DeepSpeed version: 0.15.4
- PyTorch version (GPU?): 2.6.0+cu124 (True)
- Tensorflow version (GPU?): not installed (NA)
- Flax version (CPU?/GPU?/TPU?): not installed (NA)
- Jax version: not installed
- JaxLib version: not installed
- Using distributed or parallel set-up in script?: <fill in>
- Using GPU in script?: <fill in>
- GPU type: NVIDIA A800-SXM4-80GB
### Who can help?
@zach-huggingface @SunMarc
### Information
- [ ] The official example scripts
- [x] My own modified scripts
### Tasks
- [ ] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...)
- [x] My own task or dataset (give details below)
### Reproduction
When using `gradient_accumulation_steps` in the Trainer, the calculated loss is divided by this number before the backward pass. As shown in [this](https://github.com/huggingface/transformers/blob/d5d007a1a0f0c11a726a54c8f00bd71825f84d02/src/transformers/trainer.py#L3790):
https://github.com/huggingface/transformers/blob/d5d007a1a0f0c11a726a54c8f00bd71825f84d02/src/transformers/trainer.py#L3790
```python
if (not self.model_accepts_loss_kwargs or num_items_in_batch is None) and self.compute_loss_func is None:
loss = loss / self.args.gradient_accumulation_steps
```
This is intended to average the loss over the accumulated steps. However, a problem arises on the very last training step if the remaining number of batches in the dataloader is less than gradient_accumulation_steps.
As show in [this](https://github.com/huggingface/transformers/blob/d5d007a1a0f0c11a726a54c8f00bd71825f84d02/src/transformers/trainer.py#L2501C1-L2503C59), When `num_batches=args.gradient_accumulation_steps and num_batches > len(batch_samples)`:
```python
num_batches = args.gradient_accumulation_steps if update_step != (total_updates - 1) else remainder
batch_samples, num_items_in_batch = self.get_batch_samples(epoch_iterator, num_batches, args.device)
for i, inputs in enumerate(batch_samples):
xxxx
```
In this scenario, the loss is still divided by the full gradient_accumulation_steps, even though the actual number of accumulated batches is smaller. This results in a final loss value that is artificially small, leading to an incorrect gradient magnitude for the final optimization step.
**To Reproduce**
+ Initialize a Trainer.
+ Use a dataset where the total number of samples is not perfectly divisible by per_device_train_batch_size * gradient_accumulation_steps.
+ Train the model for one epoch.
+ Observe the loss value on the final logging step. It will be significantly smaller than the others if the last accumulation cycle has fewer batches than gradient_accumulation_steps.
An simple example code is below:
```python
import os
os.environ['CUDA_VISIBLE_DEVICES'] = "0"
import torch
from torch.utils.data import TensorDataset
from transformers import (
AutoModelForSequenceClassification,
Trainer,
TrainingArguments,
AutoConfig
)
from transformers.utils import logging as hf_logging
# 1. Define the model and tokenizer
model_name = "bert-base-uncased"
config = AutoConfig.from_pretrained(model_name)
# Set all dropout probabilities to 0.0, To eliminate the randomness of each forward pass
config.hidden_dropout_prob = 0.0
config.attention_probs_dropout_prob = 0.0
model = AutoModelForSequenceClassification.from_pretrained(
model_name,
config=config,
)
# 2. Create a simple dataset
# Total 10 samples, batch_size=2, gradient accumulation=2
# This results in 5 batches in total. The first 4 batches complete one gradient update.
# The 5th batch is the last one, forming an accumulation cycle by itself, but with only one batch.
num_samples = 10
train_dataset = [
{
"input_ids": torch.randint(100, 2000, (8,)), # random generate
"attention_mask": torch.ones(8),
"labels": torch.randint(0, 2, (1,)).item()
}
]*num_samples
# 3. Set training parameters
training_args = TrainingArguments(
output_dir="./results",
per_device_train_batch_size=2,
gradient_accumulation_steps=2,
num_train_epochs=2,
logging_steps=1,
report_to="none",
lr_scheduler_type = "constant",
learning_rate = 0.0 #Without updating parameters
)
# 4. training
trainer = Trainer(
model=model,
args=training_args,
train_dataset=train_dataset,
)
trainer.train()
```
Running the code, the output log is:
> {'loss': 0.9984, 'grad_norm': 36.21815490722656, 'learning_rate': 0.0, 'epoch': 0.4}
> {'loss': 0.9984, 'grad_norm': 36.21815490722656, 'learning_rate': 0.0, 'epoch': 0.8}
> {'loss': 0.4992, 'grad_norm': 18.10907745361328, 'learning_rate': 0.0, 'epoch': 1.0} <-- **The problem!**
> {'loss': 0.9984, 'grad_norm': 36.21815490722656, 'learning_rate': 0.0, 'epoch': 1.4}
### Expected behavior
The loss scaling should be adjusted based on the actual number of batches accumulated in a given cycle. For the final (and potentially incomplete) accumulation cycle, the loss should be divided by the number of batches actually processed in that cycle, not by the total `gradient_accumulation_steps`. | {
"login": "SunMarc",
"id": 57196510,
"node_id": "MDQ6VXNlcjU3MTk2NTEw",
"avatar_url": "https://avatars.githubusercontent.com/u/57196510?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/SunMarc",
"html_url": "https://github.com/SunMarc",
"followers_url": "https://api.github.com/users/SunMarc/followers",
"following_url": "https://api.github.com/users/SunMarc/following{/other_user}",
"gists_url": "https://api.github.com/users/SunMarc/gists{/gist_id}",
"starred_url": "https://api.github.com/users/SunMarc/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/SunMarc/subscriptions",
"organizations_url": "https://api.github.com/users/SunMarc/orgs",
"repos_url": "https://api.github.com/users/SunMarc/repos",
"events_url": "https://api.github.com/users/SunMarc/events{/privacy}",
"received_events_url": "https://api.github.com/users/SunMarc/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38837/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38837/timeline | null | completed | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | true |
https://api.github.com/repos/huggingface/transformers/issues/38836 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38836/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38836/comments | https://api.github.com/repos/huggingface/transformers/issues/38836/events | https://github.com/huggingface/transformers/pull/38836 | 3,148,552,836 | PR_kwDOCUB6oc6apBKk | 38,836 | Fix DTensor import compatibility for PyTorch < 2.5 | {
"login": "Benoqtr",
"id": 155428839,
"node_id": "U_kgDOCUOn5w",
"avatar_url": "https://avatars.githubusercontent.com/u/155428839?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Benoqtr",
"html_url": "https://github.com/Benoqtr",
"followers_url": "https://api.github.com/users/Benoqtr/followers",
"following_url": "https://api.github.com/users/Benoqtr/following{/other_user}",
"gists_url": "https://api.github.com/users/Benoqtr/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Benoqtr/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Benoqtr/subscriptions",
"organizations_url": "https://api.github.com/users/Benoqtr/orgs",
"repos_url": "https://api.github.com/users/Benoqtr/repos",
"events_url": "https://api.github.com/users/Benoqtr/events{/privacy}",
"received_events_url": "https://api.github.com/users/Benoqtr/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-06-16T05:04:20 | 2025-06-23T09:25:57 | 2025-06-23T09:25:57 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38836",
"html_url": "https://github.com/huggingface/transformers/pull/38836",
"diff_url": "https://github.com/huggingface/transformers/pull/38836.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38836.patch",
"merged_at": "2025-06-23T09:25:56"
} | # What does this PR do?
This PR fixes a compatibility issue related to `DTensor` import introduced in PyTorch 2.5.
Previously, `DTensor` was imported only under the condition:
```python
if _torch_distributed_available and is_torch_greater_or_equal("2.5"):
from torch.distributed.tensor import DTensor
```
However, this led to a situation where DTensor is not defined at all in PyTorch versions below 2.5. As a result, any later use of:
`isinstance(some_tensor, DTensor)` would raise a NameError, even if the conditional import was skipped. This PR addresses that issue.
### Changes made:
-Added a fallback DTensor = None to ensure the name is always defined.
-Updated downstream code to check if DTensor is not None before using isinstance(..., DTensor).
-Ensures safe and version-compatible handling of DTensor logic across PyTorch versions.
---
Fixes: N/A (no open issue, but addresses latent compatibility bug)
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [ ] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
| {
"login": "Cyrilvallez",
"id": 71554963,
"node_id": "MDQ6VXNlcjcxNTU0OTYz",
"avatar_url": "https://avatars.githubusercontent.com/u/71554963?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Cyrilvallez",
"html_url": "https://github.com/Cyrilvallez",
"followers_url": "https://api.github.com/users/Cyrilvallez/followers",
"following_url": "https://api.github.com/users/Cyrilvallez/following{/other_user}",
"gists_url": "https://api.github.com/users/Cyrilvallez/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Cyrilvallez/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Cyrilvallez/subscriptions",
"organizations_url": "https://api.github.com/users/Cyrilvallez/orgs",
"repos_url": "https://api.github.com/users/Cyrilvallez/repos",
"events_url": "https://api.github.com/users/Cyrilvallez/events{/privacy}",
"received_events_url": "https://api.github.com/users/Cyrilvallez/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38836/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38836/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/38835 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38835/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38835/comments | https://api.github.com/repos/huggingface/transformers/issues/38835/events | https://github.com/huggingface/transformers/pull/38835 | 3,148,233,911 | PR_kwDOCUB6oc6an-0J | 38,835 | Update roc bert docs | {
"login": "SohamPrabhu",
"id": 62270341,
"node_id": "MDQ6VXNlcjYyMjcwMzQx",
"avatar_url": "https://avatars.githubusercontent.com/u/62270341?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/SohamPrabhu",
"html_url": "https://github.com/SohamPrabhu",
"followers_url": "https://api.github.com/users/SohamPrabhu/followers",
"following_url": "https://api.github.com/users/SohamPrabhu/following{/other_user}",
"gists_url": "https://api.github.com/users/SohamPrabhu/gists{/gist_id}",
"starred_url": "https://api.github.com/users/SohamPrabhu/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/SohamPrabhu/subscriptions",
"organizations_url": "https://api.github.com/users/SohamPrabhu/orgs",
"repos_url": "https://api.github.com/users/SohamPrabhu/repos",
"events_url": "https://api.github.com/users/SohamPrabhu/events{/privacy}",
"received_events_url": "https://api.github.com/users/SohamPrabhu/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-06-16T00:56:37 | 2025-06-17T18:02:19 | 2025-06-17T18:02:19 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38835",
"html_url": "https://github.com/huggingface/transformers/pull/38835",
"diff_url": "https://github.com/huggingface/transformers/pull/38835.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38835.patch",
"merged_at": "2025-06-17T18:02:19"
} | # What does this PR do?
This PR updates the model card according to the changes needed on the pull request.
<!--
Congratulations! You've made it this far! You're not quite done yet though.
Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution.
Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change.
Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost.
-->
<!-- Remove if not applicable -->
#36979 (issue)
## Before submitting
- [x] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [x] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [x] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [x] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [x] Did you write any new necessary tests?
## Who can review?
@stevhliu
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
<!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @
If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**.
Please tag fewer than 3 people.
Models:
- text models: @ArthurZucker
- vision models: @amyeroberts, @qubvel
- speech models: @eustlb
- graph models: @clefourrier
Library:
- flax: @gante and @Rocketknight1
- generate: @zucchini-nlp (visual-language models) or @gante (all others)
- pipelines: @Rocketknight1
- tensorflow: @gante and @Rocketknight1
- tokenizers: @ArthurZucker
- trainer: @zach-huggingface, @SunMarc and @qgallouedec
- chat templates: @Rocketknight1
Integrations:
- deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface
- ray/raytune: @richardliaw, @amogkam
- Big Model Inference: @SunMarc
- quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber
Documentation: @stevhliu
HF projects:
- accelerate: [different repo](https://github.com/huggingface/accelerate)
- datasets: [different repo](https://github.com/huggingface/datasets)
- diffusers: [different repo](https://github.com/huggingface/diffusers)
- rust tokenizers: [different repo](https://github.com/huggingface/tokenizers)
Maintained examples (not research project or legacy):
- Flax: @Rocketknight1
- PyTorch: See Models above and tag the person corresponding to the modality of the example.
- TensorFlow: @Rocketknight1
-->
| {
"login": "stevhliu",
"id": 59462357,
"node_id": "MDQ6VXNlcjU5NDYyMzU3",
"avatar_url": "https://avatars.githubusercontent.com/u/59462357?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/stevhliu",
"html_url": "https://github.com/stevhliu",
"followers_url": "https://api.github.com/users/stevhliu/followers",
"following_url": "https://api.github.com/users/stevhliu/following{/other_user}",
"gists_url": "https://api.github.com/users/stevhliu/gists{/gist_id}",
"starred_url": "https://api.github.com/users/stevhliu/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/stevhliu/subscriptions",
"organizations_url": "https://api.github.com/users/stevhliu/orgs",
"repos_url": "https://api.github.com/users/stevhliu/repos",
"events_url": "https://api.github.com/users/stevhliu/events{/privacy}",
"received_events_url": "https://api.github.com/users/stevhliu/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38835/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38835/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/38834 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38834/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38834/comments | https://api.github.com/repos/huggingface/transformers/issues/38834/events | https://github.com/huggingface/transformers/pull/38834 | 3,147,863,515 | PR_kwDOCUB6oc6am2Hg | 38,834 | Fix broken notebooks link in Italian training docs | {
"login": "VolodymyrBg",
"id": 189780094,
"node_id": "U_kgDOC0_Qfg",
"avatar_url": "https://avatars.githubusercontent.com/u/189780094?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/VolodymyrBg",
"html_url": "https://github.com/VolodymyrBg",
"followers_url": "https://api.github.com/users/VolodymyrBg/followers",
"following_url": "https://api.github.com/users/VolodymyrBg/following{/other_user}",
"gists_url": "https://api.github.com/users/VolodymyrBg/gists{/gist_id}",
"starred_url": "https://api.github.com/users/VolodymyrBg/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/VolodymyrBg/subscriptions",
"organizations_url": "https://api.github.com/users/VolodymyrBg/orgs",
"repos_url": "https://api.github.com/users/VolodymyrBg/repos",
"events_url": "https://api.github.com/users/VolodymyrBg/events{/privacy}",
"received_events_url": "https://api.github.com/users/VolodymyrBg/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-06-15T17:35:51 | 2025-06-16T14:38:52 | 2025-06-16T14:38:52 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38834",
"html_url": "https://github.com/huggingface/transformers/pull/38834",
"diff_url": "https://github.com/huggingface/transformers/pull/38834.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38834.patch",
"merged_at": "2025-06-16T14:38:52"
} | Replaced the relative link to the notebooks directory in docs/source/it/training.md with an external link to the official Hugging Face Notebooks repository on GitHub. This ensures users can always access the latest and correct set of example notebooks. | {
"login": "stevhliu",
"id": 59462357,
"node_id": "MDQ6VXNlcjU5NDYyMzU3",
"avatar_url": "https://avatars.githubusercontent.com/u/59462357?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/stevhliu",
"html_url": "https://github.com/stevhliu",
"followers_url": "https://api.github.com/users/stevhliu/followers",
"following_url": "https://api.github.com/users/stevhliu/following{/other_user}",
"gists_url": "https://api.github.com/users/stevhliu/gists{/gist_id}",
"starred_url": "https://api.github.com/users/stevhliu/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/stevhliu/subscriptions",
"organizations_url": "https://api.github.com/users/stevhliu/orgs",
"repos_url": "https://api.github.com/users/stevhliu/repos",
"events_url": "https://api.github.com/users/stevhliu/events{/privacy}",
"received_events_url": "https://api.github.com/users/stevhliu/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38834/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38834/timeline | null | null | null | null | true | true |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.