url string | repository_url string | labels_url string | comments_url string | events_url string | html_url string | id int64 | node_id string | number int64 | title string | user dict | labels list | state string | locked bool | assignee dict | assignees list | milestone null | comments list | created_at timestamp[ms] | updated_at timestamp[ms] | closed_at timestamp[ms] | author_association string | type dict | active_lock_reason null | draft bool | pull_request dict | body string | closed_by dict | reactions dict | timeline_url string | performed_via_github_app null | state_reason string | sub_issues_summary dict | issue_dependencies_summary dict | is_pull_request bool | is_closed bool |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
https://api.github.com/repos/huggingface/transformers/issues/37621 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/37621/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/37621/comments | https://api.github.com/repos/huggingface/transformers/issues/37621/events | https://github.com/huggingface/transformers/pull/37621 | 3,005,479,069 | PR_kwDOCUB6oc6TJnCS | 37,621 | Fix ValueError when eval_do_concat_batches=False with examples | {
"login": "jeffhataws",
"id": 56947987,
"node_id": "MDQ6VXNlcjU2OTQ3OTg3",
"avatar_url": "https://avatars.githubusercontent.com/u/56947987?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jeffhataws",
"html_url": "https://github.com/jeffhataws",
"followers_url": "https://api.github.com/users/jeffhataws/followers",
"following_url": "https://api.github.com/users/jeffhataws/following{/other_user}",
"gists_url": "https://api.github.com/users/jeffhataws/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jeffhataws/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jeffhataws/subscriptions",
"organizations_url": "https://api.github.com/users/jeffhataws/orgs",
"repos_url": "https://api.github.com/users/jeffhataws/repos",
"events_url": "https://api.github.com/users/jeffhataws/events{/privacy}",
"received_events_url": "https://api.github.com/users/jeffhataws/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-04-18T18:03:10 | 2025-04-22T10:13:26 | 2025-04-22T10:13:25 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37621",
"html_url": "https://github.com/huggingface/transformers/pull/37621",
"diff_url": "https://github.com/huggingface/transformers/pull/37621.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37621.patch",
"merged_at": "2025-04-22T10:13:25"
} | https://github.com/huggingface/transformers/issues/37593
# What does this PR do?
Fixes # [37593](https://github.com/huggingface/transformers/issues/37593)
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [X] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [X] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review?
@SunMarc
- text models: @ArthurZucker
- vision models: @amyeroberts, @qubvel
- speech models: @eustlb
- graph models: @clefourrier | {
"login": "SunMarc",
"id": 57196510,
"node_id": "MDQ6VXNlcjU3MTk2NTEw",
"avatar_url": "https://avatars.githubusercontent.com/u/57196510?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/SunMarc",
"html_url": "https://github.com/SunMarc",
"followers_url": "https://api.github.com/users/SunMarc/followers",
"following_url": "https://api.github.com/users/SunMarc/following{/other_user}",
"gists_url": "https://api.github.com/users/SunMarc/gists{/gist_id}",
"starred_url": "https://api.github.com/users/SunMarc/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/SunMarc/subscriptions",
"organizations_url": "https://api.github.com/users/SunMarc/orgs",
"repos_url": "https://api.github.com/users/SunMarc/repos",
"events_url": "https://api.github.com/users/SunMarc/events{/privacy}",
"received_events_url": "https://api.github.com/users/SunMarc/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/37621/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/37621/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/37620 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/37620/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/37620/comments | https://api.github.com/repos/huggingface/transformers/issues/37620/events | https://github.com/huggingface/transformers/pull/37620 | 3,005,448,414 | PR_kwDOCUB6oc6TJgES | 37,620 | Fix InternVL attention when using qk_norm (38B and 78B) | {
"login": "yonigozlan",
"id": 74535834,
"node_id": "MDQ6VXNlcjc0NTM1ODM0",
"avatar_url": "https://avatars.githubusercontent.com/u/74535834?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/yonigozlan",
"html_url": "https://github.com/yonigozlan",
"followers_url": "https://api.github.com/users/yonigozlan/followers",
"following_url": "https://api.github.com/users/yonigozlan/following{/other_user}",
"gists_url": "https://api.github.com/users/yonigozlan/gists{/gist_id}",
"starred_url": "https://api.github.com/users/yonigozlan/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/yonigozlan/subscriptions",
"organizations_url": "https://api.github.com/users/yonigozlan/orgs",
"repos_url": "https://api.github.com/users/yonigozlan/repos",
"events_url": "https://api.github.com/users/yonigozlan/events{/privacy}",
"received_events_url": "https://api.github.com/users/yonigozlan/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-04-18T17:49:26 | 2025-04-19T19:39:09 | 2025-04-19T19:39:08 | MEMBER | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37620",
"html_url": "https://github.com/huggingface/transformers/pull/37620",
"diff_url": "https://github.com/huggingface/transformers/pull/37620.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37620.patch",
"merged_at": "2025-04-19T19:39:08"
} | Fix an issue with InternVLVision Attention, where qk norm is not applied correctly
Cc @Cyrilvallez | {
"login": "yonigozlan",
"id": 74535834,
"node_id": "MDQ6VXNlcjc0NTM1ODM0",
"avatar_url": "https://avatars.githubusercontent.com/u/74535834?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/yonigozlan",
"html_url": "https://github.com/yonigozlan",
"followers_url": "https://api.github.com/users/yonigozlan/followers",
"following_url": "https://api.github.com/users/yonigozlan/following{/other_user}",
"gists_url": "https://api.github.com/users/yonigozlan/gists{/gist_id}",
"starred_url": "https://api.github.com/users/yonigozlan/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/yonigozlan/subscriptions",
"organizations_url": "https://api.github.com/users/yonigozlan/orgs",
"repos_url": "https://api.github.com/users/yonigozlan/repos",
"events_url": "https://api.github.com/users/yonigozlan/events{/privacy}",
"received_events_url": "https://api.github.com/users/yonigozlan/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/37620/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/37620/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/37619 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/37619/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/37619/comments | https://api.github.com/repos/huggingface/transformers/issues/37619/events | https://github.com/huggingface/transformers/pull/37619 | 3,005,445,560 | PR_kwDOCUB6oc6TJfcz | 37,619 | Updated model card for mbart and mbart50 | {
"login": "Vishesh-Mistry",
"id": 102403616,
"node_id": "U_kgDOBhqOIA",
"avatar_url": "https://avatars.githubusercontent.com/u/102403616?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Vishesh-Mistry",
"html_url": "https://github.com/Vishesh-Mistry",
"followers_url": "https://api.github.com/users/Vishesh-Mistry/followers",
"following_url": "https://api.github.com/users/Vishesh-Mistry/following{/other_user}",
"gists_url": "https://api.github.com/users/Vishesh-Mistry/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Vishesh-Mistry/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Vishesh-Mistry/subscriptions",
"organizations_url": "https://api.github.com/users/Vishesh-Mistry/orgs",
"repos_url": "https://api.github.com/users/Vishesh-Mistry/repos",
"events_url": "https://api.github.com/users/Vishesh-Mistry/events{/privacy}",
"received_events_url": "https://api.github.com/users/Vishesh-Mistry/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-04-18T17:46:51 | 2025-04-22T19:26:47 | 2025-04-22T19:26:47 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37619",
"html_url": "https://github.com/huggingface/transformers/pull/37619",
"diff_url": "https://github.com/huggingface/transformers/pull/37619.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37619.patch",
"merged_at": "2025-04-22T19:26:47"
} | # What does this PR do?
<!--
Congratulations! You've made it this far! You're not quite done yet though.
Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution.
Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change.
Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost.
-->
<!-- Remove if not applicable -->
Addresses #36979 so that the documentation for mbart and mbart50 align with the standardized format for all model cards.
## Before submitting
- [x] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
## Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
@stevhliu, please let me know about any changes that may be required.
<!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @
If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**.
Please tag fewer than 3 people.
Models:
- text models: @ArthurZucker
- vision models: @amyeroberts, @qubvel
- speech models: @eustlb
- graph models: @clefourrier
Library:
- flax: @gante and @Rocketknight1
- generate: @zucchini-nlp (visual-language models) or @gante (all others)
- pipelines: @Rocketknight1
- tensorflow: @gante and @Rocketknight1
- tokenizers: @ArthurZucker
- trainer: @zach-huggingface and @SunMarc
- chat templates: @Rocketknight1
Integrations:
- deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface
- ray/raytune: @richardliaw, @amogkam
- Big Model Inference: @SunMarc
- quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber
Documentation: @stevhliu
HF projects:
- accelerate: [different repo](https://github.com/huggingface/accelerate)
- datasets: [different repo](https://github.com/huggingface/datasets)
- diffusers: [different repo](https://github.com/huggingface/diffusers)
- rust tokenizers: [different repo](https://github.com/huggingface/tokenizers)
Maintained examples (not research project or legacy):
- Flax: @Rocketknight1
- PyTorch: See Models above and tag the person corresponding to the modality of the example.
- TensorFlow: @Rocketknight1
-->
| {
"login": "stevhliu",
"id": 59462357,
"node_id": "MDQ6VXNlcjU5NDYyMzU3",
"avatar_url": "https://avatars.githubusercontent.com/u/59462357?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/stevhliu",
"html_url": "https://github.com/stevhliu",
"followers_url": "https://api.github.com/users/stevhliu/followers",
"following_url": "https://api.github.com/users/stevhliu/following{/other_user}",
"gists_url": "https://api.github.com/users/stevhliu/gists{/gist_id}",
"starred_url": "https://api.github.com/users/stevhliu/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/stevhliu/subscriptions",
"organizations_url": "https://api.github.com/users/stevhliu/orgs",
"repos_url": "https://api.github.com/users/stevhliu/repos",
"events_url": "https://api.github.com/users/stevhliu/events{/privacy}",
"received_events_url": "https://api.github.com/users/stevhliu/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/37619/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/37619/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/37618 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/37618/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/37618/comments | https://api.github.com/repos/huggingface/transformers/issues/37618/events | https://github.com/huggingface/transformers/pull/37618 | 3,005,432,200 | PR_kwDOCUB6oc6TJckY | 37,618 | Bump torch from 2.2.0 to 2.6.0 in /examples/flax/vision | {
"login": "dependabot[bot]",
"id": 49699333,
"node_id": "MDM6Qm90NDk2OTkzMzM=",
"avatar_url": "https://avatars.githubusercontent.com/in/29110?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dependabot%5Bbot%5D",
"html_url": "https://github.com/apps/dependabot",
"followers_url": "https://api.github.com/users/dependabot%5Bbot%5D/followers",
"following_url": "https://api.github.com/users/dependabot%5Bbot%5D/following{/other_user}",
"gists_url": "https://api.github.com/users/dependabot%5Bbot%5D/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dependabot%5Bbot%5D/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dependabot%5Bbot%5D/subscriptions",
"organizations_url": "https://api.github.com/users/dependabot%5Bbot%5D/orgs",
"repos_url": "https://api.github.com/users/dependabot%5Bbot%5D/repos",
"events_url": "https://api.github.com/users/dependabot%5Bbot%5D/events{/privacy}",
"received_events_url": "https://api.github.com/users/dependabot%5Bbot%5D/received_events",
"type": "Bot",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 1905493434,
"node_id": "MDU6TGFiZWwxOTA1NDkzNDM0",
"url": "https://api.github.com/repos/huggingface/transformers/labels/dependencies",
"name": "dependencies",
"color": "0366d6",
"default": false,
"description": "Pull requests that update a dependency file"
},
{
"id": 6410654816,
"node_id": "LA_kwDOCUB6oc8AAAABfhrUYA",
"url": "https://api.github.com/repos/huggingface/transformers/labels/python",
"name": "python",
"color": "2b67c6",
"default": false,
"description": "Pull requests that update Python code"
}
] | closed | false | null | [] | null | [] | 2025-04-18T17:37:43 | 2025-05-30T13:04:54 | 2025-05-30T13:04:53 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37618",
"html_url": "https://github.com/huggingface/transformers/pull/37618",
"diff_url": "https://github.com/huggingface/transformers/pull/37618.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37618.patch",
"merged_at": "2025-05-30T13:04:53"
} | Bumps [torch](https://github.com/pytorch/pytorch) from 2.2.0 to 2.6.0.
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a href="https://github.com/pytorch/pytorch/releases">torch's releases</a>.</em></p>
<blockquote>
<h2>PyTorch 2.6.0 Release</h2>
<ul>
<li>Highlights</li>
<li>Tracked Regressions</li>
<li>Backwards Incompatible Change</li>
<li>Deprecations</li>
<li>New Features</li>
<li>Improvements</li>
<li>Bug fixes</li>
<li>Performance</li>
<li>Documentation</li>
<li>Developers</li>
</ul>
<h2><strong>Highlights</strong></h2>
<p>We are excited to announce the release of PyTorch® 2.6 (<a href="https://github.com/pytorch/pytorch/releases/tag/v2.6.0">release notes</a>)! This release features multiple improvements for PT2: <code>torch.compile</code> can now be used with Python 3.13; new performance-related knob <code>torch.compiler.set_stance</code>; several AOTInductor enhancements. Besides the PT2 improvements, another highlight is FP16 support on X86 CPUs.</p>
<p>NOTE: Starting with this release we are not going to publish on Conda, please see <a href="https://redirect.github.com/pytorch/pytorch/issues/138506">[Announcement] Deprecating PyTorch’s official Anaconda channel</a> for the details.</p>
<p>For this release the experimental Linux binaries shipped with CUDA 12.6.3 (as well as Linux Aarch64, Linux ROCm 6.2.4, and Linux XPU binaries) are built with CXX11_ABI=1 and are <a href="https://dev-discuss.pytorch.org/t/pytorch-linux-wheels-switching-to-new-wheel-build-platform-manylinux-2-28-on-november-12-2024/2581">using the Manylinux 2.28 build platform</a>. If you build PyTorch extensions with custom C++ or CUDA extensions, please update these builds to use CXX_ABI=1 as well and report any issues you are seeing. For the next PyTorch 2.7 release we plan to switch all Linux builds to Manylinux 2.28 and CXX11_ABI=1, please see <a href="https://redirect.github.com/pytorch/pytorch/issues/123649">[RFC] PyTorch next wheel build platform: manylinux-2.28</a> for the details and discussion.</p>
<p>Also in this release as an important security improvement measure we have changed the default value for <code>weights_only</code> parameter of <code>torch.load</code>. This is a backward compatibility-breaking change, please see <a href="https://dev-discuss.pytorch.org/t/bc-breaking-change-torch-load-is-being-flipped-to-use-weights-only-true-by-default-in-the-nightlies-after-137602/2573">this forum post</a> for more details.</p>
<p>This release is composed of 3892 commits from 520 contributors since PyTorch 2.5. We want to sincerely thank our dedicated community for your contributions. As always, we encourage you to try these out and report any issues as we improve PyTorch. More information about how to get started with the PyTorch 2-series can be found at our <a href="https://pytorch.org/get-started/pytorch-2.0/">Getting Started</a> page.</p>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a href="https://github.com/pytorch/pytorch/commit/1eba9b3aa3c43f86f4a2c807ac8e12c4a7767340"><code>1eba9b3</code></a> change the test wheel to release wheel when release wheel available (<a href="https://redirect.github.com/pytorch/pytorch/issues/145884">#145884</a>)</li>
<li><a href="https://github.com/pytorch/pytorch/commit/2236df1770800ffea5697b11b0bb0d910b2e59e1"><code>2236df1</code></a> [CUDA] Change slim-wheel libraries load order (<a href="https://redirect.github.com/pytorch/pytorch/issues/145662">#145662</a>)</li>
<li><a href="https://github.com/pytorch/pytorch/commit/32070409668513144e48390380a3911fc5cd03e0"><code>3207040</code></a> [CD] Fix slim-wheel cuda_nvrtc import problem (<a href="https://redirect.github.com/pytorch/pytorch/issues/145614">#145614</a>)</li>
<li><a href="https://github.com/pytorch/pytorch/commit/ca3c3a63b8087d405babf95fcce263cf1b8dd35e"><code>ca3c3a6</code></a> [Release-Only] Remove ptx from Linux CUDA 12.6 binary builds (<a href="https://redirect.github.com/pytorch/pytorch/issues/145616">#145616</a>)</li>
<li><a href="https://github.com/pytorch/pytorch/commit/7be6b5db475a5885379751953f67656ca1ce6edd"><code>7be6b5d</code></a> Fix IdentationError of code example (<a href="https://redirect.github.com/pytorch/pytorch/issues/145525">#145525</a>)</li>
<li><a href="https://github.com/pytorch/pytorch/commit/dcb8ad070f15e8444dcb4e415bde38142cda5e9d"><code>dcb8ad0</code></a> update get start xpu (<a href="https://redirect.github.com/pytorch/pytorch/issues/145286">#145286</a>)</li>
<li><a href="https://github.com/pytorch/pytorch/commit/8d4b8a920a2172523deb95bf20e8e52d50649c04"><code>8d4b8a9</code></a> Prevent legacy_load when weights_only=True (correctly) (<a href="https://redirect.github.com/pytorch/pytorch/issues/145111">#145111</a>)</li>
<li><a href="https://github.com/pytorch/pytorch/commit/9c34a2076b37c040b1b9a48b729656fe9e992949"><code>9c34a20</code></a> Revert "Prevent _legacy_load with weights_only=True (<a href="https://redirect.github.com/pytorch/pytorch/issues/144993">#144993</a>)"</li>
<li><a href="https://github.com/pytorch/pytorch/commit/cd15d7b29fea0886d1ae655da9bec767caa8c672"><code>cd15d7b</code></a> Prevent _legacy_load with weights_only=True (<a href="https://redirect.github.com/pytorch/pytorch/issues/144993">#144993</a>)</li>
<li><a href="https://github.com/pytorch/pytorch/commit/a2639bc255cd9eb93250c3bd1aea01f74538ad76"><code>a2639bc</code></a> [Release/2.6] Enable python-3.13t aarch64 builds (<a href="https://redirect.github.com/pytorch/pytorch/issues/144878">#144878</a>)</li>
<li>Additional commits viewable in <a href="https://github.com/pytorch/pytorch/compare/v2.2.0...v2.6.0">compare view</a></li>
</ul>
</details>
<br />
[](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)
Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`.
[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)
---
<details>
<summary>Dependabot commands and options</summary>
<br />
You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
You can disable automated security fix PRs for this repo from the [Security Alerts page](https://github.com/huggingface/transformers/network/alerts).
</details> | {
"login": "Rocketknight1",
"id": 12866554,
"node_id": "MDQ6VXNlcjEyODY2NTU0",
"avatar_url": "https://avatars.githubusercontent.com/u/12866554?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Rocketknight1",
"html_url": "https://github.com/Rocketknight1",
"followers_url": "https://api.github.com/users/Rocketknight1/followers",
"following_url": "https://api.github.com/users/Rocketknight1/following{/other_user}",
"gists_url": "https://api.github.com/users/Rocketknight1/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Rocketknight1/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Rocketknight1/subscriptions",
"organizations_url": "https://api.github.com/users/Rocketknight1/orgs",
"repos_url": "https://api.github.com/users/Rocketknight1/repos",
"events_url": "https://api.github.com/users/Rocketknight1/events{/privacy}",
"received_events_url": "https://api.github.com/users/Rocketknight1/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/37618/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/37618/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/37617 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/37617/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/37617/comments | https://api.github.com/repos/huggingface/transformers/issues/37617/events | https://github.com/huggingface/transformers/pull/37617 | 3,005,218,652 | PR_kwDOCUB6oc6TIuBj | 37,617 | 🚨 rm already deprecated padding max length | {
"login": "itazap",
"id": 31893021,
"node_id": "MDQ6VXNlcjMxODkzMDIx",
"avatar_url": "https://avatars.githubusercontent.com/u/31893021?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/itazap",
"html_url": "https://github.com/itazap",
"followers_url": "https://api.github.com/users/itazap/followers",
"following_url": "https://api.github.com/users/itazap/following{/other_user}",
"gists_url": "https://api.github.com/users/itazap/gists{/gist_id}",
"starred_url": "https://api.github.com/users/itazap/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/itazap/subscriptions",
"organizations_url": "https://api.github.com/users/itazap/orgs",
"repos_url": "https://api.github.com/users/itazap/repos",
"events_url": "https://api.github.com/users/itazap/events{/privacy}",
"received_events_url": "https://api.github.com/users/itazap/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-04-18T15:50:07 | 2025-05-01T13:21:57 | 2025-05-01T13:21:56 | COLLABORATOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37617",
"html_url": "https://github.com/huggingface/transformers/pull/37617",
"diff_url": "https://github.com/huggingface/transformers/pull/37617.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37617.patch",
"merged_at": "2025-05-01T13:21:56"
} | this seems already deprecated since 2022 and no instances of its use throughout! everything was converted from using ` pad_to_max_length=True` --> `padding=True, max_length=n`
there are plenty of other tests testing the `padding` parameters including `padding=True, max_length=n`, so safe to remove `def test_padding_to_max_length`.
need to remove all the overridden tests too, but want to confirm in review first | {
"login": "itazap",
"id": 31893021,
"node_id": "MDQ6VXNlcjMxODkzMDIx",
"avatar_url": "https://avatars.githubusercontent.com/u/31893021?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/itazap",
"html_url": "https://github.com/itazap",
"followers_url": "https://api.github.com/users/itazap/followers",
"following_url": "https://api.github.com/users/itazap/following{/other_user}",
"gists_url": "https://api.github.com/users/itazap/gists{/gist_id}",
"starred_url": "https://api.github.com/users/itazap/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/itazap/subscriptions",
"organizations_url": "https://api.github.com/users/itazap/orgs",
"repos_url": "https://api.github.com/users/itazap/repos",
"events_url": "https://api.github.com/users/itazap/events{/privacy}",
"received_events_url": "https://api.github.com/users/itazap/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/37617/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/37617/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/37616 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/37616/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/37616/comments | https://api.github.com/repos/huggingface/transformers/issues/37616/events | https://github.com/huggingface/transformers/pull/37616 | 3,005,032,159 | PR_kwDOCUB6oc6TIFYu | 37,616 | Fast image processor for VitMatte added and bug in slow version fixed | {
"login": "henrikm11",
"id": 126027334,
"node_id": "U_kgDOB4MGRg",
"avatar_url": "https://avatars.githubusercontent.com/u/126027334?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/henrikm11",
"html_url": "https://github.com/henrikm11",
"followers_url": "https://api.github.com/users/henrikm11/followers",
"following_url": "https://api.github.com/users/henrikm11/following{/other_user}",
"gists_url": "https://api.github.com/users/henrikm11/gists{/gist_id}",
"starred_url": "https://api.github.com/users/henrikm11/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/henrikm11/subscriptions",
"organizations_url": "https://api.github.com/users/henrikm11/orgs",
"repos_url": "https://api.github.com/users/henrikm11/repos",
"events_url": "https://api.github.com/users/henrikm11/events{/privacy}",
"received_events_url": "https://api.github.com/users/henrikm11/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-04-18T14:04:30 | 2025-04-28T18:51:51 | 2025-04-28T18:51:51 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37616",
"html_url": "https://github.com/huggingface/transformers/pull/37616",
"diff_url": "https://github.com/huggingface/transformers/pull/37616.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37616.patch",
"merged_at": "2025-04-28T18:51:50"
} | -) Added a fast image processor for VitMatte
-) fixed a bug in the slow image processor that processed images incorrectly for input format ChannelDimension.FIRST in which case the trimaps were not added in the correct dimension, this bug was also reflected in the tests through incorrectly shaped trimaps being passed to the preprocess function.
-) adjusted old tests to also cover the fast image processor
-) expanded tests
| {
"login": "yonigozlan",
"id": 74535834,
"node_id": "MDQ6VXNlcjc0NTM1ODM0",
"avatar_url": "https://avatars.githubusercontent.com/u/74535834?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/yonigozlan",
"html_url": "https://github.com/yonigozlan",
"followers_url": "https://api.github.com/users/yonigozlan/followers",
"following_url": "https://api.github.com/users/yonigozlan/following{/other_user}",
"gists_url": "https://api.github.com/users/yonigozlan/gists{/gist_id}",
"starred_url": "https://api.github.com/users/yonigozlan/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/yonigozlan/subscriptions",
"organizations_url": "https://api.github.com/users/yonigozlan/orgs",
"repos_url": "https://api.github.com/users/yonigozlan/repos",
"events_url": "https://api.github.com/users/yonigozlan/events{/privacy}",
"received_events_url": "https://api.github.com/users/yonigozlan/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/37616/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/37616/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/37615 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/37615/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/37615/comments | https://api.github.com/repos/huggingface/transformers/issues/37615/events | https://github.com/huggingface/transformers/issues/37615 | 3,004,987,631 | I_kwDOCUB6oc6zHHjv | 37,615 | Getting Warnings When Instantiating Object Detection Models Due to Meta Tensor Initialization | {
"login": "HichTala",
"id": 98521878,
"node_id": "U_kgDOBd9TFg",
"avatar_url": "https://avatars.githubusercontent.com/u/98521878?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/HichTala",
"html_url": "https://github.com/HichTala",
"followers_url": "https://api.github.com/users/HichTala/followers",
"following_url": "https://api.github.com/users/HichTala/following{/other_user}",
"gists_url": "https://api.github.com/users/HichTala/gists{/gist_id}",
"starred_url": "https://api.github.com/users/HichTala/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/HichTala/subscriptions",
"organizations_url": "https://api.github.com/users/HichTala/orgs",
"repos_url": "https://api.github.com/users/HichTala/repos",
"events_url": "https://api.github.com/users/HichTala/events{/privacy}",
"received_events_url": "https://api.github.com/users/HichTala/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
}
] | closed | false | null | [] | null | [] | 2025-04-18T13:40:57 | 2025-07-14T08:03:28 | 2025-07-14T08:03:28 | CONTRIBUTOR | null | null | null | null | ### System Info
- `transformers` version: 4.51.3
- Platform: Linux-6.11.0-21-generic-x86_64-with-glibc2.39
- Python version: 3.12.4
- Huggingface_hub version: 0.30.2
- Safetensors version: 0.5.3
- Accelerate version: 1.6.0
- Accelerate config: - compute_environment: LOCAL_MACHINE
- distributed_type: NO
- mixed_precision: no
- use_cpu: False
- debug: False
- num_processes: 1
- machine_rank: 0
- num_machines: 1
- gpu_ids: 1
- rdzv_backend: static
- same_network: True
- main_training_function: main
- enable_cpu_affinity: False
- downcast_bf16: no
- tpu_use_cluster: False
- tpu_use_sudo: False
- tpu_env: []
- DeepSpeed version: not installed
- PyTorch version (GPU?): 2.6.0+cu124 (True)
- Tensorflow version (GPU?): not installed (NA)
- Flax version (CPU?/GPU?/TPU?): not installed (NA)
- Jax version: not installed
- JaxLib version: not installed
- Using distributed or parallel set-up in script?: <fill in>
- Using GPU in script?: <fill in>
- GPU type: NVIDIA RTX 3500 Ada Generation Laptop GPU
### Who can help?
@amyeroberts, @qubvel
### Information
- [x] The official example scripts
- [ ] My own modified scripts
### Tasks
- [x] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...)
- [ ] My own task or dataset (give details below)
### Reproduction
When instantiating object detection models (e.g., `facebook/detr-resnet-50`) using `AutoModelForObjectDetection`, PyTorch emits flooding warnings.
Minimal Reproducible Example
```python
from transformers import AutoConfig, AutoModelForObjectDetection
config = AutoConfig.from_pretrained(
"facebook/detr-resnet-50",
)
model = AutoModelForObjectDetection.from_pretrained(
"facebook/detr-resnet-50",
config=config,
)
```
The warning may be print when instantiating more models using auto mapping...
### Expected behavior
When instantiating object detection models (e.g., `facebook/detr-resnet-50`) using `AutoModelForObjectDetection`, PyTorch emits flooding warnings like:
```shell
UserWarning: for module.name.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
```
From my understanding, the model is instantiated inside this `with` clause:
https://github.com/huggingface/transformers/blob/a1b82563f11d9101d54b06fd61aef8c90f63c9d2/src/transformers/modeling_utils.py#L4411-L4413
And here one of the `model_init_context` element is this function:
https://github.com/huggingface/transformers/blob/a1b82563f11d9101d54b06fd61aef8c90f63c9d2/src/transformers/integrations/accelerate.py#L36-L67
That put the model in `meta` device, which means (from my understanding) that weights are placeholder tensors (they have shape/dtype but no real data). Later in the code, real weights are loaded using `state_dict`, with pytorch's `load_state_dict` function but without passing the flag `assign=True`.
The thing is pytorch's `load_state_dict` function is called in `timm`'s `_builder` function:
https://github.com/huggingface/pytorch-image-models/blob/3ff38990264d7464ffd43ac2ddc26da04ff6355c/timm/models/_builder.py#L270
I am wondering if I should open an issue in `timm` repo instead of this one?
In previous version of `transformer`, `model_init_context` list did not contains except when a variable, `low_cpu_mem_usage`, was `True`. | {
"login": "github-actions[bot]",
"id": 41898282,
"node_id": "MDM6Qm90NDE4OTgyODI=",
"avatar_url": "https://avatars.githubusercontent.com/in/15368?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/github-actions%5Bbot%5D",
"html_url": "https://github.com/apps/github-actions",
"followers_url": "https://api.github.com/users/github-actions%5Bbot%5D/followers",
"following_url": "https://api.github.com/users/github-actions%5Bbot%5D/following{/other_user}",
"gists_url": "https://api.github.com/users/github-actions%5Bbot%5D/gists{/gist_id}",
"starred_url": "https://api.github.com/users/github-actions%5Bbot%5D/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/github-actions%5Bbot%5D/subscriptions",
"organizations_url": "https://api.github.com/users/github-actions%5Bbot%5D/orgs",
"repos_url": "https://api.github.com/users/github-actions%5Bbot%5D/repos",
"events_url": "https://api.github.com/users/github-actions%5Bbot%5D/events{/privacy}",
"received_events_url": "https://api.github.com/users/github-actions%5Bbot%5D/received_events",
"type": "Bot",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/37615/reactions",
"total_count": 1,
"+1": 1,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/37615/timeline | null | completed | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | true |
https://api.github.com/repos/huggingface/transformers/issues/37614 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/37614/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/37614/comments | https://api.github.com/repos/huggingface/transformers/issues/37614/events | https://github.com/huggingface/transformers/pull/37614 | 3,004,960,849 | PR_kwDOCUB6oc6TH1jr | 37,614 | [test] update `test_past_key_values_format` | {
"login": "gante",
"id": 12240844,
"node_id": "MDQ6VXNlcjEyMjQwODQ0",
"avatar_url": "https://avatars.githubusercontent.com/u/12240844?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/gante",
"html_url": "https://github.com/gante",
"followers_url": "https://api.github.com/users/gante/followers",
"following_url": "https://api.github.com/users/gante/following{/other_user}",
"gists_url": "https://api.github.com/users/gante/gists{/gist_id}",
"starred_url": "https://api.github.com/users/gante/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/gante/subscriptions",
"organizations_url": "https://api.github.com/users/gante/orgs",
"repos_url": "https://api.github.com/users/gante/repos",
"events_url": "https://api.github.com/users/gante/events{/privacy}",
"received_events_url": "https://api.github.com/users/gante/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-04-18T13:26:09 | 2025-04-22T10:07:37 | 2025-04-22T10:07:34 | MEMBER | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37614",
"html_url": "https://github.com/huggingface/transformers/pull/37614",
"diff_url": "https://github.com/huggingface/transformers/pull/37614.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37614.patch",
"merged_at": "2025-04-22T10:07:34"
} | # What does this PR do?
`tests/models/speecht5/test_modeling_speecht5.py::SpeechT5ForSpeechToTextTest::test_past_key_values_format` was flaky, so a revisit was in order.
This PR rewrites `test_past_key_values_format` so that:
1. It works with `Cache` objects
2. We can pass custom expected cache shapes for advanced models like DeepSeek V3
3. The checks performed are exactly the same as before
In the process, many model-level skips were removed 🤗 | {
"login": "gante",
"id": 12240844,
"node_id": "MDQ6VXNlcjEyMjQwODQ0",
"avatar_url": "https://avatars.githubusercontent.com/u/12240844?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/gante",
"html_url": "https://github.com/gante",
"followers_url": "https://api.github.com/users/gante/followers",
"following_url": "https://api.github.com/users/gante/following{/other_user}",
"gists_url": "https://api.github.com/users/gante/gists{/gist_id}",
"starred_url": "https://api.github.com/users/gante/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/gante/subscriptions",
"organizations_url": "https://api.github.com/users/gante/orgs",
"repos_url": "https://api.github.com/users/gante/repos",
"events_url": "https://api.github.com/users/gante/events{/privacy}",
"received_events_url": "https://api.github.com/users/gante/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/37614/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/37614/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/37613 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/37613/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/37613/comments | https://api.github.com/repos/huggingface/transformers/issues/37613/events | https://github.com/huggingface/transformers/pull/37613 | 3,004,924,946 | PR_kwDOCUB6oc6THton | 37,613 | Fixes #37219 : RecurrentGemma crashes for inputs longer than sliding window length | {
"login": "manueldeprada",
"id": 6536835,
"node_id": "MDQ6VXNlcjY1MzY4MzU=",
"avatar_url": "https://avatars.githubusercontent.com/u/6536835?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/manueldeprada",
"html_url": "https://github.com/manueldeprada",
"followers_url": "https://api.github.com/users/manueldeprada/followers",
"following_url": "https://api.github.com/users/manueldeprada/following{/other_user}",
"gists_url": "https://api.github.com/users/manueldeprada/gists{/gist_id}",
"starred_url": "https://api.github.com/users/manueldeprada/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/manueldeprada/subscriptions",
"organizations_url": "https://api.github.com/users/manueldeprada/orgs",
"repos_url": "https://api.github.com/users/manueldeprada/repos",
"events_url": "https://api.github.com/users/manueldeprada/events{/privacy}",
"received_events_url": "https://api.github.com/users/manueldeprada/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-04-18T13:07:31 | 2025-04-22T10:21:16 | 2025-04-22T10:21:16 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37613",
"html_url": "https://github.com/huggingface/transformers/pull/37613",
"diff_url": "https://github.com/huggingface/transformers/pull/37613.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37613.patch",
"merged_at": "2025-04-22T10:21:16"
} | ### Fix: Long-context bug in RecurrentGemma generation (#37219)
This PR resolves a shape mismatch error during generation with long prompts in `RecurrentGemma`, caused by the `attention_mask` not being cropped to the model’s sliding attention window.
The model expects a fixed-size attention window (`attention_window_size`) during decoding, but the generic `prepare_inputs_for_generation` does not enforce this. This leads to a mismatch in `_update_causal_mask()` when the prompt length exceeds the window.
**Fix:**
Crop `attention_mask` to the last `attention_window_size` tokens inside `_update_causal_mask()`.
This makes `RecurrentGemma` compatible with the general `prepare_inputs_for_generation`, and restores long-prompt generation without regressions.
### Tests
There was a previous `long_context` test, but it was 313 tokens, less than the 2048 tokens context windows. Added a new sequence to the test that covers this case.
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [ ] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review?
@gante
| {
"login": "manueldeprada",
"id": 6536835,
"node_id": "MDQ6VXNlcjY1MzY4MzU=",
"avatar_url": "https://avatars.githubusercontent.com/u/6536835?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/manueldeprada",
"html_url": "https://github.com/manueldeprada",
"followers_url": "https://api.github.com/users/manueldeprada/followers",
"following_url": "https://api.github.com/users/manueldeprada/following{/other_user}",
"gists_url": "https://api.github.com/users/manueldeprada/gists{/gist_id}",
"starred_url": "https://api.github.com/users/manueldeprada/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/manueldeprada/subscriptions",
"organizations_url": "https://api.github.com/users/manueldeprada/orgs",
"repos_url": "https://api.github.com/users/manueldeprada/repos",
"events_url": "https://api.github.com/users/manueldeprada/events{/privacy}",
"received_events_url": "https://api.github.com/users/manueldeprada/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/37613/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/37613/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/37612 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/37612/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/37612/comments | https://api.github.com/repos/huggingface/transformers/issues/37612/events | https://github.com/huggingface/transformers/pull/37612 | 3,004,883,730 | PR_kwDOCUB6oc6THkhD | 37,612 | [causal mask] fix preparation with multi-gpu | {
"login": "zucchini-nlp",
"id": 100715397,
"node_id": "U_kgDOBgDLhQ",
"avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/zucchini-nlp",
"html_url": "https://github.com/zucchini-nlp",
"followers_url": "https://api.github.com/users/zucchini-nlp/followers",
"following_url": "https://api.github.com/users/zucchini-nlp/following{/other_user}",
"gists_url": "https://api.github.com/users/zucchini-nlp/gists{/gist_id}",
"starred_url": "https://api.github.com/users/zucchini-nlp/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/zucchini-nlp/subscriptions",
"organizations_url": "https://api.github.com/users/zucchini-nlp/orgs",
"repos_url": "https://api.github.com/users/zucchini-nlp/repos",
"events_url": "https://api.github.com/users/zucchini-nlp/events{/privacy}",
"received_events_url": "https://api.github.com/users/zucchini-nlp/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-04-18T12:46:34 | 2025-04-25T07:34:18 | 2025-04-25T07:34:18 | MEMBER | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37612",
"html_url": "https://github.com/huggingface/transformers/pull/37612",
"diff_url": "https://github.com/huggingface/transformers/pull/37612.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37612.patch",
"merged_at": "2025-04-25T07:34:18"
} | # What does this PR do?
Fixes https://github.com/huggingface/transformers/issues/37606
Happens when `cache_position` and `inputs_embeds` are in different devices which doesn't affect LMs. It affects multimodal models only when multimodal encoder size is big and is allocated to the first device, so `embed_tokens` gets allocated to the next device (`idx=1`). The error will be triggered whenever a batched input is used, where `causal_mask` has to be built instead of being `None`
We can also modify only VLMs `_update_causal_mask` but I think it's easier if we have one source of truth (llama) from where everything is copied. The current fix doesn't change anything for LMs, where `cache_position` and `inputs_embeds` are always in same device
Added a small test that will fail on current `main`, tests failing in CI currently are not related. Reviewers can look only at `llama` and `utils`, other files are copied from
| {
"login": "zucchini-nlp",
"id": 100715397,
"node_id": "U_kgDOBgDLhQ",
"avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/zucchini-nlp",
"html_url": "https://github.com/zucchini-nlp",
"followers_url": "https://api.github.com/users/zucchini-nlp/followers",
"following_url": "https://api.github.com/users/zucchini-nlp/following{/other_user}",
"gists_url": "https://api.github.com/users/zucchini-nlp/gists{/gist_id}",
"starred_url": "https://api.github.com/users/zucchini-nlp/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/zucchini-nlp/subscriptions",
"organizations_url": "https://api.github.com/users/zucchini-nlp/orgs",
"repos_url": "https://api.github.com/users/zucchini-nlp/repos",
"events_url": "https://api.github.com/users/zucchini-nlp/events{/privacy}",
"received_events_url": "https://api.github.com/users/zucchini-nlp/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/37612/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/37612/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/37611 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/37611/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/37611/comments | https://api.github.com/repos/huggingface/transformers/issues/37611/events | https://github.com/huggingface/transformers/pull/37611 | 3,004,835,401 | PR_kwDOCUB6oc6THZlm | 37,611 | Add FastImageProcessor for InstructBLIPVideo | {
"login": "olccihyeon",
"id": 36918246,
"node_id": "MDQ6VXNlcjM2OTE4MjQ2",
"avatar_url": "https://avatars.githubusercontent.com/u/36918246?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/olccihyeon",
"html_url": "https://github.com/olccihyeon",
"followers_url": "https://api.github.com/users/olccihyeon/followers",
"following_url": "https://api.github.com/users/olccihyeon/following{/other_user}",
"gists_url": "https://api.github.com/users/olccihyeon/gists{/gist_id}",
"starred_url": "https://api.github.com/users/olccihyeon/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/olccihyeon/subscriptions",
"organizations_url": "https://api.github.com/users/olccihyeon/orgs",
"repos_url": "https://api.github.com/users/olccihyeon/repos",
"events_url": "https://api.github.com/users/olccihyeon/events{/privacy}",
"received_events_url": "https://api.github.com/users/olccihyeon/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | open | false | null | [] | null | [] | 2025-04-18T12:24:10 | 2025-04-22T19:08:14 | null | NONE | null | null | true | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37611",
"html_url": "https://github.com/huggingface/transformers/pull/37611",
"diff_url": "https://github.com/huggingface/transformers/pull/37611.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37611.patch",
"merged_at": null
} | # What does this PR do?
<!--
Congratulations! You've made it this far! You're not quite done yet though.
Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution.
Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change.
Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost.
-->
<!-- Remove if not applicable -->
Add Fast Image Processor for InstructBLIPVideo (Issue https://github.com/huggingface/transformers/issues/36978)
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [ ] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review?
@yonigozlan
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
<!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @
If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**.
Please tag fewer than 3 people.
Models:
- text models: @ArthurZucker
- vision models: @amyeroberts, @qubvel
- speech models: @eustlb
- graph models: @clefourrier
Library:
- flax: @gante and @Rocketknight1
- generate: @zucchini-nlp (visual-language models) or @gante (all others)
- pipelines: @Rocketknight1
- tensorflow: @gante and @Rocketknight1
- tokenizers: @ArthurZucker
- trainer: @zach-huggingface and @SunMarc
- chat templates: @Rocketknight1
Integrations:
- deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface
- ray/raytune: @richardliaw, @amogkam
- Big Model Inference: @SunMarc
- quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber
Documentation: @stevhliu
HF projects:
- accelerate: [different repo](https://github.com/huggingface/accelerate)
- datasets: [different repo](https://github.com/huggingface/datasets)
- diffusers: [different repo](https://github.com/huggingface/diffusers)
- rust tokenizers: [different repo](https://github.com/huggingface/tokenizers)
Maintained examples (not research project or legacy):
- Flax: @Rocketknight1
- PyTorch: See Models above and tag the person corresponding to the modality of the example.
- TensorFlow: @Rocketknight1
-->
| null | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/37611/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/37611/timeline | null | null | null | null | true | false |
https://api.github.com/repos/huggingface/transformers/issues/37610 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/37610/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/37610/comments | https://api.github.com/repos/huggingface/transformers/issues/37610/events | https://github.com/huggingface/transformers/pull/37610 | 3,004,731,852 | PR_kwDOCUB6oc6THDTA | 37,610 | ✨ Add EoMT Model || 🚨 Fix Mask2Former loss calculation | {
"login": "yaswanth19",
"id": 82788246,
"node_id": "MDQ6VXNlcjgyNzg4MjQ2",
"avatar_url": "https://avatars.githubusercontent.com/u/82788246?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/yaswanth19",
"html_url": "https://github.com/yaswanth19",
"followers_url": "https://api.github.com/users/yaswanth19/followers",
"following_url": "https://api.github.com/users/yaswanth19/following{/other_user}",
"gists_url": "https://api.github.com/users/yaswanth19/gists{/gist_id}",
"starred_url": "https://api.github.com/users/yaswanth19/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/yaswanth19/subscriptions",
"organizations_url": "https://api.github.com/users/yaswanth19/orgs",
"repos_url": "https://api.github.com/users/yaswanth19/repos",
"events_url": "https://api.github.com/users/yaswanth19/events{/privacy}",
"received_events_url": "https://api.github.com/users/yaswanth19/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 1843244711,
"node_id": "MDU6TGFiZWwxODQzMjQ0NzEx",
"url": "https://api.github.com/repos/huggingface/transformers/labels/New%20model",
"name": "New model",
"color": "fbca04",
"default": false,
"description": ""
},
{
"id": 5769473378,
"node_id": "LA_kwDOCUB6oc8AAAABV-MtYg",
"url": "https://api.github.com/repos/huggingface/transformers/labels/Vision",
"name": "Vision",
"color": "C079EF",
"default": false,
"description": ""
}
] | closed | false | null | [] | null | [] | 2025-04-18T11:12:54 | 2025-06-27T14:29:06 | 2025-06-27T12:18:18 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37610",
"html_url": "https://github.com/huggingface/transformers/pull/37610",
"diff_url": "https://github.com/huggingface/transformers/pull/37610.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37610.patch",
"merged_at": "2025-06-27T12:18:18"
} | # What does this PR do?
Fixes #37171 and continuation of #37392
| {
"login": "Cyrilvallez",
"id": 71554963,
"node_id": "MDQ6VXNlcjcxNTU0OTYz",
"avatar_url": "https://avatars.githubusercontent.com/u/71554963?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Cyrilvallez",
"html_url": "https://github.com/Cyrilvallez",
"followers_url": "https://api.github.com/users/Cyrilvallez/followers",
"following_url": "https://api.github.com/users/Cyrilvallez/following{/other_user}",
"gists_url": "https://api.github.com/users/Cyrilvallez/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Cyrilvallez/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Cyrilvallez/subscriptions",
"organizations_url": "https://api.github.com/users/Cyrilvallez/orgs",
"repos_url": "https://api.github.com/users/Cyrilvallez/repos",
"events_url": "https://api.github.com/users/Cyrilvallez/events{/privacy}",
"received_events_url": "https://api.github.com/users/Cyrilvallez/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/37610/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/37610/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/37609 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/37609/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/37609/comments | https://api.github.com/repos/huggingface/transformers/issues/37609/events | https://github.com/huggingface/transformers/pull/37609 | 3,004,701,410 | PR_kwDOCUB6oc6TG8o4 | 37,609 | [Bugfix] Set default value for output_attentions parameter in Gemma3T… | {
"login": "chenin-wang",
"id": 88609203,
"node_id": "MDQ6VXNlcjg4NjA5MjAz",
"avatar_url": "https://avatars.githubusercontent.com/u/88609203?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/chenin-wang",
"html_url": "https://github.com/chenin-wang",
"followers_url": "https://api.github.com/users/chenin-wang/followers",
"following_url": "https://api.github.com/users/chenin-wang/following{/other_user}",
"gists_url": "https://api.github.com/users/chenin-wang/gists{/gist_id}",
"starred_url": "https://api.github.com/users/chenin-wang/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/chenin-wang/subscriptions",
"organizations_url": "https://api.github.com/users/chenin-wang/orgs",
"repos_url": "https://api.github.com/users/chenin-wang/repos",
"events_url": "https://api.github.com/users/chenin-wang/events{/privacy}",
"received_events_url": "https://api.github.com/users/chenin-wang/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-04-18T10:51:43 | 2025-04-19T09:07:16 | 2025-04-19T09:03:54 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37609",
"html_url": "https://github.com/huggingface/transformers/pull/37609",
"diff_url": "https://github.com/huggingface/transformers/pull/37609.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37609.patch",
"merged_at": null
} | # What does this PR do?
[Bugfix] Set default value for output_attentions parameter in Gemma3TextModel
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [x] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [x] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [x] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
@ArthurZucker @amyeroberts
| {
"login": "chenin-wang",
"id": 88609203,
"node_id": "MDQ6VXNlcjg4NjA5MjAz",
"avatar_url": "https://avatars.githubusercontent.com/u/88609203?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/chenin-wang",
"html_url": "https://github.com/chenin-wang",
"followers_url": "https://api.github.com/users/chenin-wang/followers",
"following_url": "https://api.github.com/users/chenin-wang/following{/other_user}",
"gists_url": "https://api.github.com/users/chenin-wang/gists{/gist_id}",
"starred_url": "https://api.github.com/users/chenin-wang/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/chenin-wang/subscriptions",
"organizations_url": "https://api.github.com/users/chenin-wang/orgs",
"repos_url": "https://api.github.com/users/chenin-wang/repos",
"events_url": "https://api.github.com/users/chenin-wang/events{/privacy}",
"received_events_url": "https://api.github.com/users/chenin-wang/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/37609/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/37609/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/37608 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/37608/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/37608/comments | https://api.github.com/repos/huggingface/transformers/issues/37608/events | https://github.com/huggingface/transformers/pull/37608 | 3,004,599,979 | PR_kwDOCUB6oc6TGnEA | 37,608 | [don't merge] Check fork 2 | {
"login": "ydshieh",
"id": 2521628,
"node_id": "MDQ6VXNlcjI1MjE2Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ydshieh",
"html_url": "https://github.com/ydshieh",
"followers_url": "https://api.github.com/users/ydshieh/followers",
"following_url": "https://api.github.com/users/ydshieh/following{/other_user}",
"gists_url": "https://api.github.com/users/ydshieh/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ydshieh/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ydshieh/subscriptions",
"organizations_url": "https://api.github.com/users/ydshieh/orgs",
"repos_url": "https://api.github.com/users/ydshieh/repos",
"events_url": "https://api.github.com/users/ydshieh/events{/privacy}",
"received_events_url": "https://api.github.com/users/ydshieh/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-04-18T09:55:29 | 2025-04-22T16:23:35 | 2025-04-22T16:23:35 | COLLABORATOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37608",
"html_url": "https://github.com/huggingface/transformers/pull/37608",
"diff_url": "https://github.com/huggingface/transformers/pull/37608.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37608.patch",
"merged_at": null
} | # What does this PR do?
<!--
Congratulations! You've made it this far! You're not quite done yet though.
Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution.
Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change.
Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost.
-->
<!-- Remove if not applicable -->
Fixes # (issue)
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [ ] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
<!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @
If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**.
Please tag fewer than 3 people.
Models:
- text models: @ArthurZucker
- vision models: @amyeroberts, @qubvel
- speech models: @eustlb
- graph models: @clefourrier
Library:
- flax: @gante and @Rocketknight1
- generate: @zucchini-nlp (visual-language models) or @gante (all others)
- pipelines: @Rocketknight1
- tensorflow: @gante and @Rocketknight1
- tokenizers: @ArthurZucker
- trainer: @zach-huggingface and @SunMarc
- chat templates: @Rocketknight1
Integrations:
- deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface
- ray/raytune: @richardliaw, @amogkam
- Big Model Inference: @SunMarc
- quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber
Documentation: @stevhliu
HF projects:
- accelerate: [different repo](https://github.com/huggingface/accelerate)
- datasets: [different repo](https://github.com/huggingface/datasets)
- diffusers: [different repo](https://github.com/huggingface/diffusers)
- rust tokenizers: [different repo](https://github.com/huggingface/tokenizers)
Maintained examples (not research project or legacy):
- Flax: @Rocketknight1
- PyTorch: See Models above and tag the person corresponding to the modality of the example.
- TensorFlow: @Rocketknight1
-->
| {
"login": "ydshieh",
"id": 2521628,
"node_id": "MDQ6VXNlcjI1MjE2Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ydshieh",
"html_url": "https://github.com/ydshieh",
"followers_url": "https://api.github.com/users/ydshieh/followers",
"following_url": "https://api.github.com/users/ydshieh/following{/other_user}",
"gists_url": "https://api.github.com/users/ydshieh/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ydshieh/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ydshieh/subscriptions",
"organizations_url": "https://api.github.com/users/ydshieh/orgs",
"repos_url": "https://api.github.com/users/ydshieh/repos",
"events_url": "https://api.github.com/users/ydshieh/events{/privacy}",
"received_events_url": "https://api.github.com/users/ydshieh/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/37608/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/37608/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/37607 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/37607/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/37607/comments | https://api.github.com/repos/huggingface/transformers/issues/37607/events | https://github.com/huggingface/transformers/pull/37607 | 3,004,582,098 | PR_kwDOCUB6oc6TGjLf | 37,607 | Check fork | {
"login": "ydshieh",
"id": 2521628,
"node_id": "MDQ6VXNlcjI1MjE2Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ydshieh",
"html_url": "https://github.com/ydshieh",
"followers_url": "https://api.github.com/users/ydshieh/followers",
"following_url": "https://api.github.com/users/ydshieh/following{/other_user}",
"gists_url": "https://api.github.com/users/ydshieh/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ydshieh/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ydshieh/subscriptions",
"organizations_url": "https://api.github.com/users/ydshieh/orgs",
"repos_url": "https://api.github.com/users/ydshieh/repos",
"events_url": "https://api.github.com/users/ydshieh/events{/privacy}",
"received_events_url": "https://api.github.com/users/ydshieh/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-04-18T09:45:57 | 2025-04-18T10:14:58 | 2025-04-18T09:56:07 | COLLABORATOR | null | null | true | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37607",
"html_url": "https://github.com/huggingface/transformers/pull/37607",
"diff_url": "https://github.com/huggingface/transformers/pull/37607.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37607.patch",
"merged_at": null
} | # What does this PR do?
<!--
Congratulations! You've made it this far! You're not quite done yet though.
Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution.
Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change.
Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost.
-->
<!-- Remove if not applicable -->
Fixes # (issue)
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [ ] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
<!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @
If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**.
Please tag fewer than 3 people.
Models:
- text models: @ArthurZucker
- vision models: @amyeroberts, @qubvel
- speech models: @eustlb
- graph models: @clefourrier
Library:
- flax: @gante and @Rocketknight1
- generate: @zucchini-nlp (visual-language models) or @gante (all others)
- pipelines: @Rocketknight1
- tensorflow: @gante and @Rocketknight1
- tokenizers: @ArthurZucker
- trainer: @zach-huggingface and @SunMarc
- chat templates: @Rocketknight1
Integrations:
- deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface
- ray/raytune: @richardliaw, @amogkam
- Big Model Inference: @SunMarc
- quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber
Documentation: @stevhliu
HF projects:
- accelerate: [different repo](https://github.com/huggingface/accelerate)
- datasets: [different repo](https://github.com/huggingface/datasets)
- diffusers: [different repo](https://github.com/huggingface/diffusers)
- rust tokenizers: [different repo](https://github.com/huggingface/tokenizers)
Maintained examples (not research project or legacy):
- Flax: @Rocketknight1
- PyTorch: See Models above and tag the person corresponding to the modality of the example.
- TensorFlow: @Rocketknight1
-->
| {
"login": "ydshieh",
"id": 2521628,
"node_id": "MDQ6VXNlcjI1MjE2Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ydshieh",
"html_url": "https://github.com/ydshieh",
"followers_url": "https://api.github.com/users/ydshieh/followers",
"following_url": "https://api.github.com/users/ydshieh/following{/other_user}",
"gists_url": "https://api.github.com/users/ydshieh/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ydshieh/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ydshieh/subscriptions",
"organizations_url": "https://api.github.com/users/ydshieh/orgs",
"repos_url": "https://api.github.com/users/ydshieh/repos",
"events_url": "https://api.github.com/users/ydshieh/events{/privacy}",
"received_events_url": "https://api.github.com/users/ydshieh/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/37607/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/37607/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/37606 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/37606/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/37606/comments | https://api.github.com/repos/huggingface/transformers/issues/37606/events | https://github.com/huggingface/transformers/issues/37606 | 3,004,541,935 | I_kwDOCUB6oc6zFavv | 37,606 | Qwen 2.5 VL Batch Inference Error: tensors not on the same device | {
"login": "YiqunChen1999",
"id": 47789930,
"node_id": "MDQ6VXNlcjQ3Nzg5OTMw",
"avatar_url": "https://avatars.githubusercontent.com/u/47789930?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/YiqunChen1999",
"html_url": "https://github.com/YiqunChen1999",
"followers_url": "https://api.github.com/users/YiqunChen1999/followers",
"following_url": "https://api.github.com/users/YiqunChen1999/following{/other_user}",
"gists_url": "https://api.github.com/users/YiqunChen1999/gists{/gist_id}",
"starred_url": "https://api.github.com/users/YiqunChen1999/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/YiqunChen1999/subscriptions",
"organizations_url": "https://api.github.com/users/YiqunChen1999/orgs",
"repos_url": "https://api.github.com/users/YiqunChen1999/repos",
"events_url": "https://api.github.com/users/YiqunChen1999/events{/privacy}",
"received_events_url": "https://api.github.com/users/YiqunChen1999/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
}
] | closed | false | null | [] | null | [] | 2025-04-18T09:24:05 | 2025-04-25T07:34:19 | 2025-04-25T07:34:19 | NONE | null | null | null | null | ### System Info
```
Copy-and-paste the text below in your GitHub issue and FILL OUT the two last points.
- `transformers` version: 4.51.3
- Platform: Linux-5.15.0-127-generic-x86_64-with-glibc2.31
- Python version: 3.11.11
- Huggingface_hub version: 0.30.2
- Safetensors version: 0.5.3
- Accelerate version: 1.6.0
- Accelerate config: - compute_environment: LOCAL_MACHINE
- distributed_type: MULTI_GPU
- mixed_precision: no
- use_cpu: False
- debug: True
- num_processes: 4
- machine_rank: 0
- num_machines: 1
- gpu_ids: all
- rdzv_backend: static
- same_network: True
- main_training_function: main
- enable_cpu_affinity: False
- downcast_bf16: no
- tpu_use_cluster: False
- tpu_use_sudo: False
- tpu_env: []
- DeepSpeed version: not installed
- PyTorch version (GPU?): 2.6.0+cu118 (True)
- Tensorflow version (GPU?): not installed (NA)
- Flax version (CPU?/GPU?/TPU?): not installed (NA)
- Jax version: not installed
- JaxLib version: not installed
- Using distributed or parallel set-up in script?: <fill in>
- Using GPU in script?: <fill in>
- GPU type: NVIDIA GeForce RTX 3090
```
### Who can help?
@zucchini-nlp
### Information
- [ ] The official example scripts
- [ ] My own modified scripts
### Tasks
- [ ] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...)
- [ ] My own task or dataset (give details below)
### Reproduction
I am trying batch inference following the demo from https://huggingface.co/Qwen/Qwen2.5-VL-3B-Instruct with `transformers==4.51.3`. I can successfully run the single-sample demo, but fail with batch inference:
```
Traceback (most recent call last):
File "/PATH/TO/DEMO/DIR/demo.py", line 96, in <module>
generated_ids = model.generate(**inputs, max_new_tokens=128)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
... # ignored
File "MY_CONDA_ENV/lib/python3.11/site-packages/transformers/models/qwen2_5_vl/modeling_qwen2_5_vl.py", line 1334, in _prepare_4d_causal_attention_mask_with_cache_position
diagonal_attend_mask = torch.arange(target_length, device=device) > cache_position.reshape(-1, 1)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
RuntimeError: Expected all tensors to be on the same device, but found at least two devices, cuda:1 and cuda:0!
```
How I run the script:
```bash
CUDA_VISIBLE_DEVICES=0,1,2,3 python demo.py
```
My code (copied from the above url):
<details>
```python
from transformers import Qwen2_5_VLForConditionalGeneration, AutoTokenizer, AutoProcessor
from qwen_vl_utils import process_vision_info
# default: Load the model on the available device(s)
model = Qwen2_5_VLForConditionalGeneration.from_pretrained(
"Qwen/Qwen2.5-VL-3B-Instruct", torch_dtype="auto", device_map="auto"
)
# We recommend enabling flash_attention_2 for better acceleration and memory saving, especially in multi-image and video scenarios.
# model = Qwen2_5_VLForConditionalGeneration.from_pretrained(
# "Qwen/Qwen2.5-VL-3B-Instruct",
# torch_dtype=torch.bfloat16,
# attn_implementation="flash_attention_2",
# device_map="auto",
# )
# default processer
processor = AutoProcessor.from_pretrained("Qwen/Qwen2.5-VL-3B-Instruct")
# The default range for the number of visual tokens per image in the model is 4-16384.
# You can set min_pixels and max_pixels according to your needs, such as a token range of 256-1280, to balance performance and cost.
# min_pixels = 256*28*28
# max_pixels = 1280*28*28
# processor = AutoProcessor.from_pretrained("Qwen/Qwen2.5-VL-3B-Instruct", min_pixels=min_pixels, max_pixels=max_pixels)
messages = [
{
"role": "user",
"content": [
{
"type": "image",
"image": "https://qianwen-res.oss-cn-beijing.aliyuncs.com/Qwen-VL/assets/demo.jpeg",
},
{"type": "text", "text": "Describe this image."},
],
}
]
# Preparation for inference
text = processor.apply_chat_template(
messages, tokenize=False, add_generation_prompt=True
)
image_inputs, video_inputs = process_vision_info(messages)
inputs = processor(
text=[text],
images=image_inputs,
videos=video_inputs,
padding=True,
return_tensors="pt",
)
inputs = inputs.to("cuda")
# Inference: Generation of the output
generated_ids = model.generate(**inputs, max_new_tokens=128)
generated_ids_trimmed = [
out_ids[len(in_ids) :] for in_ids, out_ids in zip(inputs.input_ids, generated_ids)
]
output_text = processor.batch_decode(
generated_ids_trimmed, skip_special_tokens=True, clean_up_tokenization_spaces=False
)
print(output_text)
# Sample messages for batch inference
messages1 = [
{
"role": "user",
"content": [
{"type": "image", "image": "https://qianwen-res.oss-cn-beijing.aliyuncs.com/Qwen-VL/assets/demo.jpeg"},
{"type": "text", "text": "What are the common elements in these pictures?"},
],
}
]
messages2 = [
{"role": "system", "content": "You are a helpful assistant."},
{"role": "user", "content": "Who are you?"},
]
# Combine messages for batch processing
messages = [messages1, messages2]
# Preparation for batch inference
texts = [
processor.apply_chat_template(msg, tokenize=False, add_generation_prompt=True)
for msg in messages
]
image_inputs, video_inputs = process_vision_info(messages)
inputs = processor(
text=texts,
images=image_inputs,
videos=video_inputs,
padding=True,
return_tensors="pt",
)
inputs = inputs.to("cuda")
# Batch Inference
generated_ids = model.generate(**inputs, max_new_tokens=128)
generated_ids_trimmed = [
out_ids[len(in_ids) :] for in_ids, out_ids in zip(inputs.input_ids, generated_ids)
]
output_texts = processor.batch_decode(
generated_ids_trimmed, skip_special_tokens=True, clean_up_tokenization_spaces=False
)
print(output_texts)
```
</details>
### Expected behavior
Forward the batched inputs and output normally. | {
"login": "zucchini-nlp",
"id": 100715397,
"node_id": "U_kgDOBgDLhQ",
"avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/zucchini-nlp",
"html_url": "https://github.com/zucchini-nlp",
"followers_url": "https://api.github.com/users/zucchini-nlp/followers",
"following_url": "https://api.github.com/users/zucchini-nlp/following{/other_user}",
"gists_url": "https://api.github.com/users/zucchini-nlp/gists{/gist_id}",
"starred_url": "https://api.github.com/users/zucchini-nlp/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/zucchini-nlp/subscriptions",
"organizations_url": "https://api.github.com/users/zucchini-nlp/orgs",
"repos_url": "https://api.github.com/users/zucchini-nlp/repos",
"events_url": "https://api.github.com/users/zucchini-nlp/events{/privacy}",
"received_events_url": "https://api.github.com/users/zucchini-nlp/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/37606/reactions",
"total_count": 2,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 2
} | https://api.github.com/repos/huggingface/transformers/issues/37606/timeline | null | completed | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | true |
https://api.github.com/repos/huggingface/transformers/issues/37605 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/37605/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/37605/comments | https://api.github.com/repos/huggingface/transformers/issues/37605/events | https://github.com/huggingface/transformers/pull/37605 | 3,004,540,180 | PR_kwDOCUB6oc6TGaDF | 37,605 | [do not merge] ci check | {
"login": "qubvel",
"id": 31920396,
"node_id": "MDQ6VXNlcjMxOTIwMzk2",
"avatar_url": "https://avatars.githubusercontent.com/u/31920396?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/qubvel",
"html_url": "https://github.com/qubvel",
"followers_url": "https://api.github.com/users/qubvel/followers",
"following_url": "https://api.github.com/users/qubvel/following{/other_user}",
"gists_url": "https://api.github.com/users/qubvel/gists{/gist_id}",
"starred_url": "https://api.github.com/users/qubvel/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/qubvel/subscriptions",
"organizations_url": "https://api.github.com/users/qubvel/orgs",
"repos_url": "https://api.github.com/users/qubvel/repos",
"events_url": "https://api.github.com/users/qubvel/events{/privacy}",
"received_events_url": "https://api.github.com/users/qubvel/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-04-18T09:23:19 | 2025-04-18T10:20:46 | 2025-04-18T10:20:46 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37605",
"html_url": "https://github.com/huggingface/transformers/pull/37605",
"diff_url": "https://github.com/huggingface/transformers/pull/37605.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37605.patch",
"merged_at": null
} | # What does this PR do?
<!--
Congratulations! You've made it this far! You're not quite done yet though.
Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution.
Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change.
Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost.
-->
<!-- Remove if not applicable -->
Fixes # (issue)
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [ ] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
<!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @
If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**.
Please tag fewer than 3 people.
Models:
- text models: @ArthurZucker
- vision models: @amyeroberts, @qubvel
- speech models: @eustlb
- graph models: @clefourrier
Library:
- flax: @gante and @Rocketknight1
- generate: @zucchini-nlp (visual-language models) or @gante (all others)
- pipelines: @Rocketknight1
- tensorflow: @gante and @Rocketknight1
- tokenizers: @ArthurZucker
- trainer: @zach-huggingface and @SunMarc
- chat templates: @Rocketknight1
Integrations:
- deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface
- ray/raytune: @richardliaw, @amogkam
- Big Model Inference: @SunMarc
- quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber
Documentation: @stevhliu
HF projects:
- accelerate: [different repo](https://github.com/huggingface/accelerate)
- datasets: [different repo](https://github.com/huggingface/datasets)
- diffusers: [different repo](https://github.com/huggingface/diffusers)
- rust tokenizers: [different repo](https://github.com/huggingface/tokenizers)
Maintained examples (not research project or legacy):
- Flax: @Rocketknight1
- PyTorch: See Models above and tag the person corresponding to the modality of the example.
- TensorFlow: @Rocketknight1
-->
| {
"login": "qubvel",
"id": 31920396,
"node_id": "MDQ6VXNlcjMxOTIwMzk2",
"avatar_url": "https://avatars.githubusercontent.com/u/31920396?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/qubvel",
"html_url": "https://github.com/qubvel",
"followers_url": "https://api.github.com/users/qubvel/followers",
"following_url": "https://api.github.com/users/qubvel/following{/other_user}",
"gists_url": "https://api.github.com/users/qubvel/gists{/gist_id}",
"starred_url": "https://api.github.com/users/qubvel/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/qubvel/subscriptions",
"organizations_url": "https://api.github.com/users/qubvel/orgs",
"repos_url": "https://api.github.com/users/qubvel/repos",
"events_url": "https://api.github.com/users/qubvel/events{/privacy}",
"received_events_url": "https://api.github.com/users/qubvel/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/37605/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/37605/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/37604 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/37604/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/37604/comments | https://api.github.com/repos/huggingface/transformers/issues/37604/events | https://github.com/huggingface/transformers/pull/37604 | 3,004,529,783 | PR_kwDOCUB6oc6TGXtC | 37,604 | [kernels] use original forward at compile time | {
"login": "gante",
"id": 12240844,
"node_id": "MDQ6VXNlcjEyMjQwODQ0",
"avatar_url": "https://avatars.githubusercontent.com/u/12240844?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/gante",
"html_url": "https://github.com/gante",
"followers_url": "https://api.github.com/users/gante/followers",
"following_url": "https://api.github.com/users/gante/following{/other_user}",
"gists_url": "https://api.github.com/users/gante/gists{/gist_id}",
"starred_url": "https://api.github.com/users/gante/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/gante/subscriptions",
"organizations_url": "https://api.github.com/users/gante/orgs",
"repos_url": "https://api.github.com/users/gante/repos",
"events_url": "https://api.github.com/users/gante/events{/privacy}",
"received_events_url": "https://api.github.com/users/gante/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-04-18T09:18:48 | 2025-04-21T12:23:27 | 2025-04-21T12:22:47 | MEMBER | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37604",
"html_url": "https://github.com/huggingface/transformers/pull/37604",
"diff_url": "https://github.com/huggingface/transformers/pull/37604.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37604.patch",
"merged_at": "2025-04-21T12:22:47"
} | # What does this PR do?
`kernels` and `torch.compile` are not yet compatible with each other. Although we can skip custom kernels when the package is not installed, adding an error message is also not feasible -- we can't throw exceptions at compile time.
This PR hijacks the `kernels` decorator to add a compile-friendly path: until `kernels` supports `torch.compile`, let's use the original `forward`. | {
"login": "gante",
"id": 12240844,
"node_id": "MDQ6VXNlcjEyMjQwODQ0",
"avatar_url": "https://avatars.githubusercontent.com/u/12240844?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/gante",
"html_url": "https://github.com/gante",
"followers_url": "https://api.github.com/users/gante/followers",
"following_url": "https://api.github.com/users/gante/following{/other_user}",
"gists_url": "https://api.github.com/users/gante/gists{/gist_id}",
"starred_url": "https://api.github.com/users/gante/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/gante/subscriptions",
"organizations_url": "https://api.github.com/users/gante/orgs",
"repos_url": "https://api.github.com/users/gante/repos",
"events_url": "https://api.github.com/users/gante/events{/privacy}",
"received_events_url": "https://api.github.com/users/gante/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/37604/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/37604/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/37603 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/37603/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/37603/comments | https://api.github.com/repos/huggingface/transformers/issues/37603/events | https://github.com/huggingface/transformers/pull/37603 | 3,004,519,397 | PR_kwDOCUB6oc6TGVWm | 37,603 | [VLMs] fix flash-attention tests | {
"login": "zucchini-nlp",
"id": 100715397,
"node_id": "U_kgDOBgDLhQ",
"avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/zucchini-nlp",
"html_url": "https://github.com/zucchini-nlp",
"followers_url": "https://api.github.com/users/zucchini-nlp/followers",
"following_url": "https://api.github.com/users/zucchini-nlp/following{/other_user}",
"gists_url": "https://api.github.com/users/zucchini-nlp/gists{/gist_id}",
"starred_url": "https://api.github.com/users/zucchini-nlp/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/zucchini-nlp/subscriptions",
"organizations_url": "https://api.github.com/users/zucchini-nlp/orgs",
"repos_url": "https://api.github.com/users/zucchini-nlp/repos",
"events_url": "https://api.github.com/users/zucchini-nlp/events{/privacy}",
"received_events_url": "https://api.github.com/users/zucchini-nlp/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-04-18T09:15:21 | 2025-04-27T17:16:52 | 2025-04-24T09:48:11 | MEMBER | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37603",
"html_url": "https://github.com/huggingface/transformers/pull/37603",
"diff_url": "https://github.com/huggingface/transformers/pull/37603.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37603.patch",
"merged_at": "2025-04-24T09:48:11"
} | # What does this PR do?
FA2 with layer norm upcasted was failing for all VLMs before. The issue lies in VLM config structure which is composite, and modifying base config doesn't update sub-configs (which are also deepcopied in model with `XXX._from_config`)
The solution is to apply recursively config update on model's children, whenever a PretrainedModel is found. Though I am not sure if we need any recursive calls when `prepare_model_for_quantization`, it doesn't seem to update config | {
"login": "zucchini-nlp",
"id": 100715397,
"node_id": "U_kgDOBgDLhQ",
"avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/zucchini-nlp",
"html_url": "https://github.com/zucchini-nlp",
"followers_url": "https://api.github.com/users/zucchini-nlp/followers",
"following_url": "https://api.github.com/users/zucchini-nlp/following{/other_user}",
"gists_url": "https://api.github.com/users/zucchini-nlp/gists{/gist_id}",
"starred_url": "https://api.github.com/users/zucchini-nlp/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/zucchini-nlp/subscriptions",
"organizations_url": "https://api.github.com/users/zucchini-nlp/orgs",
"repos_url": "https://api.github.com/users/zucchini-nlp/repos",
"events_url": "https://api.github.com/users/zucchini-nlp/events{/privacy}",
"received_events_url": "https://api.github.com/users/zucchini-nlp/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/37603/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/37603/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/37602 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/37602/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/37602/comments | https://api.github.com/repos/huggingface/transformers/issues/37602/events | https://github.com/huggingface/transformers/pull/37602 | 3,004,409,257 | PR_kwDOCUB6oc6TF8Pb | 37,602 | [chat template] separate jinja logic from tokenizers | {
"login": "zucchini-nlp",
"id": 100715397,
"node_id": "U_kgDOBgDLhQ",
"avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/zucchini-nlp",
"html_url": "https://github.com/zucchini-nlp",
"followers_url": "https://api.github.com/users/zucchini-nlp/followers",
"following_url": "https://api.github.com/users/zucchini-nlp/following{/other_user}",
"gists_url": "https://api.github.com/users/zucchini-nlp/gists{/gist_id}",
"starred_url": "https://api.github.com/users/zucchini-nlp/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/zucchini-nlp/subscriptions",
"organizations_url": "https://api.github.com/users/zucchini-nlp/orgs",
"repos_url": "https://api.github.com/users/zucchini-nlp/repos",
"events_url": "https://api.github.com/users/zucchini-nlp/events{/privacy}",
"received_events_url": "https://api.github.com/users/zucchini-nlp/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-04-18T08:36:05 | 2025-05-07T12:18:04 | 2025-05-07T12:18:04 | MEMBER | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37602",
"html_url": "https://github.com/huggingface/transformers/pull/37602",
"diff_url": "https://github.com/huggingface/transformers/pull/37602.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37602.patch",
"merged_at": "2025-05-07T12:18:04"
} | # What does this PR do?
Fixes https://github.com/huggingface/transformers/issues/36713
| {
"login": "zucchini-nlp",
"id": 100715397,
"node_id": "U_kgDOBgDLhQ",
"avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/zucchini-nlp",
"html_url": "https://github.com/zucchini-nlp",
"followers_url": "https://api.github.com/users/zucchini-nlp/followers",
"following_url": "https://api.github.com/users/zucchini-nlp/following{/other_user}",
"gists_url": "https://api.github.com/users/zucchini-nlp/gists{/gist_id}",
"starred_url": "https://api.github.com/users/zucchini-nlp/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/zucchini-nlp/subscriptions",
"organizations_url": "https://api.github.com/users/zucchini-nlp/orgs",
"repos_url": "https://api.github.com/users/zucchini-nlp/repos",
"events_url": "https://api.github.com/users/zucchini-nlp/events{/privacy}",
"received_events_url": "https://api.github.com/users/zucchini-nlp/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/37602/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/37602/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/37601 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/37601/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/37601/comments | https://api.github.com/repos/huggingface/transformers/issues/37601/events | https://github.com/huggingface/transformers/pull/37601 | 3,004,391,916 | PR_kwDOCUB6oc6TF4XK | 37,601 | trigger CI | {
"login": "ydshieh",
"id": 2521628,
"node_id": "MDQ6VXNlcjI1MjE2Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ydshieh",
"html_url": "https://github.com/ydshieh",
"followers_url": "https://api.github.com/users/ydshieh/followers",
"following_url": "https://api.github.com/users/ydshieh/following{/other_user}",
"gists_url": "https://api.github.com/users/ydshieh/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ydshieh/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ydshieh/subscriptions",
"organizations_url": "https://api.github.com/users/ydshieh/orgs",
"repos_url": "https://api.github.com/users/ydshieh/repos",
"events_url": "https://api.github.com/users/ydshieh/events{/privacy}",
"received_events_url": "https://api.github.com/users/ydshieh/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-04-18T08:28:06 | 2025-04-22T16:23:49 | 2025-04-22T16:23:49 | COLLABORATOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37601",
"html_url": "https://github.com/huggingface/transformers/pull/37601",
"diff_url": "https://github.com/huggingface/transformers/pull/37601.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37601.patch",
"merged_at": null
} | # What does this PR do?
<!--
Congratulations! You've made it this far! You're not quite done yet though.
Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution.
Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change.
Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost.
-->
<!-- Remove if not applicable -->
Fixes # (issue)
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [ ] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
<!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @
If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**.
Please tag fewer than 3 people.
Models:
- text models: @ArthurZucker
- vision models: @amyeroberts, @qubvel
- speech models: @eustlb
- graph models: @clefourrier
Library:
- flax: @gante and @Rocketknight1
- generate: @zucchini-nlp (visual-language models) or @gante (all others)
- pipelines: @Rocketknight1
- tensorflow: @gante and @Rocketknight1
- tokenizers: @ArthurZucker
- trainer: @zach-huggingface and @SunMarc
- chat templates: @Rocketknight1
Integrations:
- deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface
- ray/raytune: @richardliaw, @amogkam
- Big Model Inference: @SunMarc
- quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber
Documentation: @stevhliu
HF projects:
- accelerate: [different repo](https://github.com/huggingface/accelerate)
- datasets: [different repo](https://github.com/huggingface/datasets)
- diffusers: [different repo](https://github.com/huggingface/diffusers)
- rust tokenizers: [different repo](https://github.com/huggingface/tokenizers)
Maintained examples (not research project or legacy):
- Flax: @Rocketknight1
- PyTorch: See Models above and tag the person corresponding to the modality of the example.
- TensorFlow: @Rocketknight1
-->
| {
"login": "ydshieh",
"id": 2521628,
"node_id": "MDQ6VXNlcjI1MjE2Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ydshieh",
"html_url": "https://github.com/ydshieh",
"followers_url": "https://api.github.com/users/ydshieh/followers",
"following_url": "https://api.github.com/users/ydshieh/following{/other_user}",
"gists_url": "https://api.github.com/users/ydshieh/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ydshieh/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ydshieh/subscriptions",
"organizations_url": "https://api.github.com/users/ydshieh/orgs",
"repos_url": "https://api.github.com/users/ydshieh/repos",
"events_url": "https://api.github.com/users/ydshieh/events{/privacy}",
"received_events_url": "https://api.github.com/users/ydshieh/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/37601/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/37601/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/37600 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/37600/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/37600/comments | https://api.github.com/repos/huggingface/transformers/issues/37600/events | https://github.com/huggingface/transformers/pull/37600 | 3,004,277,153 | PR_kwDOCUB6oc6TFez4 | 37,600 | docs: Details for ambigious channel dimension assignment | {
"login": "yaner-here",
"id": 26623948,
"node_id": "MDQ6VXNlcjI2NjIzOTQ4",
"avatar_url": "https://avatars.githubusercontent.com/u/26623948?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/yaner-here",
"html_url": "https://github.com/yaner-here",
"followers_url": "https://api.github.com/users/yaner-here/followers",
"following_url": "https://api.github.com/users/yaner-here/following{/other_user}",
"gists_url": "https://api.github.com/users/yaner-here/gists{/gist_id}",
"starred_url": "https://api.github.com/users/yaner-here/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/yaner-here/subscriptions",
"organizations_url": "https://api.github.com/users/yaner-here/orgs",
"repos_url": "https://api.github.com/users/yaner-here/repos",
"events_url": "https://api.github.com/users/yaner-here/events{/privacy}",
"received_events_url": "https://api.github.com/users/yaner-here/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-04-18T07:33:32 | 2025-04-29T15:12:38 | 2025-04-29T15:12:38 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37600",
"html_url": "https://github.com/huggingface/transformers/pull/37600",
"diff_url": "https://github.com/huggingface/transformers/pull/37600.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37600.patch",
"merged_at": "2025-04-29T15:12:38"
} | # What does this PR do?
<!--
Congratulations! You've made it this far! You're not quite done yet though.
Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution.
Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change.
Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost.
-->
<!-- Remove if not applicable -->
Refined docs on how to assign channel dimension of inputs.
## Before submitting
- [x] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [x] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [x] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [x] Did you write any new necessary tests?
## Who can review?
@stevhliu
| {
"login": "stevhliu",
"id": 59462357,
"node_id": "MDQ6VXNlcjU5NDYyMzU3",
"avatar_url": "https://avatars.githubusercontent.com/u/59462357?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/stevhliu",
"html_url": "https://github.com/stevhliu",
"followers_url": "https://api.github.com/users/stevhliu/followers",
"following_url": "https://api.github.com/users/stevhliu/following{/other_user}",
"gists_url": "https://api.github.com/users/stevhliu/gists{/gist_id}",
"starred_url": "https://api.github.com/users/stevhliu/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/stevhliu/subscriptions",
"organizations_url": "https://api.github.com/users/stevhliu/orgs",
"repos_url": "https://api.github.com/users/stevhliu/repos",
"events_url": "https://api.github.com/users/stevhliu/events{/privacy}",
"received_events_url": "https://api.github.com/users/stevhliu/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/37600/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/37600/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/37599 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/37599/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/37599/comments | https://api.github.com/repos/huggingface/transformers/issues/37599/events | https://github.com/huggingface/transformers/pull/37599 | 3,004,276,993 | PR_kwDOCUB6oc6TFexr | 37,599 | enable cpu offloading for Bark on xpu | {
"login": "yao-matrix",
"id": 7245027,
"node_id": "MDQ6VXNlcjcyNDUwMjc=",
"avatar_url": "https://avatars.githubusercontent.com/u/7245027?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/yao-matrix",
"html_url": "https://github.com/yao-matrix",
"followers_url": "https://api.github.com/users/yao-matrix/followers",
"following_url": "https://api.github.com/users/yao-matrix/following{/other_user}",
"gists_url": "https://api.github.com/users/yao-matrix/gists{/gist_id}",
"starred_url": "https://api.github.com/users/yao-matrix/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/yao-matrix/subscriptions",
"organizations_url": "https://api.github.com/users/yao-matrix/orgs",
"repos_url": "https://api.github.com/users/yao-matrix/repos",
"events_url": "https://api.github.com/users/yao-matrix/events{/privacy}",
"received_events_url": "https://api.github.com/users/yao-matrix/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-04-18T07:33:26 | 2025-04-23T22:34:27 | 2025-04-23T09:37:15 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37599",
"html_url": "https://github.com/huggingface/transformers/pull/37599",
"diff_url": "https://github.com/huggingface/transformers/pull/37599.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37599.patch",
"merged_at": "2025-04-23T09:37:15"
} | **command**
pytest -rA tests/models/bark/test_modeling_bark.py::BarkModelIntegrationTests::test_generate_end_to_end_with_offload
**after this PR**
PASSED | {
"login": "ydshieh",
"id": 2521628,
"node_id": "MDQ6VXNlcjI1MjE2Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ydshieh",
"html_url": "https://github.com/ydshieh",
"followers_url": "https://api.github.com/users/ydshieh/followers",
"following_url": "https://api.github.com/users/ydshieh/following{/other_user}",
"gists_url": "https://api.github.com/users/ydshieh/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ydshieh/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ydshieh/subscriptions",
"organizations_url": "https://api.github.com/users/ydshieh/orgs",
"repos_url": "https://api.github.com/users/ydshieh/repos",
"events_url": "https://api.github.com/users/ydshieh/events{/privacy}",
"received_events_url": "https://api.github.com/users/ydshieh/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/37599/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/37599/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/37598 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/37598/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/37598/comments | https://api.github.com/repos/huggingface/transformers/issues/37598/events | https://github.com/huggingface/transformers/pull/37598 | 3,003,899,806 | PR_kwDOCUB6oc6TEKQ8 | 37,598 | Fixing the example in generation strategy doc | {
"login": "jeasinema",
"id": 10633528,
"node_id": "MDQ6VXNlcjEwNjMzNTI4",
"avatar_url": "https://avatars.githubusercontent.com/u/10633528?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jeasinema",
"html_url": "https://github.com/jeasinema",
"followers_url": "https://api.github.com/users/jeasinema/followers",
"following_url": "https://api.github.com/users/jeasinema/following{/other_user}",
"gists_url": "https://api.github.com/users/jeasinema/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jeasinema/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jeasinema/subscriptions",
"organizations_url": "https://api.github.com/users/jeasinema/orgs",
"repos_url": "https://api.github.com/users/jeasinema/repos",
"events_url": "https://api.github.com/users/jeasinema/events{/privacy}",
"received_events_url": "https://api.github.com/users/jeasinema/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-04-18T03:57:31 | 2025-04-18T22:23:55 | 2025-04-18T19:50:17 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37598",
"html_url": "https://github.com/huggingface/transformers/pull/37598",
"diff_url": "https://github.com/huggingface/transformers/pull/37598.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37598.patch",
"merged_at": "2025-04-18T19:50:17"
} | The prompt text shown in the example does not match what is inside the generated output. As the generated output always include the prompt, the correct prompt should be "Hugging Face is an open-source company".
## Before submitting
- [X] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [ ] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
<!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @
If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**.
Please tag fewer than 3 people.
Models:
- text models: @ArthurZucker
- vision models: @amyeroberts, @qubvel
- speech models: @eustlb
- graph models: @clefourrier
Library:
- flax: @gante and @Rocketknight1
- generate: @zucchini-nlp (visual-language models) or @gante (all others)
- pipelines: @Rocketknight1
- tensorflow: @gante and @Rocketknight1
- tokenizers: @ArthurZucker
- trainer: @zach-huggingface and @SunMarc
- chat templates: @Rocketknight1
Integrations:
- deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface
- ray/raytune: @richardliaw, @amogkam
- Big Model Inference: @SunMarc
- quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber
Documentation: @stevhliu
HF projects:
- accelerate: [different repo](https://github.com/huggingface/accelerate)
- datasets: [different repo](https://github.com/huggingface/datasets)
- diffusers: [different repo](https://github.com/huggingface/diffusers)
- rust tokenizers: [different repo](https://github.com/huggingface/tokenizers)
Maintained examples (not research project or legacy):
- Flax: @Rocketknight1
- PyTorch: See Models above and tag the person corresponding to the modality of the example.
- TensorFlow: @Rocketknight1
-->
| {
"login": "stevhliu",
"id": 59462357,
"node_id": "MDQ6VXNlcjU5NDYyMzU3",
"avatar_url": "https://avatars.githubusercontent.com/u/59462357?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/stevhliu",
"html_url": "https://github.com/stevhliu",
"followers_url": "https://api.github.com/users/stevhliu/followers",
"following_url": "https://api.github.com/users/stevhliu/following{/other_user}",
"gists_url": "https://api.github.com/users/stevhliu/gists{/gist_id}",
"starred_url": "https://api.github.com/users/stevhliu/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/stevhliu/subscriptions",
"organizations_url": "https://api.github.com/users/stevhliu/orgs",
"repos_url": "https://api.github.com/users/stevhliu/repos",
"events_url": "https://api.github.com/users/stevhliu/events{/privacy}",
"received_events_url": "https://api.github.com/users/stevhliu/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/37598/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/37598/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/37597 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/37597/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/37597/comments | https://api.github.com/repos/huggingface/transformers/issues/37597/events | https://github.com/huggingface/transformers/pull/37597 | 3,003,681,597 | PR_kwDOCUB6oc6TDbC9 | 37,597 | Fix qwen2_5 get_rope_index tensor device locations | {
"login": "rphmeier",
"id": 10121380,
"node_id": "MDQ6VXNlcjEwMTIxMzgw",
"avatar_url": "https://avatars.githubusercontent.com/u/10121380?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/rphmeier",
"html_url": "https://github.com/rphmeier",
"followers_url": "https://api.github.com/users/rphmeier/followers",
"following_url": "https://api.github.com/users/rphmeier/following{/other_user}",
"gists_url": "https://api.github.com/users/rphmeier/gists{/gist_id}",
"starred_url": "https://api.github.com/users/rphmeier/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/rphmeier/subscriptions",
"organizations_url": "https://api.github.com/users/rphmeier/orgs",
"repos_url": "https://api.github.com/users/rphmeier/repos",
"events_url": "https://api.github.com/users/rphmeier/events{/privacy}",
"received_events_url": "https://api.github.com/users/rphmeier/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-04-18T00:45:57 | 2025-04-24T22:22:53 | 2025-04-24T14:04:38 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37597",
"html_url": "https://github.com/huggingface/transformers/pull/37597",
"diff_url": "https://github.com/huggingface/transformers/pull/37597.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37597.patch",
"merged_at": "2025-04-24T14:04:38"
} | # What does this PR do?
This fixes the torch device mismatch errors in the Qwen-2.5 integration when applied to video inputs.
Before, tensors were mixed between the CPU and cuda devices, leading to runtime errors: `RuntimeError: Expected all tensors to be on the same device, but found at least two devices, cpu and cuda:0! (when checking argument for argument tensors in method wrapper_CUDA_cat)`
These codepaths were only taken when video inputs were supplied, but were always taken when following the instructions and example code on the huggingface hub. For example, the sample code in the [Qwen 2.5 7B Instruct Video Inference section](https://huggingface.co/Qwen/Qwen2.5-VL-7B-Instruct) would unconditionally fail.
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [x] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [x] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
Tagging these people at the suggestion of the
- text models: @ArthurZucker
- vision models: @amyeroberts, @qubvel | {
"login": "LysandreJik",
"id": 30755778,
"node_id": "MDQ6VXNlcjMwNzU1Nzc4",
"avatar_url": "https://avatars.githubusercontent.com/u/30755778?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/LysandreJik",
"html_url": "https://github.com/LysandreJik",
"followers_url": "https://api.github.com/users/LysandreJik/followers",
"following_url": "https://api.github.com/users/LysandreJik/following{/other_user}",
"gists_url": "https://api.github.com/users/LysandreJik/gists{/gist_id}",
"starred_url": "https://api.github.com/users/LysandreJik/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/LysandreJik/subscriptions",
"organizations_url": "https://api.github.com/users/LysandreJik/orgs",
"repos_url": "https://api.github.com/users/LysandreJik/repos",
"events_url": "https://api.github.com/users/LysandreJik/events{/privacy}",
"received_events_url": "https://api.github.com/users/LysandreJik/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/37597/reactions",
"total_count": 1,
"+1": 1,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/37597/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/37596 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/37596/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/37596/comments | https://api.github.com/repos/huggingface/transformers/issues/37596/events | https://github.com/huggingface/transformers/pull/37596 | 3,003,522,851 | PR_kwDOCUB6oc6TC49I | 37,596 | Tests for the new Tensor Parallel integration | {
"login": "ailunc",
"id": 131329865,
"node_id": "U_kgDOB9PvSQ",
"avatar_url": "https://avatars.githubusercontent.com/u/131329865?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ailunc",
"html_url": "https://github.com/ailunc",
"followers_url": "https://api.github.com/users/ailunc/followers",
"following_url": "https://api.github.com/users/ailunc/following{/other_user}",
"gists_url": "https://api.github.com/users/ailunc/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ailunc/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ailunc/subscriptions",
"organizations_url": "https://api.github.com/users/ailunc/orgs",
"repos_url": "https://api.github.com/users/ailunc/repos",
"events_url": "https://api.github.com/users/ailunc/events{/privacy}",
"received_events_url": "https://api.github.com/users/ailunc/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | open | false | null | [] | null | [] | 2025-04-17T21:59:49 | 2025-04-18T08:19:47 | null | NONE | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37596",
"html_url": "https://github.com/huggingface/transformers/pull/37596",
"diff_url": "https://github.com/huggingface/transformers/pull/37596.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37596.patch",
"merged_at": null
} | # What does this PR do?
<!--
Congratulations! You've made it this far! You're not quite done yet though.
Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution.
Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change.
Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost.
-->
<!-- Remove if not applicable -->
Fixes #37557 (issue)
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [x] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [x] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case. https://github.com/huggingface/transformers/issues/37557
- [x] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [x] Did you write any new necessary tests?
## Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
<!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @
If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**.
Please tag fewer than 3 people.
Models:
- text models: @ArthurZucker
- vision models: @amyeroberts, @qubvel
- speech models: @eustlb
- graph models: @clefourrier
Library:
- flax: @gante and @Rocketknight1
- generate: @zucchini-nlp (visual-language models) or @gante (all others)
- pipelines: @Rocketknight1
- tensorflow: @gante and @Rocketknight1
- tokenizers: @ArthurZucker
- trainer: @zach-huggingface and @SunMarc
- chat templates: @Rocketknight1
Integrations:
- deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface
- ray/raytune: @richardliaw, @amogkam
- Big Model Inference: @SunMarc
- quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber
Documentation: @stevhliu
HF projects:
- accelerate: [different repo](https://github.com/huggingface/accelerate)
- datasets: [different repo](https://github.com/huggingface/datasets)
- diffusers: [different repo](https://github.com/huggingface/diffusers)
- rust tokenizers: [different repo](https://github.com/huggingface/tokenizers)
Maintained examples (not research project or legacy):
- Flax: @Rocketknight1
- PyTorch: See Models above and tag the person corresponding to the modality of the example.
- TensorFlow: @Rocketknight1
-->
| null | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/37596/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/37596/timeline | null | null | null | null | true | false |
https://api.github.com/repos/huggingface/transformers/issues/37595 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/37595/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/37595/comments | https://api.github.com/repos/huggingface/transformers/issues/37595/events | https://github.com/huggingface/transformers/issues/37595 | 3,003,489,945 | I_kwDOCUB6oc6zBZ6Z | 37,595 | Unable to load certain models | {
"login": "zaddy6",
"id": 49834299,
"node_id": "MDQ6VXNlcjQ5ODM0Mjk5",
"avatar_url": "https://avatars.githubusercontent.com/u/49834299?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/zaddy6",
"html_url": "https://github.com/zaddy6",
"followers_url": "https://api.github.com/users/zaddy6/followers",
"following_url": "https://api.github.com/users/zaddy6/following{/other_user}",
"gists_url": "https://api.github.com/users/zaddy6/gists{/gist_id}",
"starred_url": "https://api.github.com/users/zaddy6/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/zaddy6/subscriptions",
"organizations_url": "https://api.github.com/users/zaddy6/orgs",
"repos_url": "https://api.github.com/users/zaddy6/repos",
"events_url": "https://api.github.com/users/zaddy6/events{/privacy}",
"received_events_url": "https://api.github.com/users/zaddy6/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
}
] | closed | false | null | [] | null | [] | 2025-04-17T21:36:42 | 2025-06-26T08:03:39 | 2025-06-26T08:03:39 | NONE | null | null | null | null | ### System Info
- `transformers` version: 4.51.3
- Platform: Linux-5.15.0-130-generic-x86_64-with-glibc2.35
- Python version: 3.11.11
- Huggingface_hub version: 0.30.2
- Safetensors version: 0.5.3
- Accelerate version: 1.6.0
- Accelerate config: not found
- DeepSpeed version: 0.16.6
- PyTorch version (GPU?): 2.6.0+cu124 (True)
- Tensorflow version (GPU?): not installed (NA)
- Flax version (CPU?/GPU?/TPU?): not installed (NA)
- Jax version: not installed
- JaxLib version: not installed
- Using distributed or parallel set-up in script?: <fill in>
- Using GPU in script?: <fill in>
- GPU type: NVIDIA H100 80GB HBM3
### Who can help?
_No response_
### Information
- [x] The official example scripts
- [x] My own modified scripts
### Tasks
- [ ] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...)
- [ ] My own task or dataset (give details below)
### Reproduction
I noticed I am unable to load a lot of models
so far I have tried
HuggingFaceTB/SmolLM2-1.7B-Instruct
Qwen/Qwen2.5-Coder-7B-Instruct
```
ValueError: Unrecognized model in HuggingFaceTB/SmolLM2-1.7B-Instruct. Should have a `model_type` key in its config.json, or contain one of the following strings in its name: albert, align, altclip, aria, aria_text, audio-spectrogram-transformer, autoformer, aya_vision, bamba, bark, bart, beit, bert, bert-generation, big_bird, bigbird_pegasus, biogpt, bit, blenderbot, blenderbot-small, blip, blip-2, blip_2_qformer, bloom, bridgetower, bros, camembert, canine, chameleon, chinese_clip, chinese_clip_vision_model, clap, clip, clip_text_model, clip_vision_model,
```
Code
```
from transformers import AutoModelForCausalLM, AutoTokenizer
from peft import PeftModel
# Load the base model and tokenizer
base_model = AutoModelForCausalLM.from_pretrained("HuggingFaceTB/SmolLM2-1.7B-Instruct")
tokenizer = AutoTokenizer.from_pretrained("HuggingFaceTB/SmolLM2-1.7B-Instruct")
```
### Expected behavior
models should load without errors | {
"login": "github-actions[bot]",
"id": 41898282,
"node_id": "MDM6Qm90NDE4OTgyODI=",
"avatar_url": "https://avatars.githubusercontent.com/in/15368?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/github-actions%5Bbot%5D",
"html_url": "https://github.com/apps/github-actions",
"followers_url": "https://api.github.com/users/github-actions%5Bbot%5D/followers",
"following_url": "https://api.github.com/users/github-actions%5Bbot%5D/following{/other_user}",
"gists_url": "https://api.github.com/users/github-actions%5Bbot%5D/gists{/gist_id}",
"starred_url": "https://api.github.com/users/github-actions%5Bbot%5D/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/github-actions%5Bbot%5D/subscriptions",
"organizations_url": "https://api.github.com/users/github-actions%5Bbot%5D/orgs",
"repos_url": "https://api.github.com/users/github-actions%5Bbot%5D/repos",
"events_url": "https://api.github.com/users/github-actions%5Bbot%5D/events{/privacy}",
"received_events_url": "https://api.github.com/users/github-actions%5Bbot%5D/received_events",
"type": "Bot",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/37595/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/37595/timeline | null | completed | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | true |
https://api.github.com/repos/huggingface/transformers/issues/37594 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/37594/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/37594/comments | https://api.github.com/repos/huggingface/transformers/issues/37594/events | https://github.com/huggingface/transformers/pull/37594 | 3,003,443,888 | PR_kwDOCUB6oc6TCnim | 37,594 | Update Phi4 converter | {
"login": "Cyrilvallez",
"id": 71554963,
"node_id": "MDQ6VXNlcjcxNTU0OTYz",
"avatar_url": "https://avatars.githubusercontent.com/u/71554963?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Cyrilvallez",
"html_url": "https://github.com/Cyrilvallez",
"followers_url": "https://api.github.com/users/Cyrilvallez/followers",
"following_url": "https://api.github.com/users/Cyrilvallez/following{/other_user}",
"gists_url": "https://api.github.com/users/Cyrilvallez/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Cyrilvallez/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Cyrilvallez/subscriptions",
"organizations_url": "https://api.github.com/users/Cyrilvallez/orgs",
"repos_url": "https://api.github.com/users/Cyrilvallez/repos",
"events_url": "https://api.github.com/users/Cyrilvallez/events{/privacy}",
"received_events_url": "https://api.github.com/users/Cyrilvallez/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-04-17T21:06:53 | 2025-04-17T21:32:54 | 2025-04-17T21:08:24 | MEMBER | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37594",
"html_url": "https://github.com/huggingface/transformers/pull/37594",
"diff_url": "https://github.com/huggingface/transformers/pull/37594.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37594.patch",
"merged_at": "2025-04-17T21:08:24"
} | # What does this PR do?
<!--
Congratulations! You've made it this far! You're not quite done yet though.
Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution.
Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change.
Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost.
-->
<!-- Remove if not applicable -->
Fixes # (issue)
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [ ] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
<!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @
If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**.
Please tag fewer than 3 people.
Models:
- text models: @ArthurZucker
- vision models: @amyeroberts, @qubvel
- speech models: @eustlb
- graph models: @clefourrier
Library:
- flax: @gante and @Rocketknight1
- generate: @zucchini-nlp (visual-language models) or @gante (all others)
- pipelines: @Rocketknight1
- tensorflow: @gante and @Rocketknight1
- tokenizers: @ArthurZucker
- trainer: @zach-huggingface and @SunMarc
- chat templates: @Rocketknight1
Integrations:
- deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface
- ray/raytune: @richardliaw, @amogkam
- Big Model Inference: @SunMarc
- quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber
Documentation: @stevhliu
HF projects:
- accelerate: [different repo](https://github.com/huggingface/accelerate)
- datasets: [different repo](https://github.com/huggingface/datasets)
- diffusers: [different repo](https://github.com/huggingface/diffusers)
- rust tokenizers: [different repo](https://github.com/huggingface/tokenizers)
Maintained examples (not research project or legacy):
- Flax: @Rocketknight1
- PyTorch: See Models above and tag the person corresponding to the modality of the example.
- TensorFlow: @Rocketknight1
-->
| {
"login": "Cyrilvallez",
"id": 71554963,
"node_id": "MDQ6VXNlcjcxNTU0OTYz",
"avatar_url": "https://avatars.githubusercontent.com/u/71554963?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Cyrilvallez",
"html_url": "https://github.com/Cyrilvallez",
"followers_url": "https://api.github.com/users/Cyrilvallez/followers",
"following_url": "https://api.github.com/users/Cyrilvallez/following{/other_user}",
"gists_url": "https://api.github.com/users/Cyrilvallez/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Cyrilvallez/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Cyrilvallez/subscriptions",
"organizations_url": "https://api.github.com/users/Cyrilvallez/orgs",
"repos_url": "https://api.github.com/users/Cyrilvallez/repos",
"events_url": "https://api.github.com/users/Cyrilvallez/events{/privacy}",
"received_events_url": "https://api.github.com/users/Cyrilvallez/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/37594/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/37594/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/37593 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/37593/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/37593/comments | https://api.github.com/repos/huggingface/transformers/issues/37593/events | https://github.com/huggingface/transformers/issues/37593 | 3,003,334,472 | I_kwDOCUB6oc6zAz9I | 37,593 | When using --eval_do_concat_batches=False with run_glue.py example, I get "ValueError: Predictions and/or references don't match the expected format." | {
"login": "jeffhataws",
"id": 56947987,
"node_id": "MDQ6VXNlcjU2OTQ3OTg3",
"avatar_url": "https://avatars.githubusercontent.com/u/56947987?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jeffhataws",
"html_url": "https://github.com/jeffhataws",
"followers_url": "https://api.github.com/users/jeffhataws/followers",
"following_url": "https://api.github.com/users/jeffhataws/following{/other_user}",
"gists_url": "https://api.github.com/users/jeffhataws/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jeffhataws/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jeffhataws/subscriptions",
"organizations_url": "https://api.github.com/users/jeffhataws/orgs",
"repos_url": "https://api.github.com/users/jeffhataws/repos",
"events_url": "https://api.github.com/users/jeffhataws/events{/privacy}",
"received_events_url": "https://api.github.com/users/jeffhataws/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
}
] | closed | false | null | [] | null | [] | 2025-04-17T20:00:38 | 2025-04-28T18:06:14 | 2025-04-28T18:06:14 | CONTRIBUTOR | null | null | null | null | ### System Info
Ran on c5.4xlarge CPU instance with Ubuntu22
- `transformers` version: 4.44.0
- Platform: Linux-6.8.0-1024-aws-x86_64-with-glibc2.35
- Python version: 3.10.12
- Huggingface_hub version: 0.30.2
- Safetensors version: 0.5.3
- Accelerate version: 1.6.0
- Accelerate config: not found
- PyTorch version (GPU?): 2.6.0+cu124 (False)
- Tensorflow version (GPU?): not installed (NA)
- Flax version (CPU?/GPU?/TPU?): not installed (NA)
- Jax version: not installed
- JaxLib version: not installed
- Using distributed or parallel set-up in script?: <fill in>
### Who can help?
_No response_
### Information
- [X] The official example scripts
- [ ] My own modified scripts
### Tasks
- [X] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...)
- [ ] My own task or dataset (give details below)
### Reproduction
Setup:
```
export HF_VER=4.51.3
pip install -U transformers==$HF_VER datasets evaluate scikit-learn
cd ~/
git clone https://github.com/huggingface/transformers --branch v$HF_VER
cd ~/transformers/examples/pytorch/text-classification
```
Then run the following script:
```
#!/usr/bin/env bash
set -eExuo
export TASK_NAME=mrpc
torchrun --nproc_per_node=1 ./run_glue.py \
--model_name_or_path gaunernst/bert-tiny-uncased \
--task_name $TASK_NAME \
--do_eval \
--max_seq_length 128 \
--per_device_train_batch_size 16 \
--max_eval_samples 1000 \
--learning_rate 2e-5 \
--num_train_epochs 5 \
--save_total_limit 1 \
--eval_do_concat_batches False \
--overwrite_output_dir \
--output_dir /tmp/$TASK_NAME/ |& tee log_run
```
You will see the error:
```
***** Running Evaluation ***** [INFO|trainer.py:3831] 2025-04-17 18:48:47,507 >> Num examples = 408 [INFO|trainer.py:3834] 2025-04-17 18:48:47,507 >> Batch size = 8 100%|██████████| 51/51 [02:22<00:00, 2.84s/it][rank0]: Traceback (most recent call last): [rank0]: File "/home/ubuntu/transformers/examples/pytorch/text-classification/./run_glue.py", line 637, in <module>
[rank0]: main()
[rank0]: File "/home/ubuntu/transformers/examples/pytorch/text-classification/./run_glue.py", line 575, in main
[rank0]: metrics = trainer.evaluate(eval_dataset=eval_dataset)
[rank0]: File "/home/ubuntu/test_venv_py310/lib/python3.10/site-packages/transformers/trainer.py", line 3676, in evaluate
[rank0]: output = eval_loop(
[rank0]: File "/home/ubuntu/test_venv_py310/lib/python3.10/site-packages/transformers/trainer.py", line 3966, in evaluation_loop
[rank0]: metrics = self.compute_metrics(EvalPrediction(predictions=all_preds, label_ids=all_labels))
[rank0]: File "/home/ubuntu/transformers/examples/pytorch/text-classification/./run_glue.py", line 513, in compute_metrics
[rank0]: result = metric.compute(predictions=preds, references=p.label_ids)
[rank0]: File "/home/ubuntu/test_venv_py310/lib/python3.10/site-packages/evaluate/module.py", line 455, in compute
[rank0]: self.add_batch(**inputs)
[rank0]: File "/home/ubuntu/test_venv_py310/lib/python3.10/site-packages/evaluate/module.py", line 546, in add_batch
[rank0]: raise ValueError(error_msg) from None
[rank0]: ValueError: Predictions and/or references don't match the expected format.
[rank0]: Expected format: {'predictions': Value(dtype='int64', id=None), 'references': Value(dtype='int64', id=None)},
[rank0]: Input predictions: [[1 4]
[rank0]: [0 4]
[rank0]: [0 1]
[rank0]: [1 1]
[rank0]: [0 1]
[rank0]: [3 2]
[rank0]: [1 2]
[rank0]: [0 0]
[rank0]: [2 1]
[rank0]: [0 3]
[rank0]: [0 0]
[rank0]: [1 0]
[rank0]: [1 1]
[rank0]: [3 4]
[rank0]: [0 3]
[rank0]: [0 1]
[rank0]: [0 3]
[rank0]: [6 0]
[rank0]: [0 3]
[rank0]: [0 4]
[rank0]: [0 0]
[rank0]: [4 1]
[rank0]: [4 5]
[rank0]: [0 0]
[rank0]: [0 2]
[rank0]: [0 0]
[rank0]: [0 2]
[rank0]: [7 7]
[rank0]: [0 1]
[rank0]: [0 3]
[rank0]: [0 3]
[rank0]: [0 1]
[rank0]: [0 1]
[rank0]: [0 1]
[rank0]: [2 1]
[rank0]: [0 3]
[rank0]: [7 7]
[rank0]: [0 1]
[rank0]: [0 4]
[rank0]: [0 0]
[rank0]: [1 3]
[rank0]: [2 0]
[rank0]: [7 0]
[rank0]: [1 4]
[rank0]: [0 3]
[rank0]: [1 0]
[rank0]: [0 1]
[rank0]: [7 4]
[rank0]: [0 3]
[rank0]: [4 3]
[rank0]: [0 2]],
[rank0]: Input references: [array([1, 0, 0, 1, 0, 1, 0, 1]), array([1, 1, 1, 0, 0, 1, 1, 1]), array([1, 0, 1, 0, 0, 1, 0, 1]), ..., array([1, 0, 1, 0, 1, 1, 1, 1]), a
rray([1, 1, 1, 1, 1, 1, 1, 1]), array([0, 1, 1, 0, 1, 1, 0, 1])]
100%|██████████| 51/51 [02:22<00:00, 2.79s/it]
E0417 18:51:13.767000 4267 torch/distributed/elastic/multiprocessing/api.py:869] failed (exitcode: 1) local_rank: 0 (pid: 4285) of binary: /home/ubuntu/test_venv_py31
0/bin/python3
Traceback (most recent call last):
File "/home/ubuntu/test_venv_py310/bin/torchrun", line 8, in <module>
sys.exit(main())
```
### Expected behavior
No error | {
"login": "jeffhataws",
"id": 56947987,
"node_id": "MDQ6VXNlcjU2OTQ3OTg3",
"avatar_url": "https://avatars.githubusercontent.com/u/56947987?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jeffhataws",
"html_url": "https://github.com/jeffhataws",
"followers_url": "https://api.github.com/users/jeffhataws/followers",
"following_url": "https://api.github.com/users/jeffhataws/following{/other_user}",
"gists_url": "https://api.github.com/users/jeffhataws/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jeffhataws/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jeffhataws/subscriptions",
"organizations_url": "https://api.github.com/users/jeffhataws/orgs",
"repos_url": "https://api.github.com/users/jeffhataws/repos",
"events_url": "https://api.github.com/users/jeffhataws/events{/privacy}",
"received_events_url": "https://api.github.com/users/jeffhataws/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/37593/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/37593/timeline | null | completed | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | true |
https://api.github.com/repos/huggingface/transformers/issues/37592 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/37592/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/37592/comments | https://api.github.com/repos/huggingface/transformers/issues/37592/events | https://github.com/huggingface/transformers/pull/37592 | 3,003,206,742 | PR_kwDOCUB6oc6TBz4M | 37,592 | Restructure torchao quantization examples | {
"login": "jerryzh168",
"id": 4958441,
"node_id": "MDQ6VXNlcjQ5NTg0NDE=",
"avatar_url": "https://avatars.githubusercontent.com/u/4958441?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jerryzh168",
"html_url": "https://github.com/jerryzh168",
"followers_url": "https://api.github.com/users/jerryzh168/followers",
"following_url": "https://api.github.com/users/jerryzh168/following{/other_user}",
"gists_url": "https://api.github.com/users/jerryzh168/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jerryzh168/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jerryzh168/subscriptions",
"organizations_url": "https://api.github.com/users/jerryzh168/orgs",
"repos_url": "https://api.github.com/users/jerryzh168/repos",
"events_url": "https://api.github.com/users/jerryzh168/events{/privacy}",
"received_events_url": "https://api.github.com/users/jerryzh168/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-04-17T18:43:07 | 2025-04-22T09:20:35 | 2025-04-22T09:20:34 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37592",
"html_url": "https://github.com/huggingface/transformers/pull/37592",
"diff_url": "https://github.com/huggingface/transformers/pull/37592.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37592.patch",
"merged_at": "2025-04-22T09:20:34"
} | Summary:
Mainly structured the examples by hardwares and then listed the recommended quantization methods for each hardware H100 GPU, A100 GPU and CPU
Also added example for push_to_hub
Test Plan:
not required
Reviewers:
Subscribers:
Tasks:
Tags:
| {
"login": "SunMarc",
"id": 57196510,
"node_id": "MDQ6VXNlcjU3MTk2NTEw",
"avatar_url": "https://avatars.githubusercontent.com/u/57196510?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/SunMarc",
"html_url": "https://github.com/SunMarc",
"followers_url": "https://api.github.com/users/SunMarc/followers",
"following_url": "https://api.github.com/users/SunMarc/following{/other_user}",
"gists_url": "https://api.github.com/users/SunMarc/gists{/gist_id}",
"starred_url": "https://api.github.com/users/SunMarc/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/SunMarc/subscriptions",
"organizations_url": "https://api.github.com/users/SunMarc/orgs",
"repos_url": "https://api.github.com/users/SunMarc/repos",
"events_url": "https://api.github.com/users/SunMarc/events{/privacy}",
"received_events_url": "https://api.github.com/users/SunMarc/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/37592/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/37592/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/37591 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/37591/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/37591/comments | https://api.github.com/repos/huggingface/transformers/issues/37591/events | https://github.com/huggingface/transformers/pull/37591 | 3,003,138,108 | PR_kwDOCUB6oc6TBlGH | 37,591 | Fix some GPU OOM after #37553 | {
"login": "ydshieh",
"id": 2521628,
"node_id": "MDQ6VXNlcjI1MjE2Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ydshieh",
"html_url": "https://github.com/ydshieh",
"followers_url": "https://api.github.com/users/ydshieh/followers",
"following_url": "https://api.github.com/users/ydshieh/following{/other_user}",
"gists_url": "https://api.github.com/users/ydshieh/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ydshieh/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ydshieh/subscriptions",
"organizations_url": "https://api.github.com/users/ydshieh/orgs",
"repos_url": "https://api.github.com/users/ydshieh/repos",
"events_url": "https://api.github.com/users/ydshieh/events{/privacy}",
"received_events_url": "https://api.github.com/users/ydshieh/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-04-17T18:03:26 | 2025-04-18T08:09:21 | 2025-04-18T08:09:19 | COLLABORATOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37591",
"html_url": "https://github.com/huggingface/transformers/pull/37591",
"diff_url": "https://github.com/huggingface/transformers/pull/37591.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37591.patch",
"merged_at": "2025-04-18T08:09:19"
} | # What does this PR do?
Can't say anything why #37553 gives some tests GPU OOM and the only way I can make those tests to pass is to do some cuda cleanup at some places. | {
"login": "ydshieh",
"id": 2521628,
"node_id": "MDQ6VXNlcjI1MjE2Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ydshieh",
"html_url": "https://github.com/ydshieh",
"followers_url": "https://api.github.com/users/ydshieh/followers",
"following_url": "https://api.github.com/users/ydshieh/following{/other_user}",
"gists_url": "https://api.github.com/users/ydshieh/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ydshieh/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ydshieh/subscriptions",
"organizations_url": "https://api.github.com/users/ydshieh/orgs",
"repos_url": "https://api.github.com/users/ydshieh/repos",
"events_url": "https://api.github.com/users/ydshieh/events{/privacy}",
"received_events_url": "https://api.github.com/users/ydshieh/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/37591/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/37591/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/37590 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/37590/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/37590/comments | https://api.github.com/repos/huggingface/transformers/issues/37590/events | https://github.com/huggingface/transformers/pull/37590 | 3,003,097,955 | PR_kwDOCUB6oc6TBcZr | 37,590 | :rotating_light: :rotating_light: Inherited CausalLM Tests | {
"login": "Rocketknight1",
"id": 12866554,
"node_id": "MDQ6VXNlcjEyODY2NTU0",
"avatar_url": "https://avatars.githubusercontent.com/u/12866554?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Rocketknight1",
"html_url": "https://github.com/Rocketknight1",
"followers_url": "https://api.github.com/users/Rocketknight1/followers",
"following_url": "https://api.github.com/users/Rocketknight1/following{/other_user}",
"gists_url": "https://api.github.com/users/Rocketknight1/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Rocketknight1/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Rocketknight1/subscriptions",
"organizations_url": "https://api.github.com/users/Rocketknight1/orgs",
"repos_url": "https://api.github.com/users/Rocketknight1/repos",
"events_url": "https://api.github.com/users/Rocketknight1/events{/privacy}",
"received_events_url": "https://api.github.com/users/Rocketknight1/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-04-17T17:40:15 | 2025-05-23T17:29:55 | 2025-05-23T17:29:31 | MEMBER | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37590",
"html_url": "https://github.com/huggingface/transformers/pull/37590",
"diff_url": "https://github.com/huggingface/transformers/pull/37590.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37590.patch",
"merged_at": "2025-05-23T17:29:31"
} | This is an experimental PR to see if we can inherit tests for causal language models, which should reduce the amount of boilerplate model authors need to write by a lot.
Right now it's **very** experimental and will only touch a couple of models! Expect lots of failures for a while.
Update: It was stable and working for ~20 models, so I'm going to merge it at this point and migrate other models to it over time. | {
"login": "Rocketknight1",
"id": 12866554,
"node_id": "MDQ6VXNlcjEyODY2NTU0",
"avatar_url": "https://avatars.githubusercontent.com/u/12866554?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Rocketknight1",
"html_url": "https://github.com/Rocketknight1",
"followers_url": "https://api.github.com/users/Rocketknight1/followers",
"following_url": "https://api.github.com/users/Rocketknight1/following{/other_user}",
"gists_url": "https://api.github.com/users/Rocketknight1/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Rocketknight1/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Rocketknight1/subscriptions",
"organizations_url": "https://api.github.com/users/Rocketknight1/orgs",
"repos_url": "https://api.github.com/users/Rocketknight1/repos",
"events_url": "https://api.github.com/users/Rocketknight1/events{/privacy}",
"received_events_url": "https://api.github.com/users/Rocketknight1/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/37590/reactions",
"total_count": 1,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 1,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/37590/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/37589 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/37589/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/37589/comments | https://api.github.com/repos/huggingface/transformers/issues/37589/events | https://github.com/huggingface/transformers/pull/37589 | 3,002,952,337 | PR_kwDOCUB6oc6TA83F | 37,589 | Add config validation and style tweaks | {
"login": "Kirire",
"id": 27903062,
"node_id": "MDQ6VXNlcjI3OTAzMDYy",
"avatar_url": "https://avatars.githubusercontent.com/u/27903062?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Kirire",
"html_url": "https://github.com/Kirire",
"followers_url": "https://api.github.com/users/Kirire/followers",
"following_url": "https://api.github.com/users/Kirire/following{/other_user}",
"gists_url": "https://api.github.com/users/Kirire/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Kirire/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Kirire/subscriptions",
"organizations_url": "https://api.github.com/users/Kirire/orgs",
"repos_url": "https://api.github.com/users/Kirire/repos",
"events_url": "https://api.github.com/users/Kirire/events{/privacy}",
"received_events_url": "https://api.github.com/users/Kirire/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-04-17T16:26:20 | 2025-05-14T12:23:13 | 2025-05-14T12:22:11 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37589",
"html_url": "https://github.com/huggingface/transformers/pull/37589",
"diff_url": "https://github.com/huggingface/transformers/pull/37589.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37589.patch",
"merged_at": "2025-05-14T12:22:11"
} | **Summary of changes:**
1. **Renamed `input_states` to `hidden_states`** in the `torch_forward` method signature and body, to align with HuggingFace Transformers' naming conventions:
```python
def torch_forward(self, hidden_states: torch.Tensor, ...)
```
This change improves consistency with the rest of the codebase and aligns better with the standard used in other HuggingFace models.
2. **Removed redundant attention masking** from the `forward` method:
```python
if attention_mask is not None ...
hidden_states = (hidden_states * attention_mask[:, :, None]).to(dtype)
```
This operation is already handled later in `torch_forward`, so it's no longer necessary here. Removing it avoids potential duplication and keeps the logic centralized.
3. **Added a configuration validation check in `Mamba2Config.__init__`**:
```python
if hidden_size * expand != num_heads * head_dim:
raise AttributeError(...)
```
This ensures that the configuration is internally consistent, and helps users catch misconfigurations early by raising a clear error during initialization.
Related to #37554
4. **Improved Loss Handling During Gradient Accumulation**
Modifies the loss computation by replacing the hardcoded use of `CrossEntropyLoss`:
```python
loss_fct = CrossEntropyLoss()
loss = loss_fct(shift_logits.view(-1, shift_logits.size(-1)), shift_labels.view(-1))
```
with a more flexible call to a configurable `loss_function`:
```python
loss = self.loss_function(logits=logits, labels=labels, vocab_size=self.config.vocab_size, **kwargs)
```
This change improves support for gradient accumulation scenarios and allows for easier customization of model-specific loss functions.
Related to #34191 | {
"login": "vasqu",
"id": 73884904,
"node_id": "MDQ6VXNlcjczODg0OTA0",
"avatar_url": "https://avatars.githubusercontent.com/u/73884904?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/vasqu",
"html_url": "https://github.com/vasqu",
"followers_url": "https://api.github.com/users/vasqu/followers",
"following_url": "https://api.github.com/users/vasqu/following{/other_user}",
"gists_url": "https://api.github.com/users/vasqu/gists{/gist_id}",
"starred_url": "https://api.github.com/users/vasqu/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/vasqu/subscriptions",
"organizations_url": "https://api.github.com/users/vasqu/orgs",
"repos_url": "https://api.github.com/users/vasqu/repos",
"events_url": "https://api.github.com/users/vasqu/events{/privacy}",
"received_events_url": "https://api.github.com/users/vasqu/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/37589/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/37589/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/37588 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/37588/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/37588/comments | https://api.github.com/repos/huggingface/transformers/issues/37588/events | https://github.com/huggingface/transformers/pull/37588 | 3,002,947,081 | PR_kwDOCUB6oc6TA7u8 | 37,588 | Remove unused ConfigTester attributes | {
"login": "Rocketknight1",
"id": 12866554,
"node_id": "MDQ6VXNlcjEyODY2NTU0",
"avatar_url": "https://avatars.githubusercontent.com/u/12866554?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Rocketknight1",
"html_url": "https://github.com/Rocketknight1",
"followers_url": "https://api.github.com/users/Rocketknight1/followers",
"following_url": "https://api.github.com/users/Rocketknight1/following{/other_user}",
"gists_url": "https://api.github.com/users/Rocketknight1/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Rocketknight1/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Rocketknight1/subscriptions",
"organizations_url": "https://api.github.com/users/Rocketknight1/orgs",
"repos_url": "https://api.github.com/users/Rocketknight1/repos",
"events_url": "https://api.github.com/users/Rocketknight1/events{/privacy}",
"received_events_url": "https://api.github.com/users/Rocketknight1/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-04-17T16:23:25 | 2025-04-17T16:51:43 | 2025-04-17T16:26:20 | MEMBER | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37588",
"html_url": "https://github.com/huggingface/transformers/pull/37588",
"diff_url": "https://github.com/huggingface/transformers/pull/37588.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37588.patch",
"merged_at": null
} | EDIT: These attributes were definitely used in some cases, my bad! | {
"login": "Rocketknight1",
"id": 12866554,
"node_id": "MDQ6VXNlcjEyODY2NTU0",
"avatar_url": "https://avatars.githubusercontent.com/u/12866554?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Rocketknight1",
"html_url": "https://github.com/Rocketknight1",
"followers_url": "https://api.github.com/users/Rocketknight1/followers",
"following_url": "https://api.github.com/users/Rocketknight1/following{/other_user}",
"gists_url": "https://api.github.com/users/Rocketknight1/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Rocketknight1/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Rocketknight1/subscriptions",
"organizations_url": "https://api.github.com/users/Rocketknight1/orgs",
"repos_url": "https://api.github.com/users/Rocketknight1/repos",
"events_url": "https://api.github.com/users/Rocketknight1/events{/privacy}",
"received_events_url": "https://api.github.com/users/Rocketknight1/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/37588/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/37588/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/37587 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/37587/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/37587/comments | https://api.github.com/repos/huggingface/transformers/issues/37587/events | https://github.com/huggingface/transformers/pull/37587 | 3,002,892,969 | PR_kwDOCUB6oc6TAv_i | 37,587 | Flag SpeechT5 flaky test | {
"login": "molbap",
"id": 39954772,
"node_id": "MDQ6VXNlcjM5OTU0Nzcy",
"avatar_url": "https://avatars.githubusercontent.com/u/39954772?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/molbap",
"html_url": "https://github.com/molbap",
"followers_url": "https://api.github.com/users/molbap/followers",
"following_url": "https://api.github.com/users/molbap/following{/other_user}",
"gists_url": "https://api.github.com/users/molbap/gists{/gist_id}",
"starred_url": "https://api.github.com/users/molbap/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/molbap/subscriptions",
"organizations_url": "https://api.github.com/users/molbap/orgs",
"repos_url": "https://api.github.com/users/molbap/repos",
"events_url": "https://api.github.com/users/molbap/events{/privacy}",
"received_events_url": "https://api.github.com/users/molbap/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-04-17T15:56:42 | 2025-04-18T09:35:47 | 2025-04-18T09:35:46 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37587",
"html_url": "https://github.com/huggingface/transformers/pull/37587",
"diff_url": "https://github.com/huggingface/transformers/pull/37587.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37587.patch",
"merged_at": "2025-04-18T09:35:46"
} | # What does this PR do?
Flags a test as flaky - it's been blocking a few PRs. Not sure what is causing it to fail, cc @eustlb here maybe | {
"login": "molbap",
"id": 39954772,
"node_id": "MDQ6VXNlcjM5OTU0Nzcy",
"avatar_url": "https://avatars.githubusercontent.com/u/39954772?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/molbap",
"html_url": "https://github.com/molbap",
"followers_url": "https://api.github.com/users/molbap/followers",
"following_url": "https://api.github.com/users/molbap/following{/other_user}",
"gists_url": "https://api.github.com/users/molbap/gists{/gist_id}",
"starred_url": "https://api.github.com/users/molbap/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/molbap/subscriptions",
"organizations_url": "https://api.github.com/users/molbap/orgs",
"repos_url": "https://api.github.com/users/molbap/repos",
"events_url": "https://api.github.com/users/molbap/events{/privacy}",
"received_events_url": "https://api.github.com/users/molbap/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/37587/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/37587/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/37586 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/37586/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/37586/comments | https://api.github.com/repos/huggingface/transformers/issues/37586/events | https://github.com/huggingface/transformers/issues/37586 | 3,002,862,854 | I_kwDOCUB6oc6y_A0G | 37,586 | Leaving model loaded in memory for an inference server causes CUDNN to crash if left alone for awhile - need help making sense of what's happening | {
"login": "rperdon",
"id": 19974945,
"node_id": "MDQ6VXNlcjE5OTc0OTQ1",
"avatar_url": "https://avatars.githubusercontent.com/u/19974945?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/rperdon",
"html_url": "https://github.com/rperdon",
"followers_url": "https://api.github.com/users/rperdon/followers",
"following_url": "https://api.github.com/users/rperdon/following{/other_user}",
"gists_url": "https://api.github.com/users/rperdon/gists{/gist_id}",
"starred_url": "https://api.github.com/users/rperdon/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/rperdon/subscriptions",
"organizations_url": "https://api.github.com/users/rperdon/orgs",
"repos_url": "https://api.github.com/users/rperdon/repos",
"events_url": "https://api.github.com/users/rperdon/events{/privacy}",
"received_events_url": "https://api.github.com/users/rperdon/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
}
] | closed | false | null | [] | null | [] | 2025-04-17T15:43:15 | 2025-04-25T15:22:49 | 2025-04-25T15:20:08 | NONE | null | null | null | null | ### System Info
- `transformers` version: 4.51.3
- Platform: Linux-5.15.0-1042-nvidia-x86_64-with-glibc2.35
- Python version: 3.10.12
- Huggingface_hub version: 0.30.2
- Safetensors version: 0.5.3
- Accelerate version: 1.6.0
- Accelerate config: not found
- DeepSpeed version: not installed
- PyTorch version (GPU?): 2.6.0+cu124 (True)
- Tensorflow version (GPU?): not installed (NA)
- Flax version (CPU?/GPU?/TPU?): not installed (NA)
- Jax version: not installed
- JaxLib version: not installed
- Using distributed or parallel set-up in script?: <fill in>
- Using GPU in script?: <fill in>
- GPU type: NVIDIA A100-SXM4-40GB
### Who can help?
_No response_
### Information
- [ ] The official example scripts
- [x] My own modified scripts
### Tasks
- [ ] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...)
- [ ] My own task or dataset (give details below)
### Reproduction
https://huggingface.co/huihui-ai/Qwen2-VL-2B-Instruct-abliterated
Created a docker environment to run CUDA.
Taken this code and run into a flask application run in waitress.
Run inference on an image
Wait. In my case it's leaving it alone for a day, but I will try to test how long before it dies. Try inference again and it crashes
I'm not seeing any more memory increase in GPU than after it was loaded with nvidia-smi.
Every 5.0s: free -m dgxa100: Thu Apr 17 15:27:36 2025
total used free shared buff/cache available
Mem: 1031854 205868 668095 3939 157889 816722
Swap: 0 0 0
Main memory available prior to launching app.
### Expected behavior
After some time CUDNN crashes and does not do inference anymore.

If I try to relaunch the application:

I am forced to exit my docker container. My other docker containers are unaffected and continuing to run. I need to relaunch my docker container to be able to run the app and model for inference again. | {
"login": "rperdon",
"id": 19974945,
"node_id": "MDQ6VXNlcjE5OTc0OTQ1",
"avatar_url": "https://avatars.githubusercontent.com/u/19974945?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/rperdon",
"html_url": "https://github.com/rperdon",
"followers_url": "https://api.github.com/users/rperdon/followers",
"following_url": "https://api.github.com/users/rperdon/following{/other_user}",
"gists_url": "https://api.github.com/users/rperdon/gists{/gist_id}",
"starred_url": "https://api.github.com/users/rperdon/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/rperdon/subscriptions",
"organizations_url": "https://api.github.com/users/rperdon/orgs",
"repos_url": "https://api.github.com/users/rperdon/repos",
"events_url": "https://api.github.com/users/rperdon/events{/privacy}",
"received_events_url": "https://api.github.com/users/rperdon/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/37586/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/37586/timeline | null | completed | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | true |
https://api.github.com/repos/huggingface/transformers/issues/37585 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/37585/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/37585/comments | https://api.github.com/repos/huggingface/transformers/issues/37585/events | https://github.com/huggingface/transformers/pull/37585 | 3,002,854,964 | PR_kwDOCUB6oc6TAn9W | 37,585 | chore: update model card for SigLIP | {
"login": "saswatmeher",
"id": 35535056,
"node_id": "MDQ6VXNlcjM1NTM1MDU2",
"avatar_url": "https://avatars.githubusercontent.com/u/35535056?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/saswatmeher",
"html_url": "https://github.com/saswatmeher",
"followers_url": "https://api.github.com/users/saswatmeher/followers",
"following_url": "https://api.github.com/users/saswatmeher/following{/other_user}",
"gists_url": "https://api.github.com/users/saswatmeher/gists{/gist_id}",
"starred_url": "https://api.github.com/users/saswatmeher/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/saswatmeher/subscriptions",
"organizations_url": "https://api.github.com/users/saswatmeher/orgs",
"repos_url": "https://api.github.com/users/saswatmeher/repos",
"events_url": "https://api.github.com/users/saswatmeher/events{/privacy}",
"received_events_url": "https://api.github.com/users/saswatmeher/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-04-17T15:39:31 | 2025-04-23T09:20:44 | 2025-04-18T20:30:41 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37585",
"html_url": "https://github.com/huggingface/transformers/pull/37585",
"diff_url": "https://github.com/huggingface/transformers/pull/37585.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37585.patch",
"merged_at": "2025-04-18T20:30:41"
} | # What does this PR do?
Update the model card for SigLIP to handle #36979
Fixes #36979
## Before submitting
- [x] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [x] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [x] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
@stevhliu
<!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @
If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**.
Please tag fewer than 3 people.
Models:
- text models: @ArthurZucker
- vision models: @amyeroberts, @qubvel
- speech models: @eustlb
- graph models: @clefourrier
Library:
- flax: @gante and @Rocketknight1
- generate: @zucchini-nlp (visual-language models) or @gante (all others)
- pipelines: @Rocketknight1
- tensorflow: @gante and @Rocketknight1
- tokenizers: @ArthurZucker
- trainer: @zach-huggingface and @SunMarc
- chat templates: @Rocketknight1
Integrations:
- deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface
- ray/raytune: @richardliaw, @amogkam
- Big Model Inference: @SunMarc
- quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber
Documentation: @stevhliu
HF projects:
- accelerate: [different repo](https://github.com/huggingface/accelerate)
- datasets: [different repo](https://github.com/huggingface/datasets)
- diffusers: [different repo](https://github.com/huggingface/diffusers)
- rust tokenizers: [different repo](https://github.com/huggingface/tokenizers)
Maintained examples (not research project or legacy):
- Flax: @Rocketknight1
- PyTorch: See Models above and tag the person corresponding to the modality of the example.
- TensorFlow: @Rocketknight1
-->
| {
"login": "stevhliu",
"id": 59462357,
"node_id": "MDQ6VXNlcjU5NDYyMzU3",
"avatar_url": "https://avatars.githubusercontent.com/u/59462357?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/stevhliu",
"html_url": "https://github.com/stevhliu",
"followers_url": "https://api.github.com/users/stevhliu/followers",
"following_url": "https://api.github.com/users/stevhliu/following{/other_user}",
"gists_url": "https://api.github.com/users/stevhliu/gists{/gist_id}",
"starred_url": "https://api.github.com/users/stevhliu/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/stevhliu/subscriptions",
"organizations_url": "https://api.github.com/users/stevhliu/orgs",
"repos_url": "https://api.github.com/users/stevhliu/repos",
"events_url": "https://api.github.com/users/stevhliu/events{/privacy}",
"received_events_url": "https://api.github.com/users/stevhliu/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/37585/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/37585/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/37584 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/37584/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/37584/comments | https://api.github.com/repos/huggingface/transformers/issues/37584/events | https://github.com/huggingface/transformers/issues/37584 | 3,002,742,647 | I_kwDOCUB6oc6y-jd3 | 37,584 | Error when loading a pretrained model from local file if model has been saved to 2 locations due to config mismatch | {
"login": "sah267",
"id": 6809828,
"node_id": "MDQ6VXNlcjY4MDk4Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/6809828?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/sah267",
"html_url": "https://github.com/sah267",
"followers_url": "https://api.github.com/users/sah267/followers",
"following_url": "https://api.github.com/users/sah267/following{/other_user}",
"gists_url": "https://api.github.com/users/sah267/gists{/gist_id}",
"starred_url": "https://api.github.com/users/sah267/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sah267/subscriptions",
"organizations_url": "https://api.github.com/users/sah267/orgs",
"repos_url": "https://api.github.com/users/sah267/repos",
"events_url": "https://api.github.com/users/sah267/events{/privacy}",
"received_events_url": "https://api.github.com/users/sah267/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
}
] | closed | false | null | [] | null | [] | 2025-04-17T14:55:25 | 2025-05-28T16:41:31 | 2025-05-28T16:41:30 | NONE | null | null | null | null | ### Summary of Issue
After saving a pre-trained model from huggingface to a local folder and then loading it with `from_pretrained`. If you save that same model to another local folder and try to load from the second location, you get an error because the values in the `auto_model["AutoConfig"]` don't match. The model config is missing the model name.
We have figured out that adding an `_auto_class` value to the model config fixes the issue but we are unsure why.
### System Info
- `transformers` version: 4.45.2
- Platform: macOS-15.3.2-arm64-arm-64bit
- Python version: 3.12.8
- Huggingface_hub version: 0.26.2
- Safetensors version: 0.4.5
- Accelerate version: 1.1.1
- Accelerate config: not found
- PyTorch version (GPU?): 2.5.1 (False)
- Tensorflow version (GPU?): not installed (NA)
- Flax version (CPU?/GPU?/TPU?): not installed (NA)
- Jax version: not installed
- JaxLib version: not installed
- Using distributed or parallel set-up in script?: <fill in>
### Who can help?
_No response_
### Information
- [ ] The official example scripts
- [ ] My own modified scripts
### Tasks
- [ ] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...)
- [ ] My own task or dataset (give details below)
### Reproduction
```
def test_custom_configuration_model_load():
model_name = "Alibaba-NLP/gte-base-en-v1.5"
with tempfile.TemporaryDirectory() as temp_dir:
AutoModel.from_pretrained(model_name, trust_remote_code=True).save_pretrained(temp_dir)
AutoModel.from_pretrained(temp_dir, trust_remote_code=True)
with tempfile.TemporaryDirectory() as temp_dir:
AutoModel.from_pretrained(model_name, trust_remote_code=True).save_pretrained(temp_dir)
AutoModel.from_pretrained(temp_dir, trust_remote_code=True)
```
**Error and Stacktrace**
```
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
.venv/lib/python3.12/site-packages/transformers/models/auto/auto_factory.py:557: in from_pretrained
cls.register(config.__class__, model_class, exist_ok=True)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
cls = <class 'transformers.models.auto.modeling_auto.AutoModel'>, config_class = <class 'transformers_modules.tmp0b76e0q7.configuration.NewConfig'>
model_class = <class 'transformers_modules.Alibaba-NLP.new-impl.40ced75c3017eb27626c9d4ea981bde21a2662f4.modeling.NewModel'>, exist_ok = True
@classmethod
def register(cls, config_class, model_class, exist_ok=False):
"""
Register a new model for this class.
Args:
config_class ([`PretrainedConfig`]):
The configuration corresponding to the model to register.
model_class ([`PreTrainedModel`]):
The model to register.
"""
if hasattr(model_class, "config_class") and str(model_class.config_class) != str(config_class):
> raise ValueError(
"The model class you are passing has a `config_class` attribute that is not consistent with the "
f"config class you passed (model has {model_class.config_class} and you passed {config_class}. Fix "
"one of those so they match!"
)
E ValueError: The model class you are passing has a `config_class` attribute that is not consistent with the config class you passed (model has <class 'transformers_modules.Alibaba-NLP.new-impl.40ced75c3017eb27626c9d4ea981bde21a2662f4.configuration.NewConfig'> and you passed <class 'transformers_modules.tmp0b76e0q7.configuration.NewConfig'>. Fix one of those so they match!
```
See below for the two config file contents:
```
{
"_name_or_path": "Alibaba-NLP/gte-base-en-v1.5",
"architectures": [
"NewModel"
],
"attention_probs_dropout_prob": 0.0,
"auto_map": {
"AutoConfig": "Alibaba-NLP/new-impl--configuration.NewConfig",
"AutoModel": "Alibaba-NLP/new-impl--modeling.NewModel",
"AutoModelForMaskedLM": "Alibaba-NLP/new-impl--modeling.NewForMaskedLM",
"AutoModelForMultipleChoice": "Alibaba-NLP/new-impl--modeling.NewForMultipleChoice",
"AutoModelForQuestionAnswering": "Alibaba-NLP/new-impl--modeling.NewForQuestionAnswering",
"AutoModelForSequenceClassification": "Alibaba-NLP/new-impl--modeling.NewForSequenceClassification",
"AutoModelForTokenClassification": "Alibaba-NLP/new-impl--modeling.NewForTokenClassification"
},
"classifier_dropout": null,
"hidden_act": "gelu",
"hidden_dropout_prob": 0.1,
"hidden_size": 768,
"initializer_range": 0.02,
"intermediate_size": 3072,
"layer_norm_eps": 1e-12,
"layer_norm_type": "layer_norm",
"logn_attention_clip1": false,
"logn_attention_scale": false,
"max_position_embeddings": 8192,
"model_type": "new",
"num_attention_heads": 12,
"num_hidden_layers": 12,
"pack_qkv": true,
"pad_token_id": 0,
"position_embedding_type": "rope",
"rope_scaling": {
"factor": 2.0,
"type": "ntk"
},
"rope_theta": 500000,
"torch_dtype": "float32",
"transformers_version": "4.45.2",
"type_vocab_size": 0,
"unpad_inputs": false,
"use_memory_efficient_attention": false,
"vocab_size": 30528
}
```
local config after second load:
```
{
"_name_or_path": "Alibaba-NLP/gte-base-en-v1.5",
"architectures": [
"NewModel"
],
"attention_probs_dropout_prob": 0.0,
"auto_map": {
"AutoConfig": "configuration.NewConfig", <---- This is the WRONG value, missing `repo_id` value prefix
"AutoModel": "Alibaba-NLP/new-impl--modeling.NewModel",
"AutoModelForMaskedLM": "Alibaba-NLP/new-impl--modeling.NewForMaskedLM",
"AutoModelForMultipleChoice": "Alibaba-NLP/new-impl--modeling.NewForMultipleChoice",
"AutoModelForQuestionAnswering": "Alibaba-NLP/new-impl--modeling.NewForQuestionAnswering",
"AutoModelForSequenceClassification": "Alibaba-NLP/new-impl--modeling.NewForSequenceClassification",
"AutoModelForTokenClassification": "Alibaba-NLP/new-impl--modeling.NewForTokenClassification"
},
"classifier_dropout": null,
"hidden_act": "gelu",
"hidden_dropout_prob": 0.1,
"hidden_size": 768,
"initializer_range": 0.02,
"intermediate_size": 3072,
"layer_norm_eps": 1e-12,
"layer_norm_type": "layer_norm",
"logn_attention_clip1": false,
"logn_attention_scale": false,
"max_position_embeddings": 8192,
"model_type": "new",
"num_attention_heads": 12,
"num_hidden_layers": 12,
"pack_qkv": true,
"pad_token_id": 0,
"position_embedding_type": "rope",
"rope_scaling": {
"factor": 2.0,
"type": "ntk"
},
"rope_theta": 500000,
"torch_dtype": "float32",
"transformers_version": "4.45.2",
"type_vocab_size": 0,
"unpad_inputs": false,
"use_memory_efficient_attention": false,
"vocab_size": 30528
}
```
### Expected behavior
Should be able to save and load pretrained model as many times as desired | {
"login": "Rocketknight1",
"id": 12866554,
"node_id": "MDQ6VXNlcjEyODY2NTU0",
"avatar_url": "https://avatars.githubusercontent.com/u/12866554?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Rocketknight1",
"html_url": "https://github.com/Rocketknight1",
"followers_url": "https://api.github.com/users/Rocketknight1/followers",
"following_url": "https://api.github.com/users/Rocketknight1/following{/other_user}",
"gists_url": "https://api.github.com/users/Rocketknight1/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Rocketknight1/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Rocketknight1/subscriptions",
"organizations_url": "https://api.github.com/users/Rocketknight1/orgs",
"repos_url": "https://api.github.com/users/Rocketknight1/repos",
"events_url": "https://api.github.com/users/Rocketknight1/events{/privacy}",
"received_events_url": "https://api.github.com/users/Rocketknight1/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/37584/reactions",
"total_count": 4,
"+1": 4,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/37584/timeline | null | completed | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | true |
https://api.github.com/repos/huggingface/transformers/issues/37583 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/37583/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/37583/comments | https://api.github.com/repos/huggingface/transformers/issues/37583/events | https://github.com/huggingface/transformers/pull/37583 | 3,002,688,486 | PR_kwDOCUB6oc6TAD4c | 37,583 | Refactor phi doc | {
"login": "JihadHammoud02",
"id": 94748033,
"node_id": "U_kgDOBaW9gQ",
"avatar_url": "https://avatars.githubusercontent.com/u/94748033?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/JihadHammoud02",
"html_url": "https://github.com/JihadHammoud02",
"followers_url": "https://api.github.com/users/JihadHammoud02/followers",
"following_url": "https://api.github.com/users/JihadHammoud02/following{/other_user}",
"gists_url": "https://api.github.com/users/JihadHammoud02/gists{/gist_id}",
"starred_url": "https://api.github.com/users/JihadHammoud02/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/JihadHammoud02/subscriptions",
"organizations_url": "https://api.github.com/users/JihadHammoud02/orgs",
"repos_url": "https://api.github.com/users/JihadHammoud02/repos",
"events_url": "https://api.github.com/users/JihadHammoud02/events{/privacy}",
"received_events_url": "https://api.github.com/users/JihadHammoud02/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-04-17T14:34:39 | 2025-04-21T17:31:04 | 2025-04-21T17:31:04 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37583",
"html_url": "https://github.com/huggingface/transformers/pull/37583",
"diff_url": "https://github.com/huggingface/transformers/pull/37583.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37583.patch",
"merged_at": "2025-04-21T17:31:04"
} | Refactored the phi.md model card to follow the Hugging Face documentation conventions for model docs.
Reorganized the structure for consistency with other model docs (title, model overview, usage, and notes).
Included usage examples for:
pipeline (text generation)
AutoModel (text generation with AutoModelForCausal)
transformers-cli (text classification, to diversify usage)
Added general notes about the architecture and tokenizer.
What was left out:
The original documentation included:
Usage instructions for Phi-2
Flash Attention 2 integration tips
A performance graph
I didn’t include those in this version because there wasn’t a clear conventional place for them in model docs. I focused instead on making the file compliant with existing documentation structure.
If you'd like me to include the extra content from the original docs (e.g., FlashAttention setup, performance benchmarks, or Phi-2 usage), I’d be happy to revisit and integrate it in a way that fits well with the current doc style. | {
"login": "stevhliu",
"id": 59462357,
"node_id": "MDQ6VXNlcjU5NDYyMzU3",
"avatar_url": "https://avatars.githubusercontent.com/u/59462357?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/stevhliu",
"html_url": "https://github.com/stevhliu",
"followers_url": "https://api.github.com/users/stevhliu/followers",
"following_url": "https://api.github.com/users/stevhliu/following{/other_user}",
"gists_url": "https://api.github.com/users/stevhliu/gists{/gist_id}",
"starred_url": "https://api.github.com/users/stevhliu/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/stevhliu/subscriptions",
"organizations_url": "https://api.github.com/users/stevhliu/orgs",
"repos_url": "https://api.github.com/users/stevhliu/repos",
"events_url": "https://api.github.com/users/stevhliu/events{/privacy}",
"received_events_url": "https://api.github.com/users/stevhliu/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/37583/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/37583/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/37582 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/37582/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/37582/comments | https://api.github.com/repos/huggingface/transformers/issues/37582/events | https://github.com/huggingface/transformers/pull/37582 | 3,002,554,409 | PR_kwDOCUB6oc6S_mjq | 37,582 | Add code examples for creating & fine‑tuning EncoderDecoderModel (fixes #16135) | {
"login": "HarshitaTechWizard",
"id": 173771133,
"node_id": "U_kgDOCluJfQ",
"avatar_url": "https://avatars.githubusercontent.com/u/173771133?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/HarshitaTechWizard",
"html_url": "https://github.com/HarshitaTechWizard",
"followers_url": "https://api.github.com/users/HarshitaTechWizard/followers",
"following_url": "https://api.github.com/users/HarshitaTechWizard/following{/other_user}",
"gists_url": "https://api.github.com/users/HarshitaTechWizard/gists{/gist_id}",
"starred_url": "https://api.github.com/users/HarshitaTechWizard/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/HarshitaTechWizard/subscriptions",
"organizations_url": "https://api.github.com/users/HarshitaTechWizard/orgs",
"repos_url": "https://api.github.com/users/HarshitaTechWizard/repos",
"events_url": "https://api.github.com/users/HarshitaTechWizard/events{/privacy}",
"received_events_url": "https://api.github.com/users/HarshitaTechWizard/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | open | false | null | [] | null | [] | 2025-04-17T13:45:33 | 2025-04-17T21:34:15 | null | NONE | null | null | true | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37582",
"html_url": "https://github.com/huggingface/transformers/pull/37582",
"diff_url": "https://github.com/huggingface/transformers/pull/37582.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37582.patch",
"merged_at": null
} | This PR implements the first two code‑example requests from Issue #16135:
1. Create & Save an EncoderDecoderModel
- Shows how to load pre‑trained encoder & decoder, then save the combined model.
2. Fine‑Tune an EncoderDecoderModel
- Demonstrates a one‑epoch training loop on a tiny sample dataset.
Both snippets are inserted under the appropriate bullets in `docs/source/model_doc/encoder-decoder.mdx` and have been verified by a local `make docs` build.
Closes #16135.
| null | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/37582/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/37582/timeline | null | null | null | null | true | false |
https://api.github.com/repos/huggingface/transformers/issues/37581 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/37581/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/37581/comments | https://api.github.com/repos/huggingface/transformers/issues/37581/events | https://github.com/huggingface/transformers/pull/37581 | 3,002,546,730 | PR_kwDOCUB6oc6S_k37 | 37,581 | Ensure positive warm-up size | {
"login": "Cyrilvallez",
"id": 71554963,
"node_id": "MDQ6VXNlcjcxNTU0OTYz",
"avatar_url": "https://avatars.githubusercontent.com/u/71554963?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Cyrilvallez",
"html_url": "https://github.com/Cyrilvallez",
"followers_url": "https://api.github.com/users/Cyrilvallez/followers",
"following_url": "https://api.github.com/users/Cyrilvallez/following{/other_user}",
"gists_url": "https://api.github.com/users/Cyrilvallez/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Cyrilvallez/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Cyrilvallez/subscriptions",
"organizations_url": "https://api.github.com/users/Cyrilvallez/orgs",
"repos_url": "https://api.github.com/users/Cyrilvallez/repos",
"events_url": "https://api.github.com/users/Cyrilvallez/events{/privacy}",
"received_events_url": "https://api.github.com/users/Cyrilvallez/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-04-17T13:43:18 | 2025-04-17T14:11:56 | 2025-04-17T14:11:54 | MEMBER | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37581",
"html_url": "https://github.com/huggingface/transformers/pull/37581",
"diff_url": "https://github.com/huggingface/transformers/pull/37581.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37581.patch",
"merged_at": "2025-04-17T14:11:54"
} | # What does this PR do?
cc @qubvel @ydshieh for viz 🤗 | {
"login": "Cyrilvallez",
"id": 71554963,
"node_id": "MDQ6VXNlcjcxNTU0OTYz",
"avatar_url": "https://avatars.githubusercontent.com/u/71554963?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Cyrilvallez",
"html_url": "https://github.com/Cyrilvallez",
"followers_url": "https://api.github.com/users/Cyrilvallez/followers",
"following_url": "https://api.github.com/users/Cyrilvallez/following{/other_user}",
"gists_url": "https://api.github.com/users/Cyrilvallez/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Cyrilvallez/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Cyrilvallez/subscriptions",
"organizations_url": "https://api.github.com/users/Cyrilvallez/orgs",
"repos_url": "https://api.github.com/users/Cyrilvallez/repos",
"events_url": "https://api.github.com/users/Cyrilvallez/events{/privacy}",
"received_events_url": "https://api.github.com/users/Cyrilvallez/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/37581/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/37581/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/37580 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/37580/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/37580/comments | https://api.github.com/repos/huggingface/transformers/issues/37580/events | https://github.com/huggingface/transformers/issues/37580 | 3,002,526,390 | I_kwDOCUB6oc6y9uq2 | 37,580 | Reproduce Grounding DINO LVIS Benchmark Results with HF implementation | {
"login": "sfixl",
"id": 200221651,
"node_id": "U_kgDOC-8j0w",
"avatar_url": "https://avatars.githubusercontent.com/u/200221651?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/sfixl",
"html_url": "https://github.com/sfixl",
"followers_url": "https://api.github.com/users/sfixl/followers",
"following_url": "https://api.github.com/users/sfixl/following{/other_user}",
"gists_url": "https://api.github.com/users/sfixl/gists{/gist_id}",
"starred_url": "https://api.github.com/users/sfixl/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sfixl/subscriptions",
"organizations_url": "https://api.github.com/users/sfixl/orgs",
"repos_url": "https://api.github.com/users/sfixl/repos",
"events_url": "https://api.github.com/users/sfixl/events{/privacy}",
"received_events_url": "https://api.github.com/users/sfixl/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-04-17T13:35:38 | 2025-05-26T08:16:23 | 2025-05-26T08:16:23 | NONE | null | null | null | null | Has anyone had success reproducing the LVIS benchmark results using the HF implementation of Grounding DINO? Can it be due to the HF implementation, or is it 1:1 the same as the official one?
Unfortunately the authors did not publish any details (or code) how they did the benchmark.
My best results for the GroundingDINO-T model on the LVIS Minival benchmark are 17.2 AP, whereas the paper states 27.4 AP.
I used the official [LVIS API](https://github.com/lvis-dataset/lvis-api) for evaluation. Since it is not possible to prompt for all 1203 LVIS object categories simultaneously, I split it into batches of 20 categories and later accumulated the results.
Also it is not clear what labels they used for the prompts. I expect that the authors manually optimized the prompts. I simply used the class labels, which could (partially) explain the difference. | {
"login": "github-actions[bot]",
"id": 41898282,
"node_id": "MDM6Qm90NDE4OTgyODI=",
"avatar_url": "https://avatars.githubusercontent.com/in/15368?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/github-actions%5Bbot%5D",
"html_url": "https://github.com/apps/github-actions",
"followers_url": "https://api.github.com/users/github-actions%5Bbot%5D/followers",
"following_url": "https://api.github.com/users/github-actions%5Bbot%5D/following{/other_user}",
"gists_url": "https://api.github.com/users/github-actions%5Bbot%5D/gists{/gist_id}",
"starred_url": "https://api.github.com/users/github-actions%5Bbot%5D/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/github-actions%5Bbot%5D/subscriptions",
"organizations_url": "https://api.github.com/users/github-actions%5Bbot%5D/orgs",
"repos_url": "https://api.github.com/users/github-actions%5Bbot%5D/repos",
"events_url": "https://api.github.com/users/github-actions%5Bbot%5D/events{/privacy}",
"received_events_url": "https://api.github.com/users/github-actions%5Bbot%5D/received_events",
"type": "Bot",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/37580/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/37580/timeline | null | completed | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | true |
https://api.github.com/repos/huggingface/transformers/issues/37579 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/37579/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/37579/comments | https://api.github.com/repos/huggingface/transformers/issues/37579/events | https://github.com/huggingface/transformers/pull/37579 | 3,002,405,647 | PR_kwDOCUB6oc6S_HDD | 37,579 | [phi4] update conversion | {
"login": "zucchini-nlp",
"id": 100715397,
"node_id": "U_kgDOBgDLhQ",
"avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/zucchini-nlp",
"html_url": "https://github.com/zucchini-nlp",
"followers_url": "https://api.github.com/users/zucchini-nlp/followers",
"following_url": "https://api.github.com/users/zucchini-nlp/following{/other_user}",
"gists_url": "https://api.github.com/users/zucchini-nlp/gists{/gist_id}",
"starred_url": "https://api.github.com/users/zucchini-nlp/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/zucchini-nlp/subscriptions",
"organizations_url": "https://api.github.com/users/zucchini-nlp/orgs",
"repos_url": "https://api.github.com/users/zucchini-nlp/repos",
"events_url": "https://api.github.com/users/zucchini-nlp/events{/privacy}",
"received_events_url": "https://api.github.com/users/zucchini-nlp/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-04-17T12:52:25 | 2025-04-17T13:43:04 | 2025-04-17T13:43:04 | MEMBER | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37579",
"html_url": "https://github.com/huggingface/transformers/pull/37579",
"diff_url": "https://github.com/huggingface/transformers/pull/37579.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37579.patch",
"merged_at": "2025-04-17T13:43:04"
} | # What does this PR do?
<!--
Congratulations! You've made it this far! You're not quite done yet though.
Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution.
Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change.
Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost.
-->
<!-- Remove if not applicable -->
Fixes # (issue)
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [ ] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
<!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @
If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**.
Please tag fewer than 3 people.
Models:
- text models: @ArthurZucker
- vision models: @amyeroberts, @qubvel
- speech models: @eustlb
- graph models: @clefourrier
Library:
- flax: @gante and @Rocketknight1
- generate: @zucchini-nlp (visual-language models) or @gante (all others)
- pipelines: @Rocketknight1
- tensorflow: @gante and @Rocketknight1
- tokenizers: @ArthurZucker
- trainer: @zach-huggingface and @SunMarc
- chat templates: @Rocketknight1
Integrations:
- deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface
- ray/raytune: @richardliaw, @amogkam
- Big Model Inference: @SunMarc
- quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber
Documentation: @stevhliu
HF projects:
- accelerate: [different repo](https://github.com/huggingface/accelerate)
- datasets: [different repo](https://github.com/huggingface/datasets)
- diffusers: [different repo](https://github.com/huggingface/diffusers)
- rust tokenizers: [different repo](https://github.com/huggingface/tokenizers)
Maintained examples (not research project or legacy):
- Flax: @Rocketknight1
- PyTorch: See Models above and tag the person corresponding to the modality of the example.
- TensorFlow: @Rocketknight1
-->
| {
"login": "zucchini-nlp",
"id": 100715397,
"node_id": "U_kgDOBgDLhQ",
"avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/zucchini-nlp",
"html_url": "https://github.com/zucchini-nlp",
"followers_url": "https://api.github.com/users/zucchini-nlp/followers",
"following_url": "https://api.github.com/users/zucchini-nlp/following{/other_user}",
"gists_url": "https://api.github.com/users/zucchini-nlp/gists{/gist_id}",
"starred_url": "https://api.github.com/users/zucchini-nlp/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/zucchini-nlp/subscriptions",
"organizations_url": "https://api.github.com/users/zucchini-nlp/orgs",
"repos_url": "https://api.github.com/users/zucchini-nlp/repos",
"events_url": "https://api.github.com/users/zucchini-nlp/events{/privacy}",
"received_events_url": "https://api.github.com/users/zucchini-nlp/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/37579/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/37579/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/37578 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/37578/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/37578/comments | https://api.github.com/repos/huggingface/transformers/issues/37578/events | https://github.com/huggingface/transformers/pull/37578 | 3,002,084,273 | PR_kwDOCUB6oc6S-AyP | 37,578 | Fix Quark quantization config | {
"login": "MekkCyber",
"id": 93391238,
"node_id": "U_kgDOBZEJhg",
"avatar_url": "https://avatars.githubusercontent.com/u/93391238?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/MekkCyber",
"html_url": "https://github.com/MekkCyber",
"followers_url": "https://api.github.com/users/MekkCyber/followers",
"following_url": "https://api.github.com/users/MekkCyber/following{/other_user}",
"gists_url": "https://api.github.com/users/MekkCyber/gists{/gist_id}",
"starred_url": "https://api.github.com/users/MekkCyber/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/MekkCyber/subscriptions",
"organizations_url": "https://api.github.com/users/MekkCyber/orgs",
"repos_url": "https://api.github.com/users/MekkCyber/repos",
"events_url": "https://api.github.com/users/MekkCyber/events{/privacy}",
"received_events_url": "https://api.github.com/users/MekkCyber/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-04-17T10:27:01 | 2025-04-18T05:23:41 | 2025-04-18T05:23:39 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37578",
"html_url": "https://github.com/huggingface/transformers/pull/37578",
"diff_url": "https://github.com/huggingface/transformers/pull/37578.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37578.patch",
"merged_at": "2025-04-18T05:23:39"
} | # What does this PR do?
Raises an `ImportException` in the quantization config if quark is not installed, since it's needed there | {
"login": "MekkCyber",
"id": 93391238,
"node_id": "U_kgDOBZEJhg",
"avatar_url": "https://avatars.githubusercontent.com/u/93391238?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/MekkCyber",
"html_url": "https://github.com/MekkCyber",
"followers_url": "https://api.github.com/users/MekkCyber/followers",
"following_url": "https://api.github.com/users/MekkCyber/following{/other_user}",
"gists_url": "https://api.github.com/users/MekkCyber/gists{/gist_id}",
"starred_url": "https://api.github.com/users/MekkCyber/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/MekkCyber/subscriptions",
"organizations_url": "https://api.github.com/users/MekkCyber/orgs",
"repos_url": "https://api.github.com/users/MekkCyber/repos",
"events_url": "https://api.github.com/users/MekkCyber/events{/privacy}",
"received_events_url": "https://api.github.com/users/MekkCyber/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/37578/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/37578/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/37577 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/37577/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/37577/comments | https://api.github.com/repos/huggingface/transformers/issues/37577/events | https://github.com/huggingface/transformers/pull/37577 | 3,001,967,733 | PR_kwDOCUB6oc6S9nO4 | 37,577 | dont merge trigger CI | {
"login": "ydshieh",
"id": 2521628,
"node_id": "MDQ6VXNlcjI1MjE2Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ydshieh",
"html_url": "https://github.com/ydshieh",
"followers_url": "https://api.github.com/users/ydshieh/followers",
"following_url": "https://api.github.com/users/ydshieh/following{/other_user}",
"gists_url": "https://api.github.com/users/ydshieh/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ydshieh/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ydshieh/subscriptions",
"organizations_url": "https://api.github.com/users/ydshieh/orgs",
"repos_url": "https://api.github.com/users/ydshieh/repos",
"events_url": "https://api.github.com/users/ydshieh/events{/privacy}",
"received_events_url": "https://api.github.com/users/ydshieh/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-04-17T09:36:21 | 2025-04-17T14:38:57 | 2025-04-17T14:38:57 | COLLABORATOR | null | null | true | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37577",
"html_url": "https://github.com/huggingface/transformers/pull/37577",
"diff_url": "https://github.com/huggingface/transformers/pull/37577.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37577.patch",
"merged_at": null
} | # What does this PR do?
<!--
Congratulations! You've made it this far! You're not quite done yet though.
Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution.
Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change.
Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost.
-->
<!-- Remove if not applicable -->
Fixes # (issue)
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [ ] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
<!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @
If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**.
Please tag fewer than 3 people.
Models:
- text models: @ArthurZucker
- vision models: @amyeroberts, @qubvel
- speech models: @eustlb
- graph models: @clefourrier
Library:
- flax: @gante and @Rocketknight1
- generate: @zucchini-nlp (visual-language models) or @gante (all others)
- pipelines: @Rocketknight1
- tensorflow: @gante and @Rocketknight1
- tokenizers: @ArthurZucker
- trainer: @zach-huggingface and @SunMarc
- chat templates: @Rocketknight1
Integrations:
- deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface
- ray/raytune: @richardliaw, @amogkam
- Big Model Inference: @SunMarc
- quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber
Documentation: @stevhliu
HF projects:
- accelerate: [different repo](https://github.com/huggingface/accelerate)
- datasets: [different repo](https://github.com/huggingface/datasets)
- diffusers: [different repo](https://github.com/huggingface/diffusers)
- rust tokenizers: [different repo](https://github.com/huggingface/tokenizers)
Maintained examples (not research project or legacy):
- Flax: @Rocketknight1
- PyTorch: See Models above and tag the person corresponding to the modality of the example.
- TensorFlow: @Rocketknight1
-->
| {
"login": "ydshieh",
"id": 2521628,
"node_id": "MDQ6VXNlcjI1MjE2Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ydshieh",
"html_url": "https://github.com/ydshieh",
"followers_url": "https://api.github.com/users/ydshieh/followers",
"following_url": "https://api.github.com/users/ydshieh/following{/other_user}",
"gists_url": "https://api.github.com/users/ydshieh/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ydshieh/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ydshieh/subscriptions",
"organizations_url": "https://api.github.com/users/ydshieh/orgs",
"repos_url": "https://api.github.com/users/ydshieh/repos",
"events_url": "https://api.github.com/users/ydshieh/events{/privacy}",
"received_events_url": "https://api.github.com/users/ydshieh/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/37577/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/37577/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/37576 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/37576/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/37576/comments | https://api.github.com/repos/huggingface/transformers/issues/37576/events | https://github.com/huggingface/transformers/pull/37576 | 3,001,957,513 | PR_kwDOCUB6oc6S9k-6 | 37,576 | [VLMs] support attention backends | {
"login": "zucchini-nlp",
"id": 100715397,
"node_id": "U_kgDOBgDLhQ",
"avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/zucchini-nlp",
"html_url": "https://github.com/zucchini-nlp",
"followers_url": "https://api.github.com/users/zucchini-nlp/followers",
"following_url": "https://api.github.com/users/zucchini-nlp/following{/other_user}",
"gists_url": "https://api.github.com/users/zucchini-nlp/gists{/gist_id}",
"starred_url": "https://api.github.com/users/zucchini-nlp/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/zucchini-nlp/subscriptions",
"organizations_url": "https://api.github.com/users/zucchini-nlp/orgs",
"repos_url": "https://api.github.com/users/zucchini-nlp/repos",
"events_url": "https://api.github.com/users/zucchini-nlp/events{/privacy}",
"received_events_url": "https://api.github.com/users/zucchini-nlp/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-04-17T09:32:22 | 2025-05-15T09:13:32 | 2025-05-08T16:18:54 | MEMBER | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37576",
"html_url": "https://github.com/huggingface/transformers/pull/37576",
"diff_url": "https://github.com/huggingface/transformers/pull/37576.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37576.patch",
"merged_at": "2025-05-08T16:18:54"
} | # What does this PR do?
As per title, another step closer to vLLM + transformers
What was done:
- Support attention API for VLM related models if not yet done
- Pass `kwargs` so vLLM can forward its attention instances
- Replace all loss computations to `self.loss_fn` (see https://github.com/huggingface/transformers/pull/36044#issuecomment-2746657112)
- Minor clean up so new models can copy prettified version, update return block with `can_return_tuple`
Fixes https://github.com/huggingface/transformers/issues/36557, https://github.com/huggingface/transformers/issues/35634, https://github.com/huggingface/transformers/issues/36904 and fixes https://github.com/huggingface/transformers/issues/33963
| {
"login": "zucchini-nlp",
"id": 100715397,
"node_id": "U_kgDOBgDLhQ",
"avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/zucchini-nlp",
"html_url": "https://github.com/zucchini-nlp",
"followers_url": "https://api.github.com/users/zucchini-nlp/followers",
"following_url": "https://api.github.com/users/zucchini-nlp/following{/other_user}",
"gists_url": "https://api.github.com/users/zucchini-nlp/gists{/gist_id}",
"starred_url": "https://api.github.com/users/zucchini-nlp/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/zucchini-nlp/subscriptions",
"organizations_url": "https://api.github.com/users/zucchini-nlp/orgs",
"repos_url": "https://api.github.com/users/zucchini-nlp/repos",
"events_url": "https://api.github.com/users/zucchini-nlp/events{/privacy}",
"received_events_url": "https://api.github.com/users/zucchini-nlp/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/37576/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/37576/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/37575 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/37575/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/37575/comments | https://api.github.com/repos/huggingface/transformers/issues/37575/events | https://github.com/huggingface/transformers/pull/37575 | 3,001,897,725 | PR_kwDOCUB6oc6S9YA9 | 37,575 | [Bugfix] Fix flash-attention func param mismatch and softmax_scale default value mistake on Ascend NPU | {
"login": "FightingZhen",
"id": 26176607,
"node_id": "MDQ6VXNlcjI2MTc2NjA3",
"avatar_url": "https://avatars.githubusercontent.com/u/26176607?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/FightingZhen",
"html_url": "https://github.com/FightingZhen",
"followers_url": "https://api.github.com/users/FightingZhen/followers",
"following_url": "https://api.github.com/users/FightingZhen/following{/other_user}",
"gists_url": "https://api.github.com/users/FightingZhen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/FightingZhen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/FightingZhen/subscriptions",
"organizations_url": "https://api.github.com/users/FightingZhen/orgs",
"repos_url": "https://api.github.com/users/FightingZhen/repos",
"events_url": "https://api.github.com/users/FightingZhen/events{/privacy}",
"received_events_url": "https://api.github.com/users/FightingZhen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-04-17T09:07:23 | 2025-08-14T01:52:43 | 2025-04-18T09:34:17 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37575",
"html_url": "https://github.com/huggingface/transformers/pull/37575",
"diff_url": "https://github.com/huggingface/transformers/pull/37575.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37575.patch",
"merged_at": "2025-04-18T09:34:17"
} | # What does this PR do?
After we support using Flash Attention on Ascend NPU[(PR)](https://github.com/huggingface/transformers/pull/36696), we found 2 bugs in the implementation of Flash Attention on Ascend NPU. This PR is committed for solving both.
**Bugfix1:**
We found that there exists a kind of situation in the code of `main` branch, where func `flash_attn_varlen_func` is used **without key-value style params passing**, like following code: https://github.com/huggingface/transformers/blob/3bc44eaaeee01b7f0d2d55c9991900b43cafe62d/src/transformers/models/qwen2_5_vl/modeling_qwen2_5_vl.py#L203
In that case, params order between func `flash_attn_varlen_func` in package `flash-attn` and func `npu_flash_attn_varlen_func` in package `transformers` **is not aligned**, which may cause unexpected errors 😞
Therefore, we solve this problem by aligning mismatch params `max_seqlen_q` and `max_seqlen_k`.
At the same time, we have also checked the func `npu_flash_attn_func`, and it does not have param order mismatch problem.
**Bugfix2:**
For func `npu_flash_attn_func` and `npu_flash_attn_varlen_func`, when param `softmax_scale` is set to `None`, it should be set to `1.0 / sqrt(q.shape(-1))` as default value.
Fixes # (issue)
Not related.
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [x] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [x] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
| {
"login": "MekkCyber",
"id": 93391238,
"node_id": "U_kgDOBZEJhg",
"avatar_url": "https://avatars.githubusercontent.com/u/93391238?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/MekkCyber",
"html_url": "https://github.com/MekkCyber",
"followers_url": "https://api.github.com/users/MekkCyber/followers",
"following_url": "https://api.github.com/users/MekkCyber/following{/other_user}",
"gists_url": "https://api.github.com/users/MekkCyber/gists{/gist_id}",
"starred_url": "https://api.github.com/users/MekkCyber/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/MekkCyber/subscriptions",
"organizations_url": "https://api.github.com/users/MekkCyber/orgs",
"repos_url": "https://api.github.com/users/MekkCyber/repos",
"events_url": "https://api.github.com/users/MekkCyber/events{/privacy}",
"received_events_url": "https://api.github.com/users/MekkCyber/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/37575/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/37575/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/37574 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/37574/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/37574/comments | https://api.github.com/repos/huggingface/transformers/issues/37574/events | https://github.com/huggingface/transformers/issues/37574 | 3,001,861,522 | I_kwDOCUB6oc6y7MWS | 37,574 | Wrong KV cache update for sliding-window attention (SWA) layers when total sequence length reaches window size | {
"login": "plienhar",
"id": 118842459,
"node_id": "U_kgDOBxVkWw",
"avatar_url": "https://avatars.githubusercontent.com/u/118842459?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/plienhar",
"html_url": "https://github.com/plienhar",
"followers_url": "https://api.github.com/users/plienhar/followers",
"following_url": "https://api.github.com/users/plienhar/following{/other_user}",
"gists_url": "https://api.github.com/users/plienhar/gists{/gist_id}",
"starred_url": "https://api.github.com/users/plienhar/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/plienhar/subscriptions",
"organizations_url": "https://api.github.com/users/plienhar/orgs",
"repos_url": "https://api.github.com/users/plienhar/repos",
"events_url": "https://api.github.com/users/plienhar/events{/privacy}",
"received_events_url": "https://api.github.com/users/plienhar/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
}
] | closed | false | {
"login": "manueldeprada",
"id": 6536835,
"node_id": "MDQ6VXNlcjY1MzY4MzU=",
"avatar_url": "https://avatars.githubusercontent.com/u/6536835?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/manueldeprada",
"html_url": "https://github.com/manueldeprada",
"followers_url": "https://api.github.com/users/manueldeprada/followers",
"following_url": "https://api.github.com/users/manueldeprada/following{/other_user}",
"gists_url": "https://api.github.com/users/manueldeprada/gists{/gist_id}",
"starred_url": "https://api.github.com/users/manueldeprada/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/manueldeprada/subscriptions",
"organizations_url": "https://api.github.com/users/manueldeprada/orgs",
"repos_url": "https://api.github.com/users/manueldeprada/repos",
"events_url": "https://api.github.com/users/manueldeprada/events{/privacy}",
"received_events_url": "https://api.github.com/users/manueldeprada/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"login": "manueldeprada",
"id": 6536835,
"node_id": "MDQ6VXNlcjY1MzY4MzU=",
"avatar_url": "https://avatars.githubusercontent.com/u/6536835?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/manueldeprada",
"html_url": "https://github.com/manueldeprada",
"followers_url": "https://api.github.com/users/manueldeprada/followers",
"following_url": "https://api.github.com/users/manueldeprada/following{/other_user}",
"gists_url": "https://api.github.com/users/manueldeprada/gists{/gist_id}",
"starred_url": "https://api.github.com/users/manueldeprada/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/manueldeprada/subscriptions",
"organizations_url": "https://api.github.com/users/manueldeprada/orgs",
"repos_url": "https://api.github.com/users/manueldeprada/repos",
"events_url": "https://api.github.com/users/manueldeprada/events{/privacy}",
"received_events_url": "https://api.github.com/users/manueldeprada/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | [] | 2025-04-17T08:52:07 | 2025-05-20T10:46:14 | 2025-05-20T10:46:14 | NONE | null | null | null | null | ### System Info
```text
- `transformers` version: 4.51.2
- Platform: Linux-6.8.0-1021-aws-x86_64-with-glibc2.35
- Python version: 3.12.8
- Huggingface_hub version: 0.30.2
- Safetensors version: 0.5.3
- Accelerate version: 1.5.2
- Accelerate config: not found
- DeepSpeed version: not installed
- PyTorch version (GPU?): 2.6.0+cu124 (True)
- Tensorflow version (GPU?): not installed (NA)
- Flax version (CPU?/GPU?/TPU?): not installed (NA)
- Jax version: not installed
- JaxLib version: not installed
- Using distributed or parallel set-up in script?: No
- Using GPU in script?: No
- GPU type: NVIDIA L40S
```
### Who can help?
@ArthurZucker
### Reproduction
```python
from types import SimpleNamespace
import numpy as np
import torch
import torch.nn.functional as F
from transformers.cache_utils import HybridCache
def update_2d_attention_mask(attention_mask_2d: torch.LongTensor, padding_side: str) -> torch.LongTensor:
# From tranformers.utils.GenerationMixin._update_model_kwargs_for_generation
batch_size, _ = attention_mask_2d.shape
if padding_side == "left":
attention_mask_2d = torch.cat([attention_mask_2d, attention_mask_2d.new_ones((batch_size, 1))], dim=1)
else:
attention_mask_2d = torch.cat([attention_mask_2d.new_ones((batch_size, 1)), attention_mask_2d], dim=1)
return attention_mask_2d
def create_cache_positions(attention_mask_2d: torch.LongTensor, is_prefill: bool) -> torch.LongTensor:
# From tranformers.utils.GenerationMixin._get_initial_cache_position
cache_position = torch.ones_like(attention_mask_2d[0, :], dtype=torch.int64).cumsum(0) - 1
if is_prefill:
return cache_position
else:
return cache_position[-1:]
config = SimpleNamespace(
num_hidden_layers=1,
num_key_value_heads=1,
num_attention_heads=1,
head_dim=2,
hidden_size=2,
sliding_window=8,
sliding_window_pattern=4
)
max_batch_size = 1
max_cache_length = 12
kv_cache_manager = HybridCache(
config=config,
max_batch_size=max_batch_size,
max_cache_len=max_cache_length
)
# ** INPUTS **
# 2D attention mask of shape (batch_size, max_input_length) from the tokenizer output
attention_mask_2d = torch.tensor(
[[1, 1, 1, 1, 1, 1]],
dtype=torch.int32
)
batch_size, input_sequence_length = attention_mask_2d.shape
# Prefill key update of shape (max_batch_size, num_key_value_heads, max_input_length, head_dim)
key_states = torch.arange(1, input_sequence_length+1)[None, None, :, None].to(dtype=torch.float32).expand(-1, -1, -1, config.head_dim)
# Prefill cache positions of shape (max_input_length,). Tensor of absolute cache positions where the update will be written to.
cache_position = create_cache_positions(attention_mask_2d=attention_mask_2d, is_prefill=True)
print("---- INITIAL CACHE STATE ----")
print(" - k_cache:")
print(np.array2string(kv_cache_manager.key_cache[0].transpose(3, 2).numpy(), precision=4, suppress_small=True))
# ** PREFILL **
print("\n---- PREFILL ----")
print("> k_cache state before cache update:")
print(np.array2string(kv_cache_manager.key_cache[0].transpose(3, 2).numpy(), precision=4, suppress_small=True))
print("> Cache update inputs:")
print(" - cache_position:")
print(np.array2string(cache_position.numpy(), precision=4, suppress_small=True))
print(" - cache update (keys):")
print(np.array2string(key_states.transpose(3, 2).numpy(), precision=4, suppress_small=True))
k_cache, v_cache = kv_cache_manager.update(
key_states=key_states,
value_states=key_states+1,
layer_idx=0,
cache_kwargs={
"cache_position": cache_position,
"sliding_window": config.sliding_window,
}
)
print("> k_cache state after cache update:")
print(np.array2string(kv_cache_manager.key_cache[0].transpose(3, 2).numpy(), precision=4, suppress_small=True))
attention_mask_2d = update_2d_attention_mask(attention_mask_2d=attention_mask_2d, padding_side="left")
# ** TOKEN GENERATION **
for token_gen_step_idx in range (max_cache_length - input_sequence_length):
print(f"\n---- TOKEN GENERATION STEP #{token_gen_step_idx+1} ----")
batch_size, input_sequence_length = attention_mask_2d.shape
# Token generation key update of shape (max_batch_size, num_key_value_heads, 1, head_dim)
key_states = torch.tensor([[input_sequence_length]])[:, None, :, None].to(dtype=torch.float32).expand(-1, -1, -1, config.head_dim)
# Token generation cache_positions of shape (1,). Its unique value is the absolute cache position the update will be written to.
cache_position = create_cache_positions(attention_mask_2d=attention_mask_2d, is_prefill=False)
print("> k_cache state before cache update:")
print(np.array2string(kv_cache_manager.key_cache[0].transpose(3, 2).numpy(), precision=4, suppress_small=True))
print("> Cache update inputs:")
print(" - cache_position:")
print(np.array2string(cache_position.numpy(), precision=4, suppress_small=True))
print(" - cache update (keys):")
print(np.array2string(key_states.transpose(3, 2).numpy(), precision=4, suppress_small=True))
k_cache, v_cache = kv_cache_manager.update(
key_states=key_states,
value_states=key_states+1,
layer_idx=0,
cache_kwargs={
"cache_position": cache_position,
"sliding_window": config.sliding_window,
}
)
attention_mask_2d = update_2d_attention_mask(attention_mask_2d=attention_mask_2d, padding_side="left")
print("> k_cache state after cache update:")
print(np.array2string(kv_cache_manager.key_cache[0].transpose(3, 2).numpy(), precision=4, suppress_small=True))
```
### Expected behavior
Let's take a simple example assuming batch size 1, window size 8 and input sequence length of 6. The cache is initialized as an tensor
of shape (batch_size, window_size) full of zeros.
* Initial cache state:
```text
[0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0]
```
1. Assuming keys computed during the prefill phase are `[10.0, 10.1, 10.2, 10.3, 10.4, 10.5]`, final sequence length is 6 and
the cache state after update is:
```text
[10.0, 10.1, 10.2, 10.3, 10.4, 10.5, 0.0, 0.0]
```
2. Assuming the keys computed during token generation step 1 are `[10.6]`, final sequence length is 7 and the cache state after update is:
```text
[10.0, 10.1, 10.2, 10.3, 10.4, 10.5, 10.6, 0.0]
```
3. Assuming the keys computed during token generation step 2 are `[10.7]`, final sequence length is 8 and the cache state after update is:
```text
[10.0, 10.1, 10.2, 10.3, 10.4, 10.5, 10.6, 10.7]
```
4. Assuming the keys computed during token generation step 3 are `[10.8]`, final sequence length is 9 and now exceeds the window size. The cache must therefore be rolled before being updated and the cache state after update is:
```text
[10.1, 10.2, 10.3, 10.4, 10.5, 10.6, 10.7, 10.8]
```
5. Assuming the keys computed during token generation step 4 are `[10.9]`, final sequence length is 10 and exceeds the window size. The cache must therefore be rolled before being updated and the cache state after update is:
```text
[10.2, 10.3, 10.4, 10.5, 10.6, 10.7, 10.8, 10.9]
```
### Actual behavior
The actual behavior matches the expected behavior in the following cases:
* If the initial prompt length is larger than the window size.
* If the initial prompt length is smaller than the window size & total sequence length remain below window size - 1.
However, for sequences that start smaller than the window size and which grow larger than the window size during generation,
we observe the following diverging behavior (cf. above Python code for reproduction).
* Initial cache state:
```text
[0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0]
```
1. Cache state after prefill update:
```text
[10.0, 10.1, 10.2, 10.3, 10.4, 10.5, 0.0, 0.0]
```
2. Cache state after token generation step 1 update:
```text
[10.0, 10.1, 10.2, 10.3, 10.4, 10.5, 10.6, 0.0]
```
3. Cache state after token generation step 2 update:
```text
[10.1, 10.2, 10.3, 10.4, 10.5, 10.6, 0.0, 10.7]
```
4. Cache state after token generation step 3 update:
```text
[10.2, 10.3, 10.4, 10.5, 10.6, 0.0, 10.7, 10.8]
```
5. Cache state after token generation step 4 update:
```text
[10.3, 10.4, 10.5, 10.6, 0.0, 10.7, 10.8, 10.9]
```
Essentially, the current cache update implementation is rolling the cache one step too early.
Within the code source, this means that the rolling condition should be `to_shift = cache_position > max_cache_len - 1`
instead of `to_shift = cache_position >= max_cache_len - 1`.
Updating one step too early wrongfully:
* Shift by one position all tokens with absolute position lower or equal to window size - 1.
* Add a zero key & value at position window size.
Impact is long lasting as we only recover the expected cache state when total sequence length reaches two times the window size.
### Possible fix
In our understanding, the above behavior can be fixed by changing two lines in `HybridCache._sliding_update` and `SlidingWindowCache._sliding_update`.
```python
class FixedHybridCache(HybridCache):
def _sliding_update(self,
cache_position: torch.LongTensor,
layer_idx: int,
key_states: torch.FloatTensor,
value_states: torch.FloatTensor,
k_out: torch.FloatTensor,
v_out: torch.FloatTensor,
max_cache_len: int) -> Tuple[torch.FloatTensor, torch.FloatTensor]:
if cache_position.shape[0] > max_cache_len:
k_out = key_states[:, :, -max_cache_len:, :]
v_out = value_states[:, :, -max_cache_len:, :]
# Assumption: caches are all zeros at this point, `+=` is equivalent to `=` but compile-friendly
self.key_cache[layer_idx] += k_out
self.value_cache[layer_idx] += v_out
# we should return the whole states instead of k_out, v_out to take the whole prompt
# into consideration when building kv cache instead of just throwing away tokens outside of the window
return key_states, value_states
slicing = torch.ones(max_cache_len, dtype=torch.long, device=value_states.device).cumsum(0)
# >>>>>>>>>>>>>> START OF FIX >>>>>>>>>>>>>>
# cache_position = cache_position.clamp(0, max_cache_len - 1)
# to_shift = cache_position >= max_cache_len - 1
to_shift = cache_position > max_cache_len - 1
cache_position = cache_position.clamp(0, max_cache_len - 1)
# <<<<<<<<<<<<<< END OF FIX <<<<<<<<<<<<<<<<
indices = (slicing + to_shift[-1].int() - 1) % max_cache_len
k_out = k_out[:, :, indices]
v_out = v_out[:, :, indices]
k_out[:, :, cache_position] = key_states
v_out[:, :, cache_position] = value_states
# `_.zero()` followed by `+=` is equivalent `=`, but compile-friendly (without graph breaks due to assignment)
self.key_cache[layer_idx].zero_()
self.value_cache[layer_idx].zero_()
self.key_cache[layer_idx] += k_out
self.value_cache[layer_idx] += v_out
return k_out, v_out
``` | {
"login": "manueldeprada",
"id": 6536835,
"node_id": "MDQ6VXNlcjY1MzY4MzU=",
"avatar_url": "https://avatars.githubusercontent.com/u/6536835?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/manueldeprada",
"html_url": "https://github.com/manueldeprada",
"followers_url": "https://api.github.com/users/manueldeprada/followers",
"following_url": "https://api.github.com/users/manueldeprada/following{/other_user}",
"gists_url": "https://api.github.com/users/manueldeprada/gists{/gist_id}",
"starred_url": "https://api.github.com/users/manueldeprada/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/manueldeprada/subscriptions",
"organizations_url": "https://api.github.com/users/manueldeprada/orgs",
"repos_url": "https://api.github.com/users/manueldeprada/repos",
"events_url": "https://api.github.com/users/manueldeprada/events{/privacy}",
"received_events_url": "https://api.github.com/users/manueldeprada/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/37574/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/37574/timeline | null | completed | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | true |
https://api.github.com/repos/huggingface/transformers/issues/37573 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/37573/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/37573/comments | https://api.github.com/repos/huggingface/transformers/issues/37573/events | https://github.com/huggingface/transformers/pull/37573 | 3,001,641,309 | PR_kwDOCUB6oc6S8gUu | 37,573 | 🚨[VLMs] use only `xxx_token_id` for multimodal tokens | {
"login": "zucchini-nlp",
"id": 100715397,
"node_id": "U_kgDOBgDLhQ",
"avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/zucchini-nlp",
"html_url": "https://github.com/zucchini-nlp",
"followers_url": "https://api.github.com/users/zucchini-nlp/followers",
"following_url": "https://api.github.com/users/zucchini-nlp/following{/other_user}",
"gists_url": "https://api.github.com/users/zucchini-nlp/gists{/gist_id}",
"starred_url": "https://api.github.com/users/zucchini-nlp/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/zucchini-nlp/subscriptions",
"organizations_url": "https://api.github.com/users/zucchini-nlp/orgs",
"repos_url": "https://api.github.com/users/zucchini-nlp/repos",
"events_url": "https://api.github.com/users/zucchini-nlp/events{/privacy}",
"received_events_url": "https://api.github.com/users/zucchini-nlp/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-04-17T07:09:38 | 2025-04-29T10:09:03 | 2025-04-18T15:03:40 | MEMBER | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37573",
"html_url": "https://github.com/huggingface/transformers/pull/37573",
"diff_url": "https://github.com/huggingface/transformers/pull/37573.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37573.patch",
"merged_at": "2025-04-18T15:03:40"
} | # What does this PR do?
As per title, attempt to standardize config naming when it comes to special tokens. We currently use `x_index` an `x_id`. Even though majority of models use `x_index`, I decided that `x_id` aligns more with text models where we have `pad_token_id` etc.
This is one of the steps to make vLLM + transformers integration a bit closer 🤏🏻 From now on please let's use `xx_id` whenever a new model is added | {
"login": "zucchini-nlp",
"id": 100715397,
"node_id": "U_kgDOBgDLhQ",
"avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/zucchini-nlp",
"html_url": "https://github.com/zucchini-nlp",
"followers_url": "https://api.github.com/users/zucchini-nlp/followers",
"following_url": "https://api.github.com/users/zucchini-nlp/following{/other_user}",
"gists_url": "https://api.github.com/users/zucchini-nlp/gists{/gist_id}",
"starred_url": "https://api.github.com/users/zucchini-nlp/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/zucchini-nlp/subscriptions",
"organizations_url": "https://api.github.com/users/zucchini-nlp/orgs",
"repos_url": "https://api.github.com/users/zucchini-nlp/repos",
"events_url": "https://api.github.com/users/zucchini-nlp/events{/privacy}",
"received_events_url": "https://api.github.com/users/zucchini-nlp/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/37573/reactions",
"total_count": 1,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 1
} | https://api.github.com/repos/huggingface/transformers/issues/37573/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/37572 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/37572/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/37572/comments | https://api.github.com/repos/huggingface/transformers/issues/37572/events | https://github.com/huggingface/transformers/pull/37572 | 3,001,578,324 | PR_kwDOCUB6oc6S8Sk6 | 37,572 | fix 2 encoder_decoder issues on XPU | {
"login": "yao-matrix",
"id": 7245027,
"node_id": "MDQ6VXNlcjcyNDUwMjc=",
"avatar_url": "https://avatars.githubusercontent.com/u/7245027?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/yao-matrix",
"html_url": "https://github.com/yao-matrix",
"followers_url": "https://api.github.com/users/yao-matrix/followers",
"following_url": "https://api.github.com/users/yao-matrix/following{/other_user}",
"gists_url": "https://api.github.com/users/yao-matrix/gists{/gist_id}",
"starred_url": "https://api.github.com/users/yao-matrix/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/yao-matrix/subscriptions",
"organizations_url": "https://api.github.com/users/yao-matrix/orgs",
"repos_url": "https://api.github.com/users/yao-matrix/repos",
"events_url": "https://api.github.com/users/yao-matrix/events{/privacy}",
"received_events_url": "https://api.github.com/users/yao-matrix/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-04-17T06:39:45 | 2025-04-20T22:50:09 | 2025-04-18T15:49:24 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37572",
"html_url": "https://github.com/huggingface/transformers/pull/37572",
"diff_url": "https://github.com/huggingface/transformers/pull/37572.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37572.patch",
"merged_at": "2025-04-18T15:49:24"
} | PASSED pytest -rA tests/models/encoder_decoder/test_modeling_encoder_decoder.py -k test_roberta2roberta_summarization
PASSED pytest -rA tests/models/encoder_decoder/test_modeling_encoder_decoder.py -k test_bert2gpt2_summarization | {
"login": "ydshieh",
"id": 2521628,
"node_id": "MDQ6VXNlcjI1MjE2Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ydshieh",
"html_url": "https://github.com/ydshieh",
"followers_url": "https://api.github.com/users/ydshieh/followers",
"following_url": "https://api.github.com/users/ydshieh/following{/other_user}",
"gists_url": "https://api.github.com/users/ydshieh/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ydshieh/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ydshieh/subscriptions",
"organizations_url": "https://api.github.com/users/ydshieh/orgs",
"repos_url": "https://api.github.com/users/ydshieh/repos",
"events_url": "https://api.github.com/users/ydshieh/events{/privacy}",
"received_events_url": "https://api.github.com/users/ydshieh/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/37572/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/37572/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/37571 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/37571/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/37571/comments | https://api.github.com/repos/huggingface/transformers/issues/37571/events | https://github.com/huggingface/transformers/pull/37571 | 3,001,520,792 | PR_kwDOCUB6oc6S8GAb | 37,571 | enable 6 modeling cases on XPU | {
"login": "yao-matrix",
"id": 7245027,
"node_id": "MDQ6VXNlcjcyNDUwMjc=",
"avatar_url": "https://avatars.githubusercontent.com/u/7245027?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/yao-matrix",
"html_url": "https://github.com/yao-matrix",
"followers_url": "https://api.github.com/users/yao-matrix/followers",
"following_url": "https://api.github.com/users/yao-matrix/following{/other_user}",
"gists_url": "https://api.github.com/users/yao-matrix/gists{/gist_id}",
"starred_url": "https://api.github.com/users/yao-matrix/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/yao-matrix/subscriptions",
"organizations_url": "https://api.github.com/users/yao-matrix/orgs",
"repos_url": "https://api.github.com/users/yao-matrix/repos",
"events_url": "https://api.github.com/users/yao-matrix/events{/privacy}",
"received_events_url": "https://api.github.com/users/yao-matrix/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-04-17T06:08:50 | 2025-04-20T22:48:49 | 2025-04-18T10:28:08 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37571",
"html_url": "https://github.com/huggingface/transformers/pull/37571",
"diff_url": "https://github.com/huggingface/transformers/pull/37571.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37571.patch",
"merged_at": "2025-04-18T10:28:08"
} | **4 PASSED**
PASSED tests/models/mpt/test_modeling_mpt.py::MptIntegrationTests::test_model_logits
PASSED tests/models/nemotron/test_modeling_nemotron.py::NemotronIntegrationTest::test_nemotron_8b_generation_eager
PASSED tests/models/bamba/test_modeling_bamba.py::BambaModelIntegrationTest::test_simple_batched_generate_with_padding
PASSED tests/models/bamba/test_modeling_bamba.py::BambaModelIntegrationTest::test_simple_generate
**2 FAILED**
FAILED tests/models/gemma/test_modeling_gemma.py::GemmaIntegrationTest::test_compile_static_cache, will fix in separate PR
FAILED tests/models/nemotron/test_modeling_nemotron.py::NemotronIntegrationTest::test_nemotron_8b_generation_sdpa, will fix in separate PR | {
"login": "ydshieh",
"id": 2521628,
"node_id": "MDQ6VXNlcjI1MjE2Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ydshieh",
"html_url": "https://github.com/ydshieh",
"followers_url": "https://api.github.com/users/ydshieh/followers",
"following_url": "https://api.github.com/users/ydshieh/following{/other_user}",
"gists_url": "https://api.github.com/users/ydshieh/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ydshieh/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ydshieh/subscriptions",
"organizations_url": "https://api.github.com/users/ydshieh/orgs",
"repos_url": "https://api.github.com/users/ydshieh/repos",
"events_url": "https://api.github.com/users/ydshieh/events{/privacy}",
"received_events_url": "https://api.github.com/users/ydshieh/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/37571/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/37571/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/37570 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/37570/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/37570/comments | https://api.github.com/repos/huggingface/transformers/issues/37570/events | https://github.com/huggingface/transformers/issues/37570 | 3,001,363,608 | I_kwDOCUB6oc6y5SyY | 37,570 | How to streaming output audio of Qwen2.5-omni-7b | {
"login": "qinxuye",
"id": 357506,
"node_id": "MDQ6VXNlcjM1NzUwNg==",
"avatar_url": "https://avatars.githubusercontent.com/u/357506?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/qinxuye",
"html_url": "https://github.com/qinxuye",
"followers_url": "https://api.github.com/users/qinxuye/followers",
"following_url": "https://api.github.com/users/qinxuye/following{/other_user}",
"gists_url": "https://api.github.com/users/qinxuye/gists{/gist_id}",
"starred_url": "https://api.github.com/users/qinxuye/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/qinxuye/subscriptions",
"organizations_url": "https://api.github.com/users/qinxuye/orgs",
"repos_url": "https://api.github.com/users/qinxuye/repos",
"events_url": "https://api.github.com/users/qinxuye/events{/privacy}",
"received_events_url": "https://api.github.com/users/qinxuye/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-04-17T04:16:35 | 2025-07-30T08:03:44 | 2025-07-30T08:03:44 | NONE | null | null | null | null | All the examples of qwen2.5-omni-7b did not show how to streaming output audio, with passing streamer, I am able to get streaming text, but how can I get the streaming audio output? | {
"login": "github-actions[bot]",
"id": 41898282,
"node_id": "MDM6Qm90NDE4OTgyODI=",
"avatar_url": "https://avatars.githubusercontent.com/in/15368?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/github-actions%5Bbot%5D",
"html_url": "https://github.com/apps/github-actions",
"followers_url": "https://api.github.com/users/github-actions%5Bbot%5D/followers",
"following_url": "https://api.github.com/users/github-actions%5Bbot%5D/following{/other_user}",
"gists_url": "https://api.github.com/users/github-actions%5Bbot%5D/gists{/gist_id}",
"starred_url": "https://api.github.com/users/github-actions%5Bbot%5D/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/github-actions%5Bbot%5D/subscriptions",
"organizations_url": "https://api.github.com/users/github-actions%5Bbot%5D/orgs",
"repos_url": "https://api.github.com/users/github-actions%5Bbot%5D/repos",
"events_url": "https://api.github.com/users/github-actions%5Bbot%5D/events{/privacy}",
"received_events_url": "https://api.github.com/users/github-actions%5Bbot%5D/received_events",
"type": "Bot",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/37570/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/37570/timeline | null | completed | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | true |
https://api.github.com/repos/huggingface/transformers/issues/37569 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/37569/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/37569/comments | https://api.github.com/repos/huggingface/transformers/issues/37569/events | https://github.com/huggingface/transformers/pull/37569 | 3,001,283,691 | PR_kwDOCUB6oc6S7TVS | 37,569 | enable 6 granite cases on xpu | {
"login": "yao-matrix",
"id": 7245027,
"node_id": "MDQ6VXNlcjcyNDUwMjc=",
"avatar_url": "https://avatars.githubusercontent.com/u/7245027?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/yao-matrix",
"html_url": "https://github.com/yao-matrix",
"followers_url": "https://api.github.com/users/yao-matrix/followers",
"following_url": "https://api.github.com/users/yao-matrix/following{/other_user}",
"gists_url": "https://api.github.com/users/yao-matrix/gists{/gist_id}",
"starred_url": "https://api.github.com/users/yao-matrix/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/yao-matrix/subscriptions",
"organizations_url": "https://api.github.com/users/yao-matrix/orgs",
"repos_url": "https://api.github.com/users/yao-matrix/repos",
"events_url": "https://api.github.com/users/yao-matrix/events{/privacy}",
"received_events_url": "https://api.github.com/users/yao-matrix/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-04-17T03:20:07 | 2025-04-22T22:57:15 | 2025-04-22T15:55:02 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37569",
"html_url": "https://github.com/huggingface/transformers/pull/37569",
"diff_url": "https://github.com/huggingface/transformers/pull/37569.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37569.patch",
"merged_at": "2025-04-22T15:55:02"
} | **cases**
tests/models/granitemoe/test_modeling_granitemoe.py::GraniteMoeIntegrationTest::test_model_3b_generation
tests/models/granitemoeshared/test_modeling_granitemoeshared.py::GraniteMoeSharedIntegrationTest::test_model_3b_generation
tests/models/granite/test_modeling_granite.py::GraniteIntegrationTest::test_model_3b_logits
tests/models/granitemoe/test_modeling_granitemoe.py::GraniteMoeIntegrationTest::test_model_3b_logits
tests/models/granitemoeshared/test_modeling_granitemoeshared.py::GraniteMoeSharedIntegrationTest::test_model_3b_logits
tests/models/granite/test_modeling_granite.py::GraniteIntegrationTest::test_model_3b_logits_bf16
**results**
All pass on XPU, XPU has similar numerical behavior w/ A100, so we can see A100 has similar ground-truth as XPU. | {
"login": "ydshieh",
"id": 2521628,
"node_id": "MDQ6VXNlcjI1MjE2Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ydshieh",
"html_url": "https://github.com/ydshieh",
"followers_url": "https://api.github.com/users/ydshieh/followers",
"following_url": "https://api.github.com/users/ydshieh/following{/other_user}",
"gists_url": "https://api.github.com/users/ydshieh/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ydshieh/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ydshieh/subscriptions",
"organizations_url": "https://api.github.com/users/ydshieh/orgs",
"repos_url": "https://api.github.com/users/ydshieh/repos",
"events_url": "https://api.github.com/users/ydshieh/events{/privacy}",
"received_events_url": "https://api.github.com/users/ydshieh/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/37569/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/37569/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/37568 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/37568/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/37568/comments | https://api.github.com/repos/huggingface/transformers/issues/37568/events | https://github.com/huggingface/transformers/pull/37568 | 3,001,147,744 | PR_kwDOCUB6oc6S63Kc | 37,568 | Gaudi: Add the bf16 support for hpu | {
"login": "yuanwu2017",
"id": 34643241,
"node_id": "MDQ6VXNlcjM0NjQzMjQx",
"avatar_url": "https://avatars.githubusercontent.com/u/34643241?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/yuanwu2017",
"html_url": "https://github.com/yuanwu2017",
"followers_url": "https://api.github.com/users/yuanwu2017/followers",
"following_url": "https://api.github.com/users/yuanwu2017/following{/other_user}",
"gists_url": "https://api.github.com/users/yuanwu2017/gists{/gist_id}",
"starred_url": "https://api.github.com/users/yuanwu2017/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/yuanwu2017/subscriptions",
"organizations_url": "https://api.github.com/users/yuanwu2017/orgs",
"repos_url": "https://api.github.com/users/yuanwu2017/repos",
"events_url": "https://api.github.com/users/yuanwu2017/events{/privacy}",
"received_events_url": "https://api.github.com/users/yuanwu2017/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-04-17T01:34:53 | 2025-04-18T06:25:31 | 2025-04-18T06:00:27 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37568",
"html_url": "https://github.com/huggingface/transformers/pull/37568",
"diff_url": "https://github.com/huggingface/transformers/pull/37568.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37568.patch",
"merged_at": "2025-04-18T06:00:27"
} | # What does this PR do?
Add the bf16 support for hpu, otherwise the following error will occur.
```
[2025-04-17 01:25:52,600] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to hpu (auto detect) [553/1942]
Traceback (most recent call last):
File "/workspace/HuggingFace/tests/workloads/fine-tune/run_lora_clm.py", line 901, in <module>
main()
File "/workspace/HuggingFace/tests/workloads/fine-tune/run_lora_clm.py", line 414, in main
model_args, data_args, training_args, finetune_args = parser.parse_json_file(
File "/workspace/transformers/src/transformers/hf_argparser.py", line 420, in parse_json_file
outputs = self.parse_dict(data, allow_extra_keys=allow_extra_keys)
File "/workspace/transformers/src/transformers/hf_argparser.py", line 393, in parse_dict
obj = dtype(**inputs)
File "<string>", line 131, in __init__
File "/workspace/transformers/src/transformers/training_args.py", line 1692, in __post_init__
raise ValueError(error_message)
ValueError: Your setup doesn't support bf16/gpu.
Traceback (most recent call last):
File "/workspace/HuggingFace/tests/workloads/fine-tune/run_lora_clm.py", line 901, in <module>
main()
File "/workspace/HuggingFace/tests/workloads/fine-tune/run_lora_clm.py", line 414, in main
model_args, data_args, training_args, finetune_args = parser.parse_json_file(
File "/workspace/transformers/src/transformers/hf_argparser.py", line 420, in parse_json_file
outputs = self.parse_dict(data, allow_extra_keys=allow_extra_keys)
File "/workspace/transformers/src/transformers/hf_argparser.py", line 393, in parse_dict
obj = dtype(**inputs)
File "<string>", line 131, in __init__
File "/workspace/transformers/src/transformers/training_args.py", line 1692, in __post_init__
raise ValueError(error_message)
ValueError: Your setup doesn't support bf16/gpu.
Traceback (most recent call last):
File "/workspace/HuggingFace/tests/workloads/fine-tune/run_lora_clm.py", line 901, in <module>
main()
File "/workspace/HuggingFace/tests/workloads/fine-tune/run_lora_clm.py", line 414, in main
model_args, data_args, training_args, finetune_args = parser.parse_json_file(
File "/workspace/transformers/src/transformers/hf_argparser.py", line 420, in parse_json_file
outputs = self.parse_dict(data, allow_extra_keys=allow_extra_keys)
File "/workspace/transformers/src/transformers/hf_argparser.py", line 393, in parse_dict
obj = dtype(**inputs)
File "<string>", line 131, in __init__
File "/workspace/transformers/src/transformers/training_args.py", line 1692, in __post_init__
raise ValueError(error_message)
ValueError: Your setup doesn't support bf16/gpu.
```
<!--
Congratulations! You've made it this far! You're not quite done yet though.
Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution.
Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change.
Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost.
-->
<!-- Remove if not applicable -->
Fixes # (issue)
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [ ] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
<!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @
If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**.
Please tag fewer than 3 people.
Models:
- text models: @ArthurZucker
- vision models: @amyeroberts, @qubvel
- speech models: @eustlb
- graph models: @clefourrier
Library:
- flax: @gante and @Rocketknight1
- generate: @zucchini-nlp (visual-language models) or @gante (all others)
- pipelines: @Rocketknight1
- tensorflow: @gante and @Rocketknight1
- tokenizers: @ArthurZucker
- trainer: @zach-huggingface and @SunMarc
- chat templates: @Rocketknight1
Integrations:
- deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface
- ray/raytune: @richardliaw, @amogkam
- Big Model Inference: @SunMarc
- quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber
Documentation: @stevhliu
HF projects:
- accelerate: [different repo](https://github.com/huggingface/accelerate)
- datasets: [different repo](https://github.com/huggingface/datasets)
- diffusers: [different repo](https://github.com/huggingface/diffusers)
- rust tokenizers: [different repo](https://github.com/huggingface/tokenizers)
Maintained examples (not research project or legacy):
- Flax: @Rocketknight1
- PyTorch: See Models above and tag the person corresponding to the modality of the example.
- TensorFlow: @Rocketknight1
-->
| {
"login": "IlyasMoutawwakil",
"id": 57442720,
"node_id": "MDQ6VXNlcjU3NDQyNzIw",
"avatar_url": "https://avatars.githubusercontent.com/u/57442720?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/IlyasMoutawwakil",
"html_url": "https://github.com/IlyasMoutawwakil",
"followers_url": "https://api.github.com/users/IlyasMoutawwakil/followers",
"following_url": "https://api.github.com/users/IlyasMoutawwakil/following{/other_user}",
"gists_url": "https://api.github.com/users/IlyasMoutawwakil/gists{/gist_id}",
"starred_url": "https://api.github.com/users/IlyasMoutawwakil/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/IlyasMoutawwakil/subscriptions",
"organizations_url": "https://api.github.com/users/IlyasMoutawwakil/orgs",
"repos_url": "https://api.github.com/users/IlyasMoutawwakil/repos",
"events_url": "https://api.github.com/users/IlyasMoutawwakil/events{/privacy}",
"received_events_url": "https://api.github.com/users/IlyasMoutawwakil/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/37568/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/37568/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/37567 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/37567/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/37567/comments | https://api.github.com/repos/huggingface/transformers/issues/37567/events | https://github.com/huggingface/transformers/pull/37567 | 3,001,134,038 | PR_kwDOCUB6oc6S60Wv | 37,567 | docs: fix typo | {
"login": "tonyksong",
"id": 38965603,
"node_id": "MDQ6VXNlcjM4OTY1NjAz",
"avatar_url": "https://avatars.githubusercontent.com/u/38965603?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/tonyksong",
"html_url": "https://github.com/tonyksong",
"followers_url": "https://api.github.com/users/tonyksong/followers",
"following_url": "https://api.github.com/users/tonyksong/following{/other_user}",
"gists_url": "https://api.github.com/users/tonyksong/gists{/gist_id}",
"starred_url": "https://api.github.com/users/tonyksong/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/tonyksong/subscriptions",
"organizations_url": "https://api.github.com/users/tonyksong/orgs",
"repos_url": "https://api.github.com/users/tonyksong/repos",
"events_url": "https://api.github.com/users/tonyksong/events{/privacy}",
"received_events_url": "https://api.github.com/users/tonyksong/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-04-17T01:26:29 | 2025-04-17T13:54:44 | 2025-04-17T13:54:44 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37567",
"html_url": "https://github.com/huggingface/transformers/pull/37567",
"diff_url": "https://github.com/huggingface/transformers/pull/37567.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37567.patch",
"merged_at": "2025-04-17T13:54:44"
} | # What does this PR do?
<!--
Congratulations! You've made it this far! You're not quite done yet though.
Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution.
Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change.
Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost.
-->
Fixes typo
## Before submitting
- [x] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [ ] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
<!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @
If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**.
Please tag fewer than 3 people.
Models:
- text models: @ArthurZucker
- vision models: @amyeroberts, @qubvel
- speech models: @eustlb
- graph models: @clefourrier
Library:
- flax: @gante and @Rocketknight1
- generate: @zucchini-nlp (visual-language models) or @gante (all others)
- pipelines: @Rocketknight1
- tensorflow: @gante and @Rocketknight1
- tokenizers: @ArthurZucker
- trainer: @zach-huggingface and @SunMarc
- chat templates: @Rocketknight1
Integrations:
- deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface
- ray/raytune: @richardliaw, @amogkam
- Big Model Inference: @SunMarc
- quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber
Documentation: @stevhliu
HF projects:
- accelerate: [different repo](https://github.com/huggingface/accelerate)
- datasets: [different repo](https://github.com/huggingface/datasets)
- diffusers: [different repo](https://github.com/huggingface/diffusers)
- rust tokenizers: [different repo](https://github.com/huggingface/tokenizers)
Maintained examples (not research project or legacy):
- Flax: @Rocketknight1
- PyTorch: See Models above and tag the person corresponding to the modality of the example.
- TensorFlow: @Rocketknight1
-->
| {
"login": "Rocketknight1",
"id": 12866554,
"node_id": "MDQ6VXNlcjEyODY2NTU0",
"avatar_url": "https://avatars.githubusercontent.com/u/12866554?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Rocketknight1",
"html_url": "https://github.com/Rocketknight1",
"followers_url": "https://api.github.com/users/Rocketknight1/followers",
"following_url": "https://api.github.com/users/Rocketknight1/following{/other_user}",
"gists_url": "https://api.github.com/users/Rocketknight1/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Rocketknight1/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Rocketknight1/subscriptions",
"organizations_url": "https://api.github.com/users/Rocketknight1/orgs",
"repos_url": "https://api.github.com/users/Rocketknight1/repos",
"events_url": "https://api.github.com/users/Rocketknight1/events{/privacy}",
"received_events_url": "https://api.github.com/users/Rocketknight1/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/37567/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/37567/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/37566 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/37566/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/37566/comments | https://api.github.com/repos/huggingface/transformers/issues/37566/events | https://github.com/huggingface/transformers/issues/37566 | 3,001,118,229 | I_kwDOCUB6oc6y4W4V | 37,566 | clip gradient not working | {
"login": "jiangix-paper",
"id": 62198809,
"node_id": "MDQ6VXNlcjYyMTk4ODA5",
"avatar_url": "https://avatars.githubusercontent.com/u/62198809?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jiangix-paper",
"html_url": "https://github.com/jiangix-paper",
"followers_url": "https://api.github.com/users/jiangix-paper/followers",
"following_url": "https://api.github.com/users/jiangix-paper/following{/other_user}",
"gists_url": "https://api.github.com/users/jiangix-paper/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jiangix-paper/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jiangix-paper/subscriptions",
"organizations_url": "https://api.github.com/users/jiangix-paper/orgs",
"repos_url": "https://api.github.com/users/jiangix-paper/repos",
"events_url": "https://api.github.com/users/jiangix-paper/events{/privacy}",
"received_events_url": "https://api.github.com/users/jiangix-paper/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
}
] | closed | false | null | [] | null | [] | 2025-04-17T01:13:18 | 2025-05-25T08:02:31 | 2025-05-25T08:02:31 | NONE | null | null | null | null | ### System Info
trl 0.16.0
deepspeed 0.15.4
accelerate 0.34.0
pytorch 2.5.1
transformers 0.49.0
### Who can help?
_No response_
### Information
- [ ] The official example scripts
- [ ] My own modified scripts
### Tasks
- [ ] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...)
- [ ] My own task or dataset (give details below)
### Reproduction
Hello, I used accelerate+deepspeed zero3 for distributed grpo training with 8 A800.
For clipping gradient: I set max_grad_norm=1.0 in training arguments and set gradient_clipping=1.0 in deepspeed3.yaml.
When training, many of the printed grad_norm values are greater than 1.0。
It seems that the above parameters do not work.
### Expected behavior
I think the grad_norm values should be less than max_grad_norm (or gradient_clipping)
| {
"login": "github-actions[bot]",
"id": 41898282,
"node_id": "MDM6Qm90NDE4OTgyODI=",
"avatar_url": "https://avatars.githubusercontent.com/in/15368?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/github-actions%5Bbot%5D",
"html_url": "https://github.com/apps/github-actions",
"followers_url": "https://api.github.com/users/github-actions%5Bbot%5D/followers",
"following_url": "https://api.github.com/users/github-actions%5Bbot%5D/following{/other_user}",
"gists_url": "https://api.github.com/users/github-actions%5Bbot%5D/gists{/gist_id}",
"starred_url": "https://api.github.com/users/github-actions%5Bbot%5D/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/github-actions%5Bbot%5D/subscriptions",
"organizations_url": "https://api.github.com/users/github-actions%5Bbot%5D/orgs",
"repos_url": "https://api.github.com/users/github-actions%5Bbot%5D/repos",
"events_url": "https://api.github.com/users/github-actions%5Bbot%5D/events{/privacy}",
"received_events_url": "https://api.github.com/users/github-actions%5Bbot%5D/received_events",
"type": "Bot",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/37566/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/37566/timeline | null | completed | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | true |
https://api.github.com/repos/huggingface/transformers/issues/37565 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/37565/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/37565/comments | https://api.github.com/repos/huggingface/transformers/issues/37565/events | https://github.com/huggingface/transformers/pull/37565 | 3,001,029,519 | PR_kwDOCUB6oc6S6eEP | 37,565 | Tests for the new Tensor Parallel integration | {
"login": "ailunc",
"id": 131329865,
"node_id": "U_kgDOB9PvSQ",
"avatar_url": "https://avatars.githubusercontent.com/u/131329865?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ailunc",
"html_url": "https://github.com/ailunc",
"followers_url": "https://api.github.com/users/ailunc/followers",
"following_url": "https://api.github.com/users/ailunc/following{/other_user}",
"gists_url": "https://api.github.com/users/ailunc/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ailunc/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ailunc/subscriptions",
"organizations_url": "https://api.github.com/users/ailunc/orgs",
"repos_url": "https://api.github.com/users/ailunc/repos",
"events_url": "https://api.github.com/users/ailunc/events{/privacy}",
"received_events_url": "https://api.github.com/users/ailunc/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-04-16T23:58:57 | 2025-04-17T21:57:43 | 2025-04-17T21:57:43 | NONE | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37565",
"html_url": "https://github.com/huggingface/transformers/pull/37565",
"diff_url": "https://github.com/huggingface/transformers/pull/37565.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37565.patch",
"merged_at": null
} | # What does this PR do?
<!--
Congratulations! You've made it this far! You're not quite done yet though.
Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution.
Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change.
Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost.
-->
<!-- Remove if not applicable -->
Fixes #37557 (issue)
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [x] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [x] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case. https://github.com/huggingface/transformers/issues/37557
- [x] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [x] Did you write any new necessary tests?
## Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
<!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @
If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**.
Please tag fewer than 3 people.
Models:
- text models: @ArthurZucker
- vision models: @amyeroberts, @qubvel
- speech models: @eustlb
- graph models: @clefourrier
Library:
- flax: @gante and @Rocketknight1
- generate: @zucchini-nlp (visual-language models) or @gante (all others)
- pipelines: @Rocketknight1
- tensorflow: @gante and @Rocketknight1
- tokenizers: @ArthurZucker
- trainer: @zach-huggingface and @SunMarc
- chat templates: @Rocketknight1
Integrations:
- deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface
- ray/raytune: @richardliaw, @amogkam
- Big Model Inference: @SunMarc
- quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber
Documentation: @stevhliu
HF projects:
- accelerate: [different repo](https://github.com/huggingface/accelerate)
- datasets: [different repo](https://github.com/huggingface/datasets)
- diffusers: [different repo](https://github.com/huggingface/diffusers)
- rust tokenizers: [different repo](https://github.com/huggingface/tokenizers)
Maintained examples (not research project or legacy):
- Flax: @Rocketknight1
- PyTorch: See Models above and tag the person corresponding to the modality of the example.
- TensorFlow: @Rocketknight1
-->
| {
"login": "ailunc",
"id": 131329865,
"node_id": "U_kgDOB9PvSQ",
"avatar_url": "https://avatars.githubusercontent.com/u/131329865?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ailunc",
"html_url": "https://github.com/ailunc",
"followers_url": "https://api.github.com/users/ailunc/followers",
"following_url": "https://api.github.com/users/ailunc/following{/other_user}",
"gists_url": "https://api.github.com/users/ailunc/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ailunc/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ailunc/subscriptions",
"organizations_url": "https://api.github.com/users/ailunc/orgs",
"repos_url": "https://api.github.com/users/ailunc/repos",
"events_url": "https://api.github.com/users/ailunc/events{/privacy}",
"received_events_url": "https://api.github.com/users/ailunc/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/37565/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/37565/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/37564 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/37564/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/37564/comments | https://api.github.com/repos/huggingface/transformers/issues/37564/events | https://github.com/huggingface/transformers/pull/37564 | 3,001,000,334 | PR_kwDOCUB6oc6S6Xs1 | 37,564 | enable 6 gemma2 cases on XPU | {
"login": "yao-matrix",
"id": 7245027,
"node_id": "MDQ6VXNlcjcyNDUwMjc=",
"avatar_url": "https://avatars.githubusercontent.com/u/7245027?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/yao-matrix",
"html_url": "https://github.com/yao-matrix",
"followers_url": "https://api.github.com/users/yao-matrix/followers",
"following_url": "https://api.github.com/users/yao-matrix/following{/other_user}",
"gists_url": "https://api.github.com/users/yao-matrix/gists{/gist_id}",
"starred_url": "https://api.github.com/users/yao-matrix/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/yao-matrix/subscriptions",
"organizations_url": "https://api.github.com/users/yao-matrix/orgs",
"repos_url": "https://api.github.com/users/yao-matrix/repos",
"events_url": "https://api.github.com/users/yao-matrix/events{/privacy}",
"received_events_url": "https://api.github.com/users/yao-matrix/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-04-16T23:34:52 | 2025-04-20T22:46:36 | 2025-04-18T10:10:34 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37564",
"html_url": "https://github.com/huggingface/transformers/pull/37564",
"diff_url": "https://github.com/huggingface/transformers/pull/37564.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37564.patch",
"merged_at": "2025-04-18T10:10:34"
} | **2 pass**
PASSED tests/models/gemma2/test_modeling_gemma2.py::Gemma2IntegrationTest::test_export_static_cache
PASSED tests/models/gemma2/test_modeling_gemma2.py::Gemma2IntegrationTest::test_generation_beyond_sliding_window_3_eager
**1 should skip: XPU doesn't support flash-attention-2 package as of now**
SKIPPPED tests/models/gemma2/test_modeling_gemma2.py::Gemma2IntegrationTest::test_generation_beyond_sliding_window_0_flash_attention_2
**2 flex_attention failed, already in development plan**
FAILED tests/models/gemma2/test_modeling_gemma2.py::Gemma2IntegrationTest::test_generation_beyond_sliding_window_2_flex_attention
FAILED tests/models/gemma2/test_modeling_gemma2.py::Gemma2IntegrationTest::test_model_2b_pipeline_bf16_flex_attention
**1 repetition output when out seq is long, under debug, will submit separate PR if needed**
tests/models/gemma2/test_modeling_gemma2.py::Gemma2IntegrationTest::test_generation_beyond_sliding_window_1_sdpa
| {
"login": "ydshieh",
"id": 2521628,
"node_id": "MDQ6VXNlcjI1MjE2Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ydshieh",
"html_url": "https://github.com/ydshieh",
"followers_url": "https://api.github.com/users/ydshieh/followers",
"following_url": "https://api.github.com/users/ydshieh/following{/other_user}",
"gists_url": "https://api.github.com/users/ydshieh/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ydshieh/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ydshieh/subscriptions",
"organizations_url": "https://api.github.com/users/ydshieh/orgs",
"repos_url": "https://api.github.com/users/ydshieh/repos",
"events_url": "https://api.github.com/users/ydshieh/events{/privacy}",
"received_events_url": "https://api.github.com/users/ydshieh/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/37564/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/37564/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/37563 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/37563/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/37563/comments | https://api.github.com/repos/huggingface/transformers/issues/37563/events | https://github.com/huggingface/transformers/pull/37563 | 3,000,747,700 | PR_kwDOCUB6oc6S5gcx | 37,563 | All models can be initialized on meta device | {
"login": "Cyrilvallez",
"id": 71554963,
"node_id": "MDQ6VXNlcjcxNTU0OTYz",
"avatar_url": "https://avatars.githubusercontent.com/u/71554963?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Cyrilvallez",
"html_url": "https://github.com/Cyrilvallez",
"followers_url": "https://api.github.com/users/Cyrilvallez/followers",
"following_url": "https://api.github.com/users/Cyrilvallez/following{/other_user}",
"gists_url": "https://api.github.com/users/Cyrilvallez/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Cyrilvallez/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Cyrilvallez/subscriptions",
"organizations_url": "https://api.github.com/users/Cyrilvallez/orgs",
"repos_url": "https://api.github.com/users/Cyrilvallez/repos",
"events_url": "https://api.github.com/users/Cyrilvallez/events{/privacy}",
"received_events_url": "https://api.github.com/users/Cyrilvallez/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-04-16T21:07:40 | 2025-04-16T21:45:53 | 2025-04-16T21:26:45 | MEMBER | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37563",
"html_url": "https://github.com/huggingface/transformers/pull/37563",
"diff_url": "https://github.com/huggingface/transformers/pull/37563.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37563.patch",
"merged_at": "2025-04-16T21:26:45"
} | # What does this PR do?
Some old models could not be initialized on meta device because they used dynamic `.item()` or `.tolist()` in the `__init__`, which does not work with meta initialization as the underlying tensor will be on meta. | {
"login": "Cyrilvallez",
"id": 71554963,
"node_id": "MDQ6VXNlcjcxNTU0OTYz",
"avatar_url": "https://avatars.githubusercontent.com/u/71554963?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Cyrilvallez",
"html_url": "https://github.com/Cyrilvallez",
"followers_url": "https://api.github.com/users/Cyrilvallez/followers",
"following_url": "https://api.github.com/users/Cyrilvallez/following{/other_user}",
"gists_url": "https://api.github.com/users/Cyrilvallez/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Cyrilvallez/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Cyrilvallez/subscriptions",
"organizations_url": "https://api.github.com/users/Cyrilvallez/orgs",
"repos_url": "https://api.github.com/users/Cyrilvallez/repos",
"events_url": "https://api.github.com/users/Cyrilvallez/events{/privacy}",
"received_events_url": "https://api.github.com/users/Cyrilvallez/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/37563/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/37563/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/37562 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/37562/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/37562/comments | https://api.github.com/repos/huggingface/transformers/issues/37562/events | https://github.com/huggingface/transformers/pull/37562 | 3,000,629,455 | PR_kwDOCUB6oc6S5HSE | 37,562 | Small fix on context manager detection | {
"login": "Cyrilvallez",
"id": 71554963,
"node_id": "MDQ6VXNlcjcxNTU0OTYz",
"avatar_url": "https://avatars.githubusercontent.com/u/71554963?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Cyrilvallez",
"html_url": "https://github.com/Cyrilvallez",
"followers_url": "https://api.github.com/users/Cyrilvallez/followers",
"following_url": "https://api.github.com/users/Cyrilvallez/following{/other_user}",
"gists_url": "https://api.github.com/users/Cyrilvallez/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Cyrilvallez/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Cyrilvallez/subscriptions",
"organizations_url": "https://api.github.com/users/Cyrilvallez/orgs",
"repos_url": "https://api.github.com/users/Cyrilvallez/repos",
"events_url": "https://api.github.com/users/Cyrilvallez/events{/privacy}",
"received_events_url": "https://api.github.com/users/Cyrilvallez/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-04-16T20:06:30 | 2025-04-17T13:39:46 | 2025-04-17T13:39:44 | MEMBER | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37562",
"html_url": "https://github.com/huggingface/transformers/pull/37562",
"diff_url": "https://github.com/huggingface/transformers/pull/37562.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37562.patch",
"merged_at": "2025-04-17T13:39:44"
} | # What does this PR do?
Following feedback in https://github.com/huggingface/transformers/pull/37216!
cc @SunMarc
In the meantime, I switched the test to ensure we _can_ load on meta so that it works for 3rd party libs until we remove it (because it was not the case before https://github.com/huggingface/transformers/pull/37216, as we need to explicitly pass a `device_map=torch.device("meta")`, which is weird and an anti-pattern)
BTW, do you have concrete examples of 3rd party libs using such meta context managers? 🤔 | {
"login": "Cyrilvallez",
"id": 71554963,
"node_id": "MDQ6VXNlcjcxNTU0OTYz",
"avatar_url": "https://avatars.githubusercontent.com/u/71554963?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Cyrilvallez",
"html_url": "https://github.com/Cyrilvallez",
"followers_url": "https://api.github.com/users/Cyrilvallez/followers",
"following_url": "https://api.github.com/users/Cyrilvallez/following{/other_user}",
"gists_url": "https://api.github.com/users/Cyrilvallez/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Cyrilvallez/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Cyrilvallez/subscriptions",
"organizations_url": "https://api.github.com/users/Cyrilvallez/orgs",
"repos_url": "https://api.github.com/users/Cyrilvallez/repos",
"events_url": "https://api.github.com/users/Cyrilvallez/events{/privacy}",
"received_events_url": "https://api.github.com/users/Cyrilvallez/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/37562/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/37562/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/37561 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/37561/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/37561/comments | https://api.github.com/repos/huggingface/transformers/issues/37561/events | https://github.com/huggingface/transformers/pull/37561 | 3,000,441,751 | PR_kwDOCUB6oc6S4fc6 | 37,561 | Nougat Fast Image Processor | {
"login": "Player256",
"id": 92082372,
"node_id": "U_kgDOBX0QxA",
"avatar_url": "https://avatars.githubusercontent.com/u/92082372?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Player256",
"html_url": "https://github.com/Player256",
"followers_url": "https://api.github.com/users/Player256/followers",
"following_url": "https://api.github.com/users/Player256/following{/other_user}",
"gists_url": "https://api.github.com/users/Player256/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Player256/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Player256/subscriptions",
"organizations_url": "https://api.github.com/users/Player256/orgs",
"repos_url": "https://api.github.com/users/Player256/repos",
"events_url": "https://api.github.com/users/Player256/events{/privacy}",
"received_events_url": "https://api.github.com/users/Player256/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-04-16T18:39:48 | 2025-04-27T07:27:26 | 2025-04-27T07:25:14 | CONTRIBUTOR | null | null | true | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37561",
"html_url": "https://github.com/huggingface/transformers/pull/37561",
"diff_url": "https://github.com/huggingface/transformers/pull/37561.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37561.patch",
"merged_at": null
} | # What does this PR do?
- Working on this mega issue #36978. This PR aims to add a Fast Processor for Nougat Series of Models
| {
"login": "Player256",
"id": 92082372,
"node_id": "U_kgDOBX0QxA",
"avatar_url": "https://avatars.githubusercontent.com/u/92082372?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Player256",
"html_url": "https://github.com/Player256",
"followers_url": "https://api.github.com/users/Player256/followers",
"following_url": "https://api.github.com/users/Player256/following{/other_user}",
"gists_url": "https://api.github.com/users/Player256/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Player256/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Player256/subscriptions",
"organizations_url": "https://api.github.com/users/Player256/orgs",
"repos_url": "https://api.github.com/users/Player256/repos",
"events_url": "https://api.github.com/users/Player256/events{/privacy}",
"received_events_url": "https://api.github.com/users/Player256/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/37561/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/37561/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/37560 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/37560/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/37560/comments | https://api.github.com/repos/huggingface/transformers/issues/37560/events | https://github.com/huggingface/transformers/pull/37560 | 3,000,369,940 | PR_kwDOCUB6oc6S4P7C | 37,560 | Enable granite speech 3.3 tests | {
"login": "alex-jw-brooks",
"id": 10740300,
"node_id": "MDQ6VXNlcjEwNzQwMzAw",
"avatar_url": "https://avatars.githubusercontent.com/u/10740300?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/alex-jw-brooks",
"html_url": "https://github.com/alex-jw-brooks",
"followers_url": "https://api.github.com/users/alex-jw-brooks/followers",
"following_url": "https://api.github.com/users/alex-jw-brooks/following{/other_user}",
"gists_url": "https://api.github.com/users/alex-jw-brooks/gists{/gist_id}",
"starred_url": "https://api.github.com/users/alex-jw-brooks/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/alex-jw-brooks/subscriptions",
"organizations_url": "https://api.github.com/users/alex-jw-brooks/orgs",
"repos_url": "https://api.github.com/users/alex-jw-brooks/repos",
"events_url": "https://api.github.com/users/alex-jw-brooks/events{/privacy}",
"received_events_url": "https://api.github.com/users/alex-jw-brooks/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-04-16T18:05:28 | 2025-05-06T15:56:18 | 2025-05-06T15:56:18 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37560",
"html_url": "https://github.com/huggingface/transformers/pull/37560",
"diff_url": "https://github.com/huggingface/transformers/pull/37560.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37560.patch",
"merged_at": "2025-05-06T15:56:18"
} | Turns on the tests for granite speech now that the 3.3 model is up! Integration tests guard with peft, since they will fail if the lora isn't applied.
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [X] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [X] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [X] Did you write any new necessary tests?
## Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
@eustlb | {
"login": "ydshieh",
"id": 2521628,
"node_id": "MDQ6VXNlcjI1MjE2Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ydshieh",
"html_url": "https://github.com/ydshieh",
"followers_url": "https://api.github.com/users/ydshieh/followers",
"following_url": "https://api.github.com/users/ydshieh/following{/other_user}",
"gists_url": "https://api.github.com/users/ydshieh/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ydshieh/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ydshieh/subscriptions",
"organizations_url": "https://api.github.com/users/ydshieh/orgs",
"repos_url": "https://api.github.com/users/ydshieh/repos",
"events_url": "https://api.github.com/users/ydshieh/events{/privacy}",
"received_events_url": "https://api.github.com/users/ydshieh/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/37560/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/37560/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/37559 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/37559/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/37559/comments | https://api.github.com/repos/huggingface/transformers/issues/37559/events | https://github.com/huggingface/transformers/pull/37559 | 3,000,260,469 | PR_kwDOCUB6oc6S34SE | 37,559 | Fix qwen2audio wanr -> warn | {
"login": "alex-jw-brooks",
"id": 10740300,
"node_id": "MDQ6VXNlcjEwNzQwMzAw",
"avatar_url": "https://avatars.githubusercontent.com/u/10740300?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/alex-jw-brooks",
"html_url": "https://github.com/alex-jw-brooks",
"followers_url": "https://api.github.com/users/alex-jw-brooks/followers",
"following_url": "https://api.github.com/users/alex-jw-brooks/following{/other_user}",
"gists_url": "https://api.github.com/users/alex-jw-brooks/gists{/gist_id}",
"starred_url": "https://api.github.com/users/alex-jw-brooks/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/alex-jw-brooks/subscriptions",
"organizations_url": "https://api.github.com/users/alex-jw-brooks/orgs",
"repos_url": "https://api.github.com/users/alex-jw-brooks/repos",
"events_url": "https://api.github.com/users/alex-jw-brooks/events{/privacy}",
"received_events_url": "https://api.github.com/users/alex-jw-brooks/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-04-16T17:14:33 | 2025-04-17T13:34:59 | 2025-04-17T13:34:59 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37559",
"html_url": "https://github.com/huggingface/transformers/pull/37559",
"diff_url": "https://github.com/huggingface/transformers/pull/37559.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37559.patch",
"merged_at": "2025-04-17T13:34:59"
} | # What does this PR do?
Fixes a typo for `warnings.wanr` -. `warnings.warn` in the qwen2 audio processor. Should not really be reachable because of the deprecation decorator, but maybe still nice to fix anyway!
## Before submitting
- [x] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [ ] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
@eustlb | {
"login": "Rocketknight1",
"id": 12866554,
"node_id": "MDQ6VXNlcjEyODY2NTU0",
"avatar_url": "https://avatars.githubusercontent.com/u/12866554?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Rocketknight1",
"html_url": "https://github.com/Rocketknight1",
"followers_url": "https://api.github.com/users/Rocketknight1/followers",
"following_url": "https://api.github.com/users/Rocketknight1/following{/other_user}",
"gists_url": "https://api.github.com/users/Rocketknight1/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Rocketknight1/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Rocketknight1/subscriptions",
"organizations_url": "https://api.github.com/users/Rocketknight1/orgs",
"repos_url": "https://api.github.com/users/Rocketknight1/repos",
"events_url": "https://api.github.com/users/Rocketknight1/events{/privacy}",
"received_events_url": "https://api.github.com/users/Rocketknight1/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/37559/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/37559/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/37558 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/37558/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/37558/comments | https://api.github.com/repos/huggingface/transformers/issues/37558/events | https://github.com/huggingface/transformers/pull/37558 | 3,000,247,156 | PR_kwDOCUB6oc6S31Yr | 37,558 | [TimesFM] use the main revison instead of revision for integration test | {
"login": "kashif",
"id": 8100,
"node_id": "MDQ6VXNlcjgxMDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/8100?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/kashif",
"html_url": "https://github.com/kashif",
"followers_url": "https://api.github.com/users/kashif/followers",
"following_url": "https://api.github.com/users/kashif/following{/other_user}",
"gists_url": "https://api.github.com/users/kashif/gists{/gist_id}",
"starred_url": "https://api.github.com/users/kashif/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kashif/subscriptions",
"organizations_url": "https://api.github.com/users/kashif/orgs",
"repos_url": "https://api.github.com/users/kashif/repos",
"events_url": "https://api.github.com/users/kashif/events{/privacy}",
"received_events_url": "https://api.github.com/users/kashif/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-04-16T17:08:19 | 2025-04-17T09:26:03 | 2025-04-17T09:26:03 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37558",
"html_url": "https://github.com/huggingface/transformers/pull/37558",
"diff_url": "https://github.com/huggingface/transformers/pull/37558.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37558.patch",
"merged_at": "2025-04-17T09:26:03"
} | # What does this PR do?
use the main revisoin for integration test
<!-- Remove if not applicable -->
Fixes # (issue)
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [ ] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
<!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @
If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**.
Please tag fewer than 3 people.
Models:
- text models: @ArthurZucker
- vision models: @amyeroberts, @qubvel
- speech models: @eustlb
- graph models: @clefourrier
Library:
- flax: @gante and @Rocketknight1
- generate: @zucchini-nlp (visual-language models) or @gante (all others)
- pipelines: @Rocketknight1
- tensorflow: @gante and @Rocketknight1
- tokenizers: @ArthurZucker
- trainer: @zach-huggingface and @SunMarc
- chat templates: @Rocketknight1
Integrations:
- deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface
- ray/raytune: @richardliaw, @amogkam
- Big Model Inference: @SunMarc
- quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber
Documentation: @stevhliu
HF projects:
- accelerate: [different repo](https://github.com/huggingface/accelerate)
- datasets: [different repo](https://github.com/huggingface/datasets)
- diffusers: [different repo](https://github.com/huggingface/diffusers)
- rust tokenizers: [different repo](https://github.com/huggingface/tokenizers)
Maintained examples (not research project or legacy):
- Flax: @Rocketknight1
- PyTorch: See Models above and tag the person corresponding to the modality of the example.
- TensorFlow: @Rocketknight1
-->
| {
"login": "Cyrilvallez",
"id": 71554963,
"node_id": "MDQ6VXNlcjcxNTU0OTYz",
"avatar_url": "https://avatars.githubusercontent.com/u/71554963?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Cyrilvallez",
"html_url": "https://github.com/Cyrilvallez",
"followers_url": "https://api.github.com/users/Cyrilvallez/followers",
"following_url": "https://api.github.com/users/Cyrilvallez/following{/other_user}",
"gists_url": "https://api.github.com/users/Cyrilvallez/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Cyrilvallez/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Cyrilvallez/subscriptions",
"organizations_url": "https://api.github.com/users/Cyrilvallez/orgs",
"repos_url": "https://api.github.com/users/Cyrilvallez/repos",
"events_url": "https://api.github.com/users/Cyrilvallez/events{/privacy}",
"received_events_url": "https://api.github.com/users/Cyrilvallez/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/37558/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/37558/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/37557 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/37557/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/37557/comments | https://api.github.com/repos/huggingface/transformers/issues/37557/events | https://github.com/huggingface/transformers/issues/37557 | 3,000,185,338 | I_kwDOCUB6oc6y0zH6 | 37,557 | Missing tests for the new Tensor Parallel integration | {
"login": "molbap",
"id": 39954772,
"node_id": "MDQ6VXNlcjM5OTU0Nzcy",
"avatar_url": "https://avatars.githubusercontent.com/u/39954772?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/molbap",
"html_url": "https://github.com/molbap",
"followers_url": "https://api.github.com/users/molbap/followers",
"following_url": "https://api.github.com/users/molbap/following{/other_user}",
"gists_url": "https://api.github.com/users/molbap/gists{/gist_id}",
"starred_url": "https://api.github.com/users/molbap/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/molbap/subscriptions",
"organizations_url": "https://api.github.com/users/molbap/orgs",
"repos_url": "https://api.github.com/users/molbap/repos",
"events_url": "https://api.github.com/users/molbap/events{/privacy}",
"received_events_url": "https://api.github.com/users/molbap/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 1834088753,
"node_id": "MDU6TGFiZWwxODM0MDg4NzUz",
"url": "https://api.github.com/repos/huggingface/transformers/labels/Tests",
"name": "Tests",
"color": "a6fcca",
"default": false,
"description": "Related to tests"
},
{
"id": 2760822153,
"node_id": "MDU6TGFiZWwyNzYwODIyMTUz",
"url": "https://api.github.com/repos/huggingface/transformers/labels/Tensor%20Parallel",
"name": "Tensor Parallel",
"color": "1AD0A8",
"default": false,
"description": ""
}
] | closed | false | null | [] | null | [] | 2025-04-16T16:38:00 | 2025-05-25T08:02:32 | 2025-05-25T08:02:32 | CONTRIBUTOR | null | null | null | null | With Llama4 and a couple new models relying more and more on the TP integrations we have, we need to add tests using `torchrun` that:
- test basic TP functionalities (is a colwise split done correctly)
- serve as _functional_ documentation (how is a colwise split done in practice)
Down to write a draft of these soonish, writing it here for the record. | {
"login": "github-actions[bot]",
"id": 41898282,
"node_id": "MDM6Qm90NDE4OTgyODI=",
"avatar_url": "https://avatars.githubusercontent.com/in/15368?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/github-actions%5Bbot%5D",
"html_url": "https://github.com/apps/github-actions",
"followers_url": "https://api.github.com/users/github-actions%5Bbot%5D/followers",
"following_url": "https://api.github.com/users/github-actions%5Bbot%5D/following{/other_user}",
"gists_url": "https://api.github.com/users/github-actions%5Bbot%5D/gists{/gist_id}",
"starred_url": "https://api.github.com/users/github-actions%5Bbot%5D/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/github-actions%5Bbot%5D/subscriptions",
"organizations_url": "https://api.github.com/users/github-actions%5Bbot%5D/orgs",
"repos_url": "https://api.github.com/users/github-actions%5Bbot%5D/repos",
"events_url": "https://api.github.com/users/github-actions%5Bbot%5D/events{/privacy}",
"received_events_url": "https://api.github.com/users/github-actions%5Bbot%5D/received_events",
"type": "Bot",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/37557/reactions",
"total_count": 1,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 1,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/37557/timeline | null | completed | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | true |
https://api.github.com/repos/huggingface/transformers/issues/37556 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/37556/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/37556/comments | https://api.github.com/repos/huggingface/transformers/issues/37556/events | https://github.com/huggingface/transformers/issues/37556 | 3,000,149,670 | I_kwDOCUB6oc6y0qam | 37,556 | AutoConfig.from_pretrained on Llama4 models only returns the inner text_config | {
"login": "cg123",
"id": 397199,
"node_id": "MDQ6VXNlcjM5NzE5OQ==",
"avatar_url": "https://avatars.githubusercontent.com/u/397199?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/cg123",
"html_url": "https://github.com/cg123",
"followers_url": "https://api.github.com/users/cg123/followers",
"following_url": "https://api.github.com/users/cg123/following{/other_user}",
"gists_url": "https://api.github.com/users/cg123/gists{/gist_id}",
"starred_url": "https://api.github.com/users/cg123/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/cg123/subscriptions",
"organizations_url": "https://api.github.com/users/cg123/orgs",
"repos_url": "https://api.github.com/users/cg123/repos",
"events_url": "https://api.github.com/users/cg123/events{/privacy}",
"received_events_url": "https://api.github.com/users/cg123/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
}
] | closed | false | null | [] | null | [] | 2025-04-16T16:24:15 | 2025-05-19T13:44:03 | 2025-05-19T13:44:03 | NONE | null | null | null | null | ### System Info
- `transformers` version: 4.51.3
- Platform: Linux-6.8.0-40-generic-x86_64-with-glibc2.35
- Python version: 3.10.12
- Huggingface_hub version: 0.30.2
- Safetensors version: 0.5.3
- Accelerate version: 1.3.0
- Accelerate config: not found
- DeepSpeed version: not installed
- PyTorch version (GPU?): 2.6.0+cu124 (True)
- Tensorflow version (GPU?): not installed (NA)
- Flax version (CPU?/GPU?/TPU?): not installed (NA)
- Jax version: not installed
- JaxLib version: not installed
- Using distributed or parallel set-up in script?: no
- Using GPU in script?: no
- GPU type: NVIDIA H100 PCIe
### Who can help?
@ArthurZucker
### Information
- [ ] The official example scripts
- [ ] My own modified scripts
### Tasks
- [ ] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...)
- [x] My own task or dataset (give details below)
### Reproduction
```python
>>> import transformers
>>> cfg = transformers.AutoConfig.from_pretrained("meta-llama/Llama-4-Scout-17B-16E-Instruct")
>>> type(cfg)
<class 'transformers.models.llama.configuration_llama.LlamaConfig'>
```
### Expected behavior
I expect `AutoConfig.from_pretrained` to return the full `Llama4Config`, with both vision and text configs. As is it's currently quite difficult to work with Llama 4 models. | {
"login": "Rocketknight1",
"id": 12866554,
"node_id": "MDQ6VXNlcjEyODY2NTU0",
"avatar_url": "https://avatars.githubusercontent.com/u/12866554?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Rocketknight1",
"html_url": "https://github.com/Rocketknight1",
"followers_url": "https://api.github.com/users/Rocketknight1/followers",
"following_url": "https://api.github.com/users/Rocketknight1/following{/other_user}",
"gists_url": "https://api.github.com/users/Rocketknight1/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Rocketknight1/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Rocketknight1/subscriptions",
"organizations_url": "https://api.github.com/users/Rocketknight1/orgs",
"repos_url": "https://api.github.com/users/Rocketknight1/repos",
"events_url": "https://api.github.com/users/Rocketknight1/events{/privacy}",
"received_events_url": "https://api.github.com/users/Rocketknight1/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/37556/reactions",
"total_count": 1,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 1
} | https://api.github.com/repos/huggingface/transformers/issues/37556/timeline | null | completed | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | true |
https://api.github.com/repos/huggingface/transformers/issues/37555 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/37555/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/37555/comments | https://api.github.com/repos/huggingface/transformers/issues/37555/events | https://github.com/huggingface/transformers/issues/37555 | 3,000,132,627 | I_kwDOCUB6oc6y0mQT | 37,555 | KeyError: 'general.name' | {
"login": "nitinmukesh",
"id": 2102186,
"node_id": "MDQ6VXNlcjIxMDIxODY=",
"avatar_url": "https://avatars.githubusercontent.com/u/2102186?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/nitinmukesh",
"html_url": "https://github.com/nitinmukesh",
"followers_url": "https://api.github.com/users/nitinmukesh/followers",
"following_url": "https://api.github.com/users/nitinmukesh/following{/other_user}",
"gists_url": "https://api.github.com/users/nitinmukesh/gists{/gist_id}",
"starred_url": "https://api.github.com/users/nitinmukesh/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/nitinmukesh/subscriptions",
"organizations_url": "https://api.github.com/users/nitinmukesh/orgs",
"repos_url": "https://api.github.com/users/nitinmukesh/repos",
"events_url": "https://api.github.com/users/nitinmukesh/events{/privacy}",
"received_events_url": "https://api.github.com/users/nitinmukesh/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
}
] | closed | false | null | [] | null | [] | 2025-04-16T16:17:21 | 2025-05-17T08:04:33 | 2025-05-17T08:04:33 | NONE | null | null | null | null | ### System Info
- `transformers` version: 4.51.3
- Platform: Windows-10-10.0.26100-SP0
- Python version: 3.11.9
- Huggingface_hub version: 0.30.1
- Safetensors version: 0.5.3
- Accelerate version: 1.7.0.dev0
- Accelerate config: not found
- DeepSpeed version: not installed
- PyTorch version (GPU?): 2.6.0+cu124 (True)
- Tensorflow version (GPU?): not installed (NA)
- Flax version (CPU?/GPU?/TPU?): not installed (NA)
- Jax version: not installed
- JaxLib version: not installed
- Using distributed or parallel set-up in script?: <fill in>
- Using GPU in script?: <fill in>
- GPU type: NVIDIA GeForce RTX 4060 Laptop GPU
### Who can help?
@SunMarc
@MekkCyber
### Information
- [ ] The official example scripts
- [x] My own modified scripts
### Tasks
- [ ] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...)
- [ ] My own task or dataset (give details below)
### Reproduction
```python
import torch
from transformers import PreTrainedTokenizerFast, LlamaForCausalLM, CLIPTextModelWithProjection, T5EncoderModel
from diffusers import UniPCMultistepScheduler, HiDreamImagePipeline
from diffusers import AutoencoderKL, HiDreamImageTransformer2DModel
from diffusers import GGUFQuantizationConfig
scheduler = UniPCMultistepScheduler(
flow_shift=3.0,
prediction_type="flow_prediction",
use_flow_sigmas=True,
)
text_encoder_2 = CLIPTextModelWithProjection.from_pretrained("calcuis/hidream-gguf", gguf_file="clip_g_hidream_fp32-f16.gguf")
text_encoder_3 = T5EncoderModel.from_pretrained("calcuis/hidream-gguf", gguf_file="t5xxl_fp32-q4_0.gguf")
tokenizer_4 = PreTrainedTokenizerFast.from_pretrained("meta-llama/Meta-Llama-3.1-8B-Instruct")
text_encoder_4 = LlamaForCausalLM.from_pretrained("calcuis/hidream-gguf", gguf_file="llama-q2_k.gguf")
```
### Expected behavior
CLIPTextModelWithProjection and T5EncoderModel not working and throws same error.
LlamaForCausalLM is working,
```
(sddw-dev) C:\aiOWN\diffuser_webui>python HiDream_GGUF.py
import error: No module named 'triton'
Traceback (most recent call last):
File "C:\aiOWN\diffuser_webui\HiDream_GGUF.py", line 12, in <module>
text_encoder_2 = CLIPTextModelWithProjection.from_pretrained("calcuis/hidream-gguf", gguf_file="clip_g_hidream_fp32-f16.gguf")
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\nitin\miniconda3\envs\sddw-dev\Lib\site-packages\transformers\modeling_utils.py", line 279, in _wrapper
return func(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\nitin\miniconda3\envs\sddw-dev\Lib\site-packages\transformers\modeling_utils.py", line 4175, in from_pretrained
config, model_kwargs = cls.config_class.from_pretrained(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\nitin\miniconda3\envs\sddw-dev\Lib\site-packages\transformers\configuration_utils.py", line 550, in from_pretrained
config_dict, kwargs = cls.get_config_dict(pretrained_model_name_or_path, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\nitin\miniconda3\envs\sddw-dev\Lib\site-packages\transformers\configuration_utils.py", line 590, in get_config_dict
config_dict, kwargs = cls._get_config_dict(pretrained_model_name_or_path, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\nitin\miniconda3\envs\sddw-dev\Lib\site-packages\transformers\configuration_utils.py", line 681, in _get_config_dict
config_dict = load_gguf_checkpoint(resolved_config_file, return_tensors=False)["config"]
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\nitin\miniconda3\envs\sddw-dev\Lib\site-packages\transformers\modeling_gguf_pytorch_utils.py", line 369, in load_gguf_checkpoint
model_name = read_field(reader, "general.name")
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\nitin\miniconda3\envs\sddw-dev\Lib\site-packages\transformers\modeling_gguf_pytorch_utils.py", line 260, in read_field
value = reader.fields[field]
~~~~~~~~~~~~~^^^^^^^
KeyError: 'general.name'
``` | {
"login": "nitinmukesh",
"id": 2102186,
"node_id": "MDQ6VXNlcjIxMDIxODY=",
"avatar_url": "https://avatars.githubusercontent.com/u/2102186?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/nitinmukesh",
"html_url": "https://github.com/nitinmukesh",
"followers_url": "https://api.github.com/users/nitinmukesh/followers",
"following_url": "https://api.github.com/users/nitinmukesh/following{/other_user}",
"gists_url": "https://api.github.com/users/nitinmukesh/gists{/gist_id}",
"starred_url": "https://api.github.com/users/nitinmukesh/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/nitinmukesh/subscriptions",
"organizations_url": "https://api.github.com/users/nitinmukesh/orgs",
"repos_url": "https://api.github.com/users/nitinmukesh/repos",
"events_url": "https://api.github.com/users/nitinmukesh/events{/privacy}",
"received_events_url": "https://api.github.com/users/nitinmukesh/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/37555/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/37555/timeline | null | completed | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | true |
https://api.github.com/repos/huggingface/transformers/issues/37554 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/37554/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/37554/comments | https://api.github.com/repos/huggingface/transformers/issues/37554/events | https://github.com/huggingface/transformers/issues/37554 | 3,000,123,358 | I_kwDOCUB6oc6y0j_e | 37,554 | Possible reshape error in Mamba2Mixer causing inference issue | {
"login": "Kirire",
"id": 27903062,
"node_id": "MDQ6VXNlcjI3OTAzMDYy",
"avatar_url": "https://avatars.githubusercontent.com/u/27903062?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Kirire",
"html_url": "https://github.com/Kirire",
"followers_url": "https://api.github.com/users/Kirire/followers",
"following_url": "https://api.github.com/users/Kirire/following{/other_user}",
"gists_url": "https://api.github.com/users/Kirire/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Kirire/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Kirire/subscriptions",
"organizations_url": "https://api.github.com/users/Kirire/orgs",
"repos_url": "https://api.github.com/users/Kirire/repos",
"events_url": "https://api.github.com/users/Kirire/events{/privacy}",
"received_events_url": "https://api.github.com/users/Kirire/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
}
] | closed | false | null | [] | null | [] | 2025-04-16T16:13:38 | 2025-05-17T08:33:42 | 2025-05-17T08:33:41 | CONTRIBUTOR | null | null | null | null | ### System Info
Hello,
I'm encountering a dimension mismatch error when running inference with the `Mamba2ForCausalLM` class. The error originates from line 579 of the `Mamba2Mixer` class on the `main` branch:
```python
D_residual = self.D[..., None] * pad_tensor_by_size(hidden_states, pad_size)
```
Could it be that there's a dimension mix-up a few lines above (line 572)? Specifically, this line:
```python
hidden_states = hidden_states.reshape(batch_size, seq_len, -1, self.head_dim).float()
```
Shouldn't it be:
```python
hidden_states = hidden_states.reshape(batch_size, seq_len, self.head_dim, -1).float()
```
Thanks in advance,
Cyrile
### Who can help?
_No response_
### Information
- [x] The official example scripts
- [ ] My own modified scripts
### Tasks
- [ ] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...)
- [x] My own task or dataset (give details below)
### Reproduction
```python
import torch
from transformers import Mamba2Config, Mamba2ForCausalLM, AutoTokenizer
tokenizer = AutoTokenizer.from_pretrained("TempestTeam/TempestLLM")
config = Mamba2Config()
config.vocab_size = len(tokenizer)
config.bos_token_id = tokenizer.encode(tokenizer.special_tokens_map['bos_token'])[0]
config.eos_token_id = tokenizer.encode(tokenizer.special_tokens_map['eos_token'])[0]
config.pad_token_id = tokenizer.encode(tokenizer.special_tokens_map['pad_token'])[0]
config.num_hidden_layers = 1
config.hidden_size = 16
config.num_heads = 4
config.head_dim = 4
config.state_size = 8
model = Mamba2ForCausalLM(config)
tok = tokenizer("Hello World!", return_tensors="pt")
output = model(**tok)
```
### Expected behavior
An inference pass that completes without any dimension mismatch errors. | {
"login": "vasqu",
"id": 73884904,
"node_id": "MDQ6VXNlcjczODg0OTA0",
"avatar_url": "https://avatars.githubusercontent.com/u/73884904?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/vasqu",
"html_url": "https://github.com/vasqu",
"followers_url": "https://api.github.com/users/vasqu/followers",
"following_url": "https://api.github.com/users/vasqu/following{/other_user}",
"gists_url": "https://api.github.com/users/vasqu/gists{/gist_id}",
"starred_url": "https://api.github.com/users/vasqu/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/vasqu/subscriptions",
"organizations_url": "https://api.github.com/users/vasqu/orgs",
"repos_url": "https://api.github.com/users/vasqu/repos",
"events_url": "https://api.github.com/users/vasqu/events{/privacy}",
"received_events_url": "https://api.github.com/users/vasqu/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/37554/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/37554/timeline | null | completed | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | true |
https://api.github.com/repos/huggingface/transformers/issues/37553 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/37553/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/37553/comments | https://api.github.com/repos/huggingface/transformers/issues/37553/events | https://github.com/huggingface/transformers/pull/37553 | 2,999,904,608 | PR_kwDOCUB6oc6S2qc8 | 37,553 | update `test_can_load_with_global_device_set` with a hack | {
"login": "ydshieh",
"id": 2521628,
"node_id": "MDQ6VXNlcjI1MjE2Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ydshieh",
"html_url": "https://github.com/ydshieh",
"followers_url": "https://api.github.com/users/ydshieh/followers",
"following_url": "https://api.github.com/users/ydshieh/following{/other_user}",
"gists_url": "https://api.github.com/users/ydshieh/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ydshieh/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ydshieh/subscriptions",
"organizations_url": "https://api.github.com/users/ydshieh/orgs",
"repos_url": "https://api.github.com/users/ydshieh/repos",
"events_url": "https://api.github.com/users/ydshieh/events{/privacy}",
"received_events_url": "https://api.github.com/users/ydshieh/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-04-16T14:50:48 | 2025-04-16T17:48:32 | 2025-04-16T17:48:30 | COLLABORATOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37553",
"html_url": "https://github.com/huggingface/transformers/pull/37553",
"diff_url": "https://github.com/huggingface/transformers/pull/37553.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37553.patch",
"merged_at": "2025-04-16T17:48:30"
} | # What does this PR do?
`torch.set_default_device` has some unexpected impacts on other tests even if we try to restore the origin value by calling it at the end of the test. See #37551 for one example that is fixed there.
This PR uses `torch._GLOBAL_DEVICE_CONTEXT` to do something more close to the internal mechanism in `torch.set_default_device` to perform the cleanup, see
https://github.com/pytorch/pytorch/blob/e229ce34c4ab8cd4e2800227615be32fb362b1e6/torch/__init__.py#L1205-L1218
Running
> RUN_SLOW=1 python3 -m pytest -v tests/models/mask2former/test_modeling_mask2former.py
will pass on this PR while it fails on the base commit (b33edf1b) of this PR.
> FAILED tests/models/mask2former/test_modeling_mask2former.py::Mask2FormerModelTest::test_torch_export - AttributeError: module 'torch._tensor' has no attribute 'split'
>FAILED tests/models/mask2former/test_modeling_mask2former.py::Mask2FormerModelIntegrationTest::test_export - AttributeError: module 'torch._tensor' has no attribute 'split'
| {
"login": "ydshieh",
"id": 2521628,
"node_id": "MDQ6VXNlcjI1MjE2Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ydshieh",
"html_url": "https://github.com/ydshieh",
"followers_url": "https://api.github.com/users/ydshieh/followers",
"following_url": "https://api.github.com/users/ydshieh/following{/other_user}",
"gists_url": "https://api.github.com/users/ydshieh/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ydshieh/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ydshieh/subscriptions",
"organizations_url": "https://api.github.com/users/ydshieh/orgs",
"repos_url": "https://api.github.com/users/ydshieh/repos",
"events_url": "https://api.github.com/users/ydshieh/events{/privacy}",
"received_events_url": "https://api.github.com/users/ydshieh/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/37553/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/37553/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/37552 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/37552/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/37552/comments | https://api.github.com/repos/huggingface/transformers/issues/37552/events | https://github.com/huggingface/transformers/pull/37552 | 2,999,792,663 | PR_kwDOCUB6oc6S2SJz | 37,552 | Fix TimesFm doc issue | {
"login": "Cyrilvallez",
"id": 71554963,
"node_id": "MDQ6VXNlcjcxNTU0OTYz",
"avatar_url": "https://avatars.githubusercontent.com/u/71554963?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Cyrilvallez",
"html_url": "https://github.com/Cyrilvallez",
"followers_url": "https://api.github.com/users/Cyrilvallez/followers",
"following_url": "https://api.github.com/users/Cyrilvallez/following{/other_user}",
"gists_url": "https://api.github.com/users/Cyrilvallez/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Cyrilvallez/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Cyrilvallez/subscriptions",
"organizations_url": "https://api.github.com/users/Cyrilvallez/orgs",
"repos_url": "https://api.github.com/users/Cyrilvallez/repos",
"events_url": "https://api.github.com/users/Cyrilvallez/events{/privacy}",
"received_events_url": "https://api.github.com/users/Cyrilvallez/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-04-16T14:12:23 | 2025-04-16T14:46:06 | 2025-04-16T14:28:42 | MEMBER | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37552",
"html_url": "https://github.com/huggingface/transformers/pull/37552",
"diff_url": "https://github.com/huggingface/transformers/pull/37552.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37552.patch",
"merged_at": "2025-04-16T14:28:42"
} | # What does this PR do?
| {
"login": "Cyrilvallez",
"id": 71554963,
"node_id": "MDQ6VXNlcjcxNTU0OTYz",
"avatar_url": "https://avatars.githubusercontent.com/u/71554963?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Cyrilvallez",
"html_url": "https://github.com/Cyrilvallez",
"followers_url": "https://api.github.com/users/Cyrilvallez/followers",
"following_url": "https://api.github.com/users/Cyrilvallez/following{/other_user}",
"gists_url": "https://api.github.com/users/Cyrilvallez/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Cyrilvallez/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Cyrilvallez/subscriptions",
"organizations_url": "https://api.github.com/users/Cyrilvallez/orgs",
"repos_url": "https://api.github.com/users/Cyrilvallez/repos",
"events_url": "https://api.github.com/users/Cyrilvallez/events{/privacy}",
"received_events_url": "https://api.github.com/users/Cyrilvallez/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/37552/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/37552/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/37551 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/37551/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/37551/comments | https://api.github.com/repos/huggingface/transformers/issues/37551/events | https://github.com/huggingface/transformers/pull/37551 | 2,999,655,707 | PR_kwDOCUB6oc6S10ei | 37,551 | Fix device issue for tapas (with `as_tensor`) | {
"login": "ydshieh",
"id": 2521628,
"node_id": "MDQ6VXNlcjI1MjE2Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ydshieh",
"html_url": "https://github.com/ydshieh",
"followers_url": "https://api.github.com/users/ydshieh/followers",
"following_url": "https://api.github.com/users/ydshieh/following{/other_user}",
"gists_url": "https://api.github.com/users/ydshieh/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ydshieh/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ydshieh/subscriptions",
"organizations_url": "https://api.github.com/users/ydshieh/orgs",
"repos_url": "https://api.github.com/users/ydshieh/repos",
"events_url": "https://api.github.com/users/ydshieh/events{/privacy}",
"received_events_url": "https://api.github.com/users/ydshieh/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-04-16T13:25:11 | 2025-04-16T14:02:56 | 2025-04-16T14:02:53 | COLLABORATOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37551",
"html_url": "https://github.com/huggingface/transformers/pull/37551",
"diff_url": "https://github.com/huggingface/transformers/pull/37551.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37551.patch",
"merged_at": "2025-04-16T14:02:53"
} | # What does this PR do?
#37216 added a new test `test_can_load_with_global_device_set` which, at the end of the test, we restore the default device by calling
> torch.set_default_device(default_device)
However, it has unexpected impact on `as_tensor`. By calling `torch.set_default_device`, `as_tensor(T)` has `cpu` device even if `T` is on `cuda`. Previously, `as_tensor(T)` has the same device as T has.
This causes `tapas` model has 30 test failures. This PR set device explicitly in `tapas` code to avoid the issue, but I will open an issue on torch's repository later.
| {
"login": "ydshieh",
"id": 2521628,
"node_id": "MDQ6VXNlcjI1MjE2Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ydshieh",
"html_url": "https://github.com/ydshieh",
"followers_url": "https://api.github.com/users/ydshieh/followers",
"following_url": "https://api.github.com/users/ydshieh/following{/other_user}",
"gists_url": "https://api.github.com/users/ydshieh/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ydshieh/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ydshieh/subscriptions",
"organizations_url": "https://api.github.com/users/ydshieh/orgs",
"repos_url": "https://api.github.com/users/ydshieh/repos",
"events_url": "https://api.github.com/users/ydshieh/events{/privacy}",
"received_events_url": "https://api.github.com/users/ydshieh/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/37551/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/37551/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/37550 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/37550/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/37550/comments | https://api.github.com/repos/huggingface/transformers/issues/37550/events | https://github.com/huggingface/transformers/pull/37550 | 2,999,172,962 | PR_kwDOCUB6oc6S0LNs | 37,550 | More appropriate cuda warmup in resource-constrained hardware | {
"login": "Cyrilvallez",
"id": 71554963,
"node_id": "MDQ6VXNlcjcxNTU0OTYz",
"avatar_url": "https://avatars.githubusercontent.com/u/71554963?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Cyrilvallez",
"html_url": "https://github.com/Cyrilvallez",
"followers_url": "https://api.github.com/users/Cyrilvallez/followers",
"following_url": "https://api.github.com/users/Cyrilvallez/following{/other_user}",
"gists_url": "https://api.github.com/users/Cyrilvallez/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Cyrilvallez/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Cyrilvallez/subscriptions",
"organizations_url": "https://api.github.com/users/Cyrilvallez/orgs",
"repos_url": "https://api.github.com/users/Cyrilvallez/repos",
"events_url": "https://api.github.com/users/Cyrilvallez/events{/privacy}",
"received_events_url": "https://api.github.com/users/Cyrilvallez/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-04-16T10:19:20 | 2025-04-16T14:07:27 | 2025-04-16T11:40:02 | MEMBER | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37550",
"html_url": "https://github.com/huggingface/transformers/pull/37550",
"diff_url": "https://github.com/huggingface/transformers/pull/37550.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37550.patch",
"merged_at": "2025-04-16T11:40:02"
} | # What does this PR do?
As per the title. See https://github.com/huggingface/transformers/issues/37436 for the edge-case issue
| {
"login": "Cyrilvallez",
"id": 71554963,
"node_id": "MDQ6VXNlcjcxNTU0OTYz",
"avatar_url": "https://avatars.githubusercontent.com/u/71554963?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Cyrilvallez",
"html_url": "https://github.com/Cyrilvallez",
"followers_url": "https://api.github.com/users/Cyrilvallez/followers",
"following_url": "https://api.github.com/users/Cyrilvallez/following{/other_user}",
"gists_url": "https://api.github.com/users/Cyrilvallez/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Cyrilvallez/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Cyrilvallez/subscriptions",
"organizations_url": "https://api.github.com/users/Cyrilvallez/orgs",
"repos_url": "https://api.github.com/users/Cyrilvallez/repos",
"events_url": "https://api.github.com/users/Cyrilvallez/events{/privacy}",
"received_events_url": "https://api.github.com/users/Cyrilvallez/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/37550/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/37550/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/37549 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/37549/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/37549/comments | https://api.github.com/repos/huggingface/transformers/issues/37549/events | https://github.com/huggingface/transformers/pull/37549 | 2,998,994,057 | PR_kwDOCUB6oc6SzkPV | 37,549 | Fix the fsdp config cannot work issue. | {
"login": "yuanwu2017",
"id": 34643241,
"node_id": "MDQ6VXNlcjM0NjQzMjQx",
"avatar_url": "https://avatars.githubusercontent.com/u/34643241?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/yuanwu2017",
"html_url": "https://github.com/yuanwu2017",
"followers_url": "https://api.github.com/users/yuanwu2017/followers",
"following_url": "https://api.github.com/users/yuanwu2017/following{/other_user}",
"gists_url": "https://api.github.com/users/yuanwu2017/gists{/gist_id}",
"starred_url": "https://api.github.com/users/yuanwu2017/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/yuanwu2017/subscriptions",
"organizations_url": "https://api.github.com/users/yuanwu2017/orgs",
"repos_url": "https://api.github.com/users/yuanwu2017/repos",
"events_url": "https://api.github.com/users/yuanwu2017/events{/privacy}",
"received_events_url": "https://api.github.com/users/yuanwu2017/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-04-16T09:11:34 | 2025-04-28T08:44:51 | 2025-04-28T08:44:51 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37549",
"html_url": "https://github.com/huggingface/transformers/pull/37549",
"diff_url": "https://github.com/huggingface/transformers/pull/37549.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37549.patch",
"merged_at": "2025-04-28T08:44:51"
} | # What does this PR do?
The fsdp config generated by accelerate config is all fsdp prefixed. If passed in as parameters, these parameters will not work. But passing in in the fsdp_config.json method, these parameters can work. This patch makes them behave consistently and can work.
<!--
Congratulations! You've made it this far! You're not quite done yet though.
Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution.
Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change.
Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost.
-->
<!-- Remove if not applicable -->
Fixes # (issue)
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [ ] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
<!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @
If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**.
Please tag fewer than 3 people.
Models:
- text models: @ArthurZucker
- vision models: @amyeroberts, @qubvel
- speech models: @eustlb
- graph models: @clefourrier
Library:
- flax: @gante and @Rocketknight1
- generate: @zucchini-nlp (visual-language models) or @gante (all others)
- pipelines: @Rocketknight1
- tensorflow: @gante and @Rocketknight1
- tokenizers: @ArthurZucker
- trainer: @zach-huggingface and @SunMarc
- chat templates: @Rocketknight1
Integrations:
- deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface
- ray/raytune: @richardliaw, @amogkam
- Big Model Inference: @SunMarc
- quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber
Documentation: @stevhliu
HF projects:
- accelerate: [different repo](https://github.com/huggingface/accelerate)
- datasets: [different repo](https://github.com/huggingface/datasets)
- diffusers: [different repo](https://github.com/huggingface/diffusers)
- rust tokenizers: [different repo](https://github.com/huggingface/tokenizers)
Maintained examples (not research project or legacy):
- Flax: @Rocketknight1
- PyTorch: See Models above and tag the person corresponding to the modality of the example.
- TensorFlow: @Rocketknight1
-->
| {
"login": "SunMarc",
"id": 57196510,
"node_id": "MDQ6VXNlcjU3MTk2NTEw",
"avatar_url": "https://avatars.githubusercontent.com/u/57196510?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/SunMarc",
"html_url": "https://github.com/SunMarc",
"followers_url": "https://api.github.com/users/SunMarc/followers",
"following_url": "https://api.github.com/users/SunMarc/following{/other_user}",
"gists_url": "https://api.github.com/users/SunMarc/gists{/gist_id}",
"starred_url": "https://api.github.com/users/SunMarc/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/SunMarc/subscriptions",
"organizations_url": "https://api.github.com/users/SunMarc/orgs",
"repos_url": "https://api.github.com/users/SunMarc/repos",
"events_url": "https://api.github.com/users/SunMarc/events{/privacy}",
"received_events_url": "https://api.github.com/users/SunMarc/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/37549/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/37549/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/37548 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/37548/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/37548/comments | https://api.github.com/repos/huggingface/transformers/issues/37548/events | https://github.com/huggingface/transformers/pull/37548 | 2,998,716,883 | PR_kwDOCUB6oc6SyoCw | 37,548 | enable 6 rt_detr_v2 cases on xpu | {
"login": "yao-matrix",
"id": 7245027,
"node_id": "MDQ6VXNlcjcyNDUwMjc=",
"avatar_url": "https://avatars.githubusercontent.com/u/7245027?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/yao-matrix",
"html_url": "https://github.com/yao-matrix",
"followers_url": "https://api.github.com/users/yao-matrix/followers",
"following_url": "https://api.github.com/users/yao-matrix/following{/other_user}",
"gists_url": "https://api.github.com/users/yao-matrix/gists{/gist_id}",
"starred_url": "https://api.github.com/users/yao-matrix/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/yao-matrix/subscriptions",
"organizations_url": "https://api.github.com/users/yao-matrix/orgs",
"repos_url": "https://api.github.com/users/yao-matrix/repos",
"events_url": "https://api.github.com/users/yao-matrix/events{/privacy}",
"received_events_url": "https://api.github.com/users/yao-matrix/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-04-16T07:22:54 | 2025-04-17T08:16:50 | 2025-04-16T09:23:56 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37548",
"html_url": "https://github.com/huggingface/transformers/pull/37548",
"diff_url": "https://github.com/huggingface/transformers/pull/37548.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37548.patch",
"merged_at": "2025-04-16T09:23:56"
} | below 6 cases PASSED
tests/models/rt_detr_v2/test_modeling_rt_detr_v2.py::RTDetrV2ModelTest::test_inference_equivalence_for_static_and_dynamic_anchors_0_float32
tests/models/rt_detr_v2/test_modeling_rt_detr_v2.py::RTDetrV2ModelTest::test_inference_equivalence_for_static_and_dynamic_anchors_1_float16
tests/models/rt_detr_v2/test_modeling_rt_detr_v2.py::RTDetrV2ModelTest::test_inference_equivalence_for_static_and_dynamic_anchors_2_bfloat16
tests/models/rt_detr_v2/test_modeling_rt_detr_v2.py::RTDetrV2ModelTest::test_inference_with_different_dtypes_0_float32
tests/models/rt_detr_v2/test_modeling_rt_detr_v2.py::RTDetrV2ModelTest::test_inference_with_different_dtypes_1_float16
tests/models/rt_detr_v2/test_modeling_rt_detr_v2.py::RTDetrV2ModelTest::test_inference_with_different_dtypes_2_bfloat16 | {
"login": "ydshieh",
"id": 2521628,
"node_id": "MDQ6VXNlcjI1MjE2Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ydshieh",
"html_url": "https://github.com/ydshieh",
"followers_url": "https://api.github.com/users/ydshieh/followers",
"following_url": "https://api.github.com/users/ydshieh/following{/other_user}",
"gists_url": "https://api.github.com/users/ydshieh/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ydshieh/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ydshieh/subscriptions",
"organizations_url": "https://api.github.com/users/ydshieh/orgs",
"repos_url": "https://api.github.com/users/ydshieh/repos",
"events_url": "https://api.github.com/users/ydshieh/events{/privacy}",
"received_events_url": "https://api.github.com/users/ydshieh/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/37548/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/37548/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/37547 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/37547/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/37547/comments | https://api.github.com/repos/huggingface/transformers/issues/37547/events | https://github.com/huggingface/transformers/pull/37547 | 2,998,632,801 | PR_kwDOCUB6oc6SyV4e | 37,547 | add fromjson to jinja environments | {
"login": "yhg0112",
"id": 5001738,
"node_id": "MDQ6VXNlcjUwMDE3Mzg=",
"avatar_url": "https://avatars.githubusercontent.com/u/5001738?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/yhg0112",
"html_url": "https://github.com/yhg0112",
"followers_url": "https://api.github.com/users/yhg0112/followers",
"following_url": "https://api.github.com/users/yhg0112/following{/other_user}",
"gists_url": "https://api.github.com/users/yhg0112/gists{/gist_id}",
"starred_url": "https://api.github.com/users/yhg0112/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/yhg0112/subscriptions",
"organizations_url": "https://api.github.com/users/yhg0112/orgs",
"repos_url": "https://api.github.com/users/yhg0112/repos",
"events_url": "https://api.github.com/users/yhg0112/events{/privacy}",
"received_events_url": "https://api.github.com/users/yhg0112/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-04-16T06:46:27 | 2025-04-22T13:24:59 | 2025-04-22T13:24:58 | NONE | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37547",
"html_url": "https://github.com/huggingface/transformers/pull/37547",
"diff_url": "https://github.com/huggingface/transformers/pull/37547.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37547.patch",
"merged_at": null
} | # What does this PR do?
Adds a `fromjson` filter to the Jinja environments in `chat_template_util`.
This is particularly useful for handling tool use data that is stored as JSON strings.
## Who can review?
@Rocketknight1
| {
"login": "Rocketknight1",
"id": 12866554,
"node_id": "MDQ6VXNlcjEyODY2NTU0",
"avatar_url": "https://avatars.githubusercontent.com/u/12866554?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Rocketknight1",
"html_url": "https://github.com/Rocketknight1",
"followers_url": "https://api.github.com/users/Rocketknight1/followers",
"following_url": "https://api.github.com/users/Rocketknight1/following{/other_user}",
"gists_url": "https://api.github.com/users/Rocketknight1/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Rocketknight1/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Rocketknight1/subscriptions",
"organizations_url": "https://api.github.com/users/Rocketknight1/orgs",
"repos_url": "https://api.github.com/users/Rocketknight1/repos",
"events_url": "https://api.github.com/users/Rocketknight1/events{/privacy}",
"received_events_url": "https://api.github.com/users/Rocketknight1/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/37547/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/37547/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/37546 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/37546/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/37546/comments | https://api.github.com/repos/huggingface/transformers/issues/37546/events | https://github.com/huggingface/transformers/pull/37546 | 2,998,631,367 | PR_kwDOCUB6oc6SyVjj | 37,546 | enable 3 mpt test cases on XPU | {
"login": "yao-matrix",
"id": 7245027,
"node_id": "MDQ6VXNlcjcyNDUwMjc=",
"avatar_url": "https://avatars.githubusercontent.com/u/7245027?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/yao-matrix",
"html_url": "https://github.com/yao-matrix",
"followers_url": "https://api.github.com/users/yao-matrix/followers",
"following_url": "https://api.github.com/users/yao-matrix/following{/other_user}",
"gists_url": "https://api.github.com/users/yao-matrix/gists{/gist_id}",
"starred_url": "https://api.github.com/users/yao-matrix/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/yao-matrix/subscriptions",
"organizations_url": "https://api.github.com/users/yao-matrix/orgs",
"repos_url": "https://api.github.com/users/yao-matrix/repos",
"events_url": "https://api.github.com/users/yao-matrix/events{/privacy}",
"received_events_url": "https://api.github.com/users/yao-matrix/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-04-16T06:46:01 | 2025-04-16T22:39:58 | 2025-04-16T09:23:06 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37546",
"html_url": "https://github.com/huggingface/transformers/pull/37546",
"diff_url": "https://github.com/huggingface/transformers/pull/37546.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37546.patch",
"merged_at": "2025-04-16T09:23:06"
} | PASSED tests/models/mpt/test_modeling_mpt.py::MptIntegrationTests::test_generation
PASSED tests/models/mpt/test_modeling_mpt.py::MptIntegrationTests::test_generation_8k
FAILED tests/models/mpt/test_modeling_mpt.py::MptIntegrationTests::test_generation_batched
FAILED case is our bnb integration issue, will submit separate PR if needed once fixed
| {
"login": "ydshieh",
"id": 2521628,
"node_id": "MDQ6VXNlcjI1MjE2Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ydshieh",
"html_url": "https://github.com/ydshieh",
"followers_url": "https://api.github.com/users/ydshieh/followers",
"following_url": "https://api.github.com/users/ydshieh/following{/other_user}",
"gists_url": "https://api.github.com/users/ydshieh/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ydshieh/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ydshieh/subscriptions",
"organizations_url": "https://api.github.com/users/ydshieh/orgs",
"repos_url": "https://api.github.com/users/ydshieh/repos",
"events_url": "https://api.github.com/users/ydshieh/events{/privacy}",
"received_events_url": "https://api.github.com/users/ydshieh/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/37546/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/37546/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/37545 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/37545/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/37545/comments | https://api.github.com/repos/huggingface/transformers/issues/37545/events | https://github.com/huggingface/transformers/issues/37545 | 2,998,063,825 | I_kwDOCUB6oc6ystLR | 37,545 | Expected all tensors to be on the same device, but found at least two devices | {
"login": "ootsuka-repos",
"id": 94523049,
"node_id": "U_kgDOBaJOqQ",
"avatar_url": "https://avatars.githubusercontent.com/u/94523049?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ootsuka-repos",
"html_url": "https://github.com/ootsuka-repos",
"followers_url": "https://api.github.com/users/ootsuka-repos/followers",
"following_url": "https://api.github.com/users/ootsuka-repos/following{/other_user}",
"gists_url": "https://api.github.com/users/ootsuka-repos/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ootsuka-repos/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ootsuka-repos/subscriptions",
"organizations_url": "https://api.github.com/users/ootsuka-repos/orgs",
"repos_url": "https://api.github.com/users/ootsuka-repos/repos",
"events_url": "https://api.github.com/users/ootsuka-repos/events{/privacy}",
"received_events_url": "https://api.github.com/users/ootsuka-repos/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
}
] | closed | false | null | [] | null | [] | 2025-04-16T01:27:23 | 2025-07-21T08:04:28 | 2025-07-21T08:04:28 | NONE | null | null | null | null | ### System Info
TRLのSFTrainerが動かない。
RTX8000*2 4090D*2 num4 gpu
pip install transformers==4.51.3 NG
pip install transformers==4.49.0 OK
torch2.5.1+cuda12.4
-error text-
Expected all tensors to be on the same device, but found at least two devices, cuda:3 and cuda:0!
File "C:\Users\admin\Desktop\llm_sft\train\SFT.py", line 185, in main
trainer.train(
File "C:\Users\admin\Desktop\llm_sft\train\SFT.py", line 190, in <module>
main()
RuntimeError: Expected all tensors to be on the same device, but found at least two devices, cuda:3 and cuda:0! | {
"login": "github-actions[bot]",
"id": 41898282,
"node_id": "MDM6Qm90NDE4OTgyODI=",
"avatar_url": "https://avatars.githubusercontent.com/in/15368?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/github-actions%5Bbot%5D",
"html_url": "https://github.com/apps/github-actions",
"followers_url": "https://api.github.com/users/github-actions%5Bbot%5D/followers",
"following_url": "https://api.github.com/users/github-actions%5Bbot%5D/following{/other_user}",
"gists_url": "https://api.github.com/users/github-actions%5Bbot%5D/gists{/gist_id}",
"starred_url": "https://api.github.com/users/github-actions%5Bbot%5D/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/github-actions%5Bbot%5D/subscriptions",
"organizations_url": "https://api.github.com/users/github-actions%5Bbot%5D/orgs",
"repos_url": "https://api.github.com/users/github-actions%5Bbot%5D/repos",
"events_url": "https://api.github.com/users/github-actions%5Bbot%5D/events{/privacy}",
"received_events_url": "https://api.github.com/users/github-actions%5Bbot%5D/received_events",
"type": "Bot",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/37545/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/37545/timeline | null | completed | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | true |
https://api.github.com/repos/huggingface/transformers/issues/37544 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/37544/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/37544/comments | https://api.github.com/repos/huggingface/transformers/issues/37544/events | https://github.com/huggingface/transformers/pull/37544 | 2,997,735,019 | PR_kwDOCUB6oc6SvTrH | 37,544 | Fix `pad` image transform for batched inputs | {
"login": "sebasv",
"id": 10614357,
"node_id": "MDQ6VXNlcjEwNjE0MzU3",
"avatar_url": "https://avatars.githubusercontent.com/u/10614357?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/sebasv",
"html_url": "https://github.com/sebasv",
"followers_url": "https://api.github.com/users/sebasv/followers",
"following_url": "https://api.github.com/users/sebasv/following{/other_user}",
"gists_url": "https://api.github.com/users/sebasv/gists{/gist_id}",
"starred_url": "https://api.github.com/users/sebasv/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sebasv/subscriptions",
"organizations_url": "https://api.github.com/users/sebasv/orgs",
"repos_url": "https://api.github.com/users/sebasv/repos",
"events_url": "https://api.github.com/users/sebasv/events{/privacy}",
"received_events_url": "https://api.github.com/users/sebasv/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
},
{
"id": 5769473378,
"node_id": "LA_kwDOCUB6oc8AAAABV-MtYg",
"url": "https://api.github.com/repos/huggingface/transformers/labels/Vision",
"name": "Vision",
"color": "C079EF",
"default": false,
"description": ""
},
{
"id": 7570656740,
"node_id": "LA_kwDOCUB6oc8AAAABwz8N5A",
"url": "https://api.github.com/repos/huggingface/transformers/labels/Processing",
"name": "Processing",
"color": "1E17DF",
"default": false,
"description": ""
}
] | closed | false | null | [] | null | [] | 2025-04-15T22:00:08 | 2025-05-08T09:51:42 | 2025-05-08T09:51:15 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37544",
"html_url": "https://github.com/huggingface/transformers/pull/37544",
"diff_url": "https://github.com/huggingface/transformers/pull/37544.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37544.patch",
"merged_at": "2025-05-08T09:51:15"
} | # What does this PR do?
<!--
Congratulations! You've made it this far! You're not quite done yet though.
Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution.
Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change.
Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost.
-->
<!-- Remove if not applicable -->
Fixes #37541
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [x] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [x] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [x] Did you write any new necessary tests?
## Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
@amyeroberts, @qubvel
<!-- Your PR will be replied to more quickly if you can figure out the right person to tag with
If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**.
Please tag fewer than 3 people.
Models:
- text models: @ArthurZucker
- vision models: @amyeroberts, @qubvel
- speech models: @eustlb
- graph models: @clefourrier
Library:
- flax: @gante and @Rocketknight1
- generate: @zucchini-nlp (visual-language models) or @gante (all others)
- pipelines: @Rocketknight1
- tensorflow: @gante and @Rocketknight1
- tokenizers: @ArthurZucker
- trainer: @zach-huggingface and @SunMarc
- chat templates: @Rocketknight1
Integrations:
- deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface
- ray/raytune: @richardliaw, @amogkam
- Big Model Inference: @SunMarc
- quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber
Documentation: @stevhliu
HF projects:
- accelerate: [different repo](https://github.com/huggingface/accelerate)
- datasets: [different repo](https://github.com/huggingface/datasets)
- diffusers: [different repo](https://github.com/huggingface/diffusers)
- rust tokenizers: [different repo](https://github.com/huggingface/tokenizers)
Maintained examples (not research project or legacy):
- Flax: @Rocketknight1
- PyTorch: See Models above and tag the person corresponding to the modality of the example.
- TensorFlow: @Rocketknight1
-->
| {
"login": "qubvel",
"id": 31920396,
"node_id": "MDQ6VXNlcjMxOTIwMzk2",
"avatar_url": "https://avatars.githubusercontent.com/u/31920396?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/qubvel",
"html_url": "https://github.com/qubvel",
"followers_url": "https://api.github.com/users/qubvel/followers",
"following_url": "https://api.github.com/users/qubvel/following{/other_user}",
"gists_url": "https://api.github.com/users/qubvel/gists{/gist_id}",
"starred_url": "https://api.github.com/users/qubvel/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/qubvel/subscriptions",
"organizations_url": "https://api.github.com/users/qubvel/orgs",
"repos_url": "https://api.github.com/users/qubvel/repos",
"events_url": "https://api.github.com/users/qubvel/events{/privacy}",
"received_events_url": "https://api.github.com/users/qubvel/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/37544/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/37544/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/37543 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/37543/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/37543/comments | https://api.github.com/repos/huggingface/transformers/issues/37543/events | https://github.com/huggingface/transformers/pull/37543 | 2,997,588,626 | PR_kwDOCUB6oc6SuzG4 | 37,543 | TP support for Quark quantized model | {
"login": "amd-xiaoyu12",
"id": 188109516,
"node_id": "U_kgDOCzZSzA",
"avatar_url": "https://avatars.githubusercontent.com/u/188109516?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/amd-xiaoyu12",
"html_url": "https://github.com/amd-xiaoyu12",
"followers_url": "https://api.github.com/users/amd-xiaoyu12/followers",
"following_url": "https://api.github.com/users/amd-xiaoyu12/following{/other_user}",
"gists_url": "https://api.github.com/users/amd-xiaoyu12/gists{/gist_id}",
"starred_url": "https://api.github.com/users/amd-xiaoyu12/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/amd-xiaoyu12/subscriptions",
"organizations_url": "https://api.github.com/users/amd-xiaoyu12/orgs",
"repos_url": "https://api.github.com/users/amd-xiaoyu12/repos",
"events_url": "https://api.github.com/users/amd-xiaoyu12/events{/privacy}",
"received_events_url": "https://api.github.com/users/amd-xiaoyu12/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-04-15T20:50:40 | 2025-04-24T15:40:44 | 2025-04-24T15:40:44 | CONTRIBUTOR | null | null | true | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37543",
"html_url": "https://github.com/huggingface/transformers/pull/37543",
"diff_url": "https://github.com/huggingface/transformers/pull/37543.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37543.patch",
"merged_at": null
} | # What does this PR do?
After integrating the AMD Quark quantizer, we aim to leverage the HF tensor parallelism feature to accelerate the inference of the Quark-quantized model. I have identified several areas where modifications are necessary to support model TP. The changes made serve as a demonstration of the required adjustments. Please review and provide feedback.
For the shard_and_distribute_module() function in tensor_parallel.py, please consider adding quantizer awareness. There are several unique Quark parameters that need to be addressed. Refer to the updated version for details. I suggest adding a quantizer parameter to this function and allowing a quantizer-specific function to be used in column-wise and row-wise partition functions.
The modification for src/transformers/modeling_utils.py is temporary changes to make the test code work.
<!--
Congratulations! You've made it this far! You're not quite done yet though.
Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution.
Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change.
Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost.
-->
<!-- Remove if not applicable -->
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [x] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
<!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @
If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**.
Please tag fewer than 3 people.
Models:
- text models: @ArthurZucker
- vision models: @amyeroberts, @qubvel
- speech models: @eustlb
- graph models: @clefourrier
Library:
- flax: @gante and @Rocketknight1
- generate: @zucchini-nlp (visual-language models) or @gante (all others)
- pipelines: @Rocketknight1
- tensorflow: @gante and @Rocketknight1
- tokenizers: @ArthurZucker
- trainer: @zach-huggingface and @SunMarc
- chat templates: @Rocketknight1
Integrations:
- deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface
- ray/raytune: @richardliaw, @amogkam
- Big Model Inference: @SunMarc
- quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber
Documentation: @stevhliu
HF projects:
- accelerate: [different repo](https://github.com/huggingface/accelerate)
- datasets: [different repo](https://github.com/huggingface/datasets)
- diffusers: [different repo](https://github.com/huggingface/diffusers)
- rust tokenizers: [different repo](https://github.com/huggingface/tokenizers)
Maintained examples (not research project or legacy):
- Flax: @Rocketknight1
- PyTorch: See Models above and tag the person corresponding to the modality of the example.
- TensorFlow: @Rocketknight1
-->
| {
"login": "amd-xiaoyu12",
"id": 188109516,
"node_id": "U_kgDOCzZSzA",
"avatar_url": "https://avatars.githubusercontent.com/u/188109516?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/amd-xiaoyu12",
"html_url": "https://github.com/amd-xiaoyu12",
"followers_url": "https://api.github.com/users/amd-xiaoyu12/followers",
"following_url": "https://api.github.com/users/amd-xiaoyu12/following{/other_user}",
"gists_url": "https://api.github.com/users/amd-xiaoyu12/gists{/gist_id}",
"starred_url": "https://api.github.com/users/amd-xiaoyu12/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/amd-xiaoyu12/subscriptions",
"organizations_url": "https://api.github.com/users/amd-xiaoyu12/orgs",
"repos_url": "https://api.github.com/users/amd-xiaoyu12/repos",
"events_url": "https://api.github.com/users/amd-xiaoyu12/events{/privacy}",
"received_events_url": "https://api.github.com/users/amd-xiaoyu12/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/37543/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/37543/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/37541 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/37541/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/37541/comments | https://api.github.com/repos/huggingface/transformers/issues/37541/events | https://github.com/huggingface/transformers/issues/37541 | 2,997,402,548 | I_kwDOCUB6oc6yqLu0 | 37,541 | `image_transforms:pad` throws `ValueError` if the input contains a batch dimension | {
"login": "sebasv",
"id": 10614357,
"node_id": "MDQ6VXNlcjEwNjE0MzU3",
"avatar_url": "https://avatars.githubusercontent.com/u/10614357?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/sebasv",
"html_url": "https://github.com/sebasv",
"followers_url": "https://api.github.com/users/sebasv/followers",
"following_url": "https://api.github.com/users/sebasv/following{/other_user}",
"gists_url": "https://api.github.com/users/sebasv/gists{/gist_id}",
"starred_url": "https://api.github.com/users/sebasv/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sebasv/subscriptions",
"organizations_url": "https://api.github.com/users/sebasv/orgs",
"repos_url": "https://api.github.com/users/sebasv/repos",
"events_url": "https://api.github.com/users/sebasv/events{/privacy}",
"received_events_url": "https://api.github.com/users/sebasv/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
}
] | closed | false | null | [] | null | [] | 2025-04-15T19:31:46 | 2025-05-08T09:51:16 | 2025-05-08T09:51:16 | CONTRIBUTOR | null | null | null | null | ### System Info
```
- `transformers` version: 4.31.0
- Platform: macOS-15.3.2-arm64-arm-64bit
- Python version: 3.11.8
- Huggingface_hub version: 0.28.1
- Safetensors version: 0.5.2
- Accelerate version: 1.6.0
- Accelerate config: not found
- PyTorch version (GPU?): 2.6.0 (False)
- Tensorflow version (GPU?): not installed (NA)
- Flax version (CPU?/GPU?/TPU?): not installed (NA)
- Jax version: not installed
- JaxLib version: not installed
- Using GPU in script?: NO
- Using distributed or parallel set-up in script?: NO
```
### Who can help?
@amyeroberts @qubvel
### Information
- [ ] The official example scripts
- [x] My own modified scripts
### Tasks
- [ ] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...)
- [x] My own task or dataset (give details below)
### Reproduction
I ran into this issue using the `Mask2FormerImagePreprocessor` but a minimal reproducible example would be
```python
from transformers.image_transforms import pad
import numpy as np
boring_image = [[[0]]]
batched_boring_image = [boring_image]
pad(image=np.array(batched_boring_image), padding=((0,0),(0,0)))
# output
ValueError: setting an array element with a sequence. The requested array has an inhomogeneous shape after 1 dimensions. The detected shape was (4,) + inhomogeneous part.
```
The problem is that the `pad` function expands the padding argument to `(0, (0,0),(0,0),(0,0),(0,0))`, ie the input for the batch dimension is a scalar, not a tuple.
What I don't understand is how I'm the first to hit this issue. This code has been stable for years, still if I do something as straightforward as
```python
image = np.array([[[1, 2, 3]]])
mask = np.array([[[1, 1, 0]]])
print(mask.shape)
instance_id_to_semantic_id = {1: 1}
Mask2FormerImageProcessor.from_pretrained(
"facebook/mask2former-swin-small-coco-instance", do_normalize=False, do_reduce_labels=False, ignore_index=0
)(
images=[image],
segmentation_maps=[mask],
instance_id_to_semantic_id=instance_id_to_semantic_id,
return_tensors="pt",
)
```
I already get this error. Am I doing something wrong here?
### Expected behavior
no exception thrown | {
"login": "qubvel",
"id": 31920396,
"node_id": "MDQ6VXNlcjMxOTIwMzk2",
"avatar_url": "https://avatars.githubusercontent.com/u/31920396?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/qubvel",
"html_url": "https://github.com/qubvel",
"followers_url": "https://api.github.com/users/qubvel/followers",
"following_url": "https://api.github.com/users/qubvel/following{/other_user}",
"gists_url": "https://api.github.com/users/qubvel/gists{/gist_id}",
"starred_url": "https://api.github.com/users/qubvel/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/qubvel/subscriptions",
"organizations_url": "https://api.github.com/users/qubvel/orgs",
"repos_url": "https://api.github.com/users/qubvel/repos",
"events_url": "https://api.github.com/users/qubvel/events{/privacy}",
"received_events_url": "https://api.github.com/users/qubvel/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/37541/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/37541/timeline | null | completed | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | true |
https://api.github.com/repos/huggingface/transformers/issues/37540 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/37540/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/37540/comments | https://api.github.com/repos/huggingface/transformers/issues/37540/events | https://github.com/huggingface/transformers/pull/37540 | 2,997,132,276 | PR_kwDOCUB6oc6StPSD | 37,540 | Improve `auxiliary_in_channels` default behavior in UperNet | {
"login": "simonreise",
"id": 43753582,
"node_id": "MDQ6VXNlcjQzNzUzNTgy",
"avatar_url": "https://avatars.githubusercontent.com/u/43753582?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/simonreise",
"html_url": "https://github.com/simonreise",
"followers_url": "https://api.github.com/users/simonreise/followers",
"following_url": "https://api.github.com/users/simonreise/following{/other_user}",
"gists_url": "https://api.github.com/users/simonreise/gists{/gist_id}",
"starred_url": "https://api.github.com/users/simonreise/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/simonreise/subscriptions",
"organizations_url": "https://api.github.com/users/simonreise/orgs",
"repos_url": "https://api.github.com/users/simonreise/repos",
"events_url": "https://api.github.com/users/simonreise/events{/privacy}",
"received_events_url": "https://api.github.com/users/simonreise/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 5769473378,
"node_id": "LA_kwDOCUB6oc8AAAABV-MtYg",
"url": "https://api.github.com/repos/huggingface/transformers/labels/Vision",
"name": "Vision",
"color": "C079EF",
"default": false,
"description": ""
}
] | closed | false | null | [] | null | [] | 2025-04-15T17:28:39 | 2025-06-17T12:57:28 | 2025-06-17T12:56:46 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37540",
"html_url": "https://github.com/huggingface/transformers/pull/37540",
"diff_url": "https://github.com/huggingface/transformers/pull/37540.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37540.patch",
"merged_at": "2025-06-17T12:56:46"
} | # What does this PR do?
<!--
Congratulations! You've made it this far! You're not quite done yet though.
Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution.
Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change.
Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost.
-->
<!-- Remove if not applicable -->
Implements changes proposed in #37345
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [x] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [x] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
@qubvel
<!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @
If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**.
Please tag fewer than 3 people.
Models:
- text models: @ArthurZucker
- vision models: @amyeroberts, @qubvel
- speech models: @eustlb
- graph models: @clefourrier
Library:
- flax: @gante and @Rocketknight1
- generate: @zucchini-nlp (visual-language models) or @gante (all others)
- pipelines: @Rocketknight1
- tensorflow: @gante and @Rocketknight1
- tokenizers: @ArthurZucker
- trainer: @zach-huggingface and @SunMarc
- chat templates: @Rocketknight1
Integrations:
- deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface
- ray/raytune: @richardliaw, @amogkam
- Big Model Inference: @SunMarc
- quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber
Documentation: @stevhliu
HF projects:
- accelerate: [different repo](https://github.com/huggingface/accelerate)
- datasets: [different repo](https://github.com/huggingface/datasets)
- diffusers: [different repo](https://github.com/huggingface/diffusers)
- rust tokenizers: [different repo](https://github.com/huggingface/tokenizers)
Maintained examples (not research project or legacy):
- Flax: @Rocketknight1
- PyTorch: See Models above and tag the person corresponding to the modality of the example.
- TensorFlow: @Rocketknight1
-->
| {
"login": "qubvel",
"id": 31920396,
"node_id": "MDQ6VXNlcjMxOTIwMzk2",
"avatar_url": "https://avatars.githubusercontent.com/u/31920396?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/qubvel",
"html_url": "https://github.com/qubvel",
"followers_url": "https://api.github.com/users/qubvel/followers",
"following_url": "https://api.github.com/users/qubvel/following{/other_user}",
"gists_url": "https://api.github.com/users/qubvel/gists{/gist_id}",
"starred_url": "https://api.github.com/users/qubvel/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/qubvel/subscriptions",
"organizations_url": "https://api.github.com/users/qubvel/orgs",
"repos_url": "https://api.github.com/users/qubvel/repos",
"events_url": "https://api.github.com/users/qubvel/events{/privacy}",
"received_events_url": "https://api.github.com/users/qubvel/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/37540/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/37540/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/37539 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/37539/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/37539/comments | https://api.github.com/repos/huggingface/transformers/issues/37539/events | https://github.com/huggingface/transformers/pull/37539 | 2,997,040,054 | PR_kwDOCUB6oc6Ss7iD | 37,539 | Mllama fast image processor | {
"login": "rootonchair",
"id": 23548268,
"node_id": "MDQ6VXNlcjIzNTQ4MjY4",
"avatar_url": "https://avatars.githubusercontent.com/u/23548268?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/rootonchair",
"html_url": "https://github.com/rootonchair",
"followers_url": "https://api.github.com/users/rootonchair/followers",
"following_url": "https://api.github.com/users/rootonchair/following{/other_user}",
"gists_url": "https://api.github.com/users/rootonchair/gists{/gist_id}",
"starred_url": "https://api.github.com/users/rootonchair/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/rootonchair/subscriptions",
"organizations_url": "https://api.github.com/users/rootonchair/orgs",
"repos_url": "https://api.github.com/users/rootonchair/repos",
"events_url": "https://api.github.com/users/rootonchair/events{/privacy}",
"received_events_url": "https://api.github.com/users/rootonchair/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | open | false | null | [] | null | [] | 2025-04-15T16:48:58 | 2025-06-27T05:56:35 | null | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37539",
"html_url": "https://github.com/huggingface/transformers/pull/37539",
"diff_url": "https://github.com/huggingface/transformers/pull/37539.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37539.patch",
"merged_at": null
} | # What does this PR do?
<!--
Congratulations! You've made it this far! You're not quite done yet though.
Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution.
Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change.
Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost.
-->
<!-- Remove if not applicable -->
Related #36978
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [ ] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
<!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @
If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**.
Please tag fewer than 3 people.
Models:
- text models: @ArthurZucker
- vision models: @amyeroberts, @qubvel
- speech models: @eustlb
- graph models: @clefourrier
Library:
- flax: @gante and @Rocketknight1
- generate: @zucchini-nlp (visual-language models) or @gante (all others)
- pipelines: @Rocketknight1
- tensorflow: @gante and @Rocketknight1
- tokenizers: @ArthurZucker
- trainer: @zach-huggingface and @SunMarc
- chat templates: @Rocketknight1
Integrations:
- deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface
- ray/raytune: @richardliaw, @amogkam
- Big Model Inference: @SunMarc
- quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber
Documentation: @stevhliu
HF projects:
- accelerate: [different repo](https://github.com/huggingface/accelerate)
- datasets: [different repo](https://github.com/huggingface/datasets)
- diffusers: [different repo](https://github.com/huggingface/diffusers)
- rust tokenizers: [different repo](https://github.com/huggingface/tokenizers)
Maintained examples (not research project or legacy):
- Flax: @Rocketknight1
- PyTorch: See Models above and tag the person corresponding to the modality of the example.
- TensorFlow: @Rocketknight1
-->
| null | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/37539/reactions",
"total_count": 1,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 1,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/37539/timeline | null | null | null | null | true | false |
https://api.github.com/repos/huggingface/transformers/issues/37538 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/37538/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/37538/comments | https://api.github.com/repos/huggingface/transformers/issues/37538/events | https://github.com/huggingface/transformers/pull/37538 | 2,997,028,696 | PR_kwDOCUB6oc6Ss5FS | 37,538 | Keep Quark loading through meta device | {
"login": "BowenBao",
"id": 9376104,
"node_id": "MDQ6VXNlcjkzNzYxMDQ=",
"avatar_url": "https://avatars.githubusercontent.com/u/9376104?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/BowenBao",
"html_url": "https://github.com/BowenBao",
"followers_url": "https://api.github.com/users/BowenBao/followers",
"following_url": "https://api.github.com/users/BowenBao/following{/other_user}",
"gists_url": "https://api.github.com/users/BowenBao/gists{/gist_id}",
"starred_url": "https://api.github.com/users/BowenBao/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/BowenBao/subscriptions",
"organizations_url": "https://api.github.com/users/BowenBao/orgs",
"repos_url": "https://api.github.com/users/BowenBao/repos",
"events_url": "https://api.github.com/users/BowenBao/events{/privacy}",
"received_events_url": "https://api.github.com/users/BowenBao/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-04-15T16:43:31 | 2025-04-16T12:36:42 | 2025-04-16T12:19:56 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37538",
"html_url": "https://github.com/huggingface/transformers/pull/37538",
"diff_url": "https://github.com/huggingface/transformers/pull/37538.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37538.patch",
"merged_at": "2025-04-16T12:19:56"
} | # What does this PR do?
Follow-up to #37407 to revert part of fix that is unnecessary for quark. [Discussions](https://github.com/huggingface/transformers/pull/37407#discussion_r2044530489)
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [x] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
@SunMarc @MekkCyber @Cyrilvallez
<!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @
If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**.
Please tag fewer than 3 people.
Models:
- text models: @ArthurZucker
- vision models: @amyeroberts, @qubvel
- speech models: @eustlb
- graph models: @clefourrier
Library:
- flax: @gante and @Rocketknight1
- generate: @zucchini-nlp (visual-language models) or @gante (all others)
- pipelines: @Rocketknight1
- tensorflow: @gante and @Rocketknight1
- tokenizers: @ArthurZucker
- trainer: @zach-huggingface and @SunMarc
- chat templates: @Rocketknight1
Integrations:
- deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface
- ray/raytune: @richardliaw, @amogkam
- Big Model Inference: @SunMarc
- quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber
Documentation: @stevhliu
HF projects:
- accelerate: [different repo](https://github.com/huggingface/accelerate)
- datasets: [different repo](https://github.com/huggingface/datasets)
- diffusers: [different repo](https://github.com/huggingface/diffusers)
- rust tokenizers: [different repo](https://github.com/huggingface/tokenizers)
Maintained examples (not research project or legacy):
- Flax: @Rocketknight1
- PyTorch: See Models above and tag the person corresponding to the modality of the example.
- TensorFlow: @Rocketknight1
-->
| {
"login": "SunMarc",
"id": 57196510,
"node_id": "MDQ6VXNlcjU3MTk2NTEw",
"avatar_url": "https://avatars.githubusercontent.com/u/57196510?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/SunMarc",
"html_url": "https://github.com/SunMarc",
"followers_url": "https://api.github.com/users/SunMarc/followers",
"following_url": "https://api.github.com/users/SunMarc/following{/other_user}",
"gists_url": "https://api.github.com/users/SunMarc/gists{/gist_id}",
"starred_url": "https://api.github.com/users/SunMarc/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/SunMarc/subscriptions",
"organizations_url": "https://api.github.com/users/SunMarc/orgs",
"repos_url": "https://api.github.com/users/SunMarc/repos",
"events_url": "https://api.github.com/users/SunMarc/events{/privacy}",
"received_events_url": "https://api.github.com/users/SunMarc/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/37538/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/37538/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/37537 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/37537/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/37537/comments | https://api.github.com/repos/huggingface/transformers/issues/37537/events | https://github.com/huggingface/transformers/pull/37537 | 2,996,952,758 | PR_kwDOCUB6oc6Ssoe8 | 37,537 | Fast tokenizer encoding doesn't handle empty string input | {
"login": "itazap",
"id": 31893021,
"node_id": "MDQ6VXNlcjMxODkzMDIx",
"avatar_url": "https://avatars.githubusercontent.com/u/31893021?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/itazap",
"html_url": "https://github.com/itazap",
"followers_url": "https://api.github.com/users/itazap/followers",
"following_url": "https://api.github.com/users/itazap/following{/other_user}",
"gists_url": "https://api.github.com/users/itazap/gists{/gist_id}",
"starred_url": "https://api.github.com/users/itazap/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/itazap/subscriptions",
"organizations_url": "https://api.github.com/users/itazap/orgs",
"repos_url": "https://api.github.com/users/itazap/repos",
"events_url": "https://api.github.com/users/itazap/events{/privacy}",
"received_events_url": "https://api.github.com/users/itazap/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | open | false | null | [] | null | [] | 2025-04-15T16:11:44 | 2025-06-02T09:20:51 | null | COLLABORATOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37537",
"html_url": "https://github.com/huggingface/transformers/pull/37537",
"diff_url": "https://github.com/huggingface/transformers/pull/37537.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37537.patch",
"merged_at": null
} | ```python
batched_input = [(text, text_pair)] if text_pair else [text]
```
to
```python
batched_input = [(text, text_pair)] if text_pair is not None else [text]
```
I believe this if statement is meant to check if `text_pair` is set. However in the case of an **empty string** --> it is `False`.
This can have different results in cases when expecting a `[SEP]` token in a pair of input --> it won't consider this case as a "pair" like [intended](https://huggingface.co/docs/tokenizers/en/api/post-processors).
edit: the ByteLevel pre_tokenizer doesn't add a prefix space to empty input, unlike the python version
I added a test for comparing empty str output between python and rust versions, and had to overwrite for the tokenizers requiring `boxes` / `xpaths`. However, for future versions with fast we can just test that fast handles 2 empty strings correctly, outputting `[CLS] [SEP] [SEP]`. can apply this instead if preferred | null | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/37537/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/37537/timeline | null | null | null | null | true | false |
https://api.github.com/repos/huggingface/transformers/issues/37535 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/37535/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/37535/comments | https://api.github.com/repos/huggingface/transformers/issues/37535/events | https://github.com/huggingface/transformers/pull/37535 | 2,996,826,020 | PR_kwDOCUB6oc6SsNF6 | 37,535 | Qwen2.5-VL fix redundant cu_window_seqlens | {
"login": "John-Ge",
"id": 41175708,
"node_id": "MDQ6VXNlcjQxMTc1NzA4",
"avatar_url": "https://avatars.githubusercontent.com/u/41175708?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/John-Ge",
"html_url": "https://github.com/John-Ge",
"followers_url": "https://api.github.com/users/John-Ge/followers",
"following_url": "https://api.github.com/users/John-Ge/following{/other_user}",
"gists_url": "https://api.github.com/users/John-Ge/gists{/gist_id}",
"starred_url": "https://api.github.com/users/John-Ge/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/John-Ge/subscriptions",
"organizations_url": "https://api.github.com/users/John-Ge/orgs",
"repos_url": "https://api.github.com/users/John-Ge/repos",
"events_url": "https://api.github.com/users/John-Ge/events{/privacy}",
"received_events_url": "https://api.github.com/users/John-Ge/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | open | false | null | [] | null | [] | 2025-04-15T15:25:47 | 2025-04-16T19:05:50 | null | NONE | null | null | true | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37535",
"html_url": "https://github.com/huggingface/transformers/pull/37535",
"diff_url": "https://github.com/huggingface/transformers/pull/37535.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37535.patch",
"merged_at": null
} | ### What does this PR do?
This PR removes redundant cu_window_seqlens in Qwen2.5-VL
The original cu_window_seqlens is [link to code](https://github.com/huggingface/transformers/blob/356b3cd71d7bfb51c88fea3e8a0c054f3a457ab9/src/transformers/models/qwen2_5_vl/modular_qwen2_5_vl.py#L304)
```python
def get_window_index(self, grid_thw):
window_index: list = []
cu_window_seqlens: list = [0]
window_index_id = 0
vit_merger_window_size = self.window_size // self.spatial_merge_size // self.patch_size
for grid_t, grid_h, grid_w in grid_thw:
llm_grid_h, llm_grid_w = (
grid_h // self.spatial_merge_size,
grid_w // self.spatial_merge_size,
)
index = torch.arange(grid_t * llm_grid_h * llm_grid_w).reshape(grid_t, llm_grid_h, llm_grid_w)
pad_h = vit_merger_window_size - llm_grid_h % vit_merger_window_size
pad_w = vit_merger_window_size - llm_grid_w % vit_merger_window_size
num_windows_h = (llm_grid_h + pad_h) // vit_merger_window_size
num_windows_w = (llm_grid_w + pad_w) // vit_merger_window_size
index_padded = F.pad(index, (0, pad_w, 0, pad_h), "constant", -100)
index_padded = index_padded.reshape(
grid_t,
num_windows_h,
vit_merger_window_size,
num_windows_w,
vit_merger_window_size,
)
index_padded = index_padded.permute(0, 1, 3, 2, 4).reshape(
grid_t,
num_windows_h * num_windows_w,
vit_merger_window_size,
vit_merger_window_size,
)
seqlens = (index_padded != -100).sum([2, 3]).reshape(-1)
index_padded = index_padded.reshape(-1)
index_new = index_padded[index_padded != -100]
window_index.append(index_new + window_index_id)
cu_seqlens_tmp = seqlens.cumsum(0) * self.spatial_merge_unit + cu_window_seqlens[-1]
cu_window_seqlens.extend(cu_seqlens_tmp.tolist())
window_index_id += (grid_t * llm_grid_h * llm_grid_w).item()
window_index = torch.cat(window_index, dim=0)
return window_index, cu_window_seqlens
```
If the grid_thw is `torch.tensor([1, 8, 8])` and the `vit_merger_window_size=4`, the return of above code should have:
`cu_window_seqlens = [0, 16, 32, 32, 48, 64, 64, 64, 64, 64]`
which has redudant length as there are many 32 and 64 in the list.
This is because `pad_h = vit_merger_window_size - llm_grid_h % vit_merger_window_size = 4` which should have been 0, incuring extra window padding.
This extra numbers are later removed in `cu_window_seqlens = torch.unique_consecutive(cu_window_seqlens)`. Hence, the redundant numbers do not cause error.
We could remove these numbers to minimize the extra operations. My PR are very simple just calculate the correct window padding and remove the deduplication.
```python
# for correct padding size
pad_h = vit_merger_window_size - llm_grid_h % vit_merger_window_size if llm_grid_h % vit_merger_window_size != 0 else 0
pad_w = vit_merger_window_size - llm_grid_w % vit_merger_window_size if llm_grid_w % vit_merger_window_size != 0 else 0
# remove the deduplication
# cu_window_seqlens = torch.unique_consecutive(cu_window_seqlens)
```
| null | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/37535/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/37535/timeline | null | null | null | null | true | false |
https://api.github.com/repos/huggingface/transformers/issues/37534 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/37534/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/37534/comments | https://api.github.com/repos/huggingface/transformers/issues/37534/events | https://github.com/huggingface/transformers/pull/37534 | 2,996,767,206 | PR_kwDOCUB6oc6SsAN3 | 37,534 | Docs: fix docstrings for Gemma3 modeling | {
"login": "kiddj",
"id": 8369639,
"node_id": "MDQ6VXNlcjgzNjk2Mzk=",
"avatar_url": "https://avatars.githubusercontent.com/u/8369639?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/kiddj",
"html_url": "https://github.com/kiddj",
"followers_url": "https://api.github.com/users/kiddj/followers",
"following_url": "https://api.github.com/users/kiddj/following{/other_user}",
"gists_url": "https://api.github.com/users/kiddj/gists{/gist_id}",
"starred_url": "https://api.github.com/users/kiddj/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kiddj/subscriptions",
"organizations_url": "https://api.github.com/users/kiddj/orgs",
"repos_url": "https://api.github.com/users/kiddj/repos",
"events_url": "https://api.github.com/users/kiddj/events{/privacy}",
"received_events_url": "https://api.github.com/users/kiddj/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | open | false | null | [] | null | [] | 2025-04-15T15:05:49 | 2025-05-09T08:06:21 | null | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37534",
"html_url": "https://github.com/huggingface/transformers/pull/37534",
"diff_url": "https://github.com/huggingface/transformers/pull/37534.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37534.patch",
"merged_at": null
} | # What does this PR do?
Fixes the example in the gemma3 docstring.
## Before submitting
- [x] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [ ] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests? | null | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/37534/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/37534/timeline | null | null | null | null | true | false |
https://api.github.com/repos/huggingface/transformers/issues/37533 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/37533/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/37533/comments | https://api.github.com/repos/huggingface/transformers/issues/37533/events | https://github.com/huggingface/transformers/pull/37533 | 2,996,737,760 | PR_kwDOCUB6oc6Sr5u7 | 37,533 | Fix Mamba2 Grouped SSD Support in the torch_forward Path | {
"login": "cyang49",
"id": 7364402,
"node_id": "MDQ6VXNlcjczNjQ0MDI=",
"avatar_url": "https://avatars.githubusercontent.com/u/7364402?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/cyang49",
"html_url": "https://github.com/cyang49",
"followers_url": "https://api.github.com/users/cyang49/followers",
"following_url": "https://api.github.com/users/cyang49/following{/other_user}",
"gists_url": "https://api.github.com/users/cyang49/gists{/gist_id}",
"starred_url": "https://api.github.com/users/cyang49/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/cyang49/subscriptions",
"organizations_url": "https://api.github.com/users/cyang49/orgs",
"repos_url": "https://api.github.com/users/cyang49/repos",
"events_url": "https://api.github.com/users/cyang49/events{/privacy}",
"received_events_url": "https://api.github.com/users/cyang49/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-04-15T14:56:03 | 2025-04-16T20:22:42 | 2025-04-16T20:16:01 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37533",
"html_url": "https://github.com/huggingface/transformers/pull/37533",
"diff_url": "https://github.com/huggingface/transformers/pull/37533.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37533.patch",
"merged_at": "2025-04-16T20:16:01"
} | # What does this PR do?
We found a bug in Bamba `torch_forward` implementation where the Mamba2 grouped SSD heads are incorrectly expanded for computations.
In the original code, it uses `torch.repeat` but it results in a tile like pattern, e.g. when ngroups=4 and num_heads=16, `torch.repeat` gives
```
[W, X, Y, Z] --> [W, X, Y, Z, W, X, Y, Z, W, X, Y, Z, W, X, Y, Z]
```
instead of the desired
```
[W, X, Y, Z] -->[W, W, W, W, X, X, X, X, Y, Y, Y, Y, Z, Z, Z, Z]
```
This causes models using `ngroups > 1` and `ngroups != num_heads` to fail evaluations. We solve it by using `torch.repeat_interleave` to replace `torch.repeat`.
The bug was left undetected for a while, perhaps because the `cuda_forward` path is used by most people, or because Bamba-9B uses ngroups=1.
<!--
Congratulations! You've made it this far! You're not quite done yet though.
Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution.
Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change.
Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost.
-->
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [x] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review?
@fabianlim @ani300 @ArthurZucker
| {
"login": "Cyrilvallez",
"id": 71554963,
"node_id": "MDQ6VXNlcjcxNTU0OTYz",
"avatar_url": "https://avatars.githubusercontent.com/u/71554963?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Cyrilvallez",
"html_url": "https://github.com/Cyrilvallez",
"followers_url": "https://api.github.com/users/Cyrilvallez/followers",
"following_url": "https://api.github.com/users/Cyrilvallez/following{/other_user}",
"gists_url": "https://api.github.com/users/Cyrilvallez/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Cyrilvallez/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Cyrilvallez/subscriptions",
"organizations_url": "https://api.github.com/users/Cyrilvallez/orgs",
"repos_url": "https://api.github.com/users/Cyrilvallez/repos",
"events_url": "https://api.github.com/users/Cyrilvallez/events{/privacy}",
"received_events_url": "https://api.github.com/users/Cyrilvallez/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/37533/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/37533/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/37532 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/37532/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/37532/comments | https://api.github.com/repos/huggingface/transformers/issues/37532/events | https://github.com/huggingface/transformers/issues/37532 | 2,996,549,914 | I_kwDOCUB6oc6ym7ka | 37,532 | CUDA OOM when running meta-llama/Llama-4-Maverick-17B-128E-Instruct-FP8 | {
"login": "ido-deci",
"id": 95032221,
"node_id": "U_kgDOBaoTnQ",
"avatar_url": "https://avatars.githubusercontent.com/u/95032221?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ido-deci",
"html_url": "https://github.com/ido-deci",
"followers_url": "https://api.github.com/users/ido-deci/followers",
"following_url": "https://api.github.com/users/ido-deci/following{/other_user}",
"gists_url": "https://api.github.com/users/ido-deci/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ido-deci/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ido-deci/subscriptions",
"organizations_url": "https://api.github.com/users/ido-deci/orgs",
"repos_url": "https://api.github.com/users/ido-deci/repos",
"events_url": "https://api.github.com/users/ido-deci/events{/privacy}",
"received_events_url": "https://api.github.com/users/ido-deci/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
}
] | closed | false | null | [] | null | [] | 2025-04-15T14:04:21 | 2025-09-14T08:04:25 | 2025-09-14T08:04:25 | NONE | null | null | null | null | ### System Info
- `transformers` version: 4.52.0.dev0
- Platform: Linux-5.15.0-1030-nvidia-x86_64-with-glibc2.35
- Python version: 3.10.12
- Huggingface_hub version: 0.30.2
- Safetensors version: 0.5.3
- Accelerate version: 1.6.0
- Accelerate config: not found
- DeepSpeed version: not installed
- PyTorch version (GPU?): 2.6.0+cu124 (True)
- Tensorflow version (GPU?): not installed (NA)
- Flax version (CPU?/GPU?/TPU?): not installed (NA)
- Jax version: not installed
- JaxLib version: not installed
- Using distributed or parallel set-up in script?: Yes, using `accelerate launch`
- Using GPU in script?: Yes
- GPU type: NVIDIA H100 80GB HBM3
### Who can help?
@ArthurZucker
### Information
- [x] The official example scripts
- [x] My own modified scripts
### Tasks
- [ ] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...)
- [x] My own task or dataset (give details below)
### Reproduction
Run `accelerate launch try_llama4.py` where try_llama4.py is
```
from transformers import AutoTokenizer, Llama4ForConditionalGeneration
model_id = "meta-llama/Llama-4-Maverick-17B-128E-Instruct-FP8"
tokenizer = AutoTokenizer.from_pretrained(model_id)
messages = [
{"role": "user", "content": "Who are you?"},
]
inputs = tokenizer.apply_chat_template(messages, add_generation_prompt=True, return_tensors="pt", return_dict=True)
model = Llama4ForConditionalGeneration.from_pretrained(
model_id,
tp_plan="auto",
torch_dtype="auto",
)
model.eval()
print("LOADED")
outputs = model.generate(**inputs.to(model.device), max_new_tokens=1)
outputs = tokenizer.batch_decode(outputs[:, inputs["input_ids"].shape[-1]:])
print(outputs[0])
```
### Expected behavior
Expected: Print of the model's response.
What actually happens to me when trying to run on a 8x(NVIDIA H100 80GB HBM3) node:
The model has no problem loading with around 50GB per GPU, which leaves plenty of space for a short single generation.
However, I'm encountering CUDA OOM during generation.
This seems to be related to CompressedLinear that permanently converts the FP8 weights into BF16, which of course will cause OOM.
BTW, I also tried to install the "kernels" package, but then I get a different error:
`AttributeError: 'SequentialLlama4TextExperts' object has no attribute 'gate_up_proj'`
which seems to be related to the fact that in the FP8 version of Maverick, the expert weights are stored separately for each expert, and the kernel Llama4TextMoe from the hub doesn't support it.
Thanks! | {
"login": "github-actions[bot]",
"id": 41898282,
"node_id": "MDM6Qm90NDE4OTgyODI=",
"avatar_url": "https://avatars.githubusercontent.com/in/15368?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/github-actions%5Bbot%5D",
"html_url": "https://github.com/apps/github-actions",
"followers_url": "https://api.github.com/users/github-actions%5Bbot%5D/followers",
"following_url": "https://api.github.com/users/github-actions%5Bbot%5D/following{/other_user}",
"gists_url": "https://api.github.com/users/github-actions%5Bbot%5D/gists{/gist_id}",
"starred_url": "https://api.github.com/users/github-actions%5Bbot%5D/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/github-actions%5Bbot%5D/subscriptions",
"organizations_url": "https://api.github.com/users/github-actions%5Bbot%5D/orgs",
"repos_url": "https://api.github.com/users/github-actions%5Bbot%5D/repos",
"events_url": "https://api.github.com/users/github-actions%5Bbot%5D/events{/privacy}",
"received_events_url": "https://api.github.com/users/github-actions%5Bbot%5D/received_events",
"type": "Bot",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/37532/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/37532/timeline | null | completed | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | true |
https://api.github.com/repos/huggingface/transformers/issues/37531 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/37531/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/37531/comments | https://api.github.com/repos/huggingface/transformers/issues/37531/events | https://github.com/huggingface/transformers/pull/37531 | 2,996,500,120 | PR_kwDOCUB6oc6SrEuE | 37,531 | Revert change that breaks on Torch 2.1 | {
"login": "Rocketknight1",
"id": 12866554,
"node_id": "MDQ6VXNlcjEyODY2NTU0",
"avatar_url": "https://avatars.githubusercontent.com/u/12866554?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Rocketknight1",
"html_url": "https://github.com/Rocketknight1",
"followers_url": "https://api.github.com/users/Rocketknight1/followers",
"following_url": "https://api.github.com/users/Rocketknight1/following{/other_user}",
"gists_url": "https://api.github.com/users/Rocketknight1/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Rocketknight1/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Rocketknight1/subscriptions",
"organizations_url": "https://api.github.com/users/Rocketknight1/orgs",
"repos_url": "https://api.github.com/users/Rocketknight1/repos",
"events_url": "https://api.github.com/users/Rocketknight1/events{/privacy}",
"received_events_url": "https://api.github.com/users/Rocketknight1/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-04-15T13:49:30 | 2025-04-29T12:27:11 | 2025-04-29T12:27:10 | MEMBER | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37531",
"html_url": "https://github.com/huggingface/transformers/pull/37531",
"diff_url": "https://github.com/huggingface/transformers/pull/37531.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37531.patch",
"merged_at": "2025-04-29T12:27:10"
} | One of the changes in #37372 is broken on Torch 2.1, which we're still supporting! Reverting that until we bump our minimum Torch version and leaving a `TODO` | {
"login": "Rocketknight1",
"id": 12866554,
"node_id": "MDQ6VXNlcjEyODY2NTU0",
"avatar_url": "https://avatars.githubusercontent.com/u/12866554?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Rocketknight1",
"html_url": "https://github.com/Rocketknight1",
"followers_url": "https://api.github.com/users/Rocketknight1/followers",
"following_url": "https://api.github.com/users/Rocketknight1/following{/other_user}",
"gists_url": "https://api.github.com/users/Rocketknight1/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Rocketknight1/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Rocketknight1/subscriptions",
"organizations_url": "https://api.github.com/users/Rocketknight1/orgs",
"repos_url": "https://api.github.com/users/Rocketknight1/repos",
"events_url": "https://api.github.com/users/Rocketknight1/events{/privacy}",
"received_events_url": "https://api.github.com/users/Rocketknight1/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/37531/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/37531/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/37530 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/37530/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/37530/comments | https://api.github.com/repos/huggingface/transformers/issues/37530/events | https://github.com/huggingface/transformers/pull/37530 | 2,996,403,004 | PR_kwDOCUB6oc6Sqvrh | 37,530 | Fixes hqq by following a new path for bias parameter in pre_quantized models | {
"login": "MekkCyber",
"id": 93391238,
"node_id": "U_kgDOBZEJhg",
"avatar_url": "https://avatars.githubusercontent.com/u/93391238?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/MekkCyber",
"html_url": "https://github.com/MekkCyber",
"followers_url": "https://api.github.com/users/MekkCyber/followers",
"following_url": "https://api.github.com/users/MekkCyber/following{/other_user}",
"gists_url": "https://api.github.com/users/MekkCyber/gists{/gist_id}",
"starred_url": "https://api.github.com/users/MekkCyber/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/MekkCyber/subscriptions",
"organizations_url": "https://api.github.com/users/MekkCyber/orgs",
"repos_url": "https://api.github.com/users/MekkCyber/repos",
"events_url": "https://api.github.com/users/MekkCyber/events{/privacy}",
"received_events_url": "https://api.github.com/users/MekkCyber/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-04-15T13:16:22 | 2025-04-22T15:06:38 | 2025-04-16T11:58:15 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37530",
"html_url": "https://github.com/huggingface/transformers/pull/37530",
"diff_url": "https://github.com/huggingface/transformers/pull/37530.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37530.patch",
"merged_at": "2025-04-16T11:58:14"
} | # What does this PR do?
Since HQQ overrides the `load_state_dict` method for `HQQLinear`, it directly loads both the weight and bias parameters. This differs from our approach, where we iterate through the parameters one by one and load the bias separately from the weights.
This PR updates the behavior to simply ignore the bias parameter, assuming it was already loaded alongside the weights in the case of pre-quantized models. | {
"login": "MekkCyber",
"id": 93391238,
"node_id": "U_kgDOBZEJhg",
"avatar_url": "https://avatars.githubusercontent.com/u/93391238?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/MekkCyber",
"html_url": "https://github.com/MekkCyber",
"followers_url": "https://api.github.com/users/MekkCyber/followers",
"following_url": "https://api.github.com/users/MekkCyber/following{/other_user}",
"gists_url": "https://api.github.com/users/MekkCyber/gists{/gist_id}",
"starred_url": "https://api.github.com/users/MekkCyber/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/MekkCyber/subscriptions",
"organizations_url": "https://api.github.com/users/MekkCyber/orgs",
"repos_url": "https://api.github.com/users/MekkCyber/repos",
"events_url": "https://api.github.com/users/MekkCyber/events{/privacy}",
"received_events_url": "https://api.github.com/users/MekkCyber/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/37530/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/37530/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/37529 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/37529/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/37529/comments | https://api.github.com/repos/huggingface/transformers/issues/37529/events | https://github.com/huggingface/transformers/pull/37529 | 2,996,346,730 | PR_kwDOCUB6oc6SqjLf | 37,529 | make Llama4TextMoe forward more readable | {
"login": "JJJYmmm",
"id": 92386084,
"node_id": "U_kgDOBYGzJA",
"avatar_url": "https://avatars.githubusercontent.com/u/92386084?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/JJJYmmm",
"html_url": "https://github.com/JJJYmmm",
"followers_url": "https://api.github.com/users/JJJYmmm/followers",
"following_url": "https://api.github.com/users/JJJYmmm/following{/other_user}",
"gists_url": "https://api.github.com/users/JJJYmmm/gists{/gist_id}",
"starred_url": "https://api.github.com/users/JJJYmmm/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/JJJYmmm/subscriptions",
"organizations_url": "https://api.github.com/users/JJJYmmm/orgs",
"repos_url": "https://api.github.com/users/JJJYmmm/repos",
"events_url": "https://api.github.com/users/JJJYmmm/events{/privacy}",
"received_events_url": "https://api.github.com/users/JJJYmmm/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-04-15T12:58:50 | 2025-05-28T09:54:45 | 2025-05-28T09:54:45 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37529",
"html_url": "https://github.com/huggingface/transformers/pull/37529",
"diff_url": "https://github.com/huggingface/transformers/pull/37529.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37529.patch",
"merged_at": "2025-05-28T09:54:45"
} | # What does this PR do?
Make forward function of `Llama4TextMoe` more readable. Since It use EP directly, here is no need to use torch.gather or torch.scatter. Just adding a new dimension of expert is ok.
test script below
```python
import torch
from torch import nn
from transformers import Llama4TextConfig
from transformers.models.llama4.modeling_llama4 import Llama4TextExperts, Llama4TextMLP
class Llama4TextMoe(nn.Module):
def __init__(self, config):
super().__init__()
self.top_k = config.num_experts_per_tok
self.hidden_dim = config.hidden_size
self.num_experts = config.num_local_experts
self.experts = Llama4TextExperts(config)
self.router = nn.Linear(config.hidden_size, config.num_local_experts, bias=False)
self.shared_expert = Llama4TextMLP(config)
def forward(self, hidden_states):
batch, seq_len, hidden_dim = hidden_states.shape
hidden_states = hidden_states.view(-1, self.hidden_dim)
router_logits = self.router(hidden_states).transpose(0, 1)
tokens_per_expert = batch * seq_len
router_top_value, router_indices = torch.topk(router_logits.transpose(0, 1), self.top_k, dim=1)
router_scores = (
torch.full_like(router_logits.transpose(0, 1), float("-inf"))
.scatter_(1, router_indices, router_top_value)
.transpose(0, 1)
)
# We do this to make sure we have -inf for non topK tokens before going through the !
# Here we are just creating a tensor to index each and every single one of the hidden states. Let s maybe register a buffer for this!
router_indices = (
torch.arange(tokens_per_expert, device=hidden_states.device).view(1, -1).expand(router_scores.size(0), -1)
)
router_scores = torch.sigmoid(router_scores.float()).to(hidden_states.dtype)
router_indices = router_indices.reshape(-1, 1).expand(-1, hidden_dim)
routed_in = torch.gather(
input=hidden_states,
dim=0,
index=router_indices,
).to(hidden_states.device)
# we gather inputs corresponding to each expert based on the router indices
routed_in = routed_in * router_scores.reshape(-1, 1)
routed_out = self.experts(routed_in)
out = self.shared_expert(hidden_states)
# now that we finished expert computation -> we scatter add because we gathered previously
# we have to do this because we used all experts on all tokens. This is faster than the for loop, tho you are compute bound
# this scales a lot better if you do EP!
out.scatter_add_(dim=0, index=router_indices, src=routed_out.view(-1, hidden_dim))
return out, router_scores
def forward_simple(self, hidden_states):
hidden_states = hidden_states.view(-1, self.hidden_dim)
router_logits = self.router(hidden_states).transpose(0, 1)
router_top_value, router_indices = torch.topk(router_logits.transpose(0, 1), self.top_k, dim=1)
router_scores = (
torch.full_like(router_logits.transpose(0, 1), float("-inf"))
.scatter_(1, router_indices, router_top_value)
.transpose(0, 1)
)
router_scores = torch.sigmoid(router_scores.float()).to(hidden_states.dtype)
routed_in = hidden_states.repeat(self.num_experts, 1) # repeat num_experts times rather than torch.gather
routed_in = routed_in * router_scores.reshape(-1, 1)
routed_out = self.experts(routed_in)
out = self.shared_expert(hidden_states)
out.add_(routed_out.reshape(self.num_experts, -1, self.hidden_dim).sum(dim=0)) # average rather than torch.scatter_add_
return out, router_scores
if __name__ == '__main__':
config = Llama4TextConfig()
moe_block = Llama4TextMoe(config).cuda()
for _ in range(10):
fake_input = torch.randn(2, 512, config.hidden_size).cuda() # assume bs = 2, sqlen = 512
out_ori, score_ori = moe_block(fake_input)
out_modified, score_modified = moe_block.forward_simple(fake_input)
assert torch.allclose(out_ori, out_modified, atol=1e-5)
assert torch.allclose(score_ori, score_modified, atol=1e-5)
print('pass!')
```
## Who can review?
Models:
- text models: @ArthurZucker | {
"login": "ArthurZucker",
"id": 48595927,
"node_id": "MDQ6VXNlcjQ4NTk1OTI3",
"avatar_url": "https://avatars.githubusercontent.com/u/48595927?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ArthurZucker",
"html_url": "https://github.com/ArthurZucker",
"followers_url": "https://api.github.com/users/ArthurZucker/followers",
"following_url": "https://api.github.com/users/ArthurZucker/following{/other_user}",
"gists_url": "https://api.github.com/users/ArthurZucker/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ArthurZucker/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ArthurZucker/subscriptions",
"organizations_url": "https://api.github.com/users/ArthurZucker/orgs",
"repos_url": "https://api.github.com/users/ArthurZucker/repos",
"events_url": "https://api.github.com/users/ArthurZucker/events{/privacy}",
"received_events_url": "https://api.github.com/users/ArthurZucker/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/37529/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/37529/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/37528 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/37528/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/37528/comments | https://api.github.com/repos/huggingface/transformers/issues/37528/events | https://github.com/huggingface/transformers/pull/37528 | 2,996,278,854 | PR_kwDOCUB6oc6SqUQZ | 37,528 | Phi3 | {
"login": "xadupre",
"id": 22452781,
"node_id": "MDQ6VXNlcjIyNDUyNzgx",
"avatar_url": "https://avatars.githubusercontent.com/u/22452781?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/xadupre",
"html_url": "https://github.com/xadupre",
"followers_url": "https://api.github.com/users/xadupre/followers",
"following_url": "https://api.github.com/users/xadupre/following{/other_user}",
"gists_url": "https://api.github.com/users/xadupre/gists{/gist_id}",
"starred_url": "https://api.github.com/users/xadupre/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/xadupre/subscriptions",
"organizations_url": "https://api.github.com/users/xadupre/orgs",
"repos_url": "https://api.github.com/users/xadupre/repos",
"events_url": "https://api.github.com/users/xadupre/events{/privacy}",
"received_events_url": "https://api.github.com/users/xadupre/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | open | false | null | [] | null | [] | 2025-04-15T12:34:49 | 2025-04-16T13:45:50 | null | CONTRIBUTOR | null | null | true | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37528",
"html_url": "https://github.com/huggingface/transformers/pull/37528",
"diff_url": "https://github.com/huggingface/transformers/pull/37528.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37528.patch",
"merged_at": null
} | # What does this PR do?
Function ``_prepare_4d_causal_attention_mask_with_cache_position`` only uses ``sliding_window`` from the text config. This PR replaces config by sliding_window for better clarity. It is also necessary to make the model go through torch.export.export.
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [ ] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
<!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @
If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**.
Please tag fewer than 3 people.
Models:
- text models: @ArthurZucker
- vision models: @amyeroberts, @qubvel
- speech models: @eustlb
- graph models: @clefourrier
Library:
- flax: @gante and @Rocketknight1
- generate: @zucchini-nlp (visual-language models) or @gante (all others)
- pipelines: @Rocketknight1
- tensorflow: @gante and @Rocketknight1
- tokenizers: @ArthurZucker
- trainer: @zach-huggingface and @SunMarc
- chat templates: @Rocketknight1
Integrations:
- deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface
- ray/raytune: @richardliaw, @amogkam
- Big Model Inference: @SunMarc
- quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber
Documentation: @stevhliu
HF projects:
- accelerate: [different repo](https://github.com/huggingface/accelerate)
- datasets: [different repo](https://github.com/huggingface/datasets)
- diffusers: [different repo](https://github.com/huggingface/diffusers)
- rust tokenizers: [different repo](https://github.com/huggingface/tokenizers)
Maintained examples (not research project or legacy):
- Flax: @Rocketknight1
- PyTorch: See Models above and tag the person corresponding to the modality of the example.
- TensorFlow: @Rocketknight1
-->
| null | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/37528/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/37528/timeline | null | null | null | null | true | false |
https://api.github.com/repos/huggingface/transformers/issues/37527 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/37527/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/37527/comments | https://api.github.com/repos/huggingface/transformers/issues/37527/events | https://github.com/huggingface/transformers/pull/37527 | 2,996,275,800 | PR_kwDOCUB6oc6SqTlH | 37,527 | Fix missing return type for MLCD docs | {
"login": "qubvel",
"id": 31920396,
"node_id": "MDQ6VXNlcjMxOTIwMzk2",
"avatar_url": "https://avatars.githubusercontent.com/u/31920396?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/qubvel",
"html_url": "https://github.com/qubvel",
"followers_url": "https://api.github.com/users/qubvel/followers",
"following_url": "https://api.github.com/users/qubvel/following{/other_user}",
"gists_url": "https://api.github.com/users/qubvel/gists{/gist_id}",
"starred_url": "https://api.github.com/users/qubvel/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/qubvel/subscriptions",
"organizations_url": "https://api.github.com/users/qubvel/orgs",
"repos_url": "https://api.github.com/users/qubvel/repos",
"events_url": "https://api.github.com/users/qubvel/events{/privacy}",
"received_events_url": "https://api.github.com/users/qubvel/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-04-15T12:33:37 | 2025-04-15T13:04:17 | 2025-04-15T13:04:17 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37527",
"html_url": "https://github.com/huggingface/transformers/pull/37527",
"diff_url": "https://github.com/huggingface/transformers/pull/37527.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37527.patch",
"merged_at": "2025-04-15T13:04:17"
} | # What does this PR do?
<!--
Congratulations! You've made it this far! You're not quite done yet though.
Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution.
Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change.
Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost.
-->
<!-- Remove if not applicable -->
Fixes CI
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [ ] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
<!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @
If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**.
Please tag fewer than 3 people.
Models:
- text models: @ArthurZucker
- vision models: @amyeroberts, @qubvel
- speech models: @eustlb
- graph models: @clefourrier
Library:
- flax: @gante and @Rocketknight1
- generate: @zucchini-nlp (visual-language models) or @gante (all others)
- pipelines: @Rocketknight1
- tensorflow: @gante and @Rocketknight1
- tokenizers: @ArthurZucker
- trainer: @zach-huggingface and @SunMarc
- chat templates: @Rocketknight1
Integrations:
- deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface
- ray/raytune: @richardliaw, @amogkam
- Big Model Inference: @SunMarc
- quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber
Documentation: @stevhliu
HF projects:
- accelerate: [different repo](https://github.com/huggingface/accelerate)
- datasets: [different repo](https://github.com/huggingface/datasets)
- diffusers: [different repo](https://github.com/huggingface/diffusers)
- rust tokenizers: [different repo](https://github.com/huggingface/tokenizers)
Maintained examples (not research project or legacy):
- Flax: @Rocketknight1
- PyTorch: See Models above and tag the person corresponding to the modality of the example.
- TensorFlow: @Rocketknight1
-->
| {
"login": "qubvel",
"id": 31920396,
"node_id": "MDQ6VXNlcjMxOTIwMzk2",
"avatar_url": "https://avatars.githubusercontent.com/u/31920396?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/qubvel",
"html_url": "https://github.com/qubvel",
"followers_url": "https://api.github.com/users/qubvel/followers",
"following_url": "https://api.github.com/users/qubvel/following{/other_user}",
"gists_url": "https://api.github.com/users/qubvel/gists{/gist_id}",
"starred_url": "https://api.github.com/users/qubvel/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/qubvel/subscriptions",
"organizations_url": "https://api.github.com/users/qubvel/orgs",
"repos_url": "https://api.github.com/users/qubvel/repos",
"events_url": "https://api.github.com/users/qubvel/events{/privacy}",
"received_events_url": "https://api.github.com/users/qubvel/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/37527/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/37527/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/37526 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/37526/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/37526/comments | https://api.github.com/repos/huggingface/transformers/issues/37526/events | https://github.com/huggingface/transformers/issues/37526 | 2,996,158,460 | I_kwDOCUB6oc6ylb_8 | 37,526 | When I using pipeline to inference the valid dataset, I got the different results with evaluation result during training! | {
"login": "kingman1980",
"id": 24912500,
"node_id": "MDQ6VXNlcjI0OTEyNTAw",
"avatar_url": "https://avatars.githubusercontent.com/u/24912500?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/kingman1980",
"html_url": "https://github.com/kingman1980",
"followers_url": "https://api.github.com/users/kingman1980/followers",
"following_url": "https://api.github.com/users/kingman1980/following{/other_user}",
"gists_url": "https://api.github.com/users/kingman1980/gists{/gist_id}",
"starred_url": "https://api.github.com/users/kingman1980/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kingman1980/subscriptions",
"organizations_url": "https://api.github.com/users/kingman1980/orgs",
"repos_url": "https://api.github.com/users/kingman1980/repos",
"events_url": "https://api.github.com/users/kingman1980/events{/privacy}",
"received_events_url": "https://api.github.com/users/kingman1980/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-04-15T11:50:48 | 2025-05-16T12:33:39 | 2025-05-16T12:33:37 | NONE | null | null | null | null | I am working with three classes image classification task.
I got the confusion matrix at 30th epoch during training like this confusion matrix:
[[230, 0, 1], [1, 28, 2], [1, 10, 3727]]
And I used the pipeline with the model's checkpoint at 30th epoch to infer the same test dataset, I got the confusion matrix like this:
[[230, 1, 0], [0, 31, 0], [8, 166, 3564]]
the model is timm/mobilenetv3_small_100.lamb_in1k
Anybody knows why? | {
"login": "Rocketknight1",
"id": 12866554,
"node_id": "MDQ6VXNlcjEyODY2NTU0",
"avatar_url": "https://avatars.githubusercontent.com/u/12866554?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Rocketknight1",
"html_url": "https://github.com/Rocketknight1",
"followers_url": "https://api.github.com/users/Rocketknight1/followers",
"following_url": "https://api.github.com/users/Rocketknight1/following{/other_user}",
"gists_url": "https://api.github.com/users/Rocketknight1/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Rocketknight1/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Rocketknight1/subscriptions",
"organizations_url": "https://api.github.com/users/Rocketknight1/orgs",
"repos_url": "https://api.github.com/users/Rocketknight1/repos",
"events_url": "https://api.github.com/users/Rocketknight1/events{/privacy}",
"received_events_url": "https://api.github.com/users/Rocketknight1/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/37526/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/37526/timeline | null | completed | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | true |
https://api.github.com/repos/huggingface/transformers/issues/37525 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/37525/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/37525/comments | https://api.github.com/repos/huggingface/transformers/issues/37525/events | https://github.com/huggingface/transformers/pull/37525 | 2,996,061,631 | PR_kwDOCUB6oc6Spknp | 37,525 | fix: Restore explicit error surfacing for unexpected hub exceptions | {
"login": "manueldeprada",
"id": 6536835,
"node_id": "MDQ6VXNlcjY1MzY4MzU=",
"avatar_url": "https://avatars.githubusercontent.com/u/6536835?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/manueldeprada",
"html_url": "https://github.com/manueldeprada",
"followers_url": "https://api.github.com/users/manueldeprada/followers",
"following_url": "https://api.github.com/users/manueldeprada/following{/other_user}",
"gists_url": "https://api.github.com/users/manueldeprada/gists{/gist_id}",
"starred_url": "https://api.github.com/users/manueldeprada/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/manueldeprada/subscriptions",
"organizations_url": "https://api.github.com/users/manueldeprada/orgs",
"repos_url": "https://api.github.com/users/manueldeprada/repos",
"events_url": "https://api.github.com/users/manueldeprada/events{/privacy}",
"received_events_url": "https://api.github.com/users/manueldeprada/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-04-15T11:11:14 | 2025-04-15T13:06:39 | 2025-04-15T12:54:11 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37525",
"html_url": "https://github.com/huggingface/transformers/pull/37525",
"diff_url": "https://github.com/huggingface/transformers/pull/37525.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37525.patch",
"merged_at": "2025-04-15T12:54:11"
} | Prior to PR #36033, unexpected exceptions (e.g., ModuleNotFoundError) during hub model loading were not swallowed silently. They either matched specific except blocks or were raised.
After #36033, a catch-all except Exception block was introduced without a fallback else, causing unknown errors to be silently ignored and leading to misleading downstream behavior.
This commit adds an `else: raise e` to ensure only explicitly handled exceptions are suppressed. All others are surfaced, restoring pre-4.50 behavior and aiding in debugging and dependency visibility.
Fixes #37477, see my last comment there for more details.
Since I am new here, I appreciate reviews :)
Should a test for proper exceptions be written so this kind of regression does not happen again?
## Who can review?
@gante @Cyrilvallez @Wauplin | {
"login": "manueldeprada",
"id": 6536835,
"node_id": "MDQ6VXNlcjY1MzY4MzU=",
"avatar_url": "https://avatars.githubusercontent.com/u/6536835?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/manueldeprada",
"html_url": "https://github.com/manueldeprada",
"followers_url": "https://api.github.com/users/manueldeprada/followers",
"following_url": "https://api.github.com/users/manueldeprada/following{/other_user}",
"gists_url": "https://api.github.com/users/manueldeprada/gists{/gist_id}",
"starred_url": "https://api.github.com/users/manueldeprada/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/manueldeprada/subscriptions",
"organizations_url": "https://api.github.com/users/manueldeprada/orgs",
"repos_url": "https://api.github.com/users/manueldeprada/repos",
"events_url": "https://api.github.com/users/manueldeprada/events{/privacy}",
"received_events_url": "https://api.github.com/users/manueldeprada/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/37525/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/37525/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/37523 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/37523/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/37523/comments | https://api.github.com/repos/huggingface/transformers/issues/37523/events | https://github.com/huggingface/transformers/pull/37523 | 2,995,835,488 | PR_kwDOCUB6oc6SoypX | 37,523 | [chat template] fix security vulnerability | {
"login": "zucchini-nlp",
"id": 100715397,
"node_id": "U_kgDOBgDLhQ",
"avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/zucchini-nlp",
"html_url": "https://github.com/zucchini-nlp",
"followers_url": "https://api.github.com/users/zucchini-nlp/followers",
"following_url": "https://api.github.com/users/zucchini-nlp/following{/other_user}",
"gists_url": "https://api.github.com/users/zucchini-nlp/gists{/gist_id}",
"starred_url": "https://api.github.com/users/zucchini-nlp/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/zucchini-nlp/subscriptions",
"organizations_url": "https://api.github.com/users/zucchini-nlp/orgs",
"repos_url": "https://api.github.com/users/zucchini-nlp/repos",
"events_url": "https://api.github.com/users/zucchini-nlp/events{/privacy}",
"received_events_url": "https://api.github.com/users/zucchini-nlp/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-04-15T09:45:52 | 2025-04-17T07:21:37 | 2025-04-17T07:21:37 | MEMBER | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37523",
"html_url": "https://github.com/huggingface/transformers/pull/37523",
"diff_url": "https://github.com/huggingface/transformers/pull/37523.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37523.patch",
"merged_at": "2025-04-17T07:21:37"
} | # What does this PR do?
As discussed internally, let's just use `urlparse` | {
"login": "zucchini-nlp",
"id": 100715397,
"node_id": "U_kgDOBgDLhQ",
"avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/zucchini-nlp",
"html_url": "https://github.com/zucchini-nlp",
"followers_url": "https://api.github.com/users/zucchini-nlp/followers",
"following_url": "https://api.github.com/users/zucchini-nlp/following{/other_user}",
"gists_url": "https://api.github.com/users/zucchini-nlp/gists{/gist_id}",
"starred_url": "https://api.github.com/users/zucchini-nlp/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/zucchini-nlp/subscriptions",
"organizations_url": "https://api.github.com/users/zucchini-nlp/orgs",
"repos_url": "https://api.github.com/users/zucchini-nlp/repos",
"events_url": "https://api.github.com/users/zucchini-nlp/events{/privacy}",
"received_events_url": "https://api.github.com/users/zucchini-nlp/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/37523/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/37523/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/37522 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/37522/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/37522/comments | https://api.github.com/repos/huggingface/transformers/issues/37522/events | https://github.com/huggingface/transformers/pull/37522 | 2,995,707,179 | PR_kwDOCUB6oc6SoWld | 37,522 | internalize build_inputs_with_special_tokens and prepare_for_model | {
"login": "itazap",
"id": 31893021,
"node_id": "MDQ6VXNlcjMxODkzMDIx",
"avatar_url": "https://avatars.githubusercontent.com/u/31893021?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/itazap",
"html_url": "https://github.com/itazap",
"followers_url": "https://api.github.com/users/itazap/followers",
"following_url": "https://api.github.com/users/itazap/following{/other_user}",
"gists_url": "https://api.github.com/users/itazap/gists{/gist_id}",
"starred_url": "https://api.github.com/users/itazap/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/itazap/subscriptions",
"organizations_url": "https://api.github.com/users/itazap/orgs",
"repos_url": "https://api.github.com/users/itazap/repos",
"events_url": "https://api.github.com/users/itazap/events{/privacy}",
"received_events_url": "https://api.github.com/users/itazap/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | open | false | null | [] | null | [] | 2025-04-15T09:01:35 | 2025-06-02T09:20:19 | null | COLLABORATOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37522",
"html_url": "https://github.com/huggingface/transformers/pull/37522",
"diff_url": "https://github.com/huggingface/transformers/pull/37522.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37522.patch",
"merged_at": null
} | internalize `prepare_for_model` and `build_inputs_with_special_tokens`:
users should encode using `__call__ `:
### slow case
`__call__ ` --> calls `prepare_for_model` --> which calls `build_inputs_with_special_tokens`.
### fast case
`__call__ ` --> calls rust tokenizer encode --> does not call `build_inputs_with_special_tokens` or `prepare_for_model`
Exposing these makes it unclear which method is correct to use for encoding, and adds redundancy to maintain. We already test these under the hood with testing`__call__` or `encode` or `batch_encode`, etc., so IMO it is safe to remove testing `build_inputs_with_special_tokens` at least from all fast files and `build_inputs_with_special_tokens` --> `_build_inputs_with_special_tokens` in slow.
* only modified llama fast for now but can add a commit with all fast files edited
* also updated old language_modeling.py file, not sure if it is still relevant in general
| null | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/37522/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/37522/timeline | null | null | null | null | true | false |
https://api.github.com/repos/huggingface/transformers/issues/37521 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/37521/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/37521/comments | https://api.github.com/repos/huggingface/transformers/issues/37521/events | https://github.com/huggingface/transformers/issues/37521 | 2,995,630,848 | I_kwDOCUB6oc6yjbMA | 37,521 | DeformableDetrHungarianMatcher: fancy indexing fails | {
"login": "codingS3b",
"id": 31093529,
"node_id": "MDQ6VXNlcjMxMDkzNTI5",
"avatar_url": "https://avatars.githubusercontent.com/u/31093529?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/codingS3b",
"html_url": "https://github.com/codingS3b",
"followers_url": "https://api.github.com/users/codingS3b/followers",
"following_url": "https://api.github.com/users/codingS3b/following{/other_user}",
"gists_url": "https://api.github.com/users/codingS3b/gists{/gist_id}",
"starred_url": "https://api.github.com/users/codingS3b/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/codingS3b/subscriptions",
"organizations_url": "https://api.github.com/users/codingS3b/orgs",
"repos_url": "https://api.github.com/users/codingS3b/repos",
"events_url": "https://api.github.com/users/codingS3b/events{/privacy}",
"received_events_url": "https://api.github.com/users/codingS3b/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
}
] | closed | false | null | [] | null | [] | 2025-04-15T08:36:27 | 2025-04-15T12:41:34 | 2025-04-15T12:41:33 | NONE | null | null | null | null | ### System Info
- `transformers` version: 4.49.0
- Platform: Linux-3.10.0-1160.25.1.el7.x86_64-x86_64-with-glibc2.17
- Python version: 3.12.9
- Huggingface_hub version: 0.29.2
- Safetensors version: 0.5.3
- Accelerate version: not installed
- Accelerate config: not found
- DeepSpeed version: not installed
- PyTorch version (GPU?): 2.6.0+cu118 (False)
- Tensorflow version (GPU?): not installed (NA)
- Flax version (CPU?/GPU?/TPU?): not installed (NA)
- Jax version: not installed
- JaxLib version: not installed
- Using distributed or parallel set-up in script?: no
### Who can help?
I assume this to be a bug when carrying out the matching with `DeformableDetrHungarianMatcher`, where indexing fails due to shape mismatches but am not totally sure whether I have the data prepared in the expected format. @qubvel @amyeroberts, could you possibly have a look?
### Information
- [ ] The official example scripts
- [x] My own modified scripts
### Tasks
- [ ] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...)
- [x] My own task or dataset (give details below)
### Reproduction
```python
import torch
import numpy as np
from transformers.loss.loss_deformable_detr import DeformableDetrHungarianMatcher
np.random.seed(42)
matcher = DeformableDetrHungarianMatcher()
batch_size = 2
num_queries = 10
num_classes = 2
# prepare the data according to the docstring of the matcher
outputs = {
"logits": torch.randn(batch_size, num_queries, num_classes), # batch_size, num_queries, num_classes
"pred_boxes": torch.randn(batch_size, num_queries, 4), # batch_size, num_queries, 4
}
# for each sample in the batch, a different number of objects can be present
targets = []
for sidx in range(batch_size):
n_objects = np.random.randint(1, 4)
d = {
"class_labels": torch.ones(n_objects).to(torch.uint8), # we have only one class, but the number of objects is random
"boxes": torch.randn(n_objects, 4),
}
print(f"Sample {sidx}: {n_objects} ground-truth objects!")
targets.append(d)
# this fails with IndexError: The shape of the mask [4] at index 0 does not match the shape of the indexed tensor [20, 2] at index 1
# the relevant line is `class_cost = pos_cost_class[:, target_ids] - neg_cost_class[:, target_ids]`
matcher(outputs, targets)
```
### Expected behavior
I would expect the matcher to not crash and provide a reasonable output instead. | {
"login": "codingS3b",
"id": 31093529,
"node_id": "MDQ6VXNlcjMxMDkzNTI5",
"avatar_url": "https://avatars.githubusercontent.com/u/31093529?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/codingS3b",
"html_url": "https://github.com/codingS3b",
"followers_url": "https://api.github.com/users/codingS3b/followers",
"following_url": "https://api.github.com/users/codingS3b/following{/other_user}",
"gists_url": "https://api.github.com/users/codingS3b/gists{/gist_id}",
"starred_url": "https://api.github.com/users/codingS3b/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/codingS3b/subscriptions",
"organizations_url": "https://api.github.com/users/codingS3b/orgs",
"repos_url": "https://api.github.com/users/codingS3b/repos",
"events_url": "https://api.github.com/users/codingS3b/events{/privacy}",
"received_events_url": "https://api.github.com/users/codingS3b/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/37521/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/37521/timeline | null | completed | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | true |
https://api.github.com/repos/huggingface/transformers/issues/37520 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/37520/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/37520/comments | https://api.github.com/repos/huggingface/transformers/issues/37520/events | https://github.com/huggingface/transformers/pull/37520 | 2,995,575,472 | PR_kwDOCUB6oc6Sn5vv | 37,520 | Fix BitsAndBytesConfig JSON serialization in TrainingArguments | {
"login": "astefanutti",
"id": 366207,
"node_id": "MDQ6VXNlcjM2NjIwNw==",
"avatar_url": "https://avatars.githubusercontent.com/u/366207?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/astefanutti",
"html_url": "https://github.com/astefanutti",
"followers_url": "https://api.github.com/users/astefanutti/followers",
"following_url": "https://api.github.com/users/astefanutti/following{/other_user}",
"gists_url": "https://api.github.com/users/astefanutti/gists{/gist_id}",
"starred_url": "https://api.github.com/users/astefanutti/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/astefanutti/subscriptions",
"organizations_url": "https://api.github.com/users/astefanutti/orgs",
"repos_url": "https://api.github.com/users/astefanutti/repos",
"events_url": "https://api.github.com/users/astefanutti/events{/privacy}",
"received_events_url": "https://api.github.com/users/astefanutti/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-04-15T08:17:04 | 2025-04-16T09:34:23 | 2025-04-16T09:18:17 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37520",
"html_url": "https://github.com/huggingface/transformers/pull/37520",
"diff_url": "https://github.com/huggingface/transformers/pull/37520.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37520.patch",
"merged_at": "2025-04-16T09:18:17"
} | # What does this PR do?
This makes sure the BitsAndBytesConfig object gets converted to dict before TrainingArguments are serialized to JSON.
This happens when using BitsAndBytes and the TensorBoard integration for example.
<!-- Remove if not applicable -->
Fixes #37518
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [x] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
<!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @
If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**.
Please tag fewer than 3 people.
Models:
- text models: @ArthurZucker
- vision models: @amyeroberts, @qubvel
- speech models: @eustlb
- graph models: @clefourrier
Library:
- flax: @gante and @Rocketknight1
- generate: @zucchini-nlp (visual-language models) or @gante (all others)
- pipelines: @Rocketknight1
- tensorflow: @gante and @Rocketknight1
- tokenizers: @ArthurZucker
- trainer: @zach-huggingface and @SunMarc
- chat templates: @Rocketknight1
Integrations:
- deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface
- ray/raytune: @richardliaw, @amogkam
- Big Model Inference: @SunMarc
- quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber
Documentation: @stevhliu
HF projects:
- accelerate: [different repo](https://github.com/huggingface/accelerate)
- datasets: [different repo](https://github.com/huggingface/datasets)
- diffusers: [different repo](https://github.com/huggingface/diffusers)
- rust tokenizers: [different repo](https://github.com/huggingface/tokenizers)
Maintained examples (not research project or legacy):
- Flax: @Rocketknight1
- PyTorch: See Models above and tag the person corresponding to the modality of the example.
- TensorFlow: @Rocketknight1
-->
@zach-huggingface @SunMarc @MekkCyber | {
"login": "MekkCyber",
"id": 93391238,
"node_id": "U_kgDOBZEJhg",
"avatar_url": "https://avatars.githubusercontent.com/u/93391238?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/MekkCyber",
"html_url": "https://github.com/MekkCyber",
"followers_url": "https://api.github.com/users/MekkCyber/followers",
"following_url": "https://api.github.com/users/MekkCyber/following{/other_user}",
"gists_url": "https://api.github.com/users/MekkCyber/gists{/gist_id}",
"starred_url": "https://api.github.com/users/MekkCyber/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/MekkCyber/subscriptions",
"organizations_url": "https://api.github.com/users/MekkCyber/orgs",
"repos_url": "https://api.github.com/users/MekkCyber/repos",
"events_url": "https://api.github.com/users/MekkCyber/events{/privacy}",
"received_events_url": "https://api.github.com/users/MekkCyber/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/37520/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/37520/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/37519 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/37519/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/37519/comments | https://api.github.com/repos/huggingface/transformers/issues/37519/events | https://github.com/huggingface/transformers/issues/37519 | 2,995,574,520 | I_kwDOCUB6oc6yjNb4 | 37,519 | [FSDP][torch.compile] accelerator.unwrap_model and trainer._save work incorrectly when FSDP + torch.compile | {
"login": "efsotr",
"id": 104755879,
"node_id": "U_kgDOBj5ypw",
"avatar_url": "https://avatars.githubusercontent.com/u/104755879?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/efsotr",
"html_url": "https://github.com/efsotr",
"followers_url": "https://api.github.com/users/efsotr/followers",
"following_url": "https://api.github.com/users/efsotr/following{/other_user}",
"gists_url": "https://api.github.com/users/efsotr/gists{/gist_id}",
"starred_url": "https://api.github.com/users/efsotr/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/efsotr/subscriptions",
"organizations_url": "https://api.github.com/users/efsotr/orgs",
"repos_url": "https://api.github.com/users/efsotr/repos",
"events_url": "https://api.github.com/users/efsotr/events{/privacy}",
"received_events_url": "https://api.github.com/users/efsotr/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 1862634478,
"node_id": "MDU6TGFiZWwxODYyNjM0NDc4",
"url": "https://api.github.com/repos/huggingface/transformers/labels/Should%20Fix",
"name": "Should Fix",
"color": "FF0000",
"default": false,
"description": "This has been identified as a bug and should be fixed."
},
{
"id": 1990918270,
"node_id": "MDU6TGFiZWwxOTkwOTE4Mjcw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/Good%20First%20Issue",
"name": "Good First Issue",
"color": "bbf794",
"default": false,
"description": ""
},
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
}
] | closed | false | null | [] | null | [] | 2025-04-15T08:16:39 | 2025-09-23T20:07:36 | 2025-05-06T15:51:29 | CONTRIBUTOR | null | null | null | null | ### System Info
transformers 4.51.3
accelerate 1.6.0
### Who can help?
@zach-huggingface @SunMarc
### Information
- [ ] The official example scripts
- [x] My own modified scripts
### Tasks
- [ ] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...)
- [x] My own task or dataset (give details below)
### Reproduction
To use torch.compile, you need to either uninstall the kernels library or set the environment variable DISABLE_KERNEL_MAPPING to 1.
**train.py**
```python
from typing import cast
import torch
from transformers import HfArgumentParser, Trainer, TrainingArguments, LlamaForCausalLM, LlamaConfig
args = HfArgumentParser(TrainingArguments)
training_args = cast(TrainingArguments, args.parse_args_into_dataclasses())[0]
print(training_args, flush=True)
config = LlamaConfig(
vocab_size=128,
hidden_size=128,
intermediate_size=128*2,
num_hidden_layers=2
)
model = LlamaForCausalLM(config).cuda().bfloat16()
train_dataset = [{"input_ids": torch.randint(0, 128, (128,)),
"labels": torch.randint(0, 128, (128,))} for i in range(16)]
trainer = Trainer(
model=model,
args=training_args,
train_dataset=train_dataset,
eval_dataset=train_dataset,
)
trainer.train()
trainer.save_state()
```
**fsdp.yaml**
```yaml
compute_environment: LOCAL_MACHINE
debug: false
distributed_type: FSDP
fsdp_config:
fsdp_sharding_strategy: FULL_SHARD
fsdp_activation_checkpointing: false
fsdp_use_orig_params: true
fsdp_auto_wrap_policy: TRANSFORMER_BASED_WRAP
fsdp_backward_prefetch_policy: BACKWARD_PRE
fsdp_offload_params: false
fsdp_state_dict_type: FULL_STATE_DICT
fsdp_transformer_layer_cls_to_wrap: LlamaDecoderLayer,Embedding
mixed_precision: 'no'
enable_cpu_affinity: false
machine_rank: 0
main_training_function: main
num_machines: 1
num_processes: 2
rdzv_backend: static
same_network: true
tpu_env: []
tpu_use_cluster: false
tpu_use_sudo: false
use_cpu: false
```
**launch script**
```bash
export CUDA_VISIBLE_DEVICES=0,1
export DISABLE_KERNEL_MAPPING=1
OUTPUT_DIR=test_fsdp
mkdir -p $OUTPUT_DIR
OMP_NUM_THREADS=8 accelerate launch --main_process_port 40129 --config_file fsdp.yaml \
train.py \
--torch_compile_mode default \
--do_train \
--optim adamw_torch_fused \
--learning_rate 1e-3 \
--weight_decay 0 \
--lr_scheduler_type constant_with_warmup \
--warmup_ratio 0.1 \
--per_device_train_batch_size 4 \
--per_device_eval_batch_size 4 \
--eval_on_start 0 \
--eval_strategy epoch \
--eval_steps 1 \
--save_strategy epoch \
--save_only_model 1 \
--greater_is_better False \
--logging_strategy steps \
--logging_steps 1 \
--include_tokens_per_second \
--output_dir $OUTPUT_DIR \
--num_train_epochs 1 \
--seed 0 \
--report_to none \
> $OUTPUT_DIR/training.log 2>&1
```
### Expected behavior
file `test_fsdp/checkpoint-2/config.json` exists
run
```python
from safetensors import safe_open
path = "test_fsdp/checkpoint-2/model.safetensors"
file = safe_open(path, framework="pt")
print(file.keys())
lm_head = "lm_head.weight"
if lm_head not in file.keys():
lm_head += "_orig_mod."
print(file.get_tensor(lm_head).shape)
```
expected to get
```
['lm_head.weight', 'model.embed_tokens.weight', 'model.layers.0.input_layernorm.weight', 'model.layers.0.mlp.down_proj.weight', 'model.layers.0.mlp.gate_proj.weight', 'model.layers.0.mlp.up_proj.weight', 'model.layers.0.post_attention_layernorm.weight', 'model.layers.0.self_attn.k_proj.weight', 'model.layers.0.self_attn.o_proj.weight', 'model.layers.0.self_attn.q_proj.weight', 'model.layers.0.self_attn.v_proj.weight', 'model.layers.1.input_layernorm.weight', 'model.layers.1.mlp.down_proj.weight', 'model.layers.1.mlp.gate_proj.weight', 'model.layers.1.mlp.up_proj.weight', 'model.layers.1.post_attention_layernorm.weight', 'model.layers.1.self_attn.k_proj.weight', 'model.layers.1.self_attn.o_proj.weight', 'model.layers.1.self_attn.q_proj.weight', 'model.layers.1.self_attn.v_proj.weight', 'model.norm.weight']
torch.Size([128, 128])
```
instead of
```
['_orig_mod.lm_head.weight', '_orig_mod.model.embed_tokens.weight', '_orig_mod.model.layers.0.input_layernorm.weight', '_orig_mod.model.layers.0.mlp.down_proj.weight', '_orig_mod.model.layers.0.mlp.gate_proj.weight', '_orig_mod.model.layers.0.mlp.up_proj.weight', '_orig_mod.model.layers.0.post_attention_layernorm.weight', '_orig_mod.model.layers.0.self_attn.k_proj.weight', '_orig_mod.model.layers.0.self_attn.o_proj.weight', '_orig_mod.model.layers.0.self_attn.q_proj.weight', '_orig_mod.model.layers.0.self_attn.v_proj.weight', '_orig_mod.model.layers.1.input_layernorm.weight', '_orig_mod.model.layers.1.mlp.down_proj.weight', '_orig_mod.model.layers.1.mlp.gate_proj.weight', '_orig_mod.model.layers.1.mlp.up_proj.weight', '_orig_mod.model.layers.1.post_attention_layernorm.weight', '_orig_mod.model.layers.1.self_attn.k_proj.weight', '_orig_mod.model.layers.1.self_attn.o_proj.weight', '_orig_mod.model.layers.1.self_attn.q_proj.weight', '_orig_mod.model.layers.1.self_attn.v_proj.weight', '_orig_mod.model.norm.weight']
torch.Size([8128])
```
If the `--eval_strategy epoch` in the launch script is changed to `--eval_strategy no`, then
```
['_orig_mod.lm_head.weight', '_orig_mod.model.embed_tokens.weight', '_orig_mod.model.layers.0.input_layernorm.weight', '_orig_mod.model.layers.0.mlp.down_proj.weight', '_orig_mod.model.layers.0.mlp.gate_proj.weight', '_orig_mod.model.layers.0.mlp.up_proj.weight', '_orig_mod.model.layers.0.post_attention_layernorm.weight', '_orig_mod.model.layers.0.self_attn.k_proj.weight', '_orig_mod.model.layers.0.self_attn.o_proj.weight', '_orig_mod.model.layers.0.self_attn.q_proj.weight', '_orig_mod.model.layers.0.self_attn.v_proj.weight', '_orig_mod.model.layers.1.input_layernorm.weight', '_orig_mod.model.layers.1.mlp.down_proj.weight', '_orig_mod.model.layers.1.mlp.gate_proj.weight', '_orig_mod.model.layers.1.mlp.up_proj.weight', '_orig_mod.model.layers.1.post_attention_layernorm.weight', '_orig_mod.model.layers.1.self_attn.k_proj.weight', '_orig_mod.model.layers.1.self_attn.o_proj.weight', '_orig_mod.model.layers.1.self_attn.q_proj.weight', '_orig_mod.model.layers.1.self_attn.v_proj.weight', '_orig_mod.model.norm.weight']
torch.Size([128, 128])
``` | {
"login": "SunMarc",
"id": 57196510,
"node_id": "MDQ6VXNlcjU3MTk2NTEw",
"avatar_url": "https://avatars.githubusercontent.com/u/57196510?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/SunMarc",
"html_url": "https://github.com/SunMarc",
"followers_url": "https://api.github.com/users/SunMarc/followers",
"following_url": "https://api.github.com/users/SunMarc/following{/other_user}",
"gists_url": "https://api.github.com/users/SunMarc/gists{/gist_id}",
"starred_url": "https://api.github.com/users/SunMarc/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/SunMarc/subscriptions",
"organizations_url": "https://api.github.com/users/SunMarc/orgs",
"repos_url": "https://api.github.com/users/SunMarc/repos",
"events_url": "https://api.github.com/users/SunMarc/events{/privacy}",
"received_events_url": "https://api.github.com/users/SunMarc/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/37519/reactions",
"total_count": 2,
"+1": 2,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/37519/timeline | null | completed | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | true |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.