url string | repository_url string | labels_url string | comments_url string | events_url string | html_url string | id int64 | node_id string | number int64 | title string | user dict | labels list | state string | locked bool | assignee dict | assignees list | milestone null | comments list | created_at timestamp[ms] | updated_at timestamp[ms] | closed_at timestamp[ms] | author_association string | type dict | active_lock_reason null | draft bool | pull_request dict | body string | closed_by dict | reactions dict | timeline_url string | performed_via_github_app null | state_reason string | sub_issues_summary dict | issue_dependencies_summary dict | is_pull_request bool | is_closed bool |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
https://api.github.com/repos/huggingface/transformers/issues/38427 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38427/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38427/comments | https://api.github.com/repos/huggingface/transformers/issues/38427/events | https://github.com/huggingface/transformers/pull/38427 | 3,096,538,766 | PR_kwDOCUB6oc6X67Ug | 38,427 | Create hug | {
"login": "Hulk733",
"id": 213313875,
"node_id": "U_kgDODLbpUw",
"avatar_url": "https://avatars.githubusercontent.com/u/213313875?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Hulk733",
"html_url": "https://github.com/Hulk733",
"followers_url": "https://api.github.com/users/Hulk733/followers",
"following_url": "https://api.github.com/users/Hulk733/following{/other_user}",
"gists_url": "https://api.github.com/users/Hulk733/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Hulk733/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Hulk733/subscriptions",
"organizations_url": "https://api.github.com/users/Hulk733/orgs",
"repos_url": "https://api.github.com/users/Hulk733/repos",
"events_url": "https://api.github.com/users/Hulk733/events{/privacy}",
"received_events_url": "https://api.github.com/users/Hulk733/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-05-28T08:39:04 | 2025-05-28T15:01:13 | 2025-05-28T15:01:13 | NONE | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38427",
"html_url": "https://github.com/huggingface/transformers/pull/38427",
"diff_url": "https://github.com/huggingface/transformers/pull/38427.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38427.patch",
"merged_at": null
} | huug
# What does this PR do?
<!--
Congratulations! You've made it this far! You're not quite done yet though.
Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution.
Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change.
Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost.
-->
<!-- Remove if not applicable -->
Fixes # (issue)
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [ ] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
<!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @
If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**.
Please tag fewer than 3 people.
Models:
- text models: @ArthurZucker
- vision models: @amyeroberts, @qubvel
- speech models: @eustlb
- graph models: @clefourrier
Library:
- flax: @gante and @Rocketknight1
- generate: @zucchini-nlp (visual-language models) or @gante (all others)
- pipelines: @Rocketknight1
- tensorflow: @gante and @Rocketknight1
- tokenizers: @ArthurZucker
- trainer: @zach-huggingface and @SunMarc
- chat templates: @Rocketknight1
Integrations:
- deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface
- ray/raytune: @richardliaw, @amogkam
- Big Model Inference: @SunMarc
- quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber
Documentation: @stevhliu
HF projects:
- accelerate: [different repo](https://github.com/huggingface/accelerate)
- datasets: [different repo](https://github.com/huggingface/datasets)
- diffusers: [different repo](https://github.com/huggingface/diffusers)
- rust tokenizers: [different repo](https://github.com/huggingface/tokenizers)
Maintained examples (not research project or legacy):
- Flax: @Rocketknight1
- PyTorch: See Models above and tag the person corresponding to the modality of the example.
- TensorFlow: @Rocketknight1
-->
| {
"login": "Rocketknight1",
"id": 12866554,
"node_id": "MDQ6VXNlcjEyODY2NTU0",
"avatar_url": "https://avatars.githubusercontent.com/u/12866554?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Rocketknight1",
"html_url": "https://github.com/Rocketknight1",
"followers_url": "https://api.github.com/users/Rocketknight1/followers",
"following_url": "https://api.github.com/users/Rocketknight1/following{/other_user}",
"gists_url": "https://api.github.com/users/Rocketknight1/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Rocketknight1/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Rocketknight1/subscriptions",
"organizations_url": "https://api.github.com/users/Rocketknight1/orgs",
"repos_url": "https://api.github.com/users/Rocketknight1/repos",
"events_url": "https://api.github.com/users/Rocketknight1/events{/privacy}",
"received_events_url": "https://api.github.com/users/Rocketknight1/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38427/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38427/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/38426 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38426/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38426/comments | https://api.github.com/repos/huggingface/transformers/issues/38426/events | https://github.com/huggingface/transformers/pull/38426 | 3,096,536,854 | PR_kwDOCUB6oc6X665b | 38,426 | [janus] Fix failing tests on mi3XX | {
"login": "remi-or",
"id": 83456801,
"node_id": "MDQ6VXNlcjgzNDU2ODAx",
"avatar_url": "https://avatars.githubusercontent.com/u/83456801?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/remi-or",
"html_url": "https://github.com/remi-or",
"followers_url": "https://api.github.com/users/remi-or/followers",
"following_url": "https://api.github.com/users/remi-or/following{/other_user}",
"gists_url": "https://api.github.com/users/remi-or/gists{/gist_id}",
"starred_url": "https://api.github.com/users/remi-or/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/remi-or/subscriptions",
"organizations_url": "https://api.github.com/users/remi-or/orgs",
"repos_url": "https://api.github.com/users/remi-or/repos",
"events_url": "https://api.github.com/users/remi-or/events{/privacy}",
"received_events_url": "https://api.github.com/users/remi-or/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-05-28T08:38:26 | 2025-06-04T07:38:11 | 2025-06-04T07:38:10 | COLLABORATOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38426",
"html_url": "https://github.com/huggingface/transformers/pull/38426",
"diff_url": "https://github.com/huggingface/transformers/pull/38426.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38426.patch",
"merged_at": "2025-06-04T07:38:10"
} | This PR fixes a few test fails of the `janus` model on MI3XX:
- an AttributeError when trying to retrieve BOI token in a `generation_kwargs` attribute that is not guaranteed to be in the config;
- three multiple devices errors (device map start with the wrong module, module split when it should not be, inputs on different devices)
- a test fail due to numerical difference, fixed that with `Expectations` | {
"login": "zucchini-nlp",
"id": 100715397,
"node_id": "U_kgDOBgDLhQ",
"avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/zucchini-nlp",
"html_url": "https://github.com/zucchini-nlp",
"followers_url": "https://api.github.com/users/zucchini-nlp/followers",
"following_url": "https://api.github.com/users/zucchini-nlp/following{/other_user}",
"gists_url": "https://api.github.com/users/zucchini-nlp/gists{/gist_id}",
"starred_url": "https://api.github.com/users/zucchini-nlp/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/zucchini-nlp/subscriptions",
"organizations_url": "https://api.github.com/users/zucchini-nlp/orgs",
"repos_url": "https://api.github.com/users/zucchini-nlp/repos",
"events_url": "https://api.github.com/users/zucchini-nlp/events{/privacy}",
"received_events_url": "https://api.github.com/users/zucchini-nlp/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38426/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38426/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/38425 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38425/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38425/comments | https://api.github.com/repos/huggingface/transformers/issues/38425/events | https://github.com/huggingface/transformers/issues/38425 | 3,096,489,576 | I_kwDOCUB6oc64kK5o | 38,425 | Can not load TencentBAC/Conan-embedding-v2 | {
"login": "shanekao-sks",
"id": 167391143,
"node_id": "U_kgDOCfovpw",
"avatar_url": "https://avatars.githubusercontent.com/u/167391143?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/shanekao-sks",
"html_url": "https://github.com/shanekao-sks",
"followers_url": "https://api.github.com/users/shanekao-sks/followers",
"following_url": "https://api.github.com/users/shanekao-sks/following{/other_user}",
"gists_url": "https://api.github.com/users/shanekao-sks/gists{/gist_id}",
"starred_url": "https://api.github.com/users/shanekao-sks/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/shanekao-sks/subscriptions",
"organizations_url": "https://api.github.com/users/shanekao-sks/orgs",
"repos_url": "https://api.github.com/users/shanekao-sks/repos",
"events_url": "https://api.github.com/users/shanekao-sks/events{/privacy}",
"received_events_url": "https://api.github.com/users/shanekao-sks/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
}
] | closed | false | null | [] | null | [] | 2025-05-28T08:21:23 | 2025-05-28T14:58:03 | 2025-05-28T14:58:02 | NONE | null | null | null | null | ### System Info
Description
When attempting to load the “Conan-embedding-v2” model directly via transformers.AutoModel.from_pretrained, I get a ValueError indicating that the repo’s config.json lacks a model_type key. This prevents the Transformers library from inferring which model class to instantiate.
### Who can help?
_No response_
### Information
- [ ] The official example scripts
- [ ] My own modified scripts
### Tasks
- [ ] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...)
- [ ] My own task or dataset (give details below)
### Reproduction
from transformers import AutoModel
model = AutoModel.from_pretrained("TencentBAC/Conan-embedding-v2")
ValueError: Unrecognized model in TencentBAC/Conan-embedding-v2.
Should have a `model_type` key in its config.json, or contain one of the following strings in its name: albert, bart, bert, …, whisper, xlnet, …
### Expected behavior
AutoModel.from_pretrained("TencentBAC/Conan-embedding-v2") should load the model automatically, or at minimum provide guidance on how to set the correct model_type. | {
"login": "Rocketknight1",
"id": 12866554,
"node_id": "MDQ6VXNlcjEyODY2NTU0",
"avatar_url": "https://avatars.githubusercontent.com/u/12866554?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Rocketknight1",
"html_url": "https://github.com/Rocketknight1",
"followers_url": "https://api.github.com/users/Rocketknight1/followers",
"following_url": "https://api.github.com/users/Rocketknight1/following{/other_user}",
"gists_url": "https://api.github.com/users/Rocketknight1/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Rocketknight1/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Rocketknight1/subscriptions",
"organizations_url": "https://api.github.com/users/Rocketknight1/orgs",
"repos_url": "https://api.github.com/users/Rocketknight1/repos",
"events_url": "https://api.github.com/users/Rocketknight1/events{/privacy}",
"received_events_url": "https://api.github.com/users/Rocketknight1/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38425/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38425/timeline | null | not_planned | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | true |
https://api.github.com/repos/huggingface/transformers/issues/38424 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38424/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38424/comments | https://api.github.com/repos/huggingface/transformers/issues/38424/events | https://github.com/huggingface/transformers/pull/38424 | 3,096,357,376 | PR_kwDOCUB6oc6X6Uwg | 38,424 | Update `CsmForConditionalGenerationIntegrationTest` | {
"login": "ydshieh",
"id": 2521628,
"node_id": "MDQ6VXNlcjI1MjE2Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ydshieh",
"html_url": "https://github.com/ydshieh",
"followers_url": "https://api.github.com/users/ydshieh/followers",
"following_url": "https://api.github.com/users/ydshieh/following{/other_user}",
"gists_url": "https://api.github.com/users/ydshieh/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ydshieh/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ydshieh/subscriptions",
"organizations_url": "https://api.github.com/users/ydshieh/orgs",
"repos_url": "https://api.github.com/users/ydshieh/repos",
"events_url": "https://api.github.com/users/ydshieh/events{/privacy}",
"received_events_url": "https://api.github.com/users/ydshieh/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-05-28T07:33:13 | 2025-05-28T08:20:45 | 2025-05-28T08:20:43 | COLLABORATOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38424",
"html_url": "https://github.com/huggingface/transformers/pull/38424",
"diff_url": "https://github.com/huggingface/transformers/pull/38424.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38424.patch",
"merged_at": "2025-05-28T08:20:43"
} | # What does this PR do?
Missing `@require_read_token`.
Now the 4 tests are all passing, good job @eustlb ! | {
"login": "ydshieh",
"id": 2521628,
"node_id": "MDQ6VXNlcjI1MjE2Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ydshieh",
"html_url": "https://github.com/ydshieh",
"followers_url": "https://api.github.com/users/ydshieh/followers",
"following_url": "https://api.github.com/users/ydshieh/following{/other_user}",
"gists_url": "https://api.github.com/users/ydshieh/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ydshieh/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ydshieh/subscriptions",
"organizations_url": "https://api.github.com/users/ydshieh/orgs",
"repos_url": "https://api.github.com/users/ydshieh/repos",
"events_url": "https://api.github.com/users/ydshieh/events{/privacy}",
"received_events_url": "https://api.github.com/users/ydshieh/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38424/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38424/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/38423 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38423/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38423/comments | https://api.github.com/repos/huggingface/transformers/issues/38423/events | https://github.com/huggingface/transformers/pull/38423 | 3,096,024,889 | PR_kwDOCUB6oc6X5Mw_ | 38,423 | [Qwen2-VL] Fix smart_resize bug | {
"login": "rdonggroq",
"id": 210547133,
"node_id": "U_kgDODIyxvQ",
"avatar_url": "https://avatars.githubusercontent.com/u/210547133?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/rdonggroq",
"html_url": "https://github.com/rdonggroq",
"followers_url": "https://api.github.com/users/rdonggroq/followers",
"following_url": "https://api.github.com/users/rdonggroq/following{/other_user}",
"gists_url": "https://api.github.com/users/rdonggroq/gists{/gist_id}",
"starred_url": "https://api.github.com/users/rdonggroq/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/rdonggroq/subscriptions",
"organizations_url": "https://api.github.com/users/rdonggroq/orgs",
"repos_url": "https://api.github.com/users/rdonggroq/repos",
"events_url": "https://api.github.com/users/rdonggroq/events{/privacy}",
"received_events_url": "https://api.github.com/users/rdonggroq/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-05-28T05:03:04 | 2025-06-09T21:19:47 | 2025-05-30T06:52:40 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38423",
"html_url": "https://github.com/huggingface/transformers/pull/38423",
"diff_url": "https://github.com/huggingface/transformers/pull/38423.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38423.patch",
"merged_at": null
} | # What does this PR do?
<!--
Congratulations! You've made it this far! You're not quite done yet though.
Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution.
Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change.
Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost.
-->
<!-- Remove if not applicable -->
This currently throws:
```python
import torch
from transformers import Qwen2VLImageProcessorFast
from transformers.image_utils import ChannelDimension
processor = Qwen2VLImageProcessorFast()
format = ChannelDimension.FIRST
image = torch.zeros((3, 100, 100))
size = {"shortest_edge": 100, "longest_edge": 100}
processor.preprocess(image, input_data_format=format, size=size)
```
<details>
<summary>Error</summary>
```
---------------------------------------------------------------------------
RuntimeError Traceback (most recent call last)
Cell In[38], line 8
6 image = torch.zeros((3, 100, 100))
7 size = {"shortest_edge":100, "longest_edge":100}
----> 8 processor.preprocess(image, input_data_format=format, size=size)
File /nix/store/xg7lc4jxm5p199b01nndbfqr3fy4p8g7-python3.10-transformers-4.50.3/lib/python3.10/site-packages/transformers/models/qwen2_vl/image_processing_qwen2_vl_fast.py:397, in Qwen2VLImageProcessorFast.preprocess(self, images, videos, do_resize, size, resample, do_rescale, rescale_factor, do_normalize, image_mean, image_std, min_pixels, max_pixels, patch_size, temporal_patch_size, merge_size, do_convert_rgb, return_tensors, data_format, input_data_format, device, **kwargs)
395 pixel_values, vision_grid_thws = [], []
396 for image in images:
--> 397 patches, image_grid_thw = self._preprocess(
398 image,
399 do_resize=do_resize,
400 size=size,
401 interpolation=interpolation,
402 do_rescale=do_rescale,
403 rescale_factor=rescale_factor,
404 do_normalize=do_normalize,
405 image_mean=image_mean,
406 image_std=image_std,
407 patch_size=patch_size,
408 temporal_patch_size=temporal_patch_size,
409 merge_size=merge_size,
410 do_convert_rgb=do_convert_rgb,
411 input_data_format=input_data_format,
412 device=device,
413 )
414 pixel_values.extend(patches)
415 vision_grid_thws.append(image_grid_thw)
File /nix/store/xg7lc4jxm5p199b01nndbfqr3fy4p8g7-python3.10-transformers-4.50.3/lib/python3.10/site-packages/transformers/models/qwen2_vl/image_processing_qwen2_vl_fast.py:209, in Qwen2VLImageProcessorFast._preprocess(self, images, do_resize, size, interpolation, do_rescale, rescale_factor, do_normalize, image_mean, image_std, patch_size, temporal_patch_size, merge_size, do_convert_rgb, input_data_format, device)
201 if do_resize:
202 resized_height, resized_width = smart_resize(
203 height,
204 width,
(...)
207 max_pixels=size["longest_edge"],
208 )
--> 209 stacked_images = F.resize(
210 stacked_images, size=(resized_height, resized_width), interpolation=interpolation
211 )
212 resized_images_grouped[shape] = stacked_images
213 resized_images = reorder_images(resized_images_grouped, grouped_images_index)
File /nix/store/cifrlch412l6cnpa06qaf8lrqbs47pzh-python3.10-torchvision-0.20.1/lib/python3.10/site-packages/torchvision/transforms/v2/functional/_geometry.py:188, in resize(inpt, size, interpolation, max_size, antialias)
185 _log_api_usage_once(resize)
187 kernel = _get_kernel(resize, type(inpt))
--> 188 return kernel(inpt, size=size, interpolation=interpolation, max_size=max_size, antialias=antialias)
File /nix/store/cifrlch412l6cnpa06qaf8lrqbs47pzh-python3.10-torchvision-0.20.1/lib/python3.10/site-packages/torchvision/transforms/v2/functional/_geometry.py:260, in resize_image(image, size, interpolation, max_size, antialias)
257 if need_cast:
258 image = image.to(dtype=torch.float32)
--> 260 image = interpolate(
261 image,
262 size=[new_height, new_width],
263 mode=interpolation.value,
264 align_corners=align_corners,
265 antialias=antialias,
266 )
268 if need_cast:
269 if interpolation == InterpolationMode.BICUBIC and dtype == torch.uint8:
270 # This path is hit on non-AVX archs, or on GPU.
File /nix/store/93vlnr4hqnr20y3fm9j952p9zrjr3dqp-python3.10-torch-2.5.1/lib/python3.10/site-packages/torch/nn/functional.py:4591, in interpolate(input, size, scale_factor, mode, align_corners, recompute_scale_factor, antialias)
4589 assert align_corners is not None
4590 if antialias:
-> 4591 return torch._C._nn._upsample_bicubic2d_aa(
4592 input, output_size, align_corners, scale_factors
4593 )
4594 return torch._C._nn.upsample_bicubic2d(
4595 input, output_size, align_corners, scale_factors
4596 )
4598 if input.dim() == 3 and mode == "bilinear":
RuntimeError: Input and output sizes should be greater than 0, but got input (H: 100, W: 100) output (H: 0, W: 0)
```
</details>
This PR incorporates [the fix](https://github.com/QwenLM/Qwen2.5-VL/commit/a30e36facd0a5131d9ed59e93210c7ac5de75adb) from the Qwen repo.
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [X] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
<!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @
If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**.
Please tag fewer than 3 people.
Models:
- text models: @ArthurZucker
- vision models: @amyeroberts, @qubvel
- speech models: @eustlb
- graph models: @clefourrier
Library:
- flax: @gante and @Rocketknight1
- generate: @zucchini-nlp (visual-language models) or @gante (all others)
- pipelines: @Rocketknight1
- tensorflow: @gante and @Rocketknight1
- tokenizers: @ArthurZucker
- trainer: @zach-huggingface and @SunMarc
- chat templates: @Rocketknight1
Integrations:
- deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface
- ray/raytune: @richardliaw, @amogkam
- Big Model Inference: @SunMarc
- quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber
Documentation: @stevhliu
HF projects:
- accelerate: [different repo](https://github.com/huggingface/accelerate)
- datasets: [different repo](https://github.com/huggingface/datasets)
- diffusers: [different repo](https://github.com/huggingface/diffusers)
- rust tokenizers: [different repo](https://github.com/huggingface/tokenizers)
Maintained examples (not research project or legacy):
- Flax: @Rocketknight1
- PyTorch: See Models above and tag the person corresponding to the modality of the example.
- TensorFlow: @Rocketknight1
-->
Looks like @simonJJJ may be relevant. | {
"login": "rdonggroq",
"id": 210547133,
"node_id": "U_kgDODIyxvQ",
"avatar_url": "https://avatars.githubusercontent.com/u/210547133?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/rdonggroq",
"html_url": "https://github.com/rdonggroq",
"followers_url": "https://api.github.com/users/rdonggroq/followers",
"following_url": "https://api.github.com/users/rdonggroq/following{/other_user}",
"gists_url": "https://api.github.com/users/rdonggroq/gists{/gist_id}",
"starred_url": "https://api.github.com/users/rdonggroq/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/rdonggroq/subscriptions",
"organizations_url": "https://api.github.com/users/rdonggroq/orgs",
"repos_url": "https://api.github.com/users/rdonggroq/repos",
"events_url": "https://api.github.com/users/rdonggroq/events{/privacy}",
"received_events_url": "https://api.github.com/users/rdonggroq/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38423/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38423/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/38422 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38422/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38422/comments | https://api.github.com/repos/huggingface/transformers/issues/38422/events | https://github.com/huggingface/transformers/issues/38422 | 3,095,974,133 | I_kwDOCUB6oc64iND1 | 38,422 | Bug in error handling routine in save_pretrained | {
"login": "csehydrogen",
"id": 6292032,
"node_id": "MDQ6VXNlcjYyOTIwMzI=",
"avatar_url": "https://avatars.githubusercontent.com/u/6292032?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/csehydrogen",
"html_url": "https://github.com/csehydrogen",
"followers_url": "https://api.github.com/users/csehydrogen/followers",
"following_url": "https://api.github.com/users/csehydrogen/following{/other_user}",
"gists_url": "https://api.github.com/users/csehydrogen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/csehydrogen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/csehydrogen/subscriptions",
"organizations_url": "https://api.github.com/users/csehydrogen/orgs",
"repos_url": "https://api.github.com/users/csehydrogen/repos",
"events_url": "https://api.github.com/users/csehydrogen/events{/privacy}",
"received_events_url": "https://api.github.com/users/csehydrogen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
}
] | closed | false | null | [] | null | [] | 2025-05-28T04:36:02 | 2025-05-29T13:58:17 | 2025-05-29T13:58:17 | NONE | null | null | null | null | ### System Info
* Python 3.10.12
* transformers 4.42.4
### Who can help?
@ArthurZucker
### Information
- [ ] The official example scripts
- [x] My own modified scripts
### Tasks
- [ ] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...)
- [x] My own task or dataset (give details below)
### Reproduction
`save_pretrained` has the following error handling routine:
https://github.com/huggingface/transformers/blob/3b3ebcec4077f124f2cd0ec3cd5d028dc352a3e5/src/transformers/modeling_utils.py#L3749
Here, `shared_names` is `List[Set[str]]` so `set(shared_names)` raises `TypeError: unhashable type: 'set'`.
It should be fixed to
```
error_names.extend(shared_names)
```
Normal CPU/GPU backend will not reach the line, so it is hard to provide a reproducible example. I found it while writing a custom (private) backend, and it triggers the error when I implemented `.data_ptr()` in a wrong way.
### Expected behavior
When something is wrong, `error_names` should be populated without the error.
The error should be raised in the following lines:
https://github.com/huggingface/transformers/blob/3b3ebcec4077f124f2cd0ec3cd5d028dc352a3e5/src/transformers/modeling_utils.py#L3751-L3754 | {
"login": "Rocketknight1",
"id": 12866554,
"node_id": "MDQ6VXNlcjEyODY2NTU0",
"avatar_url": "https://avatars.githubusercontent.com/u/12866554?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Rocketknight1",
"html_url": "https://github.com/Rocketknight1",
"followers_url": "https://api.github.com/users/Rocketknight1/followers",
"following_url": "https://api.github.com/users/Rocketknight1/following{/other_user}",
"gists_url": "https://api.github.com/users/Rocketknight1/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Rocketknight1/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Rocketknight1/subscriptions",
"organizations_url": "https://api.github.com/users/Rocketknight1/orgs",
"repos_url": "https://api.github.com/users/Rocketknight1/repos",
"events_url": "https://api.github.com/users/Rocketknight1/events{/privacy}",
"received_events_url": "https://api.github.com/users/Rocketknight1/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38422/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38422/timeline | null | completed | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | true |
https://api.github.com/repos/huggingface/transformers/issues/38421 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38421/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38421/comments | https://api.github.com/repos/huggingface/transformers/issues/38421/events | https://github.com/huggingface/transformers/pull/38421 | 3,095,938,370 | PR_kwDOCUB6oc6X45_O | 38,421 | [Qwen2.5-VL] Fix empty string input crash in processor | {
"login": "Flink-ddd",
"id": 180720690,
"node_id": "U_kgDOCsWUMg",
"avatar_url": "https://avatars.githubusercontent.com/u/180720690?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Flink-ddd",
"html_url": "https://github.com/Flink-ddd",
"followers_url": "https://api.github.com/users/Flink-ddd/followers",
"following_url": "https://api.github.com/users/Flink-ddd/following{/other_user}",
"gists_url": "https://api.github.com/users/Flink-ddd/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Flink-ddd/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Flink-ddd/subscriptions",
"organizations_url": "https://api.github.com/users/Flink-ddd/orgs",
"repos_url": "https://api.github.com/users/Flink-ddd/repos",
"events_url": "https://api.github.com/users/Flink-ddd/events{/privacy}",
"received_events_url": "https://api.github.com/users/Flink-ddd/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | open | false | null | [] | null | [] | 2025-05-28T04:11:40 | 2025-05-29T02:12:22 | null | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38421",
"html_url": "https://github.com/huggingface/transformers/pull/38421",
"diff_url": "https://github.com/huggingface/transformers/pull/38421.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38421.patch",
"merged_at": null
} | This PR fixes [#38417](https://github.com/huggingface/transformers/issues/38417).
When passing an empty string to the Qwen2.5 tokenizer with return_tensors="pt", the original output was a float32 tensor.
This patch ensures a consistent torch.long dtype by returning torch.empty((1, 0), dtype=torch.long) for empty input.
A test is included to validate the fix.
This is my first contribution to 🤗 Transformers. Happy to help and open to feedback! | null | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38421/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38421/timeline | null | null | null | null | true | false |
https://api.github.com/repos/huggingface/transformers/issues/38420 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38420/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38420/comments | https://api.github.com/repos/huggingface/transformers/issues/38420/events | https://github.com/huggingface/transformers/pull/38420 | 3,095,858,580 | PR_kwDOCUB6oc6X4o9j | 38,420 | Fix an error in verify_tp_plan for keys without '.' | {
"login": "liwii",
"id": 26531068,
"node_id": "MDQ6VXNlcjI2NTMxMDY4",
"avatar_url": "https://avatars.githubusercontent.com/u/26531068?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/liwii",
"html_url": "https://github.com/liwii",
"followers_url": "https://api.github.com/users/liwii/followers",
"following_url": "https://api.github.com/users/liwii/following{/other_user}",
"gists_url": "https://api.github.com/users/liwii/gists{/gist_id}",
"starred_url": "https://api.github.com/users/liwii/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/liwii/subscriptions",
"organizations_url": "https://api.github.com/users/liwii/orgs",
"repos_url": "https://api.github.com/users/liwii/repos",
"events_url": "https://api.github.com/users/liwii/events{/privacy}",
"received_events_url": "https://api.github.com/users/liwii/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-05-28T03:22:16 | 2025-05-28T07:30:43 | 2025-05-28T07:30:43 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38420",
"html_url": "https://github.com/huggingface/transformers/pull/38420",
"diff_url": "https://github.com/huggingface/transformers/pull/38420.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38420.patch",
"merged_at": "2025-05-28T07:30:43"
} | # What does this PR do?
Fixes # #38419
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [x] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
@ArthurZucker
<!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @
If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**.
Please tag fewer than 3 people.
Models:
- text models: @ArthurZucker
- vision models: @amyeroberts, @qubvel
- speech models: @eustlb
- graph models: @clefourrier
Library:
- flax: @gante and @Rocketknight1
- generate: @zucchini-nlp (visual-language models) or @gante (all others)
- pipelines: @Rocketknight1
- tensorflow: @gante and @Rocketknight1
- tokenizers: @ArthurZucker
- trainer: @zach-huggingface and @SunMarc
- chat templates: @Rocketknight1
Integrations:
- deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface
- ray/raytune: @richardliaw, @amogkam
- Big Model Inference: @SunMarc
- quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber
Documentation: @stevhliu
HF projects:
- accelerate: [different repo](https://github.com/huggingface/accelerate)
- datasets: [different repo](https://github.com/huggingface/datasets)
- diffusers: [different repo](https://github.com/huggingface/diffusers)
- rust tokenizers: [different repo](https://github.com/huggingface/tokenizers)
Maintained examples (not research project or legacy):
- Flax: @Rocketknight1
- PyTorch: See Models above and tag the person corresponding to the modality of the example.
- TensorFlow: @Rocketknight1
-->
| {
"login": "ArthurZucker",
"id": 48595927,
"node_id": "MDQ6VXNlcjQ4NTk1OTI3",
"avatar_url": "https://avatars.githubusercontent.com/u/48595927?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ArthurZucker",
"html_url": "https://github.com/ArthurZucker",
"followers_url": "https://api.github.com/users/ArthurZucker/followers",
"following_url": "https://api.github.com/users/ArthurZucker/following{/other_user}",
"gists_url": "https://api.github.com/users/ArthurZucker/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ArthurZucker/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ArthurZucker/subscriptions",
"organizations_url": "https://api.github.com/users/ArthurZucker/orgs",
"repos_url": "https://api.github.com/users/ArthurZucker/repos",
"events_url": "https://api.github.com/users/ArthurZucker/events{/privacy}",
"received_events_url": "https://api.github.com/users/ArthurZucker/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38420/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38420/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/38419 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38419/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38419/comments | https://api.github.com/repos/huggingface/transformers/issues/38419/events | https://github.com/huggingface/transformers/issues/38419 | 3,095,856,123 | I_kwDOCUB6oc64hwP7 | 38,419 | `verify_tp_plan` function raises an error if a key without '.' is given | {
"login": "liwii",
"id": 26531068,
"node_id": "MDQ6VXNlcjI2NTMxMDY4",
"avatar_url": "https://avatars.githubusercontent.com/u/26531068?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/liwii",
"html_url": "https://github.com/liwii",
"followers_url": "https://api.github.com/users/liwii/followers",
"following_url": "https://api.github.com/users/liwii/following{/other_user}",
"gists_url": "https://api.github.com/users/liwii/gists{/gist_id}",
"starred_url": "https://api.github.com/users/liwii/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/liwii/subscriptions",
"organizations_url": "https://api.github.com/users/liwii/orgs",
"repos_url": "https://api.github.com/users/liwii/repos",
"events_url": "https://api.github.com/users/liwii/events{/privacy}",
"received_events_url": "https://api.github.com/users/liwii/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
}
] | closed | false | null | [] | null | [] | 2025-05-28T03:20:29 | 2025-07-16T09:13:06 | 2025-07-16T09:13:06 | CONTRIBUTOR | null | null | null | null | ### System Info
- `transformers` version: 4.53.0.dev0
- Platform: Linux-5.10.0-34-cloud-amd64-x86_64-with-glibc2.31
- Python version: 3.9.2
- Huggingface_hub version: 0.32.2
- Safetensors version: 0.4.5
- Accelerate version: 1.7.0
- Accelerate config: not found
- DeepSpeed version: not installed
- PyTorch version (accelerator?): 2.6.0+cu124 (NA)
- Tensorflow version (GPU?): 2.15.1 (False)
- Flax version (CPU?/GPU?/TPU?): 0.7.0 (cpu)
- Jax version: 0.4.13
- JaxLib version: 0.4.13
- Using distributed or parallel set-up in script?: <fill in>
### Who can help?
@ArthurZucker
(I listed you because you seem to be the reviewer of the original PR, feel free to re-assign this to other people in charge)
### Information
- [ ] The official example scripts
- [x] My own modified scripts
### Tasks
- [ ] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...)
- [x] My own task or dataset (give details below)
### Reproduction
Run the following script:
```
from transformers.models.auto.modeling_auto import (
AutoModelForSeq2SeqLM,
)
import logging
logger = logging.getLogger("transformers.modeling_utils")
logger.setLevel(logging.ERROR)
model = AutoModelForSeq2SeqLM.from_pretrained("Helsinki-NLP/opus-mt-zh-en", revision="cf109095479db38d6df799875e34039d4938aaa6")
```
You see the following error:
```
Traceback (most recent call last):
File "/home/kokiryu/transformers/test_tp_plan.py", line 15, in <module>
model = AutoModelForSeq2SeqLM.from_pretrained("Helsinki-NLP/opus-mt-zh-en", revision="cf109095479db38d6df799875e34039d4938aaa6")
File "/home/kokiryu/.local/lib/python3.9/site-packages/transformers/models/auto/auto_factory.py", line 586, in from_pretrained
return model_class.from_pretrained(
File "/home/kokiryu/.local/lib/python3.9/site-packages/transformers/modeling_utils.py", line 316, in _wrapper
return func(*args, **kwargs)
File "/home/kokiryu/.local/lib/python3.9/site-packages/transformers/modeling_utils.py", line 4697, in from_pretrained
) = cls._load_pretrained_model(
File "/home/kokiryu/.local/lib/python3.9/site-packages/transformers/modeling_utils.py", line 5113, in _load_pretrained_model
verify_tp_plan(expected_keys, getattr(model_to_load, "_tp_plan", None))
File "/home/kokiryu/.local/lib/python3.9/site-packages/transformers/integrations/tensor_parallel.py", line 904, in verify_tp_plan
param_name, _ = key.rsplit(".", 1) if "." in key else key
ValueError: too many values to unpack (expected 2)
```
### Expected behavior
Keys without `'.'` such as `'final_logits_bias'` can be given to the `verify_tp_plan`. Current code ends up executing `param_name, _ = key` in such a case and raises an error.
https://github.com/huggingface/transformers/blob/3b3ebcec4077f124f2cd0ec3cd5d028dc352a3e5/src/transformers/integrations/tensor_parallel.py#L903
It should directly assign `param_name = key` and output warnings without the errors. | {
"login": "ArthurZucker",
"id": 48595927,
"node_id": "MDQ6VXNlcjQ4NTk1OTI3",
"avatar_url": "https://avatars.githubusercontent.com/u/48595927?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ArthurZucker",
"html_url": "https://github.com/ArthurZucker",
"followers_url": "https://api.github.com/users/ArthurZucker/followers",
"following_url": "https://api.github.com/users/ArthurZucker/following{/other_user}",
"gists_url": "https://api.github.com/users/ArthurZucker/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ArthurZucker/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ArthurZucker/subscriptions",
"organizations_url": "https://api.github.com/users/ArthurZucker/orgs",
"repos_url": "https://api.github.com/users/ArthurZucker/repos",
"events_url": "https://api.github.com/users/ArthurZucker/events{/privacy}",
"received_events_url": "https://api.github.com/users/ArthurZucker/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38419/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38419/timeline | null | completed | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | true |
https://api.github.com/repos/huggingface/transformers/issues/38418 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38418/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38418/comments | https://api.github.com/repos/huggingface/transformers/issues/38418/events | https://github.com/huggingface/transformers/pull/38418 | 3,095,766,250 | PR_kwDOCUB6oc6X4VSi | 38,418 | Fix typo in tokenization_utils_base.py docstring | {
"login": "cwngan",
"id": 58377002,
"node_id": "MDQ6VXNlcjU4Mzc3MDAy",
"avatar_url": "https://avatars.githubusercontent.com/u/58377002?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/cwngan",
"html_url": "https://github.com/cwngan",
"followers_url": "https://api.github.com/users/cwngan/followers",
"following_url": "https://api.github.com/users/cwngan/following{/other_user}",
"gists_url": "https://api.github.com/users/cwngan/gists{/gist_id}",
"starred_url": "https://api.github.com/users/cwngan/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/cwngan/subscriptions",
"organizations_url": "https://api.github.com/users/cwngan/orgs",
"repos_url": "https://api.github.com/users/cwngan/repos",
"events_url": "https://api.github.com/users/cwngan/events{/privacy}",
"received_events_url": "https://api.github.com/users/cwngan/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-05-28T02:19:33 | 2025-05-28T14:52:48 | 2025-05-28T14:52:10 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38418",
"html_url": "https://github.com/huggingface/transformers/pull/38418",
"diff_url": "https://github.com/huggingface/transformers/pull/38418.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38418.patch",
"merged_at": "2025-05-28T14:52:10"
} | # What does this PR do?
<!--
Congratulations! You've made it this far! You're not quite done yet though.
Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution.
Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change.
Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost.
-->
Fix typo in docstring of `tokenization_utils_base.py`.
## Before submitting
- [x] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [ ] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
<!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @
If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**.
Please tag fewer than 3 people.
Models:
- text models: @ArthurZucker
- vision models: @amyeroberts, @qubvel
- speech models: @eustlb
- graph models: @clefourrier
Library:
- flax: @gante and @Rocketknight1
- generate: @zucchini-nlp (visual-language models) or @gante (all others)
- pipelines: @Rocketknight1
- tensorflow: @gante and @Rocketknight1
- tokenizers: @ArthurZucker
- trainer: @zach-huggingface and @SunMarc
- chat templates: @Rocketknight1
Integrations:
- deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface
- ray/raytune: @richardliaw, @amogkam
- Big Model Inference: @SunMarc
- quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber
Documentation: @stevhliu
HF projects:
- accelerate: [different repo](https://github.com/huggingface/accelerate)
- datasets: [different repo](https://github.com/huggingface/datasets)
- diffusers: [different repo](https://github.com/huggingface/diffusers)
- rust tokenizers: [different repo](https://github.com/huggingface/tokenizers)
Maintained examples (not research project or legacy):
- Flax: @Rocketknight1
- PyTorch: See Models above and tag the person corresponding to the modality of the example.
- TensorFlow: @Rocketknight1
-->
| {
"login": "Rocketknight1",
"id": 12866554,
"node_id": "MDQ6VXNlcjEyODY2NTU0",
"avatar_url": "https://avatars.githubusercontent.com/u/12866554?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Rocketknight1",
"html_url": "https://github.com/Rocketknight1",
"followers_url": "https://api.github.com/users/Rocketknight1/followers",
"following_url": "https://api.github.com/users/Rocketknight1/following{/other_user}",
"gists_url": "https://api.github.com/users/Rocketknight1/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Rocketknight1/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Rocketknight1/subscriptions",
"organizations_url": "https://api.github.com/users/Rocketknight1/orgs",
"repos_url": "https://api.github.com/users/Rocketknight1/repos",
"events_url": "https://api.github.com/users/Rocketknight1/events{/privacy}",
"received_events_url": "https://api.github.com/users/Rocketknight1/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38418/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38418/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/38417 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38417/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38417/comments | https://api.github.com/repos/huggingface/transformers/issues/38417/events | https://github.com/huggingface/transformers/issues/38417 | 3,095,730,420 | I_kwDOCUB6oc64hRj0 | 38,417 | Tokenizer returns float32 tensor for empty string input instead of long dtype | {
"login": "Goer17",
"id": 99397919,
"node_id": "U_kgDOBeyxHw",
"avatar_url": "https://avatars.githubusercontent.com/u/99397919?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Goer17",
"html_url": "https://github.com/Goer17",
"followers_url": "https://api.github.com/users/Goer17/followers",
"following_url": "https://api.github.com/users/Goer17/following{/other_user}",
"gists_url": "https://api.github.com/users/Goer17/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Goer17/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Goer17/subscriptions",
"organizations_url": "https://api.github.com/users/Goer17/orgs",
"repos_url": "https://api.github.com/users/Goer17/repos",
"events_url": "https://api.github.com/users/Goer17/events{/privacy}",
"received_events_url": "https://api.github.com/users/Goer17/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
}
] | closed | false | null | [] | null | [] | 2025-05-28T01:54:25 | 2025-07-05T08:02:59 | 2025-07-05T08:02:59 | NONE | null | null | null | null | Hi team,
I found an unexpected behavior when using HuggingFace Transformers' tokenizer with an empty string as input. When I run the following code:
```python
from transformers import AutoTokenizer
tokenizer = AutoTokenizer.from_pretrained("Qwen/Qwen2.5-0.5B", trust_remote_code=True)
input_ids = tokenizer("", return_tensors="pt")['input_ids']
print(input_ids, input_ids.dtype)
# output:
# tensor([], size=(1, 0)) torch.float32
```
**Potential impact**:
For example, if this float32 tensor is concatenated with other torch.long tensors using torch.cat, the result will be promoted to float32, causing all token IDs to become floats. This can break downstream code that expects integer token IDs.
Could you please take a look? Thank you!
### Who can help?
_No response_
### Information
- [ ] The official example scripts
- [ ] My own modified scripts
### Tasks
- [ ] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...)
- [x] My own task or dataset (give details below)
### Reproduction
```python
from transformers import AutoTokenizer
tokenizer = AutoTokenizer.from_pretrained("Qwen/Qwen2.5-0.5B", trust_remote_code=True)
input_ids = tokenizer("", return_tensors="pt")['input_ids']
print(input_ids, input_ids.dtype, sep="\n")
# output:
# tensor([], size=(1, 0))
# torch.float32
```
### Expected behavior
The tokenizer should always return torch.long tensors for input_ids, regardless of the input content. | {
"login": "github-actions[bot]",
"id": 41898282,
"node_id": "MDM6Qm90NDE4OTgyODI=",
"avatar_url": "https://avatars.githubusercontent.com/in/15368?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/github-actions%5Bbot%5D",
"html_url": "https://github.com/apps/github-actions",
"followers_url": "https://api.github.com/users/github-actions%5Bbot%5D/followers",
"following_url": "https://api.github.com/users/github-actions%5Bbot%5D/following{/other_user}",
"gists_url": "https://api.github.com/users/github-actions%5Bbot%5D/gists{/gist_id}",
"starred_url": "https://api.github.com/users/github-actions%5Bbot%5D/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/github-actions%5Bbot%5D/subscriptions",
"organizations_url": "https://api.github.com/users/github-actions%5Bbot%5D/orgs",
"repos_url": "https://api.github.com/users/github-actions%5Bbot%5D/repos",
"events_url": "https://api.github.com/users/github-actions%5Bbot%5D/events{/privacy}",
"received_events_url": "https://api.github.com/users/github-actions%5Bbot%5D/received_events",
"type": "Bot",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38417/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38417/timeline | null | completed | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | true |
https://api.github.com/repos/huggingface/transformers/issues/38416 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38416/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38416/comments | https://api.github.com/repos/huggingface/transformers/issues/38416/events | https://github.com/huggingface/transformers/issues/38416 | 3,095,482,354 | I_kwDOCUB6oc64gU_y | 38,416 | Model implmenetation using Liger Kernel layers | {
"login": "helloworld1",
"id": 247316,
"node_id": "MDQ6VXNlcjI0NzMxNg==",
"avatar_url": "https://avatars.githubusercontent.com/u/247316?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/helloworld1",
"html_url": "https://github.com/helloworld1",
"followers_url": "https://api.github.com/users/helloworld1/followers",
"following_url": "https://api.github.com/users/helloworld1/following{/other_user}",
"gists_url": "https://api.github.com/users/helloworld1/gists{/gist_id}",
"starred_url": "https://api.github.com/users/helloworld1/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/helloworld1/subscriptions",
"organizations_url": "https://api.github.com/users/helloworld1/orgs",
"repos_url": "https://api.github.com/users/helloworld1/repos",
"events_url": "https://api.github.com/users/helloworld1/events{/privacy}",
"received_events_url": "https://api.github.com/users/helloworld1/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 2648621985,
"node_id": "MDU6TGFiZWwyNjQ4NjIxOTg1",
"url": "https://api.github.com/repos/huggingface/transformers/labels/Feature%20request",
"name": "Feature request",
"color": "FBCA04",
"default": false,
"description": "Request for a new feature"
}
] | open | false | null | [] | null | [] | 2025-05-27T23:18:13 | 2025-07-24T12:21:15 | null | CONTRIBUTOR | null | null | null | null | ### Feature request
In the new and existing model implementation,check if Liger Kernel and `use_liger_kernel` flag is on is installed and provide an alternative and more efficient efficient layer implementation. This moves the monkey-patch process in Liger-Kernel to transformes library.
```
self.norm = LlamaRMSNorm(config.hidden_size, eps=config.rms_norm_eps) if not is_liger_kernel_enabled() else LigerLlamaRMSNorm(config.hidden_size, eps=config.rms_norm_eps)
```
### Motivation
Transformer is changing model implementation between releases. Liger kernel monkey patch process has ran into compatiblity issues multiple times.
Example: https://github.com/linkedin/Liger-Kernel/issues/723
I would like to make Liger Kernel as part of transformers model implementation to ensure the stability of efficient layers being used. This will also give modeller ability to use efficient kernels during modelling.
### Your contribution
I can help PR the change | null | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38416/reactions",
"total_count": 1,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 1,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38416/timeline | null | null | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | false |
https://api.github.com/repos/huggingface/transformers/issues/38415 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38415/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38415/comments | https://api.github.com/repos/huggingface/transformers/issues/38415/events | https://github.com/huggingface/transformers/pull/38415 | 3,094,871,933 | PR_kwDOCUB6oc6X1TLB | 38,415 | Add mi300 to amd daily ci workflows definition | {
"login": "ivarflakstad",
"id": 69173633,
"node_id": "MDQ6VXNlcjY5MTczNjMz",
"avatar_url": "https://avatars.githubusercontent.com/u/69173633?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ivarflakstad",
"html_url": "https://github.com/ivarflakstad",
"followers_url": "https://api.github.com/users/ivarflakstad/followers",
"following_url": "https://api.github.com/users/ivarflakstad/following{/other_user}",
"gists_url": "https://api.github.com/users/ivarflakstad/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ivarflakstad/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ivarflakstad/subscriptions",
"organizations_url": "https://api.github.com/users/ivarflakstad/orgs",
"repos_url": "https://api.github.com/users/ivarflakstad/repos",
"events_url": "https://api.github.com/users/ivarflakstad/events{/privacy}",
"received_events_url": "https://api.github.com/users/ivarflakstad/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-05-27T18:52:21 | 2025-05-28T07:17:43 | 2025-05-28T07:17:41 | MEMBER | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38415",
"html_url": "https://github.com/huggingface/transformers/pull/38415",
"diff_url": "https://github.com/huggingface/transformers/pull/38415.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38415.patch",
"merged_at": "2025-05-28T07:17:41"
} | null | {
"login": "ydshieh",
"id": 2521628,
"node_id": "MDQ6VXNlcjI1MjE2Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ydshieh",
"html_url": "https://github.com/ydshieh",
"followers_url": "https://api.github.com/users/ydshieh/followers",
"following_url": "https://api.github.com/users/ydshieh/following{/other_user}",
"gists_url": "https://api.github.com/users/ydshieh/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ydshieh/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ydshieh/subscriptions",
"organizations_url": "https://api.github.com/users/ydshieh/orgs",
"repos_url": "https://api.github.com/users/ydshieh/repos",
"events_url": "https://api.github.com/users/ydshieh/events{/privacy}",
"received_events_url": "https://api.github.com/users/ydshieh/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38415/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38415/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/38414 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38414/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38414/comments | https://api.github.com/repos/huggingface/transformers/issues/38414/events | https://github.com/huggingface/transformers/pull/38414 | 3,094,787,967 | PR_kwDOCUB6oc6X1Azy | 38,414 | [docs] Format fix | {
"login": "stevhliu",
"id": 59462357,
"node_id": "MDQ6VXNlcjU5NDYyMzU3",
"avatar_url": "https://avatars.githubusercontent.com/u/59462357?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/stevhliu",
"html_url": "https://github.com/stevhliu",
"followers_url": "https://api.github.com/users/stevhliu/followers",
"following_url": "https://api.github.com/users/stevhliu/following{/other_user}",
"gists_url": "https://api.github.com/users/stevhliu/gists{/gist_id}",
"starred_url": "https://api.github.com/users/stevhliu/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/stevhliu/subscriptions",
"organizations_url": "https://api.github.com/users/stevhliu/orgs",
"repos_url": "https://api.github.com/users/stevhliu/repos",
"events_url": "https://api.github.com/users/stevhliu/events{/privacy}",
"received_events_url": "https://api.github.com/users/stevhliu/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-05-27T18:21:13 | 2025-06-03T16:53:32 | 2025-06-03T16:53:23 | MEMBER | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38414",
"html_url": "https://github.com/huggingface/transformers/pull/38414",
"diff_url": "https://github.com/huggingface/transformers/pull/38414.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38414.patch",
"merged_at": "2025-06-03T16:53:23"
} | Fixes formatting of table in caching docs | {
"login": "stevhliu",
"id": 59462357,
"node_id": "MDQ6VXNlcjU5NDYyMzU3",
"avatar_url": "https://avatars.githubusercontent.com/u/59462357?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/stevhliu",
"html_url": "https://github.com/stevhliu",
"followers_url": "https://api.github.com/users/stevhliu/followers",
"following_url": "https://api.github.com/users/stevhliu/following{/other_user}",
"gists_url": "https://api.github.com/users/stevhliu/gists{/gist_id}",
"starred_url": "https://api.github.com/users/stevhliu/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/stevhliu/subscriptions",
"organizations_url": "https://api.github.com/users/stevhliu/orgs",
"repos_url": "https://api.github.com/users/stevhliu/repos",
"events_url": "https://api.github.com/users/stevhliu/events{/privacy}",
"received_events_url": "https://api.github.com/users/stevhliu/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38414/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38414/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/38413 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38413/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38413/comments | https://api.github.com/repos/huggingface/transformers/issues/38413/events | https://github.com/huggingface/transformers/pull/38413 | 3,094,769,913 | PR_kwDOCUB6oc6X085T | 38,413 | Fix CircleCI not triggered when PR is opened from a branch of `huggingface/transformers` | {
"login": "ydshieh",
"id": 2521628,
"node_id": "MDQ6VXNlcjI1MjE2Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ydshieh",
"html_url": "https://github.com/ydshieh",
"followers_url": "https://api.github.com/users/ydshieh/followers",
"following_url": "https://api.github.com/users/ydshieh/following{/other_user}",
"gists_url": "https://api.github.com/users/ydshieh/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ydshieh/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ydshieh/subscriptions",
"organizations_url": "https://api.github.com/users/ydshieh/orgs",
"repos_url": "https://api.github.com/users/ydshieh/repos",
"events_url": "https://api.github.com/users/ydshieh/events{/privacy}",
"received_events_url": "https://api.github.com/users/ydshieh/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-05-27T18:13:26 | 2025-05-28T09:25:45 | 2025-05-28T09:25:43 | COLLABORATOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38413",
"html_url": "https://github.com/huggingface/transformers/pull/38413",
"diff_url": "https://github.com/huggingface/transformers/pull/38413.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38413.patch",
"merged_at": "2025-05-28T09:25:43"
} | # What does this PR do?
This happens when a PR is opened by a team member (using a branch within `huggingface/transformers`). An empty commit could fix the issue. It's from a work of `converting PR to draft automatically`.
However, we stop doing so for now, so let's remove irrelevant code and fix this seeming strange behavior. | {
"login": "ydshieh",
"id": 2521628,
"node_id": "MDQ6VXNlcjI1MjE2Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ydshieh",
"html_url": "https://github.com/ydshieh",
"followers_url": "https://api.github.com/users/ydshieh/followers",
"following_url": "https://api.github.com/users/ydshieh/following{/other_user}",
"gists_url": "https://api.github.com/users/ydshieh/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ydshieh/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ydshieh/subscriptions",
"organizations_url": "https://api.github.com/users/ydshieh/orgs",
"repos_url": "https://api.github.com/users/ydshieh/repos",
"events_url": "https://api.github.com/users/ydshieh/events{/privacy}",
"received_events_url": "https://api.github.com/users/ydshieh/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38413/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38413/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/38412 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38412/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38412/comments | https://api.github.com/repos/huggingface/transformers/issues/38412/events | https://github.com/huggingface/transformers/pull/38412 | 3,094,749,552 | PR_kwDOCUB6oc6X04ZR | 38,412 | Fix GLM4 checkpoints | {
"login": "ydshieh",
"id": 2521628,
"node_id": "MDQ6VXNlcjI1MjE2Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ydshieh",
"html_url": "https://github.com/ydshieh",
"followers_url": "https://api.github.com/users/ydshieh/followers",
"following_url": "https://api.github.com/users/ydshieh/following{/other_user}",
"gists_url": "https://api.github.com/users/ydshieh/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ydshieh/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ydshieh/subscriptions",
"organizations_url": "https://api.github.com/users/ydshieh/orgs",
"repos_url": "https://api.github.com/users/ydshieh/repos",
"events_url": "https://api.github.com/users/ydshieh/events{/privacy}",
"received_events_url": "https://api.github.com/users/ydshieh/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-05-27T18:04:53 | 2025-05-28T16:40:09 | 2025-05-28T16:40:09 | COLLABORATOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38412",
"html_url": "https://github.com/huggingface/transformers/pull/38412",
"diff_url": "https://github.com/huggingface/transformers/pull/38412.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38412.patch",
"merged_at": "2025-05-28T16:40:09"
} | # What does this PR do?
Can't find something with 9b + 0414 + hf, the close one is `THUDM/GLM-4-9B-0414`.
All integration tests are passing now
@yao-matrix If you can approve, I can merge quickly so you can update for XPU if you want. | {
"login": "ydshieh",
"id": 2521628,
"node_id": "MDQ6VXNlcjI1MjE2Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ydshieh",
"html_url": "https://github.com/ydshieh",
"followers_url": "https://api.github.com/users/ydshieh/followers",
"following_url": "https://api.github.com/users/ydshieh/following{/other_user}",
"gists_url": "https://api.github.com/users/ydshieh/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ydshieh/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ydshieh/subscriptions",
"organizations_url": "https://api.github.com/users/ydshieh/orgs",
"repos_url": "https://api.github.com/users/ydshieh/repos",
"events_url": "https://api.github.com/users/ydshieh/events{/privacy}",
"received_events_url": "https://api.github.com/users/ydshieh/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38412/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38412/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/38411 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38411/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38411/comments | https://api.github.com/repos/huggingface/transformers/issues/38411/events | https://github.com/huggingface/transformers/pull/38411 | 3,094,499,316 | PR_kwDOCUB6oc6X0CG0 | 38,411 | Disable mi210 scheduled CI | {
"login": "ivarflakstad",
"id": 69173633,
"node_id": "MDQ6VXNlcjY5MTczNjMz",
"avatar_url": "https://avatars.githubusercontent.com/u/69173633?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ivarflakstad",
"html_url": "https://github.com/ivarflakstad",
"followers_url": "https://api.github.com/users/ivarflakstad/followers",
"following_url": "https://api.github.com/users/ivarflakstad/following{/other_user}",
"gists_url": "https://api.github.com/users/ivarflakstad/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ivarflakstad/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ivarflakstad/subscriptions",
"organizations_url": "https://api.github.com/users/ivarflakstad/orgs",
"repos_url": "https://api.github.com/users/ivarflakstad/repos",
"events_url": "https://api.github.com/users/ivarflakstad/events{/privacy}",
"received_events_url": "https://api.github.com/users/ivarflakstad/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-05-27T16:20:07 | 2025-05-28T08:35:43 | 2025-05-28T08:35:41 | MEMBER | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38411",
"html_url": "https://github.com/huggingface/transformers/pull/38411",
"diff_url": "https://github.com/huggingface/transformers/pull/38411.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38411.patch",
"merged_at": "2025-05-28T08:35:41"
} | null | {
"login": "ivarflakstad",
"id": 69173633,
"node_id": "MDQ6VXNlcjY5MTczNjMz",
"avatar_url": "https://avatars.githubusercontent.com/u/69173633?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ivarflakstad",
"html_url": "https://github.com/ivarflakstad",
"followers_url": "https://api.github.com/users/ivarflakstad/followers",
"following_url": "https://api.github.com/users/ivarflakstad/following{/other_user}",
"gists_url": "https://api.github.com/users/ivarflakstad/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ivarflakstad/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ivarflakstad/subscriptions",
"organizations_url": "https://api.github.com/users/ivarflakstad/orgs",
"repos_url": "https://api.github.com/users/ivarflakstad/repos",
"events_url": "https://api.github.com/users/ivarflakstad/events{/privacy}",
"received_events_url": "https://api.github.com/users/ivarflakstad/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38411/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38411/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/38410 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38410/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38410/comments | https://api.github.com/repos/huggingface/transformers/issues/38410/events | https://github.com/huggingface/transformers/pull/38410 | 3,094,479,275 | PR_kwDOCUB6oc6Xz9uT | 38,410 | Change slack channel for mi250 CI | {
"login": "ivarflakstad",
"id": 69173633,
"node_id": "MDQ6VXNlcjY5MTczNjMz",
"avatar_url": "https://avatars.githubusercontent.com/u/69173633?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ivarflakstad",
"html_url": "https://github.com/ivarflakstad",
"followers_url": "https://api.github.com/users/ivarflakstad/followers",
"following_url": "https://api.github.com/users/ivarflakstad/following{/other_user}",
"gists_url": "https://api.github.com/users/ivarflakstad/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ivarflakstad/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ivarflakstad/subscriptions",
"organizations_url": "https://api.github.com/users/ivarflakstad/orgs",
"repos_url": "https://api.github.com/users/ivarflakstad/repos",
"events_url": "https://api.github.com/users/ivarflakstad/events{/privacy}",
"received_events_url": "https://api.github.com/users/ivarflakstad/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-05-27T16:12:51 | 2025-05-28T07:20:36 | 2025-05-28T07:20:34 | MEMBER | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38410",
"html_url": "https://github.com/huggingface/transformers/pull/38410",
"diff_url": "https://github.com/huggingface/transformers/pull/38410.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38410.patch",
"merged_at": "2025-05-28T07:20:34"
} | null | {
"login": "ydshieh",
"id": 2521628,
"node_id": "MDQ6VXNlcjI1MjE2Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ydshieh",
"html_url": "https://github.com/ydshieh",
"followers_url": "https://api.github.com/users/ydshieh/followers",
"following_url": "https://api.github.com/users/ydshieh/following{/other_user}",
"gists_url": "https://api.github.com/users/ydshieh/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ydshieh/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ydshieh/subscriptions",
"organizations_url": "https://api.github.com/users/ydshieh/orgs",
"repos_url": "https://api.github.com/users/ydshieh/repos",
"events_url": "https://api.github.com/users/ydshieh/events{/privacy}",
"received_events_url": "https://api.github.com/users/ydshieh/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38410/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38410/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/38409 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38409/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38409/comments | https://api.github.com/repos/huggingface/transformers/issues/38409/events | https://github.com/huggingface/transformers/pull/38409 | 3,094,442,107 | PR_kwDOCUB6oc6Xz1lt | 38,409 | Make patch helper more helpful | {
"login": "ArthurZucker",
"id": 48595927,
"node_id": "MDQ6VXNlcjQ4NTk1OTI3",
"avatar_url": "https://avatars.githubusercontent.com/u/48595927?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ArthurZucker",
"html_url": "https://github.com/ArthurZucker",
"followers_url": "https://api.github.com/users/ArthurZucker/followers",
"following_url": "https://api.github.com/users/ArthurZucker/following{/other_user}",
"gists_url": "https://api.github.com/users/ArthurZucker/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ArthurZucker/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ArthurZucker/subscriptions",
"organizations_url": "https://api.github.com/users/ArthurZucker/orgs",
"repos_url": "https://api.github.com/users/ArthurZucker/repos",
"events_url": "https://api.github.com/users/ArthurZucker/events{/privacy}",
"received_events_url": "https://api.github.com/users/ArthurZucker/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-05-27T15:58:44 | 2025-05-30T09:20:13 | 2025-05-30T09:19:42 | COLLABORATOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38409",
"html_url": "https://github.com/huggingface/transformers/pull/38409",
"diff_url": "https://github.com/huggingface/transformers/pull/38409.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38409.patch",
"merged_at": "2025-05-30T09:19:42"
} | # What does this PR do?
Late edit, makes the patch helper more helpful
| {
"login": "ArthurZucker",
"id": 48595927,
"node_id": "MDQ6VXNlcjQ4NTk1OTI3",
"avatar_url": "https://avatars.githubusercontent.com/u/48595927?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ArthurZucker",
"html_url": "https://github.com/ArthurZucker",
"followers_url": "https://api.github.com/users/ArthurZucker/followers",
"following_url": "https://api.github.com/users/ArthurZucker/following{/other_user}",
"gists_url": "https://api.github.com/users/ArthurZucker/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ArthurZucker/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ArthurZucker/subscriptions",
"organizations_url": "https://api.github.com/users/ArthurZucker/orgs",
"repos_url": "https://api.github.com/users/ArthurZucker/repos",
"events_url": "https://api.github.com/users/ArthurZucker/events{/privacy}",
"received_events_url": "https://api.github.com/users/ArthurZucker/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38409/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38409/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/38408 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38408/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38408/comments | https://api.github.com/repos/huggingface/transformers/issues/38408/events | https://github.com/huggingface/transformers/issues/38408 | 3,094,375,915 | I_kwDOCUB6oc64cG3r | 38,408 | accelerate + device_map auto = error | {
"login": "AaronZLT",
"id": 35474496,
"node_id": "MDQ6VXNlcjM1NDc0NDk2",
"avatar_url": "https://avatars.githubusercontent.com/u/35474496?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/AaronZLT",
"html_url": "https://github.com/AaronZLT",
"followers_url": "https://api.github.com/users/AaronZLT/followers",
"following_url": "https://api.github.com/users/AaronZLT/following{/other_user}",
"gists_url": "https://api.github.com/users/AaronZLT/gists{/gist_id}",
"starred_url": "https://api.github.com/users/AaronZLT/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/AaronZLT/subscriptions",
"organizations_url": "https://api.github.com/users/AaronZLT/orgs",
"repos_url": "https://api.github.com/users/AaronZLT/repos",
"events_url": "https://api.github.com/users/AaronZLT/events{/privacy}",
"received_events_url": "https://api.github.com/users/AaronZLT/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
}
] | closed | false | null | [] | null | [] | 2025-05-27T15:37:10 | 2025-07-07T16:22:11 | 2025-07-05T08:03:02 | CONTRIBUTOR | null | null | null | null | Using `accelerate launch` to run the training script. When the model is loaded with `device_map` set to `auto`, before training error will be occurred since the trainer is trying to prepare the PP model into DDP.
```bash
[llm toolkit]: *** Train ***
[rank0]: Traceback (most recent call last):
[rank0]: File "/mnt/sdb/user_1/workspace/llm-toolkit/tmp/test_prepare/finetune.py", line 54, i
n <module>
[rank0]: train(
[rank0]: File "/mnt/sdb/user_1/workspace/llm-toolkit/llmtoolkit/train.py", line 144, in train
[rank0]: train_result = trainer.train()
[rank0]: File "/mnt/sdb/user_1/anaconda3/envs/workspace/lib/python3.10/site-packages/transform
ers/trainer.py", line 2245, in train
[rank0]: return inner_training_loop(
[rank0]: File "/mnt/sdb/user_1/anaconda3/envs/workspace/lib/python3.10/site-packages/transform
ers/trainer.py", line 2374, in _inner_training_loop
[rank0]: model, self.optimizer = self.accelerator.prepare(self.model, self.optimizer)
[rank0]: File "/mnt/sdb/user_1/anaconda3/envs/workspace/lib/python3.10/site-packages/accelerat
e/accelerator.py", line 1446, in prepare
[rank0]: result = tuple(
[rank0]: File "/mnt/sdb/user_1/anaconda3/envs/workspace/lib/python3.10/site-packages/accelerat
e/accelerator.py", line 1447, in <genexpr>
[rank0]: self._prepare_one(obj, first_pass=True, device_placement=d) for obj, d in zip(args, device
_placement)
[rank0]: File "/mnt/sdb/user_1/anaconda3/envs/workspace/lib/python3.10/site-packages/accelerat
e/accelerator.py", line 1289, in _prepare_one
[rank0]: return self.prepare_model(obj, device_placement=device_placement)
[rank0]: File "/mnt/sdb/user_1/anaconda3/envs/workspace/lib/python3.10/site-packages/accelerat
e/accelerator.py", line 1595, in prepare_model
[rank0]: model = torch.nn.parallel.DistributedDataParallel(
[rank0]: File "/mnt/sdb/user_1/anaconda3/envs/workspace/lib/python3.10/site-packages/torch/nn/
parallel/distributed.py", line 827, in __init__
[rank0]: _sync_module_states(
[rank0]: File "/mnt/sdb/user_1/anaconda3/envs/workspace/lib/python3.10/site-packages/torch/dis
tributed/utils.py", line 317, in _sync_module_states
[rank0]: _sync_params_and_buffers(process_group, module_states, broadcast_bucket_size, src)
[rank0]: File "/mnt/sdb/user_1/anaconda3/envs/workspace/lib/python3.10/site-packages/torch/dis
tributed/utils.py", line 328, in _sync_params_and_buffers
[rank0]: dist._broadcast_coalesced(
[rank0]: File "/mnt/sdb/user_1/anaconda3/envs/workspace/lib/python3.10/site-packages/torch/_co
mpile.py", line 32, in inner
[rank0]: return disable_fn(*args, **kwargs)
[rank0]: File "/mnt/sdb/user_1/anaconda3/envs/workspace/lib/python3.10/site-packages/torch/_dy
namo/eval_frame.py", line 632, in _fn
[rank0]: return fn(*args, **kwargs)
[rank0]: File "/mnt/sdb/user_1/anaconda3/envs/workspace/lib/python3.10/site-packages/torch/dis
tributed/tensor/_api.py", line 340, in __torch_dispatch__
[rank0]: return DTensor._op_dispatcher.dispatch(
[rank0]: File "/mnt/sdb/user_1/anaconda3/envs/workspace/lib/python3.10/site-packages/torch/dis
tributed/tensor/_dispatch.py", line 166, in dispatch
[rank0]: op_info = self.unwrap_to_op_info(op_call, args, kwargs)
[rank0]: File "/mnt/sdb/user_1/anaconda3/envs/workspace/lib/python3.10/site-packages/torch/dis
tributed/tensor/_dispatch.py", line 371, in unwrap_to_op_info
[rank0]: self._try_replicate_spec_for_scalar_tensor(op_call, arg, mesh)
[rank0]: File "/mnt/sdb/user_1/anaconda3/envs/workspace/lib/python3.10/site-packages/torch/dis
tributed/tensor/_dispatch.py", line 470, in _try_replicate_spec_for_scalar_tensor
[rank0]: raise RuntimeError(
[rank0]: RuntimeError: aten.cat.default: got mixed torch.Tensor and DTensor, need to convert all torch.
Tensor to DTensor before calling distributed operators!
[rank0]:[W527 23:23:59.836288935 ProcessGroupNCCL.cpp:1250] Warning: WARNING: process group has NOT bee
n destroyed before we destruct ProcessGroupNCCL. On normal program exit, the application should call de
stroy_process_group to ensure that any pending NCCL operations have finished in this process. In rare c
ases this process can exit before this point and block the progress of another member of the process gr
oup. This constraint has always been present, but this warning has only been added since PyTorch 2.4 (
function operator())
W0527 23:24:01.485000 3226366 site-packages/torch/distributed/elastic/multiprocessing/api.py:897] Sendi
ng process 3226692 closing signal SIGTERM
E0527 23:24:01.752000 3226366 site-packages/torch/distributed/elastic/multiprocessing/api.py:869] faile
d (exitcode: 1) local_rank: 0 (pid: 3226691) of binary: /mnt/sdb/user_1/anaconda3/envs/workspace
/bin/python
```
@zach-huggingface @SunMarc
| {
"login": "github-actions[bot]",
"id": 41898282,
"node_id": "MDM6Qm90NDE4OTgyODI=",
"avatar_url": "https://avatars.githubusercontent.com/in/15368?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/github-actions%5Bbot%5D",
"html_url": "https://github.com/apps/github-actions",
"followers_url": "https://api.github.com/users/github-actions%5Bbot%5D/followers",
"following_url": "https://api.github.com/users/github-actions%5Bbot%5D/following{/other_user}",
"gists_url": "https://api.github.com/users/github-actions%5Bbot%5D/gists{/gist_id}",
"starred_url": "https://api.github.com/users/github-actions%5Bbot%5D/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/github-actions%5Bbot%5D/subscriptions",
"organizations_url": "https://api.github.com/users/github-actions%5Bbot%5D/orgs",
"repos_url": "https://api.github.com/users/github-actions%5Bbot%5D/repos",
"events_url": "https://api.github.com/users/github-actions%5Bbot%5D/events{/privacy}",
"received_events_url": "https://api.github.com/users/github-actions%5Bbot%5D/received_events",
"type": "Bot",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38408/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38408/timeline | null | completed | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | true |
https://api.github.com/repos/huggingface/transformers/issues/38407 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38407/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38407/comments | https://api.github.com/repos/huggingface/transformers/issues/38407/events | https://github.com/huggingface/transformers/pull/38407 | 3,094,356,273 | PR_kwDOCUB6oc6XziuY | 38,407 | fix: handle no scheduler passed by user | {
"login": "McPatate",
"id": 9112841,
"node_id": "MDQ6VXNlcjkxMTI4NDE=",
"avatar_url": "https://avatars.githubusercontent.com/u/9112841?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/McPatate",
"html_url": "https://github.com/McPatate",
"followers_url": "https://api.github.com/users/McPatate/followers",
"following_url": "https://api.github.com/users/McPatate/following{/other_user}",
"gists_url": "https://api.github.com/users/McPatate/gists{/gist_id}",
"starred_url": "https://api.github.com/users/McPatate/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/McPatate/subscriptions",
"organizations_url": "https://api.github.com/users/McPatate/orgs",
"repos_url": "https://api.github.com/users/McPatate/repos",
"events_url": "https://api.github.com/users/McPatate/events{/privacy}",
"received_events_url": "https://api.github.com/users/McPatate/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-05-27T15:30:20 | 2025-05-30T09:00:47 | 2025-05-30T09:00:45 | MEMBER | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38407",
"html_url": "https://github.com/huggingface/transformers/pull/38407",
"diff_url": "https://github.com/huggingface/transformers/pull/38407.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38407.patch",
"merged_at": "2025-05-30T09:00:45"
} | cc @kashif | {
"login": "ArthurZucker",
"id": 48595927,
"node_id": "MDQ6VXNlcjQ4NTk1OTI3",
"avatar_url": "https://avatars.githubusercontent.com/u/48595927?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ArthurZucker",
"html_url": "https://github.com/ArthurZucker",
"followers_url": "https://api.github.com/users/ArthurZucker/followers",
"following_url": "https://api.github.com/users/ArthurZucker/following{/other_user}",
"gists_url": "https://api.github.com/users/ArthurZucker/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ArthurZucker/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ArthurZucker/subscriptions",
"organizations_url": "https://api.github.com/users/ArthurZucker/orgs",
"repos_url": "https://api.github.com/users/ArthurZucker/repos",
"events_url": "https://api.github.com/users/ArthurZucker/events{/privacy}",
"received_events_url": "https://api.github.com/users/ArthurZucker/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38407/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38407/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/38406 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38406/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38406/comments | https://api.github.com/repos/huggingface/transformers/issues/38406/events | https://github.com/huggingface/transformers/pull/38406 | 3,094,285,804 | PR_kwDOCUB6oc6XzTT1 | 38,406 | [generate] add soft deprecations on custom generation methods | {
"login": "gante",
"id": 12240844,
"node_id": "MDQ6VXNlcjEyMjQwODQ0",
"avatar_url": "https://avatars.githubusercontent.com/u/12240844?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/gante",
"html_url": "https://github.com/gante",
"followers_url": "https://api.github.com/users/gante/followers",
"following_url": "https://api.github.com/users/gante/following{/other_user}",
"gists_url": "https://api.github.com/users/gante/gists{/gist_id}",
"starred_url": "https://api.github.com/users/gante/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/gante/subscriptions",
"organizations_url": "https://api.github.com/users/gante/orgs",
"repos_url": "https://api.github.com/users/gante/repos",
"events_url": "https://api.github.com/users/gante/events{/privacy}",
"received_events_url": "https://api.github.com/users/gante/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-05-27T15:07:25 | 2025-06-02T10:11:46 | 2025-06-02T10:11:46 | MEMBER | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38406",
"html_url": "https://github.com/huggingface/transformers/pull/38406",
"diff_url": "https://github.com/huggingface/transformers/pull/38406.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38406.patch",
"merged_at": "2025-06-02T10:11:46"
} | # What does this PR do?
On all generation methods except greedy decoding, sampling, beam search, and speculative decoding, adds a soft-deprecation message.
This message asks users to add `trust_remote_code=True` to their `generate` calls before `v4.55.0`, as the plan is to move those generation methods into `custom_generate` methods. Moving those methods into the Hub should massively reduce our CI burden 🧼 🧼 🧼 | {
"login": "manueldeprada",
"id": 6536835,
"node_id": "MDQ6VXNlcjY1MzY4MzU=",
"avatar_url": "https://avatars.githubusercontent.com/u/6536835?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/manueldeprada",
"html_url": "https://github.com/manueldeprada",
"followers_url": "https://api.github.com/users/manueldeprada/followers",
"following_url": "https://api.github.com/users/manueldeprada/following{/other_user}",
"gists_url": "https://api.github.com/users/manueldeprada/gists{/gist_id}",
"starred_url": "https://api.github.com/users/manueldeprada/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/manueldeprada/subscriptions",
"organizations_url": "https://api.github.com/users/manueldeprada/orgs",
"repos_url": "https://api.github.com/users/manueldeprada/repos",
"events_url": "https://api.github.com/users/manueldeprada/events{/privacy}",
"received_events_url": "https://api.github.com/users/manueldeprada/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38406/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38406/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/38405 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38405/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38405/comments | https://api.github.com/repos/huggingface/transformers/issues/38405/events | https://github.com/huggingface/transformers/pull/38405 | 3,094,190,582 | PR_kwDOCUB6oc6Xy-h0 | 38,405 | Add Dia model | {
"login": "buttercrab",
"id": 34997549,
"node_id": "MDQ6VXNlcjM0OTk3NTQ5",
"avatar_url": "https://avatars.githubusercontent.com/u/34997549?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/buttercrab",
"html_url": "https://github.com/buttercrab",
"followers_url": "https://api.github.com/users/buttercrab/followers",
"following_url": "https://api.github.com/users/buttercrab/following{/other_user}",
"gists_url": "https://api.github.com/users/buttercrab/gists{/gist_id}",
"starred_url": "https://api.github.com/users/buttercrab/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/buttercrab/subscriptions",
"organizations_url": "https://api.github.com/users/buttercrab/orgs",
"repos_url": "https://api.github.com/users/buttercrab/repos",
"events_url": "https://api.github.com/users/buttercrab/events{/privacy}",
"received_events_url": "https://api.github.com/users/buttercrab/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-05-27T14:40:47 | 2025-06-26T11:04:56 | 2025-06-26T11:04:23 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38405",
"html_url": "https://github.com/huggingface/transformers/pull/38405",
"diff_url": "https://github.com/huggingface/transformers/pull/38405.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38405.patch",
"merged_at": "2025-06-26T11:04:23"
} | # What does this PR do?
<!--
Congratulations! You've made it this far! You're not quite done yet though.
Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution.
Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change.
Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost.
-->
<!-- Remove if not applicable -->
Fixes # (issue)
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [ ] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
<!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @
If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**.
Please tag fewer than 3 people.
Models:
- text models: @ArthurZucker
- vision models: @amyeroberts, @qubvel
- speech models: @eustlb
- graph models: @clefourrier
Library:
- flax: @gante and @Rocketknight1
- generate: @zucchini-nlp (visual-language models) or @gante (all others)
- pipelines: @Rocketknight1
- tensorflow: @gante and @Rocketknight1
- tokenizers: @ArthurZucker
- trainer: @zach-huggingface and @SunMarc
- chat templates: @Rocketknight1
Integrations:
- deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface
- ray/raytune: @richardliaw, @amogkam
- Big Model Inference: @SunMarc
- quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber
Documentation: @stevhliu
HF projects:
- accelerate: [different repo](https://github.com/huggingface/accelerate)
- datasets: [different repo](https://github.com/huggingface/datasets)
- diffusers: [different repo](https://github.com/huggingface/diffusers)
- rust tokenizers: [different repo](https://github.com/huggingface/tokenizers)
Maintained examples (not research project or legacy):
- Flax: @Rocketknight1
- PyTorch: See Models above and tag the person corresponding to the modality of the example.
- TensorFlow: @Rocketknight1
-->
| {
"login": "vasqu",
"id": 73884904,
"node_id": "MDQ6VXNlcjczODg0OTA0",
"avatar_url": "https://avatars.githubusercontent.com/u/73884904?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/vasqu",
"html_url": "https://github.com/vasqu",
"followers_url": "https://api.github.com/users/vasqu/followers",
"following_url": "https://api.github.com/users/vasqu/following{/other_user}",
"gists_url": "https://api.github.com/users/vasqu/gists{/gist_id}",
"starred_url": "https://api.github.com/users/vasqu/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/vasqu/subscriptions",
"organizations_url": "https://api.github.com/users/vasqu/orgs",
"repos_url": "https://api.github.com/users/vasqu/repos",
"events_url": "https://api.github.com/users/vasqu/events{/privacy}",
"received_events_url": "https://api.github.com/users/vasqu/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38405/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38405/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/38404 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38404/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38404/comments | https://api.github.com/repos/huggingface/transformers/issues/38404/events | https://github.com/huggingface/transformers/pull/38404 | 3,094,061,091 | PR_kwDOCUB6oc6XyiUu | 38,404 | [cleanup] delete deprecated kwargs in qwen2_audio 🧹 | {
"login": "gante",
"id": 12240844,
"node_id": "MDQ6VXNlcjEyMjQwODQ0",
"avatar_url": "https://avatars.githubusercontent.com/u/12240844?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/gante",
"html_url": "https://github.com/gante",
"followers_url": "https://api.github.com/users/gante/followers",
"following_url": "https://api.github.com/users/gante/following{/other_user}",
"gists_url": "https://api.github.com/users/gante/gists{/gist_id}",
"starred_url": "https://api.github.com/users/gante/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/gante/subscriptions",
"organizations_url": "https://api.github.com/users/gante/orgs",
"repos_url": "https://api.github.com/users/gante/repos",
"events_url": "https://api.github.com/users/gante/events{/privacy}",
"received_events_url": "https://api.github.com/users/gante/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-05-27T14:00:42 | 2025-05-27T15:08:56 | 2025-05-27T15:08:53 | MEMBER | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38404",
"html_url": "https://github.com/huggingface/transformers/pull/38404",
"diff_url": "https://github.com/huggingface/transformers/pull/38404.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38404.patch",
"merged_at": "2025-05-27T15:08:53"
} | # What does this PR do?
Cleanup PR: deletes unused arguments, deprecated in https://github.com/huggingface/transformers/pull/36282 🧹 | {
"login": "gante",
"id": 12240844,
"node_id": "MDQ6VXNlcjEyMjQwODQ0",
"avatar_url": "https://avatars.githubusercontent.com/u/12240844?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/gante",
"html_url": "https://github.com/gante",
"followers_url": "https://api.github.com/users/gante/followers",
"following_url": "https://api.github.com/users/gante/following{/other_user}",
"gists_url": "https://api.github.com/users/gante/gists{/gist_id}",
"starred_url": "https://api.github.com/users/gante/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/gante/subscriptions",
"organizations_url": "https://api.github.com/users/gante/orgs",
"repos_url": "https://api.github.com/users/gante/repos",
"events_url": "https://api.github.com/users/gante/events{/privacy}",
"received_events_url": "https://api.github.com/users/gante/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38404/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38404/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/38403 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38403/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38403/comments | https://api.github.com/repos/huggingface/transformers/issues/38403/events | https://github.com/huggingface/transformers/issues/38403 | 3,093,928,283 | I_kwDOCUB6oc64aZlb | 38,403 | Qwen2.5-VL-7B-Instruct model keys are different between a saved model and downloaded one | {
"login": "harshit2997",
"id": 17030113,
"node_id": "MDQ6VXNlcjE3MDMwMTEz",
"avatar_url": "https://avatars.githubusercontent.com/u/17030113?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/harshit2997",
"html_url": "https://github.com/harshit2997",
"followers_url": "https://api.github.com/users/harshit2997/followers",
"following_url": "https://api.github.com/users/harshit2997/following{/other_user}",
"gists_url": "https://api.github.com/users/harshit2997/gists{/gist_id}",
"starred_url": "https://api.github.com/users/harshit2997/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/harshit2997/subscriptions",
"organizations_url": "https://api.github.com/users/harshit2997/orgs",
"repos_url": "https://api.github.com/users/harshit2997/repos",
"events_url": "https://api.github.com/users/harshit2997/events{/privacy}",
"received_events_url": "https://api.github.com/users/harshit2997/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
}
] | closed | false | null | [] | null | [] | 2025-05-27T13:21:55 | 2025-05-27T13:35:47 | 2025-05-27T13:35:46 | NONE | null | null | null | null | ### System Info
- `transformers` version: 4.52.3
- `bitsandbytes` version: 0.45.5
- `vLLM` version: 0.8.5.post1
- Platform: Linux-5.15.0-1045-azure-x86_64-with-glibc2.35
- Python version: 3.10.16
- Huggingface_hub version: 0.32.1
- Safetensors version: 0.5.3
- Accelerate version: 1.7.0
- Accelerate config: not found
- DeepSpeed version: not installed
- PyTorch version (GPU?): 2.6.0+cu124 (True)
- Tensorflow version (GPU?): not installed (NA)
- Flax version (CPU?/GPU?/TPU?): not installed (NA)
- Jax version: not installed
- JaxLib version: not installed
- Using distributed or parallel set-up in script?: no
- Using GPU in script?: yes
- GPU type: NVIDIA A100 80GB PCIe
### Who can help?
I have been trying to get fast inference on Qwen-2.5-VL-7B-Instruct through vLLM and have been successful. But to get more throughput, I tried saving a bitsandbytes 4 bit quantized model locally and loading it into vLLM. It fails with key mismatch. To root cause, I tried doing the same thing with just loading a transformers model, saving it locally, and then loading it into vLLM. It failed again. Both times the error message was:
```
ERROR 05-27 12:50:03 [core.py:396] File "/home/aiscuser/.conda/envs/slm/lib/python3.10/site-packages/vllm/model_executor/models/qwen2.py", line 405, in load_weights
ERROR 05-27 12:50:03 [core.py:396] param = params_dict[name]
ERROR 05-27 12:50:03 [core.py:396] KeyError: 'language_model.embed_tokens.weight'
```
Investigating more, when I download the model using `huggingface-cli` and check in the cache, the `model.safetensors.index.json` file has different keys as compared to the `model.safetensors.index.json` of the locally saved model (even unquantized).
Some lines from `model.safetensors.index.json` of the model from cache:
```
"lm_head.weight": "model-00005-of-00005.safetensors",
"model.embed_tokens.weight": "model-00001-of-00005.safetensors",
"model.layers.0.input_layernorm.weight": "model-00001-of-00005.safetensors",
"model.layers.0.mlp.down_proj.weight": "model-00001-of-00005.safetensors",
"model.layers.0.mlp.gate_proj.weight": "model-00001-of-00005.safetensors",
"model.layers.0.mlp.up_proj.weight": "model-00001-of-00005.safetensors",
"model.layers.0.post_attention_layernorm.weight": "model-00001-of-00005.safetensors",
"visual.blocks.0.attn.proj.bias": "model-00001-of-00005.safetensors",
"visual.blocks.0.attn.proj.weight": "model-00001-of-00005.safetensors",
"visual.blocks.0.attn.qkv.bias": "model-00001-of-00005.safetensors",
"visual.blocks.0.attn.qkv.weight": "model-00001-of-00005.safetensors",
"visual.blocks.0.mlp.down_proj.bias": "model-00001-of-00005.safetensors",
"visual.blocks.0.mlp.down_proj.weight": "model-00001-of-00005.safetensors",
```
Corresponding lines from `model.safetensors.index.json` of the locally saved model:
```
"lm_head.weight": "model-00004-of-00004.safetensors",
"model.language_model.embed_tokens.weight": "model-00001-of-00004.safetensors",
"model.language_model.layers.0.input_layernorm.weight": "model-00001-of-00004.safetensors",
"model.language_model.layers.0.mlp.down_proj.weight": "model-00001-of-00004.safetensors",
"model.language_model.layers.0.mlp.gate_proj.weight": "model-00001-of-00004.safetensors",
"model.language_model.layers.0.mlp.up_proj.weight": "model-00001-of-00004.safetensors",
"model.language_model.layers.0.post_attention_layernorm.weight": "model-00001-of-00004.safetensors",
"model.visual.blocks.0.attn.proj.bias": "model-00001-of-00004.safetensors",
"model.visual.blocks.0.attn.proj.weight": "model-00001-of-00004.safetensors",
"model.visual.blocks.0.attn.qkv.bias": "model-00001-of-00004.safetensors",
"model.visual.blocks.0.attn.qkv.weight": "model-00001-of-00004.safetensors",
"model.visual.blocks.0.mlp.down_proj.bias": "model-00001-of-00004.safetensors",
"model.visual.blocks.0.mlp.down_proj.weight": "model-00001-of-00004.safetensors",
```
This is also probably why vLLM can load the model when I just give it the HF model name but cannot load it from the locally saved model. But my question is why is this happening? Why is the locally saved model different?
Code used to locally save the unquantized model:
```
import torch
from transformers import Qwen2_5_VLForConditionalGeneration, AutoTokenizer, AutoProcessor, BitsAndBytesConfig
model_path = "Qwen/Qwen2.5-VL-7B-Instruct"
DEVICE = "cuda:0"
processor = AutoProcessor.from_pretrained(model_path, trust_remote_code=True, min_pixels=1024*28*28, use_Fast=True)
model = Qwen2_5_VLForConditionalGeneration.from_pretrained(
model_path,
device_map=DEVICE,
torch_dtype=torch.bfloat16,
trust_remote_code=True,
).to(DEVICE)
processor.tokenizer.padding_side = 'left'
model.save_pretrained("./qwen7b")
processor.save_pretrained("./qwen7b")
```
Code used to save the quantized model:
```
import torch
from transformers import Qwen2_5_VLForConditionalGeneration, AutoTokenizer, AutoProcessor, BitsAndBytesConfig
model_path = "Qwen/Qwen2.5-VL-7B-Instruct"
DEVICE = "cuda:0"
processor = AutoProcessor.from_pretrained(model_path, trust_remote_code=True, min_pixels=1024*28*28, use_Fast=True)
quantization_config = BitsAndBytesConfig(
load_in_4bit=True,
bnb_4bit_use_double_quant=True,
bnb_4bit_quant_type="nf4",
bnb_4bit_compute_dtype=torch.bfloat16,
)
model = Qwen2_5_VLForConditionalGeneration.from_pretrained(
model_path,
device_map=DEVICE,
torch_dtype=torch.bfloat16,
trust_remote_code=True,
quantization_config=quantization_config
).to(DEVICE)
processor.tokenizer.padding_side = 'left'
model.save_pretrained("./qwen7b_bnb4")
processor.save_pretrained("./qwen7b_bnb4")
```
Code used to load the model into vLLM
```
import torch
from transformers import AutoProcessor
from vllm import LLM, SamplingParams
model_path = "/home/aiscuser/qwen7b"
# model_path = "/home/aiscuser/qwen7b_bnb4"
processor = AutoProcessor.from_pretrained(model_path, trust_remote_code=True, min_pixels=1024*28*28, use_fast=True)
model = LLM(
model=model_path,
tokenizer_mode="auto",
limit_mm_per_prompt={"image": 1, "video": 0},
trust_remote_code=True,
tensor_parallel_size=1,
# quantization="bitsandbytes" # tried with and without this. Error is on non-quantization weights
)
```
### Information
- [ ] The official example scripts
- [x] My own modified scripts
### Tasks
- [ ] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...)
- [x] My own task or dataset (give details below)
### Reproduction
Steps to repro:
1. Run the script to save the model locally. Just running the one for unquantized should be fine.
2. Now run the script to try loading the model into vLLM.
### Expected behavior
Model keys should be same between downloaded and locally saved model. Subsequently, vLLM should load the model correctly. | {
"login": "zucchini-nlp",
"id": 100715397,
"node_id": "U_kgDOBgDLhQ",
"avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/zucchini-nlp",
"html_url": "https://github.com/zucchini-nlp",
"followers_url": "https://api.github.com/users/zucchini-nlp/followers",
"following_url": "https://api.github.com/users/zucchini-nlp/following{/other_user}",
"gists_url": "https://api.github.com/users/zucchini-nlp/gists{/gist_id}",
"starred_url": "https://api.github.com/users/zucchini-nlp/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/zucchini-nlp/subscriptions",
"organizations_url": "https://api.github.com/users/zucchini-nlp/orgs",
"repos_url": "https://api.github.com/users/zucchini-nlp/repos",
"events_url": "https://api.github.com/users/zucchini-nlp/events{/privacy}",
"received_events_url": "https://api.github.com/users/zucchini-nlp/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38403/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38403/timeline | null | completed | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | true |
https://api.github.com/repos/huggingface/transformers/issues/38402 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38402/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38402/comments | https://api.github.com/repos/huggingface/transformers/issues/38402/events | https://github.com/huggingface/transformers/issues/38402 | 3,093,921,609 | I_kwDOCUB6oc64aX9J | 38,402 | Version 4.52.3 leads to error after bundling with pyinstaller | {
"login": "HiokKuek",
"id": 128700697,
"node_id": "U_kgDOB6vRGQ",
"avatar_url": "https://avatars.githubusercontent.com/u/128700697?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/HiokKuek",
"html_url": "https://github.com/HiokKuek",
"followers_url": "https://api.github.com/users/HiokKuek/followers",
"following_url": "https://api.github.com/users/HiokKuek/following{/other_user}",
"gists_url": "https://api.github.com/users/HiokKuek/gists{/gist_id}",
"starred_url": "https://api.github.com/users/HiokKuek/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/HiokKuek/subscriptions",
"organizations_url": "https://api.github.com/users/HiokKuek/orgs",
"repos_url": "https://api.github.com/users/HiokKuek/repos",
"events_url": "https://api.github.com/users/HiokKuek/events{/privacy}",
"received_events_url": "https://api.github.com/users/HiokKuek/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
}
] | closed | false | null | [] | null | [] | 2025-05-27T13:20:00 | 2025-07-31T15:40:04 | 2025-07-20T08:02:38 | NONE | null | null | null | null | ### System Info
The new versioning induces the following error when trying to bundle a python script that contains imports from the transformers library.
```
File "functions\classify.py", line 1, in <module>
from transformers import DistilBertTokenizer, DistilBertForSequenceClassification
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "<frozen importlib._bootstrap>", line 1412, in _handle_fromlist
File "transformers\utils\import_utils.py", line 2045, in __getattr__
File "transformers\utils\import_utils.py", line 2075, in _get_module
File "transformers\utils\import_utils.py", line 2073, in _get_module
File "importlib\__init__.py", line 90, in import_module
File "<frozen importlib._bootstrap>", line 1387, in _gcd_import
File "<frozen importlib._bootstrap>", line 1360, in _find_and_load
File "<frozen importlib._bootstrap>", line 1331, in _find_and_load_unlocked
File "<frozen importlib._bootstrap>", line 935, in _load_unlocked
File "PyInstaller\loader\pyimod02_importers.py", line 450, in exec_module
File "transformers\models\__init__.py", line 341, in <module>
File "transformers\utils\import_utils.py", line 2594, in define_import_structure
File "transformers\utils\import_utils.py", line 2305, in create_import_structure_from_path
FileNotFoundError: [WinError 3] The system cannot find the path specified: 'C:\\Users\\ernest\\Desktop\\URL-Extractor-Extension\\native_messaging_genai\\dist\\main\\_internal\\transformers\\models\\__init__.pyc'
[PYI-9628:ERROR] Failed to execute script 'main' due to unhandled exception!
```
Error goes away after downgrading to v4.51.3
### Who can help?
_No response_
### Information
- [ ] The official example scripts
- [ ] My own modified scripts
### Tasks
- [ ] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...)
- [x] My own task or dataset (give details below)
### Reproduction
1. Python script that uses transformers, in my case `from transformers import AutoTokenizer, AutoModelForSequenceClassification`
2. bundle with pyinstall
3. run the .exe file
### Expected behavior
The things you intend to do with your python script (native host application in my case).
The recent update made it such that your application will crash wth the error as described above when you try and run the .exe. | {
"login": "github-actions[bot]",
"id": 41898282,
"node_id": "MDM6Qm90NDE4OTgyODI=",
"avatar_url": "https://avatars.githubusercontent.com/in/15368?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/github-actions%5Bbot%5D",
"html_url": "https://github.com/apps/github-actions",
"followers_url": "https://api.github.com/users/github-actions%5Bbot%5D/followers",
"following_url": "https://api.github.com/users/github-actions%5Bbot%5D/following{/other_user}",
"gists_url": "https://api.github.com/users/github-actions%5Bbot%5D/gists{/gist_id}",
"starred_url": "https://api.github.com/users/github-actions%5Bbot%5D/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/github-actions%5Bbot%5D/subscriptions",
"organizations_url": "https://api.github.com/users/github-actions%5Bbot%5D/orgs",
"repos_url": "https://api.github.com/users/github-actions%5Bbot%5D/repos",
"events_url": "https://api.github.com/users/github-actions%5Bbot%5D/events{/privacy}",
"received_events_url": "https://api.github.com/users/github-actions%5Bbot%5D/received_events",
"type": "Bot",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38402/reactions",
"total_count": 2,
"+1": 2,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38402/timeline | null | completed | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | true |
https://api.github.com/repos/huggingface/transformers/issues/38401 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38401/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38401/comments | https://api.github.com/repos/huggingface/transformers/issues/38401/events | https://github.com/huggingface/transformers/pull/38401 | 3,093,709,313 | PR_kwDOCUB6oc6XxV4S | 38,401 | Add report_repo_id to mi300 workflow | {
"login": "ivarflakstad",
"id": 69173633,
"node_id": "MDQ6VXNlcjY5MTczNjMz",
"avatar_url": "https://avatars.githubusercontent.com/u/69173633?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ivarflakstad",
"html_url": "https://github.com/ivarflakstad",
"followers_url": "https://api.github.com/users/ivarflakstad/followers",
"following_url": "https://api.github.com/users/ivarflakstad/following{/other_user}",
"gists_url": "https://api.github.com/users/ivarflakstad/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ivarflakstad/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ivarflakstad/subscriptions",
"organizations_url": "https://api.github.com/users/ivarflakstad/orgs",
"repos_url": "https://api.github.com/users/ivarflakstad/repos",
"events_url": "https://api.github.com/users/ivarflakstad/events{/privacy}",
"received_events_url": "https://api.github.com/users/ivarflakstad/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-05-27T12:11:59 | 2025-05-27T14:35:09 | 2025-05-27T14:35:07 | MEMBER | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38401",
"html_url": "https://github.com/huggingface/transformers/pull/38401",
"diff_url": "https://github.com/huggingface/transformers/pull/38401.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38401.patch",
"merged_at": "2025-05-27T14:35:07"
} | null | {
"login": "ydshieh",
"id": 2521628,
"node_id": "MDQ6VXNlcjI1MjE2Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ydshieh",
"html_url": "https://github.com/ydshieh",
"followers_url": "https://api.github.com/users/ydshieh/followers",
"following_url": "https://api.github.com/users/ydshieh/following{/other_user}",
"gists_url": "https://api.github.com/users/ydshieh/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ydshieh/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ydshieh/subscriptions",
"organizations_url": "https://api.github.com/users/ydshieh/orgs",
"repos_url": "https://api.github.com/users/ydshieh/repos",
"events_url": "https://api.github.com/users/ydshieh/events{/privacy}",
"received_events_url": "https://api.github.com/users/ydshieh/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38401/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38401/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/38400 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38400/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38400/comments | https://api.github.com/repos/huggingface/transformers/issues/38400/events | https://github.com/huggingface/transformers/pull/38400 | 3,093,667,146 | PR_kwDOCUB6oc6XxMto | 38,400 | [WIP] Tokenizer Refactor | {
"login": "itazap",
"id": 31893021,
"node_id": "MDQ6VXNlcjMxODkzMDIx",
"avatar_url": "https://avatars.githubusercontent.com/u/31893021?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/itazap",
"html_url": "https://github.com/itazap",
"followers_url": "https://api.github.com/users/itazap/followers",
"following_url": "https://api.github.com/users/itazap/following{/other_user}",
"gists_url": "https://api.github.com/users/itazap/gists{/gist_id}",
"starred_url": "https://api.github.com/users/itazap/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/itazap/subscriptions",
"organizations_url": "https://api.github.com/users/itazap/orgs",
"repos_url": "https://api.github.com/users/itazap/repos",
"events_url": "https://api.github.com/users/itazap/events{/privacy}",
"received_events_url": "https://api.github.com/users/itazap/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | open | false | null | [] | null | [] | 2025-05-27T11:55:54 | 2025-05-27T13:26:29 | null | COLLABORATOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38400",
"html_url": "https://github.com/huggingface/transformers/pull/38400",
"diff_url": "https://github.com/huggingface/transformers/pull/38400.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38400.patch",
"merged_at": null
} | ❗DRAFT PR DONT REVIEW❗
lots of duplicate code for `create_token_type_ids_from_sequences`, we can refactor by updating the base function!
However I think it should also be internalized (see https://github.com/huggingface/transformers/pull/37522) as it's only used internally by `prepare_for_model` for **slow tokenizers only**.
Questions:
- are users supposed to be using these function directly in any case?
_internalize prepare_for_model and build_inputs_with_special_tokens:_
users should encode using __call__ :
slow case
__call__ --> calls prepare_for_model --> which calls build_inputs_with_special_tokens.
fast case
__call__ --> calls rust tokenizer encode --> does not call build_inputs_with_special_tokens or prepare_for_model
Exposing these makes it unclear which method is correct to use for encoding, and adds redundancy to maintain. We already test these under the hood with testing__call__ or encode or batch_encode, etc., so IMO it is safe to remove testing build_inputs_with_special_tokens at least from all fast files and build_inputs_with_special_tokens --> _build_inputs_with_special_tokens in slow.
only modified llama fast for now but can add a commit with all fast files edited
also updated old language_modeling.py file, not sure if it is still relevant in general
| null | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38400/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38400/timeline | null | null | null | null | true | false |
https://api.github.com/repos/huggingface/transformers/issues/38399 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38399/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38399/comments | https://api.github.com/repos/huggingface/transformers/issues/38399/events | https://github.com/huggingface/transformers/pull/38399 | 3,093,638,483 | PR_kwDOCUB6oc6XxGa4 | 38,399 | [generate] move `SinkCache` to a `custom_generate` repo | {
"login": "gante",
"id": 12240844,
"node_id": "MDQ6VXNlcjEyMjQwODQ0",
"avatar_url": "https://avatars.githubusercontent.com/u/12240844?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/gante",
"html_url": "https://github.com/gante",
"followers_url": "https://api.github.com/users/gante/followers",
"following_url": "https://api.github.com/users/gante/following{/other_user}",
"gists_url": "https://api.github.com/users/gante/gists{/gist_id}",
"starred_url": "https://api.github.com/users/gante/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/gante/subscriptions",
"organizations_url": "https://api.github.com/users/gante/orgs",
"repos_url": "https://api.github.com/users/gante/repos",
"events_url": "https://api.github.com/users/gante/events{/privacy}",
"received_events_url": "https://api.github.com/users/gante/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-05-27T11:44:44 | 2025-06-02T10:13:30 | 2025-06-02T10:13:30 | MEMBER | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38399",
"html_url": "https://github.com/huggingface/transformers/pull/38399",
"diff_url": "https://github.com/huggingface/transformers/pull/38399.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38399.patch",
"merged_at": "2025-06-02T10:13:30"
} | # What does this PR do?
Moves the deprecated `SinkCache` to a `custom_generate` repo: https://huggingface.co/transformers-community/sink_cache
Users importing `SinkCache` will see no issue for now. Users using `SinkCache` will get a message redirecting them to the Hub repo.
### Hub usage example:
```py
# requires `transformers>=4.52.0`
from transformers import AutoModelForCausalLM, AutoTokenizer
# Preparing model, tokenizer, and model inputs
tokenizer = AutoTokenizer.from_pretrained("Qwen/Qwen3-0.6B")
model = AutoModelForCausalLM.from_pretrained("Qwen/Qwen3-0.6B", device_map="auto")
messages = [{"role": "user", "content": "Tell me a story about a cat."}]
text = tokenizer.apply_chat_template(
messages,
tokenize=False,
add_generation_prompt=True,
enable_thinking=False
)
model_inputs = tokenizer([text], return_tensors="pt").to(model.device)
# Using sink cache
gen_out = model.generate(
# usual `generate` arguments
**model_inputs,
do_sample=False,
max_new_tokens=100,
return_dict_in_generate=True,
# sink cache arguments (default `window_length=256`)
custom_generate="transformers-community/sink_cache",
trust_remote_code=True,
)
print(tokenizer.batch_decode(gen_out.sequences, skip_special_tokens=True))
assert "sinkcache" in str(type(gen_out.past_key_values)).lower()
# ['user\nTell me a story about a cat.\nassistant\n<think>\n\n</think>\n\nOnce upon a time, in a cozy village nestled
# between rolling hills and a sparkling lake, there lived a cat named Luna. Luna was small and fluffy, with a curious
# eyes that sparkled with wonder. She had a soft, warm coat that shimmered like the morning sun, and her tail was
# always wagging in playful motions.\n\nOne day, while exploring the village, Luna noticed a curious sight: a young
# boy playing with a ball on the lake. She followed him closely, her heart racing']
``` | {
"login": "manueldeprada",
"id": 6536835,
"node_id": "MDQ6VXNlcjY1MzY4MzU=",
"avatar_url": "https://avatars.githubusercontent.com/u/6536835?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/manueldeprada",
"html_url": "https://github.com/manueldeprada",
"followers_url": "https://api.github.com/users/manueldeprada/followers",
"following_url": "https://api.github.com/users/manueldeprada/following{/other_user}",
"gists_url": "https://api.github.com/users/manueldeprada/gists{/gist_id}",
"starred_url": "https://api.github.com/users/manueldeprada/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/manueldeprada/subscriptions",
"organizations_url": "https://api.github.com/users/manueldeprada/orgs",
"repos_url": "https://api.github.com/users/manueldeprada/repos",
"events_url": "https://api.github.com/users/manueldeprada/events{/privacy}",
"received_events_url": "https://api.github.com/users/manueldeprada/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38399/reactions",
"total_count": 1,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 1,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38399/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/38398 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38398/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38398/comments | https://api.github.com/repos/huggingface/transformers/issues/38398/events | https://github.com/huggingface/transformers/pull/38398 | 3,093,347,432 | PR_kwDOCUB6oc6XwGgs | 38,398 | Trigger doc-builder job after style bot | {
"login": "ydshieh",
"id": 2521628,
"node_id": "MDQ6VXNlcjI1MjE2Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ydshieh",
"html_url": "https://github.com/ydshieh",
"followers_url": "https://api.github.com/users/ydshieh/followers",
"following_url": "https://api.github.com/users/ydshieh/following{/other_user}",
"gists_url": "https://api.github.com/users/ydshieh/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ydshieh/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ydshieh/subscriptions",
"organizations_url": "https://api.github.com/users/ydshieh/orgs",
"repos_url": "https://api.github.com/users/ydshieh/repos",
"events_url": "https://api.github.com/users/ydshieh/events{/privacy}",
"received_events_url": "https://api.github.com/users/ydshieh/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-05-27T09:54:50 | 2025-05-28T15:15:36 | 2025-05-28T15:15:34 | COLLABORATOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38398",
"html_url": "https://github.com/huggingface/transformers/pull/38398",
"diff_url": "https://github.com/huggingface/transformers/pull/38398.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38398.patch",
"merged_at": "2025-05-28T15:15:34"
} | # What does this PR do?
If the style bot push a commit (fixing style issue), it won't trigger doc-builder afterward.
In order to make it trigger, we could use `workflow_call` event, but we need to pass the safe commit (i.e. the new commit sha of the style fix) to prevent the the attack. | {
"login": "ydshieh",
"id": 2521628,
"node_id": "MDQ6VXNlcjI1MjE2Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ydshieh",
"html_url": "https://github.com/ydshieh",
"followers_url": "https://api.github.com/users/ydshieh/followers",
"following_url": "https://api.github.com/users/ydshieh/following{/other_user}",
"gists_url": "https://api.github.com/users/ydshieh/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ydshieh/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ydshieh/subscriptions",
"organizations_url": "https://api.github.com/users/ydshieh/orgs",
"repos_url": "https://api.github.com/users/ydshieh/repos",
"events_url": "https://api.github.com/users/ydshieh/events{/privacy}",
"received_events_url": "https://api.github.com/users/ydshieh/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38398/reactions",
"total_count": 1,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 1,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38398/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/38397 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38397/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38397/comments | https://api.github.com/repos/huggingface/transformers/issues/38397/events | https://github.com/huggingface/transformers/pull/38397 | 3,093,229,598 | PR_kwDOCUB6oc6XvsvK | 38,397 | guard size mismatch check to only quantized models | {
"login": "SunMarc",
"id": 57196510,
"node_id": "MDQ6VXNlcjU3MTk2NTEw",
"avatar_url": "https://avatars.githubusercontent.com/u/57196510?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/SunMarc",
"html_url": "https://github.com/SunMarc",
"followers_url": "https://api.github.com/users/SunMarc/followers",
"following_url": "https://api.github.com/users/SunMarc/following{/other_user}",
"gists_url": "https://api.github.com/users/SunMarc/gists{/gist_id}",
"starred_url": "https://api.github.com/users/SunMarc/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/SunMarc/subscriptions",
"organizations_url": "https://api.github.com/users/SunMarc/orgs",
"repos_url": "https://api.github.com/users/SunMarc/repos",
"events_url": "https://api.github.com/users/SunMarc/events{/privacy}",
"received_events_url": "https://api.github.com/users/SunMarc/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-05-27T09:12:52 | 2025-05-27T09:45:05 | 2025-05-27T09:45:03 | MEMBER | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38397",
"html_url": "https://github.com/huggingface/transformers/pull/38397",
"diff_url": "https://github.com/huggingface/transformers/pull/38397.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38397.patch",
"merged_at": "2025-05-27T09:45:03"
} | # What does this PR do?
This PR fixes https://github.com/huggingface/transformers/issues/36960
The issue that we were doing a check that was only supposed to be done on quantized models on all models.
### To reproduce
```python
from transformers import Mask2FormerConfig, Mask2FormerForUniversalSegmentation
model_config = Mask2FormerConfig(mask_feature_size=512)
model = Mask2FormerForUniversalSegmentation.from_pretrained(
pretrained_model_name_or_path="facebook/mask2former-swin-tiny-coco-instance",
config=model_config,
ignore_mismatched_sizes=True,
)
``` | {
"login": "SunMarc",
"id": 57196510,
"node_id": "MDQ6VXNlcjU3MTk2NTEw",
"avatar_url": "https://avatars.githubusercontent.com/u/57196510?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/SunMarc",
"html_url": "https://github.com/SunMarc",
"followers_url": "https://api.github.com/users/SunMarc/followers",
"following_url": "https://api.github.com/users/SunMarc/following{/other_user}",
"gists_url": "https://api.github.com/users/SunMarc/gists{/gist_id}",
"starred_url": "https://api.github.com/users/SunMarc/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/SunMarc/subscriptions",
"organizations_url": "https://api.github.com/users/SunMarc/orgs",
"repos_url": "https://api.github.com/users/SunMarc/repos",
"events_url": "https://api.github.com/users/SunMarc/events{/privacy}",
"received_events_url": "https://api.github.com/users/SunMarc/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38397/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38397/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/38396 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38396/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38396/comments | https://api.github.com/repos/huggingface/transformers/issues/38396/events | https://github.com/huggingface/transformers/issues/38396 | 3,092,555,337 | I_kwDOCUB6oc64VKZJ | 38,396 | Can I disable all CI works in my forked version of Transformers? | {
"login": "ChengLyu",
"id": 5308679,
"node_id": "MDQ6VXNlcjUzMDg2Nzk=",
"avatar_url": "https://avatars.githubusercontent.com/u/5308679?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ChengLyu",
"html_url": "https://github.com/ChengLyu",
"followers_url": "https://api.github.com/users/ChengLyu/followers",
"following_url": "https://api.github.com/users/ChengLyu/following{/other_user}",
"gists_url": "https://api.github.com/users/ChengLyu/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ChengLyu/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ChengLyu/subscriptions",
"organizations_url": "https://api.github.com/users/ChengLyu/orgs",
"repos_url": "https://api.github.com/users/ChengLyu/repos",
"events_url": "https://api.github.com/users/ChengLyu/events{/privacy}",
"received_events_url": "https://api.github.com/users/ChengLyu/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-05-27T04:44:07 | 2025-05-28T18:06:31 | 2025-05-28T14:49:55 | CONTRIBUTOR | null | null | null | null | After I synced the `main` branch of Transformers in my forked version, github keeps running CI works and fails. Can I disable it? Thanks. | {
"login": "ivarflakstad",
"id": 69173633,
"node_id": "MDQ6VXNlcjY5MTczNjMz",
"avatar_url": "https://avatars.githubusercontent.com/u/69173633?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ivarflakstad",
"html_url": "https://github.com/ivarflakstad",
"followers_url": "https://api.github.com/users/ivarflakstad/followers",
"following_url": "https://api.github.com/users/ivarflakstad/following{/other_user}",
"gists_url": "https://api.github.com/users/ivarflakstad/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ivarflakstad/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ivarflakstad/subscriptions",
"organizations_url": "https://api.github.com/users/ivarflakstad/orgs",
"repos_url": "https://api.github.com/users/ivarflakstad/repos",
"events_url": "https://api.github.com/users/ivarflakstad/events{/privacy}",
"received_events_url": "https://api.github.com/users/ivarflakstad/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38396/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38396/timeline | null | completed | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | true |
https://api.github.com/repos/huggingface/transformers/issues/38395 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38395/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38395/comments | https://api.github.com/repos/huggingface/transformers/issues/38395/events | https://github.com/huggingface/transformers/pull/38395 | 3,092,331,528 | PR_kwDOCUB6oc6Xsp4U | 38,395 | Fix convert Phi4MM to HF Format Script | {
"login": "HkFromMY",
"id": 48499555,
"node_id": "MDQ6VXNlcjQ4NDk5NTU1",
"avatar_url": "https://avatars.githubusercontent.com/u/48499555?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/HkFromMY",
"html_url": "https://github.com/HkFromMY",
"followers_url": "https://api.github.com/users/HkFromMY/followers",
"following_url": "https://api.github.com/users/HkFromMY/following{/other_user}",
"gists_url": "https://api.github.com/users/HkFromMY/gists{/gist_id}",
"starred_url": "https://api.github.com/users/HkFromMY/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/HkFromMY/subscriptions",
"organizations_url": "https://api.github.com/users/HkFromMY/orgs",
"repos_url": "https://api.github.com/users/HkFromMY/repos",
"events_url": "https://api.github.com/users/HkFromMY/events{/privacy}",
"received_events_url": "https://api.github.com/users/HkFromMY/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-05-27T01:49:36 | 2025-05-30T13:52:51 | 2025-05-28T13:22:05 | NONE | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38395",
"html_url": "https://github.com/huggingface/transformers/pull/38395",
"diff_url": "https://github.com/huggingface/transformers/pull/38395.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38395.patch",
"merged_at": null
} | # What does this PR do?
<!--
Congratulations! You've made it this far! You're not quite done yet though.
Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution.
Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change.
Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost.
-->
<!-- Remove if not applicable -->
Fixes
- Load all `lora_B` weights with 0 due to difference in the adapter name.
Using the script below to load converted weights
```python
model = AutoModelForCausalLM.from_pretrained(CONVERTED_MODEL_PATH, trust_remote_code=True)
model = PeftModel.from_pretrained(model, CONVERTED_LORA_PATH_SPEECH).cuda()
count = 0
lora_params = { n: p for n, p in model.named_parameters() if 'lora_B' in n }
for n, p in lora_params.items():
count += 1
print(n, p.sum())
print(f"Lora_B count: {count}")
```
will have following output
```python
base_model.model.model.layers.30.mlp.gate_up_proj.lora_B.default.weight tensor(0., device='cuda:0')
base_model.model.model.layers.30.mlp.down_proj.lora_B.default.weight tensor(0., device='cuda:0')
base_model.model.model.layers.31.self_attn.o_proj.lora_B.default.weight tensor(0., device='cuda:0')
base_model.model.model.layers.31.self_attn.qkv_proj.lora_B.default.weight tensor(0., device='cuda:0')
base_model.model.model.layers.31.mlp.gate_up_proj.lora_B.default.weight tensor(0., device='cuda:0')
base_model.model.model.layers.31.mlp.down_proj.lora_B.default.weight tensor(0., device='cuda:0')
```
By appending the `base_model.model.` in front of the adapter keys' names, HF can load the adapter weights correctly for downstream finetuning, etc. Please note that the `lora_A` weights are loaded correctly, but `lora_B` weights are all with values of 0 in both `speech-lora` and `vision-lora`.
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [ ] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
<!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @
If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**.
Please tag fewer than 3 people.
Models:
- text models: @ArthurZucker
- vision models: @amyeroberts, @qubvel
- speech models: @eustlb
- graph models: @clefourrier
Library:
- flax: @gante and @Rocketknight1
- generate: @zucchini-nlp (visual-language models) or @gante (all others)
- pipelines: @Rocketknight1
- tensorflow: @gante and @Rocketknight1
- tokenizers: @ArthurZucker
- trainer: @zach-huggingface and @SunMarc
- chat templates: @Rocketknight1
Integrations:
- deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface
- ray/raytune: @richardliaw, @amogkam
- Big Model Inference: @SunMarc
- quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber
Documentation: @stevhliu
HF projects:
- accelerate: [different repo](https://github.com/huggingface/accelerate)
- datasets: [different repo](https://github.com/huggingface/datasets)
- diffusers: [different repo](https://github.com/huggingface/diffusers)
- rust tokenizers: [different repo](https://github.com/huggingface/tokenizers)
Maintained examples (not research project or legacy):
- Flax: @Rocketknight1
- PyTorch: See Models above and tag the person corresponding to the modality of the example.
- TensorFlow: @Rocketknight1
-->
| {
"login": "Cyrilvallez",
"id": 71554963,
"node_id": "MDQ6VXNlcjcxNTU0OTYz",
"avatar_url": "https://avatars.githubusercontent.com/u/71554963?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Cyrilvallez",
"html_url": "https://github.com/Cyrilvallez",
"followers_url": "https://api.github.com/users/Cyrilvallez/followers",
"following_url": "https://api.github.com/users/Cyrilvallez/following{/other_user}",
"gists_url": "https://api.github.com/users/Cyrilvallez/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Cyrilvallez/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Cyrilvallez/subscriptions",
"organizations_url": "https://api.github.com/users/Cyrilvallez/orgs",
"repos_url": "https://api.github.com/users/Cyrilvallez/repos",
"events_url": "https://api.github.com/users/Cyrilvallez/events{/privacy}",
"received_events_url": "https://api.github.com/users/Cyrilvallez/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38395/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38395/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/38394 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38394/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38394/comments | https://api.github.com/repos/huggingface/transformers/issues/38394/events | https://github.com/huggingface/transformers/pull/38394 | 3,092,306,205 | PR_kwDOCUB6oc6Xskqt | 38,394 | Updated model card for OLMo2 | {
"login": "andyvu923",
"id": 204499861,
"node_id": "U_kgDODDBrlQ",
"avatar_url": "https://avatars.githubusercontent.com/u/204499861?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/andyvu923",
"html_url": "https://github.com/andyvu923",
"followers_url": "https://api.github.com/users/andyvu923/followers",
"following_url": "https://api.github.com/users/andyvu923/following{/other_user}",
"gists_url": "https://api.github.com/users/andyvu923/gists{/gist_id}",
"starred_url": "https://api.github.com/users/andyvu923/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/andyvu923/subscriptions",
"organizations_url": "https://api.github.com/users/andyvu923/orgs",
"repos_url": "https://api.github.com/users/andyvu923/repos",
"events_url": "https://api.github.com/users/andyvu923/events{/privacy}",
"received_events_url": "https://api.github.com/users/andyvu923/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-05-27T01:26:43 | 2025-05-28T00:29:41 | 2025-05-27T23:24:36 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38394",
"html_url": "https://github.com/huggingface/transformers/pull/38394",
"diff_url": "https://github.com/huggingface/transformers/pull/38394.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38394.patch",
"merged_at": "2025-05-27T23:24:36"
} | # What does this PR do?
#36979
This PR updates the model card for OLMo2 using the template provided in #36979.
## Before submitting
- [x] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [x] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [x] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case. https://github.com/huggingface/transformers/issues/36979
## Who can review?
@stevhliu please let me know if any changes need to be made, thank you!
| {
"login": "stevhliu",
"id": 59462357,
"node_id": "MDQ6VXNlcjU5NDYyMzU3",
"avatar_url": "https://avatars.githubusercontent.com/u/59462357?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/stevhliu",
"html_url": "https://github.com/stevhliu",
"followers_url": "https://api.github.com/users/stevhliu/followers",
"following_url": "https://api.github.com/users/stevhliu/following{/other_user}",
"gists_url": "https://api.github.com/users/stevhliu/gists{/gist_id}",
"starred_url": "https://api.github.com/users/stevhliu/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/stevhliu/subscriptions",
"organizations_url": "https://api.github.com/users/stevhliu/orgs",
"repos_url": "https://api.github.com/users/stevhliu/repos",
"events_url": "https://api.github.com/users/stevhliu/events{/privacy}",
"received_events_url": "https://api.github.com/users/stevhliu/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38394/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38394/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/38393 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38393/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38393/comments | https://api.github.com/repos/huggingface/transformers/issues/38393/events | https://github.com/huggingface/transformers/issues/38393 | 3,092,295,270 | I_kwDOCUB6oc64UK5m | 38,393 | PaliGemmaProcessor fails due to missing return_tensors in tokenizer call | {
"login": "sergiosgatidis",
"id": 52936169,
"node_id": "MDQ6VXNlcjUyOTM2MTY5",
"avatar_url": "https://avatars.githubusercontent.com/u/52936169?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/sergiosgatidis",
"html_url": "https://github.com/sergiosgatidis",
"followers_url": "https://api.github.com/users/sergiosgatidis/followers",
"following_url": "https://api.github.com/users/sergiosgatidis/following{/other_user}",
"gists_url": "https://api.github.com/users/sergiosgatidis/gists{/gist_id}",
"starred_url": "https://api.github.com/users/sergiosgatidis/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sergiosgatidis/subscriptions",
"organizations_url": "https://api.github.com/users/sergiosgatidis/orgs",
"repos_url": "https://api.github.com/users/sergiosgatidis/repos",
"events_url": "https://api.github.com/users/sergiosgatidis/events{/privacy}",
"received_events_url": "https://api.github.com/users/sergiosgatidis/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
}
] | closed | false | null | [] | null | [] | 2025-05-27T01:16:34 | 2025-05-27T09:32:26 | 2025-05-27T09:32:25 | NONE | null | null | null | null | ### System Info
- `transformers` version: 4.52.3
- Platform: Linux-6.8.0-59-generic-x86_64-with-glibc2.39
- Python version: 3.10.16
- Huggingface_hub version: 0.32.1
- Safetensors version: 0.5.3
- Accelerate version: 1.7.0
- Accelerate config: not found
- DeepSpeed version: not installed
- PyTorch version (GPU?): 2.6.0+cu118 (True)
- Tensorflow version (GPU?): 2.19.0 (True)
- Flax version (CPU?/GPU?/TPU?): 0.10.6 (cpu)
- Jax version: 0.6.0
- JaxLib version: 0.6.0
- Using distributed or parallel set-up in script?: <fill in>
- Using GPU in script?: yes
- GPU type: NVIDIA RTX 6000 Ada Generation
### Who can help?
@ArthurZucker and @itazap
### Information
- [x] The official example scripts
- [ ] My own modified scripts
### Tasks
- [ ] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...)
- [x] My own task or dataset (give details below)
### Reproduction
**Summary:**
When using PaliGemmaProcessor for multimodal fine-tuning with a suffix, the processor crashes with:
`AttributeError: ‘list’ object has no attribute ‘masked_fill’
`
This happens because return_tensors="pt" is not passed to the tokenizer internally. As a result, the tokenizer returns Python lists for input_ids and token_type_ids, and the processor assumes they’re tensors — leading to a crash at:
`inputs[“input_ids”].masked_fill(inputs[“token_type_ids”] == 0, -100)
example:
````
from transformers import PaliGemmaForConditionalGeneration, PaliGemmaProcessor
model_id = 'google/paligemma2-3b-pt-224'
processor = PaliGemmaProcessor.from_pretrained(model_id)
examples = [
{
"prefix": "caption <loc0412><loc0269><loc0644><loc0546><seg015>",
"suffix": "RML",
"image": PIL.Image.new("RGB", (224, 224)),
},
{
"prefix": "detect Left Fourth Rib",
"suffix": "<loc0234><loc0621><loc0495><loc0796> Left Fourth Rib",
"image": PIL.Image.new("RGB", (224, 224)),
}
]
texts = ["<image>" + ex["prefix"] for ex in examples]
labels = [ex["suffix"] for ex in examples]
images = [ex["image"] for ex in examples]
tokens = processor(
text=texts,
images=images,
suffix=labels,
return_tensors="pt",
padding="longest"
)
````
This raises:
`AttributeError: ‘list’ object has no attribute ‘masked_fill’`
**Proposed Fix:**
In the __call__ method of PaliGemmaProcessor, the return_tensors argument is popped from text_kwargs:
`return_tensors = output_kwargs["text_kwargs"].pop("return_tensors", None)`
…but it is never passed to self.tokenizer(...). Adding this line to the tokenizer call may fix the issue:
`return_tensors=return_tensors,`
### Expected behavior
The processor should correctly pass return_tensors="pt" to the tokenizer so that all fields (e.g., input_ids, token_type_ids) are returned as PyTorch tensors, allowing downstream tensor operations like .masked_fill() to work without errors. | {
"login": "zucchini-nlp",
"id": 100715397,
"node_id": "U_kgDOBgDLhQ",
"avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/zucchini-nlp",
"html_url": "https://github.com/zucchini-nlp",
"followers_url": "https://api.github.com/users/zucchini-nlp/followers",
"following_url": "https://api.github.com/users/zucchini-nlp/following{/other_user}",
"gists_url": "https://api.github.com/users/zucchini-nlp/gists{/gist_id}",
"starred_url": "https://api.github.com/users/zucchini-nlp/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/zucchini-nlp/subscriptions",
"organizations_url": "https://api.github.com/users/zucchini-nlp/orgs",
"repos_url": "https://api.github.com/users/zucchini-nlp/repos",
"events_url": "https://api.github.com/users/zucchini-nlp/events{/privacy}",
"received_events_url": "https://api.github.com/users/zucchini-nlp/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38393/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38393/timeline | null | completed | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | true |
https://api.github.com/repos/huggingface/transformers/issues/38392 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38392/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38392/comments | https://api.github.com/repos/huggingface/transformers/issues/38392/events | https://github.com/huggingface/transformers/issues/38392 | 3,092,263,805 | I_kwDOCUB6oc64UDN9 | 38,392 | "THUDM/glm-4-0414-9b-chat" is 404, but needed for Glm4IntegrationTest UT | {
"login": "yao-matrix",
"id": 7245027,
"node_id": "MDQ6VXNlcjcyNDUwMjc=",
"avatar_url": "https://avatars.githubusercontent.com/u/7245027?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/yao-matrix",
"html_url": "https://github.com/yao-matrix",
"followers_url": "https://api.github.com/users/yao-matrix/followers",
"following_url": "https://api.github.com/users/yao-matrix/following{/other_user}",
"gists_url": "https://api.github.com/users/yao-matrix/gists{/gist_id}",
"starred_url": "https://api.github.com/users/yao-matrix/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/yao-matrix/subscriptions",
"organizations_url": "https://api.github.com/users/yao-matrix/orgs",
"repos_url": "https://api.github.com/users/yao-matrix/repos",
"events_url": "https://api.github.com/users/yao-matrix/events{/privacy}",
"received_events_url": "https://api.github.com/users/yao-matrix/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
}
] | closed | false | null | [] | null | [] | 2025-05-27T00:43:14 | 2025-05-28T16:27:47 | 2025-05-28T16:27:33 | CONTRIBUTOR | null | null | null | null | ### System Info
N/A
### Who can help?
@ydshieh
### Information
- [ ] The official example scripts
- [ ] My own modified scripts
### Tasks
- [ ] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...)
- [ ] My own task or dataset (give details below)
### Reproduction
as title suggests, ["THUDM/glm-4-0414-9b-chat"](https://huggingface.co/THUDM/glm-4-0414-9b-chat) is 404, but needed for Glm4IntegrationTest UT. Thx.
### Expected behavior
load model successfully | {
"login": "ydshieh",
"id": 2521628,
"node_id": "MDQ6VXNlcjI1MjE2Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ydshieh",
"html_url": "https://github.com/ydshieh",
"followers_url": "https://api.github.com/users/ydshieh/followers",
"following_url": "https://api.github.com/users/ydshieh/following{/other_user}",
"gists_url": "https://api.github.com/users/ydshieh/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ydshieh/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ydshieh/subscriptions",
"organizations_url": "https://api.github.com/users/ydshieh/orgs",
"repos_url": "https://api.github.com/users/ydshieh/repos",
"events_url": "https://api.github.com/users/ydshieh/events{/privacy}",
"received_events_url": "https://api.github.com/users/ydshieh/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38392/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38392/timeline | null | completed | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | true |
https://api.github.com/repos/huggingface/transformers/issues/38391 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38391/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38391/comments | https://api.github.com/repos/huggingface/transformers/issues/38391/events | https://github.com/huggingface/transformers/pull/38391 | 3,092,216,630 | PR_kwDOCUB6oc6XsSbi | 38,391 | Add ColQwen2.5 to transformers 🤗 | {
"login": "qnguyen3",
"id": 42907738,
"node_id": "MDQ6VXNlcjQyOTA3NzM4",
"avatar_url": "https://avatars.githubusercontent.com/u/42907738?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/qnguyen3",
"html_url": "https://github.com/qnguyen3",
"followers_url": "https://api.github.com/users/qnguyen3/followers",
"following_url": "https://api.github.com/users/qnguyen3/following{/other_user}",
"gists_url": "https://api.github.com/users/qnguyen3/gists{/gist_id}",
"starred_url": "https://api.github.com/users/qnguyen3/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/qnguyen3/subscriptions",
"organizations_url": "https://api.github.com/users/qnguyen3/orgs",
"repos_url": "https://api.github.com/users/qnguyen3/repos",
"events_url": "https://api.github.com/users/qnguyen3/events{/privacy}",
"received_events_url": "https://api.github.com/users/qnguyen3/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-05-26T23:50:28 | 2025-08-21T03:18:10 | 2025-06-04T16:47:40 | NONE | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38391",
"html_url": "https://github.com/huggingface/transformers/pull/38391",
"diff_url": "https://github.com/huggingface/transformers/pull/38391.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38391.patch",
"merged_at": null
} | # What does this PR do?
Add ColQwen2_5 in 🤗 transformers.
## Who can review?
- @yonigozlan
- @ArthurZucker
- @Cyrilvallez
## Additional details
The newly converted model weights are stored in qnguyen3/colqwen2_5-v0.2-hf.
## Progress checklist
- [x] Created script that successfully runs the forward() pass using the original repository and checkpoint
- [x] Successfully added the model skeleton to 🤗 Transformers
- [x] Successfully converted original checkpoint to 🤗 Transformers checkpoint
- [x] Successfully ran forward() pass in 🤗 Transformers that gives identical output to original checkpoint
- [x] Finished model tests in 🤗 Transformers
- [x] Successfully added tokenizer in 🤗 Transformers
- [x] Run end-to-end integration tests
- [x] Finished docs
- [x] Uploaded model weights to the Hub
- [x] Submitted the pull request
The implementation is production-ready and maintains full compatibility with the original ColQwen2.5 functionality. Looking forward to getting this merged! 🚀 | {
"login": "yonigozlan",
"id": 74535834,
"node_id": "MDQ6VXNlcjc0NTM1ODM0",
"avatar_url": "https://avatars.githubusercontent.com/u/74535834?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/yonigozlan",
"html_url": "https://github.com/yonigozlan",
"followers_url": "https://api.github.com/users/yonigozlan/followers",
"following_url": "https://api.github.com/users/yonigozlan/following{/other_user}",
"gists_url": "https://api.github.com/users/yonigozlan/gists{/gist_id}",
"starred_url": "https://api.github.com/users/yonigozlan/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/yonigozlan/subscriptions",
"organizations_url": "https://api.github.com/users/yonigozlan/orgs",
"repos_url": "https://api.github.com/users/yonigozlan/repos",
"events_url": "https://api.github.com/users/yonigozlan/events{/privacy}",
"received_events_url": "https://api.github.com/users/yonigozlan/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38391/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38391/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/38390 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38390/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38390/comments | https://api.github.com/repos/huggingface/transformers/issues/38390/events | https://github.com/huggingface/transformers/pull/38390 | 3,092,139,829 | PR_kwDOCUB6oc6XsCRV | 38,390 | add colqwen2_5 to transformers 🤗 | {
"login": "qnguyen3",
"id": 42907738,
"node_id": "MDQ6VXNlcjQyOTA3NzM4",
"avatar_url": "https://avatars.githubusercontent.com/u/42907738?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/qnguyen3",
"html_url": "https://github.com/qnguyen3",
"followers_url": "https://api.github.com/users/qnguyen3/followers",
"following_url": "https://api.github.com/users/qnguyen3/following{/other_user}",
"gists_url": "https://api.github.com/users/qnguyen3/gists{/gist_id}",
"starred_url": "https://api.github.com/users/qnguyen3/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/qnguyen3/subscriptions",
"organizations_url": "https://api.github.com/users/qnguyen3/orgs",
"repos_url": "https://api.github.com/users/qnguyen3/repos",
"events_url": "https://api.github.com/users/qnguyen3/events{/privacy}",
"received_events_url": "https://api.github.com/users/qnguyen3/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-05-26T22:24:58 | 2025-05-26T23:32:09 | 2025-05-26T23:32:09 | NONE | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38390",
"html_url": "https://github.com/huggingface/transformers/pull/38390",
"diff_url": "https://github.com/huggingface/transformers/pull/38390.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38390.patch",
"merged_at": null
} | # What does this PR do?
Add ColQwen2_5 in 🤗 transformers.
## Who can review?
- @yonigozlan
- @ArthurZucker
- @Cyrilvallez
## Additional details
- The newly converted model weights are stored in `qnguyen3/colqwen2_5-v0.2-hf`.
## Progress checklist
- [x] Created script that successfully runs the forward() pass using the original repository and checkpoint
- [x] Successfully added the model skeleton to 🤗 Transformers
- [x] Successfully converted original checkpoint to 🤗 Transformers checkpoint
- [x] Successfully ran forward() pass in 🤗 Transformers that gives identical output to original checkpoint
- [x] Finished model tests in 🤗 Transformers
- [x] Successfully added tokenizer in 🤗 Transformers
- [x] Run end-to-end integration tests
- [x] Finished docs
- [x] Uploaded model weights to the Hub
- [x] Submitted the pull request
The implementation is production-ready and maintains full compatibility with the original ColQwen2.5 functionality. Looking forward to getting this merged! 🚀 | {
"login": "qnguyen3",
"id": 42907738,
"node_id": "MDQ6VXNlcjQyOTA3NzM4",
"avatar_url": "https://avatars.githubusercontent.com/u/42907738?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/qnguyen3",
"html_url": "https://github.com/qnguyen3",
"followers_url": "https://api.github.com/users/qnguyen3/followers",
"following_url": "https://api.github.com/users/qnguyen3/following{/other_user}",
"gists_url": "https://api.github.com/users/qnguyen3/gists{/gist_id}",
"starred_url": "https://api.github.com/users/qnguyen3/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/qnguyen3/subscriptions",
"organizations_url": "https://api.github.com/users/qnguyen3/orgs",
"repos_url": "https://api.github.com/users/qnguyen3/repos",
"events_url": "https://api.github.com/users/qnguyen3/events{/privacy}",
"received_events_url": "https://api.github.com/users/qnguyen3/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38390/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38390/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/38389 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38389/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38389/comments | https://api.github.com/repos/huggingface/transformers/issues/38389/events | https://github.com/huggingface/transformers/pull/38389 | 3,092,077,693 | PR_kwDOCUB6oc6Xr1IS | 38,389 | Fix MedGemma torch compilation graph breaks in PaliGemma base class | {
"login": "rahulrshetty45",
"id": 209668615,
"node_id": "U_kgDODH9KBw",
"avatar_url": "https://avatars.githubusercontent.com/u/209668615?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/rahulrshetty45",
"html_url": "https://github.com/rahulrshetty45",
"followers_url": "https://api.github.com/users/rahulrshetty45/followers",
"following_url": "https://api.github.com/users/rahulrshetty45/following{/other_user}",
"gists_url": "https://api.github.com/users/rahulrshetty45/gists{/gist_id}",
"starred_url": "https://api.github.com/users/rahulrshetty45/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/rahulrshetty45/subscriptions",
"organizations_url": "https://api.github.com/users/rahulrshetty45/orgs",
"repos_url": "https://api.github.com/users/rahulrshetty45/repos",
"events_url": "https://api.github.com/users/rahulrshetty45/events{/privacy}",
"received_events_url": "https://api.github.com/users/rahulrshetty45/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 8103865784,
"node_id": "LA_kwDOCUB6oc8AAAAB4wctuA",
"url": "https://api.github.com/repos/huggingface/transformers/labels/for%20patch",
"name": "for patch",
"color": "D93F0B",
"default": false,
"description": "Tag issues / labels that should be included in the next patch"
}
] | closed | false | null | [] | null | [] | 2025-05-26T21:21:58 | 2025-05-28T16:18:07 | 2025-05-28T16:18:07 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38389",
"html_url": "https://github.com/huggingface/transformers/pull/38389",
"diff_url": "https://github.com/huggingface/transformers/pull/38389.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38389.patch",
"merged_at": null
} | # Fix MedGemma torch compilation graph breaks
## What does this PR do?
This PR fixes a critical issue where MedGemma models (and other models inheriting from PaliGemma) fail to compile with `torch.compile()` due to graph breaks when accessing `_attn_implementation` during compilation.
## Problem
- **Issue**: #38333
- **Error**: Graph breaks occur in torch 2.7.0 when `torch.compile()` is used with MedGemma models
- **Root Cause**: The `_update_causal_mask` method in PaliGemma accesses `self.config.text_config._attn_implementation`, which causes graph breaks during torch compilation
- **Impact**: Users cannot use torch compilation with MedGemma models, significantly impacting performance
## Solution
### 🔧 **Changes Made**
1. **Modified PaliGemma._update_causal_mask method**:
- Added `is_torchdynamo_compiling()` check before accessing `_attn_implementation`
- This prevents the property access during torch compilation that was causing graph breaks
- Maintains full backward compatibility and existing functionality
### 📝 **Code Changes**
```python
# Before
if self.config.text_config._attn_implementation == "flash_attention_2":
# After
# Avoid accessing _attn_implementation during torch compilation to prevent graph breaks
if not is_torchdynamo_compiling() and self.config.text_config._attn_implementation == "flash_attention_2":
```
## Testing
- ✅ Verified that PaliGemma models can be instantiated without errors
- ✅ Confirmed that the `_update_causal_mask` method works correctly in both compilation and non-compilation contexts
- ✅ Validated that existing functionality is preserved
- ✅ Checked that the fix prevents graph breaks during torch compilation
## Impact
- **Fixes**: MedGemma torch compilation issues
- **Affects**: All models inheriting from PaliGemma (including MedGemma, Gemma3 vision models)
- **Backward Compatibility**: ✅ Fully maintained
- **Performance**: ✅ Enables torch compilation for significant performance improvements
## Related Issues
- Fixes #38333
## Checklist
- [x] This PR fixes a bug
- [x] The fix is minimal and targeted
- [x] No breaking changes introduced
- [x] Backward compatibility maintained
- [x] Code follows existing patterns in the codebase
- [x] Fix addresses the root cause, not just symptoms
## Additional Notes
This is a **permanent fix** that addresses the root cause of the compilation issue, unlike temporary workarounds. The solution follows the established pattern in the transformers codebase of checking `is_torchdynamo_compiling()` to avoid operations that cause graph breaks during compilation.
The fix is minimal, safe, and maintains full compatibility while enabling torch compilation for MedGemma and related models. | {
"login": "Cyrilvallez",
"id": 71554963,
"node_id": "MDQ6VXNlcjcxNTU0OTYz",
"avatar_url": "https://avatars.githubusercontent.com/u/71554963?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Cyrilvallez",
"html_url": "https://github.com/Cyrilvallez",
"followers_url": "https://api.github.com/users/Cyrilvallez/followers",
"following_url": "https://api.github.com/users/Cyrilvallez/following{/other_user}",
"gists_url": "https://api.github.com/users/Cyrilvallez/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Cyrilvallez/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Cyrilvallez/subscriptions",
"organizations_url": "https://api.github.com/users/Cyrilvallez/orgs",
"repos_url": "https://api.github.com/users/Cyrilvallez/repos",
"events_url": "https://api.github.com/users/Cyrilvallez/events{/privacy}",
"received_events_url": "https://api.github.com/users/Cyrilvallez/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38389/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38389/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/38388 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38388/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38388/comments | https://api.github.com/repos/huggingface/transformers/issues/38388/events | https://github.com/huggingface/transformers/pull/38388 | 3,091,972,623 | PR_kwDOCUB6oc6XrfEK | 38,388 | Fix Whisper inference regression with backward-compatible logprob calculation | {
"login": "rahulrshetty45",
"id": 209668615,
"node_id": "U_kgDODH9KBw",
"avatar_url": "https://avatars.githubusercontent.com/u/209668615?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/rahulrshetty45",
"html_url": "https://github.com/rahulrshetty45",
"followers_url": "https://api.github.com/users/rahulrshetty45/followers",
"following_url": "https://api.github.com/users/rahulrshetty45/following{/other_user}",
"gists_url": "https://api.github.com/users/rahulrshetty45/gists{/gist_id}",
"starred_url": "https://api.github.com/users/rahulrshetty45/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/rahulrshetty45/subscriptions",
"organizations_url": "https://api.github.com/users/rahulrshetty45/orgs",
"repos_url": "https://api.github.com/users/rahulrshetty45/repos",
"events_url": "https://api.github.com/users/rahulrshetty45/events{/privacy}",
"received_events_url": "https://api.github.com/users/rahulrshetty45/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | open | false | null | [] | null | [] | 2025-05-26T19:57:54 | 2025-06-26T14:00:33 | null | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38388",
"html_url": "https://github.com/huggingface/transformers/pull/38388",
"diff_url": "https://github.com/huggingface/transformers/pull/38388.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38388.patch",
"merged_at": null
} | ## Summary
This PR fixes the Whisper inference regression reported in issue #38378 by implementing a backward-compatible solution that allows users to choose between the legacy and new logprob calculation methods.
## Problem
A regression was introduced in transformers v4.52.0 (commit da334bcfa) that changed the average log probability calculation in `_retrieve_avg_logprobs`, causing different inference results for fine-tuned Whisper models across different versions.
**Original formula (< v4.52.0):** `sum_logprobs / (length + 1)`
**New formula (>= v4.52.0):** `sum_logprobs / len(tokens)`
This affected:
- Short-form transcription consistency
- Long-form transcription with timestamps
- Temperature fallback decisions
- Model hallucination patterns
## Solution
- **Added `use_legacy_logprob_calculation` parameter** to `WhisperConfig`
- **Defaults to `True`** for backward compatibility (no breaking changes)
- **Allows opt-in to new behavior** by setting the parameter to `False`
- **Comprehensive test coverage** for both calculation modes
- **Detailed documentation** explaining the fix and usage
## Changes Made
1. **Configuration (`configuration_whisper.py`)**:
- Added `use_legacy_logprob_calculation` parameter with default `True`
- Updated docstring with clear explanation
2. **Generation (`generation_whisper.py`)**:
- Modified `_retrieve_avg_logprobs` method to support both calculation modes
- Added detailed comments explaining the regression fix
3. **Tests (`test_whisper_regression.py`)**:
- Comprehensive test suite covering both legacy and new modes
- Regression scenario tests
- Deterministic behavior verification
4. **Documentation (`WHISPER_REGRESSION_FIX.md`)**:
- Complete explanation of the problem and solution
- Usage examples for both modes
- Migration guide for different user types
## Testing
- ✅ All existing tests pass
- ✅ New regression tests added
- ✅ Both calculation modes tested
- ✅ Backward compatibility verified
- ✅ Configuration handling tested
## Backward Compatibility
This change is **fully backward compatible**:
- Default behavior matches transformers < v4.52.0
- No breaking changes to existing APIs
- Users can opt into new behavior when ready
## Related Issues
Fixes #38378
## Checklist
- [x] I have read the contribution guidelines
- [x] My code follows the project's coding standards
- [x] I have added tests that prove my fix is effective
- [x] I have added necessary documentation
- [x] My changes generate no new warnings
- [x] Any dependent changes have been merged and published | null | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38388/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38388/timeline | null | null | null | null | true | false |
https://api.github.com/repos/huggingface/transformers/issues/38387 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38387/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38387/comments | https://api.github.com/repos/huggingface/transformers/issues/38387/events | https://github.com/huggingface/transformers/pull/38387 | 3,091,825,959 | PR_kwDOCUB6oc6XrAB1 | 38,387 | Expanded `UserWarning` message on parameter edge case | {
"login": "jamesbraza",
"id": 8990777,
"node_id": "MDQ6VXNlcjg5OTA3Nzc=",
"avatar_url": "https://avatars.githubusercontent.com/u/8990777?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jamesbraza",
"html_url": "https://github.com/jamesbraza",
"followers_url": "https://api.github.com/users/jamesbraza/followers",
"following_url": "https://api.github.com/users/jamesbraza/following{/other_user}",
"gists_url": "https://api.github.com/users/jamesbraza/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jamesbraza/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jamesbraza/subscriptions",
"organizations_url": "https://api.github.com/users/jamesbraza/orgs",
"repos_url": "https://api.github.com/users/jamesbraza/repos",
"events_url": "https://api.github.com/users/jamesbraza/events{/privacy}",
"received_events_url": "https://api.github.com/users/jamesbraza/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | open | false | null | [] | null | [] | 2025-05-26T18:22:53 | 2025-05-27T14:33:44 | null | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38387",
"html_url": "https://github.com/huggingface/transformers/pull/38387",
"diff_url": "https://github.com/huggingface/transformers/pull/38387.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38387.patch",
"merged_at": null
} | # What does this PR do?
A more verbose explanation in the `UserWarning` gives users more understanding to iterate on parameters.
Fixes https://github.com/huggingface/transformers/issues/36896
## Who can review?
@manueldeprada @Rocketknight1 @stevhliu @gante
| null | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38387/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38387/timeline | null | null | null | null | true | false |
https://api.github.com/repos/huggingface/transformers/issues/38386 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38386/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38386/comments | https://api.github.com/repos/huggingface/transformers/issues/38386/events | https://github.com/huggingface/transformers/pull/38386 | 3,091,672,642 | PR_kwDOCUB6oc6XqfH2 | 38,386 | [cli] cli usable without torch | {
"login": "gante",
"id": 12240844,
"node_id": "MDQ6VXNlcjEyMjQwODQ0",
"avatar_url": "https://avatars.githubusercontent.com/u/12240844?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/gante",
"html_url": "https://github.com/gante",
"followers_url": "https://api.github.com/users/gante/followers",
"following_url": "https://api.github.com/users/gante/following{/other_user}",
"gists_url": "https://api.github.com/users/gante/gists{/gist_id}",
"starred_url": "https://api.github.com/users/gante/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/gante/subscriptions",
"organizations_url": "https://api.github.com/users/gante/orgs",
"repos_url": "https://api.github.com/users/gante/repos",
"events_url": "https://api.github.com/users/gante/events{/privacy}",
"received_events_url": "https://api.github.com/users/gante/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-05-26T16:41:55 | 2025-05-26T17:35:24 | 2025-05-26T16:54:18 | MEMBER | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38386",
"html_url": "https://github.com/huggingface/transformers/pull/38386",
"diff_url": "https://github.com/huggingface/transformers/pull/38386.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38386.patch",
"merged_at": "2025-05-26T16:54:18"
} | # What does this PR do?
fixes import issues related to type hints, such that the CLIs are usable without torch
supercedes #38356 | {
"login": "gante",
"id": 12240844,
"node_id": "MDQ6VXNlcjEyMjQwODQ0",
"avatar_url": "https://avatars.githubusercontent.com/u/12240844?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/gante",
"html_url": "https://github.com/gante",
"followers_url": "https://api.github.com/users/gante/followers",
"following_url": "https://api.github.com/users/gante/following{/other_user}",
"gists_url": "https://api.github.com/users/gante/gists{/gist_id}",
"starred_url": "https://api.github.com/users/gante/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/gante/subscriptions",
"organizations_url": "https://api.github.com/users/gante/orgs",
"repos_url": "https://api.github.com/users/gante/repos",
"events_url": "https://api.github.com/users/gante/events{/privacy}",
"received_events_url": "https://api.github.com/users/gante/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38386/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38386/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/38385 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38385/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38385/comments | https://api.github.com/repos/huggingface/transformers/issues/38385/events | https://github.com/huggingface/transformers/pull/38385 | 3,091,625,304 | PR_kwDOCUB6oc6XqU9m | 38,385 | Fix convert to original state dict for VLMs | {
"login": "hiyouga",
"id": 16256802,
"node_id": "MDQ6VXNlcjE2MjU2ODAy",
"avatar_url": "https://avatars.githubusercontent.com/u/16256802?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/hiyouga",
"html_url": "https://github.com/hiyouga",
"followers_url": "https://api.github.com/users/hiyouga/followers",
"following_url": "https://api.github.com/users/hiyouga/following{/other_user}",
"gists_url": "https://api.github.com/users/hiyouga/gists{/gist_id}",
"starred_url": "https://api.github.com/users/hiyouga/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/hiyouga/subscriptions",
"organizations_url": "https://api.github.com/users/hiyouga/orgs",
"repos_url": "https://api.github.com/users/hiyouga/repos",
"events_url": "https://api.github.com/users/hiyouga/events{/privacy}",
"received_events_url": "https://api.github.com/users/hiyouga/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 8103865784,
"node_id": "LA_kwDOCUB6oc8AAAAB4wctuA",
"url": "https://api.github.com/repos/huggingface/transformers/labels/for%20patch",
"name": "for patch",
"color": "D93F0B",
"default": false,
"description": "Tag issues / labels that should be included in the next patch"
}
] | closed | false | null | [] | null | [] | 2025-05-26T16:12:52 | 2025-06-03T10:11:29 | 2025-05-27T10:28:00 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38385",
"html_url": "https://github.com/huggingface/transformers/pull/38385",
"diff_url": "https://github.com/huggingface/transformers/pull/38385.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38385.patch",
"merged_at": "2025-05-27T10:28:00"
} | # What does this PR do?
#37033 introduces the base models for all VLMs. The model weights will be converted by mapping the original keys according to
https://github.com/huggingface/transformers/blob/701caef704e356dc2f9331cc3fd5df0eccb4720a/src/transformers/models/qwen2_vl/modeling_qwen2_vl.py#L1735-L1738
However, the previous implementation of Transformers could not properly convert the weights back due to existing bugs (`replacement = re.sub(r"\(.*?\)", "", pattern)` -> `replacement = re.sub(r"\(.*?\)", "", replacement)`) and the lack of support for nested parentheses
https://github.com/huggingface/transformers/blob/701caef704e356dc2f9331cc3fd5df0eccb4720a/src/transformers/modeling_utils.py#L3644-L3657
We want to provide a more accurate weight conversion implementation to prevent issues with third-party apps.
https://github.com/hiyouga/LLaMA-Factory/issues/8147
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [x] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review?
@ArthurZucker @zucchini-nlp | {
"login": "zucchini-nlp",
"id": 100715397,
"node_id": "U_kgDOBgDLhQ",
"avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/zucchini-nlp",
"html_url": "https://github.com/zucchini-nlp",
"followers_url": "https://api.github.com/users/zucchini-nlp/followers",
"following_url": "https://api.github.com/users/zucchini-nlp/following{/other_user}",
"gists_url": "https://api.github.com/users/zucchini-nlp/gists{/gist_id}",
"starred_url": "https://api.github.com/users/zucchini-nlp/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/zucchini-nlp/subscriptions",
"organizations_url": "https://api.github.com/users/zucchini-nlp/orgs",
"repos_url": "https://api.github.com/users/zucchini-nlp/repos",
"events_url": "https://api.github.com/users/zucchini-nlp/events{/privacy}",
"received_events_url": "https://api.github.com/users/zucchini-nlp/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38385/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38385/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/38384 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38384/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38384/comments | https://api.github.com/repos/huggingface/transformers/issues/38384/events | https://github.com/huggingface/transformers/pull/38384 | 3,091,602,628 | PR_kwDOCUB6oc6XqQBH | 38,384 | update gemma tests | {
"login": "ydshieh",
"id": 2521628,
"node_id": "MDQ6VXNlcjI1MjE2Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ydshieh",
"html_url": "https://github.com/ydshieh",
"followers_url": "https://api.github.com/users/ydshieh/followers",
"following_url": "https://api.github.com/users/ydshieh/following{/other_user}",
"gists_url": "https://api.github.com/users/ydshieh/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ydshieh/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ydshieh/subscriptions",
"organizations_url": "https://api.github.com/users/ydshieh/orgs",
"repos_url": "https://api.github.com/users/ydshieh/repos",
"events_url": "https://api.github.com/users/ydshieh/events{/privacy}",
"received_events_url": "https://api.github.com/users/ydshieh/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-05-26T16:00:51 | 2025-05-26T17:54:06 | 2025-05-26T17:54:04 | COLLABORATOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38384",
"html_url": "https://github.com/huggingface/transformers/pull/38384",
"diff_url": "https://github.com/huggingface/transformers/pull/38384.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38384.patch",
"merged_at": "2025-05-26T17:54:04"
} | # What does this PR do?
All tests now pass, except the following one
Detect and fix most `_init_weights()` issues - make it work for composite models (#37070)
fails
> tests/models/gemma/test_modeling_gemma.py::GemmaModelTest::test_sdpa_equivalence
2e-3 --> 4e-3, which is larger than 3e-3.
Let's deal with this in a separate PR. | {
"login": "ydshieh",
"id": 2521628,
"node_id": "MDQ6VXNlcjI1MjE2Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ydshieh",
"html_url": "https://github.com/ydshieh",
"followers_url": "https://api.github.com/users/ydshieh/followers",
"following_url": "https://api.github.com/users/ydshieh/following{/other_user}",
"gists_url": "https://api.github.com/users/ydshieh/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ydshieh/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ydshieh/subscriptions",
"organizations_url": "https://api.github.com/users/ydshieh/orgs",
"repos_url": "https://api.github.com/users/ydshieh/repos",
"events_url": "https://api.github.com/users/ydshieh/events{/privacy}",
"received_events_url": "https://api.github.com/users/ydshieh/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38384/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38384/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/38383 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38383/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38383/comments | https://api.github.com/repos/huggingface/transformers/issues/38383/events | https://github.com/huggingface/transformers/pull/38383 | 3,091,296,350 | PR_kwDOCUB6oc6XpNWv | 38,383 | for now disable compile | {
"login": "ArthurZucker",
"id": 48595927,
"node_id": "MDQ6VXNlcjQ4NTk1OTI3",
"avatar_url": "https://avatars.githubusercontent.com/u/48595927?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ArthurZucker",
"html_url": "https://github.com/ArthurZucker",
"followers_url": "https://api.github.com/users/ArthurZucker/followers",
"following_url": "https://api.github.com/users/ArthurZucker/following{/other_user}",
"gists_url": "https://api.github.com/users/ArthurZucker/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ArthurZucker/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ArthurZucker/subscriptions",
"organizations_url": "https://api.github.com/users/ArthurZucker/orgs",
"repos_url": "https://api.github.com/users/ArthurZucker/repos",
"events_url": "https://api.github.com/users/ArthurZucker/events{/privacy}",
"received_events_url": "https://api.github.com/users/ArthurZucker/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-05-26T13:51:23 | 2025-05-26T14:04:32 | 2025-05-26T13:57:11 | COLLABORATOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38383",
"html_url": "https://github.com/huggingface/transformers/pull/38383",
"diff_url": "https://github.com/huggingface/transformers/pull/38383.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38383.patch",
"merged_at": "2025-05-26T13:57:11"
} | # What does this PR do?
We should disable compile on CB for now as it breaks on mac.
Until we have a decorator that only applies it for mac devices | {
"login": "ArthurZucker",
"id": 48595927,
"node_id": "MDQ6VXNlcjQ4NTk1OTI3",
"avatar_url": "https://avatars.githubusercontent.com/u/48595927?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ArthurZucker",
"html_url": "https://github.com/ArthurZucker",
"followers_url": "https://api.github.com/users/ArthurZucker/followers",
"following_url": "https://api.github.com/users/ArthurZucker/following{/other_user}",
"gists_url": "https://api.github.com/users/ArthurZucker/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ArthurZucker/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ArthurZucker/subscriptions",
"organizations_url": "https://api.github.com/users/ArthurZucker/orgs",
"repos_url": "https://api.github.com/users/ArthurZucker/repos",
"events_url": "https://api.github.com/users/ArthurZucker/events{/privacy}",
"received_events_url": "https://api.github.com/users/ArthurZucker/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38383/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38383/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/38382 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38382/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38382/comments | https://api.github.com/repos/huggingface/transformers/issues/38382/events | https://github.com/huggingface/transformers/pull/38382 | 3,091,206,256 | PR_kwDOCUB6oc6Xo5cj | 38,382 | Better check in `initialize_weights` | {
"login": "Cyrilvallez",
"id": 71554963,
"node_id": "MDQ6VXNlcjcxNTU0OTYz",
"avatar_url": "https://avatars.githubusercontent.com/u/71554963?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Cyrilvallez",
"html_url": "https://github.com/Cyrilvallez",
"followers_url": "https://api.github.com/users/Cyrilvallez/followers",
"following_url": "https://api.github.com/users/Cyrilvallez/following{/other_user}",
"gists_url": "https://api.github.com/users/Cyrilvallez/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Cyrilvallez/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Cyrilvallez/subscriptions",
"organizations_url": "https://api.github.com/users/Cyrilvallez/orgs",
"repos_url": "https://api.github.com/users/Cyrilvallez/repos",
"events_url": "https://api.github.com/users/Cyrilvallez/events{/privacy}",
"received_events_url": "https://api.github.com/users/Cyrilvallez/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-05-26T13:21:26 | 2025-06-11T14:57:05 | 2025-05-26T14:20:23 | MEMBER | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38382",
"html_url": "https://github.com/huggingface/transformers/pull/38382",
"diff_url": "https://github.com/huggingface/transformers/pull/38382.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38382.patch",
"merged_at": "2025-05-26T14:20:23"
} | # What does this PR do?
This has no impact on Transformers itself, but should help with remote code, and makes a bit more sense in general (See https://github.com/huggingface/transformers/issues/38358 as well)
| {
"login": "Cyrilvallez",
"id": 71554963,
"node_id": "MDQ6VXNlcjcxNTU0OTYz",
"avatar_url": "https://avatars.githubusercontent.com/u/71554963?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Cyrilvallez",
"html_url": "https://github.com/Cyrilvallez",
"followers_url": "https://api.github.com/users/Cyrilvallez/followers",
"following_url": "https://api.github.com/users/Cyrilvallez/following{/other_user}",
"gists_url": "https://api.github.com/users/Cyrilvallez/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Cyrilvallez/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Cyrilvallez/subscriptions",
"organizations_url": "https://api.github.com/users/Cyrilvallez/orgs",
"repos_url": "https://api.github.com/users/Cyrilvallez/repos",
"events_url": "https://api.github.com/users/Cyrilvallez/events{/privacy}",
"received_events_url": "https://api.github.com/users/Cyrilvallez/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38382/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38382/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/38381 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38381/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38381/comments | https://api.github.com/repos/huggingface/transformers/issues/38381/events | https://github.com/huggingface/transformers/issues/38381 | 3,091,203,647 | I_kwDOCUB6oc64QAY_ | 38,381 | [i18n-<languageCode>] Translating docs to <عربي> | {
"login": "migo-ali",
"id": 212225479,
"node_id": "U_kgDODKZNxw",
"avatar_url": "https://avatars.githubusercontent.com/u/212225479?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/migo-ali",
"html_url": "https://github.com/migo-ali",
"followers_url": "https://api.github.com/users/migo-ali/followers",
"following_url": "https://api.github.com/users/migo-ali/following{/other_user}",
"gists_url": "https://api.github.com/users/migo-ali/gists{/gist_id}",
"starred_url": "https://api.github.com/users/migo-ali/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/migo-ali/subscriptions",
"organizations_url": "https://api.github.com/users/migo-ali/orgs",
"repos_url": "https://api.github.com/users/migo-ali/repos",
"events_url": "https://api.github.com/users/migo-ali/events{/privacy}",
"received_events_url": "https://api.github.com/users/migo-ali/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 2796628563,
"node_id": "MDU6TGFiZWwyNzk2NjI4NTYz",
"url": "https://api.github.com/repos/huggingface/transformers/labels/WIP",
"name": "WIP",
"color": "234C99",
"default": false,
"description": "Label your PR/Issue with WIP for some long outstanding Issues/PRs that are work in progress"
}
] | closed | false | null | [] | null | [] | 2025-05-26T13:20:26 | 2025-07-27T15:49:08 | 2025-07-27T15:48:29 | NONE | null | null | null | null | <!)
<!--
Keep on adding more as you go 🔥
-- | {
"login": "migo-ali",
"id": 212225479,
"node_id": "U_kgDODKZNxw",
"avatar_url": "https://avatars.githubusercontent.com/u/212225479?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/migo-ali",
"html_url": "https://github.com/migo-ali",
"followers_url": "https://api.github.com/users/migo-ali/followers",
"following_url": "https://api.github.com/users/migo-ali/following{/other_user}",
"gists_url": "https://api.github.com/users/migo-ali/gists{/gist_id}",
"starred_url": "https://api.github.com/users/migo-ali/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/migo-ali/subscriptions",
"organizations_url": "https://api.github.com/users/migo-ali/orgs",
"repos_url": "https://api.github.com/users/migo-ali/repos",
"events_url": "https://api.github.com/users/migo-ali/events{/privacy}",
"received_events_url": "https://api.github.com/users/migo-ali/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38381/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38381/timeline | null | completed | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | true |
https://api.github.com/repos/huggingface/transformers/issues/38380 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38380/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38380/comments | https://api.github.com/repos/huggingface/transformers/issues/38380/events | https://github.com/huggingface/transformers/pull/38380 | 3,091,149,922 | PR_kwDOCUB6oc6XotIr | 38,380 | [WIP] Cache specific inputs | {
"login": "gante",
"id": 12240844,
"node_id": "MDQ6VXNlcjEyMjQwODQ0",
"avatar_url": "https://avatars.githubusercontent.com/u/12240844?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/gante",
"html_url": "https://github.com/gante",
"followers_url": "https://api.github.com/users/gante/followers",
"following_url": "https://api.github.com/users/gante/following{/other_user}",
"gists_url": "https://api.github.com/users/gante/gists{/gist_id}",
"starred_url": "https://api.github.com/users/gante/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/gante/subscriptions",
"organizations_url": "https://api.github.com/users/gante/orgs",
"repos_url": "https://api.github.com/users/gante/repos",
"events_url": "https://api.github.com/users/gante/events{/privacy}",
"received_events_url": "https://api.github.com/users/gante/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-05-26T13:00:53 | 2025-05-26T14:42:28 | 2025-05-26T14:42:13 | MEMBER | null | null | true | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38380",
"html_url": "https://github.com/huggingface/transformers/pull/38380",
"diff_url": "https://github.com/huggingface/transformers/pull/38380.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38380.patch",
"merged_at": null
} | # What does this PR do?
WIP
Related issue: #38055
Allows each cache class to set their own extra `update` inputs, to allow e.g. the query states to be passed | {
"login": "gante",
"id": 12240844,
"node_id": "MDQ6VXNlcjEyMjQwODQ0",
"avatar_url": "https://avatars.githubusercontent.com/u/12240844?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/gante",
"html_url": "https://github.com/gante",
"followers_url": "https://api.github.com/users/gante/followers",
"following_url": "https://api.github.com/users/gante/following{/other_user}",
"gists_url": "https://api.github.com/users/gante/gists{/gist_id}",
"starred_url": "https://api.github.com/users/gante/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/gante/subscriptions",
"organizations_url": "https://api.github.com/users/gante/orgs",
"repos_url": "https://api.github.com/users/gante/repos",
"events_url": "https://api.github.com/users/gante/events{/privacy}",
"received_events_url": "https://api.github.com/users/gante/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38380/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38380/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/38379 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38379/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38379/comments | https://api.github.com/repos/huggingface/transformers/issues/38379/events | https://github.com/huggingface/transformers/pull/38379 | 3,091,023,818 | PR_kwDOCUB6oc6XoRcq | 38,379 | Use one `utils/notification_service.py` | {
"login": "ydshieh",
"id": 2521628,
"node_id": "MDQ6VXNlcjI1MjE2Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ydshieh",
"html_url": "https://github.com/ydshieh",
"followers_url": "https://api.github.com/users/ydshieh/followers",
"following_url": "https://api.github.com/users/ydshieh/following{/other_user}",
"gists_url": "https://api.github.com/users/ydshieh/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ydshieh/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ydshieh/subscriptions",
"organizations_url": "https://api.github.com/users/ydshieh/orgs",
"repos_url": "https://api.github.com/users/ydshieh/repos",
"events_url": "https://api.github.com/users/ydshieh/events{/privacy}",
"received_events_url": "https://api.github.com/users/ydshieh/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-05-26T12:19:06 | 2025-05-26T14:15:32 | 2025-05-26T14:15:30 | COLLABORATOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38379",
"html_url": "https://github.com/huggingface/transformers/pull/38379",
"diff_url": "https://github.com/huggingface/transformers/pull/38379.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38379.patch",
"merged_at": "2025-05-26T14:15:30"
} | # What does this PR do?
So quantization job also use the same notification script. | {
"login": "ydshieh",
"id": 2521628,
"node_id": "MDQ6VXNlcjI1MjE2Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ydshieh",
"html_url": "https://github.com/ydshieh",
"followers_url": "https://api.github.com/users/ydshieh/followers",
"following_url": "https://api.github.com/users/ydshieh/following{/other_user}",
"gists_url": "https://api.github.com/users/ydshieh/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ydshieh/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ydshieh/subscriptions",
"organizations_url": "https://api.github.com/users/ydshieh/orgs",
"repos_url": "https://api.github.com/users/ydshieh/repos",
"events_url": "https://api.github.com/users/ydshieh/events{/privacy}",
"received_events_url": "https://api.github.com/users/ydshieh/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38379/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38379/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/38378 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38378/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38378/comments | https://api.github.com/repos/huggingface/transformers/issues/38378/events | https://github.com/huggingface/transformers/issues/38378 | 3,090,930,420 | I_kwDOCUB6oc64O9r0 | 38,378 | Transformers version causing my finetuned model to hallucinate | {
"login": "mohamedessam331952",
"id": 200995091,
"node_id": "U_kgDOC_rxEw",
"avatar_url": "https://avatars.githubusercontent.com/u/200995091?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mohamedessam331952",
"html_url": "https://github.com/mohamedessam331952",
"followers_url": "https://api.github.com/users/mohamedessam331952/followers",
"following_url": "https://api.github.com/users/mohamedessam331952/following{/other_user}",
"gists_url": "https://api.github.com/users/mohamedessam331952/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mohamedessam331952/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mohamedessam331952/subscriptions",
"organizations_url": "https://api.github.com/users/mohamedessam331952/orgs",
"repos_url": "https://api.github.com/users/mohamedessam331952/repos",
"events_url": "https://api.github.com/users/mohamedessam331952/events{/privacy}",
"received_events_url": "https://api.github.com/users/mohamedessam331952/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
}
] | closed | false | null | [] | null | [] | 2025-05-26T11:45:56 | 2025-09-04T16:17:59 | 2025-09-04T16:17:59 | NONE | null | null | null | null | ### System Info
I finetuned whisper-large-v3 using transformers version 4.51.3, I found that different transformers version leads to different results in inference, For short form (dataset evaluation for example) seems to be better on version 4.51.3 (i think after version 4.50.0) and timestamps long-form better on version 4.46.0 than on recent versions, i have no idea why , but i tested this on some audios.
### Who can help?
@sanchit-gandhi
### Information
model link : [model](https://huggingface.co/DrAliGomaa/whisper-large-v3-ar-test)
### Tasks
- speech recognition
### Reproduction
try inference using the different transformers versions and you will see different results specially in the timestamped long form
### Expected behavior
transformers version shouldn't affect how the model works. | {
"login": "eustlb",
"id": 94853470,
"node_id": "U_kgDOBadZXg",
"avatar_url": "https://avatars.githubusercontent.com/u/94853470?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/eustlb",
"html_url": "https://github.com/eustlb",
"followers_url": "https://api.github.com/users/eustlb/followers",
"following_url": "https://api.github.com/users/eustlb/following{/other_user}",
"gists_url": "https://api.github.com/users/eustlb/gists{/gist_id}",
"starred_url": "https://api.github.com/users/eustlb/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/eustlb/subscriptions",
"organizations_url": "https://api.github.com/users/eustlb/orgs",
"repos_url": "https://api.github.com/users/eustlb/repos",
"events_url": "https://api.github.com/users/eustlb/events{/privacy}",
"received_events_url": "https://api.github.com/users/eustlb/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38378/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38378/timeline | null | completed | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | true |
https://api.github.com/repos/huggingface/transformers/issues/38377 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38377/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38377/comments | https://api.github.com/repos/huggingface/transformers/issues/38377/events | https://github.com/huggingface/transformers/issues/38377 | 3,090,919,051 | I_kwDOCUB6oc64O66L | 38,377 | Why are the model classes in unit tests imported directly from the transformer package instead of directly importing the model classes in the file? Is there any special consideration? | {
"login": "ENg-122",
"id": 86090741,
"node_id": "MDQ6VXNlcjg2MDkwNzQx",
"avatar_url": "https://avatars.githubusercontent.com/u/86090741?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ENg-122",
"html_url": "https://github.com/ENg-122",
"followers_url": "https://api.github.com/users/ENg-122/followers",
"following_url": "https://api.github.com/users/ENg-122/following{/other_user}",
"gists_url": "https://api.github.com/users/ENg-122/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ENg-122/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ENg-122/subscriptions",
"organizations_url": "https://api.github.com/users/ENg-122/orgs",
"repos_url": "https://api.github.com/users/ENg-122/repos",
"events_url": "https://api.github.com/users/ENg-122/events{/privacy}",
"received_events_url": "https://api.github.com/users/ENg-122/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 2648621985,
"node_id": "MDU6TGFiZWwyNjQ4NjIxOTg1",
"url": "https://api.github.com/repos/huggingface/transformers/labels/Feature%20request",
"name": "Feature request",
"color": "FBCA04",
"default": false,
"description": "Request for a new feature"
}
] | open | false | null | [] | null | [] | 2025-05-26T11:41:19 | 2025-05-26T11:41:19 | null | NONE | null | null | null | null | ### Feature request
Take qwen3MoE unit test as an example:
if is_torch_available():
import torch
from transformers import (
Qwen3MoeForCausalLM,
Qwen3MoeForQuestionAnswering,
Qwen3MoeForSequenceClassification,
Qwen3MoeForTokenClassification,
Qwen3MoeModel,
)
Why not this:
from src.transformers.models.qwen3_moe.modeling_qwen3_moe import (
Qwen3MoeForCausalLM,
Qwen3MoeForQuestionAnswering,
Qwen3MoeForSequenceClassification,
Qwen3MoeForTokenClassification,
Qwen3MoeModel,
)
### Motivation
Unit tests should guard their own code files
### Your contribution
No PR has been submitted yet | null | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38377/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38377/timeline | null | null | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | false |
https://api.github.com/repos/huggingface/transformers/issues/38376 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38376/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38376/comments | https://api.github.com/repos/huggingface/transformers/issues/38376/events | https://github.com/huggingface/transformers/pull/38376 | 3,090,738,331 | PR_kwDOCUB6oc6XnS2d | 38,376 | Protect `get_default_device` for torch<2.3 | {
"login": "Cyrilvallez",
"id": 71554963,
"node_id": "MDQ6VXNlcjcxNTU0OTYz",
"avatar_url": "https://avatars.githubusercontent.com/u/71554963?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Cyrilvallez",
"html_url": "https://github.com/Cyrilvallez",
"followers_url": "https://api.github.com/users/Cyrilvallez/followers",
"following_url": "https://api.github.com/users/Cyrilvallez/following{/other_user}",
"gists_url": "https://api.github.com/users/Cyrilvallez/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Cyrilvallez/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Cyrilvallez/subscriptions",
"organizations_url": "https://api.github.com/users/Cyrilvallez/orgs",
"repos_url": "https://api.github.com/users/Cyrilvallez/repos",
"events_url": "https://api.github.com/users/Cyrilvallez/events{/privacy}",
"received_events_url": "https://api.github.com/users/Cyrilvallez/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 8103865784,
"node_id": "LA_kwDOCUB6oc8AAAAB4wctuA",
"url": "https://api.github.com/repos/huggingface/transformers/labels/for%20patch",
"name": "for patch",
"color": "D93F0B",
"default": false,
"description": "Tag issues / labels that should be included in the next patch"
}
] | closed | false | null | [] | null | [] | 2025-05-26T10:31:42 | 2025-06-10T10:09:07 | 2025-05-26T13:00:09 | MEMBER | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38376",
"html_url": "https://github.com/huggingface/transformers/pull/38376",
"diff_url": "https://github.com/huggingface/transformers/pull/38376.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38376.patch",
"merged_at": "2025-05-26T13:00:09"
} | # What does this PR do?
As per the title! It was reported in https://github.com/huggingface/transformers/issues/38329!
`set_default_device` has been around for a long time, but not `get_default_device` interestingly! | {
"login": "Cyrilvallez",
"id": 71554963,
"node_id": "MDQ6VXNlcjcxNTU0OTYz",
"avatar_url": "https://avatars.githubusercontent.com/u/71554963?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Cyrilvallez",
"html_url": "https://github.com/Cyrilvallez",
"followers_url": "https://api.github.com/users/Cyrilvallez/followers",
"following_url": "https://api.github.com/users/Cyrilvallez/following{/other_user}",
"gists_url": "https://api.github.com/users/Cyrilvallez/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Cyrilvallez/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Cyrilvallez/subscriptions",
"organizations_url": "https://api.github.com/users/Cyrilvallez/orgs",
"repos_url": "https://api.github.com/users/Cyrilvallez/repos",
"events_url": "https://api.github.com/users/Cyrilvallez/events{/privacy}",
"received_events_url": "https://api.github.com/users/Cyrilvallez/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38376/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38376/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/38375 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38375/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38375/comments | https://api.github.com/repos/huggingface/transformers/issues/38375/events | https://github.com/huggingface/transformers/issues/38375 | 3,090,713,860 | I_kwDOCUB6oc64OI0E | 38,375 | Unable to run run_instance_segmentation_no_trainer with HF Accelerate | {
"login": "gohjiayi",
"id": 36816180,
"node_id": "MDQ6VXNlcjM2ODE2MTgw",
"avatar_url": "https://avatars.githubusercontent.com/u/36816180?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/gohjiayi",
"html_url": "https://github.com/gohjiayi",
"followers_url": "https://api.github.com/users/gohjiayi/followers",
"following_url": "https://api.github.com/users/gohjiayi/following{/other_user}",
"gists_url": "https://api.github.com/users/gohjiayi/gists{/gist_id}",
"starred_url": "https://api.github.com/users/gohjiayi/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/gohjiayi/subscriptions",
"organizations_url": "https://api.github.com/users/gohjiayi/orgs",
"repos_url": "https://api.github.com/users/gohjiayi/repos",
"events_url": "https://api.github.com/users/gohjiayi/events{/privacy}",
"received_events_url": "https://api.github.com/users/gohjiayi/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
}
] | closed | false | null | [] | null | [] | 2025-05-26T10:23:04 | 2025-07-05T08:03:07 | 2025-07-05T08:03:07 | NONE | null | null | null | null | ### System Info
I am trying to run the [examples/pytorch/instance-segmentation/run_instance_segmentation_no_trainer.py](https://github.com/huggingface/transformers/blob/d1b92369ca193da49f9f7ecd01b08ece45c2c9aa/examples/pytorch/instance-segmentation/run_instance_segmentation_no_trainer.py) with HF Accelerate. I was able to run the other Trainer API example successfully, but the No Trainer (Accelerate) version is facing the following bug.
This is using the `4.52.0.dev0` instance. The only change I've made was to change epochs=2.
The following error arose, when trying to prompt for more information, ChatGPT suggests it could be the following issues but I have no idea on what could be the root cause. No other related issues found and the docs bot was not working. Would appreciate advice on how to run this example script as I hope to adopt it for my task.
| **Category** | **Potential Issue** | **Explanation** | **Recommended Fix** |
|----------------------------|--------------------------------------------------------------------------------------|-------------------------------------------------------------------------------------------------------------|--------------------------------------------------------------------------------------------------------|
| **Model Config Mismatch** | Mismatch in `num_labels` vs checkpoint (81 vs 3) | Causes some layers (e.g., `class_predictor`) to be randomly initialized, might desync ranks | Set `config.num_labels = 3` **before** loading the model or use a matching checkpoint |
| **DDP Desynchronization** | Different logic across ranks (e.g., `if rank == 0:` doing extra things) | All ranks must call collectives in the same order and time | Ensure logic is **identical** across all ranks |
| **Evaluation in DDP** | Evaluation logic not synchronized | Can cause hanging during collective ops like `all_gather` | Skip evaluation for non-zero ranks or use `if rank == 0:` carefully |
| **GPU Communication** | NCCL timeout or deadlock due to driver/hardware/GIL issues | Long-running or stuck collectives cause watchdog termination | Set env vars: `NCCL_BLOCKING_WAIT=1`, `NCCL_ASYNC_ERROR_HANDLING=1`, and reduce batch size if needed |
| **Distributed Setup** | Improper `accelerate` or `torchrun` configuration | One process might be behaving incorrectly | Test with single GPU first: `CUDA_VISIBLE_DEVICES=0 accelerate launch --num_processes=1 ...` |
| **Deprecated Args** | `_max_size` passed to `Mask2FormerImageProcessor` | Harmless, but messy | Remove `_max_size` from processor initialization |
| **Resource Overload** | GPU memory, bandwidth, or CPU bottleneck | Can indirectly cause slowdowns or crashes | Monitor with `nvidia-smi`, lower batch size, reduce `num_workers` |
Error message below:
```
loading weights file model.safetensors from cache at /home/jiayi/.cache/huggingface/hub/models--facebook--mask2former-swin-tiny-coco-instance/snapshots/22c4a2f15dc88149b8b8d9f4d42c54431fbd66f6/model.safetensors
Instantiating SwinBackbone model under default dtype torch.float32.
All model checkpoint weights were used when initializing Mask2FormerForUniversalSegmentation.
Some weights of Mask2FormerForUniversalSegmentation were not initialized from the model checkpoint at facebook/mask2former-swin-tiny-coco-instance and are newly initialized because the shapes did not match:
- class_predictor.bias: found shape torch.Size([81]) in the checkpoint and torch.Size([3]) in the model instantiated
- class_predictor.weight: found shape torch.Size([81, 256]) in the checkpoint and torch.Size([3, 256]) in the model instantiated
- criterion.empty_weight: found shape torch.Size([81]) in the checkpoint and torch.Size([3]) in the model instantiated
You should probably TRAIN this model on a down-stream task to be able to use it for predictions and inference.
/raid/jiayi/safety_barrier_breach/mask2former_hf/venv/lib/python3.9/site-packages/transformers/utils/deprecation.py:172: UserWarning: The following named arguments are not valid for `Mask2FormerImageProcessor.__init__` and were ignored: '_max_size'
return func(*args, **kwargs)
loading configuration file preprocessor_config.json from cache at /home/jiayi/.cache/huggingface/hub/models--facebook--mask2former-swin-tiny-coco-instance/snapshots/22c4a2f15dc88149b8b8d9f4d42c54431fbd66f6/preprocessor_config.json
Using a slow image processor as `use_fast` is unset and a slow processor was saved with this model. `use_fast=True` will be the default behavior in v4.52, even if the model was saved with a slow processor. This will result in minor differences in outputs. You'll still be able to use a slow processor with `use_fast=False`.
/raid/jiayi/safety_barrier_breach/mask2former_hf/venv/lib/python3.9/site-packages/transformers/utils/deprecation.py:172: UserWarning: The following named arguments are not valid for `Mask2FormerImageProcessor.__init__` and were ignored: '_max_size'
return func(*args, **kwargs)
Image processor Mask2FormerImageProcessor {
"do_normalize": true,
"do_reduce_labels": true,
"do_rescale": true,
"do_resize": true,
"ignore_index": 255,
"image_mean": [
0.48500001430511475,
0.4560000002384186,
0.4059999883174896
],
"image_processor_type": "Mask2FormerImageProcessor",
"image_std": [
0.2290000021457672,
0.2239999920129776,
0.22499999403953552
],
"num_labels": 80,
"resample": 2,
"rescale_factor": 0.00392156862745098,
"size": {
"height": 256,
"width": 256
},
"size_divisor": 32
}
05/26/2025 17:59:13 - INFO - __main__ - ***** Running training *****
05/26/2025 17:59:13 - INFO - __main__ - Num examples = 1600
05/26/2025 17:59:13 - INFO - __main__ - Num Epochs = 2
05/26/2025 17:59:13 - INFO - __main__ - Instantaneous batch size per device = 8
05/26/2025 17:59:13 - INFO - __main__ - Total train batch size (w. parallel, distributed & accumulation) = 32
05/26/2025 17:59:13 - INFO - __main__ - Gradient Accumulation steps = 2
05/26/2025 17:59:13 - INFO - __main__ - Total optimization steps = 100
50%|███████████████████████████████████ | 50/100 [00:50<00:44, 1.13it/s]05/26/2025 18:00:03 - INFO - __main__ - ***** Running evaluation *****
[rank1]:[E526 18:10:06.330014458 ProcessGroupNCCL.cpp:632] [Rank 1] Watchdog caught collective operation timeout: WorkNCCL(SeqNum=1516, OpType=_ALLGATHER_BASE, NumelIn=4, NumelOut=8, Timeout(ms)=600000) ran for 600075 milliseconds before timing out.
[rank1]:[E526 18:10:06.330380113 ProcessGroupNCCL.cpp:2268] [PG ID 0 PG GUID 0(default_pg) Rank 1] failure detected by watchdog at work sequence id: 1516 PG status: last enqueued work: 1531, last completed work: 1515
[rank1]:[E526 18:10:06.330405467 ProcessGroupNCCL.cpp:670] Stack trace of the failed collective not found, potentially because FlightRecorder is disabled. You can enable it by setting TORCH_NCCL_TRACE_BUFFER_SIZE to a non-zero value.
[rank1]:[E526 18:10:06.330496568 ProcessGroupNCCL.cpp:2103] [PG ID 0 PG GUID 0(default_pg) Rank 1] First PG on this rank to signal dumping.
[rank0]:[E526 18:10:06.348434228 ProcessGroupNCCL.cpp:632] [Rank 0] Watchdog caught collective operation timeout: WorkNCCL(SeqNum=1520, OpType=_ALLGATHER_BASE, NumelIn=7, NumelOut=14, Timeout(ms)=600000) ran for 600093 milliseconds before timing out.
[rank0]:[E526 18:10:06.348758153 ProcessGroupNCCL.cpp:2268] [PG ID 0 PG GUID 0(default_pg) Rank 0] failure detected by watchdog at work sequence id: 1520 PG status: last enqueued work: 1531, last completed work: 1519
[rank0]:[E526 18:10:06.348778609 ProcessGroupNCCL.cpp:670] Stack trace of the failed collective not found, potentially because FlightRecorder is disabled. You can enable it by setting TORCH_NCCL_TRACE_BUFFER_SIZE to a non-zero value.
[rank0]:[E526 18:10:06.348837555 ProcessGroupNCCL.cpp:2103] [PG ID 0 PG GUID 0(default_pg) Rank 0] First PG on this rank to signal dumping.
[rank0]:[E526 18:10:07.235145069 ProcessGroupNCCL.cpp:1743] [PG ID 0 PG GUID 0(default_pg) Rank 0] Received a dump signal due to a collective timeout from this local rank and we will try our best to dump the debug info. Last enqueued NCCL work: 1531, last completed NCCL work: 1519.This is most likely caused by incorrect usages of collectives, e.g., wrong sizes used across ranks, the order of collectives is not same for all ranks or the scheduled collective, for some reason, didn't run. Additionally, this can be caused by GIL deadlock or other reasons such as network errors or bugs in the communications library (e.g. NCCL), etc.
[rank1]:[E526 18:10:07.235182687 ProcessGroupNCCL.cpp:1743] [PG ID 0 PG GUID 0(default_pg) Rank 1] Received a dump signal due to a collective timeout from this local rank and we will try our best to dump the debug info. Last enqueued NCCL work: 1531, last completed NCCL work: 1515.This is most likely caused by incorrect usages of collectives, e.g., wrong sizes used across ranks, the order of collectives is not same for all ranks or the scheduled collective, for some reason, didn't run. Additionally, this can be caused by GIL deadlock or other reasons such as network errors or bugs in the communications library (e.g. NCCL), etc.
[rank0]:[E526 18:10:07.235355691 ProcessGroupNCCL.cpp:1533] [PG ID 0 PG GUID 0(default_pg) Rank 0] ProcessGroupNCCL preparing to dump debug info. Include stack trace: 1
[rank1]:[E526 18:10:07.235369478 ProcessGroupNCCL.cpp:1533] [PG ID 0 PG GUID 0(default_pg) Rank 1] ProcessGroupNCCL preparing to dump debug info. Include stack trace: 1
[rank0]:[E526 18:10:07.496712146 ProcessGroupNCCL.cpp:684] [Rank 0] Some NCCL operations have failed or timed out. Due to the asynchronous nature of CUDA kernels, subsequent GPU operations might run on corrupted/incomplete data.
[rank0]:[E526 18:10:07.496742723 ProcessGroupNCCL.cpp:698] [Rank 0] To avoid data inconsistency, we are taking the entire process down.
[rank0]:[E526 18:10:07.499895520 ProcessGroupNCCL.cpp:1896] [PG ID 0 PG GUID 0(default_pg) Rank 0] Process group watchdog thread terminated with exception: [Rank 0] Watchdog caught collective operation timeout: WorkNCCL(SeqNum=1520, OpType=_ALLGATHER_BASE, NumelIn=7, NumelOut=14, Timeout(ms)=600000) ran for 600093 milliseconds before timing out.
Exception raised from checkTimeout at /pytorch/torch/csrc/distributed/c10d/ProcessGroupNCCL.cpp:635 (most recent call first):
frame #0: c10::Error::Error(c10::SourceLocation, std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> >) + 0x98 (0x7f12fc3065e8 in /raid/jiayi/safety_barrier_breach/mask2former_hf/venv/lib/python3.9/site-packages/torch/lib/libc10.so)
frame #1: c10d::ProcessGroupNCCL::WorkNCCL::checkTimeout(std::optional<std::chrono::duration<long, std::ratio<1l, 1000l> > >) + 0x23d (0x7f12fd621a1d in /raid/jiayi/safety_barrier_breach/mask2former_hf/venv/lib/python3.9/site-packages/torch/lib/libtorch_cuda.so)
frame #2: c10d::ProcessGroupNCCL::watchdogHandler() + 0xc80 (0x7f12fd6237a0 in /raid/jiayi/safety_barrier_breach/mask2former_hf/venv/lib/python3.9/site-packages/torch/lib/libtorch_cuda.so)
frame #3: c10d::ProcessGroupNCCL::ncclCommWatchdog() + 0x14d (0x7f12fd624ead in /raid/jiayi/safety_barrier_breach/mask2former_hf/venv/lib/python3.9/site-packages/torch/lib/libtorch_cuda.so)
frame #4: <unknown function> + 0xd6df4 (0x7f13660c2df4 in /lib/x86_64-linux-gnu/libstdc++.so.6)
frame #5: <unknown function> + 0x8609 (0x7f1369f77609 in /lib/x86_64-linux-gnu/libpthread.so.0)
frame #6: clone + 0x43 (0x7f136a0b1353 in /lib/x86_64-linux-gnu/libc.so.6)
terminate called after throwing an instance of 'c10::DistBackendError'
[rank1]:[E526 18:10:07.500452346 ProcessGroupNCCL.cpp:684] [Rank 1] Some NCCL operations have failed or timed out. Due to the asynchronous nature of CUDA kernels, subsequent GPU operations might run on corrupted/incomplete data.
[rank1]:[E526 18:10:07.500470141 ProcessGroupNCCL.cpp:698] [Rank 1] To avoid data inconsistency, we are taking the entire process down.
0%| | 0/21 [10:04<?, ?it/s]
[rank1]:[E526 18:10:07.502423448 ProcessGroupNCCL.cpp:1896] [PG ID 0 PG GUID 0(default_pg) Rank 1] Process group watchdog thread terminated with exception: [Rank 1] Watchdog caught collective operation timeout: WorkNCCL(SeqNum=1516, OpType=_ALLGATHER_BASE, NumelIn=4, NumelOut=8, Timeout(ms)=600000) ran for 600075 milliseconds before timing out.
Exception raised from checkTimeout at /pytorch/torch/csrc/distributed/c10d/ProcessGroupNCCL.cpp:635 (most recent call first):
frame #0: c10::Error::Error(c10::SourceLocation, std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> >) + 0x98 (0x7f3a12f885e8 in /raid/jiayi/safety_barrier_breach/mask2former_hf/venv/lib/python3.9/site-packages/torch/lib/libc10.so)
frame #1: c10d::ProcessGroupNCCL::WorkNCCL::checkTimeout(std::optional<std::chrono::duration<long, std::ratio<1l, 1000l> > >) + 0x23d (0x7f3a142a3a1d in /raid/jiayi/safety_barrier_breach/mask2former_hf/venv/lib/python3.9/site-packages/torch/lib/libtorch_cuda.so)
frame #2: c10d::ProcessGroupNCCL::watchdogHandler() + 0xc80 (0x7f3a142a57a0 in /raid/jiayi/safety_barrier_breach/mask2former_hf/venv/lib/python3.9/site-packages/torch/lib/libtorch_cuda.so)
frame #3: c10d::ProcessGroupNCCL::ncclCommWatchdog() + 0x14d (0x7f3a142a6ead in /raid/jiayi/safety_barrier_breach/mask2former_hf/venv/lib/python3.9/site-packages/torch/lib/libtorch_cuda.so)
frame #4: <unknown function> + 0xd6df4 (0x7f3a7cd44df4 in /lib/x86_64-linux-gnu/libstdc++.so.6)
frame #5: <unknown function> + 0x8609 (0x7f3a80bf9609 in /lib/x86_64-linux-gnu/libpthread.so.0)
frame #6: clone + 0x43 (0x7f3a80d33353 in /lib/x86_64-linux-gnu/libc.so.6)
terminate called after throwing an instance of 'c10::DistBackendError'
what(): [PG ID 0 PG GUID 0(default_pg) Rank 0] Process group watchdog thread terminated with exception: [Rank 0] Watchdog caught collective operation timeout: WorkNCCL(SeqNum=1520, OpType=_ALLGATHER_BASE, NumelIn=7, NumelOut=14, Timeout(ms)=600000) ran for 600093 milliseconds before timing out.
Exception raised from checkTimeout at /pytorch/torch/csrc/distributed/c10d/ProcessGroupNCCL.cpp:635 (most recent call first):
frame #0: c10::Error::Error(c10::SourceLocation, std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> >) + 0x98 (0x7f12fc3065e8 in /raid/jiayi/safety_barrier_breach/mask2former_hf/venv/lib/python3.9/site-packages/torch/lib/libc10.so)
frame #1: c10d::ProcessGroupNCCL::WorkNCCL::checkTimeout(std::optional<std::chrono::duration<long, std::ratio<1l, 1000l> > >) + 0x23d (0x7f12fd621a1d in /raid/jiayi/safety_barrier_breach/mask2former_hf/venv/lib/python3.9/site-packages/torch/lib/libtorch_cuda.so)
frame #2: c10d::ProcessGroupNCCL::watchdogHandler() + 0xc80 (0x7f12fd6237a0 in /raid/jiayi/safety_barrier_breach/mask2former_hf/venv/lib/python3.9/site-packages/torch/lib/libtorch_cuda.so)
frame #3: c10d::ProcessGroupNCCL::ncclCommWatchdog() + 0x14d (0x7f12fd624ead in /raid/jiayi/safety_barrier_breach/mask2former_hf/venv/lib/python3.9/site-packages/torch/lib/libtorch_cuda.so)
frame #4: <unknown function> + 0xd6df4 (0x7f13660c2df4 in /lib/x86_64-linux-gnu/libstdc++.so.6)
frame #5: <unknown function> + 0x8609 (0x7f1369f77609 in /lib/x86_64-linux-gnu/libpthread.so.0)
frame #6: clone + 0x43 (0x7f136a0b1353 in /lib/x86_64-linux-gnu/libc.so.6)
Exception raised from ncclCommWatchdog at /pytorch/torch/csrc/distributed/c10d/ProcessGroupNCCL.cpp:1902 (most recent call first):
frame #0: c10::Error::Error(c10::SourceLocation, std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> >) + 0x98 (0x7f12fc3065e8 in /raid/jiayi/safety_barrier_breach/mask2former_hf/venv/lib/python3.9/site-packages/torch/lib/libc10.so)
frame #1: <unknown function> + 0x11b4a6e (0x7f12fd5f3a6e in /raid/jiayi/safety_barrier_breach/mask2former_hf/venv/lib/python3.9/site-packages/torch/lib/libtorch_cuda.so)
frame #2: <unknown function> + 0xe07bed (0x7f12fd246bed in /raid/jiayi/safety_barrier_breach/mask2former_hf/venv/lib/python3.9/site-packages/torch/lib/libtorch_cuda.so)
frame #3: <unknown function> + 0xd6df4 (0x7f13660c2df4 in /lib/x86_64-linux-gnu/libstdc++.so.6)
frame #4: <unknown function> + 0x8609 (0x7f1369f77609 in /lib/x86_64-linux-gnu/libpthread.so.0)
frame #5: clone + 0x43 (0x7f136a0b1353 in /lib/x86_64-linux-gnu/libc.so.6)
what(): [PG ID 0 PG GUID 0(default_pg) Rank 1] Process group watchdog thread terminated with exception: [Rank 1] Watchdog caught collective operation timeout: WorkNCCL(SeqNum=1516, OpType=_ALLGATHER_BASE, NumelIn=4, NumelOut=8, Timeout(ms)=600000) ran for 600075 milliseconds before timing out.
Exception raised from checkTimeout at /pytorch/torch/csrc/distributed/c10d/ProcessGroupNCCL.cpp:635 (most recent call first):
frame #0: c10::Error::Error(c10::SourceLocation, std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> >) + 0x98 (0x7f3a12f885e8 in /raid/jiayi/safety_barrier_breach/mask2former_hf/venv/lib/python3.9/site-packages/torch/lib/libc10.so)
frame #1: c10d::ProcessGroupNCCL::WorkNCCL::checkTimeout(std::optional<std::chrono::duration<long, std::ratio<1l, 1000l> > >) + 0x23d (0x7f3a142a3a1d in /raid/jiayi/safety_barrier_breach/mask2former_hf/venv/lib/python3.9/site-packages/torch/lib/libtorch_cuda.so)
frame #2: c10d::ProcessGroupNCCL::watchdogHandler() + 0xc80 (0x7f3a142a57a0 in /raid/jiayi/safety_barrier_breach/mask2former_hf/venv/lib/python3.9/site-packages/torch/lib/libtorch_cuda.so)
frame #3: c10d::ProcessGroupNCCL::ncclCommWatchdog() + 0x14d (0x7f3a142a6ead in /raid/jiayi/safety_barrier_breach/mask2former_hf/venv/lib/python3.9/site-packages/torch/lib/libtorch_cuda.so)
frame #4: <unknown function> + 0xd6df4 (0x7f3a7cd44df4 in /lib/x86_64-linux-gnu/libstdc++.so.6)
frame #5: <unknown function> + 0x8609 (0x7f3a80bf9609 in /lib/x86_64-linux-gnu/libpthread.so.0)
frame #6: clone + 0x43 (0x7f3a80d33353 in /lib/x86_64-linux-gnu/libc.so.6)
Exception raised from ncclCommWatchdog at /pytorch/torch/csrc/distributed/c10d/ProcessGroupNCCL.cpp:1902 (most recent call first):
frame #0: c10::Error::Error(c10::SourceLocation, std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> >) + 0x98 (0x7f3a12f885e8 in /raid/jiayi/safety_barrier_breach/mask2former_hf/venv/lib/python3.9/site-packages/torch/lib/libc10.so)
frame #1: <unknown function> + 0x11b4a6e (0x7f3a14275a6e in /raid/jiayi/safety_barrier_breach/mask2former_hf/venv/lib/python3.9/site-packages/torch/lib/libtorch_cuda.so)
frame #2: <unknown function> + 0xe07bed (0x7f3a13ec8bed in /raid/jiayi/safety_barrier_breach/mask2former_hf/venv/lib/python3.9/site-packages/torch/lib/libtorch_cuda.so)
frame #3: <unknown function> + 0xd6df4 (0x7f3a7cd44df4 in /lib/x86_64-linux-gnu/libstdc++.so.6)
frame #4: <unknown function> + 0x8609 (0x7f3a80bf9609 in /lib/x86_64-linux-gnu/libpthread.so.0)
frame #5: clone + 0x43 (0x7f3a80d33353 in /lib/x86_64-linux-gnu/libc.so.6)
W0526 18:10:09.067974 3031878 torch/distributed/elastic/multiprocessing/api.py:900] Sending process 3032005 closing signal SIGTERM
E0526 18:10:10.286100 3031878 torch/distributed/elastic/multiprocessing/api.py:874] failed (exitcode: -6) local_rank: 0 (pid: 3032004) of binary: /raid/jiayi/safety_barrier_breach/mask2former_hf/venv/bin/python
Traceback (most recent call last):
File "/raid/jiayi/safety_barrier_breach/mask2former_hf/venv/bin/accelerate", line 8, in <module>
sys.exit(main())
File "/raid/jiayi/safety_barrier_breach/mask2former_hf/venv/lib/python3.9/site-packages/accelerate/commands/accelerate_cli.py", line 50, in main
args.func(args)
File "/raid/jiayi/safety_barrier_breach/mask2former_hf/venv/lib/python3.9/site-packages/accelerate/commands/launch.py", line 1189, in launch_command
multi_gpu_launcher(args)
File "/raid/jiayi/safety_barrier_breach/mask2former_hf/venv/lib/python3.9/site-packages/accelerate/commands/launch.py", line 815, in multi_gpu_launcher
distrib_run.run(args)
File "/raid/jiayi/safety_barrier_breach/mask2former_hf/venv/lib/python3.9/site-packages/torch/distributed/run.py", line 883, in run
elastic_launch(
File "/raid/jiayi/safety_barrier_breach/mask2former_hf/venv/lib/python3.9/site-packages/torch/distributed/launcher/api.py", line 139, in __call__
return launch_agent(self._config, self._entrypoint, list(args))
File "/raid/jiayi/safety_barrier_breach/mask2former_hf/venv/lib/python3.9/site-packages/torch/distributed/launcher/api.py", line 270, in launch_agent
raise ChildFailedError(
torch.distributed.elastic.multiprocessing.errors.ChildFailedError:
========================================================
original_run_instance_segmentation_no_trainer.py FAILED
--------------------------------------------------------
Failures:
<NO_OTHER_FAILURES>
--------------------------------------------------------
Root Cause (first observed failure):
[0]:
time : 2025-05-26_18:10:09
host : DGX-Station
rank : 0 (local_rank: 0)
exitcode : -6 (pid: 3032004)
error_file: <N/A>
traceback : Signal 6 (SIGABRT) received by PID 3032004
========================================================
```
### Who can help?
_No response_
### Information
- [ ] The official example scripts
- [ ] My own modified scripts
### Tasks
- [ ] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...)
- [ ] My own task or dataset (give details below)
### Reproduction
Run the [examples/pytorch/instance-segmentation/run_instance_segmentation_no_trainer.py](https://github.com/huggingface/transformers/blob/d1b92369ca193da49f9f7ecd01b08ece45c2c9aa/examples/pytorch/instance-segmentation/run_instance_segmentation_no_trainer.py) script with the example command given in the README
### Expected behavior
The training is supposed to complete without any issue | {
"login": "github-actions[bot]",
"id": 41898282,
"node_id": "MDM6Qm90NDE4OTgyODI=",
"avatar_url": "https://avatars.githubusercontent.com/in/15368?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/github-actions%5Bbot%5D",
"html_url": "https://github.com/apps/github-actions",
"followers_url": "https://api.github.com/users/github-actions%5Bbot%5D/followers",
"following_url": "https://api.github.com/users/github-actions%5Bbot%5D/following{/other_user}",
"gists_url": "https://api.github.com/users/github-actions%5Bbot%5D/gists{/gist_id}",
"starred_url": "https://api.github.com/users/github-actions%5Bbot%5D/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/github-actions%5Bbot%5D/subscriptions",
"organizations_url": "https://api.github.com/users/github-actions%5Bbot%5D/orgs",
"repos_url": "https://api.github.com/users/github-actions%5Bbot%5D/repos",
"events_url": "https://api.github.com/users/github-actions%5Bbot%5D/events{/privacy}",
"received_events_url": "https://api.github.com/users/github-actions%5Bbot%5D/received_events",
"type": "Bot",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38375/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38375/timeline | null | completed | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | true |
https://api.github.com/repos/huggingface/transformers/issues/38374 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38374/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38374/comments | https://api.github.com/repos/huggingface/transformers/issues/38374/events | https://github.com/huggingface/transformers/pull/38374 | 3,090,712,405 | PR_kwDOCUB6oc6XnNE1 | 38,374 | [video utils] group and reorder by number of frames | {
"login": "zucchini-nlp",
"id": 100715397,
"node_id": "U_kgDOBgDLhQ",
"avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/zucchini-nlp",
"html_url": "https://github.com/zucchini-nlp",
"followers_url": "https://api.github.com/users/zucchini-nlp/followers",
"following_url": "https://api.github.com/users/zucchini-nlp/following{/other_user}",
"gists_url": "https://api.github.com/users/zucchini-nlp/gists{/gist_id}",
"starred_url": "https://api.github.com/users/zucchini-nlp/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/zucchini-nlp/subscriptions",
"organizations_url": "https://api.github.com/users/zucchini-nlp/orgs",
"repos_url": "https://api.github.com/users/zucchini-nlp/repos",
"events_url": "https://api.github.com/users/zucchini-nlp/events{/privacy}",
"received_events_url": "https://api.github.com/users/zucchini-nlp/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 8103865784,
"node_id": "LA_kwDOCUB6oc8AAAAB4wctuA",
"url": "https://api.github.com/repos/huggingface/transformers/labels/for%20patch",
"name": "for patch",
"color": "D93F0B",
"default": false,
"description": "Tag issues / labels that should be included in the next patch"
}
] | closed | false | null | [] | null | [] | 2025-05-26T10:22:27 | 2025-05-27T09:32:33 | 2025-05-27T09:32:33 | MEMBER | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38374",
"html_url": "https://github.com/huggingface/transformers/pull/38374",
"diff_url": "https://github.com/huggingface/transformers/pull/38374.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38374.patch",
"merged_at": "2025-05-27T09:32:33"
} | # What does this PR do?
Fixes https://github.com/huggingface/transformers/issues/38352. Though we cannot handle in transformers videos of different shapes (yet!), it seems to be failing for vLLM
This PR ensures that videos are grouped not only by frame size, but also number of frames. A small test added and is green for me locally | {
"login": "zucchini-nlp",
"id": 100715397,
"node_id": "U_kgDOBgDLhQ",
"avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/zucchini-nlp",
"html_url": "https://github.com/zucchini-nlp",
"followers_url": "https://api.github.com/users/zucchini-nlp/followers",
"following_url": "https://api.github.com/users/zucchini-nlp/following{/other_user}",
"gists_url": "https://api.github.com/users/zucchini-nlp/gists{/gist_id}",
"starred_url": "https://api.github.com/users/zucchini-nlp/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/zucchini-nlp/subscriptions",
"organizations_url": "https://api.github.com/users/zucchini-nlp/orgs",
"repos_url": "https://api.github.com/users/zucchini-nlp/repos",
"events_url": "https://api.github.com/users/zucchini-nlp/events{/privacy}",
"received_events_url": "https://api.github.com/users/zucchini-nlp/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38374/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38374/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/38373 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38373/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38373/comments | https://api.github.com/repos/huggingface/transformers/issues/38373/events | https://github.com/huggingface/transformers/issues/38373 | 3,090,663,790 | I_kwDOCUB6oc64N8lu | 38,373 | Attention module assumes hidden_size == num_heads * head_dim, limiting head_dim flexibility | {
"login": "r01ex-ai",
"id": 169426751,
"node_id": "U_kgDOChk_Pw",
"avatar_url": "https://avatars.githubusercontent.com/u/169426751?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/r01ex-ai",
"html_url": "https://github.com/r01ex-ai",
"followers_url": "https://api.github.com/users/r01ex-ai/followers",
"following_url": "https://api.github.com/users/r01ex-ai/following{/other_user}",
"gists_url": "https://api.github.com/users/r01ex-ai/gists{/gist_id}",
"starred_url": "https://api.github.com/users/r01ex-ai/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/r01ex-ai/subscriptions",
"organizations_url": "https://api.github.com/users/r01ex-ai/orgs",
"repos_url": "https://api.github.com/users/r01ex-ai/repos",
"events_url": "https://api.github.com/users/r01ex-ai/events{/privacy}",
"received_events_url": "https://api.github.com/users/r01ex-ai/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 2648621985,
"node_id": "MDU6TGFiZWwyNjQ4NjIxOTg1",
"url": "https://api.github.com/repos/huggingface/transformers/labels/Feature%20request",
"name": "Feature request",
"color": "FBCA04",
"default": false,
"description": "Request for a new feature"
}
] | closed | false | null | [] | null | [] | 2025-05-26T10:05:43 | 2025-05-26T10:25:01 | 2025-05-26T10:25:01 | NONE | null | null | null | null | ### Feature request
In the attention module (e.g., LLaMA or similar architectures), the line 262
attn_output = attn_output.reshape(*input_shape, -1)
implicitly assumes that hidden_size == num_heads * head_dim, which may not hold if one wants to configure a model with non-standard head sizes (e.g., head_dim = 48, num_heads = 32, hidden_size = 1536).
While this setup is mathematically valid, the code fails due to the use of input_shape from the input tensor.
Proposed fix:
Use attn_output = attn_output.transpose(1, 2).reshape(batch_size, seq_len, num_heads * head_dim) and ensure o_proj is adjusted accordingly.
This would help make the model architecture more flexible and allow research use cases that explore non-standard attention configurations.
Or maybe this limitation intentional?
### Motivation
Attention implementation of Llama prevents flexible model architecture. This is useful for pruning research and such.
### Your contribution
Proposed fix above.. | {
"login": "r01ex-ai",
"id": 169426751,
"node_id": "U_kgDOChk_Pw",
"avatar_url": "https://avatars.githubusercontent.com/u/169426751?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/r01ex-ai",
"html_url": "https://github.com/r01ex-ai",
"followers_url": "https://api.github.com/users/r01ex-ai/followers",
"following_url": "https://api.github.com/users/r01ex-ai/following{/other_user}",
"gists_url": "https://api.github.com/users/r01ex-ai/gists{/gist_id}",
"starred_url": "https://api.github.com/users/r01ex-ai/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/r01ex-ai/subscriptions",
"organizations_url": "https://api.github.com/users/r01ex-ai/orgs",
"repos_url": "https://api.github.com/users/r01ex-ai/repos",
"events_url": "https://api.github.com/users/r01ex-ai/events{/privacy}",
"received_events_url": "https://api.github.com/users/r01ex-ai/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38373/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38373/timeline | null | not_planned | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | true |
https://api.github.com/repos/huggingface/transformers/issues/38372 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38372/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38372/comments | https://api.github.com/repos/huggingface/transformers/issues/38372/events | https://github.com/huggingface/transformers/pull/38372 | 3,090,579,348 | PR_kwDOCUB6oc6XmwC7 | 38,372 | [qwen-vl] Look for vocab size in text config | {
"login": "zucchini-nlp",
"id": 100715397,
"node_id": "U_kgDOBgDLhQ",
"avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/zucchini-nlp",
"html_url": "https://github.com/zucchini-nlp",
"followers_url": "https://api.github.com/users/zucchini-nlp/followers",
"following_url": "https://api.github.com/users/zucchini-nlp/following{/other_user}",
"gists_url": "https://api.github.com/users/zucchini-nlp/gists{/gist_id}",
"starred_url": "https://api.github.com/users/zucchini-nlp/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/zucchini-nlp/subscriptions",
"organizations_url": "https://api.github.com/users/zucchini-nlp/orgs",
"repos_url": "https://api.github.com/users/zucchini-nlp/repos",
"events_url": "https://api.github.com/users/zucchini-nlp/events{/privacy}",
"received_events_url": "https://api.github.com/users/zucchini-nlp/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 8103865784,
"node_id": "LA_kwDOCUB6oc8AAAAB4wctuA",
"url": "https://api.github.com/repos/huggingface/transformers/labels/for%20patch",
"name": "for patch",
"color": "D93F0B",
"default": false,
"description": "Tag issues / labels that should be included in the next patch"
}
] | closed | false | null | [] | null | [] | 2025-05-26T09:35:19 | 2025-05-28T07:32:27 | 2025-05-28T07:32:27 | MEMBER | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38372",
"html_url": "https://github.com/huggingface/transformers/pull/38372",
"diff_url": "https://github.com/huggingface/transformers/pull/38372.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38372.patch",
"merged_at": "2025-05-28T07:32:27"
} | # What does this PR do?
Fixes https://github.com/huggingface/transformers/issues/38331. We had two PRs merged one after another and the current state of PR while backwards compatible, will fail when users create a new Qwen-architecture model following the new standards | {
"login": "ArthurZucker",
"id": 48595927,
"node_id": "MDQ6VXNlcjQ4NTk1OTI3",
"avatar_url": "https://avatars.githubusercontent.com/u/48595927?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ArthurZucker",
"html_url": "https://github.com/ArthurZucker",
"followers_url": "https://api.github.com/users/ArthurZucker/followers",
"following_url": "https://api.github.com/users/ArthurZucker/following{/other_user}",
"gists_url": "https://api.github.com/users/ArthurZucker/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ArthurZucker/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ArthurZucker/subscriptions",
"organizations_url": "https://api.github.com/users/ArthurZucker/orgs",
"repos_url": "https://api.github.com/users/ArthurZucker/repos",
"events_url": "https://api.github.com/users/ArthurZucker/events{/privacy}",
"received_events_url": "https://api.github.com/users/ArthurZucker/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38372/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38372/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/38371 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38371/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38371/comments | https://api.github.com/repos/huggingface/transformers/issues/38371/events | https://github.com/huggingface/transformers/pull/38371 | 3,090,568,649 | PR_kwDOCUB6oc6XmtxM | 38,371 | [aya vision] fix processor for vLLM | {
"login": "zucchini-nlp",
"id": 100715397,
"node_id": "U_kgDOBgDLhQ",
"avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/zucchini-nlp",
"html_url": "https://github.com/zucchini-nlp",
"followers_url": "https://api.github.com/users/zucchini-nlp/followers",
"following_url": "https://api.github.com/users/zucchini-nlp/following{/other_user}",
"gists_url": "https://api.github.com/users/zucchini-nlp/gists{/gist_id}",
"starred_url": "https://api.github.com/users/zucchini-nlp/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/zucchini-nlp/subscriptions",
"organizations_url": "https://api.github.com/users/zucchini-nlp/orgs",
"repos_url": "https://api.github.com/users/zucchini-nlp/repos",
"events_url": "https://api.github.com/users/zucchini-nlp/events{/privacy}",
"received_events_url": "https://api.github.com/users/zucchini-nlp/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 8103865784,
"node_id": "LA_kwDOCUB6oc8AAAAB4wctuA",
"url": "https://api.github.com/repos/huggingface/transformers/labels/for%20patch",
"name": "for patch",
"color": "D93F0B",
"default": false,
"description": "Tag issues / labels that should be included in the next patch"
}
] | closed | false | null | [] | null | [] | 2025-05-26T09:31:17 | 2025-06-02T11:53:23 | 2025-05-27T09:43:54 | MEMBER | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38371",
"html_url": "https://github.com/huggingface/transformers/pull/38371",
"diff_url": "https://github.com/huggingface/transformers/pull/38371.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38371.patch",
"merged_at": "2025-05-27T09:43:54"
} | # What does this PR do?
Fixes https://github.com/huggingface/transformers/issues/38350
@DarkLight1337 , one quick question. I am trying to add a test on our side, so we don't break anything that vLLM relies on. For the case of aya-vision the error log shows inputs as (`text="<image>, images=None"`) but doing so will fail on other processors where we have extra checks that "`num input images == num image tokens`"
I wonder how that is bypassed on vLLM and how can I add a test to cover assumptions vLLM has about our processors | {
"login": "zucchini-nlp",
"id": 100715397,
"node_id": "U_kgDOBgDLhQ",
"avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/zucchini-nlp",
"html_url": "https://github.com/zucchini-nlp",
"followers_url": "https://api.github.com/users/zucchini-nlp/followers",
"following_url": "https://api.github.com/users/zucchini-nlp/following{/other_user}",
"gists_url": "https://api.github.com/users/zucchini-nlp/gists{/gist_id}",
"starred_url": "https://api.github.com/users/zucchini-nlp/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/zucchini-nlp/subscriptions",
"organizations_url": "https://api.github.com/users/zucchini-nlp/orgs",
"repos_url": "https://api.github.com/users/zucchini-nlp/repos",
"events_url": "https://api.github.com/users/zucchini-nlp/events{/privacy}",
"received_events_url": "https://api.github.com/users/zucchini-nlp/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38371/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38371/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/38370 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38370/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38370/comments | https://api.github.com/repos/huggingface/transformers/issues/38370/events | https://github.com/huggingface/transformers/pull/38370 | 3,090,557,656 | PR_kwDOCUB6oc6XmraI | 38,370 | Fix all import errors based on older torch versions | {
"login": "Cyrilvallez",
"id": 71554963,
"node_id": "MDQ6VXNlcjcxNTU0OTYz",
"avatar_url": "https://avatars.githubusercontent.com/u/71554963?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Cyrilvallez",
"html_url": "https://github.com/Cyrilvallez",
"followers_url": "https://api.github.com/users/Cyrilvallez/followers",
"following_url": "https://api.github.com/users/Cyrilvallez/following{/other_user}",
"gists_url": "https://api.github.com/users/Cyrilvallez/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Cyrilvallez/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Cyrilvallez/subscriptions",
"organizations_url": "https://api.github.com/users/Cyrilvallez/orgs",
"repos_url": "https://api.github.com/users/Cyrilvallez/repos",
"events_url": "https://api.github.com/users/Cyrilvallez/events{/privacy}",
"received_events_url": "https://api.github.com/users/Cyrilvallez/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 8103865784,
"node_id": "LA_kwDOCUB6oc8AAAAB4wctuA",
"url": "https://api.github.com/repos/huggingface/transformers/labels/for%20patch",
"name": "for patch",
"color": "D93F0B",
"default": false,
"description": "Tag issues / labels that should be included in the next patch"
}
] | closed | false | null | [] | null | [] | 2025-05-26T09:26:03 | 2025-06-03T10:35:40 | 2025-05-26T10:11:55 | MEMBER | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38370",
"html_url": "https://github.com/huggingface/transformers/pull/38370",
"diff_url": "https://github.com/huggingface/transformers/pull/38370.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38370.patch",
"merged_at": "2025-05-26T10:11:55"
} | # What does this PR do?
As per the title | {
"login": "Cyrilvallez",
"id": 71554963,
"node_id": "MDQ6VXNlcjcxNTU0OTYz",
"avatar_url": "https://avatars.githubusercontent.com/u/71554963?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Cyrilvallez",
"html_url": "https://github.com/Cyrilvallez",
"followers_url": "https://api.github.com/users/Cyrilvallez/followers",
"following_url": "https://api.github.com/users/Cyrilvallez/following{/other_user}",
"gists_url": "https://api.github.com/users/Cyrilvallez/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Cyrilvallez/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Cyrilvallez/subscriptions",
"organizations_url": "https://api.github.com/users/Cyrilvallez/orgs",
"repos_url": "https://api.github.com/users/Cyrilvallez/repos",
"events_url": "https://api.github.com/users/Cyrilvallez/events{/privacy}",
"received_events_url": "https://api.github.com/users/Cyrilvallez/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38370/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38370/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/38369 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38369/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38369/comments | https://api.github.com/repos/huggingface/transformers/issues/38369/events | https://github.com/huggingface/transformers/pull/38369 | 3,090,529,737 | PR_kwDOCUB6oc6XmljU | 38,369 | [bugfix] fix flex-attention not supported on Ascend NPU, update BlockMask type annotations to str | {
"login": "FightingZhen",
"id": 26176607,
"node_id": "MDQ6VXNlcjI2MTc2NjA3",
"avatar_url": "https://avatars.githubusercontent.com/u/26176607?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/FightingZhen",
"html_url": "https://github.com/FightingZhen",
"followers_url": "https://api.github.com/users/FightingZhen/followers",
"following_url": "https://api.github.com/users/FightingZhen/following{/other_user}",
"gists_url": "https://api.github.com/users/FightingZhen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/FightingZhen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/FightingZhen/subscriptions",
"organizations_url": "https://api.github.com/users/FightingZhen/orgs",
"repos_url": "https://api.github.com/users/FightingZhen/repos",
"events_url": "https://api.github.com/users/FightingZhen/events{/privacy}",
"received_events_url": "https://api.github.com/users/FightingZhen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-05-26T09:13:36 | 2025-08-14T01:52:39 | 2025-05-26T11:56:38 | CONTRIBUTOR | null | null | true | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38369",
"html_url": "https://github.com/huggingface/transformers/pull/38369",
"diff_url": "https://github.com/huggingface/transformers/pull/38369.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38369.patch",
"merged_at": null
} | # What does this PR do?
**1. Problem 1**
Flex-attention has not been fully verified on Ascend NPU yet. In PR #37866 , the func `is_torch_flex_attn_available` defined in below code does not contain logical judgment about Ascend NPU. In this situation, when we use torch>=2.5.0, this func will return `True` on Ascend NPU, which is not correct.
https://github.com/huggingface/transformers/blob/d03a3ca69226a776d9ed69cd90cbe8f6ebe8fa34/src/transformers/utils/import_utils.py#L412
**2. Problem 2**
If func `is_torch_flex_attn_available` return `False` on Ascend NPU as expected, object `BlockMask` will not be imported in below code
https://github.com/huggingface/transformers/blob/a5a0c7b88828a7273bdedbdcaa2c7a252084c0d8/src/transformers/masking_utils.py#L29
But this object is directly called in type annotations in below code now, directly causing importance error.
https://github.com/huggingface/transformers/blob/a5a0c7b88828a7273bdedbdcaa2c7a252084c0d8/src/transformers/masking_utils.py#L585
**Therefore, this PR is committed for solving above two problems**. By adding flex-attention not supported on Ascend NPU logic in `is_torch_flex_attn_available`, and updating `BlockMask` type annotations to string format.
Fixes # (issue)
#38362
## Before submitting
- [x] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [x] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [x] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
| {
"login": "FightingZhen",
"id": 26176607,
"node_id": "MDQ6VXNlcjI2MTc2NjA3",
"avatar_url": "https://avatars.githubusercontent.com/u/26176607?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/FightingZhen",
"html_url": "https://github.com/FightingZhen",
"followers_url": "https://api.github.com/users/FightingZhen/followers",
"following_url": "https://api.github.com/users/FightingZhen/following{/other_user}",
"gists_url": "https://api.github.com/users/FightingZhen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/FightingZhen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/FightingZhen/subscriptions",
"organizations_url": "https://api.github.com/users/FightingZhen/orgs",
"repos_url": "https://api.github.com/users/FightingZhen/repos",
"events_url": "https://api.github.com/users/FightingZhen/events{/privacy}",
"received_events_url": "https://api.github.com/users/FightingZhen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38369/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38369/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/38368 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38368/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38368/comments | https://api.github.com/repos/huggingface/transformers/issues/38368/events | https://github.com/huggingface/transformers/pull/38368 | 3,090,484,856 | PR_kwDOCUB6oc6XmcES | 38,368 | 2/2 More cleaning for the `LlamaModel` keeping only the core | {
"login": "ArthurZucker",
"id": 48595927,
"node_id": "MDQ6VXNlcjQ4NTk1OTI3",
"avatar_url": "https://avatars.githubusercontent.com/u/48595927?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ArthurZucker",
"html_url": "https://github.com/ArthurZucker",
"followers_url": "https://api.github.com/users/ArthurZucker/followers",
"following_url": "https://api.github.com/users/ArthurZucker/following{/other_user}",
"gists_url": "https://api.github.com/users/ArthurZucker/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ArthurZucker/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ArthurZucker/subscriptions",
"organizations_url": "https://api.github.com/users/ArthurZucker/orgs",
"repos_url": "https://api.github.com/users/ArthurZucker/repos",
"events_url": "https://api.github.com/users/ArthurZucker/events{/privacy}",
"received_events_url": "https://api.github.com/users/ArthurZucker/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-05-26T08:52:55 | 2025-09-11T09:06:08 | 2025-09-11T09:06:04 | COLLABORATOR | null | null | true | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38368",
"html_url": "https://github.com/huggingface/transformers/pull/38368",
"diff_url": "https://github.com/huggingface/transformers/pull/38368.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38368.patch",
"merged_at": null
} | # What does this PR do?
We remove the ForXXX that are not core, moving them to a more generic / general place as they never change for most models | {
"login": "ArthurZucker",
"id": 48595927,
"node_id": "MDQ6VXNlcjQ4NTk1OTI3",
"avatar_url": "https://avatars.githubusercontent.com/u/48595927?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ArthurZucker",
"html_url": "https://github.com/ArthurZucker",
"followers_url": "https://api.github.com/users/ArthurZucker/followers",
"following_url": "https://api.github.com/users/ArthurZucker/following{/other_user}",
"gists_url": "https://api.github.com/users/ArthurZucker/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ArthurZucker/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ArthurZucker/subscriptions",
"organizations_url": "https://api.github.com/users/ArthurZucker/orgs",
"repos_url": "https://api.github.com/users/ArthurZucker/repos",
"events_url": "https://api.github.com/users/ArthurZucker/events{/privacy}",
"received_events_url": "https://api.github.com/users/ArthurZucker/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38368/reactions",
"total_count": 1,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 1,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38368/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/38367 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38367/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38367/comments | https://api.github.com/repos/huggingface/transformers/issues/38367/events | https://github.com/huggingface/transformers/pull/38367 | 3,090,481,922 | PR_kwDOCUB6oc6XmbcB | 38,367 | 1/2 Last step cleaning the `LlamaModel` codebase | {
"login": "ArthurZucker",
"id": 48595927,
"node_id": "MDQ6VXNlcjQ4NTk1OTI3",
"avatar_url": "https://avatars.githubusercontent.com/u/48595927?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ArthurZucker",
"html_url": "https://github.com/ArthurZucker",
"followers_url": "https://api.github.com/users/ArthurZucker/followers",
"following_url": "https://api.github.com/users/ArthurZucker/following{/other_user}",
"gists_url": "https://api.github.com/users/ArthurZucker/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ArthurZucker/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ArthurZucker/subscriptions",
"organizations_url": "https://api.github.com/users/ArthurZucker/orgs",
"repos_url": "https://api.github.com/users/ArthurZucker/repos",
"events_url": "https://api.github.com/users/ArthurZucker/events{/privacy}",
"received_events_url": "https://api.github.com/users/ArthurZucker/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-05-26T08:51:34 | 2025-07-07T15:20:15 | 2025-07-07T15:20:15 | COLLABORATOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38367",
"html_url": "https://github.com/huggingface/transformers/pull/38367",
"diff_url": "https://github.com/huggingface/transformers/pull/38367.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38367.patch",
"merged_at": null
} | # What does this PR do?
Output attention and hidden states logic can be reduced to basically attaching hooks to the `GradientCheckpointing` layer.
Refactored `can_return_tuple` as it can be much simpler :) only top level will receive `False`
This makes the code simpler and cleaner! | {
"login": "ArthurZucker",
"id": 48595927,
"node_id": "MDQ6VXNlcjQ4NTk1OTI3",
"avatar_url": "https://avatars.githubusercontent.com/u/48595927?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ArthurZucker",
"html_url": "https://github.com/ArthurZucker",
"followers_url": "https://api.github.com/users/ArthurZucker/followers",
"following_url": "https://api.github.com/users/ArthurZucker/following{/other_user}",
"gists_url": "https://api.github.com/users/ArthurZucker/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ArthurZucker/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ArthurZucker/subscriptions",
"organizations_url": "https://api.github.com/users/ArthurZucker/orgs",
"repos_url": "https://api.github.com/users/ArthurZucker/repos",
"events_url": "https://api.github.com/users/ArthurZucker/events{/privacy}",
"received_events_url": "https://api.github.com/users/ArthurZucker/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38367/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38367/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/38366 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38366/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38366/comments | https://api.github.com/repos/huggingface/transformers/issues/38366/events | https://github.com/huggingface/transformers/pull/38366 | 3,090,457,609 | PR_kwDOCUB6oc6XmWOh | 38,366 | Fix Qwen2.5-VL Video Processor | {
"login": "yeliudev",
"id": 22849092,
"node_id": "MDQ6VXNlcjIyODQ5MDky",
"avatar_url": "https://avatars.githubusercontent.com/u/22849092?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/yeliudev",
"html_url": "https://github.com/yeliudev",
"followers_url": "https://api.github.com/users/yeliudev/followers",
"following_url": "https://api.github.com/users/yeliudev/following{/other_user}",
"gists_url": "https://api.github.com/users/yeliudev/gists{/gist_id}",
"starred_url": "https://api.github.com/users/yeliudev/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/yeliudev/subscriptions",
"organizations_url": "https://api.github.com/users/yeliudev/orgs",
"repos_url": "https://api.github.com/users/yeliudev/repos",
"events_url": "https://api.github.com/users/yeliudev/events{/privacy}",
"received_events_url": "https://api.github.com/users/yeliudev/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-05-26T08:41:21 | 2025-05-27T11:46:37 | 2025-05-27T11:46:37 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38366",
"html_url": "https://github.com/huggingface/transformers/pull/38366",
"diff_url": "https://github.com/huggingface/transformers/pull/38366.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38366.patch",
"merged_at": "2025-05-27T11:46:37"
} | # What does this PR do?
This PR fixes the bug in Qwen2_5_VLProcessor, which would consistently logging "Unused or unrecognized kwargs: fps, return_tensors." during training. This issue seems to be introduced in #35206.
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [ ] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review?
@zucchini-nlp
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
<!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @
If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**.
Please tag fewer than 3 people.
Models:
- text models: @ArthurZucker
- vision models: @amyeroberts, @qubvel
- speech models: @eustlb
- graph models: @clefourrier
Library:
- flax: @gante and @Rocketknight1
- generate: @zucchini-nlp (visual-language models) or @gante (all others)
- pipelines: @Rocketknight1
- tensorflow: @gante and @Rocketknight1
- tokenizers: @ArthurZucker
- trainer: @zach-huggingface and @SunMarc
- chat templates: @Rocketknight1
Integrations:
- deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface
- ray/raytune: @richardliaw, @amogkam
- Big Model Inference: @SunMarc
- quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber
Documentation: @stevhliu
HF projects:
- accelerate: [different repo](https://github.com/huggingface/accelerate)
- datasets: [different repo](https://github.com/huggingface/datasets)
- diffusers: [different repo](https://github.com/huggingface/diffusers)
- rust tokenizers: [different repo](https://github.com/huggingface/tokenizers)
Maintained examples (not research project or legacy):
- Flax: @Rocketknight1
- PyTorch: See Models above and tag the person corresponding to the modality of the example.
- TensorFlow: @Rocketknight1
-->
| {
"login": "ArthurZucker",
"id": 48595927,
"node_id": "MDQ6VXNlcjQ4NTk1OTI3",
"avatar_url": "https://avatars.githubusercontent.com/u/48595927?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ArthurZucker",
"html_url": "https://github.com/ArthurZucker",
"followers_url": "https://api.github.com/users/ArthurZucker/followers",
"following_url": "https://api.github.com/users/ArthurZucker/following{/other_user}",
"gists_url": "https://api.github.com/users/ArthurZucker/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ArthurZucker/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ArthurZucker/subscriptions",
"organizations_url": "https://api.github.com/users/ArthurZucker/orgs",
"repos_url": "https://api.github.com/users/ArthurZucker/repos",
"events_url": "https://api.github.com/users/ArthurZucker/events{/privacy}",
"received_events_url": "https://api.github.com/users/ArthurZucker/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38366/reactions",
"total_count": 2,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 2,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38366/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/38365 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38365/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38365/comments | https://api.github.com/repos/huggingface/transformers/issues/38365/events | https://github.com/huggingface/transformers/pull/38365 | 3,090,456,660 | PR_kwDOCUB6oc6XmWBt | 38,365 | [paligemma] fix processor with suffix | {
"login": "zucchini-nlp",
"id": 100715397,
"node_id": "U_kgDOBgDLhQ",
"avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/zucchini-nlp",
"html_url": "https://github.com/zucchini-nlp",
"followers_url": "https://api.github.com/users/zucchini-nlp/followers",
"following_url": "https://api.github.com/users/zucchini-nlp/following{/other_user}",
"gists_url": "https://api.github.com/users/zucchini-nlp/gists{/gist_id}",
"starred_url": "https://api.github.com/users/zucchini-nlp/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/zucchini-nlp/subscriptions",
"organizations_url": "https://api.github.com/users/zucchini-nlp/orgs",
"repos_url": "https://api.github.com/users/zucchini-nlp/repos",
"events_url": "https://api.github.com/users/zucchini-nlp/events{/privacy}",
"received_events_url": "https://api.github.com/users/zucchini-nlp/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 8103865784,
"node_id": "LA_kwDOCUB6oc8AAAAB4wctuA",
"url": "https://api.github.com/repos/huggingface/transformers/labels/for%20patch",
"name": "for patch",
"color": "D93F0B",
"default": false,
"description": "Tag issues / labels that should be included in the next patch"
}
] | closed | false | null | [] | null | [] | 2025-05-26T08:40:54 | 2025-05-27T09:31:56 | 2025-05-27T09:31:56 | MEMBER | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38365",
"html_url": "https://github.com/huggingface/transformers/pull/38365",
"diff_url": "https://github.com/huggingface/transformers/pull/38365.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38365.patch",
"merged_at": "2025-05-27T09:31:56"
} | # What does this PR do?
Fixes https://github.com/huggingface/transformers/issues/38341 and adds a test for that
| {
"login": "zucchini-nlp",
"id": 100715397,
"node_id": "U_kgDOBgDLhQ",
"avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/zucchini-nlp",
"html_url": "https://github.com/zucchini-nlp",
"followers_url": "https://api.github.com/users/zucchini-nlp/followers",
"following_url": "https://api.github.com/users/zucchini-nlp/following{/other_user}",
"gists_url": "https://api.github.com/users/zucchini-nlp/gists{/gist_id}",
"starred_url": "https://api.github.com/users/zucchini-nlp/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/zucchini-nlp/subscriptions",
"organizations_url": "https://api.github.com/users/zucchini-nlp/orgs",
"repos_url": "https://api.github.com/users/zucchini-nlp/repos",
"events_url": "https://api.github.com/users/zucchini-nlp/events{/privacy}",
"received_events_url": "https://api.github.com/users/zucchini-nlp/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38365/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38365/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/38364 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38364/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38364/comments | https://api.github.com/repos/huggingface/transformers/issues/38364/events | https://github.com/huggingface/transformers/pull/38364 | 3,090,339,200 | PR_kwDOCUB6oc6Xl86S | 38,364 | Lag kv cache | {
"login": "JoelSeniorLiang",
"id": 5005500,
"node_id": "MDQ6VXNlcjUwMDU1MDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/5005500?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/JoelSeniorLiang",
"html_url": "https://github.com/JoelSeniorLiang",
"followers_url": "https://api.github.com/users/JoelSeniorLiang/followers",
"following_url": "https://api.github.com/users/JoelSeniorLiang/following{/other_user}",
"gists_url": "https://api.github.com/users/JoelSeniorLiang/gists{/gist_id}",
"starred_url": "https://api.github.com/users/JoelSeniorLiang/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/JoelSeniorLiang/subscriptions",
"organizations_url": "https://api.github.com/users/JoelSeniorLiang/orgs",
"repos_url": "https://api.github.com/users/JoelSeniorLiang/repos",
"events_url": "https://api.github.com/users/JoelSeniorLiang/events{/privacy}",
"received_events_url": "https://api.github.com/users/JoelSeniorLiang/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | open | false | null | [] | null | [] | 2025-05-26T07:46:48 | 2025-06-09T03:13:02 | null | NONE | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38364",
"html_url": "https://github.com/huggingface/transformers/pull/38364",
"diff_url": "https://github.com/huggingface/transformers/pull/38364.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38364.patch",
"merged_at": null
} | # What does this PR do?
Add a feature with a KV cache compression algorithm named LagKV.
<!--
Congratulations! You've made it this far! You're not quite done yet though.
Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution.
Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change.
Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost.
-->
<!-- Remove if not applicable -->
Fixes # (issue)
#38312
## Before submitting
- [x] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [x] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [x] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [x] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [x] Did you write any new necessary tests?
## Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
<!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @
If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**.
Please tag fewer than 3 people.
Models:
- text models: @ArthurZucker
- vision models: @amyeroberts, @qubvel
- speech models: @eustlb
- graph models: @clefourrier
Library:
- flax: @gante and @Rocketknight1
- generate: @zucchini-nlp (visual-language models) or @gante (all others)
- pipelines: @Rocketknight1
- tensorflow: @gante and @Rocketknight1
- tokenizers: @ArthurZucker
- trainer: @zach-huggingface and @SunMarc
- chat templates: @Rocketknight1
Integrations:
- deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface
- ray/raytune: @richardliaw, @amogkam
- Big Model Inference: @SunMarc
- quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber
Documentation: @stevhliu
HF projects:
- accelerate: [different repo](https://github.com/huggingface/accelerate)
- datasets: [different repo](https://github.com/huggingface/datasets)
- diffusers: [different repo](https://github.com/huggingface/diffusers)
- rust tokenizers: [different repo](https://github.com/huggingface/tokenizers)
Maintained examples (not research project or legacy):
- Flax: @Rocketknight1
- PyTorch: See Models above and tag the person corresponding to the modality of the example.
- TensorFlow: @Rocketknight1
-->
| null | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38364/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38364/timeline | null | null | null | null | true | false |
https://api.github.com/repos/huggingface/transformers/issues/38363 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38363/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38363/comments | https://api.github.com/repos/huggingface/transformers/issues/38363/events | https://github.com/huggingface/transformers/pull/38363 | 3,090,289,724 | PR_kwDOCUB6oc6XlyJt | 38,363 | Fix the seamless_m4t cannot work on Gaudi | {
"login": "yuanwu2017",
"id": 34643241,
"node_id": "MDQ6VXNlcjM0NjQzMjQx",
"avatar_url": "https://avatars.githubusercontent.com/u/34643241?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/yuanwu2017",
"html_url": "https://github.com/yuanwu2017",
"followers_url": "https://api.github.com/users/yuanwu2017/followers",
"following_url": "https://api.github.com/users/yuanwu2017/following{/other_user}",
"gists_url": "https://api.github.com/users/yuanwu2017/gists{/gist_id}",
"starred_url": "https://api.github.com/users/yuanwu2017/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/yuanwu2017/subscriptions",
"organizations_url": "https://api.github.com/users/yuanwu2017/orgs",
"repos_url": "https://api.github.com/users/yuanwu2017/repos",
"events_url": "https://api.github.com/users/yuanwu2017/events{/privacy}",
"received_events_url": "https://api.github.com/users/yuanwu2017/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-05-26T07:27:09 | 2025-06-30T06:52:14 | 2025-06-25T10:40:02 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38363",
"html_url": "https://github.com/huggingface/transformers/pull/38363",
"diff_url": "https://github.com/huggingface/transformers/pull/38363.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38363.patch",
"merged_at": "2025-06-25T10:40:02"
} | # What does this PR do?
<!--
Congratulations! You've made it this far! You're not quite done yet though.
Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution.
Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change.
Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost.
-->
<!-- Remove if not applicable -->
Fixes # (issue)
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [ ] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
<!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @
If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**.
Please tag fewer than 3 people.
Models:
- text models: @ArthurZucker
- vision models: @amyeroberts, @qubvel
- speech models: @eustlb
- graph models: @clefourrier
Library:
- flax: @gante and @Rocketknight1
- generate: @zucchini-nlp (visual-language models) or @gante (all others)
- pipelines: @Rocketknight1
- tensorflow: @gante and @Rocketknight1
- tokenizers: @ArthurZucker
- trainer: @zach-huggingface and @SunMarc
- chat templates: @Rocketknight1
Integrations:
- deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface
- ray/raytune: @richardliaw, @amogkam
- Big Model Inference: @SunMarc
- quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber
Documentation: @stevhliu
HF projects:
- accelerate: [different repo](https://github.com/huggingface/accelerate)
- datasets: [different repo](https://github.com/huggingface/datasets)
- diffusers: [different repo](https://github.com/huggingface/diffusers)
- rust tokenizers: [different repo](https://github.com/huggingface/tokenizers)
Maintained examples (not research project or legacy):
- Flax: @Rocketknight1
- PyTorch: See Models above and tag the person corresponding to the modality of the example.
- TensorFlow: @Rocketknight1
-->
| {
"login": "IlyasMoutawwakil",
"id": 57442720,
"node_id": "MDQ6VXNlcjU3NDQyNzIw",
"avatar_url": "https://avatars.githubusercontent.com/u/57442720?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/IlyasMoutawwakil",
"html_url": "https://github.com/IlyasMoutawwakil",
"followers_url": "https://api.github.com/users/IlyasMoutawwakil/followers",
"following_url": "https://api.github.com/users/IlyasMoutawwakil/following{/other_user}",
"gists_url": "https://api.github.com/users/IlyasMoutawwakil/gists{/gist_id}",
"starred_url": "https://api.github.com/users/IlyasMoutawwakil/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/IlyasMoutawwakil/subscriptions",
"organizations_url": "https://api.github.com/users/IlyasMoutawwakil/orgs",
"repos_url": "https://api.github.com/users/IlyasMoutawwakil/repos",
"events_url": "https://api.github.com/users/IlyasMoutawwakil/events{/privacy}",
"received_events_url": "https://api.github.com/users/IlyasMoutawwakil/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38363/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38363/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/38362 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38362/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38362/comments | https://api.github.com/repos/huggingface/transformers/issues/38362/events | https://github.com/huggingface/transformers/issues/38362 | 3,090,276,599 | I_kwDOCUB6oc64MeD3 | 38,362 | Bug:ImportError: cannot import name 'TransformGetItemToIndex' from 'torch._dynamo._trace_wrapped_higher_order_op' (/home/tiger/.local/lib/python3.10/site-packages/torch/_dynamo/_trace_wrapped_higher_order_op.py) | {
"login": "hswei88",
"id": 129183149,
"node_id": "U_kgDOB7MtrQ",
"avatar_url": "https://avatars.githubusercontent.com/u/129183149?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/hswei88",
"html_url": "https://github.com/hswei88",
"followers_url": "https://api.github.com/users/hswei88/followers",
"following_url": "https://api.github.com/users/hswei88/following{/other_user}",
"gists_url": "https://api.github.com/users/hswei88/gists{/gist_id}",
"starred_url": "https://api.github.com/users/hswei88/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/hswei88/subscriptions",
"organizations_url": "https://api.github.com/users/hswei88/orgs",
"repos_url": "https://api.github.com/users/hswei88/repos",
"events_url": "https://api.github.com/users/hswei88/events{/privacy}",
"received_events_url": "https://api.github.com/users/hswei88/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
}
] | closed | false | null | [] | null | [] | 2025-05-26T07:21:29 | 2025-05-26T12:03:00 | 2025-05-26T12:03:00 | NONE | null | null | null | null | ### System Info
transformers:4.53.0
torch:2.5.1+cpu
### Who can help?
@FightingZhen
### Information
- [ ] The official example scripts
- [x] My own modified scripts
### Tasks
- [ ] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...)
- [x] My own task or dataset (give details below)
### Reproduction
i met an import error using Ascend
**Here is the minimal reproducible code.**
`if is_torch_flex_attn_available():
from torch._dynamo._trace_wrapped_higher_order_op import TransformGetItemToIndex
from torch.nn.attention.flex_attention import BlockMask, create_block_mask`
**Exception log**
`File "/home/tiger/new_verl/transformers/src/transformers/masking_utils.py", line 29, in <module>
from torch._dynamo._trace_wrapped_higher_order_op import TransformGetItemToIndex
ImportError: cannot import name 'TransformGetItemToIndex' from 'torch._dynamo._trace_wrapped_higher_order_op' (/home/tiger/.local/lib/python3.10/site-packages/torch/_dynamo/_trace_wrapped_higher_order_op.py)`
### Expected behavior
working | {
"login": "hswei88",
"id": 129183149,
"node_id": "U_kgDOB7MtrQ",
"avatar_url": "https://avatars.githubusercontent.com/u/129183149?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/hswei88",
"html_url": "https://github.com/hswei88",
"followers_url": "https://api.github.com/users/hswei88/followers",
"following_url": "https://api.github.com/users/hswei88/following{/other_user}",
"gists_url": "https://api.github.com/users/hswei88/gists{/gist_id}",
"starred_url": "https://api.github.com/users/hswei88/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/hswei88/subscriptions",
"organizations_url": "https://api.github.com/users/hswei88/orgs",
"repos_url": "https://api.github.com/users/hswei88/repos",
"events_url": "https://api.github.com/users/hswei88/events{/privacy}",
"received_events_url": "https://api.github.com/users/hswei88/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38362/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38362/timeline | null | completed | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | true |
https://api.github.com/repos/huggingface/transformers/issues/38361 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38361/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38361/comments | https://api.github.com/repos/huggingface/transformers/issues/38361/events | https://github.com/huggingface/transformers/pull/38361 | 3,090,200,694 | PR_kwDOCUB6oc6XlfAX | 38,361 | test | {
"login": "zRzRzRzRzRzRzR",
"id": 93239683,
"node_id": "U_kgDOBY65gw",
"avatar_url": "https://avatars.githubusercontent.com/u/93239683?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/zRzRzRzRzRzRzR",
"html_url": "https://github.com/zRzRzRzRzRzRzR",
"followers_url": "https://api.github.com/users/zRzRzRzRzRzRzR/followers",
"following_url": "https://api.github.com/users/zRzRzRzRzRzRzR/following{/other_user}",
"gists_url": "https://api.github.com/users/zRzRzRzRzRzRzR/gists{/gist_id}",
"starred_url": "https://api.github.com/users/zRzRzRzRzRzRzR/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/zRzRzRzRzRzRzR/subscriptions",
"organizations_url": "https://api.github.com/users/zRzRzRzRzRzRzR/orgs",
"repos_url": "https://api.github.com/users/zRzRzRzRzRzRzR/repos",
"events_url": "https://api.github.com/users/zRzRzRzRzRzRzR/events{/privacy}",
"received_events_url": "https://api.github.com/users/zRzRzRzRzRzRzR/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-05-26T06:44:20 | 2025-05-26T06:44:26 | 2025-05-26T06:44:26 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38361",
"html_url": "https://github.com/huggingface/transformers/pull/38361",
"diff_url": "https://github.com/huggingface/transformers/pull/38361.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38361.patch",
"merged_at": null
} | # What does this PR do?
<!--
Congratulations! You've made it this far! You're not quite done yet though.
Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution.
Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change.
Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost.
-->
<!-- Remove if not applicable -->
Fixes # (issue)
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [ ] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
<!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @
If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**.
Please tag fewer than 3 people.
Models:
- text models: @ArthurZucker
- vision models: @amyeroberts, @qubvel
- speech models: @eustlb
- graph models: @clefourrier
Library:
- flax: @gante and @Rocketknight1
- generate: @zucchini-nlp (visual-language models) or @gante (all others)
- pipelines: @Rocketknight1
- tensorflow: @gante and @Rocketknight1
- tokenizers: @ArthurZucker
- trainer: @zach-huggingface and @SunMarc
- chat templates: @Rocketknight1
Integrations:
- deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface
- ray/raytune: @richardliaw, @amogkam
- Big Model Inference: @SunMarc
- quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber
Documentation: @stevhliu
HF projects:
- accelerate: [different repo](https://github.com/huggingface/accelerate)
- datasets: [different repo](https://github.com/huggingface/datasets)
- diffusers: [different repo](https://github.com/huggingface/diffusers)
- rust tokenizers: [different repo](https://github.com/huggingface/tokenizers)
Maintained examples (not research project or legacy):
- Flax: @Rocketknight1
- PyTorch: See Models above and tag the person corresponding to the modality of the example.
- TensorFlow: @Rocketknight1
-->
| {
"login": "zRzRzRzRzRzRzR",
"id": 93239683,
"node_id": "U_kgDOBY65gw",
"avatar_url": "https://avatars.githubusercontent.com/u/93239683?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/zRzRzRzRzRzRzR",
"html_url": "https://github.com/zRzRzRzRzRzRzR",
"followers_url": "https://api.github.com/users/zRzRzRzRzRzRzR/followers",
"following_url": "https://api.github.com/users/zRzRzRzRzRzRzR/following{/other_user}",
"gists_url": "https://api.github.com/users/zRzRzRzRzRzRzR/gists{/gist_id}",
"starred_url": "https://api.github.com/users/zRzRzRzRzRzRzR/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/zRzRzRzRzRzRzR/subscriptions",
"organizations_url": "https://api.github.com/users/zRzRzRzRzRzRzR/orgs",
"repos_url": "https://api.github.com/users/zRzRzRzRzRzRzR/repos",
"events_url": "https://api.github.com/users/zRzRzRzRzRzRzR/events{/privacy}",
"received_events_url": "https://api.github.com/users/zRzRzRzRzRzRzR/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38361/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38361/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/38360 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38360/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38360/comments | https://api.github.com/repos/huggingface/transformers/issues/38360/events | https://github.com/huggingface/transformers/issues/38360 | 3,090,008,678 | I_kwDOCUB6oc64Lcpm | 38,360 | Bug: BlockMask type not defined in masking_utils.py | {
"login": "yijun-lee",
"id": 119404328,
"node_id": "U_kgDOBx33KA",
"avatar_url": "https://avatars.githubusercontent.com/u/119404328?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/yijun-lee",
"html_url": "https://github.com/yijun-lee",
"followers_url": "https://api.github.com/users/yijun-lee/followers",
"following_url": "https://api.github.com/users/yijun-lee/following{/other_user}",
"gists_url": "https://api.github.com/users/yijun-lee/gists{/gist_id}",
"starred_url": "https://api.github.com/users/yijun-lee/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/yijun-lee/subscriptions",
"organizations_url": "https://api.github.com/users/yijun-lee/orgs",
"repos_url": "https://api.github.com/users/yijun-lee/repos",
"events_url": "https://api.github.com/users/yijun-lee/events{/privacy}",
"received_events_url": "https://api.github.com/users/yijun-lee/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
}
] | closed | false | null | [] | null | [] | 2025-05-26T05:01:17 | 2025-05-26T11:55:01 | 2025-05-26T11:55:01 | CONTRIBUTOR | null | null | null | null | ### System Info
I tried to run `transformers env` but encountered an error. However, I was able to gather the following system information:
- `transformers` version: 4.53.0.dev0
- Platform: Linux-5.15.0-60-generic-x86_64-with-glibc2.31
- Python version: 3.10.12
- PyTorch version (GPU?): 2.3.1+cu121 (Yes)
- CUDA version: 12.2
- GPU: 2x NVIDIA A100-SXM4-80GB
### Who can help?
@ArthurZucker
@Cyrilvallez
### Information
- [ ] The official example scripts
- [x] My own modified scripts
### Tasks
- [ ] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...)
- [x] My own task or dataset (give details below)
### Reproduction
### Error Case 1: ImportError due to missing BlockMask type definition
```python
from transformers import AutoTokenizer
gguf_model_id = 'unsloth/Llama-4-Scout-17B-16E-Instruct-GGUF'
gguf_file = 'Llama-4-Scout-17B-16E-Instruct-UD-IQ2_XXS.gguf'
gguf_tokenizer = AutoTokenizer.from_pretrained(gguf_model_id)
original_model_id = 'meta-llama/Llama-4-Scout-17B-16E-Instruct'
original_tokenizer = AutoTokenizer.from_pretrained(original_model_id)
input_text = "Hello World!"
gguf_token = gguf_tokenizer(input_text, return_tensors="pt")
original_token = original_tokenizer(input_text, return_tensors="pt")
```
Error message:
```bash
Traceback (most recent call last):
File "/root/transformers/tokenizer.py", line 1, in <module>
from transformers import AutoTokenizer
File "<frozen importlib.bootstrap>", line 1075, in _handle_fromlist
File "/root/transformers/src/transformers/utils/import_utils.py", line 2045, in getattr
module = self.get_module(self._class_to_module[name])
File "/root/transformers/src/transformers/utils/import_utils.py", line 2075, in get_module
raise e
File "/root/transformers/src/transformers/utils/import_utils.py", line 2073, in get_module
return importlib.import_module("." + module_name, self.name)
File "/root/transformers/src/transformers/models/auto/tokenization_auto.py", line 38, in <module>
from .auto_factory import LazyAutoMapping
File "/root/transformers/src/transformers/models/auto/auto_factory.py", line 40, in <module>
from ...generation import GenerationMixin
File "<frozen importlib.bootstrap>", line 1075, in _handle_fromlist
File "/root/transformers/src/transformers/utils/import_utils.py", line 2045, in getattr
module = self.get_module(self._class_to_module[name])
File "/root/transformers/src/transformers/utils/import_utils.py", line 2075, in get_module
raise e
File "/root/transformers/src/transformers/utils/import_utils.py", line 2073, in get_module
return importlib.import_module("." + module_name, self.name)
File "/usr/lib/python3.10/importlib/init.py", line 126, in import_module
return bootstrap._gcd_import(name[level:], package, level)
File "/root/transformers/src/transformers/generation/utils.py", line 49, in <module>
from ..masking_utils import create_masks_for_generate
File "/root/transformers/src/transformers/masking_utils.py", line 585, in <module>
attention_mask: Optional[Union[torch.Tensor, BlockMask]],
NameError: name 'BlockMask' is not defined
```
### Error Case 2: transformers env command failure due to same import error
```bash
transformers env
```
Error message:
```bash
Traceback (most recent call last):
File "/usr/local/bin/transformers", line 5, in <module>
from transformers.commands.transformers_cli import main
File "/root/transformers/src/transformers/commands/transformers_cli.py", line 20, in <module>
from transformers.commands.chat import ChatCommand
File "/root/transformers/src/transformers/commands/chat.py", line 45, in <module>
from transformers import (
File "<frozen importlib.bootstrap>", line 1075, in _handle_fromlist
File "/root/transformers/src/transformers/utils/import_utils.py", line 2045, in getattr
module = self.get_module(self._class_to_module[name])
File "/root/transformers/src/transformers/utils/import_utils.py", line 2075, in get_module
raise e
File "/root/transformers/src/transformers/utils/import_utils.py", line 2073, in get_module
return importlib.import_module("." + module_name, self.name)
File "/usr/lib/python3.10/importlib/init.py", line 126, in import_module
return bootstrap._gcd_import(name[level:], package, level)
File "/root/transformers/src/transformers/models/auto/modeling_auto.py", line 21, in <module>
from .auto_factory import (
File "/root/transformers/src/transformers/models/auto/auto_factory.py", line 40, in <module>
from ...generation import GenerationMixin
File "<frozen importlib.bootstrap>", line 1075, in _handle_fromlist
File "/root/transformers/src/transformers/utils/import_utils.py", line 2045, in getattr
module = self.get_module(self._class_to_module[name])
File "/root/transformers/src/transformers/utils/import_utils.py", line 2075, in get_module
raise e
File "/root/transformers/src/transformers/utils/import_utils.py", line 2073, in get_module
return importlib.import_module("." + module_name, self.name)
File "/usr/lib/python3.10/importlib/init.py", line 126, in import_module
return bootstrap._gcd_import(name[level:], package, level)
File "/root/transformers/src/transformers/generation/utils.py", line 49, in <module>
from ..masking_utils import create_masks_for_generate
File "/root/transformers/src/transformers/masking_utils.py", line 585, in <module>
attention_mask: Optional[Union[torch.Tensor, BlockMask]],
NameError: name 'BlockMask' is not defined
```
### Expected behavior
### For Error Case 1:
This code is a test case for another issue, and while the tokenizer functionality itself might not work correctly, the import error with `BlockMask` should not occur. The library should at least be able to import the necessary modules without raising this type of error, even if the actual tokenizer functionality fails for other reasons.
### For Error Case 2:
The `transformers env` command should successfully run and display the environment information in a formatted way.
| {
"login": "yijun-lee",
"id": 119404328,
"node_id": "U_kgDOBx33KA",
"avatar_url": "https://avatars.githubusercontent.com/u/119404328?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/yijun-lee",
"html_url": "https://github.com/yijun-lee",
"followers_url": "https://api.github.com/users/yijun-lee/followers",
"following_url": "https://api.github.com/users/yijun-lee/following{/other_user}",
"gists_url": "https://api.github.com/users/yijun-lee/gists{/gist_id}",
"starred_url": "https://api.github.com/users/yijun-lee/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/yijun-lee/subscriptions",
"organizations_url": "https://api.github.com/users/yijun-lee/orgs",
"repos_url": "https://api.github.com/users/yijun-lee/repos",
"events_url": "https://api.github.com/users/yijun-lee/events{/privacy}",
"received_events_url": "https://api.github.com/users/yijun-lee/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38360/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38360/timeline | null | completed | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | true |
https://api.github.com/repos/huggingface/transformers/issues/38359 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38359/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38359/comments | https://api.github.com/repos/huggingface/transformers/issues/38359/events | https://github.com/huggingface/transformers/pull/38359 | 3,090,003,499 | PR_kwDOCUB6oc6Xk0NG | 38,359 | V.4.51.3 | {
"login": "Remorax",
"id": 26062692,
"node_id": "MDQ6VXNlcjI2MDYyNjky",
"avatar_url": "https://avatars.githubusercontent.com/u/26062692?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Remorax",
"html_url": "https://github.com/Remorax",
"followers_url": "https://api.github.com/users/Remorax/followers",
"following_url": "https://api.github.com/users/Remorax/following{/other_user}",
"gists_url": "https://api.github.com/users/Remorax/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Remorax/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Remorax/subscriptions",
"organizations_url": "https://api.github.com/users/Remorax/orgs",
"repos_url": "https://api.github.com/users/Remorax/repos",
"events_url": "https://api.github.com/users/Remorax/events{/privacy}",
"received_events_url": "https://api.github.com/users/Remorax/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-05-26T04:57:36 | 2025-05-26T04:58:42 | 2025-05-26T04:58:42 | NONE | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38359",
"html_url": "https://github.com/huggingface/transformers/pull/38359",
"diff_url": "https://github.com/huggingface/transformers/pull/38359.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38359.patch",
"merged_at": null
} | # What does this PR do?
<!--
Congratulations! You've made it this far! You're not quite done yet though.
Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution.
Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change.
Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost.
-->
<!-- Remove if not applicable -->
Fixes # (issue)
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [ ] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
<!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @
If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**.
Please tag fewer than 3 people.
Models:
- text models: @ArthurZucker
- vision models: @amyeroberts, @qubvel
- speech models: @eustlb
- graph models: @clefourrier
Library:
- flax: @gante and @Rocketknight1
- generate: @zucchini-nlp (visual-language models) or @gante (all others)
- pipelines: @Rocketknight1
- tensorflow: @gante and @Rocketknight1
- tokenizers: @ArthurZucker
- trainer: @zach-huggingface and @SunMarc
- chat templates: @Rocketknight1
Integrations:
- deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface
- ray/raytune: @richardliaw, @amogkam
- Big Model Inference: @SunMarc
- quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber
Documentation: @stevhliu
HF projects:
- accelerate: [different repo](https://github.com/huggingface/accelerate)
- datasets: [different repo](https://github.com/huggingface/datasets)
- diffusers: [different repo](https://github.com/huggingface/diffusers)
- rust tokenizers: [different repo](https://github.com/huggingface/tokenizers)
Maintained examples (not research project or legacy):
- Flax: @Rocketknight1
- PyTorch: See Models above and tag the person corresponding to the modality of the example.
- TensorFlow: @Rocketknight1
-->
| {
"login": "Remorax",
"id": 26062692,
"node_id": "MDQ6VXNlcjI2MDYyNjky",
"avatar_url": "https://avatars.githubusercontent.com/u/26062692?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Remorax",
"html_url": "https://github.com/Remorax",
"followers_url": "https://api.github.com/users/Remorax/followers",
"following_url": "https://api.github.com/users/Remorax/following{/other_user}",
"gists_url": "https://api.github.com/users/Remorax/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Remorax/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Remorax/subscriptions",
"organizations_url": "https://api.github.com/users/Remorax/orgs",
"repos_url": "https://api.github.com/users/Remorax/repos",
"events_url": "https://api.github.com/users/Remorax/events{/privacy}",
"received_events_url": "https://api.github.com/users/Remorax/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38359/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38359/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/38358 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38358/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38358/comments | https://api.github.com/repos/huggingface/transformers/issues/38358/events | https://github.com/huggingface/transformers/issues/38358 | 3,089,954,410 | I_kwDOCUB6oc64LPZq | 38,358 | Invalid attribute access in `PreTrainedModel.initialize_weights` | {
"login": "DarkLight1337",
"id": 44970335,
"node_id": "MDQ6VXNlcjQ0OTcwMzM1",
"avatar_url": "https://avatars.githubusercontent.com/u/44970335?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/DarkLight1337",
"html_url": "https://github.com/DarkLight1337",
"followers_url": "https://api.github.com/users/DarkLight1337/followers",
"following_url": "https://api.github.com/users/DarkLight1337/following{/other_user}",
"gists_url": "https://api.github.com/users/DarkLight1337/gists{/gist_id}",
"starred_url": "https://api.github.com/users/DarkLight1337/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/DarkLight1337/subscriptions",
"organizations_url": "https://api.github.com/users/DarkLight1337/orgs",
"repos_url": "https://api.github.com/users/DarkLight1337/repos",
"events_url": "https://api.github.com/users/DarkLight1337/events{/privacy}",
"received_events_url": "https://api.github.com/users/DarkLight1337/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
}
] | closed | false | null | [] | null | [] | 2025-05-26T04:22:31 | 2025-05-27T09:00:42 | 2025-05-27T09:00:42 | NONE | null | null | null | null | ### System Info
transformers 4.52
### Who can help?
@ArthurZucker @hmellor
### Information
- [ ] The official example scripts
- [x] My own modified scripts
### Tasks
- [ ] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...)
- [x] My own task or dataset (give details below)
### Reproduction
https://github.com/huggingface/transformers/blob/6e3063422c4b1c014aa60c32b9254fd2902f0f28/src/transformers/modeling_utils.py#L2660-L2661
This function fails if child modules define `_init_weights` instead of `_initialize_weights`. I guess this is the result of a rename operation.
For example, this occurs in the [`Resampler` module](https://huggingface.co/Qwen/Qwen-VL/blob/main/visual.py) of https://huggingface.co/Qwen/Qwen-VL
Discovered in https://github.com/vllm-project/vllm/pull/18678
Error log: https://buildkite.com/vllm/fastcheck/builds/25105/steps?sid=01970814-a020-4f10-9c0c-c37abbed6073
### Expected behavior
This function should check the `_initialize_weights` attribute, not `_init_weights`. Optionally, emit a warning if the model uses the old method name `_init_weights`. | {
"login": "Cyrilvallez",
"id": 71554963,
"node_id": "MDQ6VXNlcjcxNTU0OTYz",
"avatar_url": "https://avatars.githubusercontent.com/u/71554963?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Cyrilvallez",
"html_url": "https://github.com/Cyrilvallez",
"followers_url": "https://api.github.com/users/Cyrilvallez/followers",
"following_url": "https://api.github.com/users/Cyrilvallez/following{/other_user}",
"gists_url": "https://api.github.com/users/Cyrilvallez/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Cyrilvallez/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Cyrilvallez/subscriptions",
"organizations_url": "https://api.github.com/users/Cyrilvallez/orgs",
"repos_url": "https://api.github.com/users/Cyrilvallez/repos",
"events_url": "https://api.github.com/users/Cyrilvallez/events{/privacy}",
"received_events_url": "https://api.github.com/users/Cyrilvallez/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38358/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38358/timeline | null | completed | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | true |
https://api.github.com/repos/huggingface/transformers/issues/38357 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38357/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38357/comments | https://api.github.com/repos/huggingface/transformers/issues/38357/events | https://github.com/huggingface/transformers/pull/38357 | 3,089,765,219 | PR_kwDOCUB6oc6XkBfq | 38,357 | fix typo: `tokenizer` -> `tokenize` | {
"login": "foldl",
"id": 4046440,
"node_id": "MDQ6VXNlcjQwNDY0NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/4046440?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/foldl",
"html_url": "https://github.com/foldl",
"followers_url": "https://api.github.com/users/foldl/followers",
"following_url": "https://api.github.com/users/foldl/following{/other_user}",
"gists_url": "https://api.github.com/users/foldl/gists{/gist_id}",
"starred_url": "https://api.github.com/users/foldl/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/foldl/subscriptions",
"organizations_url": "https://api.github.com/users/foldl/orgs",
"repos_url": "https://api.github.com/users/foldl/repos",
"events_url": "https://api.github.com/users/foldl/events{/privacy}",
"received_events_url": "https://api.github.com/users/foldl/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-05-26T01:41:49 | 2025-05-26T15:29:47 | 2025-05-26T15:29:17 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38357",
"html_url": "https://github.com/huggingface/transformers/pull/38357",
"diff_url": "https://github.com/huggingface/transformers/pull/38357.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38357.patch",
"merged_at": "2025-05-26T15:29:16"
} | # What does this PR do?
This PR fixes a small but annoying typo.
## Before submitting
- [x] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [ ] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
@stevhliu | {
"login": "Rocketknight1",
"id": 12866554,
"node_id": "MDQ6VXNlcjEyODY2NTU0",
"avatar_url": "https://avatars.githubusercontent.com/u/12866554?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Rocketknight1",
"html_url": "https://github.com/Rocketknight1",
"followers_url": "https://api.github.com/users/Rocketknight1/followers",
"following_url": "https://api.github.com/users/Rocketknight1/following{/other_user}",
"gists_url": "https://api.github.com/users/Rocketknight1/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Rocketknight1/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Rocketknight1/subscriptions",
"organizations_url": "https://api.github.com/users/Rocketknight1/orgs",
"repos_url": "https://api.github.com/users/Rocketknight1/repos",
"events_url": "https://api.github.com/users/Rocketknight1/events{/privacy}",
"received_events_url": "https://api.github.com/users/Rocketknight1/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38357/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38357/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/38356 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38356/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38356/comments | https://api.github.com/repos/huggingface/transformers/issues/38356/events | https://github.com/huggingface/transformers/pull/38356 | 3,089,684,779 | PR_kwDOCUB6oc6Xjwmd | 38,356 | Fix missing imports in `transformers add-new-model-like` CLI command | {
"login": "aminejebbar",
"id": 183459962,
"node_id": "U_kgDOCu9geg",
"avatar_url": "https://avatars.githubusercontent.com/u/183459962?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/aminejebbar",
"html_url": "https://github.com/aminejebbar",
"followers_url": "https://api.github.com/users/aminejebbar/followers",
"following_url": "https://api.github.com/users/aminejebbar/following{/other_user}",
"gists_url": "https://api.github.com/users/aminejebbar/gists{/gist_id}",
"starred_url": "https://api.github.com/users/aminejebbar/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/aminejebbar/subscriptions",
"organizations_url": "https://api.github.com/users/aminejebbar/orgs",
"repos_url": "https://api.github.com/users/aminejebbar/repos",
"events_url": "https://api.github.com/users/aminejebbar/events{/privacy}",
"received_events_url": "https://api.github.com/users/aminejebbar/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-05-26T00:19:54 | 2025-05-26T21:41:36 | 2025-05-26T21:39:42 | NONE | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38356",
"html_url": "https://github.com/huggingface/transformers/pull/38356",
"diff_url": "https://github.com/huggingface/transformers/pull/38356.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38356.patch",
"merged_at": null
} | While working on adding support for a custom model using the `transformers add-new-model-like` CLI command, I encountered an error due to missing imports in `src/transformers/commands/chat.py`. This caused the command to fail because essential components such as `TextIteratorStreamer`, `GenerationConfig`, `AutoTokenizer`, and `AutoModelForCausalLM` were not imported.
This PR fixes the issue by adding the necessary import statements, ensuring the command works correctly when adding new models. | {
"login": "aminejebbar",
"id": 183459962,
"node_id": "U_kgDOCu9geg",
"avatar_url": "https://avatars.githubusercontent.com/u/183459962?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/aminejebbar",
"html_url": "https://github.com/aminejebbar",
"followers_url": "https://api.github.com/users/aminejebbar/followers",
"following_url": "https://api.github.com/users/aminejebbar/following{/other_user}",
"gists_url": "https://api.github.com/users/aminejebbar/gists{/gist_id}",
"starred_url": "https://api.github.com/users/aminejebbar/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/aminejebbar/subscriptions",
"organizations_url": "https://api.github.com/users/aminejebbar/orgs",
"repos_url": "https://api.github.com/users/aminejebbar/repos",
"events_url": "https://api.github.com/users/aminejebbar/events{/privacy}",
"received_events_url": "https://api.github.com/users/aminejebbar/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38356/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38356/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/38355 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38355/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38355/comments | https://api.github.com/repos/huggingface/transformers/issues/38355/events | https://github.com/huggingface/transformers/pull/38355 | 3,089,626,618 | PR_kwDOCUB6oc6XjlKx | 38,355 | enable large_gpu and torchao cases on XPU | {
"login": "yao-matrix",
"id": 7245027,
"node_id": "MDQ6VXNlcjcyNDUwMjc=",
"avatar_url": "https://avatars.githubusercontent.com/u/7245027?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/yao-matrix",
"html_url": "https://github.com/yao-matrix",
"followers_url": "https://api.github.com/users/yao-matrix/followers",
"following_url": "https://api.github.com/users/yao-matrix/following{/other_user}",
"gists_url": "https://api.github.com/users/yao-matrix/gists{/gist_id}",
"starred_url": "https://api.github.com/users/yao-matrix/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/yao-matrix/subscriptions",
"organizations_url": "https://api.github.com/users/yao-matrix/orgs",
"repos_url": "https://api.github.com/users/yao-matrix/repos",
"events_url": "https://api.github.com/users/yao-matrix/events{/privacy}",
"received_events_url": "https://api.github.com/users/yao-matrix/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-05-25T22:48:34 | 2025-05-28T22:26:27 | 2025-05-28T08:30:16 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38355",
"html_url": "https://github.com/huggingface/transformers/pull/38355",
"diff_url": "https://github.com/huggingface/transformers/pull/38355.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38355.patch",
"merged_at": "2025-05-28T08:30:16"
} | 1. enable `cohere2` integration test cases on XPU
2. enable `torchao` test cases on XPU
@ydshieh @IlyasMoutawwakil , pls help review, thx. | {
"login": "ydshieh",
"id": 2521628,
"node_id": "MDQ6VXNlcjI1MjE2Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ydshieh",
"html_url": "https://github.com/ydshieh",
"followers_url": "https://api.github.com/users/ydshieh/followers",
"following_url": "https://api.github.com/users/ydshieh/following{/other_user}",
"gists_url": "https://api.github.com/users/ydshieh/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ydshieh/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ydshieh/subscriptions",
"organizations_url": "https://api.github.com/users/ydshieh/orgs",
"repos_url": "https://api.github.com/users/ydshieh/repos",
"events_url": "https://api.github.com/users/ydshieh/events{/privacy}",
"received_events_url": "https://api.github.com/users/ydshieh/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38355/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38355/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/38354 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38354/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38354/comments | https://api.github.com/repos/huggingface/transformers/issues/38354/events | https://github.com/huggingface/transformers/pull/38354 | 3,089,454,245 | PR_kwDOCUB6oc6XjDo8 | 38,354 | Uninstall `kernels` for AMD docker images | {
"login": "ydshieh",
"id": 2521628,
"node_id": "MDQ6VXNlcjI1MjE2Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ydshieh",
"html_url": "https://github.com/ydshieh",
"followers_url": "https://api.github.com/users/ydshieh/followers",
"following_url": "https://api.github.com/users/ydshieh/following{/other_user}",
"gists_url": "https://api.github.com/users/ydshieh/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ydshieh/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ydshieh/subscriptions",
"organizations_url": "https://api.github.com/users/ydshieh/orgs",
"repos_url": "https://api.github.com/users/ydshieh/repos",
"events_url": "https://api.github.com/users/ydshieh/events{/privacy}",
"received_events_url": "https://api.github.com/users/ydshieh/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-05-25T17:33:14 | 2025-05-25T17:46:19 | 2025-05-25T17:42:25 | COLLABORATOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38354",
"html_url": "https://github.com/huggingface/transformers/pull/38354",
"diff_url": "https://github.com/huggingface/transformers/pull/38354.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38354.patch",
"merged_at": "2025-05-25T17:42:25"
} | # What does this PR do?
Talked to @ivarflakstad offline because this cause AMD CI to fail (There were 6077 failures ...).
I will merge directly. | {
"login": "ydshieh",
"id": 2521628,
"node_id": "MDQ6VXNlcjI1MjE2Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ydshieh",
"html_url": "https://github.com/ydshieh",
"followers_url": "https://api.github.com/users/ydshieh/followers",
"following_url": "https://api.github.com/users/ydshieh/following{/other_user}",
"gists_url": "https://api.github.com/users/ydshieh/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ydshieh/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ydshieh/subscriptions",
"organizations_url": "https://api.github.com/users/ydshieh/orgs",
"repos_url": "https://api.github.com/users/ydshieh/repos",
"events_url": "https://api.github.com/users/ydshieh/events{/privacy}",
"received_events_url": "https://api.github.com/users/ydshieh/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38354/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38354/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/38353 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38353/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38353/comments | https://api.github.com/repos/huggingface/transformers/issues/38353/events | https://github.com/huggingface/transformers/issues/38353 | 3,089,399,948 | I_kwDOCUB6oc64JICM | 38,353 | please support java and scala | {
"login": "mullerhai",
"id": 6143404,
"node_id": "MDQ6VXNlcjYxNDM0MDQ=",
"avatar_url": "https://avatars.githubusercontent.com/u/6143404?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mullerhai",
"html_url": "https://github.com/mullerhai",
"followers_url": "https://api.github.com/users/mullerhai/followers",
"following_url": "https://api.github.com/users/mullerhai/following{/other_user}",
"gists_url": "https://api.github.com/users/mullerhai/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mullerhai/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mullerhai/subscriptions",
"organizations_url": "https://api.github.com/users/mullerhai/orgs",
"repos_url": "https://api.github.com/users/mullerhai/repos",
"events_url": "https://api.github.com/users/mullerhai/events{/privacy}",
"received_events_url": "https://api.github.com/users/mullerhai/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 2648621985,
"node_id": "MDU6TGFiZWwyNjQ4NjIxOTg1",
"url": "https://api.github.com/repos/huggingface/transformers/labels/Feature%20request",
"name": "Feature request",
"color": "FBCA04",
"default": false,
"description": "Request for a new feature"
}
] | closed | false | null | [] | null | [] | 2025-05-25T16:01:22 | 2025-05-26T15:14:48 | 2025-05-26T15:14:47 | NONE | null | null | null | null | ### Feature request
now java and scala all have deep learning framework like djl ,javacpp-pytorch , storch easyAi , we need use transformers in jvm platform ,now transformers can use for python js rust ,please support java and scala
### Motivation
can not wait
### Your contribution
please see storch-transformers git | {
"login": "Rocketknight1",
"id": 12866554,
"node_id": "MDQ6VXNlcjEyODY2NTU0",
"avatar_url": "https://avatars.githubusercontent.com/u/12866554?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Rocketknight1",
"html_url": "https://github.com/Rocketknight1",
"followers_url": "https://api.github.com/users/Rocketknight1/followers",
"following_url": "https://api.github.com/users/Rocketknight1/following{/other_user}",
"gists_url": "https://api.github.com/users/Rocketknight1/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Rocketknight1/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Rocketknight1/subscriptions",
"organizations_url": "https://api.github.com/users/Rocketknight1/orgs",
"repos_url": "https://api.github.com/users/Rocketknight1/repos",
"events_url": "https://api.github.com/users/Rocketknight1/events{/privacy}",
"received_events_url": "https://api.github.com/users/Rocketknight1/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38353/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38353/timeline | null | not_planned | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | true |
https://api.github.com/repos/huggingface/transformers/issues/38352 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38352/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38352/comments | https://api.github.com/repos/huggingface/transformers/issues/38352/events | https://github.com/huggingface/transformers/issues/38352 | 3,089,346,782 | I_kwDOCUB6oc64I7De | 38,352 | `video_utils.group_videos_by_shape` does not consider video length | {
"login": "DarkLight1337",
"id": 44970335,
"node_id": "MDQ6VXNlcjQ0OTcwMzM1",
"avatar_url": "https://avatars.githubusercontent.com/u/44970335?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/DarkLight1337",
"html_url": "https://github.com/DarkLight1337",
"followers_url": "https://api.github.com/users/DarkLight1337/followers",
"following_url": "https://api.github.com/users/DarkLight1337/following{/other_user}",
"gists_url": "https://api.github.com/users/DarkLight1337/gists{/gist_id}",
"starred_url": "https://api.github.com/users/DarkLight1337/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/DarkLight1337/subscriptions",
"organizations_url": "https://api.github.com/users/DarkLight1337/orgs",
"repos_url": "https://api.github.com/users/DarkLight1337/repos",
"events_url": "https://api.github.com/users/DarkLight1337/events{/privacy}",
"received_events_url": "https://api.github.com/users/DarkLight1337/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
}
] | closed | false | null | [] | null | [] | 2025-05-25T14:41:51 | 2025-05-27T09:32:34 | 2025-05-27T09:32:34 | NONE | null | null | null | null | ### System Info
transformers 4.52.3
### Who can help?
@zucchini-nlp @hmellor
### Information
- [ ] The official example scripts
- [x] My own modified scripts
### Tasks
- [ ] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...)
- [x] My own task or dataset (give details below)
### Reproduction
The utility function `transformers.video_utils.group_videos_by_shape` fails to handle videos with the same image shape but varying length.
Example:
```py
import torch
from transformers.video_utils import group_videos_by_shape
video_1 = torch.zeros((4, 3, 336, 336))
video_2 = torch.zeros((5, 3, 336, 336))
grouped_videos, grouped_videos_index = group_videos_by_shape([video_1, video_2])
```
Discovered in https://github.com/vllm-project/vllm/pull/18678
Error log: https://buildkite.com/vllm/fastcheck/builds/25100/steps?jid=0197076f-fbbf-45c4-968f-6d6f154f4af9
### Expected behavior
The videos should be grouped by the full shape, not just `shape[-2::]` | {
"login": "zucchini-nlp",
"id": 100715397,
"node_id": "U_kgDOBgDLhQ",
"avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/zucchini-nlp",
"html_url": "https://github.com/zucchini-nlp",
"followers_url": "https://api.github.com/users/zucchini-nlp/followers",
"following_url": "https://api.github.com/users/zucchini-nlp/following{/other_user}",
"gists_url": "https://api.github.com/users/zucchini-nlp/gists{/gist_id}",
"starred_url": "https://api.github.com/users/zucchini-nlp/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/zucchini-nlp/subscriptions",
"organizations_url": "https://api.github.com/users/zucchini-nlp/orgs",
"repos_url": "https://api.github.com/users/zucchini-nlp/repos",
"events_url": "https://api.github.com/users/zucchini-nlp/events{/privacy}",
"received_events_url": "https://api.github.com/users/zucchini-nlp/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38352/reactions",
"total_count": 1,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 1
} | https://api.github.com/repos/huggingface/transformers/issues/38352/timeline | null | completed | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | true |
https://api.github.com/repos/huggingface/transformers/issues/38350 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38350/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38350/comments | https://api.github.com/repos/huggingface/transformers/issues/38350/events | https://github.com/huggingface/transformers/issues/38350 | 3,089,249,502 | I_kwDOCUB6oc64IjTe | 38,350 | Unable to pass images to Aya Vision processor | {
"login": "DarkLight1337",
"id": 44970335,
"node_id": "MDQ6VXNlcjQ0OTcwMzM1",
"avatar_url": "https://avatars.githubusercontent.com/u/44970335?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/DarkLight1337",
"html_url": "https://github.com/DarkLight1337",
"followers_url": "https://api.github.com/users/DarkLight1337/followers",
"following_url": "https://api.github.com/users/DarkLight1337/following{/other_user}",
"gists_url": "https://api.github.com/users/DarkLight1337/gists{/gist_id}",
"starred_url": "https://api.github.com/users/DarkLight1337/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/DarkLight1337/subscriptions",
"organizations_url": "https://api.github.com/users/DarkLight1337/orgs",
"repos_url": "https://api.github.com/users/DarkLight1337/repos",
"events_url": "https://api.github.com/users/DarkLight1337/events{/privacy}",
"received_events_url": "https://api.github.com/users/DarkLight1337/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
}
] | closed | false | null | [] | null | [] | 2025-05-25T11:57:30 | 2025-05-27T09:43:55 | 2025-05-27T09:43:55 | NONE | null | null | null | null | ### System Info
transformers 4.52
### Who can help?
@zucchini-nlp @hmellor
### Information
- [ ] The official example scripts
- [x] My own modified scripts
### Tasks
- [ ] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...)
- [x] My own task or dataset (give details below)
### Reproduction
In the new version of transformers, [`_check_special_mm_tokens` is being called inside `AyaVisionProcessor`](https://github.com/huggingface/transformers/blob/main/src/transformers/models/aya_vision/processing_aya_vision.py#L230). However, [`_check_special_mm_tokens` assumes that the image placeholder `<image>` can be represented as a single token](https://github.com/huggingface/transformers/blob/main/src/transformers/processing_utils.py#L1661). This is not the case for [Aya Vision 8B](https://huggingface.co/CohereLabs/aya-vision-8b) which encodes `<image>` into `[35, 6504, 37]`. As a result, the validation always fails whenever an image is passed.
I discovered this issue when attempting to update transformers version in vLLM: https://github.com/vllm-project/vllm/pull/18678
Error log: ~~https://buildkite.com/vllm/fastcheck/builds/25098/steps?sid=019706c6-1a33-4922-9358-d72dfc525fe2~~ https://buildkite.com/vllm/fastcheck/builds/25098/steps?sid=019706c6-1a35-46ac-aa2b-8d6d811109fd
### Expected behavior
`_check_special_mm_tokens` should handle the case where the modality text takes up multiple tokens. | {
"login": "zucchini-nlp",
"id": 100715397,
"node_id": "U_kgDOBgDLhQ",
"avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/zucchini-nlp",
"html_url": "https://github.com/zucchini-nlp",
"followers_url": "https://api.github.com/users/zucchini-nlp/followers",
"following_url": "https://api.github.com/users/zucchini-nlp/following{/other_user}",
"gists_url": "https://api.github.com/users/zucchini-nlp/gists{/gist_id}",
"starred_url": "https://api.github.com/users/zucchini-nlp/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/zucchini-nlp/subscriptions",
"organizations_url": "https://api.github.com/users/zucchini-nlp/orgs",
"repos_url": "https://api.github.com/users/zucchini-nlp/repos",
"events_url": "https://api.github.com/users/zucchini-nlp/events{/privacy}",
"received_events_url": "https://api.github.com/users/zucchini-nlp/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38350/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38350/timeline | null | completed | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | true |
https://api.github.com/repos/huggingface/transformers/issues/38349 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38349/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38349/comments | https://api.github.com/repos/huggingface/transformers/issues/38349/events | https://github.com/huggingface/transformers/pull/38349 | 3,089,148,899 | PR_kwDOCUB6oc6XiH3X | 38,349 | Hot fix for AMD CI workflow | {
"login": "ydshieh",
"id": 2521628,
"node_id": "MDQ6VXNlcjI1MjE2Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ydshieh",
"html_url": "https://github.com/ydshieh",
"followers_url": "https://api.github.com/users/ydshieh/followers",
"following_url": "https://api.github.com/users/ydshieh/following{/other_user}",
"gists_url": "https://api.github.com/users/ydshieh/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ydshieh/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ydshieh/subscriptions",
"organizations_url": "https://api.github.com/users/ydshieh/orgs",
"repos_url": "https://api.github.com/users/ydshieh/repos",
"events_url": "https://api.github.com/users/ydshieh/events{/privacy}",
"received_events_url": "https://api.github.com/users/ydshieh/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-05-25T08:48:46 | 2025-05-25T09:15:33 | 2025-05-25T09:15:31 | COLLABORATOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38349",
"html_url": "https://github.com/huggingface/transformers/pull/38349",
"diff_url": "https://github.com/huggingface/transformers/pull/38349.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38349.patch",
"merged_at": "2025-05-25T09:15:31"
} | # What does this PR do?
I made a mistake in #38298 and the slack reports is broken for AMD at this moment due to a dict key error.
We should access `event_payload["workflow_run"]["event"]` instead where `event_payload` looks like
```python
{
"action": "completed",
"workflow": {
"created_at": "2024-04-23T08:01:31.000Z",
"html_url": "https://github.com/ydshieh/transformers/blob/main/.github/workflows/self-scheduled-amd-caller.yml",
"id": 95041645,
"name": "Self-hosted runner (AMD scheduled CI caller)",
"node_id": "W_kwDOEW1XxM4Fqjht",
"path": ".github/workflows/self-scheduled-amd-caller.yml",
"state": "active",
"updated_at": "2025-05-23T12:11:50.000Z",
"url": "https://api.github.com/repos/ydshieh/transformers/actions/workflows/95041645"
},
"workflow_run": {
"actor": {
"html_url": "https://github.com/ydshieh",
"id": 2521628,
"login": "ydshieh",
},
"artifacts_url": "https://api.github.com/repos/ydshieh/transformers/actions/runs/15211635268/artifacts",
"cancel_url": "https://api.github.com/repos/ydshieh/transformers/actions/runs/15211635268/cancel",
"conclusion": "success",
"created_at": "2025-05-23T13:37:05Z",
"display_title": "Self-hosted runner (AMD scheduled CI caller)",
"event": "schedule",
"head_branch": "main",
"head_commit": {
"author": {
"email": "2521628+ydshieh@users.noreply.github.com",
"name": "Yih-Dar"
}
},
"updated_at": "2025-05-23T13:37:12Z",
"url": "https://api.github.com/repos/ydshieh/transformers/actions/runs/15211635268",
"workflow_id": 95041645,
"workflow_url": "https://api.github.com/repos/ydshieh/transformers/actions/workflows/95041645"
}
}
``` | {
"login": "ydshieh",
"id": 2521628,
"node_id": "MDQ6VXNlcjI1MjE2Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ydshieh",
"html_url": "https://github.com/ydshieh",
"followers_url": "https://api.github.com/users/ydshieh/followers",
"following_url": "https://api.github.com/users/ydshieh/following{/other_user}",
"gists_url": "https://api.github.com/users/ydshieh/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ydshieh/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ydshieh/subscriptions",
"organizations_url": "https://api.github.com/users/ydshieh/orgs",
"repos_url": "https://api.github.com/users/ydshieh/repos",
"events_url": "https://api.github.com/users/ydshieh/events{/privacy}",
"received_events_url": "https://api.github.com/users/ydshieh/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38349/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38349/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/38348 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38348/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38348/comments | https://api.github.com/repos/huggingface/transformers/issues/38348/events | https://github.com/huggingface/transformers/issues/38348 | 3,089,055,604 | I_kwDOCUB6oc64Hz90 | 38,348 | Incorrect keypoint batch handling inside SuperGlueForKeypointMatching | {
"login": "i44p",
"id": 125213001,
"node_id": "U_kgDOB3aZSQ",
"avatar_url": "https://avatars.githubusercontent.com/u/125213001?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/i44p",
"html_url": "https://github.com/i44p",
"followers_url": "https://api.github.com/users/i44p/followers",
"following_url": "https://api.github.com/users/i44p/following{/other_user}",
"gists_url": "https://api.github.com/users/i44p/gists{/gist_id}",
"starred_url": "https://api.github.com/users/i44p/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/i44p/subscriptions",
"organizations_url": "https://api.github.com/users/i44p/orgs",
"repos_url": "https://api.github.com/users/i44p/repos",
"events_url": "https://api.github.com/users/i44p/events{/privacy}",
"received_events_url": "https://api.github.com/users/i44p/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
}
] | closed | false | null | [] | null | [] | 2025-05-25T05:30:55 | 2025-07-01T12:14:45 | 2025-07-01T12:14:45 | NONE | null | null | null | null | ### System Info
- `transformers` version: 4.51.3
- Platform: Linux-6.14.6-arch1-1-x86_64-with-glibc2.41
- Python version: 3.12.10
- Huggingface_hub version: 0.30.2
- Safetensors version: 0.5.3
- Accelerate version: not installed
- Accelerate config: not found
- DeepSpeed version: not installed
- PyTorch version (GPU?): 2.6.0 (False)
- Tensorflow version (GPU?): not installed (NA)
- Flax version (CPU?/GPU?/TPU?): not installed (NA)
- Jax version: not installed
- JaxLib version: not installed
- Using distributed or parallel set-up in script?: no
### Who can help?
@qubvel @sbucaille
### Information
- [ ] The official example scripts
- [ ] My own modified scripts
### Tasks
- [ ] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...)
- [ ] My own task or dataset (give details below)
### Reproduction
1. Install `pytorch`, `pillow`, `and transformers=4.51.3` using either pip or pixi.
2. Run the following script:
```py
import torch
from transformers import AutoImageProcessor, AutoModel
from PIL import Image
import requests
class Test:
def __init__(self):
self.processor = AutoImageProcessor.from_pretrained("magic-leap-community/superglue_outdoor")
self.model = AutoModel.from_pretrained("magic-leap-community/superglue_outdoor")
@torch.inference_mode()
def get_keypoints(
self,
series1: list[Image.Image],
series2: list[Image.Image]
):
images = []
for s1, s2 in zip(series1, series2):
images.append([s1, s2])
processor_inputs = self.processor(images, return_tensors="pt")
outputs = self.model(**processor_inputs)
image_sizes = [[(s1.height, s1.width), (s2.height, s2.width)]
for s1, s2 in zip(series1, series2)]
processed_outputs = self.processor.post_process_keypoint_matching(
outputs, image_sizes
)
return processed_outputs
url_image1 = "https://raw.githubusercontent.com/magicleap/SuperGluePretrainedNetwork/refs/heads/master/assets/phototourism_sample_images/united_states_capitol_98169888_3347710852.jpg"
image1 = Image.open(requests.get(url_image1, stream=True).raw)
url_image2 = "https://raw.githubusercontent.com/magicleap/SuperGluePretrainedNetwork/refs/heads/master/assets/phototourism_sample_images/united_states_capitol_26757027_6717084061.jpg"
image2 = Image.open(requests.get(url_image2, stream=True).raw)
test = Test()
kps = test.get_keypoints((image1, image1), (image2, image2))
assert torch.equal(kps[0]['keypoints0'], kps[1]['keypoints0'])
print("Assertion succeeded!")
```
### Expected behavior
The script executes successfully and `get_keypoints` returns two exact same arrays, assertion succeeds.
I tried to use `SuperGlueForKeypointMatching` (added in #29886) for batch inference but I found that while it works with single images well, it fails to do batch inference. I believe this is caused by incorrect concatenation inside `SuperGlueForKeypointMatching._match_image_pair`:
https://github.com/huggingface/transformers/blob/d0c9c66d1c09df3cd70bf036e813d88337b20d4c/src/transformers/models/superglue/modeling_superglue.py#L726-L727
Changing this seemingly fixed the issue for me.
```py
matches = torch.cat([matches0, matches1], dim=1).reshape(batch_size, 2, -1)
matching_scores = torch.cat([matching_scores0, matching_scores1], dim=1).reshape(batch_size, 2, -1)
```
| {
"login": "qubvel",
"id": 31920396,
"node_id": "MDQ6VXNlcjMxOTIwMzk2",
"avatar_url": "https://avatars.githubusercontent.com/u/31920396?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/qubvel",
"html_url": "https://github.com/qubvel",
"followers_url": "https://api.github.com/users/qubvel/followers",
"following_url": "https://api.github.com/users/qubvel/following{/other_user}",
"gists_url": "https://api.github.com/users/qubvel/gists{/gist_id}",
"starred_url": "https://api.github.com/users/qubvel/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/qubvel/subscriptions",
"organizations_url": "https://api.github.com/users/qubvel/orgs",
"repos_url": "https://api.github.com/users/qubvel/repos",
"events_url": "https://api.github.com/users/qubvel/events{/privacy}",
"received_events_url": "https://api.github.com/users/qubvel/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38348/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38348/timeline | null | completed | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | true |
https://api.github.com/repos/huggingface/transformers/issues/38347 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38347/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38347/comments | https://api.github.com/repos/huggingface/transformers/issues/38347/events | https://github.com/huggingface/transformers/issues/38347 | 3,089,040,935 | I_kwDOCUB6oc64HwYn | 38,347 | Significant WER Increase with Whisper Chunking Compared to Long-Form Transcription | {
"login": "Parvezkhan0",
"id": 137421089,
"node_id": "U_kgDOCDDhIQ",
"avatar_url": "https://avatars.githubusercontent.com/u/137421089?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Parvezkhan0",
"html_url": "https://github.com/Parvezkhan0",
"followers_url": "https://api.github.com/users/Parvezkhan0/followers",
"following_url": "https://api.github.com/users/Parvezkhan0/following{/other_user}",
"gists_url": "https://api.github.com/users/Parvezkhan0/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Parvezkhan0/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Parvezkhan0/subscriptions",
"organizations_url": "https://api.github.com/users/Parvezkhan0/orgs",
"repos_url": "https://api.github.com/users/Parvezkhan0/repos",
"events_url": "https://api.github.com/users/Parvezkhan0/events{/privacy}",
"received_events_url": "https://api.github.com/users/Parvezkhan0/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
}
] | closed | false | null | [] | null | [] | 2025-05-25T04:52:47 | 2025-07-30T08:03:19 | 2025-07-30T08:03:19 | NONE | null | null | null | null | ### System Info
I conducted several experiments using Whisper and Seamless M4Tv2 on the FLEURS dataset (with audio files concatenated into 5-minute samples). I utilized the batching feature by setting ` chunk_length_s` to 30 seconds. Surprisingly, this chunking approach led to a 20% higher Word Error Rate (WER) across all languages compared to long-form (sequential) transcription.
### Who can help?
_No response_
### Information
- [ ] The official example scripts
- [ ] My own modified scripts
### Tasks
- [ ] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...)
- [ ] My own task or dataset (give details below)
### Reproduction
Use the default Whisper pipeline on audio files that are several minutes long. When chunk_length_s is enabled (e.g., set to 30 seconds), the transcription quality significantly degrades compared to processing the entire audio sequentially without chunking.
### Expected behavior
A 20% increase seems excessively high to me. I'd expect chunking to cause only a slight drop in accuracy — perhaps a few percentage points at most — not this significant.
| {
"login": "github-actions[bot]",
"id": 41898282,
"node_id": "MDM6Qm90NDE4OTgyODI=",
"avatar_url": "https://avatars.githubusercontent.com/in/15368?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/github-actions%5Bbot%5D",
"html_url": "https://github.com/apps/github-actions",
"followers_url": "https://api.github.com/users/github-actions%5Bbot%5D/followers",
"following_url": "https://api.github.com/users/github-actions%5Bbot%5D/following{/other_user}",
"gists_url": "https://api.github.com/users/github-actions%5Bbot%5D/gists{/gist_id}",
"starred_url": "https://api.github.com/users/github-actions%5Bbot%5D/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/github-actions%5Bbot%5D/subscriptions",
"organizations_url": "https://api.github.com/users/github-actions%5Bbot%5D/orgs",
"repos_url": "https://api.github.com/users/github-actions%5Bbot%5D/repos",
"events_url": "https://api.github.com/users/github-actions%5Bbot%5D/events{/privacy}",
"received_events_url": "https://api.github.com/users/github-actions%5Bbot%5D/received_events",
"type": "Bot",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38347/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38347/timeline | null | completed | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | true |
https://api.github.com/repos/huggingface/transformers/issues/38346 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38346/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38346/comments | https://api.github.com/repos/huggingface/transformers/issues/38346/events | https://github.com/huggingface/transformers/issues/38346 | 3,088,901,753 | I_kwDOCUB6oc64HOZ5 | 38,346 | Why is return_assistant_tokens_mask and continue_final_message incompatible? | {
"login": "nyxkrage",
"id": 46626618,
"node_id": "MDQ6VXNlcjQ2NjI2NjE4",
"avatar_url": "https://avatars.githubusercontent.com/u/46626618?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/nyxkrage",
"html_url": "https://github.com/nyxkrage",
"followers_url": "https://api.github.com/users/nyxkrage/followers",
"following_url": "https://api.github.com/users/nyxkrage/following{/other_user}",
"gists_url": "https://api.github.com/users/nyxkrage/gists{/gist_id}",
"starred_url": "https://api.github.com/users/nyxkrage/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/nyxkrage/subscriptions",
"organizations_url": "https://api.github.com/users/nyxkrage/orgs",
"repos_url": "https://api.github.com/users/nyxkrage/repos",
"events_url": "https://api.github.com/users/nyxkrage/events{/privacy}",
"received_events_url": "https://api.github.com/users/nyxkrage/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-05-24T23:44:13 | 2025-07-02T08:03:11 | 2025-07-02T08:03:11 | NONE | null | null | null | null | I'm currently authoring a new chat template, and while debugging encountered the check for this, however when uncommenting the check, the resulting mask and template both seem to still be correct. So I'm curious as to why or whether this check is needed at all?
I can see it was introduced in [the original PR](https://github.com/huggingface/transformers/pull/33198), however there doesn't seem to be any justification/explanation for this assertion. | {
"login": "github-actions[bot]",
"id": 41898282,
"node_id": "MDM6Qm90NDE4OTgyODI=",
"avatar_url": "https://avatars.githubusercontent.com/in/15368?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/github-actions%5Bbot%5D",
"html_url": "https://github.com/apps/github-actions",
"followers_url": "https://api.github.com/users/github-actions%5Bbot%5D/followers",
"following_url": "https://api.github.com/users/github-actions%5Bbot%5D/following{/other_user}",
"gists_url": "https://api.github.com/users/github-actions%5Bbot%5D/gists{/gist_id}",
"starred_url": "https://api.github.com/users/github-actions%5Bbot%5D/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/github-actions%5Bbot%5D/subscriptions",
"organizations_url": "https://api.github.com/users/github-actions%5Bbot%5D/orgs",
"repos_url": "https://api.github.com/users/github-actions%5Bbot%5D/repos",
"events_url": "https://api.github.com/users/github-actions%5Bbot%5D/events{/privacy}",
"received_events_url": "https://api.github.com/users/github-actions%5Bbot%5D/received_events",
"type": "Bot",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38346/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38346/timeline | null | completed | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | true |
https://api.github.com/repos/huggingface/transformers/issues/38345 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38345/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38345/comments | https://api.github.com/repos/huggingface/transformers/issues/38345/events | https://github.com/huggingface/transformers/pull/38345 | 3,088,834,532 | PR_kwDOCUB6oc6XhICb | 38,345 | Add GLPNImageProcessorFast and related tests | {
"login": "aryanchauhan31",
"id": 176995032,
"node_id": "U_kgDOCoy62A",
"avatar_url": "https://avatars.githubusercontent.com/u/176995032?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/aryanchauhan31",
"html_url": "https://github.com/aryanchauhan31",
"followers_url": "https://api.github.com/users/aryanchauhan31/followers",
"following_url": "https://api.github.com/users/aryanchauhan31/following{/other_user}",
"gists_url": "https://api.github.com/users/aryanchauhan31/gists{/gist_id}",
"starred_url": "https://api.github.com/users/aryanchauhan31/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/aryanchauhan31/subscriptions",
"organizations_url": "https://api.github.com/users/aryanchauhan31/orgs",
"repos_url": "https://api.github.com/users/aryanchauhan31/repos",
"events_url": "https://api.github.com/users/aryanchauhan31/events{/privacy}",
"received_events_url": "https://api.github.com/users/aryanchauhan31/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-05-24T22:42:23 | 2025-05-24T23:12:36 | 2025-05-24T23:12:36 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38345",
"html_url": "https://github.com/huggingface/transformers/pull/38345",
"diff_url": "https://github.com/huggingface/transformers/pull/38345.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38345.patch",
"merged_at": null
} | This PR adds a `GLPNImageProcessorFast` for the GLPN model, aligned with Hugging Face's vision processing standards. It includes:
- Fast image processor implementation.
- Updates to `__init__.py` to register the processor.
- Unit tests covering PIL, NumPy, PyTorch input handling.
- Integration with auto image processor mechanism.
All tests in `tests/models/glpn/test_image_processing_glpn.py` pass ✅
| {
"login": "aryanchauhan31",
"id": 176995032,
"node_id": "U_kgDOCoy62A",
"avatar_url": "https://avatars.githubusercontent.com/u/176995032?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/aryanchauhan31",
"html_url": "https://github.com/aryanchauhan31",
"followers_url": "https://api.github.com/users/aryanchauhan31/followers",
"following_url": "https://api.github.com/users/aryanchauhan31/following{/other_user}",
"gists_url": "https://api.github.com/users/aryanchauhan31/gists{/gist_id}",
"starred_url": "https://api.github.com/users/aryanchauhan31/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/aryanchauhan31/subscriptions",
"organizations_url": "https://api.github.com/users/aryanchauhan31/orgs",
"repos_url": "https://api.github.com/users/aryanchauhan31/repos",
"events_url": "https://api.github.com/users/aryanchauhan31/events{/privacy}",
"received_events_url": "https://api.github.com/users/aryanchauhan31/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38345/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38345/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/38344 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38344/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38344/comments | https://api.github.com/repos/huggingface/transformers/issues/38344/events | https://github.com/huggingface/transformers/issues/38344 | 3,088,639,066 | I_kwDOCUB6oc64GORa | 38,344 | [Tests] Testing for ALBERT is quite slow | {
"login": "saqlain2204",
"id": 118016760,
"node_id": "U_kgDOBwjK-A",
"avatar_url": "https://avatars.githubusercontent.com/u/118016760?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/saqlain2204",
"html_url": "https://github.com/saqlain2204",
"followers_url": "https://api.github.com/users/saqlain2204/followers",
"following_url": "https://api.github.com/users/saqlain2204/following{/other_user}",
"gists_url": "https://api.github.com/users/saqlain2204/gists{/gist_id}",
"starred_url": "https://api.github.com/users/saqlain2204/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/saqlain2204/subscriptions",
"organizations_url": "https://api.github.com/users/saqlain2204/orgs",
"repos_url": "https://api.github.com/users/saqlain2204/repos",
"events_url": "https://api.github.com/users/saqlain2204/events{/privacy}",
"received_events_url": "https://api.github.com/users/saqlain2204/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-05-24T18:03:20 | 2025-06-04T10:46:42 | 2025-05-30T14:22:33 | CONTRIBUTOR | null | null | null | null | I've noticed that the tests for the ALBERT model are running quite slowly. Would it be possible to reduce the model size to speed up the testing process? I’d be happy to implement this change if needed!
Thanks | {
"login": "Rocketknight1",
"id": 12866554,
"node_id": "MDQ6VXNlcjEyODY2NTU0",
"avatar_url": "https://avatars.githubusercontent.com/u/12866554?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Rocketknight1",
"html_url": "https://github.com/Rocketknight1",
"followers_url": "https://api.github.com/users/Rocketknight1/followers",
"following_url": "https://api.github.com/users/Rocketknight1/following{/other_user}",
"gists_url": "https://api.github.com/users/Rocketknight1/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Rocketknight1/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Rocketknight1/subscriptions",
"organizations_url": "https://api.github.com/users/Rocketknight1/orgs",
"repos_url": "https://api.github.com/users/Rocketknight1/repos",
"events_url": "https://api.github.com/users/Rocketknight1/events{/privacy}",
"received_events_url": "https://api.github.com/users/Rocketknight1/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38344/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38344/timeline | null | completed | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | true |
https://api.github.com/repos/huggingface/transformers/issues/38343 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38343/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38343/comments | https://api.github.com/repos/huggingface/transformers/issues/38343/events | https://github.com/huggingface/transformers/pull/38343 | 3,088,328,864 | PR_kwDOCUB6oc6XfiCd | 38,343 | [WIP] Add OneformerFastImageProcessor | {
"login": "Player256",
"id": 92082372,
"node_id": "U_kgDOBX0QxA",
"avatar_url": "https://avatars.githubusercontent.com/u/92082372?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Player256",
"html_url": "https://github.com/Player256",
"followers_url": "https://api.github.com/users/Player256/followers",
"following_url": "https://api.github.com/users/Player256/following{/other_user}",
"gists_url": "https://api.github.com/users/Player256/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Player256/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Player256/subscriptions",
"organizations_url": "https://api.github.com/users/Player256/orgs",
"repos_url": "https://api.github.com/users/Player256/repos",
"events_url": "https://api.github.com/users/Player256/events{/privacy}",
"received_events_url": "https://api.github.com/users/Player256/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-05-24T09:57:54 | 2025-07-23T11:26:02 | 2025-07-22T20:41:39 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38343",
"html_url": "https://github.com/huggingface/transformers/pull/38343",
"diff_url": "https://github.com/huggingface/transformers/pull/38343.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38343.patch",
"merged_at": "2025-07-22T20:41:39"
} | # What does this PR do?
Adds Fast Image Processor for Oneformer model
Fixes # (issue)
#36978
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [x] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [x] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [x] Did you write any new necessary tests?
## Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
| {
"login": "yonigozlan",
"id": 74535834,
"node_id": "MDQ6VXNlcjc0NTM1ODM0",
"avatar_url": "https://avatars.githubusercontent.com/u/74535834?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/yonigozlan",
"html_url": "https://github.com/yonigozlan",
"followers_url": "https://api.github.com/users/yonigozlan/followers",
"following_url": "https://api.github.com/users/yonigozlan/following{/other_user}",
"gists_url": "https://api.github.com/users/yonigozlan/gists{/gist_id}",
"starred_url": "https://api.github.com/users/yonigozlan/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/yonigozlan/subscriptions",
"organizations_url": "https://api.github.com/users/yonigozlan/orgs",
"repos_url": "https://api.github.com/users/yonigozlan/repos",
"events_url": "https://api.github.com/users/yonigozlan/events{/privacy}",
"received_events_url": "https://api.github.com/users/yonigozlan/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38343/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38343/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/38342 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38342/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38342/comments | https://api.github.com/repos/huggingface/transformers/issues/38342/events | https://github.com/huggingface/transformers/pull/38342 | 3,087,957,867 | PR_kwDOCUB6oc6XeRU8 | 38,342 | Add `GLPNImageProcessorFast` and corresponding tests for GLPN model | {
"login": "aryanchauhan31",
"id": 176995032,
"node_id": "U_kgDOCoy62A",
"avatar_url": "https://avatars.githubusercontent.com/u/176995032?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/aryanchauhan31",
"html_url": "https://github.com/aryanchauhan31",
"followers_url": "https://api.github.com/users/aryanchauhan31/followers",
"following_url": "https://api.github.com/users/aryanchauhan31/following{/other_user}",
"gists_url": "https://api.github.com/users/aryanchauhan31/gists{/gist_id}",
"starred_url": "https://api.github.com/users/aryanchauhan31/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/aryanchauhan31/subscriptions",
"organizations_url": "https://api.github.com/users/aryanchauhan31/orgs",
"repos_url": "https://api.github.com/users/aryanchauhan31/repos",
"events_url": "https://api.github.com/users/aryanchauhan31/events{/privacy}",
"received_events_url": "https://api.github.com/users/aryanchauhan31/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-05-24T03:30:36 | 2025-05-24T16:56:00 | 2025-05-24T16:56:00 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38342",
"html_url": "https://github.com/huggingface/transformers/pull/38342",
"diff_url": "https://github.com/huggingface/transformers/pull/38342.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38342.patch",
"merged_at": null
} | # What does this PR do?
This PR adds a `Fast` version of the `GLPNImageProcessor`, enabling faster image preprocessing for the GLPN model via `BaseImageProcessorFast`. It mirrors the functionality of the original slow processor while offering better performance and compatibility with fast inference pipelines.
## Added
- `GLPNImageProcessorFast` in `src/transformers/models/glpn/image_processing_glpn_fast.py`
- Import routing and registration in:
- `src/transformers/models/glpn/__init__.py`
- `src/transformers/__init__.py`
- New unit tests in `tests/models/glpn/test_image_processing_glpn.py` to verify behavior across:
- PIL
- NumPy
- PyTorch tensor inputs
- 4-channel inputs
## Why is this needed?
This adds support for optimized image processing for GLPN, aligns the model with other vision models in the Transformers library that support fast processors, and improves downstream performance for tasks like monocular depth estimation.
## Tests
- ✅ All image processor tests pass (`test_call_pil`, `test_call_numpy`, `test_call_pytorch`, etc.)
- ✅ Compatibility with `AutoImageProcessor` confirmed.
| {
"login": "aryanchauhan31",
"id": 176995032,
"node_id": "U_kgDOCoy62A",
"avatar_url": "https://avatars.githubusercontent.com/u/176995032?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/aryanchauhan31",
"html_url": "https://github.com/aryanchauhan31",
"followers_url": "https://api.github.com/users/aryanchauhan31/followers",
"following_url": "https://api.github.com/users/aryanchauhan31/following{/other_user}",
"gists_url": "https://api.github.com/users/aryanchauhan31/gists{/gist_id}",
"starred_url": "https://api.github.com/users/aryanchauhan31/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/aryanchauhan31/subscriptions",
"organizations_url": "https://api.github.com/users/aryanchauhan31/orgs",
"repos_url": "https://api.github.com/users/aryanchauhan31/repos",
"events_url": "https://api.github.com/users/aryanchauhan31/events{/privacy}",
"received_events_url": "https://api.github.com/users/aryanchauhan31/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38342/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38342/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/38341 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38341/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38341/comments | https://api.github.com/repos/huggingface/transformers/issues/38341/events | https://github.com/huggingface/transformers/issues/38341 | 3,087,745,789 | I_kwDOCUB6oc64C0L9 | 38,341 | Processors do not pass on `return_tensors` to tokenizers properly. | {
"login": "shuheng-liu",
"id": 22414322,
"node_id": "MDQ6VXNlcjIyNDE0MzIy",
"avatar_url": "https://avatars.githubusercontent.com/u/22414322?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/shuheng-liu",
"html_url": "https://github.com/shuheng-liu",
"followers_url": "https://api.github.com/users/shuheng-liu/followers",
"following_url": "https://api.github.com/users/shuheng-liu/following{/other_user}",
"gists_url": "https://api.github.com/users/shuheng-liu/gists{/gist_id}",
"starred_url": "https://api.github.com/users/shuheng-liu/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/shuheng-liu/subscriptions",
"organizations_url": "https://api.github.com/users/shuheng-liu/orgs",
"repos_url": "https://api.github.com/users/shuheng-liu/repos",
"events_url": "https://api.github.com/users/shuheng-liu/events{/privacy}",
"received_events_url": "https://api.github.com/users/shuheng-liu/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
}
] | closed | false | null | [] | null | [] | 2025-05-24T00:01:46 | 2025-05-27T09:31:57 | 2025-05-27T09:31:57 | NONE | null | null | null | null | ### System Info
- `transformers` version: 4.52.3
- Platform: macOS-15.4.1-arm64-arm-64bit-Mach-O
- Python version: 3.13.0
- Huggingface_hub version: 0.32.0
- Safetensors version: 0.5.3
- Accelerate version: not installed
- Accelerate config: not found
- DeepSpeed version: not installed
- PyTorch version (GPU?): 2.7.0 (False)
- Tensorflow version (GPU?): not installed (NA)
- Flax version (CPU?/GPU?/TPU?): not installed (NA)
- Jax version: not installed
- JaxLib version: not installed
- Using distributed or parallel set-up in script?: <fill in>
### Who can help?
@ArthurZucker @zucchini-nlp
### Information
- [ ] The official example scripts
- [x] My own modified scripts
### Tasks
- [ ] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...)
- [x] My own task or dataset (give details below)
### Reproduction
1. Install `pytorch`, `pillow`, and `transformers=4.52.3` using pip.
2. Execute the following script:
```python3
import torch
from transformers import AutoProcessor
processor = AutoProcessor.from_pretrained("google/paligemma-3b-pt-224")
batch_features = processor(
text="<image> What's in this image?",
images=torch.zeros(3, 224, 224),
suffix="Nothing",
return_tensors="pt"
)
```
This yields an `AttributeError` with `transformers==4.52.3`
```
File "/private/tmp/venv/lib/python3.13/site-packages/transformers/models/paligemma/processing_paligemma.py", line 313, in __call__
labels = inputs["input_ids"].masked_fill(inputs["token_type_ids"] == 0, -100)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
AttributeError: 'list' object has no attribute 'masked_fill'
```
### Expected behavior
The `batch_features` should be created without error.
There seems to be a recent bug in the `__call__` method of many processors, including, e.g., `PaliGemmaProcessor`
This is likely caused by
https://github.com/huggingface/transformers/blob/31f8a0fe8a7e2db1ee30bf32ed5976cd11f3283c/src/transformers/models/paligemma/processing_paligemma.py#L301-L307
which was changed in commit 32eca7197a8d2618417a0d665db38d0af3695a2c
I believe the intention was to call `.get()` instead of `.pop()` on `text_kwargs` on line 301. Calling `.pop()` modifies `text_kwargs` in-place and hence the tokenizer would return `inputs["input_ids"]` as list instead of pytorch tensors. The `masked_fill` call below will fail when it's a list.
https://github.com/huggingface/transformers/blob/31f8a0fe8a7e2db1ee30bf32ed5976cd11f3283c/src/transformers/models/paligemma/processing_paligemma.py#L312-L313
| {
"login": "zucchini-nlp",
"id": 100715397,
"node_id": "U_kgDOBgDLhQ",
"avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/zucchini-nlp",
"html_url": "https://github.com/zucchini-nlp",
"followers_url": "https://api.github.com/users/zucchini-nlp/followers",
"following_url": "https://api.github.com/users/zucchini-nlp/following{/other_user}",
"gists_url": "https://api.github.com/users/zucchini-nlp/gists{/gist_id}",
"starred_url": "https://api.github.com/users/zucchini-nlp/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/zucchini-nlp/subscriptions",
"organizations_url": "https://api.github.com/users/zucchini-nlp/orgs",
"repos_url": "https://api.github.com/users/zucchini-nlp/repos",
"events_url": "https://api.github.com/users/zucchini-nlp/events{/privacy}",
"received_events_url": "https://api.github.com/users/zucchini-nlp/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38341/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38341/timeline | null | completed | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | true |
https://api.github.com/repos/huggingface/transformers/issues/38340 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38340/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38340/comments | https://api.github.com/repos/huggingface/transformers/issues/38340/events | https://github.com/huggingface/transformers/issues/38340 | 3,087,738,006 | I_kwDOCUB6oc64CySW | 38,340 | Errors using TinyLlama-1.1B-Chat-v1.0 and DirectML | {
"login": "tritium007",
"id": 206151406,
"node_id": "U_kgDODEme7g",
"avatar_url": "https://avatars.githubusercontent.com/u/206151406?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/tritium007",
"html_url": "https://github.com/tritium007",
"followers_url": "https://api.github.com/users/tritium007/followers",
"following_url": "https://api.github.com/users/tritium007/following{/other_user}",
"gists_url": "https://api.github.com/users/tritium007/gists{/gist_id}",
"starred_url": "https://api.github.com/users/tritium007/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/tritium007/subscriptions",
"organizations_url": "https://api.github.com/users/tritium007/orgs",
"repos_url": "https://api.github.com/users/tritium007/repos",
"events_url": "https://api.github.com/users/tritium007/events{/privacy}",
"received_events_url": "https://api.github.com/users/tritium007/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
}
] | closed | false | null | [] | null | [] | 2025-05-23T23:52:32 | 2025-07-15T14:10:57 | 2025-07-15T14:10:44 | NONE | null | null | null | null | ### System Info
- `transformers` version: 4.52.3
- Platform: Linux-5.15.167.4-microsoft-standard-WSL2-x86_64-with-glibc2.39
- Python version: 3.12.9
- Huggingface_hub version: 0.32.0
- Safetensors version: 0.5.3
- Accelerate version: 1.7.0
- Accelerate config: not found
- DeepSpeed version: not installed
- PyTorch version (GPU?): 2.4.1+cu121 (False)
- Tensorflow version (GPU?): not installed (NA)
- Flax version (CPU?/GPU?/TPU?): not installed (NA)
- Jax version: not installed
- JaxLib version: not installed
- Using distributed or parallel set-up in script?: <fill in>
### Who can help?
@SunMarc @zach-huggingface @ivarflakstad
### Information
- [ ] The official example scripts
- [x] My own modified scripts
### Tasks
- [ ] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...)
- [x] My own task or dataset (give details below)
### Reproduction
1. Use torch_directml
```python
from transformers import AutoTokenizer, AutoModelForCausalLM
import torch
import platform
import os
from datetime import datetime, timezone
import torch_directml
model_dtype = torch.float32
if used_directml:
print(f"Loading model to CPU first (with dtype={model_dtype}), then moving to DirectML device: {device}")
model = AutoModelForCausalLM.from_pretrained(
MODEL_NAME,
torch_dtype=model_dtype # Use the determined dtype
)
print(f"Model loaded initially (device: {model.device if hasattr(model, 'device') else 'N/A'}). Now moving to {device}...")
model = model.to(device)
print(f"Model moved to DirectML device: {device} (model.device: {model.device if hasattr(model, 'device') else 'N/A'})")
```
Final selected device for model loading: privateuseone:0, target model dtype: torch.float32
Loading model to CPU first (with dtype=torch.float32), then moving to DirectML device: privateuseone:0
Detailed error during LLM initialization: TypeError: argument of type 'NoneType' is not iterable
Traceback (most recent call last):
File "/home/afonso/skillspark/llmservice/utils/llm_integration.py", line 167, in initialize_llm
model = AutoModelForCausalLM.from_pretrained(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/afonso/miniconda3/envs/directml/lib/python3.12/site-packages/transformers/models/auto/auto_factory.py", line 571, in from_pretrained
return model_class.from_pretrained(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/afonso/miniconda3/envs/directml/lib/python3.12/site-packages/transformers/modeling_utils.py", line 309, in _wrapper
return func(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^
File "/home/afonso/miniconda3/envs/directml/lib/python3.12/site-packages/transformers/modeling_utils.py", line 4507, in from_pretrained
model = cls(config, *model_args, **model_kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/afonso/miniconda3/envs/directml/lib/python3.12/site-packages/transformers/models/llama/modeling_llama.py", line 618, in __init__
self.model = LlamaModel(config)
^^^^^^^^^^^^^^^^^^
File "/home/afonso/miniconda3/envs/directml/lib/python3.12/site-packages/transformers/models/llama/modeling_llama.py", line 379, in __init__
self.post_init()
File "/home/afonso/miniconda3/envs/directml/lib/python3.12/site-packages/transformers/modeling_utils.py", line 1968, in post_init
if v not in ALL_PARALLEL_STYLES:
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
TypeError: argument of type 'NoneType' is not iterable
### Expected behavior
Model loads successfully, with no errors | {
"login": "SunMarc",
"id": 57196510,
"node_id": "MDQ6VXNlcjU3MTk2NTEw",
"avatar_url": "https://avatars.githubusercontent.com/u/57196510?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/SunMarc",
"html_url": "https://github.com/SunMarc",
"followers_url": "https://api.github.com/users/SunMarc/followers",
"following_url": "https://api.github.com/users/SunMarc/following{/other_user}",
"gists_url": "https://api.github.com/users/SunMarc/gists{/gist_id}",
"starred_url": "https://api.github.com/users/SunMarc/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/SunMarc/subscriptions",
"organizations_url": "https://api.github.com/users/SunMarc/orgs",
"repos_url": "https://api.github.com/users/SunMarc/repos",
"events_url": "https://api.github.com/users/SunMarc/events{/privacy}",
"received_events_url": "https://api.github.com/users/SunMarc/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38340/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38340/timeline | null | completed | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | true |
https://api.github.com/repos/huggingface/transformers/issues/38339 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38339/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38339/comments | https://api.github.com/repos/huggingface/transformers/issues/38339/events | https://github.com/huggingface/transformers/pull/38339 | 3,087,683,467 | PR_kwDOCUB6oc6XdVPN | 38,339 | Bump minimum version of accelerate | {
"login": "antoniomika",
"id": 6239308,
"node_id": "MDQ6VXNlcjYyMzkzMDg=",
"avatar_url": "https://avatars.githubusercontent.com/u/6239308?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/antoniomika",
"html_url": "https://github.com/antoniomika",
"followers_url": "https://api.github.com/users/antoniomika/followers",
"following_url": "https://api.github.com/users/antoniomika/following{/other_user}",
"gists_url": "https://api.github.com/users/antoniomika/gists{/gist_id}",
"starred_url": "https://api.github.com/users/antoniomika/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/antoniomika/subscriptions",
"organizations_url": "https://api.github.com/users/antoniomika/orgs",
"repos_url": "https://api.github.com/users/antoniomika/repos",
"events_url": "https://api.github.com/users/antoniomika/events{/privacy}",
"received_events_url": "https://api.github.com/users/antoniomika/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | open | false | null | [] | null | [] | 2025-05-23T22:52:07 | 2025-06-25T07:35:51 | null | NONE | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38339",
"html_url": "https://github.com/huggingface/transformers/pull/38339",
"diff_url": "https://github.com/huggingface/transformers/pull/38339.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38339.patch",
"merged_at": null
} | # What does this PR do?
The minimum version of accelerate should now be `1.3.0` as `keep_torch_compile` is provided to `unwrap_model`. If this isn't desirable, we can also update `unwrap_model` to take a `**kwargs` to prevent this from occurring.
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [x] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
@zach-huggingface most likely due to the change being for the trainer. The change was introduced by #37725, so @Joaquinecc as well.
<!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @
If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**.
Please tag fewer than 3 people.
Models:
- text models: @ArthurZucker
- vision models: @amyeroberts, @qubvel
- speech models: @eustlb
- graph models: @clefourrier
Library:
- flax: @gante and @Rocketknight1
- generate: @zucchini-nlp (visual-language models) or @gante (all others)
- pipelines: @Rocketknight1
- tensorflow: @gante and @Rocketknight1
- tokenizers: @ArthurZucker
- trainer: @zach-huggingface and @SunMarc
- chat templates: @Rocketknight1
Integrations:
- deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface
- ray/raytune: @richardliaw, @amogkam
- Big Model Inference: @SunMarc
- quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber
Documentation: @stevhliu
HF projects:
- accelerate: [different repo](https://github.com/huggingface/accelerate)
- datasets: [different repo](https://github.com/huggingface/datasets)
- diffusers: [different repo](https://github.com/huggingface/diffusers)
- rust tokenizers: [different repo](https://github.com/huggingface/tokenizers)
Maintained examples (not research project or legacy):
- Flax: @Rocketknight1
- PyTorch: See Models above and tag the person corresponding to the modality of the example.
- TensorFlow: @Rocketknight1
-->
| null | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38339/reactions",
"total_count": 1,
"+1": 1,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38339/timeline | null | null | null | null | true | false |
https://api.github.com/repos/huggingface/transformers/issues/38338 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38338/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38338/comments | https://api.github.com/repos/huggingface/transformers/issues/38338/events | https://github.com/huggingface/transformers/pull/38338 | 3,087,510,463 | PR_kwDOCUB6oc6XcvPM | 38,338 | Support CLIP w/ Registers & Fusion MLP Gate | {
"login": "zer0int",
"id": 132047210,
"node_id": "U_kgDOB97hag",
"avatar_url": "https://avatars.githubusercontent.com/u/132047210?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/zer0int",
"html_url": "https://github.com/zer0int",
"followers_url": "https://api.github.com/users/zer0int/followers",
"following_url": "https://api.github.com/users/zer0int/following{/other_user}",
"gists_url": "https://api.github.com/users/zer0int/gists{/gist_id}",
"starred_url": "https://api.github.com/users/zer0int/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/zer0int/subscriptions",
"organizations_url": "https://api.github.com/users/zer0int/orgs",
"repos_url": "https://api.github.com/users/zer0int/repos",
"events_url": "https://api.github.com/users/zer0int/events{/privacy}",
"received_events_url": "https://api.github.com/users/zer0int/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | open | false | null | [] | null | [] | 2025-05-23T20:52:26 | 2025-05-26T14:33:48 | null | NONE | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38338",
"html_url": "https://github.com/huggingface/transformers/pull/38338",
"diff_url": "https://github.com/huggingface/transformers/pull/38338.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38338.patch",
"merged_at": null
} | ## Adding new model (CLIP)
### A short description of the model and a link to the paper:
- An Indie-CLIP model - 1 hobby dev, 1 GPU, alas 0 papers (sorry! 🤗)
- Modification and fine-tune of openai/clip (only ViT modified; Text Encoder, Tokenizer: As pre-trained)
- Fusion MLP layers (11) / gates + 4 additional "Register Tokens" in the ViT (ViT-L/14)
- Inspired by paper: [Vision Transformers Need Registers](https://arxiv.org/abs/2309.16588)
- Extremely low modality gap, outperforms original ViT-L/14 on most benchmarks including zero-shot (see HF for all details)
- "Fixed" attention heatmaps (no more "misleading" (XAI) background attention on high-norm 'global information hoarding' outlier patches):

### Link to the implementation if it is open-sourced.
- Geometric Parametrization used during fine-tuning only (undo thereafter) -> 1 GPU, tiny batch size, yet no overfit
- [github.com/zer0int/CLIP-fine-tune-registers-gated](https://github.com/zer0int/CLIP-fine-tune-registers-gated)
### Link to the model weights if they are available.
- [huggingface.co/zer0int/CLIP-Registers-Gated_MLP-ViT-L-14/](https://huggingface.co/zer0int/CLIP-Registers-Gated_MLP-ViT-L-14/)
Thank you for your time & consideration - happy to make changes as needed! | null | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38338/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38338/timeline | null | null | null | null | true | false |
https://api.github.com/repos/huggingface/transformers/issues/38337 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38337/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38337/comments | https://api.github.com/repos/huggingface/transformers/issues/38337/events | https://github.com/huggingface/transformers/pull/38337 | 3,087,501,318 | PR_kwDOCUB6oc6XctNZ | 38,337 | <spam> | {
"login": "huzaifa1-0",
"id": 133530582,
"node_id": "U_kgDOB_WD1g",
"avatar_url": "https://avatars.githubusercontent.com/u/133530582?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/huzaifa1-0",
"html_url": "https://github.com/huzaifa1-0",
"followers_url": "https://api.github.com/users/huzaifa1-0/followers",
"following_url": "https://api.github.com/users/huzaifa1-0/following{/other_user}",
"gists_url": "https://api.github.com/users/huzaifa1-0/gists{/gist_id}",
"starred_url": "https://api.github.com/users/huzaifa1-0/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/huzaifa1-0/subscriptions",
"organizations_url": "https://api.github.com/users/huzaifa1-0/orgs",
"repos_url": "https://api.github.com/users/huzaifa1-0/repos",
"events_url": "https://api.github.com/users/huzaifa1-0/events{/privacy}",
"received_events_url": "https://api.github.com/users/huzaifa1-0/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-05-23T20:47:01 | 2025-05-27T20:53:31 | 2025-05-26T14:30:33 | NONE | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38337",
"html_url": "https://github.com/huggingface/transformers/pull/38337",
"diff_url": "https://github.com/huggingface/transformers/pull/38337.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38337.patch",
"merged_at": null
} | # What does this PR do?
<!-- Replace this comment with your description -->
## Changes
- Updated `GitPython` from `<3.1.19` to `>=3.1.40` to address [CVE-XXXX](link-to-security-advisory) (if applicable).
- Improved code comments in `setup.py` for better readability.
- Fixed a typo in the package description.
## Motivation
<!-- Explain why these changes are needed -->
- The previous GitPython version had security vulnerabilities ([#12345](link-to-issue)).
- The `setup.py` file lacked clarity in dependency groupings, causing confusion for contributors.
## Testing
<!-- List steps you took to verify your changes -->
- Ran `make deps_table_update` to regenerate the dependency table.
- Installed the package locally with `pip install -e .[dev]` and confirmed no conflicts.
- Verified CI checks pass (if possible).
## Checklist
- [ ] My code follows the project’s style guidelines.
- [x] I have referenced any related issues (e.g., `Closes #12345`).
- [ ] My changes generate no new warnings.
---
### **Additional Tips**
1. **Link to Issues/CVEs**: If your PR fixes an issue or security vulnerability, reference it with `Closes #12345` or `Addresses CVE-XXXX`.
2. **Use Bullet Points**: Keep the description scannable.
3. **Avoid Jargon**: Assume the reviewer is unfamiliar with your specific changes.
4. **Follow the Project Template**: Hugging Face may have a PR template – fill it out completely.
---
### **Example Final Description**
```markdown
# What does this PR do?
This PR updates outdated dependencies in `setup.py` and improves documentation for better maintainability.
## Changes
- Bumps `GitPython` dependency from `<3.1.19` to `>=3.1.40` to resolve security warnings.
- Adds inline comments explaining the `extras` groups (e.g., `"torch-speech"`).
- Fixes a typo in the package description ("JAX" was misspelled).
## Motivation
- GitPython 3.1.19 had known vulnerabilities (CVE-2023-XXXX).
- New contributors often struggle with dependency groupings in `setup.py` ([#9876](https://github.com/huggingface/transformers/issues/9876)).
## Testing
- Ran `make deps_table_update` successfully.
- Verified installation with `pip install -e .[dev]` on Python 3.10.
## Checklist
- [x] My code follows the project’s style guidelines.
- [x] I have referenced related issues (Closes #9876).
-->
| {
"login": "Rocketknight1",
"id": 12866554,
"node_id": "MDQ6VXNlcjEyODY2NTU0",
"avatar_url": "https://avatars.githubusercontent.com/u/12866554?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Rocketknight1",
"html_url": "https://github.com/Rocketknight1",
"followers_url": "https://api.github.com/users/Rocketknight1/followers",
"following_url": "https://api.github.com/users/Rocketknight1/following{/other_user}",
"gists_url": "https://api.github.com/users/Rocketknight1/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Rocketknight1/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Rocketknight1/subscriptions",
"organizations_url": "https://api.github.com/users/Rocketknight1/orgs",
"repos_url": "https://api.github.com/users/Rocketknight1/repos",
"events_url": "https://api.github.com/users/Rocketknight1/events{/privacy}",
"received_events_url": "https://api.github.com/users/Rocketknight1/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38337/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38337/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/38336 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38336/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38336/comments | https://api.github.com/repos/huggingface/transformers/issues/38336/events | https://github.com/huggingface/transformers/pull/38336 | 3,087,382,607 | PR_kwDOCUB6oc6XcSvz | 38,336 | fix typos | {
"login": "DeVikingMark",
"id": 190900683,
"node_id": "U_kgDOC2Dpyw",
"avatar_url": "https://avatars.githubusercontent.com/u/190900683?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/DeVikingMark",
"html_url": "https://github.com/DeVikingMark",
"followers_url": "https://api.github.com/users/DeVikingMark/followers",
"following_url": "https://api.github.com/users/DeVikingMark/following{/other_user}",
"gists_url": "https://api.github.com/users/DeVikingMark/gists{/gist_id}",
"starred_url": "https://api.github.com/users/DeVikingMark/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/DeVikingMark/subscriptions",
"organizations_url": "https://api.github.com/users/DeVikingMark/orgs",
"repos_url": "https://api.github.com/users/DeVikingMark/repos",
"events_url": "https://api.github.com/users/DeVikingMark/events{/privacy}",
"received_events_url": "https://api.github.com/users/DeVikingMark/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-05-23T19:48:18 | 2025-05-26T14:43:16 | 2025-05-26T14:42:37 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38336",
"html_url": "https://github.com/huggingface/transformers/pull/38336",
"diff_url": "https://github.com/huggingface/transformers/pull/38336.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38336.patch",
"merged_at": "2025-05-26T14:42:37"
} | # What does this PR do?
<!--
Congratulations! You've made it this far! You're not quite done yet though.
Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution.
Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change.
Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost.
-->
<!-- Remove if not applicable -->
Fixes # (issue)
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [ ] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
<!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @
If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**.
Please tag fewer than 3 people.
Models:
- text models: @ArthurZucker
- vision models: @amyeroberts, @qubvel
- speech models: @eustlb
- graph models: @clefourrier
Library:
- flax: @gante and @Rocketknight1
- generate: @zucchini-nlp (visual-language models) or @gante (all others)
- pipelines: @Rocketknight1
- tensorflow: @gante and @Rocketknight1
- tokenizers: @ArthurZucker
- trainer: @zach-huggingface and @SunMarc
- chat templates: @Rocketknight1
Integrations:
- deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface
- ray/raytune: @richardliaw, @amogkam
- Big Model Inference: @SunMarc
- quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber
Documentation: @stevhliu
HF projects:
- accelerate: [different repo](https://github.com/huggingface/accelerate)
- datasets: [different repo](https://github.com/huggingface/datasets)
- diffusers: [different repo](https://github.com/huggingface/diffusers)
- rust tokenizers: [different repo](https://github.com/huggingface/tokenizers)
Maintained examples (not research project or legacy):
- Flax: @Rocketknight1
- PyTorch: See Models above and tag the person corresponding to the modality of the example.
- TensorFlow: @Rocketknight1
-->
| {
"login": "Rocketknight1",
"id": 12866554,
"node_id": "MDQ6VXNlcjEyODY2NTU0",
"avatar_url": "https://avatars.githubusercontent.com/u/12866554?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Rocketknight1",
"html_url": "https://github.com/Rocketknight1",
"followers_url": "https://api.github.com/users/Rocketknight1/followers",
"following_url": "https://api.github.com/users/Rocketknight1/following{/other_user}",
"gists_url": "https://api.github.com/users/Rocketknight1/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Rocketknight1/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Rocketknight1/subscriptions",
"organizations_url": "https://api.github.com/users/Rocketknight1/orgs",
"repos_url": "https://api.github.com/users/Rocketknight1/repos",
"events_url": "https://api.github.com/users/Rocketknight1/events{/privacy}",
"received_events_url": "https://api.github.com/users/Rocketknight1/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38336/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38336/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/38335 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38335/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38335/comments | https://api.github.com/repos/huggingface/transformers/issues/38335/events | https://github.com/huggingface/transformers/pull/38335 | 3,087,294,529 | PR_kwDOCUB6oc6Xb_Bf | 38,335 | Merge type hints from `microsoft/python-type-stubs` (post dropping support for Python 3.8) | {
"login": "Avasam",
"id": 1350584,
"node_id": "MDQ6VXNlcjEzNTA1ODQ=",
"avatar_url": "https://avatars.githubusercontent.com/u/1350584?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Avasam",
"html_url": "https://github.com/Avasam",
"followers_url": "https://api.github.com/users/Avasam/followers",
"following_url": "https://api.github.com/users/Avasam/following{/other_user}",
"gists_url": "https://api.github.com/users/Avasam/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Avasam/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Avasam/subscriptions",
"organizations_url": "https://api.github.com/users/Avasam/orgs",
"repos_url": "https://api.github.com/users/Avasam/repos",
"events_url": "https://api.github.com/users/Avasam/events{/privacy}",
"received_events_url": "https://api.github.com/users/Avasam/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-05-23T19:10:55 | 2025-05-28T16:39:41 | 2025-05-28T16:21:40 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38335",
"html_url": "https://github.com/huggingface/transformers/pull/38335",
"diff_url": "https://github.com/huggingface/transformers/pull/38335.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38335.patch",
"merged_at": "2025-05-28T16:21:40"
} | # What does this PR do?
Reopening #23093 but simplified now that support for Python 3.8 has been dropped in https://github.com/huggingface/transformers/pull/34314
Merge type definitions from https://github.com/microsoft/python-type-stubs/tree/main/stubs/transformers-stubs so that Pylance can rely directly on the first-party `transformers` package.
I also ran `pyright --pythonversion=3.9` on all changed files to sanity check I'm not writing anything that will obviously break at runtime under Python 3.9
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [x] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [x] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [N/A] Did you write any new necessary tests? (shouldn't affect runtime, so all existing tests should pass as-is)
## Who can review?
🤷 The provided list doesn't mention typing / type hints
- I guess @Rocketknight1 who opened #16059 and seems to have reviewed other typing-related PRs
- @sgugger who started review on #23093 before I closed to wait on Python 3.9 | {
"login": "Rocketknight1",
"id": 12866554,
"node_id": "MDQ6VXNlcjEyODY2NTU0",
"avatar_url": "https://avatars.githubusercontent.com/u/12866554?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Rocketknight1",
"html_url": "https://github.com/Rocketknight1",
"followers_url": "https://api.github.com/users/Rocketknight1/followers",
"following_url": "https://api.github.com/users/Rocketknight1/following{/other_user}",
"gists_url": "https://api.github.com/users/Rocketknight1/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Rocketknight1/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Rocketknight1/subscriptions",
"organizations_url": "https://api.github.com/users/Rocketknight1/orgs",
"repos_url": "https://api.github.com/users/Rocketknight1/repos",
"events_url": "https://api.github.com/users/Rocketknight1/events{/privacy}",
"received_events_url": "https://api.github.com/users/Rocketknight1/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38335/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38335/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/38334 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38334/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38334/comments | https://api.github.com/repos/huggingface/transformers/issues/38334/events | https://github.com/huggingface/transformers/pull/38334 | 3,087,103,385 | PR_kwDOCUB6oc6XbUXl | 38,334 | [mllama] Allow `pixel_values` with `inputs_embeds` | {
"login": "dxoigmn",
"id": 1328,
"node_id": "MDQ6VXNlcjEzMjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/1328?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dxoigmn",
"html_url": "https://github.com/dxoigmn",
"followers_url": "https://api.github.com/users/dxoigmn/followers",
"following_url": "https://api.github.com/users/dxoigmn/following{/other_user}",
"gists_url": "https://api.github.com/users/dxoigmn/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dxoigmn/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dxoigmn/subscriptions",
"organizations_url": "https://api.github.com/users/dxoigmn/orgs",
"repos_url": "https://api.github.com/users/dxoigmn/repos",
"events_url": "https://api.github.com/users/dxoigmn/events{/privacy}",
"received_events_url": "https://api.github.com/users/dxoigmn/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-05-23T17:46:11 | 2025-05-27T16:34:33 | 2025-05-27T16:33:56 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38334",
"html_url": "https://github.com/huggingface/transformers/pull/38334",
"diff_url": "https://github.com/huggingface/transformers/pull/38334.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38334.patch",
"merged_at": "2025-05-27T16:33:56"
} | # What does this PR do?
Allow the specification of both `pixel_values` and `inputs_embeds` at the same time in `MllamaModel.forward`.
Fixes #38326
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [x] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [x] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [x] Did you write any new necessary tests?
## Who can review?
@zucchini-nlp | {
"login": "zucchini-nlp",
"id": 100715397,
"node_id": "U_kgDOBgDLhQ",
"avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/zucchini-nlp",
"html_url": "https://github.com/zucchini-nlp",
"followers_url": "https://api.github.com/users/zucchini-nlp/followers",
"following_url": "https://api.github.com/users/zucchini-nlp/following{/other_user}",
"gists_url": "https://api.github.com/users/zucchini-nlp/gists{/gist_id}",
"starred_url": "https://api.github.com/users/zucchini-nlp/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/zucchini-nlp/subscriptions",
"organizations_url": "https://api.github.com/users/zucchini-nlp/orgs",
"repos_url": "https://api.github.com/users/zucchini-nlp/repos",
"events_url": "https://api.github.com/users/zucchini-nlp/events{/privacy}",
"received_events_url": "https://api.github.com/users/zucchini-nlp/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38334/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38334/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/38333 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38333/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38333/comments | https://api.github.com/repos/huggingface/transformers/issues/38333/events | https://github.com/huggingface/transformers/issues/38333 | 3,087,021,373 | I_kwDOCUB6oc64ADU9 | 38,333 | MedGemma worked find prior to 4.52.3 release but now errors | {
"login": "harpreetsahota204",
"id": 40807278,
"node_id": "MDQ6VXNlcjQwODA3Mjc4",
"avatar_url": "https://avatars.githubusercontent.com/u/40807278?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/harpreetsahota204",
"html_url": "https://github.com/harpreetsahota204",
"followers_url": "https://api.github.com/users/harpreetsahota204/followers",
"following_url": "https://api.github.com/users/harpreetsahota204/following{/other_user}",
"gists_url": "https://api.github.com/users/harpreetsahota204/gists{/gist_id}",
"starred_url": "https://api.github.com/users/harpreetsahota204/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/harpreetsahota204/subscriptions",
"organizations_url": "https://api.github.com/users/harpreetsahota204/orgs",
"repos_url": "https://api.github.com/users/harpreetsahota204/repos",
"events_url": "https://api.github.com/users/harpreetsahota204/events{/privacy}",
"received_events_url": "https://api.github.com/users/harpreetsahota204/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
}
] | closed | false | null | [] | null | [] | 2025-05-23T17:17:11 | 2025-06-28T14:02:04 | 2025-06-28T14:02:04 | NONE | null | null | null | null | ### System Info
Hi,
Seeing this error while using MedGemma. Used yesterday on 4.51.3, upgraded transformers today and now see this error
```
Error: Unexpected type in sourceless builder transformers.models.gemma3.configuration_gemma3.Gemma3TextConfig
from user code:
File "/home/harpreet/miniconda3/envs/fo_develop/lib/python3.11/site-packages/transformers/models/gemma3/modeling_gemma3.py", line 1345, in forward
outputs = self.model(
File "/home/harpreet/miniconda3/envs/fo_develop/lib/python3.11/site-packages/transformers/utils/generic.py", line 969, in wrapper
output = func(self, *args, **kwargs)
File "/home/harpreet/miniconda3/envs/fo_develop/lib/python3.11/site-packages/transformers/models/gemma3/modeling_gemma3.py", line 1205, in forward
causal_mask = self._update_causal_mask(
File "/home/harpreet/miniconda3/envs/fo_develop/lib/python3.11/site-packages/transformers/models/gemma3/modeling_gemma3.py", line 1024, in _update_causal_mask
if self.config.text_config._attn_implementation == "flash_attention_2":
File "/home/harpreet/miniconda3/envs/fo_develop/lib/python3.11/site-packages/transformers/configuration_utils.py", line 211, in __getattribute__
return super().__getattribute__(key)
Set TORCHDYNAMO_VERBOSE=1 for the internal stack trace (please do this especially if you're reporting a bug to PyTorch). For even more developer context, set TORCH_LOGS="+dynamo"
```
### Who can help?
_No response_
### Information
- [ ] The official example scripts
- [ ] My own modified scripts
### Tasks
- [ ] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...)
- [ ] My own task or dataset (give details below)
### Reproduction
Steps to reproduce.
Install transfromers 4.51.3 and follow MedGemma model card for usage.
Then, upgrade to latest transformers, and observe the error
### Expected behavior
I'd expect the model to run per the model card | {
"login": "harpreetsahota204",
"id": 40807278,
"node_id": "MDQ6VXNlcjQwODA3Mjc4",
"avatar_url": "https://avatars.githubusercontent.com/u/40807278?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/harpreetsahota204",
"html_url": "https://github.com/harpreetsahota204",
"followers_url": "https://api.github.com/users/harpreetsahota204/followers",
"following_url": "https://api.github.com/users/harpreetsahota204/following{/other_user}",
"gists_url": "https://api.github.com/users/harpreetsahota204/gists{/gist_id}",
"starred_url": "https://api.github.com/users/harpreetsahota204/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/harpreetsahota204/subscriptions",
"organizations_url": "https://api.github.com/users/harpreetsahota204/orgs",
"repos_url": "https://api.github.com/users/harpreetsahota204/repos",
"events_url": "https://api.github.com/users/harpreetsahota204/events{/privacy}",
"received_events_url": "https://api.github.com/users/harpreetsahota204/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38333/reactions",
"total_count": 1,
"+1": 1,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38333/timeline | null | completed | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | true |
https://api.github.com/repos/huggingface/transformers/issues/38332 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38332/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38332/comments | https://api.github.com/repos/huggingface/transformers/issues/38332/events | https://github.com/huggingface/transformers/pull/38332 | 3,086,967,113 | PR_kwDOCUB6oc6Xa1-g | 38,332 | Encoder-Decoder Gemma | {
"login": "bzhangGo",
"id": 17406686,
"node_id": "MDQ6VXNlcjE3NDA2Njg2",
"avatar_url": "https://avatars.githubusercontent.com/u/17406686?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/bzhangGo",
"html_url": "https://github.com/bzhangGo",
"followers_url": "https://api.github.com/users/bzhangGo/followers",
"following_url": "https://api.github.com/users/bzhangGo/following{/other_user}",
"gists_url": "https://api.github.com/users/bzhangGo/gists{/gist_id}",
"starred_url": "https://api.github.com/users/bzhangGo/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/bzhangGo/subscriptions",
"organizations_url": "https://api.github.com/users/bzhangGo/orgs",
"repos_url": "https://api.github.com/users/bzhangGo/repos",
"events_url": "https://api.github.com/users/bzhangGo/events{/privacy}",
"received_events_url": "https://api.github.com/users/bzhangGo/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-05-23T16:53:19 | 2025-07-12T15:22:51 | 2025-06-25T09:05:10 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38332",
"html_url": "https://github.com/huggingface/transformers/pull/38332",
"diff_url": "https://github.com/huggingface/transformers/pull/38332.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38332.patch",
"merged_at": "2025-06-25T09:05:10"
} | # What does this PR do?
<!--
Congratulations! You've made it this far! You're not quite done yet though.
Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution.
Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change.
Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost.
-->
<!-- Remove if not applicable -->
Add support for encoder-decoder Gemma (https://arxiv.org/abs/2504.06225)
## Before submitting
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
* Documentation is not updated yet.
- [ ] Did you write any new necessary tests?
* Nope. Instead, I tested the code locally.
## Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
<!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @
If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**.
Please tag fewer than 3 people.
Models:
- text models: @ArthurZucker
- vision models: @amyeroberts, @qubvel
- speech models: @eustlb
- graph models: @clefourrier
Library:
- flax: @gante and @Rocketknight1
- generate: @zucchini-nlp (visual-language models) or @gante (all others)
- pipelines: @Rocketknight1
- tensorflow: @gante and @Rocketknight1
- tokenizers: @ArthurZucker
- trainer: @zach-huggingface and @SunMarc
- chat templates: @Rocketknight1
Integrations:
- deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface
- ray/raytune: @richardliaw, @amogkam
- Big Model Inference: @SunMarc
- quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber
Documentation: @stevhliu
HF projects:
- accelerate: [different repo](https://github.com/huggingface/accelerate)
- datasets: [different repo](https://github.com/huggingface/datasets)
- diffusers: [different repo](https://github.com/huggingface/diffusers)
- rust tokenizers: [different repo](https://github.com/huggingface/tokenizers)
Maintained examples (not research project or legacy):
- Flax: @Rocketknight1
- PyTorch: See Models above and tag the person corresponding to the modality of the example.
- TensorFlow: @Rocketknight1
-->
| {
"login": "ArthurZucker",
"id": 48595927,
"node_id": "MDQ6VXNlcjQ4NTk1OTI3",
"avatar_url": "https://avatars.githubusercontent.com/u/48595927?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ArthurZucker",
"html_url": "https://github.com/ArthurZucker",
"followers_url": "https://api.github.com/users/ArthurZucker/followers",
"following_url": "https://api.github.com/users/ArthurZucker/following{/other_user}",
"gists_url": "https://api.github.com/users/ArthurZucker/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ArthurZucker/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ArthurZucker/subscriptions",
"organizations_url": "https://api.github.com/users/ArthurZucker/orgs",
"repos_url": "https://api.github.com/users/ArthurZucker/repos",
"events_url": "https://api.github.com/users/ArthurZucker/events{/privacy}",
"received_events_url": "https://api.github.com/users/ArthurZucker/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38332/reactions",
"total_count": 1,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 1,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38332/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/38331 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38331/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38331/comments | https://api.github.com/repos/huggingface/transformers/issues/38331/events | https://github.com/huggingface/transformers/issues/38331 | 3,086,954,969 | I_kwDOCUB6oc63_zHZ | 38,331 | AttributeError: 'Qwen2VLConfig' object has no attribute 'hidden_size' | {
"login": "Tcc0403",
"id": 76503978,
"node_id": "MDQ6VXNlcjc2NTAzOTc4",
"avatar_url": "https://avatars.githubusercontent.com/u/76503978?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Tcc0403",
"html_url": "https://github.com/Tcc0403",
"followers_url": "https://api.github.com/users/Tcc0403/followers",
"following_url": "https://api.github.com/users/Tcc0403/following{/other_user}",
"gists_url": "https://api.github.com/users/Tcc0403/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Tcc0403/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Tcc0403/subscriptions",
"organizations_url": "https://api.github.com/users/Tcc0403/orgs",
"repos_url": "https://api.github.com/users/Tcc0403/repos",
"events_url": "https://api.github.com/users/Tcc0403/events{/privacy}",
"received_events_url": "https://api.github.com/users/Tcc0403/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
}
] | closed | false | null | [] | null | [] | 2025-05-23T16:48:02 | 2025-05-28T07:32:27 | 2025-05-28T07:32:27 | NONE | null | null | null | null | ### System Info
- `transformers` version: 4.53.0.dev0
- Platform: Linux-5.15.167.4-microsoft-standard-WSL2-x86_64-with-glibc2.35
- Python version: 3.13.1
- Huggingface_hub version: 0.32.0
- Safetensors version: 0.5.3
- Accelerate version: 1.7.0
- Accelerate config: not found
- DeepSpeed version: not installed
- PyTorch version (accelerator?): 2.6.0+cu124 (CUDA)
- Tensorflow version (GPU?): not installed (NA)
- Flax version (CPU?/GPU?/TPU?): not installed (NA)
- Jax version: not installed
- JaxLib version: not installed
- Using distributed or parallel set-up in script?: no
- Using GPU in script?: no
- GPU type: NVIDIA GeForce RTX 3080
### Who can help?
@zucchini-nlp
### Information
- [ ] The official example scripts
- [x] My own modified scripts
### Tasks
- [ ] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...)
- [ ] My own task or dataset (give details below)
### Reproduction
Running the follwoing code gives an error:
```python
from transformers.models.qwen2_vl import Qwen2VLConfig, Qwen2VLForConditionalGeneration
# Lower model size to speed up initialization
config = Qwen2VLConfig(text_config={"hidden_size": 1024, "intermediate_size": 2048, "num_hidden_layers": 4})
model = Qwen2VLForConditionalGeneration(config)
```
Error message:
```bash
Traceback (most recent call last):
File "/home/tcc/transformers/qwen2_vl_config_bug.py", line 6, in <module>
model = Qwen2VLForConditionalGeneration(config)
File "/home/tcc/transformers/.venv/lib/python3.13/site-packages/transformers/models/qwen2_vl/modeling_qwen2_vl.py", line 1744, in __init__
self.lm_head = nn.Linear(config.hidden_size, config.vocab_size, bias=False)
^^^^^^^^^^^^^^^^^^
File "/home/tcc/transformers/.venv/lib/python3.13/site-packages/transformers/configuration_utils.py", line 211, in __getattribute__
return super().__getattribute__(key)
~~~~~~~~~~~~~~~~~~~~~~~~^^^^^
AttributeError: 'Qwen2VLConfig' object has no attribute 'hidden_size'
```
### Expected behavior
The current `lm_head` is initialized with `config.vocab_size` and `config.hidden_size`.
https://github.com/huggingface/transformers/blob/1ed19360b1400bd849164e0b9be940e8342af6b1/src/transformers/models/qwen2_vl/modeling_qwen2_vl.py#L1741-L1744
Since `Qwen2VLConfig` is passed to `Qwen2VLForConditionalGeneration`, perhaps we need to read from `config.text_config` instead.
#### More context
#37268 introduced standardize config to qwen vlm family and they are `lm_head` was properly initialized with `config.text_config.hidden_size/vocab_size`. However, in the follow-up PR #37033, the initialization of `lm_head` fell back to `config.hidden_size/vocab_size`.
Afaik [Qwen2_VL](https://github.com/huggingface/transformers/pull/37033/files#diff-09bc594f9680f1d042fd485106c68022d77b59831697a00b3b38f12a3e40f395R1734) and [Qwen2_5_VL](https://github.com/huggingface/transformers/pull/37033/files#diff-14a94f3f61380a81d3ca7d0e074db624b948131a1d69c0a2b337b0ec879249edR1861) are only two models that have the above issues. | {
"login": "ArthurZucker",
"id": 48595927,
"node_id": "MDQ6VXNlcjQ4NTk1OTI3",
"avatar_url": "https://avatars.githubusercontent.com/u/48595927?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ArthurZucker",
"html_url": "https://github.com/ArthurZucker",
"followers_url": "https://api.github.com/users/ArthurZucker/followers",
"following_url": "https://api.github.com/users/ArthurZucker/following{/other_user}",
"gists_url": "https://api.github.com/users/ArthurZucker/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ArthurZucker/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ArthurZucker/subscriptions",
"organizations_url": "https://api.github.com/users/ArthurZucker/orgs",
"repos_url": "https://api.github.com/users/ArthurZucker/repos",
"events_url": "https://api.github.com/users/ArthurZucker/events{/privacy}",
"received_events_url": "https://api.github.com/users/ArthurZucker/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38331/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38331/timeline | null | completed | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | true |
https://api.github.com/repos/huggingface/transformers/issues/38330 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38330/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38330/comments | https://api.github.com/repos/huggingface/transformers/issues/38330/events | https://github.com/huggingface/transformers/pull/38330 | 3,086,910,867 | PR_kwDOCUB6oc6Xape6 | 38,330 | [chat] use the checkpoint's `generation_config.json` as base parameterization | {
"login": "gante",
"id": 12240844,
"node_id": "MDQ6VXNlcjEyMjQwODQ0",
"avatar_url": "https://avatars.githubusercontent.com/u/12240844?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/gante",
"html_url": "https://github.com/gante",
"followers_url": "https://api.github.com/users/gante/followers",
"following_url": "https://api.github.com/users/gante/following{/other_user}",
"gists_url": "https://api.github.com/users/gante/gists{/gist_id}",
"starred_url": "https://api.github.com/users/gante/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/gante/subscriptions",
"organizations_url": "https://api.github.com/users/gante/orgs",
"repos_url": "https://api.github.com/users/gante/repos",
"events_url": "https://api.github.com/users/gante/events{/privacy}",
"received_events_url": "https://api.github.com/users/gante/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-05-23T16:29:13 | 2025-05-27T10:36:01 | 2025-05-27T10:35:34 | MEMBER | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38330",
"html_url": "https://github.com/huggingface/transformers/pull/38330",
"diff_url": "https://github.com/huggingface/transformers/pull/38330.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38330.patch",
"merged_at": "2025-05-27T10:35:34"
} | # What does this PR do?
See title.
The biggest issue it fixes is EOS token handling: we were defaulting generation-time EOS to the tokenizer's EOS (= 1 token), instead of the value set in the `generation_config.json` (>= 1 token).
As an example: [this model has 2 EOS tokens](https://huggingface.co/open-r1/OpenR1-Qwen-7B/blob/main/generation_config.json), but `chat` was ignoring one of them because of our choice of defaults. | {
"login": "gante",
"id": 12240844,
"node_id": "MDQ6VXNlcjEyMjQwODQ0",
"avatar_url": "https://avatars.githubusercontent.com/u/12240844?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/gante",
"html_url": "https://github.com/gante",
"followers_url": "https://api.github.com/users/gante/followers",
"following_url": "https://api.github.com/users/gante/following{/other_user}",
"gists_url": "https://api.github.com/users/gante/gists{/gist_id}",
"starred_url": "https://api.github.com/users/gante/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/gante/subscriptions",
"organizations_url": "https://api.github.com/users/gante/orgs",
"repos_url": "https://api.github.com/users/gante/repos",
"events_url": "https://api.github.com/users/gante/events{/privacy}",
"received_events_url": "https://api.github.com/users/gante/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38330/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38330/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/38329 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38329/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38329/comments | https://api.github.com/repos/huggingface/transformers/issues/38329/events | https://github.com/huggingface/transformers/issues/38329 | 3,086,846,753 | I_kwDOCUB6oc63_Ysh | 38,329 | Use of torch.get_default_device() of pull requests #37216: Causing Compatibility issues with Torch 2.2.0 | {
"login": "Sefurus",
"id": 77594207,
"node_id": "MDQ6VXNlcjc3NTk0MjA3",
"avatar_url": "https://avatars.githubusercontent.com/u/77594207?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Sefurus",
"html_url": "https://github.com/Sefurus",
"followers_url": "https://api.github.com/users/Sefurus/followers",
"following_url": "https://api.github.com/users/Sefurus/following{/other_user}",
"gists_url": "https://api.github.com/users/Sefurus/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Sefurus/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Sefurus/subscriptions",
"organizations_url": "https://api.github.com/users/Sefurus/orgs",
"repos_url": "https://api.github.com/users/Sefurus/repos",
"events_url": "https://api.github.com/users/Sefurus/events{/privacy}",
"received_events_url": "https://api.github.com/users/Sefurus/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
}
] | closed | false | null | [] | null | [] | 2025-05-23T16:02:31 | 2025-05-28T10:46:19 | 2025-05-27T09:01:06 | NONE | null | null | null | null | ### System Info
System Info
- transformers version: 4.52.3 (but the problem occurs since the 4.51.3-BitNet-preview version)
- Platform: Microsoft Windows 11 Pro
- huggingface-hub versión: 0.32.0
- safetensors : 0.5.3
- Python versión: 3.9 (cpython-3.9.22-windows-x86_64-none)
- Tensorflow version (GPU?): not installed
- GPU type: NVIDIA QUADRO P1000
Dependecies:
*Installed via uv add*
- pillow==11.2.1
- torch==2.2.0+cu118
- transformers==4.52.3
- numpy==1.24.4
- accelerate==1.7.0
### Who can help?
@amyeroberts
@qubvel
@Cyrilvallez
### Information
- [x] The official example scripts
- [ ] My own modified scripts
### Tasks
- [x] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...)
- [ ] My own task or dataset (give details below)
### Reproduction
Using torch==2.2.0+cu118 with Transformers 4.52.3 using the First [example](https://huggingface.co/docs/transformers/main/en/model_doc/rt_detr_v2#overview) of RT-DETRv2 of the transformer page. Actually, just running transformers env also gives the error.
```
import torch
import requests
from PIL import Image
from transformers import RTDetrV2ForObjectDetection, RTDetrImageProcessor
url = 'http://images.cocodataset.org/val2017/000000039769.jpg'
image = Image.open(requests.get(url, stream=True).raw)
image_processor = RTDetrImageProcessor.from_pretrained("PekingU/rtdetr_v2_r18vd")
model = RTDetrV2ForObjectDetection.from_pretrained("PekingU/rtdetr_v2_r18vd")
inputs = image_processor(images=image, return_tensors="pt")
with torch.no_grad():
outputs = model(**inputs)
results = image_processor.post_process_object_detection(outputs, target_sizes=torch.tensor([(image.height, image.width)]), threshold=0.5)
for result in results:
for score, label_id, box in zip(result["scores"], result["labels"], result["boxes"]):
score, label = score.item(), label_id.item()
box = [round(i, 2) for i in box.tolist()]
print(f"{model.config.id2label[label]}: {score:.2f} {box}")
```
Give the next issue:
```
Traceback (most recent call last):
File "e:\playground\prof_3\new_proof\main.py", line 11, in <module>
model = RTDetrV2ForObjectDetection.from_pretrained("PekingU/rtdetr_v2_r18vd")
File "E:\playground\prof_3\new_proof\.venv\lib\site-packages\transformers\modeling_utils.py", line 309, in _wrapper
return func(*args, **kwargs)
File "E:\playground\prof_3\new_proof\.venv\lib\site-packages\transformers\modeling_utils.py", line 4252, in from_pretrained
device_in_context = get_torch_context_manager_or_global_device()
File "E:\playground\prof_3\new_proof\.venv\lib\site-packages\transformers\modeling_utils.py", line 322, in get_torch_context_manager_or_global_device
default_device = torch.get_default_device()
File "E:\playground\prof_3\new_proof\.venv\lib\site-packages\torch\__init__.py", line 1932, in __getattr__
raise AttributeError(f"module '{__name__}' has no attribute '{name}'")
AttributeError: module 'torch' has no attribute 'get_default_device'
```
This worked until versión 4.51.3. But since the release of the 4.51.3-XXXX-preview versions and the official 4.52.0
which include the [#37216](https://github.com/huggingface/transformers/pull/37216) this issue is present. I believe that is because of the use of torch.get_default_device() because this function appears since PyTorch 2.3.0 but Pytorch 2.2 doesn't have it (You can look of a similar issue in the official Pytorch GitHub https://github.com/pytorch/pytorch/issues/126632).
The issue is "fixed" if you change:
`default_device = torch.get_default_device()`
to
`default_device = "cuda" if torch.cuda.is_available() else "cpu"`
And I know is ugly but It works for this Pytorch version.
### Expected behavior
No exception should be thrown when loading the model. | {
"login": "Cyrilvallez",
"id": 71554963,
"node_id": "MDQ6VXNlcjcxNTU0OTYz",
"avatar_url": "https://avatars.githubusercontent.com/u/71554963?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Cyrilvallez",
"html_url": "https://github.com/Cyrilvallez",
"followers_url": "https://api.github.com/users/Cyrilvallez/followers",
"following_url": "https://api.github.com/users/Cyrilvallez/following{/other_user}",
"gists_url": "https://api.github.com/users/Cyrilvallez/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Cyrilvallez/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Cyrilvallez/subscriptions",
"organizations_url": "https://api.github.com/users/Cyrilvallez/orgs",
"repos_url": "https://api.github.com/users/Cyrilvallez/repos",
"events_url": "https://api.github.com/users/Cyrilvallez/events{/privacy}",
"received_events_url": "https://api.github.com/users/Cyrilvallez/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38329/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38329/timeline | null | completed | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | true |
https://api.github.com/repos/huggingface/transformers/issues/38328 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38328/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38328/comments | https://api.github.com/repos/huggingface/transformers/issues/38328/events | https://github.com/huggingface/transformers/pull/38328 | 3,086,845,541 | PR_kwDOCUB6oc6XabAp | 38,328 | Align TP check | {
"login": "SunMarc",
"id": 57196510,
"node_id": "MDQ6VXNlcjU3MTk2NTEw",
"avatar_url": "https://avatars.githubusercontent.com/u/57196510?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/SunMarc",
"html_url": "https://github.com/SunMarc",
"followers_url": "https://api.github.com/users/SunMarc/followers",
"following_url": "https://api.github.com/users/SunMarc/following{/other_user}",
"gists_url": "https://api.github.com/users/SunMarc/gists{/gist_id}",
"starred_url": "https://api.github.com/users/SunMarc/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/SunMarc/subscriptions",
"organizations_url": "https://api.github.com/users/SunMarc/orgs",
"repos_url": "https://api.github.com/users/SunMarc/repos",
"events_url": "https://api.github.com/users/SunMarc/events{/privacy}",
"received_events_url": "https://api.github.com/users/SunMarc/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 8103865784,
"node_id": "LA_kwDOCUB6oc8AAAAB4wctuA",
"url": "https://api.github.com/repos/huggingface/transformers/labels/for%20patch",
"name": "for patch",
"color": "D93F0B",
"default": false,
"description": "Tag issues / labels that should be included in the next patch"
}
] | closed | false | null | [] | null | [] | 2025-05-23T16:01:56 | 2025-06-03T10:35:50 | 2025-05-30T15:15:39 | MEMBER | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38328",
"html_url": "https://github.com/huggingface/transformers/pull/38328",
"diff_url": "https://github.com/huggingface/transformers/pull/38328.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38328.patch",
"merged_at": "2025-05-30T15:15:39"
} | # What does this PR do?
We need to have the following check `is_torch_greater_or_equal("2.5") and _torch_distributed_available` when checking that the tp_plan is correctly designed, otherwise `ALL_PARALLEL_STYLES` might be empty (e.g for torch 2.4) and it will raise an error when loading a model like llama that has a `_tp_plan`
Saw in accelerate CI https://github.com/huggingface/accelerate/actions/runs/15199790246/job/42795709533?pr=3581
Might need a patch for that but let's see if this impacts a lot of users
FIxed partly here https://github.com/huggingface/transformers/pull/38370
| {
"login": "SunMarc",
"id": 57196510,
"node_id": "MDQ6VXNlcjU3MTk2NTEw",
"avatar_url": "https://avatars.githubusercontent.com/u/57196510?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/SunMarc",
"html_url": "https://github.com/SunMarc",
"followers_url": "https://api.github.com/users/SunMarc/followers",
"following_url": "https://api.github.com/users/SunMarc/following{/other_user}",
"gists_url": "https://api.github.com/users/SunMarc/gists{/gist_id}",
"starred_url": "https://api.github.com/users/SunMarc/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/SunMarc/subscriptions",
"organizations_url": "https://api.github.com/users/SunMarc/orgs",
"repos_url": "https://api.github.com/users/SunMarc/repos",
"events_url": "https://api.github.com/users/SunMarc/events{/privacy}",
"received_events_url": "https://api.github.com/users/SunMarc/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38328/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38328/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/38327 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38327/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38327/comments | https://api.github.com/repos/huggingface/transformers/issues/38327/events | https://github.com/huggingface/transformers/pull/38327 | 3,086,793,058 | PR_kwDOCUB6oc6XaPfo | 38,327 | Never fallback to eager implicitly | {
"login": "Cyrilvallez",
"id": 71554963,
"node_id": "MDQ6VXNlcjcxNTU0OTYz",
"avatar_url": "https://avatars.githubusercontent.com/u/71554963?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Cyrilvallez",
"html_url": "https://github.com/Cyrilvallez",
"followers_url": "https://api.github.com/users/Cyrilvallez/followers",
"following_url": "https://api.github.com/users/Cyrilvallez/following{/other_user}",
"gists_url": "https://api.github.com/users/Cyrilvallez/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Cyrilvallez/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Cyrilvallez/subscriptions",
"organizations_url": "https://api.github.com/users/Cyrilvallez/orgs",
"repos_url": "https://api.github.com/users/Cyrilvallez/repos",
"events_url": "https://api.github.com/users/Cyrilvallez/events{/privacy}",
"received_events_url": "https://api.github.com/users/Cyrilvallez/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-05-23T15:42:11 | 2025-05-23T17:48:03 | 2025-05-23T17:48:01 | MEMBER | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38327",
"html_url": "https://github.com/huggingface/transformers/pull/38327",
"diff_url": "https://github.com/huggingface/transformers/pull/38327.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38327.patch",
"merged_at": "2025-05-23T17:48:01"
} | # What does this PR do?
Follow-up of https://github.com/huggingface/transformers/pull/38288. As we do not fallback to eager implicitly, aligns the mask to do the same (and remove the argument everywhere as it was its only purpose). Also updates the warnings in the attention functions.
Also add some models that slipped through in the previous PR | {
"login": "Cyrilvallez",
"id": 71554963,
"node_id": "MDQ6VXNlcjcxNTU0OTYz",
"avatar_url": "https://avatars.githubusercontent.com/u/71554963?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Cyrilvallez",
"html_url": "https://github.com/Cyrilvallez",
"followers_url": "https://api.github.com/users/Cyrilvallez/followers",
"following_url": "https://api.github.com/users/Cyrilvallez/following{/other_user}",
"gists_url": "https://api.github.com/users/Cyrilvallez/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Cyrilvallez/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Cyrilvallez/subscriptions",
"organizations_url": "https://api.github.com/users/Cyrilvallez/orgs",
"repos_url": "https://api.github.com/users/Cyrilvallez/repos",
"events_url": "https://api.github.com/users/Cyrilvallez/events{/privacy}",
"received_events_url": "https://api.github.com/users/Cyrilvallez/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38327/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38327/timeline | null | null | null | null | true | true |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.