url
string
repository_url
string
labels_url
string
comments_url
string
events_url
string
html_url
string
id
int64
node_id
string
number
int64
title
string
user
dict
labels
list
state
string
locked
bool
assignee
dict
assignees
list
milestone
null
comments
list
created_at
timestamp[ms]
updated_at
timestamp[ms]
closed_at
timestamp[ms]
author_association
string
type
dict
active_lock_reason
null
draft
bool
pull_request
dict
body
string
closed_by
dict
reactions
dict
timeline_url
string
performed_via_github_app
null
state_reason
string
sub_issues_summary
dict
issue_dependencies_summary
dict
is_pull_request
bool
is_closed
bool
https://api.github.com/repos/huggingface/transformers/issues/41542
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/41542/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/41542/comments
https://api.github.com/repos/huggingface/transformers/issues/41542/events
https://github.com/huggingface/transformers/pull/41542
3,509,763,626
PR_kwDOCUB6oc6tcrYw
41,542
Add conditional checks to _check_and_adjust_attn_implementation()
{ "login": "zheliuyu", "id": 190869220, "node_id": "U_kgDOC2Bu5A", "avatar_url": "https://avatars.githubusercontent.com/u/190869220?v=4", "gravatar_id": "", "url": "https://api.github.com/users/zheliuyu", "html_url": "https://github.com/zheliuyu", "followers_url": "https://api.github.com/users/zheliuyu/followers", "following_url": "https://api.github.com/users/zheliuyu/following{/other_user}", "gists_url": "https://api.github.com/users/zheliuyu/gists{/gist_id}", "starred_url": "https://api.github.com/users/zheliuyu/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/zheliuyu/subscriptions", "organizations_url": "https://api.github.com/users/zheliuyu/orgs", "repos_url": "https://api.github.com/users/zheliuyu/repos", "events_url": "https://api.github.com/users/zheliuyu/events{/privacy}", "received_events_url": "https://api.github.com/users/zheliuyu/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-10-13T11:56:11
2025-10-14T15:53:04
2025-10-14T13:00:08
CONTRIBUTOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/41542", "html_url": "https://github.com/huggingface/transformers/pull/41542", "diff_url": "https://github.com/huggingface/transformers/pull/41542.diff", "patch_url": "https://github.com/huggingface/transformers/pull/41542.patch", "merged_at": "2025-10-14T13:00:07" }
# What does this PR do? <!-- Congratulations! You've made it this far! You're not quite done yet though. Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution. Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change. Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost. --> To prevent unnecessary downloads by kernels, avoid installing kernels-community/flash-attn and kernels-community/vllm-flash-attn3 when attn_implementation=flash_attention_2 is specified for NPU. ## test ### script ``` from transformers import AutoModelForCausalLM, AutoTokenizer model = AutoModelForCausalLM.from_pretrained( "Qwen/Qwen3-0.6B", device_map="auto", torch_dtype="auto", attn_implementation="flash_attention_2", ).eval() print("Operation successful") ``` ### Before ``` `torch_dtype` is deprecated! Use `dtype` instead! Fetching 0 files: 0it [00:00, ?it/s] Traceback (most recent call last): File "/root/kernels-main/src/kernels/utils.py", line 144, in install_kernel return _load_kernel_from_path(repo_path, package_name, variant_locks) File "/root/kernels-main/src/kernels/utils.py", line 177, in _load_kernel_from_path raise FileNotFoundError( FileNotFoundError: Kernel at path `/root/.cache/huggingface/hub/models--kernels-community--flash-attn/snapshots/90b3e941627659b28ff001c08b218315e1b7183b` does not have build: torch27-cxx11-cann81-aarch64-linux ``` ### After ``` `torch_dtype` is deprecated! Use `dtype` instead! Operation successful ```
{ "login": "MekkCyber", "id": 93391238, "node_id": "U_kgDOBZEJhg", "avatar_url": "https://avatars.githubusercontent.com/u/93391238?v=4", "gravatar_id": "", "url": "https://api.github.com/users/MekkCyber", "html_url": "https://github.com/MekkCyber", "followers_url": "https://api.github.com/users/MekkCyber/followers", "following_url": "https://api.github.com/users/MekkCyber/following{/other_user}", "gists_url": "https://api.github.com/users/MekkCyber/gists{/gist_id}", "starred_url": "https://api.github.com/users/MekkCyber/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/MekkCyber/subscriptions", "organizations_url": "https://api.github.com/users/MekkCyber/orgs", "repos_url": "https://api.github.com/users/MekkCyber/repos", "events_url": "https://api.github.com/users/MekkCyber/events{/privacy}", "received_events_url": "https://api.github.com/users/MekkCyber/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/41542/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/41542/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/41541
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/41541/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/41541/comments
https://api.github.com/repos/huggingface/transformers/issues/41541/events
https://github.com/huggingface/transformers/pull/41541
3,509,294,372
PR_kwDOCUB6oc6tbDID
41,541
Untangle config inheritance
{ "login": "zucchini-nlp", "id": 100715397, "node_id": "U_kgDOBgDLhQ", "avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4", "gravatar_id": "", "url": "https://api.github.com/users/zucchini-nlp", "html_url": "https://github.com/zucchini-nlp", "followers_url": "https://api.github.com/users/zucchini-nlp/followers", "following_url": "https://api.github.com/users/zucchini-nlp/following{/other_user}", "gists_url": "https://api.github.com/users/zucchini-nlp/gists{/gist_id}", "starred_url": "https://api.github.com/users/zucchini-nlp/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/zucchini-nlp/subscriptions", "organizations_url": "https://api.github.com/users/zucchini-nlp/orgs", "repos_url": "https://api.github.com/users/zucchini-nlp/repos", "events_url": "https://api.github.com/users/zucchini-nlp/events{/privacy}", "received_events_url": "https://api.github.com/users/zucchini-nlp/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
open
false
null
[]
null
[]
2025-10-13T09:57:13
2025-10-20T09:30:31
null
MEMBER
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/41541", "html_url": "https://github.com/huggingface/transformers/pull/41541", "diff_url": "https://github.com/huggingface/transformers/pull/41541.diff", "patch_url": "https://github.com/huggingface/transformers/pull/41541.patch", "merged_at": null }
# What does this PR do? As per title, deletes base config attributes that are not actually universal for all models (e.g. token ids used only for text models and tying can be done only if model has embedding layers) After this PR, we need to clean up generation-related params from config classes (https://github.com/huggingface/transformers/pull/41695), and then we can easily turn config classes to a `dataclass`. Using `dataclass` is one of the requirements for hf hub type validation which I am working on currently
null
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/41541/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/41541/timeline
null
null
null
null
true
false
https://api.github.com/repos/huggingface/transformers/issues/41540
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/41540/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/41540/comments
https://api.github.com/repos/huggingface/transformers/issues/41540/events
https://github.com/huggingface/transformers/pull/41540
3,509,101,112
PR_kwDOCUB6oc6taZVQ
41,540
Added kernels from kernel hub for Bamba model
{ "login": "romitjain", "id": 11757603, "node_id": "MDQ6VXNlcjExNzU3NjAz", "avatar_url": "https://avatars.githubusercontent.com/u/11757603?v=4", "gravatar_id": "", "url": "https://api.github.com/users/romitjain", "html_url": "https://github.com/romitjain", "followers_url": "https://api.github.com/users/romitjain/followers", "following_url": "https://api.github.com/users/romitjain/following{/other_user}", "gists_url": "https://api.github.com/users/romitjain/gists{/gist_id}", "starred_url": "https://api.github.com/users/romitjain/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/romitjain/subscriptions", "organizations_url": "https://api.github.com/users/romitjain/orgs", "repos_url": "https://api.github.com/users/romitjain/repos", "events_url": "https://api.github.com/users/romitjain/events{/privacy}", "received_events_url": "https://api.github.com/users/romitjain/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
open
false
null
[]
null
[]
2025-10-13T08:59:35
2025-10-17T08:27:51
null
CONTRIBUTOR
null
null
true
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/41540", "html_url": "https://github.com/huggingface/transformers/pull/41540", "diff_url": "https://github.com/huggingface/transformers/pull/41540.diff", "patch_url": "https://github.com/huggingface/transformers/pull/41540.patch", "merged_at": null }
# What does this PR do? <!-- Congratulations! You've made it this far! You're not quite done yet though. Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution. Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change. Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost. --> <!-- Remove if not applicable --> Adds support for `mamba_ssm` and `causal_conv1d` kernels from the kernel-hub in bamba models. Fixes # (issue) https://github.com/huggingface/transformers/issues/41208 ## Before submitting - [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case). - [X] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request), Pull Request section? - [X] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link to it if that's the case. - [ ] Did you make sure to update the documentation with your changes? Here are the [documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and [here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation). - [ ] Did you write any new necessary tests? ## Who can review? @vasqu @MekkCyber @drbh <!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @ If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**. Please tag fewer than 3 people. Models: - text models: @ArthurZucker @Cyrilvallez - vision models: @yonigozlan @molbap - audio models: @eustlb @ebezzam @vasqu - multimodal models: @zucchini-nlp - graph models: @clefourrier Library: - generate: @zucchini-nlp (visual-language models) or @gante (all others) - continuous batching: @remi-or @ArthurZucker @McPatate - pipelines: @Rocketknight1 - tokenizers: @ArthurZucker and @itazap - trainer: @zach-huggingface @SunMarc - attention: @vasqu @ArthurZucker @CyrilVallez - model loading (from pretrained, etc): @CyrilVallez - distributed: @3outeille @ArthurZucker @S1ro1 - CIs: @ydshieh Integrations: - deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface - ray/raytune: @richardliaw, @amogkam - Big Model Inference: @SunMarc - quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber - kernels: @MekkCyber @drbh - peft: @BenjaminBossan @githubnemo Devices/Backends: - AMD ROCm: @ivarflakstad - Intel XPU: @IlyasMoutawwakil - Ascend NPU: @ivarflakstad Documentation: @stevhliu Research projects are not maintained and should be taken as is. -->
null
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/41540/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/41540/timeline
null
null
null
null
true
false
https://api.github.com/repos/huggingface/transformers/issues/41539
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/41539/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/41539/comments
https://api.github.com/repos/huggingface/transformers/issues/41539/events
https://github.com/huggingface/transformers/issues/41539
3,509,035,329
I_kwDOCUB6oc7RJ6FB
41,539
All POETRY operations fail on latest version 4.57.0
{ "login": "bfuia", "id": 159527810, "node_id": "U_kgDOCYIzgg", "avatar_url": "https://avatars.githubusercontent.com/u/159527810?v=4", "gravatar_id": "", "url": "https://api.github.com/users/bfuia", "html_url": "https://github.com/bfuia", "followers_url": "https://api.github.com/users/bfuia/followers", "following_url": "https://api.github.com/users/bfuia/following{/other_user}", "gists_url": "https://api.github.com/users/bfuia/gists{/gist_id}", "starred_url": "https://api.github.com/users/bfuia/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/bfuia/subscriptions", "organizations_url": "https://api.github.com/users/bfuia/orgs", "repos_url": "https://api.github.com/users/bfuia/repos", "events_url": "https://api.github.com/users/bfuia/events{/privacy}", "received_events_url": "https://api.github.com/users/bfuia/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "id": 3817266200, "node_id": "MDU6TGFiZWwzODE3MjY2MjAw", "url": "https://api.github.com/repos/huggingface/transformers/labels/bug", "name": "bug", "color": "d73a4a", "default": true, "description": null } ]
closed
false
null
[]
null
[]
2025-10-13T08:40:49
2025-10-13T14:18:02
2025-10-13T14:18:02
NONE
null
null
null
null
### System Info I import transformers (always latest) in my poetry project. I use poetry 2.1.2 After this transformers release (4.57.0) I regenerated the poetry lock with command: `poetry lock` Then when retrying to generate the lock again after other updates - it fails with message: `Could not parse constrains version: <emtpy>` ### Who can help? _No response_ ### Information - [ ] The official example scripts - [ ] My own modified scripts ### Tasks - [ ] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...) - [ ] My own task or dataset (give details below) ### Reproduction Doing a simple search in the poetry.lock file I found out that transformers latest package needs `optax (<empty>)` which produces this failure because poetry does not know how to parse this type of version. Note I am sure that this is the problem because commenting out the transformers the lock works fine, and also by using 4.56.2 from September it also works fine and that `optax (<empty>)` cannot be found in the lock in this case. ### Expected behavior A developer should be able to use the latest transformers package version with poetry.
{ "login": "Rocketknight1", "id": 12866554, "node_id": "MDQ6VXNlcjEyODY2NTU0", "avatar_url": "https://avatars.githubusercontent.com/u/12866554?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Rocketknight1", "html_url": "https://github.com/Rocketknight1", "followers_url": "https://api.github.com/users/Rocketknight1/followers", "following_url": "https://api.github.com/users/Rocketknight1/following{/other_user}", "gists_url": "https://api.github.com/users/Rocketknight1/gists{/gist_id}", "starred_url": "https://api.github.com/users/Rocketknight1/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Rocketknight1/subscriptions", "organizations_url": "https://api.github.com/users/Rocketknight1/orgs", "repos_url": "https://api.github.com/users/Rocketknight1/repos", "events_url": "https://api.github.com/users/Rocketknight1/events{/privacy}", "received_events_url": "https://api.github.com/users/Rocketknight1/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/41539/reactions", "total_count": 2, "+1": 2, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/41539/timeline
null
completed
{ "total": 0, "completed": 0, "percent_completed": 0 }
{ "blocked_by": 0, "total_blocked_by": 0, "blocking": 0, "total_blocking": 0 }
false
true
https://api.github.com/repos/huggingface/transformers/issues/41538
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/41538/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/41538/comments
https://api.github.com/repos/huggingface/transformers/issues/41538/events
https://github.com/huggingface/transformers/pull/41538
3,508,811,198
PR_kwDOCUB6oc6tZa6G
41,538
examples/rag from original paper issue
{ "login": "pratap834", "id": 117664342, "node_id": "U_kgDOBwNqVg", "avatar_url": "https://avatars.githubusercontent.com/u/117664342?v=4", "gravatar_id": "", "url": "https://api.github.com/users/pratap834", "html_url": "https://github.com/pratap834", "followers_url": "https://api.github.com/users/pratap834/followers", "following_url": "https://api.github.com/users/pratap834/following{/other_user}", "gists_url": "https://api.github.com/users/pratap834/gists{/gist_id}", "starred_url": "https://api.github.com/users/pratap834/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/pratap834/subscriptions", "organizations_url": "https://api.github.com/users/pratap834/orgs", "repos_url": "https://api.github.com/users/pratap834/repos", "events_url": "https://api.github.com/users/pratap834/events{/privacy}", "received_events_url": "https://api.github.com/users/pratap834/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "id": 9258341780, "node_id": "LA_kwDOCUB6oc8AAAACJ9cVlA", "url": "https://api.github.com/repos/huggingface/transformers/labels/Code%20agent%20slop", "name": "Code agent slop", "color": "C59579", "default": false, "description": "" } ]
closed
false
null
[]
null
[]
2025-10-13T07:27:41
2025-10-13T14:16:21
2025-10-13T14:16:21
NONE
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/41538", "html_url": "https://github.com/huggingface/transformers/pull/41538", "diff_url": "https://github.com/huggingface/transformers/pull/41538.diff", "patch_url": "https://github.com/huggingface/transformers/pull/41538.patch", "merged_at": null }
# What does this PR do? <!-- Congratulations! You've made it this far! You're not quite done yet though. Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution. Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change. Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost. --> <!-- Remove if not applicable --> Fixes # (issue) ## Before submitting - [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case). - [ ] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request), Pull Request section? - [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link to it if that's the case. - [ ] Did you make sure to update the documentation with your changes? Here are the [documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and [here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation). - [ ] Did you write any new necessary tests? ## Who can review? Anyone in the community is free to review the PR once the tests have passed. Feel free to tag members/contributors who may be interested in your PR. <!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @ If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**. Please tag fewer than 3 people. Models: - text models: @ArthurZucker @Cyrilvallez - vision models: @yonigozlan @molbap - audio models: @eustlb @ebezzam @vasqu - multimodal models: @zucchini-nlp - graph models: @clefourrier Library: - generate: @zucchini-nlp (visual-language models) or @gante (all others) - continuous batching: @remi-or @ArthurZucker @McPatate - pipelines: @Rocketknight1 - tokenizers: @ArthurZucker and @itazap - trainer: @zach-huggingface @SunMarc - attention: @vasqu @ArthurZucker @CyrilVallez - model loading (from pretrained, etc): @CyrilVallez - distributed: @3outeille @ArthurZucker @S1ro1 - CIs: @ydshieh Integrations: - deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface - ray/raytune: @richardliaw, @amogkam - Big Model Inference: @SunMarc - quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber - kernels: @MekkCyber @drbh - peft: @BenjaminBossan @githubnemo Devices/Backends: - AMD ROCm: @ivarflakstad - Intel XPU: @IlyasMoutawwakil - Ascend NPU: @ivarflakstad Documentation: @stevhliu Research projects are not maintained and should be taken as is. -->
{ "login": "Rocketknight1", "id": 12866554, "node_id": "MDQ6VXNlcjEyODY2NTU0", "avatar_url": "https://avatars.githubusercontent.com/u/12866554?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Rocketknight1", "html_url": "https://github.com/Rocketknight1", "followers_url": "https://api.github.com/users/Rocketknight1/followers", "following_url": "https://api.github.com/users/Rocketknight1/following{/other_user}", "gists_url": "https://api.github.com/users/Rocketknight1/gists{/gist_id}", "starred_url": "https://api.github.com/users/Rocketknight1/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Rocketknight1/subscriptions", "organizations_url": "https://api.github.com/users/Rocketknight1/orgs", "repos_url": "https://api.github.com/users/Rocketknight1/repos", "events_url": "https://api.github.com/users/Rocketknight1/events{/privacy}", "received_events_url": "https://api.github.com/users/Rocketknight1/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/41538/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/41538/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/41537
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/41537/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/41537/comments
https://api.github.com/repos/huggingface/transformers/issues/41537/events
https://github.com/huggingface/transformers/pull/41537
3,508,788,137
PR_kwDOCUB6oc6tZV0L
41,537
examples/rag from original paper issue
{ "login": "pratap834", "id": 117664342, "node_id": "U_kgDOBwNqVg", "avatar_url": "https://avatars.githubusercontent.com/u/117664342?v=4", "gravatar_id": "", "url": "https://api.github.com/users/pratap834", "html_url": "https://github.com/pratap834", "followers_url": "https://api.github.com/users/pratap834/followers", "following_url": "https://api.github.com/users/pratap834/following{/other_user}", "gists_url": "https://api.github.com/users/pratap834/gists{/gist_id}", "starred_url": "https://api.github.com/users/pratap834/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/pratap834/subscriptions", "organizations_url": "https://api.github.com/users/pratap834/orgs", "repos_url": "https://api.github.com/users/pratap834/repos", "events_url": "https://api.github.com/users/pratap834/events{/privacy}", "received_events_url": "https://api.github.com/users/pratap834/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-10-13T07:21:11
2025-10-13T07:24:45
2025-10-13T07:24:45
NONE
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/41537", "html_url": "https://github.com/huggingface/transformers/pull/41537", "diff_url": "https://github.com/huggingface/transformers/pull/41537.diff", "patch_url": "https://github.com/huggingface/transformers/pull/41537.patch", "merged_at": null }
# What does this PR do? <!-- Congratulations! You've made it this far! You're not quite done yet though. Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution. Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change. Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost. --> <!-- Remove if not applicable --> Fixes # (issue) ## Before submitting - [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case). - [ ] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request), Pull Request section? - [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link to it if that's the case. - [ ] Did you make sure to update the documentation with your changes? Here are the [documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and [here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation). - [ ] Did you write any new necessary tests? ## Who can review? Anyone in the community is free to review the PR once the tests have passed. Feel free to tag members/contributors who may be interested in your PR. <!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @ If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**. Please tag fewer than 3 people. Models: - text models: @ArthurZucker @Cyrilvallez - vision models: @yonigozlan @molbap - audio models: @eustlb @ebezzam @vasqu - multimodal models: @zucchini-nlp - graph models: @clefourrier Library: - generate: @zucchini-nlp (visual-language models) or @gante (all others) - continuous batching: @remi-or @ArthurZucker @McPatate - pipelines: @Rocketknight1 - tokenizers: @ArthurZucker and @itazap - trainer: @zach-huggingface @SunMarc - attention: @vasqu @ArthurZucker @CyrilVallez - model loading (from pretrained, etc): @CyrilVallez - distributed: @3outeille @ArthurZucker @S1ro1 - CIs: @ydshieh Integrations: - deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface - ray/raytune: @richardliaw, @amogkam - Big Model Inference: @SunMarc - quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber - kernels: @MekkCyber @drbh - peft: @BenjaminBossan @githubnemo Devices/Backends: - AMD ROCm: @ivarflakstad - Intel XPU: @IlyasMoutawwakil - Ascend NPU: @ivarflakstad Documentation: @stevhliu Research projects are not maintained and should be taken as is. -->
{ "login": "pratap834", "id": 117664342, "node_id": "U_kgDOBwNqVg", "avatar_url": "https://avatars.githubusercontent.com/u/117664342?v=4", "gravatar_id": "", "url": "https://api.github.com/users/pratap834", "html_url": "https://github.com/pratap834", "followers_url": "https://api.github.com/users/pratap834/followers", "following_url": "https://api.github.com/users/pratap834/following{/other_user}", "gists_url": "https://api.github.com/users/pratap834/gists{/gist_id}", "starred_url": "https://api.github.com/users/pratap834/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/pratap834/subscriptions", "organizations_url": "https://api.github.com/users/pratap834/orgs", "repos_url": "https://api.github.com/users/pratap834/repos", "events_url": "https://api.github.com/users/pratap834/events{/privacy}", "received_events_url": "https://api.github.com/users/pratap834/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/41537/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/41537/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/41536
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/41536/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/41536/comments
https://api.github.com/repos/huggingface/transformers/issues/41536/events
https://github.com/huggingface/transformers/pull/41536
3,507,694,564
PR_kwDOCUB6oc6tVttx
41,536
[Qwen3VL] fix device mismatch error for FSDP2 training
{ "login": "HollowMan6", "id": 43995067, "node_id": "MDQ6VXNlcjQzOTk1MDY3", "avatar_url": "https://avatars.githubusercontent.com/u/43995067?v=4", "gravatar_id": "", "url": "https://api.github.com/users/HollowMan6", "html_url": "https://github.com/HollowMan6", "followers_url": "https://api.github.com/users/HollowMan6/followers", "following_url": "https://api.github.com/users/HollowMan6/following{/other_user}", "gists_url": "https://api.github.com/users/HollowMan6/gists{/gist_id}", "starred_url": "https://api.github.com/users/HollowMan6/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/HollowMan6/subscriptions", "organizations_url": "https://api.github.com/users/HollowMan6/orgs", "repos_url": "https://api.github.com/users/HollowMan6/repos", "events_url": "https://api.github.com/users/HollowMan6/events{/privacy}", "received_events_url": "https://api.github.com/users/HollowMan6/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-10-12T18:46:38
2025-10-14T10:29:06
2025-10-14T10:28:25
CONTRIBUTOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/41536", "html_url": "https://github.com/huggingface/transformers/pull/41536", "diff_url": "https://github.com/huggingface/transformers/pull/41536.diff", "patch_url": "https://github.com/huggingface/transformers/pull/41536.patch", "merged_at": "2025-10-14T10:28:25" }
# What does this PR do? <!-- Congratulations! You've made it this far! You're not quite done yet though. Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution. Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change. Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost. --> <!-- Remove if not applicable --> For FSDP2, parameters might be on a meta device, and the weight.device attribute may not accurately reflect where the actual computation will happen during forward passes. ```log File "transformers/models/qwen3_vl_moe/modeling_qwen3_vl_moe.py", line 776, in forward pos_embeds = self.fast_pos_embed_interpolate(grid_thw) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "transformers/models/qwen3_vl_moe/modeling_qwen3_vl_moe.py", line 745, in fast_pos_embed_interpolate pos_embeds = self.pos_embed(idx_tensor) * weight_tensor[:, :, None] ^^^^^^^^^^^^^^^^^^^^^^^^^^ File "torch/nn/modules/module.py", line 1773, in _wrapped_call_impl return self._call_impl(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "torch/nn/modules/module.py", line 1879, in _call_impl return inner() ^^^^^^^ File "torch/nn/modules/module.py", line 1827, in inner result = forward_call(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "torch/nn/modules/sparse.py", line 192, in forward return F.embedding( ^^^^^^^^^^^^ File "torch/nn/functional.py", line 2546, in embedding return torch.embedding(weight, input, padding_idx, scale_grad_by_freq, sparse) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ RuntimeError: Expected all tensors to be on the same device, but got index is on cpu, different from other tensors on cuda:0 (when checking argument in method wrapper_CUDA__index_select) ``` https://github.com/volcengine/verl/pull/3686#issuecomment-3380981817 Since the device for grid_thw is pretty much dependent on the user-side implementation (passed as a parameter for the forward method), I think it's better to take the device of grid_thw for unifying the device of idx_tensor and weight_tensor, so that user-side implementation can have more control over this and to guarantee nothing can go wrong here. User-side code can ensure the input grid is on the same device as the positional embedding weight, as the size of grid_thw is small, so there shouldn't be too much overhead. This is tested on [verl](https://github.com/volcengine/verl) and worked fine. ## Before submitting - [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case). - [X] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request), Pull Request section? - [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link to it if that's the case. - [ ] Did you make sure to update the documentation with your changes? Here are the [documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and [here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation). - [ ] Did you write any new necessary tests? ## Who can review? Anyone in the community is free to review the PR once the tests have passed. Feel free to tag members/contributors who may be interested in your PR. <!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @ If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**. Please tag fewer than 3 people. Models: - text models: @ArthurZucker @Cyrilvallez - vision models: @yonigozlan @molbap - audio models: @eustlb @ebezzam @vasqu - multimodal models: @zucchini-nlp - graph models: @clefourrier Library: - generate: @zucchini-nlp (visual-language models) or @gante (all others) - continuous batching: @remi-or @ArthurZucker @McPatate - pipelines: @Rocketknight1 - tokenizers: @ArthurZucker and @itazap - trainer: @zach-huggingface @SunMarc - attention: @vasqu @ArthurZucker @CyrilVallez - model loading (from pretrained, etc): @CyrilVallez - distributed: @3outeille @ArthurZucker @S1ro1 - CIs: @ydshieh Integrations: - deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface - ray/raytune: @richardliaw, @amogkam - Big Model Inference: @SunMarc - quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber - kernels: @MekkCyber @drbh - peft: @BenjaminBossan @githubnemo Devices/Backends: - AMD ROCm: @ivarflakstad - Intel XPU: @IlyasMoutawwakil - Ascend NPU: @ivarflakstad Documentation: @stevhliu Research projects are not maintained and should be taken as is. --> @yonigozlan @molbap @ArthurZucker @Cyrilvallez @zucchini-nlp
{ "login": "zucchini-nlp", "id": 100715397, "node_id": "U_kgDOBgDLhQ", "avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4", "gravatar_id": "", "url": "https://api.github.com/users/zucchini-nlp", "html_url": "https://github.com/zucchini-nlp", "followers_url": "https://api.github.com/users/zucchini-nlp/followers", "following_url": "https://api.github.com/users/zucchini-nlp/following{/other_user}", "gists_url": "https://api.github.com/users/zucchini-nlp/gists{/gist_id}", "starred_url": "https://api.github.com/users/zucchini-nlp/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/zucchini-nlp/subscriptions", "organizations_url": "https://api.github.com/users/zucchini-nlp/orgs", "repos_url": "https://api.github.com/users/zucchini-nlp/repos", "events_url": "https://api.github.com/users/zucchini-nlp/events{/privacy}", "received_events_url": "https://api.github.com/users/zucchini-nlp/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/41536/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/41536/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/41535
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/41535/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/41535/comments
https://api.github.com/repos/huggingface/transformers/issues/41535/events
https://github.com/huggingface/transformers/pull/41535
3,507,692,154
PR_kwDOCUB6oc6tVtX1
41,535
[Qwen3VL] fix: hidden_states in place modification error
{ "login": "HollowMan6", "id": 43995067, "node_id": "MDQ6VXNlcjQzOTk1MDY3", "avatar_url": "https://avatars.githubusercontent.com/u/43995067?v=4", "gravatar_id": "", "url": "https://api.github.com/users/HollowMan6", "html_url": "https://github.com/HollowMan6", "followers_url": "https://api.github.com/users/HollowMan6/followers", "following_url": "https://api.github.com/users/HollowMan6/following{/other_user}", "gists_url": "https://api.github.com/users/HollowMan6/gists{/gist_id}", "starred_url": "https://api.github.com/users/HollowMan6/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/HollowMan6/subscriptions", "organizations_url": "https://api.github.com/users/HollowMan6/orgs", "repos_url": "https://api.github.com/users/HollowMan6/repos", "events_url": "https://api.github.com/users/HollowMan6/events{/privacy}", "received_events_url": "https://api.github.com/users/HollowMan6/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-10-12T18:45:09
2025-10-13T08:53:58
2025-10-13T08:50:14
CONTRIBUTOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/41535", "html_url": "https://github.com/huggingface/transformers/pull/41535", "diff_url": "https://github.com/huggingface/transformers/pull/41535.diff", "patch_url": "https://github.com/huggingface/transformers/pull/41535.patch", "merged_at": "2025-10-13T08:50:14" }
# What does this PR do? <!-- Congratulations! You've made it this far! You're not quite done yet though. Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution. Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change. Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost. --> ``` File "transformers/models/qwen3_vl_moe/modeling_qwen3_vl_moe.py", line 941, in forward hidden_states = self._deepstack_process( ^^^^^^^^^^^^^^^^^^^^^^^^ File "transformers/models/qwen3_vl_moe/modeling_qwen3_vl_moe.py", line 960, in _deepstack_process hidden_states[visual_pos_masks, :] = local_this ~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^ RuntimeError: Output 0 of SliceBackward0 is a view and is being modified inplace. This view was created inside a custom Function (or because an input was returned as-is) and the autograd logic to handle view+inplace would override the custom backward associated with the custom Function, leading to incorrect gradients. This behavior is forbidden. You can fix this by cloning the output of the custom Function. ``` ## Before submitting - [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case). - [X] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request), Pull Request section? - [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link to it if that's the case. - [ ] Did you make sure to update the documentation with your changes? Here are the [documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and [here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation). - [ ] Did you write any new necessary tests? ## Who can review? Anyone in the community is free to review the PR once the tests have passed. Feel free to tag members/contributors who may be interested in your PR. <!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @ If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**. Please tag fewer than 3 people. Models: - text models: @ArthurZucker @Cyrilvallez - vision models: @yonigozlan @molbap - audio models: @eustlb @ebezzam @vasqu - multimodal models: @zucchini-nlp - graph models: @clefourrier Library: - generate: @zucchini-nlp (visual-language models) or @gante (all others) - continuous batching: @remi-or @ArthurZucker @McPatate - pipelines: @Rocketknight1 - tokenizers: @ArthurZucker and @itazap - trainer: @zach-huggingface @SunMarc - attention: @vasqu @ArthurZucker @CyrilVallez - model loading (from pretrained, etc): @CyrilVallez - distributed: @3outeille @ArthurZucker @S1ro1 - CIs: @ydshieh Integrations: - deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface - ray/raytune: @richardliaw, @amogkam - Big Model Inference: @SunMarc - quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber - kernels: @MekkCyber @drbh - peft: @BenjaminBossan @githubnemo Devices/Backends: - AMD ROCm: @ivarflakstad - Intel XPU: @IlyasMoutawwakil - Ascend NPU: @ivarflakstad Documentation: @stevhliu Research projects are not maintained and should be taken as is. --> @yonigozlan @molbap @ArthurZucker @Cyrilvallez @zucchini-nlp
{ "login": "zucchini-nlp", "id": 100715397, "node_id": "U_kgDOBgDLhQ", "avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4", "gravatar_id": "", "url": "https://api.github.com/users/zucchini-nlp", "html_url": "https://github.com/zucchini-nlp", "followers_url": "https://api.github.com/users/zucchini-nlp/followers", "following_url": "https://api.github.com/users/zucchini-nlp/following{/other_user}", "gists_url": "https://api.github.com/users/zucchini-nlp/gists{/gist_id}", "starred_url": "https://api.github.com/users/zucchini-nlp/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/zucchini-nlp/subscriptions", "organizations_url": "https://api.github.com/users/zucchini-nlp/orgs", "repos_url": "https://api.github.com/users/zucchini-nlp/repos", "events_url": "https://api.github.com/users/zucchini-nlp/events{/privacy}", "received_events_url": "https://api.github.com/users/zucchini-nlp/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/41535/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/41535/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/41534
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/41534/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/41534/comments
https://api.github.com/repos/huggingface/transformers/issues/41534/events
https://github.com/huggingface/transformers/pull/41534
3,507,251,407
PR_kwDOCUB6oc6tUNtM
41,534
Add VideoMAE video processor
{ "login": "Aki-07", "id": 95642646, "node_id": "U_kgDOBbNkFg", "avatar_url": "https://avatars.githubusercontent.com/u/95642646?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Aki-07", "html_url": "https://github.com/Aki-07", "followers_url": "https://api.github.com/users/Aki-07/followers", "following_url": "https://api.github.com/users/Aki-07/following{/other_user}", "gists_url": "https://api.github.com/users/Aki-07/gists{/gist_id}", "starred_url": "https://api.github.com/users/Aki-07/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Aki-07/subscriptions", "organizations_url": "https://api.github.com/users/Aki-07/orgs", "repos_url": "https://api.github.com/users/Aki-07/repos", "events_url": "https://api.github.com/users/Aki-07/events{/privacy}", "received_events_url": "https://api.github.com/users/Aki-07/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-10-12T14:05:59
2025-10-13T13:42:28
2025-10-13T13:42:28
CONTRIBUTOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/41534", "html_url": "https://github.com/huggingface/transformers/pull/41534", "diff_url": "https://github.com/huggingface/transformers/pull/41534.diff", "patch_url": "https://github.com/huggingface/transformers/pull/41534.patch", "merged_at": "2025-10-13T13:42:27" }
## What does this PR do? - add a dedicated `VideoMAEVideoProcessor` that decodes/samples videos via TorchCodec and emits `pixel_values` ready for VideoMAE models - document the new processor alongside the existing image processor so users can discover the GPU-friendly path - cover the processor with torchvision-gated regression tests to ensure serialization, sampling, and output naming stay stable Fixes #41520 ## Before submitting - [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case). - [x] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request), Pull Request section? - [x] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link to it if that's the case. (follow-up to the discussion with @zucchini-nlp on video processors) - [x] Did you make sure to update the documentation with your changes? Here are the [documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and [here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation). - [x] Did you write any new necessary tests?
{ "login": "zucchini-nlp", "id": 100715397, "node_id": "U_kgDOBgDLhQ", "avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4", "gravatar_id": "", "url": "https://api.github.com/users/zucchini-nlp", "html_url": "https://github.com/zucchini-nlp", "followers_url": "https://api.github.com/users/zucchini-nlp/followers", "following_url": "https://api.github.com/users/zucchini-nlp/following{/other_user}", "gists_url": "https://api.github.com/users/zucchini-nlp/gists{/gist_id}", "starred_url": "https://api.github.com/users/zucchini-nlp/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/zucchini-nlp/subscriptions", "organizations_url": "https://api.github.com/users/zucchini-nlp/orgs", "repos_url": "https://api.github.com/users/zucchini-nlp/repos", "events_url": "https://api.github.com/users/zucchini-nlp/events{/privacy}", "received_events_url": "https://api.github.com/users/zucchini-nlp/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/41534/reactions", "total_count": 1, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 1, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/41534/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/41533
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/41533/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/41533/comments
https://api.github.com/repos/huggingface/transformers/issues/41533/events
https://github.com/huggingface/transformers/issues/41533
3,507,227,594
I_kwDOCUB6oc7RDAvK
41,533
Add_Specifical_tokens and resize_toked_embeddings result in an error
{ "login": "jialiangZ", "id": 54654343, "node_id": "MDQ6VXNlcjU0NjU0MzQz", "avatar_url": "https://avatars.githubusercontent.com/u/54654343?v=4", "gravatar_id": "", "url": "https://api.github.com/users/jialiangZ", "html_url": "https://github.com/jialiangZ", "followers_url": "https://api.github.com/users/jialiangZ/followers", "following_url": "https://api.github.com/users/jialiangZ/following{/other_user}", "gists_url": "https://api.github.com/users/jialiangZ/gists{/gist_id}", "starred_url": "https://api.github.com/users/jialiangZ/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/jialiangZ/subscriptions", "organizations_url": "https://api.github.com/users/jialiangZ/orgs", "repos_url": "https://api.github.com/users/jialiangZ/repos", "events_url": "https://api.github.com/users/jialiangZ/events{/privacy}", "received_events_url": "https://api.github.com/users/jialiangZ/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "id": 3817266200, "node_id": "MDU6TGFiZWwzODE3MjY2MjAw", "url": "https://api.github.com/repos/huggingface/transformers/labels/bug", "name": "bug", "color": "d73a4a", "default": true, "description": null } ]
closed
false
null
[]
null
[]
2025-10-12T13:50:40
2025-10-13T14:09:29
2025-10-13T14:08:57
NONE
null
null
null
null
### System Info I want to add a few special tokens to my Qwen2.5VL model as separators, and after executing the following code, he received the following error message. I don't know how to solve this problem. ``` bash [rank1]: Traceback (most recent call last): [rank1]: RuntimeError: shape '[-1, 151936]' is invalid for input of size 329273399 [rank0]: Traceback (most recent call last): [rank0]: RuntimeError: shape '[-1, 151936]' is invalid for input of size 217038339 [rank3]: Traceback (most recent call last): [rank3]: RuntimeError: shape '[-1, 151936]' is invalid for input of size 116936799 [rank2]: Traceback (most recent call last): [rank2]: RuntimeError: shape '[-1, 151936]' is invalid for input of size 215673318 Traceback (most recent call last): File "/home/hk-project-p0022189/tum_yvc3016/miniconda3/envs/qwen2_5-VL/lib/python3.10/site-packages/torch/distributed/elastic/multiprocessing/errors/init.py", line 355, in wrapper raise ChildFailedError( torch.distributed.elastic.multiprocessing.errors.ChildFailedError: qwenvl/train/train_livecc.py FAILED Failures: <NO_OTHER_FAILURES> Root Cause (first observed failure): error_file: <N/A> traceback : To enable traceback see: https://pytorch.org/docs/stable/elastic/errors.html ``` ### Who can help? _No response_ ### Information - [ ] The official example scripts - [x] My own modified scripts ### Tasks - [ ] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...) - [x] My own task or dataset (give details below) ### Reproduction ``` python import os import logging import pathlib import torch import transformers import json from typing import Dict import shutil import sys from pathlib import Path project_root = Path(__file__).parent.parent.parent sys.path.append(str(project_root)) import qwenvl.train.trainer from trainer import replace_qwen2_vl_attention_class from transformers import ( Qwen2VLForConditionalGeneration, ) from model_code.modeling_qwen2_5_vl import Qwen2_5_VLForConditionalGeneration # from qwenvl.data.data_qwen import make_supervised_data_module from qwenvl.data.lmm_dataset_for_batch import make_supervised_data_module from qwenvl.train.argument import ( ModelArguments, DataArguments, TrainingArguments, ) from transformers import AutoTokenizer, AutoProcessor, Qwen2VLImageProcessor, Trainer local_rank = None os.environ["TOKENIZERS_PARALLELISM"] = "false" def rank0_print(*args): if local_rank == 0: print(*args) def add_special_tokens_safely(tokenizer, new_tokens): """ 安全地向 tokenizer 添加新的 special tokens,保留原有的 additional_special_tokens。 Args: tokenizer: Hugging Face tokenizer model: 对应的语言模型 new_tokens: list of str, 要添加的新 token Returns: bool: 是否有新 token 被添加 """ # 获取当前词表中的所有 token current_vocab = set(tokenizer.get_vocab().keys()) # 过滤出真正需要添加的 token tokens_to_add = [t for t in new_tokens if t not in current_vocab] if not tokens_to_add: rank0_print("🟢 所有指定的 token 已存在于词表中,无需添加。") return False # 获取原有 additional_special_tokens(如 <image>, <ref> 等) orig_special_tokens = tokenizer.special_tokens_map.get( "additional_special_tokens", [] ) # 合并:保留原有 + 新增 updated_special_tokens = orig_special_tokens + [ t for t in tokens_to_add if t not in orig_special_tokens ] rank0_print(f"📌 正在添加新 token: {tokens_to_add}") rank0_print(f"🔧 更新后的 additional_special_tokens 总数: {len(updated_special_tokens)}") # 使用 add_special_tokens API(会自动去重) num_added = tokenizer.add_special_tokens( {"additional_special_tokens": updated_special_tokens} ) if num_added > 0: rank0_print(f"✅ 成功添加 {num_added} 个新 token 到词表") return num_added > 0 def safe_save_model_for_hf_trainer(trainer: transformers.Trainer, output_dir: str): """Collects the state dict and dump to disk.""" if trainer.deepspeed: torch.cuda.synchronize() trainer.save_model(output_dir) return state_dict = trainer.model.state_dict() if trainer.args.should_save: cpu_state_dict = {key: value.cpu() for key, value in state_dict.items()} del state_dict trainer._save(output_dir, state_dict=cpu_state_dict) # noqa def set_model(model_args, model): if model_args.tune_mm_vision: for n, p in model.visual.named_parameters(): p.requires_grad = True else: for n, p in model.visual.named_parameters(): p.requires_grad = False if model_args.tune_mm_mlp: for n, p in model.visual.merger.named_parameters(): p.requires_grad = True else: for n, p in model.visual.merger.named_parameters(): p.requires_grad = False if model_args.tune_mm_llm: for n, p in model.model.named_parameters(): p.requires_grad = True model.lm_head.requires_grad = True else: for n, p in model.model.named_parameters(): p.requires_grad = False model.lm_head.requires_grad = False def train(attn_implementation="flash_attention_2"): global local_rank parser = transformers.HfArgumentParser( (ModelArguments, DataArguments, TrainingArguments) ) model_args, data_args, training_args = parser.parse_args_into_dataclasses() local_rank = training_args.local_rank os.makedirs(training_args.output_dir, exist_ok=True) model = Qwen2_5_VLForConditionalGeneration.from_pretrained( model_args.model_name_or_path, cache_dir=training_args.cache_dir, attn_implementation=attn_implementation, torch_dtype=(torch.bfloat16 if training_args.bf16 else None), ) processor = AutoProcessor.from_pretrained( model_args.model_name_or_path, ) tokenizer = AutoTokenizer.from_pretrained( model_args.model_name_or_path, cache_dir=training_args.cache_dir, model_max_length=training_args.model_max_length, padding_side="right", use_fast=False, ) data_args.image_processor = processor.image_processor data_args.model_type = "qwen2.5vl" if data_args.data_flatten: replace_qwen2_vl_attention_class() model.config.use_cache = False if training_args.gradient_checkpointing: if hasattr(model, "enable_input_require_grads"): model.enable_input_require_grads() else: def make_inputs_require_grad(module, input, output): output.requires_grad_(True) model.get_input_embeddings().register_forward_hook(make_inputs_require_grad) ####################新加的 # 要添加的自定义 special tokens # Token ID # ----------------------- # <think> 151665 # </think> 151666 # <answer> 151667 # </answer> 151668 NEW_SPECIAL_TOKENS = ["<think>", "</think>", "<answer>", "</answer>"] was_updated = add_special_tokens_safely(tokenizer, NEW_SPECIAL_TOKENS) if was_updated: model.resize_token_embeddings(len(tokenizer)) else: rank0_print("ℹ️ 未检测到新 token 添加,继续执行...") model.tokenizer = tokenizer # 👈 将 tokenizer 挂载到模型上 model.randK = data_args.randK # 👈 将 randK 挂载到模型上 model.randF = data_args.randF # 👈 将 randF 挂载到模型上 model.dataset_use = data_args.dataset_use # 👈 将 dataset_use 挂载到模型上 set_model(model_args, model) # if torch.distributed.get_rank() == 0: # model.visual.print_trainable_parameters() # model.model.print_trainable_parameters() processor.tokenizer = tokenizer data_module = make_supervised_data_module(processor=processor, data_args=data_args) trainer = Trainer( model=model, processing_class=tokenizer, args=training_args, **data_module ) if list(pathlib.Path(training_args.output_dir).glob("checkpoint-*")): logging.info("checkpoint found, resume training") trainer.train(resume_from_checkpoint=True) else: trainer.train() trainer.save_state() data_args.image_processor.save_pretrained(training_args.output_dir) source_path = "chat_template.json" template_path = os.path.join(training_args.output_dir, "chat_template.json") shutil.copy2(source_path, template_path) model.config.use_cache = True safe_save_model_for_hf_trainer(trainer=trainer, output_dir=training_args.output_dir) if __name__ == "__main__": # train(attn_implementation="flash_attention_2") train() ``` ``` bash #!/bin/bash # Distributed training configuration NPROC_PER_NODE=4 MASTER_ADDR=${MASTER_ADDR:-"127.0.0.1"} MASTER_PORT=${MASTER_PORT:-$(shuf -i 20001-29999 -n 1)} NNODES=${WORLD_SIZE:-1} # 获取当前脚本所在的目录(例如 pe_version) SCRIPT_DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd )" # 向上两级:Qwen2_5_vl_group/ PY_ROOT="$(dirname "$(dirname "$SCRIPT_DIR")")" # 设置 PYTHONPATH,指向你想添加的子目录 export PYTHONPATH="$PY_ROOT:$PYTHONPATH" export VIDEO_MIN_PIXELS=78400 # 100*28*28. the minimum visual frame tokens sent to llm is 100 export FPS_MAX_FRAMES=60 # maximum number of frames for each video (480/60/2 = 4min) export VIDEO_MAX_PIXELS=2408448 # 24576*28*28. the maximum overall video tokens sent to llm is 24k (leave 8k for language) # DeepSpeed configuration deepspeed=./scripts/zero3.json # Model configuration if [ -d "../../../../model/Qwen2.5-VL-3B-Instruct" ]; then llm=../../../../model/Qwen2.5-VL-3B-Instruct # Using local model else llm=Qwen/Qwen2.5-VL-3B-Instruct # Using HuggingFace model ID fi # Training hyperparameters lr=2e-5 batch_size=1 grad_accum_steps=16 # Training entry point entry_file=qwenvl/train/train_livecc.py # Dataset configuration (replace with public dataset names) datasets=../../../../data/VideoEspresso/videoespresso_train_video_duration_le_40s_rewrited_completed.jsonl # Output configuration run_name="qwen2_5vl-batch-CoT-3B" output_dir=./output/qwen2_5vl-batch-CoT-3B-$(date +"%Y%m%d_%H%M") # Training arguments args=" --deepspeed ${deepspeed} \ --model_name_or_path "${llm}" \ --dataset_use ${datasets} \ --data_flatten False \ --tune_mm_vision False \ --tune_mm_mlp True \ --tune_mm_llm True \ --bf16 True \ --output_dir ${output_dir} \ --num_train_epochs 1 \ --per_device_train_batch_size ${batch_size} \ --per_device_eval_batch_size $((batch_size*2)) \ --gradient_accumulation_steps ${grad_accum_steps} \ --max_pixels 50176 \ --min_pixels 784 \ --eval_strategy "no" \ --save_strategy "steps" \ --save_steps 100 \ --save_total_limit 1 \ --learning_rate ${lr} \ --weight_decay 0 \ --warmup_ratio 0.03 \ --max_grad_norm 1 \ --lr_scheduler_type "cosine" \ --logging_steps 1 \ --model_max_length 8192 \ --gradient_checkpointing True \ --dataloader_num_workers 16 \ --use_liger_kernel True \ --run_name ${run_name} \ --report_to none \ --randK False \ --randF False \ " # Launch training torchrun --nproc_per_node=${NPROC_PER_NODE} \ --master_addr=${MASTER_ADDR} \ --master_port=${MASTER_PORT} \ ${entry_file} ${args} ``` ### Expected behavior How to add tokens correctly
{ "login": "Rocketknight1", "id": 12866554, "node_id": "MDQ6VXNlcjEyODY2NTU0", "avatar_url": "https://avatars.githubusercontent.com/u/12866554?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Rocketknight1", "html_url": "https://github.com/Rocketknight1", "followers_url": "https://api.github.com/users/Rocketknight1/followers", "following_url": "https://api.github.com/users/Rocketknight1/following{/other_user}", "gists_url": "https://api.github.com/users/Rocketknight1/gists{/gist_id}", "starred_url": "https://api.github.com/users/Rocketknight1/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Rocketknight1/subscriptions", "organizations_url": "https://api.github.com/users/Rocketknight1/orgs", "repos_url": "https://api.github.com/users/Rocketknight1/repos", "events_url": "https://api.github.com/users/Rocketknight1/events{/privacy}", "received_events_url": "https://api.github.com/users/Rocketknight1/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/41533/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/41533/timeline
null
completed
{ "total": 0, "completed": 0, "percent_completed": 0 }
{ "blocked_by": 0, "total_blocked_by": 0, "blocking": 0, "total_blocking": 0 }
false
true
https://api.github.com/repos/huggingface/transformers/issues/41532
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/41532/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/41532/comments
https://api.github.com/repos/huggingface/transformers/issues/41532/events
https://github.com/huggingface/transformers/issues/41532
3,507,181,077
I_kwDOCUB6oc7RC1YV
41,532
where is examples/rag from original paper?
{ "login": "IgorKasianenko", "id": 17688220, "node_id": "MDQ6VXNlcjE3Njg4MjIw", "avatar_url": "https://avatars.githubusercontent.com/u/17688220?v=4", "gravatar_id": "", "url": "https://api.github.com/users/IgorKasianenko", "html_url": "https://github.com/IgorKasianenko", "followers_url": "https://api.github.com/users/IgorKasianenko/followers", "following_url": "https://api.github.com/users/IgorKasianenko/following{/other_user}", "gists_url": "https://api.github.com/users/IgorKasianenko/gists{/gist_id}", "starred_url": "https://api.github.com/users/IgorKasianenko/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/IgorKasianenko/subscriptions", "organizations_url": "https://api.github.com/users/IgorKasianenko/orgs", "repos_url": "https://api.github.com/users/IgorKasianenko/repos", "events_url": "https://api.github.com/users/IgorKasianenko/events{/privacy}", "received_events_url": "https://api.github.com/users/IgorKasianenko/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "id": 3817266200, "node_id": "MDU6TGFiZWwzODE3MjY2MjAw", "url": "https://api.github.com/repos/huggingface/transformers/labels/bug", "name": "bug", "color": "d73a4a", "default": true, "description": null } ]
closed
false
null
[]
null
[]
2025-10-12T13:17:53
2025-10-17T09:34:15
2025-10-17T09:34:15
NONE
null
null
null
null
### System Info https://arxiv.org/pdf/2005.11401 mentions https://github.com/huggingface/transformers/blob/main/examples/rag but it is not there. Add redirect if possible ### Who can help? _No response_ ### Information - [ ] The official example scripts - [ ] My own modified scripts ### Tasks - [ ] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...) - [ ] My own task or dataset (give details below) ### Reproduction Go to https://github.com/huggingface/transformers/blob/main/examples/rag ### Expected behavior some example instead of 404
{ "login": "IgorKasianenko", "id": 17688220, "node_id": "MDQ6VXNlcjE3Njg4MjIw", "avatar_url": "https://avatars.githubusercontent.com/u/17688220?v=4", "gravatar_id": "", "url": "https://api.github.com/users/IgorKasianenko", "html_url": "https://github.com/IgorKasianenko", "followers_url": "https://api.github.com/users/IgorKasianenko/followers", "following_url": "https://api.github.com/users/IgorKasianenko/following{/other_user}", "gists_url": "https://api.github.com/users/IgorKasianenko/gists{/gist_id}", "starred_url": "https://api.github.com/users/IgorKasianenko/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/IgorKasianenko/subscriptions", "organizations_url": "https://api.github.com/users/IgorKasianenko/orgs", "repos_url": "https://api.github.com/users/IgorKasianenko/repos", "events_url": "https://api.github.com/users/IgorKasianenko/events{/privacy}", "received_events_url": "https://api.github.com/users/IgorKasianenko/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/41532/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/41532/timeline
null
completed
{ "total": 0, "completed": 0, "percent_completed": 0 }
{ "blocked_by": 0, "total_blocked_by": 0, "blocking": 0, "total_blocking": 0 }
false
true
https://api.github.com/repos/huggingface/transformers/issues/41531
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/41531/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/41531/comments
https://api.github.com/repos/huggingface/transformers/issues/41531/events
https://github.com/huggingface/transformers/pull/41531
3,506,767,284
PR_kwDOCUB6oc6tSnbr
41,531
🌐 [i18n-KO] Translated `video_processor.md` to Korean
{ "login": "chelsseeey", "id": 152389483, "node_id": "U_kgDOCRVHaw", "avatar_url": "https://avatars.githubusercontent.com/u/152389483?v=4", "gravatar_id": "", "url": "https://api.github.com/users/chelsseeey", "html_url": "https://github.com/chelsseeey", "followers_url": "https://api.github.com/users/chelsseeey/followers", "following_url": "https://api.github.com/users/chelsseeey/following{/other_user}", "gists_url": "https://api.github.com/users/chelsseeey/gists{/gist_id}", "starred_url": "https://api.github.com/users/chelsseeey/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/chelsseeey/subscriptions", "organizations_url": "https://api.github.com/users/chelsseeey/orgs", "repos_url": "https://api.github.com/users/chelsseeey/repos", "events_url": "https://api.github.com/users/chelsseeey/events{/privacy}", "received_events_url": "https://api.github.com/users/chelsseeey/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
open
false
null
[]
null
[]
2025-10-12T07:18:04
2025-10-29T09:17:55
null
CONTRIBUTOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/41531", "html_url": "https://github.com/huggingface/transformers/pull/41531", "diff_url": "https://github.com/huggingface/transformers/pull/41531.diff", "patch_url": "https://github.com/huggingface/transformers/pull/41531.patch", "merged_at": null }
# What does this PR do? Translated the `video_processor.md` file of the documentation to Korean. Thank you in advance for your review. Part of https://github.com/huggingface/transformers/issues/20179 ## Before reviewing - [X] Check for missing / redundant translations (번역 누락/중복 검사) - [X] Grammar Check (맞춤법 검사) - [X] Review or Add new terms to glossary (용어 확인 및 추가) - [X] Check Inline TOC (e.g. `[[lowercased-header]]`) - [X] Check live-preview for gotchas (live-preview로 정상작동 확인) ## Who can review? (Initial) <!-- 1. 위 체크가 모두 완료된 뒤에만 KREW 팀원들에게 리뷰를 요청하는 아래 주석을 노출해주세요!--> May you please review this PR? @jungnerd, @luckyvickyricky, @chelsseeey, @skwh54, @maximizemaxwell, @D15M4S ## Before submitting - [X] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case). - [X] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#start-contributing-pull-requests), Pull Request section? - [X] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link to it if that's the case. - [X] Did you make sure to update the documentation with your changes? Here are the [documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and [here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation). - [X] Did you write any new necessary tests? ## Who can review? (Final) <!-- 2. KREW 팀원들의 리뷰가 끝난 후에 아래 주석을 노출해주세요! --> <!---transformers, course는 @stevhliu, agent-course는 @sergiopanieg smol-agents는 @albertvillanova입니다!---> @stevhliu May you please review this PR?
null
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/41531/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/41531/timeline
null
null
null
null
true
false
https://api.github.com/repos/huggingface/transformers/issues/41530
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/41530/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/41530/comments
https://api.github.com/repos/huggingface/transformers/issues/41530/events
https://github.com/huggingface/transformers/pull/41530
3,506,748,189
PR_kwDOCUB6oc6tSjmw
41,530
Fix: Correct loss normalization in training_step for multi-GPU training
{ "login": "KaparthyReddy", "id": 166050493, "node_id": "U_kgDOCeW6vQ", "avatar_url": "https://avatars.githubusercontent.com/u/166050493?v=4", "gravatar_id": "", "url": "https://api.github.com/users/KaparthyReddy", "html_url": "https://github.com/KaparthyReddy", "followers_url": "https://api.github.com/users/KaparthyReddy/followers", "following_url": "https://api.github.com/users/KaparthyReddy/following{/other_user}", "gists_url": "https://api.github.com/users/KaparthyReddy/gists{/gist_id}", "starred_url": "https://api.github.com/users/KaparthyReddy/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/KaparthyReddy/subscriptions", "organizations_url": "https://api.github.com/users/KaparthyReddy/orgs", "repos_url": "https://api.github.com/users/KaparthyReddy/repos", "events_url": "https://api.github.com/users/KaparthyReddy/events{/privacy}", "received_events_url": "https://api.github.com/users/KaparthyReddy/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-10-12T06:51:55
2025-10-13T14:02:31
2025-10-13T14:02:30
NONE
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/41530", "html_url": "https://github.com/huggingface/transformers/pull/41530", "diff_url": "https://github.com/huggingface/transformers/pull/41530.diff", "patch_url": "https://github.com/huggingface/transformers/pull/41530.patch", "merged_at": null }
# Fix: Correct loss normalization in training_step for multi-GPU training ## Description Fixes #37474 This PR corrects the loss aggregation logic in `Trainer.training_step` when training with multiple GPUs. ## Problem When `num_items_in_batch` is provided (e.g., for token-level loss normalization), each device computes loss as: ```python per_device_loss = sum_of_losses / total_items_across_all_devices
{ "login": "Rocketknight1", "id": 12866554, "node_id": "MDQ6VXNlcjEyODY2NTU0", "avatar_url": "https://avatars.githubusercontent.com/u/12866554?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Rocketknight1", "html_url": "https://github.com/Rocketknight1", "followers_url": "https://api.github.com/users/Rocketknight1/followers", "following_url": "https://api.github.com/users/Rocketknight1/following{/other_user}", "gists_url": "https://api.github.com/users/Rocketknight1/gists{/gist_id}", "starred_url": "https://api.github.com/users/Rocketknight1/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Rocketknight1/subscriptions", "organizations_url": "https://api.github.com/users/Rocketknight1/orgs", "repos_url": "https://api.github.com/users/Rocketknight1/repos", "events_url": "https://api.github.com/users/Rocketknight1/events{/privacy}", "received_events_url": "https://api.github.com/users/Rocketknight1/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/41530/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/41530/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/41529
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/41529/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/41529/comments
https://api.github.com/repos/huggingface/transformers/issues/41529/events
https://github.com/huggingface/transformers/pull/41529
3,506,734,244
PR_kwDOCUB6oc6tShCa
41,529
Add num_logits_to_keep to GPT-2 and improve Llama implementation
{ "login": "KaparthyReddy", "id": 166050493, "node_id": "U_kgDOCeW6vQ", "avatar_url": "https://avatars.githubusercontent.com/u/166050493?v=4", "gravatar_id": "", "url": "https://api.github.com/users/KaparthyReddy", "html_url": "https://github.com/KaparthyReddy", "followers_url": "https://api.github.com/users/KaparthyReddy/followers", "following_url": "https://api.github.com/users/KaparthyReddy/following{/other_user}", "gists_url": "https://api.github.com/users/KaparthyReddy/gists{/gist_id}", "starred_url": "https://api.github.com/users/KaparthyReddy/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/KaparthyReddy/subscriptions", "organizations_url": "https://api.github.com/users/KaparthyReddy/orgs", "repos_url": "https://api.github.com/users/KaparthyReddy/repos", "events_url": "https://api.github.com/users/KaparthyReddy/events{/privacy}", "received_events_url": "https://api.github.com/users/KaparthyReddy/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-10-12T06:31:47
2025-10-13T13:47:44
2025-10-13T13:47:44
NONE
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/41529", "html_url": "https://github.com/huggingface/transformers/pull/41529", "diff_url": "https://github.com/huggingface/transformers/pull/41529.diff", "patch_url": "https://github.com/huggingface/transformers/pull/41529.patch", "merged_at": null }
# Add num_logits_to_keep to GPT-2 and improve Llama implementation ## Description Adds `num_logits_to_keep` parameter to GPT-2 for memory-efficient generation and simplifies the implementation in Llama. ## Changes ### GPT-2 - ✅ Added `num_logits_to_keep` parameter to `GPT2LMHeadModel.forward()` - ✅ Only computes logits for last N tokens when specified - ✅ Default behavior unchanged (computes all logits when `num_logits_to_keep=0`) - ✅ Added parameter documentation ### Llama - ✅ Simplified `num_logits_to_keep` implementation - ✅ Removed redundant slicing operation - ✅ Fixed edge case where `logits_to_keep=0` would incorrectly slice all tokens - ✅ Cleaner, more maintainable code ## Motivation During generation, models typically compute logits for ALL tokens in the sequence. However, for next-token prediction, we only need the last token's logits. This parameter allows: - 🚀 **Memory efficiency**: Reduce memory usage during generation - ⚡ **Speed**: Less computation needed - 🎯 **Flexibility**: Users can choose how many logits to keep ## Usage Example ```python from transformers import GPT2LMHeadModel, GPT2Tokenizer model = GPT2LMHeadModel.from_pretrained("gpt2") tokenizer = GPT2Tokenizer.from_pretrained("gpt2") inputs = tokenizer("Hello, I'm a language model", return_tensors="pt") # Default: compute all logits outputs = model(**inputs) print(outputs.logits.shape) # [1, sequence_length, vocab_size] # Memory efficient: only compute last token's logits outputs = model(**inputs, num_logits_to_keep=1) print(outputs.logits.shape) # [1, 1, vocab_size]
{ "login": "Rocketknight1", "id": 12866554, "node_id": "MDQ6VXNlcjEyODY2NTU0", "avatar_url": "https://avatars.githubusercontent.com/u/12866554?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Rocketknight1", "html_url": "https://github.com/Rocketknight1", "followers_url": "https://api.github.com/users/Rocketknight1/followers", "following_url": "https://api.github.com/users/Rocketknight1/following{/other_user}", "gists_url": "https://api.github.com/users/Rocketknight1/gists{/gist_id}", "starred_url": "https://api.github.com/users/Rocketknight1/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Rocketknight1/subscriptions", "organizations_url": "https://api.github.com/users/Rocketknight1/orgs", "repos_url": "https://api.github.com/users/Rocketknight1/repos", "events_url": "https://api.github.com/users/Rocketknight1/events{/privacy}", "received_events_url": "https://api.github.com/users/Rocketknight1/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/41529/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/41529/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/41528
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/41528/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/41528/comments
https://api.github.com/repos/huggingface/transformers/issues/41528/events
https://github.com/huggingface/transformers/pull/41528
3,506,719,764
PR_kwDOCUB6oc6tSeX6
41,528
Add position encoding interpolation to DeiT
{ "login": "KaparthyReddy", "id": 166050493, "node_id": "U_kgDOCeW6vQ", "avatar_url": "https://avatars.githubusercontent.com/u/166050493?v=4", "gravatar_id": "", "url": "https://api.github.com/users/KaparthyReddy", "html_url": "https://github.com/KaparthyReddy", "followers_url": "https://api.github.com/users/KaparthyReddy/followers", "following_url": "https://api.github.com/users/KaparthyReddy/following{/other_user}", "gists_url": "https://api.github.com/users/KaparthyReddy/gists{/gist_id}", "starred_url": "https://api.github.com/users/KaparthyReddy/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/KaparthyReddy/subscriptions", "organizations_url": "https://api.github.com/users/KaparthyReddy/orgs", "repos_url": "https://api.github.com/users/KaparthyReddy/repos", "events_url": "https://api.github.com/users/KaparthyReddy/events{/privacy}", "received_events_url": "https://api.github.com/users/KaparthyReddy/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
open
false
null
[]
null
[]
2025-10-12T06:08:48
2025-10-13T13:54:12
null
NONE
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/41528", "html_url": "https://github.com/huggingface/transformers/pull/41528", "diff_url": "https://github.com/huggingface/transformers/pull/41528.diff", "patch_url": "https://github.com/huggingface/transformers/pull/41528.patch", "merged_at": null }
# Add position encoding interpolation to DeiT ## Description Adds position encoding interpolation to DeiT models, enabling the use of pretrained checkpoints on images with different resolutions than those used during training. Contributes to #30579 ## Changes - ✅ Added `interpolate_pos_encoding` parameter to `DeiTModel`, `DeiTForImageClassification`, `DeiTForImageClassificationWithTeacher`, and `DeiTForMaskedImageModeling` - ✅ Implemented `interpolate_pos_encoding()` method in `DeiTEmbeddings` class - ✅ Properly handles both CLS and distillation tokens (unique to DeiT) - ✅ Added comprehensive test `test_model_with_different_image_size` - ✅ All tests passing ## Implementation Details DeiT has two special tokens (CLS + distillation), unlike ViT which only has one (CLS). The interpolation method: 1. Separates the CLS/distillation token embeddings from patch embeddings 2. Interpolates patch embeddings to match new image size 3. Recombines CLS/distillation tokens with interpolated patches ## Testing ```python # Example usage model = DeiTModel.from_pretrained("facebook/deit-base-distilled-patch16-224") # Use with 480x480 images instead of 224x224 outputs = model(large_images, interpolate_pos_encoding=True)
null
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/41528/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/41528/timeline
null
null
null
null
true
false
https://api.github.com/repos/huggingface/transformers/issues/41527
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/41527/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/41527/comments
https://api.github.com/repos/huggingface/transformers/issues/41527/events
https://github.com/huggingface/transformers/pull/41527
3,506,624,178
PR_kwDOCUB6oc6tSMu4
41,527
🌐 [i18n-KO] Translated selecting.md to Korean
{ "login": "maximizemaxwell", "id": 138701551, "node_id": "U_kgDOCERq7w", "avatar_url": "https://avatars.githubusercontent.com/u/138701551?v=4", "gravatar_id": "", "url": "https://api.github.com/users/maximizemaxwell", "html_url": "https://github.com/maximizemaxwell", "followers_url": "https://api.github.com/users/maximizemaxwell/followers", "following_url": "https://api.github.com/users/maximizemaxwell/following{/other_user}", "gists_url": "https://api.github.com/users/maximizemaxwell/gists{/gist_id}", "starred_url": "https://api.github.com/users/maximizemaxwell/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/maximizemaxwell/subscriptions", "organizations_url": "https://api.github.com/users/maximizemaxwell/orgs", "repos_url": "https://api.github.com/users/maximizemaxwell/repos", "events_url": "https://api.github.com/users/maximizemaxwell/events{/privacy}", "received_events_url": "https://api.github.com/users/maximizemaxwell/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
open
false
null
[]
null
[]
2025-10-12T03:33:30
2025-10-22T23:52:25
null
CONTRIBUTOR
null
null
true
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/41527", "html_url": "https://github.com/huggingface/transformers/pull/41527", "diff_url": "https://github.com/huggingface/transformers/pull/41527.diff", "patch_url": "https://github.com/huggingface/transformers/pull/41527.patch", "merged_at": null }
# What does this PR do? Translated the `selecting.md` file of the documentation to Korean. Thank you in advance for your review. Part of https://github.com/huggingface/transformers/issues/20179 ## Before reviewing - [X] Check for missing / redundant translations (번역 누락/중복 검사) - [X] Grammar Check (맞춤법 검사) - [X] Review or Add new terms to glossary (용어 확인 및 추가) - [X] Check Inline TOC (e.g. `[[lowercased-header]]`) - [X] Check live-preview for gotchas (live-preview로 정상작동 확인) ## Who can review? (Initial) <!-- 1. 위 체크가 모두 완료된 뒤에만 KREW 팀원들에게 리뷰를 요청하는 아래 주석을 노출해주세요!--> May you please review this PR? @4N3MONE, @yijun-lee, @jungnerd , @harheem ## Before submitting - [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case). - [ ] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#start-contributing-pull-requests), Pull Request section? - [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link to it if that's the case. - [ ] Did you make sure to update the documentation with your changes? Here are the [documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and [here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation). - [ ] Did you write any new necessary tests? ## Who can review? (Final) <!-- 2. KREW 팀원들의 리뷰가 끝난 후에 아래 주석을 노출해주세요! --> <!-- @stevhliu May you please review this PR? -->
null
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/41527/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/41527/timeline
null
null
null
null
true
false
https://api.github.com/repos/huggingface/transformers/issues/41526
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/41526/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/41526/comments
https://api.github.com/repos/huggingface/transformers/issues/41526/events
https://github.com/huggingface/transformers/issues/41526
3,506,175,434
I_kwDOCUB6oc7Q-_3K
41,526
Eager Attention Works with OneVision 0.5B but Not with 7B
{ "login": "varungupta31", "id": 51288316, "node_id": "MDQ6VXNlcjUxMjg4MzE2", "avatar_url": "https://avatars.githubusercontent.com/u/51288316?v=4", "gravatar_id": "", "url": "https://api.github.com/users/varungupta31", "html_url": "https://github.com/varungupta31", "followers_url": "https://api.github.com/users/varungupta31/followers", "following_url": "https://api.github.com/users/varungupta31/following{/other_user}", "gists_url": "https://api.github.com/users/varungupta31/gists{/gist_id}", "starred_url": "https://api.github.com/users/varungupta31/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/varungupta31/subscriptions", "organizations_url": "https://api.github.com/users/varungupta31/orgs", "repos_url": "https://api.github.com/users/varungupta31/repos", "events_url": "https://api.github.com/users/varungupta31/events{/privacy}", "received_events_url": "https://api.github.com/users/varungupta31/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "id": 3817266200, "node_id": "MDU6TGFiZWwzODE3MjY2MjAw", "url": "https://api.github.com/repos/huggingface/transformers/labels/bug", "name": "bug", "color": "d73a4a", "default": true, "description": null } ]
closed
false
null
[]
null
[]
2025-10-11T17:45:23
2025-10-13T07:13:16
2025-10-13T07:13:16
NONE
null
null
null
null
### System Info I am trying to extract attention weights from the model and thus need to use `eager` implementation. 7B fails gives garbage output, ```python import requests from PIL import Image import torch from transformers import AutoProcessor, LlavaOnevisionForConditionalGeneration model_id = "llava-hf/llava-onevision-qwen2-7b-ov-hf" model = LlavaOnevisionForConditionalGeneration.from_pretrained( model_id, torch_dtype=torch.float16, low_cpu_mem_usage=True, attn_implementation='eager', device_map='auto' ) processor = AutoProcessor.from_pretrained(model_id) # Define a chat history and use `apply_chat_template` to get correctly formatted prompt # Each value in "content" has to be a list of dicts with types ("text", "image") conversation = [ { "role": "user", "content": [ {"type": "text", "text": "What are these?"}, {"type": "image"}, ], }, ] prompt = processor.apply_chat_template(conversation, add_generation_prompt=True) image_file = "http://images.cocodataset.org/val2017/000000039769.jpg" raw_image = Image.open(requests.get(image_file, stream=True).raw) inputs = processor(images=raw_image, text=prompt, return_tensors='pt').to(model.device, torch.float16) output = model.generate(**inputs, max_new_tokens=200, do_sample=False) print(processor.decode(output[0][2:], skip_special_tokens=True)) ``` Output ```bash What are these?assistant ! ``` Changing, model from, ``` model_id = "llava-hf/llava-onevision-qwen2-7b-ov-hf" ``` to ``` model_id = "llava-hf/llava-onevision-qwen2-0.5b-ov-hf" ``` Works, ```bash What are these?assistant The image shows two cats, one larger and one smaller, lying on a pink blanket. The larger cat is a tabby with a striped coat, while the smaller cat is a calico with a mix of white, black, and brown fur. They are both resting comfortably on a pink blanket, with the larger cat on the left and the smaller cat on the right. ``` I am actually interested in doing attention extraction - which seems tricky with `sdpa` and flash attention, right? Can you please help me fix this? - why is this happening and if this is too problematic to fix, how do I get attention weights at each generation step? Thanks! @zucchini-nlp likely has an answer on this, I would really appreciate your time and inputs, thank you. ### Who can help? _No response_ ### Information - [ ] The official example scripts - [ ] My own modified scripts ### Tasks - [ ] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...) - [ ] My own task or dataset (give details below) ### Reproduction Run the code above - model _works_ but gives garbage outputs, as shown. Changing model to 0.5B variant with everything else constant, works as expected. ### Expected behavior Consisent outputs (non-garbage) when running Onevision 7B with eager attention.
{ "login": "varungupta31", "id": 51288316, "node_id": "MDQ6VXNlcjUxMjg4MzE2", "avatar_url": "https://avatars.githubusercontent.com/u/51288316?v=4", "gravatar_id": "", "url": "https://api.github.com/users/varungupta31", "html_url": "https://github.com/varungupta31", "followers_url": "https://api.github.com/users/varungupta31/followers", "following_url": "https://api.github.com/users/varungupta31/following{/other_user}", "gists_url": "https://api.github.com/users/varungupta31/gists{/gist_id}", "starred_url": "https://api.github.com/users/varungupta31/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/varungupta31/subscriptions", "organizations_url": "https://api.github.com/users/varungupta31/orgs", "repos_url": "https://api.github.com/users/varungupta31/repos", "events_url": "https://api.github.com/users/varungupta31/events{/privacy}", "received_events_url": "https://api.github.com/users/varungupta31/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/41526/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/41526/timeline
null
completed
{ "total": 0, "completed": 0, "percent_completed": 0 }
{ "blocked_by": 0, "total_blocked_by": 0, "blocking": 0, "total_blocking": 0 }
false
true
https://api.github.com/repos/huggingface/transformers/issues/41525
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/41525/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/41525/comments
https://api.github.com/repos/huggingface/transformers/issues/41525/events
https://github.com/huggingface/transformers/pull/41525
3,506,137,270
PR_kwDOCUB6oc6tQvkf
41,525
Fixed Type-hints in function defintions
{ "login": "Sai-Suraj-27", "id": 87087741, "node_id": "MDQ6VXNlcjg3MDg3NzQx", "avatar_url": "https://avatars.githubusercontent.com/u/87087741?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Sai-Suraj-27", "html_url": "https://github.com/Sai-Suraj-27", "followers_url": "https://api.github.com/users/Sai-Suraj-27/followers", "following_url": "https://api.github.com/users/Sai-Suraj-27/following{/other_user}", "gists_url": "https://api.github.com/users/Sai-Suraj-27/gists{/gist_id}", "starred_url": "https://api.github.com/users/Sai-Suraj-27/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Sai-Suraj-27/subscriptions", "organizations_url": "https://api.github.com/users/Sai-Suraj-27/orgs", "repos_url": "https://api.github.com/users/Sai-Suraj-27/repos", "events_url": "https://api.github.com/users/Sai-Suraj-27/events{/privacy}", "received_events_url": "https://api.github.com/users/Sai-Suraj-27/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-10-11T17:10:42
2025-10-13T09:48:38
2025-10-13T09:48:37
CONTRIBUTOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/41525", "html_url": "https://github.com/huggingface/transformers/pull/41525", "diff_url": "https://github.com/huggingface/transformers/pull/41525.diff", "patch_url": "https://github.com/huggingface/transformers/pull/41525.patch", "merged_at": "2025-10-13T09:48:37" }
# What does this PR do? Although [RUF013](https://docs.astral.sh/ruff/rules/implicit-optional/#implicit-optional-ruf013) rule is [enabled](https://github.com/huggingface/transformers/blob/3927ffed31e3c0d2929bf98bd05b7c61fcc48b62/pyproject.toml#L35), it doesn't catch when we are using custom types ([known issue](https://github.com/astral-sh/ruff/issues/14018?utm_source=chatgpt.com)). Used [Pyrefly](https://github.com/facebook/pyrefly) to find these. ## Before submitting - [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case). - [x] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request), Pull Request section? - [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link to it if that's the case. - [ ] Did you make sure to update the documentation with your changes? Here are the [documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and [here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation). - [ ] Did you write any new necessary tests? ## Who can review? @zucchini-nlp @molbap
{ "login": "zucchini-nlp", "id": 100715397, "node_id": "U_kgDOBgDLhQ", "avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4", "gravatar_id": "", "url": "https://api.github.com/users/zucchini-nlp", "html_url": "https://github.com/zucchini-nlp", "followers_url": "https://api.github.com/users/zucchini-nlp/followers", "following_url": "https://api.github.com/users/zucchini-nlp/following{/other_user}", "gists_url": "https://api.github.com/users/zucchini-nlp/gists{/gist_id}", "starred_url": "https://api.github.com/users/zucchini-nlp/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/zucchini-nlp/subscriptions", "organizations_url": "https://api.github.com/users/zucchini-nlp/orgs", "repos_url": "https://api.github.com/users/zucchini-nlp/repos", "events_url": "https://api.github.com/users/zucchini-nlp/events{/privacy}", "received_events_url": "https://api.github.com/users/zucchini-nlp/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/41525/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/41525/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/41524
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/41524/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/41524/comments
https://api.github.com/repos/huggingface/transformers/issues/41524/events
https://github.com/huggingface/transformers/pull/41524
3,506,049,723
PR_kwDOCUB6oc6tQdOg
41,524
Add max_eval_batches argument to TrainingArguments
{ "login": "KaparthyReddy", "id": 166050493, "node_id": "U_kgDOCeW6vQ", "avatar_url": "https://avatars.githubusercontent.com/u/166050493?v=4", "gravatar_id": "", "url": "https://api.github.com/users/KaparthyReddy", "html_url": "https://github.com/KaparthyReddy", "followers_url": "https://api.github.com/users/KaparthyReddy/followers", "following_url": "https://api.github.com/users/KaparthyReddy/following{/other_user}", "gists_url": "https://api.github.com/users/KaparthyReddy/gists{/gist_id}", "starred_url": "https://api.github.com/users/KaparthyReddy/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/KaparthyReddy/subscriptions", "organizations_url": "https://api.github.com/users/KaparthyReddy/orgs", "repos_url": "https://api.github.com/users/KaparthyReddy/repos", "events_url": "https://api.github.com/users/KaparthyReddy/events{/privacy}", "received_events_url": "https://api.github.com/users/KaparthyReddy/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
open
false
null
[]
null
[]
2025-10-11T16:08:35
2025-10-13T13:56:14
null
NONE
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/41524", "html_url": "https://github.com/huggingface/transformers/pull/41524", "diff_url": "https://github.com/huggingface/transformers/pull/41524.diff", "patch_url": "https://github.com/huggingface/transformers/pull/41524.patch", "merged_at": null }
# Add max_eval_batches argument to TrainingArguments ## Description Adds a `max_eval_batches` parameter to `TrainingArguments` that allows users to limit the number of batches used during evaluation. Fixes #31561 ## Motivation When working with large evaluation datasets, running evaluation on the entire dataset can be very slow. During development, hyperparameter tuning, or quick iteration, it's often sufficient to evaluate on a subset of the data. This is similar to PyTorch Lightning's `limit_val_batches` parameter. ## Changes - ✅ Added `max_eval_batches` parameter to `TrainingArguments` - ✅ Implemented batch limiting in `Trainer.evaluation_loop` - ✅ Added test coverage - ✅ Added documentation in parameter metadata ## Usage Example ```python from transformers import Trainer, TrainingArguments training_args = TrainingArguments( output_dir="./output", evaluation_strategy="steps", eval_steps=100, max_eval_batches=50, # Only evaluate on 50 batches ) trainer = Trainer( model=model, args=training_args, eval_dataset=eval_dataset, ) # Evaluation will stop after 50 batches instead of going through entire dataset trainer.evaluate()
null
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/41524/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/41524/timeline
null
null
null
null
true
false
https://api.github.com/repos/huggingface/transformers/issues/41523
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/41523/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/41523/comments
https://api.github.com/repos/huggingface/transformers/issues/41523/events
https://github.com/huggingface/transformers/pull/41523
3,505,936,204
PR_kwDOCUB6oc6tQFYu
41,523
Add test coverage for ConvNextImageProcessorFast
{ "login": "KaparthyReddy", "id": 166050493, "node_id": "U_kgDOCeW6vQ", "avatar_url": "https://avatars.githubusercontent.com/u/166050493?v=4", "gravatar_id": "", "url": "https://api.github.com/users/KaparthyReddy", "html_url": "https://github.com/KaparthyReddy", "followers_url": "https://api.github.com/users/KaparthyReddy/followers", "following_url": "https://api.github.com/users/KaparthyReddy/following{/other_user}", "gists_url": "https://api.github.com/users/KaparthyReddy/gists{/gist_id}", "starred_url": "https://api.github.com/users/KaparthyReddy/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/KaparthyReddy/subscriptions", "organizations_url": "https://api.github.com/users/KaparthyReddy/orgs", "repos_url": "https://api.github.com/users/KaparthyReddy/repos", "events_url": "https://api.github.com/users/KaparthyReddy/events{/privacy}", "received_events_url": "https://api.github.com/users/KaparthyReddy/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
open
false
null
[]
null
[]
2025-10-11T14:52:04
2025-10-11T14:53:14
null
NONE
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/41523", "html_url": "https://github.com/huggingface/transformers/pull/41523", "diff_url": "https://github.com/huggingface/transformers/pull/41523.diff", "patch_url": "https://github.com/huggingface/transformers/pull/41523.patch", "merged_at": null }
# Add test coverage for ConvNextImageProcessorFast ## Description Completes test coverage for the existing `ConvNextImageProcessorFast` implementation by adding `image_processor_list` to enable testing both slow and fast processors. ## Changes - ✅ Added `image_processor_list = [image_processing_class, fast_image_processing_class]` to test class - ✅ Enables all existing tests to run on both slow and fast processors - ✅ All 19 tests passing ## Testing Results ```bash RUN_SLOW=1 python -m pytest tests/models/convnext/test_image_processing_convnext.py -v
null
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/41523/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/41523/timeline
null
null
null
null
true
false
https://api.github.com/repos/huggingface/transformers/issues/41522
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/41522/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/41522/comments
https://api.github.com/repos/huggingface/transformers/issues/41522/events
https://github.com/huggingface/transformers/pull/41522
3,505,830,231
PR_kwDOCUB6oc6tPvkv
41,522
Fix _init_weights to safely skip int8 quantized weights
{ "login": "KaparthyReddy", "id": 166050493, "node_id": "U_kgDOCeW6vQ", "avatar_url": "https://avatars.githubusercontent.com/u/166050493?v=4", "gravatar_id": "", "url": "https://api.github.com/users/KaparthyReddy", "html_url": "https://github.com/KaparthyReddy", "followers_url": "https://api.github.com/users/KaparthyReddy/followers", "following_url": "https://api.github.com/users/KaparthyReddy/following{/other_user}", "gists_url": "https://api.github.com/users/KaparthyReddy/gists{/gist_id}", "starred_url": "https://api.github.com/users/KaparthyReddy/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/KaparthyReddy/subscriptions", "organizations_url": "https://api.github.com/users/KaparthyReddy/orgs", "repos_url": "https://api.github.com/users/KaparthyReddy/repos", "events_url": "https://api.github.com/users/KaparthyReddy/events{/privacy}", "received_events_url": "https://api.github.com/users/KaparthyReddy/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
open
false
null
[]
null
[]
2025-10-11T13:28:13
2025-10-13T13:57:22
null
NONE
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/41522", "html_url": "https://github.com/huggingface/transformers/pull/41522", "diff_url": "https://github.com/huggingface/transformers/pull/41522.diff", "patch_url": "https://github.com/huggingface/transformers/pull/41522.patch", "merged_at": null }
This PR addresses a RuntimeError that occurs when loading W8A8 quantized Qwen2.5-VL models. Changes: - Modified `_init_weights` in `Qwen2_5_VLPreTrainedModel` to safely skip non-floating tensors (e.g., int8 quantized weights) during initialization. - Ensures float tensors are initialized normally, while int8 or other non-float tensors are ignored, preventing errors from `normal_()` on integer dtypes. - Improves compatibility for loading quantized models without vLLM and allows safe custom hooks for research purposes. Note: This PR does not include the tester file `test_init_weights_safe.py`; that file exists only in the fork for private testing.
null
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/41522/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/41522/timeline
null
null
null
null
true
false
https://api.github.com/repos/huggingface/transformers/issues/41521
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/41521/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/41521/comments
https://api.github.com/repos/huggingface/transformers/issues/41521/events
https://github.com/huggingface/transformers/pull/41521
3,505,718,903
PR_kwDOCUB6oc6tPXlB
41,521
Fix forced_bos_token_id not set in generation_config
{ "login": "Addyk-24", "id": 174926659, "node_id": "U_kgDOCm0rQw", "avatar_url": "https://avatars.githubusercontent.com/u/174926659?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Addyk-24", "html_url": "https://github.com/Addyk-24", "followers_url": "https://api.github.com/users/Addyk-24/followers", "following_url": "https://api.github.com/users/Addyk-24/following{/other_user}", "gists_url": "https://api.github.com/users/Addyk-24/gists{/gist_id}", "starred_url": "https://api.github.com/users/Addyk-24/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Addyk-24/subscriptions", "organizations_url": "https://api.github.com/users/Addyk-24/orgs", "repos_url": "https://api.github.com/users/Addyk-24/repos", "events_url": "https://api.github.com/users/Addyk-24/events{/privacy}", "received_events_url": "https://api.github.com/users/Addyk-24/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
open
false
null
[]
null
[]
2025-10-11T12:19:20
2025-10-15T10:03:08
null
NONE
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/41521", "html_url": "https://github.com/huggingface/transformers/pull/41521", "diff_url": "https://github.com/huggingface/transformers/pull/41521.diff", "patch_url": "https://github.com/huggingface/transformers/pull/41521.patch", "merged_at": null }
# What does this PR do? Fixes #41492 Fixes incorrect target language generation during evaluation/validation in run_translation.py for multilingual translation models (mBART , M2M100). ### Problem When fine-tuning multilingual models, forced_bos_token_id was only set in model.config but not in model.generation_config. During evaluation, model.generate() reads from generation_config, causing generation in wrong language and artificially low BLEU scores.Previously would be ~2-5 (wrong language) ### Solution Set forced_bos_token_id in both model.config and model.generation_config. Results: <img width="275" height="112" alt="eval_metrics" src="https://github.com/user-attachments/assets/5b499354-bdbc-4eb0-b07a-fcb75cc2690f" /> - ✅ this generated correct language token id with correct target language. - ✅ This warning appears if you modify model.config directly for generation. Using model.generation_config removes this warning and ensures Transformers v5+ uses the setting correctly. - ✅ All evaluations complete without errors - ✅ Using decoder_start_token_id only/both causes empty outputs. - ✅ With this fix, the target language ID is automatically handled during generation. ## Before submitting - [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case). - [x] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request), Pull Request section? - [x] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link to it if that's the case. - [ ] Did you make sure to update the documentation with your changes? Here are the [documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and [here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation). - [ ] Did you write any new necessary tests? ## Who can review? @zach-huggingface @CyrilVallez
{ "login": "Addyk-24", "id": 174926659, "node_id": "U_kgDOCm0rQw", "avatar_url": "https://avatars.githubusercontent.com/u/174926659?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Addyk-24", "html_url": "https://github.com/Addyk-24", "followers_url": "https://api.github.com/users/Addyk-24/followers", "following_url": "https://api.github.com/users/Addyk-24/following{/other_user}", "gists_url": "https://api.github.com/users/Addyk-24/gists{/gist_id}", "starred_url": "https://api.github.com/users/Addyk-24/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Addyk-24/subscriptions", "organizations_url": "https://api.github.com/users/Addyk-24/orgs", "repos_url": "https://api.github.com/users/Addyk-24/repos", "events_url": "https://api.github.com/users/Addyk-24/events{/privacy}", "received_events_url": "https://api.github.com/users/Addyk-24/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/41521/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/41521/timeline
null
null
null
null
true
false
https://api.github.com/repos/huggingface/transformers/issues/41520
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/41520/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/41520/comments
https://api.github.com/repos/huggingface/transformers/issues/41520/events
https://github.com/huggingface/transformers/issues/41520
3,505,415,635
I_kwDOCUB6oc7Q8GXT
41,520
Add GPU-Accelerated Video Decoding for VideoMAE, TimeSformer, and Similar Models
{ "login": "Aki-07", "id": 95642646, "node_id": "U_kgDOBbNkFg", "avatar_url": "https://avatars.githubusercontent.com/u/95642646?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Aki-07", "html_url": "https://github.com/Aki-07", "followers_url": "https://api.github.com/users/Aki-07/followers", "following_url": "https://api.github.com/users/Aki-07/following{/other_user}", "gists_url": "https://api.github.com/users/Aki-07/gists{/gist_id}", "starred_url": "https://api.github.com/users/Aki-07/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Aki-07/subscriptions", "organizations_url": "https://api.github.com/users/Aki-07/orgs", "repos_url": "https://api.github.com/users/Aki-07/repos", "events_url": "https://api.github.com/users/Aki-07/events{/privacy}", "received_events_url": "https://api.github.com/users/Aki-07/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "id": 2648621985, "node_id": "MDU6TGFiZWwyNjQ4NjIxOTg1", "url": "https://api.github.com/repos/huggingface/transformers/labels/Feature%20request", "name": "Feature request", "color": "FBCA04", "default": false, "description": "Request for a new feature" } ]
closed
false
null
[]
null
[]
2025-10-11T08:15:22
2025-10-13T13:42:29
2025-10-13T13:42:29
CONTRIBUTOR
null
null
null
null
### Feature request I’d like to propose adding a lightweight, GPU-friendly video decoding utility to Transformers. Right now, models like VideoMAE and TimeSformer depend on PIL or OpenCV for frame extraction, which keeps preprocessing on the CPU and slows down training and inference. A unified VideoReader module that wraps libraries such as PyAV or Decord could decode and batch frames directly as CUDA tensors, eliminating a major bottleneck in current video workflows and making it easier for users to stream large clips efficiently into Transformer-based video and multimodal models. ### Motivation Most video pipelines in Transformers still rely on CPU-bound decoding through PIL or OpenCV. For large-batch or long-clip training, this quickly becomes the main bottleneck — GPU utilization stays low while the CPU decodes frames one by one. Libraries like PyAV and Decord already provide fast, batched video loading that can hand off tensors directly to the GPU. Integrating this capability natively would make training and inference with models such as VideoMAE, TimeSformer, or InternVideo far more efficient. ### Your contribution Love working in this space, if given an green sign Will start working on it
{ "login": "zucchini-nlp", "id": 100715397, "node_id": "U_kgDOBgDLhQ", "avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4", "gravatar_id": "", "url": "https://api.github.com/users/zucchini-nlp", "html_url": "https://github.com/zucchini-nlp", "followers_url": "https://api.github.com/users/zucchini-nlp/followers", "following_url": "https://api.github.com/users/zucchini-nlp/following{/other_user}", "gists_url": "https://api.github.com/users/zucchini-nlp/gists{/gist_id}", "starred_url": "https://api.github.com/users/zucchini-nlp/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/zucchini-nlp/subscriptions", "organizations_url": "https://api.github.com/users/zucchini-nlp/orgs", "repos_url": "https://api.github.com/users/zucchini-nlp/repos", "events_url": "https://api.github.com/users/zucchini-nlp/events{/privacy}", "received_events_url": "https://api.github.com/users/zucchini-nlp/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/41520/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/41520/timeline
null
completed
{ "total": 0, "completed": 0, "percent_completed": 0 }
{ "blocked_by": 0, "total_blocked_by": 0, "blocking": 0, "total_blocking": 0 }
false
true
https://api.github.com/repos/huggingface/transformers/issues/41519
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/41519/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/41519/comments
https://api.github.com/repos/huggingface/transformers/issues/41519/events
https://github.com/huggingface/transformers/issues/41519
3,505,413,795
I_kwDOCUB6oc7Q8F6j
41,519
Add Shared Vision–Language Negative Sampling Utility for BLIP / LLaVA / IDEFICS
{ "login": "Aki-07", "id": 95642646, "node_id": "U_kgDOBbNkFg", "avatar_url": "https://avatars.githubusercontent.com/u/95642646?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Aki-07", "html_url": "https://github.com/Aki-07", "followers_url": "https://api.github.com/users/Aki-07/followers", "following_url": "https://api.github.com/users/Aki-07/following{/other_user}", "gists_url": "https://api.github.com/users/Aki-07/gists{/gist_id}", "starred_url": "https://api.github.com/users/Aki-07/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Aki-07/subscriptions", "organizations_url": "https://api.github.com/users/Aki-07/orgs", "repos_url": "https://api.github.com/users/Aki-07/repos", "events_url": "https://api.github.com/users/Aki-07/events{/privacy}", "received_events_url": "https://api.github.com/users/Aki-07/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "id": 2648621985, "node_id": "MDU6TGFiZWwyNjQ4NjIxOTg1", "url": "https://api.github.com/repos/huggingface/transformers/labels/Feature%20request", "name": "Feature request", "color": "FBCA04", "default": false, "description": "Request for a new feature" } ]
open
false
null
[]
null
[]
2025-10-11T08:13:34
2025-10-13T13:58:31
null
CONTRIBUTOR
null
null
null
null
### Feature request Add a unified vision-language negative sampling utility for contrastive learning. Support hard negative mining and in-batch negative sampling. Compatible with BLIP, LLaVA, and IDEFICS training pipelines. Modular design for easy integration into existing VLM frameworks ### Motivation VLM training still relies on ad-hoc contrastive negative sampling code. A common utility to batch image–text negatives (with optional hard-negative mining) would simplify research pipelines and encourage standardization. ### Your contribution If given a approval, will start working on this Contributor: [@Aki-07](https://github.com/Aki-07)
null
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/41519/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/41519/timeline
null
null
{ "total": 0, "completed": 0, "percent_completed": 0 }
{ "blocked_by": 0, "total_blocked_by": 0, "blocking": 0, "total_blocking": 0 }
false
false
https://api.github.com/repos/huggingface/transformers/issues/41518
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/41518/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/41518/comments
https://api.github.com/repos/huggingface/transformers/issues/41518/events
https://github.com/huggingface/transformers/issues/41518
3,505,411,105
I_kwDOCUB6oc7Q8FQh
41,518
Add Structured Prompt Templates Registry for LLM / VLM / Diffusion Tasks
{ "login": "Aki-07", "id": 95642646, "node_id": "U_kgDOBbNkFg", "avatar_url": "https://avatars.githubusercontent.com/u/95642646?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Aki-07", "html_url": "https://github.com/Aki-07", "followers_url": "https://api.github.com/users/Aki-07/followers", "following_url": "https://api.github.com/users/Aki-07/following{/other_user}", "gists_url": "https://api.github.com/users/Aki-07/gists{/gist_id}", "starred_url": "https://api.github.com/users/Aki-07/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Aki-07/subscriptions", "organizations_url": "https://api.github.com/users/Aki-07/orgs", "repos_url": "https://api.github.com/users/Aki-07/repos", "events_url": "https://api.github.com/users/Aki-07/events{/privacy}", "received_events_url": "https://api.github.com/users/Aki-07/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "id": 2648621985, "node_id": "MDU6TGFiZWwyNjQ4NjIxOTg1", "url": "https://api.github.com/repos/huggingface/transformers/labels/Feature%20request", "name": "Feature request", "color": "FBCA04", "default": false, "description": "Request for a new feature" } ]
open
false
null
[]
null
[]
2025-10-11T08:10:20
2025-10-13T15:06:20
null
CONTRIBUTOR
null
null
null
null
### Feature request Introduce transformers.prompt_templates — a YAML-based registry and accessor API: ``` from transformers import PromptTemplates PromptTemplates.get("summarization") # "Summarize the following text:" PromptTemplates.list_tasks() # ["summarization","vqa","ocr",...] ``` - Templates stored as yaml/json under src/transformers/prompt_templates/templates/. - Accessor + validation in registry.py. - Optional CLI command transformers-cli list-prompts. - Pipelines can import a template by task name instead of hard-coding. ### Motivation Every pipeline and model today embeds its own prompt strings (e.g., summarization, OCR, VQA). This duplication makes results inconsistent and hard to benchmark. A central registry of task-specific prompt templates would unify defaults and enable easy community additions. ### Your contribution I’ll implement the registry module, add unit tests and docs, and migrate 1–2 pipelines (summarization / captioning) to use it. Contributor: [@Aki-07](https://github.com/Aki-07)
null
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/41518/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/41518/timeline
null
null
{ "total": 0, "completed": 0, "percent_completed": 0 }
{ "blocked_by": 0, "total_blocked_by": 0, "blocking": 0, "total_blocking": 0 }
false
false
https://api.github.com/repos/huggingface/transformers/issues/41517
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/41517/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/41517/comments
https://api.github.com/repos/huggingface/transformers/issues/41517/events
https://github.com/huggingface/transformers/issues/41517
3,505,388,270
I_kwDOCUB6oc7Q7_ru
41,517
Add MobileNetV2 Fast Image Processor
{ "login": "Aki-07", "id": 95642646, "node_id": "U_kgDOBbNkFg", "avatar_url": "https://avatars.githubusercontent.com/u/95642646?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Aki-07", "html_url": "https://github.com/Aki-07", "followers_url": "https://api.github.com/users/Aki-07/followers", "following_url": "https://api.github.com/users/Aki-07/following{/other_user}", "gists_url": "https://api.github.com/users/Aki-07/gists{/gist_id}", "starred_url": "https://api.github.com/users/Aki-07/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Aki-07/subscriptions", "organizations_url": "https://api.github.com/users/Aki-07/orgs", "repos_url": "https://api.github.com/users/Aki-07/repos", "events_url": "https://api.github.com/users/Aki-07/events{/privacy}", "received_events_url": "https://api.github.com/users/Aki-07/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "id": 2648621985, "node_id": "MDU6TGFiZWwyNjQ4NjIxOTg1", "url": "https://api.github.com/repos/huggingface/transformers/labels/Feature%20request", "name": "Feature request", "color": "FBCA04", "default": false, "description": "Request for a new feature" } ]
closed
false
null
[]
null
[]
2025-10-11T07:55:14
2025-10-11T07:57:55
2025-10-11T07:57:55
CONTRIBUTOR
null
null
null
null
### Feature request Add a fast image processor for **MobileNetV2** to bring it in line with other vision models (e.g., ViT, ConvNeXt) that already have fast variants. This will improve performance for all MobileNetV2-based checkpoints and multimodal processors that depend on it. ### Scope - Mirror current preprocessing steps (resize, crop, normalize). - Base on existing `BaseImageProcessorFast` class. - Register it in the model’s auto mapping for automatic detection. ### Motivation MobileNetV2 is one of the most widely used lightweight vision models — both as a standalone backbone and inside multimodal processors. However, it still lacks a Fast Image Processor implementation. Adding MobileNetV2ImageProcessorFast will: - Enable torch-native preprocessing, avoiding PIL bottlenecks. - Significantly reduce preprocessing latency in training and inference pipelines. - Improve GPU utilization, especially in batched or streaming inference. - Ensure feature parity with other vision models (ViT, ConvNeXt, ResNet) that already provide fast variants. This improvement benefits millions of downstream users and deployments that rely on MobileNetV2-based models, while keeping the implementation lightweight and consistent with Hugging Face’s design for fast processors. ### Your contribution My planned steps: - Create the new MobileNetV2ImageProcessorFast based on the existing BaseImageProcessorFast design pattern. - Ensure full feature parity with the current (slow) MobileNetV2 image processor — including resizing, cropping, normalization, and label reduction for segmentation tasks. - Register the processor in the MobileNetV2 module and auto mapping so it’s automatically used by any MobileNetV2 checkpoint. - Add/update tests to validate consistency between the slow and fast versions. Contributor: @Aki-07 Happy to take full ownership of this feature once approved.
{ "login": "Aki-07", "id": 95642646, "node_id": "U_kgDOBbNkFg", "avatar_url": "https://avatars.githubusercontent.com/u/95642646?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Aki-07", "html_url": "https://github.com/Aki-07", "followers_url": "https://api.github.com/users/Aki-07/followers", "following_url": "https://api.github.com/users/Aki-07/following{/other_user}", "gists_url": "https://api.github.com/users/Aki-07/gists{/gist_id}", "starred_url": "https://api.github.com/users/Aki-07/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Aki-07/subscriptions", "organizations_url": "https://api.github.com/users/Aki-07/orgs", "repos_url": "https://api.github.com/users/Aki-07/repos", "events_url": "https://api.github.com/users/Aki-07/events{/privacy}", "received_events_url": "https://api.github.com/users/Aki-07/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/41517/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/41517/timeline
null
completed
{ "total": 0, "completed": 0, "percent_completed": 0 }
{ "blocked_by": 0, "total_blocked_by": 0, "blocking": 0, "total_blocking": 0 }
false
true
https://api.github.com/repos/huggingface/transformers/issues/41516
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/41516/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/41516/comments
https://api.github.com/repos/huggingface/transformers/issues/41516/events
https://github.com/huggingface/transformers/issues/41516
3,505,305,779
I_kwDOCUB6oc7Q7riz
41,516
when finetune deepseek v3.1 with lora,an ValueError was reported: QuantizationMethod.FP8 do not support training
{ "login": "Francisapzii", "id": 101245339, "node_id": "U_kgDOBgjhmw", "avatar_url": "https://avatars.githubusercontent.com/u/101245339?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Francisapzii", "html_url": "https://github.com/Francisapzii", "followers_url": "https://api.github.com/users/Francisapzii/followers", "following_url": "https://api.github.com/users/Francisapzii/following{/other_user}", "gists_url": "https://api.github.com/users/Francisapzii/gists{/gist_id}", "starred_url": "https://api.github.com/users/Francisapzii/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Francisapzii/subscriptions", "organizations_url": "https://api.github.com/users/Francisapzii/orgs", "repos_url": "https://api.github.com/users/Francisapzii/repos", "events_url": "https://api.github.com/users/Francisapzii/events{/privacy}", "received_events_url": "https://api.github.com/users/Francisapzii/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "id": 3817266200, "node_id": "MDU6TGFiZWwzODE3MjY2MjAw", "url": "https://api.github.com/repos/huggingface/transformers/labels/bug", "name": "bug", "color": "d73a4a", "default": true, "description": null } ]
open
false
null
[]
null
[]
2025-10-11T06:43:50
2025-10-16T18:38:56
null
NONE
null
null
null
null
### System Info The error was as below. Please help. Traceback (most recent call last): File "/hail/cjs/finetune/ft0/lora.py", line 231, in <module> model, tokenizer = finetune_with_lora(model_name, output_dir) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/hail/cjs/finetune/ft0/lora.py", line 156, in finetune_with_lora trainer = Trainer( ^^^^^^^^ File "/hail/cjs/anaconda3/envs/lora1/lib/python3.11/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func return func(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^ File "/hail/cjs/anaconda3/envs/lora1/lib/python3.11/site-packages/transformers/trainer.py", line 572, in __init__ raise ValueError( ValueError: The model you are trying to fine-tune is quantized with QuantizationMethod.FP8 but that quantization method do not support training. Please open an issue on GitHub: https://github.com/huggingface/transformers to request the support for training support for QuantizationMethod.FP8 ### Who can help? _No response_ ### Information - [ ] The official example scripts - [ ] My own modified scripts ### Tasks - [ ] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...) - [ ] My own task or dataset (give details below) ### Reproduction Finetune "deepseek-ai/DeepSeek-V3.1-Terminus" with lora, this error will happen. ### Expected behavior correct this error, please.
null
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/41516/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/41516/timeline
null
null
{ "total": 0, "completed": 0, "percent_completed": 0 }
{ "blocked_by": 0, "total_blocked_by": 0, "blocking": 0, "total_blocking": 0 }
false
false
https://api.github.com/repos/huggingface/transformers/issues/41515
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/41515/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/41515/comments
https://api.github.com/repos/huggingface/transformers/issues/41515/events
https://github.com/huggingface/transformers/pull/41515
3,504,483,436
PR_kwDOCUB6oc6tLSUv
41,515
[ci] Disable workflows with secrets and custom runners to run on fork
{ "login": "HollowMan6", "id": 43995067, "node_id": "MDQ6VXNlcjQzOTk1MDY3", "avatar_url": "https://avatars.githubusercontent.com/u/43995067?v=4", "gravatar_id": "", "url": "https://api.github.com/users/HollowMan6", "html_url": "https://github.com/HollowMan6", "followers_url": "https://api.github.com/users/HollowMan6/followers", "following_url": "https://api.github.com/users/HollowMan6/following{/other_user}", "gists_url": "https://api.github.com/users/HollowMan6/gists{/gist_id}", "starred_url": "https://api.github.com/users/HollowMan6/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/HollowMan6/subscriptions", "organizations_url": "https://api.github.com/users/HollowMan6/orgs", "repos_url": "https://api.github.com/users/HollowMan6/repos", "events_url": "https://api.github.com/users/HollowMan6/events{/privacy}", "received_events_url": "https://api.github.com/users/HollowMan6/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-10-10T20:55:17
2025-10-16T14:59:25
2025-10-16T14:59:25
CONTRIBUTOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/41515", "html_url": "https://github.com/huggingface/transformers/pull/41515", "diff_url": "https://github.com/huggingface/transformers/pull/41515.diff", "patch_url": "https://github.com/huggingface/transformers/pull/41515.patch", "merged_at": null }
# What does this PR do? <!-- Congratulations! You've made it this far! You're not quite done yet though. Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution. Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change. Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost. --> <!-- Remove if not applicable --> Skip those workflows by checking if the repository is owned by `huggingface`. Tested by directly pushing to the fork: https://github.com/HollowMan6/transformers/commit/14be7f3c250c3ba1887877fc1df67b2e87559759 and opening a PR on the fork side: - https://github.com/HollowMan6/transformers/pull/1 ## Before submitting - [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case). - [X] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request), Pull Request section? - [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link to it if that's the case. - [ ] Did you make sure to update the documentation with your changes? Here are the [documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and [here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation). - [ ] Did you write any new necessary tests? ## Who can review? Anyone in the community is free to review the PR once the tests have passed. Feel free to tag members/contributors who may be interested in your PR. <!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @ If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**. Please tag fewer than 3 people. Models: - text models: @ArthurZucker @Cyrilvallez - vision models: @yonigozlan @molbap - audio models: @eustlb @ebezzam @vasqu - multimodal models: @zucchini-nlp - graph models: @clefourrier Library: - generate: @zucchini-nlp (visual-language models) or @gante (all others) - continuous batching: @remi-or @ArthurZucker @McPatate - pipelines: @Rocketknight1 - tokenizers: @ArthurZucker and @itazap - trainer: @zach-huggingface @SunMarc - attention: @vasqu @ArthurZucker @CyrilVallez - model loading (from pretrained, etc): @CyrilVallez - distributed: @3outeille @ArthurZucker @S1ro1 - CIs: @ydshieh Integrations: - deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface - ray/raytune: @richardliaw, @amogkam - Big Model Inference: @SunMarc - quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber - kernels: @MekkCyber @drbh - peft: @BenjaminBossan @githubnemo Devices/Backends: - AMD ROCm: @ivarflakstad - Intel XPU: @IlyasMoutawwakil - Ascend NPU: @ivarflakstad Documentation: @stevhliu Research projects are not maintained and should be taken as is. --> @ydshieh
{ "login": "HollowMan6", "id": 43995067, "node_id": "MDQ6VXNlcjQzOTk1MDY3", "avatar_url": "https://avatars.githubusercontent.com/u/43995067?v=4", "gravatar_id": "", "url": "https://api.github.com/users/HollowMan6", "html_url": "https://github.com/HollowMan6", "followers_url": "https://api.github.com/users/HollowMan6/followers", "following_url": "https://api.github.com/users/HollowMan6/following{/other_user}", "gists_url": "https://api.github.com/users/HollowMan6/gists{/gist_id}", "starred_url": "https://api.github.com/users/HollowMan6/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/HollowMan6/subscriptions", "organizations_url": "https://api.github.com/users/HollowMan6/orgs", "repos_url": "https://api.github.com/users/HollowMan6/repos", "events_url": "https://api.github.com/users/HollowMan6/events{/privacy}", "received_events_url": "https://api.github.com/users/HollowMan6/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/41515/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/41515/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/41514
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/41514/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/41514/comments
https://api.github.com/repos/huggingface/transformers/issues/41514/events
https://github.com/huggingface/transformers/pull/41514
3,503,050,135
PR_kwDOCUB6oc6tGWoJ
41,514
delete some tokenizer tests using pickle
{ "login": "ydshieh", "id": 2521628, "node_id": "MDQ6VXNlcjI1MjE2Mjg=", "avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4", "gravatar_id": "", "url": "https://api.github.com/users/ydshieh", "html_url": "https://github.com/ydshieh", "followers_url": "https://api.github.com/users/ydshieh/followers", "following_url": "https://api.github.com/users/ydshieh/following{/other_user}", "gists_url": "https://api.github.com/users/ydshieh/gists{/gist_id}", "starred_url": "https://api.github.com/users/ydshieh/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/ydshieh/subscriptions", "organizations_url": "https://api.github.com/users/ydshieh/orgs", "repos_url": "https://api.github.com/users/ydshieh/repos", "events_url": "https://api.github.com/users/ydshieh/events{/privacy}", "received_events_url": "https://api.github.com/users/ydshieh/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-10-10T13:54:34
2025-10-14T12:50:53
2025-10-14T12:50:52
COLLABORATOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/41514", "html_url": "https://github.com/huggingface/transformers/pull/41514", "diff_url": "https://github.com/huggingface/transformers/pull/41514.diff", "patch_url": "https://github.com/huggingface/transformers/pull/41514.patch", "merged_at": "2025-10-14T12:50:52" }
# What does this PR do? There is no room for `pickle` within `transformers`!
{ "login": "ydshieh", "id": 2521628, "node_id": "MDQ6VXNlcjI1MjE2Mjg=", "avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4", "gravatar_id": "", "url": "https://api.github.com/users/ydshieh", "html_url": "https://github.com/ydshieh", "followers_url": "https://api.github.com/users/ydshieh/followers", "following_url": "https://api.github.com/users/ydshieh/following{/other_user}", "gists_url": "https://api.github.com/users/ydshieh/gists{/gist_id}", "starred_url": "https://api.github.com/users/ydshieh/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/ydshieh/subscriptions", "organizations_url": "https://api.github.com/users/ydshieh/orgs", "repos_url": "https://api.github.com/users/ydshieh/repos", "events_url": "https://api.github.com/users/ydshieh/events{/privacy}", "received_events_url": "https://api.github.com/users/ydshieh/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/41514/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/41514/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/41513
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/41513/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/41513/comments
https://api.github.com/repos/huggingface/transformers/issues/41513/events
https://github.com/huggingface/transformers/pull/41513
3,502,993,717
PR_kwDOCUB6oc6tGKkr
41,513
Remove references to AutoModelForVision2Seq
{ "login": "Rocketknight1", "id": 12866554, "node_id": "MDQ6VXNlcjEyODY2NTU0", "avatar_url": "https://avatars.githubusercontent.com/u/12866554?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Rocketknight1", "html_url": "https://github.com/Rocketknight1", "followers_url": "https://api.github.com/users/Rocketknight1/followers", "following_url": "https://api.github.com/users/Rocketknight1/following{/other_user}", "gists_url": "https://api.github.com/users/Rocketknight1/gists{/gist_id}", "starred_url": "https://api.github.com/users/Rocketknight1/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Rocketknight1/subscriptions", "organizations_url": "https://api.github.com/users/Rocketknight1/orgs", "repos_url": "https://api.github.com/users/Rocketknight1/repos", "events_url": "https://api.github.com/users/Rocketknight1/events{/privacy}", "received_events_url": "https://api.github.com/users/Rocketknight1/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-10-10T13:41:07
2025-10-13T16:00:10
2025-10-13T16:00:08
MEMBER
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/41513", "html_url": "https://github.com/huggingface/transformers/pull/41513", "diff_url": "https://github.com/huggingface/transformers/pull/41513.diff", "patch_url": "https://github.com/huggingface/transformers/pull/41513.patch", "merged_at": "2025-10-13T16:00:07" }
Since `AutoModelForVision2Seq` is deprecated for `v5`, this PR redirects our code to point at `AutoModelForImageTextToText` to cut down on unnecessary warnings. Fixes #41509
{ "login": "Rocketknight1", "id": 12866554, "node_id": "MDQ6VXNlcjEyODY2NTU0", "avatar_url": "https://avatars.githubusercontent.com/u/12866554?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Rocketknight1", "html_url": "https://github.com/Rocketknight1", "followers_url": "https://api.github.com/users/Rocketknight1/followers", "following_url": "https://api.github.com/users/Rocketknight1/following{/other_user}", "gists_url": "https://api.github.com/users/Rocketknight1/gists{/gist_id}", "starred_url": "https://api.github.com/users/Rocketknight1/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Rocketknight1/subscriptions", "organizations_url": "https://api.github.com/users/Rocketknight1/orgs", "repos_url": "https://api.github.com/users/Rocketknight1/repos", "events_url": "https://api.github.com/users/Rocketknight1/events{/privacy}", "received_events_url": "https://api.github.com/users/Rocketknight1/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/41513/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/41513/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/41512
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/41512/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/41512/comments
https://api.github.com/repos/huggingface/transformers/issues/41512/events
https://github.com/huggingface/transformers/pull/41512
3,502,706,946
PR_kwDOCUB6oc6tFLhZ
41,512
Remove outdated flags
{ "login": "Cyrilvallez", "id": 71554963, "node_id": "MDQ6VXNlcjcxNTU0OTYz", "avatar_url": "https://avatars.githubusercontent.com/u/71554963?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Cyrilvallez", "html_url": "https://github.com/Cyrilvallez", "followers_url": "https://api.github.com/users/Cyrilvallez/followers", "following_url": "https://api.github.com/users/Cyrilvallez/following{/other_user}", "gists_url": "https://api.github.com/users/Cyrilvallez/gists{/gist_id}", "starred_url": "https://api.github.com/users/Cyrilvallez/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Cyrilvallez/subscriptions", "organizations_url": "https://api.github.com/users/Cyrilvallez/orgs", "repos_url": "https://api.github.com/users/Cyrilvallez/repos", "events_url": "https://api.github.com/users/Cyrilvallez/events{/privacy}", "received_events_url": "https://api.github.com/users/Cyrilvallez/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-10-10T12:34:34
2025-10-10T12:43:54
2025-10-10T12:34:47
MEMBER
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/41512", "html_url": "https://github.com/huggingface/transformers/pull/41512", "diff_url": "https://github.com/huggingface/transformers/pull/41512.diff", "patch_url": "https://github.com/huggingface/transformers/pull/41512.patch", "merged_at": "2025-10-10T12:34:47" }
# What does this PR do? Those features disappeared recently, but those models were added in the middle so they still have the flags
{ "login": "Cyrilvallez", "id": 71554963, "node_id": "MDQ6VXNlcjcxNTU0OTYz", "avatar_url": "https://avatars.githubusercontent.com/u/71554963?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Cyrilvallez", "html_url": "https://github.com/Cyrilvallez", "followers_url": "https://api.github.com/users/Cyrilvallez/followers", "following_url": "https://api.github.com/users/Cyrilvallez/following{/other_user}", "gists_url": "https://api.github.com/users/Cyrilvallez/gists{/gist_id}", "starred_url": "https://api.github.com/users/Cyrilvallez/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Cyrilvallez/subscriptions", "organizations_url": "https://api.github.com/users/Cyrilvallez/orgs", "repos_url": "https://api.github.com/users/Cyrilvallez/repos", "events_url": "https://api.github.com/users/Cyrilvallez/events{/privacy}", "received_events_url": "https://api.github.com/users/Cyrilvallez/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/41512/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/41512/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/41511
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/41511/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/41511/comments
https://api.github.com/repos/huggingface/transformers/issues/41511/events
https://github.com/huggingface/transformers/pull/41511
3,502,609,986
PR_kwDOCUB6oc6tE2Kn
41,511
[don't merge yet] Remove some custom datasets defined in codebase
{ "login": "ydshieh", "id": 2521628, "node_id": "MDQ6VXNlcjI1MjE2Mjg=", "avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4", "gravatar_id": "", "url": "https://api.github.com/users/ydshieh", "html_url": "https://github.com/ydshieh", "followers_url": "https://api.github.com/users/ydshieh/followers", "following_url": "https://api.github.com/users/ydshieh/following{/other_user}", "gists_url": "https://api.github.com/users/ydshieh/gists{/gist_id}", "starred_url": "https://api.github.com/users/ydshieh/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/ydshieh/subscriptions", "organizations_url": "https://api.github.com/users/ydshieh/orgs", "repos_url": "https://api.github.com/users/ydshieh/repos", "events_url": "https://api.github.com/users/ydshieh/events{/privacy}", "received_events_url": "https://api.github.com/users/ydshieh/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
open
false
null
[]
null
[]
2025-10-10T12:07:03
2025-10-13T12:53:08
null
COLLABORATOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/41511", "html_url": "https://github.com/huggingface/transformers/pull/41511", "diff_url": "https://github.com/huggingface/transformers/pull/41511.diff", "patch_url": "https://github.com/huggingface/transformers/pull/41511.patch", "merged_at": null }
# What does this PR do? It should have been gone long long time ago ...
null
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/41511/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/41511/timeline
null
null
null
null
true
false
https://api.github.com/repos/huggingface/transformers/issues/41510
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/41510/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/41510/comments
https://api.github.com/repos/huggingface/transformers/issues/41510/events
https://github.com/huggingface/transformers/pull/41510
3,502,490,958
PR_kwDOCUB6oc6tEb_S
41,510
Fix detectron2 import
{ "login": "Cyrilvallez", "id": 71554963, "node_id": "MDQ6VXNlcjcxNTU0OTYz", "avatar_url": "https://avatars.githubusercontent.com/u/71554963?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Cyrilvallez", "html_url": "https://github.com/Cyrilvallez", "followers_url": "https://api.github.com/users/Cyrilvallez/followers", "following_url": "https://api.github.com/users/Cyrilvallez/following{/other_user}", "gists_url": "https://api.github.com/users/Cyrilvallez/gists{/gist_id}", "starred_url": "https://api.github.com/users/Cyrilvallez/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Cyrilvallez/subscriptions", "organizations_url": "https://api.github.com/users/Cyrilvallez/orgs", "repos_url": "https://api.github.com/users/Cyrilvallez/repos", "events_url": "https://api.github.com/users/Cyrilvallez/events{/privacy}", "received_events_url": "https://api.github.com/users/Cyrilvallez/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-10-10T11:31:28
2025-10-10T13:40:59
2025-10-10T11:33:47
MEMBER
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/41510", "html_url": "https://github.com/huggingface/transformers/pull/41510", "diff_url": "https://github.com/huggingface/transformers/pull/41510.diff", "patch_url": "https://github.com/huggingface/transformers/pull/41510.patch", "merged_at": "2025-10-10T11:33:47" }
# What does this PR do? Detectron2 does not uninstall correctly for some reason (probably the setup as the lib needs to be installed from source) so we need this cc @ydshieh
{ "login": "Cyrilvallez", "id": 71554963, "node_id": "MDQ6VXNlcjcxNTU0OTYz", "avatar_url": "https://avatars.githubusercontent.com/u/71554963?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Cyrilvallez", "html_url": "https://github.com/Cyrilvallez", "followers_url": "https://api.github.com/users/Cyrilvallez/followers", "following_url": "https://api.github.com/users/Cyrilvallez/following{/other_user}", "gists_url": "https://api.github.com/users/Cyrilvallez/gists{/gist_id}", "starred_url": "https://api.github.com/users/Cyrilvallez/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Cyrilvallez/subscriptions", "organizations_url": "https://api.github.com/users/Cyrilvallez/orgs", "repos_url": "https://api.github.com/users/Cyrilvallez/repos", "events_url": "https://api.github.com/users/Cyrilvallez/events{/privacy}", "received_events_url": "https://api.github.com/users/Cyrilvallez/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/41510/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/41510/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/41509
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/41509/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/41509/comments
https://api.github.com/repos/huggingface/transformers/issues/41509/events
https://github.com/huggingface/transformers/issues/41509
3,502,437,111
I_kwDOCUB6oc7QwvL3
41,509
Warning when using the "image-to-text" pipeline
{ "login": "intexcor", "id": 142020129, "node_id": "U_kgDOCHcOIQ", "avatar_url": "https://avatars.githubusercontent.com/u/142020129?v=4", "gravatar_id": "", "url": "https://api.github.com/users/intexcor", "html_url": "https://github.com/intexcor", "followers_url": "https://api.github.com/users/intexcor/followers", "following_url": "https://api.github.com/users/intexcor/following{/other_user}", "gists_url": "https://api.github.com/users/intexcor/gists{/gist_id}", "starred_url": "https://api.github.com/users/intexcor/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/intexcor/subscriptions", "organizations_url": "https://api.github.com/users/intexcor/orgs", "repos_url": "https://api.github.com/users/intexcor/repos", "events_url": "https://api.github.com/users/intexcor/events{/privacy}", "received_events_url": "https://api.github.com/users/intexcor/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "id": 3817266200, "node_id": "MDU6TGFiZWwzODE3MjY2MjAw", "url": "https://api.github.com/repos/huggingface/transformers/labels/bug", "name": "bug", "color": "d73a4a", "default": true, "description": null } ]
closed
false
null
[]
null
[]
2025-10-10T11:14:56
2025-10-13T16:00:09
2025-10-13T16:00:09
NONE
null
null
null
null
### System Info transformers=4.57.0 python 3.13 ### Who can help? _No response_ ### Information - [x] The official example scripts - [x] My own modified scripts ### Tasks - [x] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...) - [x] My own task or dataset (give details below) ### Reproduction from transformers import pipeline pipe = pipeline("image-to-text", model="intexcp/donut") ### Expected behavior Bug fixes
{ "login": "Rocketknight1", "id": 12866554, "node_id": "MDQ6VXNlcjEyODY2NTU0", "avatar_url": "https://avatars.githubusercontent.com/u/12866554?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Rocketknight1", "html_url": "https://github.com/Rocketknight1", "followers_url": "https://api.github.com/users/Rocketknight1/followers", "following_url": "https://api.github.com/users/Rocketknight1/following{/other_user}", "gists_url": "https://api.github.com/users/Rocketknight1/gists{/gist_id}", "starred_url": "https://api.github.com/users/Rocketknight1/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Rocketknight1/subscriptions", "organizations_url": "https://api.github.com/users/Rocketknight1/orgs", "repos_url": "https://api.github.com/users/Rocketknight1/repos", "events_url": "https://api.github.com/users/Rocketknight1/events{/privacy}", "received_events_url": "https://api.github.com/users/Rocketknight1/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/41509/reactions", "total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/41509/timeline
null
completed
{ "total": 0, "completed": 0, "percent_completed": 0 }
{ "blocked_by": 0, "total_blocked_by": 0, "blocking": 0, "total_blocking": 0 }
false
true
https://api.github.com/repos/huggingface/transformers/issues/41508
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/41508/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/41508/comments
https://api.github.com/repos/huggingface/transformers/issues/41508/events
https://github.com/huggingface/transformers/pull/41508
3,502,170,916
PR_kwDOCUB6oc6tDV69
41,508
Bump to hfh 1.0.0.rc5 to fix test
{ "login": "Wauplin", "id": 11801849, "node_id": "MDQ6VXNlcjExODAxODQ5", "avatar_url": "https://avatars.githubusercontent.com/u/11801849?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Wauplin", "html_url": "https://github.com/Wauplin", "followers_url": "https://api.github.com/users/Wauplin/followers", "following_url": "https://api.github.com/users/Wauplin/following{/other_user}", "gists_url": "https://api.github.com/users/Wauplin/gists{/gist_id}", "starred_url": "https://api.github.com/users/Wauplin/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Wauplin/subscriptions", "organizations_url": "https://api.github.com/users/Wauplin/orgs", "repos_url": "https://api.github.com/users/Wauplin/repos", "events_url": "https://api.github.com/users/Wauplin/events{/privacy}", "received_events_url": "https://api.github.com/users/Wauplin/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-10-10T10:00:09
2025-10-10T10:12:10
2025-10-10T10:12:09
CONTRIBUTOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/41508", "html_url": "https://github.com/huggingface/transformers/pull/41508", "diff_url": "https://github.com/huggingface/transformers/pull/41508.diff", "patch_url": "https://github.com/huggingface/transformers/pull/41508.patch", "merged_at": "2025-10-10T10:12:09" }
Should be the current failing CI. See https://github.com/huggingface/huggingface_hub/pull/3433 for info. (related to [internal slack thread](https://huggingface.slack.com/archives/C01NE71C4F7/p1760022191957959))
{ "login": "LysandreJik", "id": 30755778, "node_id": "MDQ6VXNlcjMwNzU1Nzc4", "avatar_url": "https://avatars.githubusercontent.com/u/30755778?v=4", "gravatar_id": "", "url": "https://api.github.com/users/LysandreJik", "html_url": "https://github.com/LysandreJik", "followers_url": "https://api.github.com/users/LysandreJik/followers", "following_url": "https://api.github.com/users/LysandreJik/following{/other_user}", "gists_url": "https://api.github.com/users/LysandreJik/gists{/gist_id}", "starred_url": "https://api.github.com/users/LysandreJik/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/LysandreJik/subscriptions", "organizations_url": "https://api.github.com/users/LysandreJik/orgs", "repos_url": "https://api.github.com/users/LysandreJik/repos", "events_url": "https://api.github.com/users/LysandreJik/events{/privacy}", "received_events_url": "https://api.github.com/users/LysandreJik/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/41508/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/41508/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/41507
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/41507/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/41507/comments
https://api.github.com/repos/huggingface/transformers/issues/41507/events
https://github.com/huggingface/transformers/pull/41507
3,502,138,986
PR_kwDOCUB6oc6tDO8u
41,507
[kernels] rm mra kernels
{ "login": "MekkCyber", "id": 93391238, "node_id": "U_kgDOBZEJhg", "avatar_url": "https://avatars.githubusercontent.com/u/93391238?v=4", "gravatar_id": "", "url": "https://api.github.com/users/MekkCyber", "html_url": "https://github.com/MekkCyber", "followers_url": "https://api.github.com/users/MekkCyber/followers", "following_url": "https://api.github.com/users/MekkCyber/following{/other_user}", "gists_url": "https://api.github.com/users/MekkCyber/gists{/gist_id}", "starred_url": "https://api.github.com/users/MekkCyber/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/MekkCyber/subscriptions", "organizations_url": "https://api.github.com/users/MekkCyber/orgs", "repos_url": "https://api.github.com/users/MekkCyber/repos", "events_url": "https://api.github.com/users/MekkCyber/events{/privacy}", "received_events_url": "https://api.github.com/users/MekkCyber/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-10-10T09:49:55
2025-10-14T11:34:06
2025-10-14T11:34:04
CONTRIBUTOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/41507", "html_url": "https://github.com/huggingface/transformers/pull/41507", "diff_url": "https://github.com/huggingface/transformers/pull/41507.diff", "patch_url": "https://github.com/huggingface/transformers/pull/41507.patch", "merged_at": "2025-10-14T11:34:04" }
# What does this PR do? Removes the mra kernels, and uses kernels from the hub : https://huggingface.co/kernels-community/mra instead
{ "login": "MekkCyber", "id": 93391238, "node_id": "U_kgDOBZEJhg", "avatar_url": "https://avatars.githubusercontent.com/u/93391238?v=4", "gravatar_id": "", "url": "https://api.github.com/users/MekkCyber", "html_url": "https://github.com/MekkCyber", "followers_url": "https://api.github.com/users/MekkCyber/followers", "following_url": "https://api.github.com/users/MekkCyber/following{/other_user}", "gists_url": "https://api.github.com/users/MekkCyber/gists{/gist_id}", "starred_url": "https://api.github.com/users/MekkCyber/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/MekkCyber/subscriptions", "organizations_url": "https://api.github.com/users/MekkCyber/orgs", "repos_url": "https://api.github.com/users/MekkCyber/repos", "events_url": "https://api.github.com/users/MekkCyber/events{/privacy}", "received_events_url": "https://api.github.com/users/MekkCyber/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/41507/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/41507/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/41506
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/41506/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/41506/comments
https://api.github.com/repos/huggingface/transformers/issues/41506/events
https://github.com/huggingface/transformers/pull/41506
3,502,082,825
PR_kwDOCUB6oc6tDCyW
41,506
[SAM] Fix typing hints
{ "login": "zucchini-nlp", "id": 100715397, "node_id": "U_kgDOBgDLhQ", "avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4", "gravatar_id": "", "url": "https://api.github.com/users/zucchini-nlp", "html_url": "https://github.com/zucchini-nlp", "followers_url": "https://api.github.com/users/zucchini-nlp/followers", "following_url": "https://api.github.com/users/zucchini-nlp/following{/other_user}", "gists_url": "https://api.github.com/users/zucchini-nlp/gists{/gist_id}", "starred_url": "https://api.github.com/users/zucchini-nlp/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/zucchini-nlp/subscriptions", "organizations_url": "https://api.github.com/users/zucchini-nlp/orgs", "repos_url": "https://api.github.com/users/zucchini-nlp/repos", "events_url": "https://api.github.com/users/zucchini-nlp/events{/privacy}", "received_events_url": "https://api.github.com/users/zucchini-nlp/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-10-10T09:31:12
2025-10-13T09:52:00
2025-10-13T09:52:00
MEMBER
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/41506", "html_url": "https://github.com/huggingface/transformers/pull/41506", "diff_url": "https://github.com/huggingface/transformers/pull/41506.diff", "patch_url": "https://github.com/huggingface/transformers/pull/41506.patch", "merged_at": "2025-10-13T09:52:00" }
# What does this PR do? As per title
{ "login": "zucchini-nlp", "id": 100715397, "node_id": "U_kgDOBgDLhQ", "avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4", "gravatar_id": "", "url": "https://api.github.com/users/zucchini-nlp", "html_url": "https://github.com/zucchini-nlp", "followers_url": "https://api.github.com/users/zucchini-nlp/followers", "following_url": "https://api.github.com/users/zucchini-nlp/following{/other_user}", "gists_url": "https://api.github.com/users/zucchini-nlp/gists{/gist_id}", "starred_url": "https://api.github.com/users/zucchini-nlp/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/zucchini-nlp/subscriptions", "organizations_url": "https://api.github.com/users/zucchini-nlp/orgs", "repos_url": "https://api.github.com/users/zucchini-nlp/repos", "events_url": "https://api.github.com/users/zucchini-nlp/events{/privacy}", "received_events_url": "https://api.github.com/users/zucchini-nlp/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/41506/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/41506/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/41505
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/41505/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/41505/comments
https://api.github.com/repos/huggingface/transformers/issues/41505/events
https://github.com/huggingface/transformers/pull/41505
3,502,017,115
PR_kwDOCUB6oc6tC06p
41,505
🚨 [v5] `generate` delegates default cache initialization to the model
{ "login": "gante", "id": 12240844, "node_id": "MDQ6VXNlcjEyMjQwODQ0", "avatar_url": "https://avatars.githubusercontent.com/u/12240844?v=4", "gravatar_id": "", "url": "https://api.github.com/users/gante", "html_url": "https://github.com/gante", "followers_url": "https://api.github.com/users/gante/followers", "following_url": "https://api.github.com/users/gante/following{/other_user}", "gists_url": "https://api.github.com/users/gante/gists{/gist_id}", "starred_url": "https://api.github.com/users/gante/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/gante/subscriptions", "organizations_url": "https://api.github.com/users/gante/orgs", "repos_url": "https://api.github.com/users/gante/repos", "events_url": "https://api.github.com/users/gante/events{/privacy}", "received_events_url": "https://api.github.com/users/gante/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-10-10T09:09:27
2025-10-13T12:20:52
2025-10-13T12:20:48
MEMBER
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/41505", "html_url": "https://github.com/huggingface/transformers/pull/41505", "diff_url": "https://github.com/huggingface/transformers/pull/41505.diff", "patch_url": "https://github.com/huggingface/transformers/pull/41505.patch", "merged_at": "2025-10-13T12:20:48" }
# What does this PR do? See PR title. Now that all traces of legacy caches were removed, we can trust the model to initialize its own cache! This means we no longer need to set `cache_implementation="xxx"` defaults in new models, assuming the model's forward pass defaults to the right cache class. Also fixes related bugs, uncovered by not feeding a cache to the model.
{ "login": "gante", "id": 12240844, "node_id": "MDQ6VXNlcjEyMjQwODQ0", "avatar_url": "https://avatars.githubusercontent.com/u/12240844?v=4", "gravatar_id": "", "url": "https://api.github.com/users/gante", "html_url": "https://github.com/gante", "followers_url": "https://api.github.com/users/gante/followers", "following_url": "https://api.github.com/users/gante/following{/other_user}", "gists_url": "https://api.github.com/users/gante/gists{/gist_id}", "starred_url": "https://api.github.com/users/gante/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/gante/subscriptions", "organizations_url": "https://api.github.com/users/gante/orgs", "repos_url": "https://api.github.com/users/gante/repos", "events_url": "https://api.github.com/users/gante/events{/privacy}", "received_events_url": "https://api.github.com/users/gante/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/41505/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/41505/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/41504
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/41504/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/41504/comments
https://api.github.com/repos/huggingface/transformers/issues/41504/events
https://github.com/huggingface/transformers/pull/41504
3,502,004,725
PR_kwDOCUB6oc6tCyOc
41,504
Revert `local_rank` deletion and some cleaning
{ "login": "SunMarc", "id": 57196510, "node_id": "MDQ6VXNlcjU3MTk2NTEw", "avatar_url": "https://avatars.githubusercontent.com/u/57196510?v=4", "gravatar_id": "", "url": "https://api.github.com/users/SunMarc", "html_url": "https://github.com/SunMarc", "followers_url": "https://api.github.com/users/SunMarc/followers", "following_url": "https://api.github.com/users/SunMarc/following{/other_user}", "gists_url": "https://api.github.com/users/SunMarc/gists{/gist_id}", "starred_url": "https://api.github.com/users/SunMarc/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/SunMarc/subscriptions", "organizations_url": "https://api.github.com/users/SunMarc/orgs", "repos_url": "https://api.github.com/users/SunMarc/repos", "events_url": "https://api.github.com/users/SunMarc/events{/privacy}", "received_events_url": "https://api.github.com/users/SunMarc/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-10-10T09:05:21
2025-10-10T10:23:06
2025-10-10T10:23:04
MEMBER
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/41504", "html_url": "https://github.com/huggingface/transformers/pull/41504", "diff_url": "https://github.com/huggingface/transformers/pull/41504.diff", "patch_url": "https://github.com/huggingface/transformers/pull/41504.patch", "merged_at": "2025-10-10T10:23:04" }
# What does this PR do? This PR removes some bits that I forgot when removing `logging_dir`. Also, we need to keep `local_rank` as torch.distributed.launch inject `local_rank` in the script. I will deprecate at once torch removes it from their codebase + we don't support this version of pytorch which is in a super long time
{ "login": "SunMarc", "id": 57196510, "node_id": "MDQ6VXNlcjU3MTk2NTEw", "avatar_url": "https://avatars.githubusercontent.com/u/57196510?v=4", "gravatar_id": "", "url": "https://api.github.com/users/SunMarc", "html_url": "https://github.com/SunMarc", "followers_url": "https://api.github.com/users/SunMarc/followers", "following_url": "https://api.github.com/users/SunMarc/following{/other_user}", "gists_url": "https://api.github.com/users/SunMarc/gists{/gist_id}", "starred_url": "https://api.github.com/users/SunMarc/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/SunMarc/subscriptions", "organizations_url": "https://api.github.com/users/SunMarc/orgs", "repos_url": "https://api.github.com/users/SunMarc/repos", "events_url": "https://api.github.com/users/SunMarc/events{/privacy}", "received_events_url": "https://api.github.com/users/SunMarc/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/41504/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/41504/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/41503
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/41503/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/41503/comments
https://api.github.com/repos/huggingface/transformers/issues/41503/events
https://github.com/huggingface/transformers/pull/41503
3,501,937,472
PR_kwDOCUB6oc6tCkWT
41,503
Fix some tests
{ "login": "Cyrilvallez", "id": 71554963, "node_id": "MDQ6VXNlcjcxNTU0OTYz", "avatar_url": "https://avatars.githubusercontent.com/u/71554963?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Cyrilvallez", "html_url": "https://github.com/Cyrilvallez", "followers_url": "https://api.github.com/users/Cyrilvallez/followers", "following_url": "https://api.github.com/users/Cyrilvallez/following{/other_user}", "gists_url": "https://api.github.com/users/Cyrilvallez/gists{/gist_id}", "starred_url": "https://api.github.com/users/Cyrilvallez/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Cyrilvallez/subscriptions", "organizations_url": "https://api.github.com/users/Cyrilvallez/orgs", "repos_url": "https://api.github.com/users/Cyrilvallez/repos", "events_url": "https://api.github.com/users/Cyrilvallez/events{/privacy}", "received_events_url": "https://api.github.com/users/Cyrilvallez/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-10-10T08:41:49
2025-10-10T09:05:11
2025-10-10T09:05:09
MEMBER
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/41503", "html_url": "https://github.com/huggingface/transformers/pull/41503", "diff_url": "https://github.com/huggingface/transformers/pull/41503.diff", "patch_url": "https://github.com/huggingface/transformers/pull/41503.patch", "merged_at": "2025-10-10T09:05:09" }
# What does this PR do? This PR removes some last remnants of the old cache format, and fixes related tests. I also updated contrastive search on the hub https://huggingface.co/transformers-community/contrastive-search/discussions/3 to fix the related tests with the new format
{ "login": "Cyrilvallez", "id": 71554963, "node_id": "MDQ6VXNlcjcxNTU0OTYz", "avatar_url": "https://avatars.githubusercontent.com/u/71554963?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Cyrilvallez", "html_url": "https://github.com/Cyrilvallez", "followers_url": "https://api.github.com/users/Cyrilvallez/followers", "following_url": "https://api.github.com/users/Cyrilvallez/following{/other_user}", "gists_url": "https://api.github.com/users/Cyrilvallez/gists{/gist_id}", "starred_url": "https://api.github.com/users/Cyrilvallez/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Cyrilvallez/subscriptions", "organizations_url": "https://api.github.com/users/Cyrilvallez/orgs", "repos_url": "https://api.github.com/users/Cyrilvallez/repos", "events_url": "https://api.github.com/users/Cyrilvallez/events{/privacy}", "received_events_url": "https://api.github.com/users/Cyrilvallez/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/41503/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/41503/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/41502
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/41502/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/41502/comments
https://api.github.com/repos/huggingface/transformers/issues/41502/events
https://github.com/huggingface/transformers/pull/41502
3,501,735,980
PR_kwDOCUB6oc6tB5fn
41,502
:globe_with_meridians: [i18n-KO] Translated `ko-LFM2.md` to Korean
{ "login": "ssum21", "id": 116950962, "node_id": "U_kgDOBviHsg", "avatar_url": "https://avatars.githubusercontent.com/u/116950962?v=4", "gravatar_id": "", "url": "https://api.github.com/users/ssum21", "html_url": "https://github.com/ssum21", "followers_url": "https://api.github.com/users/ssum21/followers", "following_url": "https://api.github.com/users/ssum21/following{/other_user}", "gists_url": "https://api.github.com/users/ssum21/gists{/gist_id}", "starred_url": "https://api.github.com/users/ssum21/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/ssum21/subscriptions", "organizations_url": "https://api.github.com/users/ssum21/orgs", "repos_url": "https://api.github.com/users/ssum21/repos", "events_url": "https://api.github.com/users/ssum21/events{/privacy}", "received_events_url": "https://api.github.com/users/ssum21/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-10-10T07:36:31
2025-10-16T18:29:04
2025-10-16T18:29:04
CONTRIBUTOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/41502", "html_url": "https://github.com/huggingface/transformers/pull/41502", "diff_url": "https://github.com/huggingface/transformers/pull/41502.diff", "patch_url": "https://github.com/huggingface/transformers/pull/41502.patch", "merged_at": "2025-10-16T18:29:04" }
<!-- PR의 제목은 ":globe_with_meridians: [i18n-KO] Translated `<your_file>.md` to Korean" 으로 부탁드립니다 --> # What does this PR do? Translated the `lfm2.md` file of the documentation to Korean. Thank you in advance for your review. Part of https://github.com/huggingface/transformers/issues/20179 ## Before reviewing - [x] Check for missing / redundant translations (번역 누락/중복 검사) - [x] Grammar Check (맞춤법 검사) - [x] Review or Add new terms to glossary (용어 확인 및 추가) - [x] Check Inline TOC (e.g. `[[lowercased-header]]`) - [x] Check live-preview for gotchas (live-preview로 정상작동 확인) ## Who can review? (Initial) <!-- 1. 위 체크가 모두 완료된 뒤에만 KREW 팀원들에게 리뷰를 요청하는 아래 주석을 노출해주세요!--> May you please review this PR? @4N3MONE, @Kim-Ju-won, @ahnjj, @FacerAin, @ssum21, @TaskerJang, @HyunZ118 ## Before submitting - [x] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case). - [x] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#start-contributing-pull-requests), Pull Request section? - [x] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link to it if that's the case. - [x] Did you make sure to update the documentation with your changes? Here are the [documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and [here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation). - [x] Did you write any new necessary tests? ## Who can review? (Final) <!-- 2. KREW 팀원들의 리뷰가 끝난 후에 아래 주석을 노출해주세요! --> <!---transformers, course는 @stevhliu, agent-course는 @sergiopanieg smol-agents는 @albertvillanova입니다!---> @stevhliu May you please review this PR?
{ "login": "stevhliu", "id": 59462357, "node_id": "MDQ6VXNlcjU5NDYyMzU3", "avatar_url": "https://avatars.githubusercontent.com/u/59462357?v=4", "gravatar_id": "", "url": "https://api.github.com/users/stevhliu", "html_url": "https://github.com/stevhliu", "followers_url": "https://api.github.com/users/stevhliu/followers", "following_url": "https://api.github.com/users/stevhliu/following{/other_user}", "gists_url": "https://api.github.com/users/stevhliu/gists{/gist_id}", "starred_url": "https://api.github.com/users/stevhliu/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/stevhliu/subscriptions", "organizations_url": "https://api.github.com/users/stevhliu/orgs", "repos_url": "https://api.github.com/users/stevhliu/repos", "events_url": "https://api.github.com/users/stevhliu/events{/privacy}", "received_events_url": "https://api.github.com/users/stevhliu/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/41502/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/41502/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/41501
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/41501/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/41501/comments
https://api.github.com/repos/huggingface/transformers/issues/41501/events
https://github.com/huggingface/transformers/issues/41501
3,501,684,923
I_kwDOCUB6oc7Qt3i7
41,501
Paged attention does not work
{ "login": "yuyijiong", "id": 73890704, "node_id": "MDQ6VXNlcjczODkwNzA0", "avatar_url": "https://avatars.githubusercontent.com/u/73890704?v=4", "gravatar_id": "", "url": "https://api.github.com/users/yuyijiong", "html_url": "https://github.com/yuyijiong", "followers_url": "https://api.github.com/users/yuyijiong/followers", "following_url": "https://api.github.com/users/yuyijiong/following{/other_user}", "gists_url": "https://api.github.com/users/yuyijiong/gists{/gist_id}", "starred_url": "https://api.github.com/users/yuyijiong/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/yuyijiong/subscriptions", "organizations_url": "https://api.github.com/users/yuyijiong/orgs", "repos_url": "https://api.github.com/users/yuyijiong/repos", "events_url": "https://api.github.com/users/yuyijiong/events{/privacy}", "received_events_url": "https://api.github.com/users/yuyijiong/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "id": 3817266200, "node_id": "MDU6TGFiZWwzODE3MjY2MjAw", "url": "https://api.github.com/repos/huggingface/transformers/labels/bug", "name": "bug", "color": "d73a4a", "default": true, "description": null } ]
open
false
null
[]
null
[]
2025-10-10T07:20:29
2025-10-16T11:58:16
null
NONE
null
null
null
null
### System Info - `transformers` version: 4.57.0 - Platform: Linux-5.15.0-107-generic-x86_64-with-glibc2.35 - Python version: 3.12.11 - Huggingface_hub version: 0.35.3 - Safetensors version: 0.6.2 - Accelerate version: 1.10.1 - Accelerate config: - compute_environment: LOCAL_MACHINE - distributed_type: MULTI_GPU - mixed_precision: bf16 - use_cpu: False - debug: False - num_processes: 8 - machine_rank: 0 - num_machines: 1 - gpu_ids: all - rdzv_backend: static - same_network: True - main_training_function: main - enable_cpu_affinity: False - downcast_bf16: no - tpu_use_cluster: False - tpu_use_sudo: False - tpu_env: [] - DeepSpeed version: not installed - PyTorch version (accelerator?): 2.8.0+cu128 (CUDA) - Tensorflow version (GPU?): not installed (NA) - Flax version (CPU?/GPU?/TPU?): not installed (NA) - Jax version: not installed - JaxLib version: not installed - Using distributed or parallel set-up in script?: no - Using GPU in script?: yes - GPU type: NVIDIA H20 ### Who can help? @vasqu @ArthurZucker @CyrilVallez ### Information - [ ] The official example scripts - [x] My own modified scripts ### Tasks - [x] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...) - [ ] My own task or dataset (give details below) ### Reproduction When set attn_implementation to "paged_attention", I meet the error even with very simple inference code. This error is similar to #39525 ```python import os os.environ["CUDA_VISIBLE_DEVICES"] = "0" import pandas as pd import torch from transformers import AutoTokenizer,Qwen3ForCausalLM,AutoConfig,AutoModelForCausalLM if __name__ == '__main__': model_path="/share/models/Qwen3-4B-Thinking-2507" tokenizer = AutoTokenizer.from_pretrained(model_path, trust_remote_code=True, add_bos_token=False, add_eos_token=False) model = AutoModelForCausalLM.from_pretrained(model_path, dtype=torch.bfloat16, trust_remote_code=True, #config=config, attn_implementation="paged_attention",#"flash_attention_2",#"sdpa",# device_map="cuda" ).eval() prompt="How are you today?" chat_prompt=tokenizer.apply_chat_template([{"role":"user","content":prompt}],tokenize=False,add_generation_prompt=True) chat_prompt_ids=tokenizer(chat_prompt,return_tensors="pt")["input_ids"].to(model.device) output=model.generate(input_ids=chat_prompt_ids,max_new_tokens=500,num_beams=1,do_sample=False,temperature=1.0,use_cache=True,return_dict_in_generate=True,output_logits=True) output_text=tokenizer.decode(output['sequences'][0][chat_prompt_ids.size(1):]) print(output_text) ``` Error: ``` File "/root/miniconda3/envs/yyj/lib/python3.12/site-packages/transformers/models/qwen3/modeling_qwen3.py", line 260, in forward hidden_states, _ = self.self_attn( ^^^^^^^^^^^^^^^ File "/root/miniconda3/envs/yyj/lib/python3.12/site-packages/torch/nn/modules/module.py", line 1773, in _wrapped_call_impl return self._call_impl(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/root/miniconda3/envs/yyj/lib/python3.12/site-packages/torch/nn/modules/module.py", line 1784, in _call_impl return forward_call(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/root/miniconda3/envs/yyj/lib/python3.12/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func return func(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^ File "/root/miniconda3/envs/yyj/lib/python3.12/site-packages/transformers/models/qwen3/modeling_qwen3.py", line 216, in forward attn_output, attn_weights = attention_interface( ^^^^^^^^^^^^^^^^^^^^ File "/root/miniconda3/envs/yyj/lib/python3.12/site-packages/transformers/integrations/flash_paged.py", line 85, in paged_attention_forward cu_seq_lens_q.to(torch.int32), ^^^^^^^^^^^^^^^^ AttributeError: 'NoneType' object has no attribute 'to' ``` ### Expected behavior fix the bug
null
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/41501/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/41501/timeline
null
null
{ "total": 0, "completed": 0, "percent_completed": 0 }
{ "blocked_by": 0, "total_blocked_by": 0, "blocking": 0, "total_blocking": 0 }
false
false
https://api.github.com/repos/huggingface/transformers/issues/41500
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/41500/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/41500/comments
https://api.github.com/repos/huggingface/transformers/issues/41500/events
https://github.com/huggingface/transformers/pull/41500
3,501,659,884
PR_kwDOCUB6oc6tBpiL
41,500
[QoL] modular conversion shows LoC saved
{ "login": "molbap", "id": 39954772, "node_id": "MDQ6VXNlcjM5OTU0Nzcy", "avatar_url": "https://avatars.githubusercontent.com/u/39954772?v=4", "gravatar_id": "", "url": "https://api.github.com/users/molbap", "html_url": "https://github.com/molbap", "followers_url": "https://api.github.com/users/molbap/followers", "following_url": "https://api.github.com/users/molbap/following{/other_user}", "gists_url": "https://api.github.com/users/molbap/gists{/gist_id}", "starred_url": "https://api.github.com/users/molbap/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/molbap/subscriptions", "organizations_url": "https://api.github.com/users/molbap/orgs", "repos_url": "https://api.github.com/users/molbap/repos", "events_url": "https://api.github.com/users/molbap/events{/privacy}", "received_events_url": "https://api.github.com/users/molbap/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-10-10T07:10:44
2025-10-10T09:55:24
2025-10-10T09:55:23
CONTRIBUTOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/41500", "html_url": "https://github.com/huggingface/transformers/pull/41500", "diff_url": "https://github.com/huggingface/transformers/pull/41500.diff", "patch_url": "https://github.com/huggingface/transformers/pull/41500.patch", "merged_at": "2025-10-10T09:55:23" }
# What does this PR do? Tiny thing. Simply print the LoC saved when doing modular conversion. <img width="1528" height="81" alt="image" src="https://github.com/user-attachments/assets/a6bab118-5dc9-4550-991e-eb8acaa657f4" /> ## Why? Because we sometimes hesitate too much (me first). Modular isn't magic, it's close,and it can't do everything. I sometimes want to optimize too much a conversion that's already leagues better than if I was using `modeling` directly.
{ "login": "molbap", "id": 39954772, "node_id": "MDQ6VXNlcjM5OTU0Nzcy", "avatar_url": "https://avatars.githubusercontent.com/u/39954772?v=4", "gravatar_id": "", "url": "https://api.github.com/users/molbap", "html_url": "https://github.com/molbap", "followers_url": "https://api.github.com/users/molbap/followers", "following_url": "https://api.github.com/users/molbap/following{/other_user}", "gists_url": "https://api.github.com/users/molbap/gists{/gist_id}", "starred_url": "https://api.github.com/users/molbap/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/molbap/subscriptions", "organizations_url": "https://api.github.com/users/molbap/orgs", "repos_url": "https://api.github.com/users/molbap/repos", "events_url": "https://api.github.com/users/molbap/events{/privacy}", "received_events_url": "https://api.github.com/users/molbap/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/41500/reactions", "total_count": 1, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 1, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/41500/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/41499
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/41499/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/41499/comments
https://api.github.com/repos/huggingface/transformers/issues/41499/events
https://github.com/huggingface/transformers/pull/41499
3,501,646,922
PR_kwDOCUB6oc6tBmvP
41,499
fix bnb model loading
{ "login": "jiqing-feng", "id": 107918818, "node_id": "U_kgDOBm614g", "avatar_url": "https://avatars.githubusercontent.com/u/107918818?v=4", "gravatar_id": "", "url": "https://api.github.com/users/jiqing-feng", "html_url": "https://github.com/jiqing-feng", "followers_url": "https://api.github.com/users/jiqing-feng/followers", "following_url": "https://api.github.com/users/jiqing-feng/following{/other_user}", "gists_url": "https://api.github.com/users/jiqing-feng/gists{/gist_id}", "starred_url": "https://api.github.com/users/jiqing-feng/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/jiqing-feng/subscriptions", "organizations_url": "https://api.github.com/users/jiqing-feng/orgs", "repos_url": "https://api.github.com/users/jiqing-feng/repos", "events_url": "https://api.github.com/users/jiqing-feng/events{/privacy}", "received_events_url": "https://api.github.com/users/jiqing-feng/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-10-10T07:05:44
2025-10-10T08:28:09
2025-10-10T08:27:29
CONTRIBUTOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/41499", "html_url": "https://github.com/huggingface/transformers/pull/41499", "diff_url": "https://github.com/huggingface/transformers/pull/41499.diff", "patch_url": "https://github.com/huggingface/transformers/pull/41499.patch", "merged_at": "2025-10-10T08:27:29" }
```python from transformers import pipeline pipe = pipeline("text-generation", model="hugging-quants/Meta-Llama-3.1-8B-Instruct-BNB-NF4") ``` Before this PR: ``` Traceback (most recent call last): File "/home/jiqing/transformers/src/transformers/pipelines/base.py", line 262, in load_model model = model_class.from_pretrained(model, **fp32_kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/jiqing/transformers/src/transformers/modeling_utils.py", line 272, in _wrapper return func(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^ File "/home/jiqing/transformers/src/transformers/modeling_utils.py", line 4669, in from_pretrained hf_quantizer, config, dtype, device_map = get_hf_quantizer( ^^^^^^^^^^^^^^^^^ File "/home/jiqing/transformers/src/transformers/quantizers/auto.py", line 301, in get_hf_quantizer config.quantization_config = AutoHfQuantizer.merge_quantization_configs( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/jiqing/transformers/src/transformers/quantizers/auto.py", line 210, in merge_quantization_configs quantization_config = AutoQuantizationConfig.from_dict(quantization_config) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/jiqing/transformers/src/transformers/quantizers/auto.py", line 130, in from_dict raise ValueError( ValueError: Unknown quantization type, got bitsandbytes - supported types are: ['awq', 'bitsandbytes_4bit', 'bitsandbytes_8bit', 'gptq ', 'aqlm', 'quanto', 'quark', 'fp_quant', 'eetq', 'higgs', 'hqq', 'compressed-tensors', 'fbgemm_fp8', 'torchao', 'bitnet', 'vptq', 'sp qr', 'fp8', 'auto-round', 'mxfp4'] ``` After this PR, it works. Hi @SunMarc . Please review this PR. Thanks!
{ "login": "SunMarc", "id": 57196510, "node_id": "MDQ6VXNlcjU3MTk2NTEw", "avatar_url": "https://avatars.githubusercontent.com/u/57196510?v=4", "gravatar_id": "", "url": "https://api.github.com/users/SunMarc", "html_url": "https://github.com/SunMarc", "followers_url": "https://api.github.com/users/SunMarc/followers", "following_url": "https://api.github.com/users/SunMarc/following{/other_user}", "gists_url": "https://api.github.com/users/SunMarc/gists{/gist_id}", "starred_url": "https://api.github.com/users/SunMarc/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/SunMarc/subscriptions", "organizations_url": "https://api.github.com/users/SunMarc/orgs", "repos_url": "https://api.github.com/users/SunMarc/repos", "events_url": "https://api.github.com/users/SunMarc/events{/privacy}", "received_events_url": "https://api.github.com/users/SunMarc/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/41499/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/41499/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/41498
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/41498/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/41498/comments
https://api.github.com/repos/huggingface/transformers/issues/41498/events
https://github.com/huggingface/transformers/issues/41498
3,501,348,442
I_kwDOCUB6oc7QslZa
41,498
mamba2 mixer different outputs between fast, slow, training, and inference paths.
{ "login": "tsengalb99", "id": 33385672, "node_id": "MDQ6VXNlcjMzMzg1Njcy", "avatar_url": "https://avatars.githubusercontent.com/u/33385672?v=4", "gravatar_id": "", "url": "https://api.github.com/users/tsengalb99", "html_url": "https://github.com/tsengalb99", "followers_url": "https://api.github.com/users/tsengalb99/followers", "following_url": "https://api.github.com/users/tsengalb99/following{/other_user}", "gists_url": "https://api.github.com/users/tsengalb99/gists{/gist_id}", "starred_url": "https://api.github.com/users/tsengalb99/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/tsengalb99/subscriptions", "organizations_url": "https://api.github.com/users/tsengalb99/orgs", "repos_url": "https://api.github.com/users/tsengalb99/repos", "events_url": "https://api.github.com/users/tsengalb99/events{/privacy}", "received_events_url": "https://api.github.com/users/tsengalb99/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "id": 3817266200, "node_id": "MDU6TGFiZWwzODE3MjY2MjAw", "url": "https://api.github.com/repos/huggingface/transformers/labels/bug", "name": "bug", "color": "d73a4a", "default": true, "description": null } ]
open
false
null
[]
null
[]
2025-10-10T04:46:26
2025-10-11T08:31:05
null
NONE
null
null
null
null
### System Info ``` - huggingface_hub version: 0.31.4 - Platform: Linux-5.15.0-140-generic-x86_64-with-glibc2.35 - Python version: 3.13.0 - Running in iPython ?: No - Running in notebook ?: No - Running in Google Colab ?: No - Running in Google Colab Enterprise ?: No - Token path ?: /home/alberttseng/.cache/huggingface/token - Has saved token ?: True - Who am I ?: at676 - Configured git credential helpers: - FastAI: N/A - Tensorflow: N/A - Torch: 2.9.0.dev20250803+cu129 - Jinja2: 3.1.6 - Graphviz: N/A - keras: N/A - Pydot: N/A - Pillow: 11.2.1 - hf_transfer: N/A - gradio: N/A - tensorboard: N/A - numpy: 2.2.6 - pydantic: 2.10.3 - aiohttp: 3.11.18 - hf_xet: N/A - ENDPOINT: https://huggingface.co - HF_HUB_CACHE: /home/alberttseng/.cache/huggingface/hub - HF_ASSETS_CACHE: /home/alberttseng/.cache/huggingface/assets - HF_TOKEN_PATH: /home/alberttseng/.cache/huggingface/token - HF_STORED_TOKENS_PATH: /home/alberttseng/.cache/huggingface/stored_tokens - HF_HUB_OFFLINE: False - HF_HUB_DISABLE_TELEMETRY: False - HF_HUB_DISABLE_PROGRESS_BARS: None - HF_HUB_DISABLE_SYMLINKS_WARNING: False - HF_HUB_DISABLE_EXPERIMENTAL_WARNING: False - HF_HUB_DISABLE_IMPLICIT_TOKEN: False - HF_HUB_ENABLE_HF_TRANSFER: False - HF_HUB_ETAG_TIMEOUT: 10 - HF_HUB_DOWNLOAD_TIMEOUT: 10 ``` ### Who can help? @ArthurZucker @Cyrilvallez ### Information - [ ] The official example scripts - [x] My own modified scripts ### Tasks - [ ] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...) - [x] My own task or dataset (give details below) ### Reproduction In https://github.com/huggingface/transformers/blob/0720e206c6ba28887e4d60ef60a6a089f6c1cc76/src/transformers/models/mamba2/modeling_mamba2.py#L656, when given the same input to `hidden_states` and setting the other arguments to `None`, I get different results for the fast path (`self.cuda_kernels_forward`) with `self.training = True`, the fast path with `self.training = False`, and the slow path. ### Expected behavior These should all be equivalent.
null
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/41498/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/41498/timeline
null
null
{ "total": 0, "completed": 0, "percent_completed": 0 }
{ "blocked_by": 0, "total_blocked_by": 0, "blocking": 0, "total_blocking": 0 }
false
false
https://api.github.com/repos/huggingface/transformers/issues/41497
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/41497/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/41497/comments
https://api.github.com/repos/huggingface/transformers/issues/41497/events
https://github.com/huggingface/transformers/pull/41497
3,501,186,963
PR_kwDOCUB6oc6tAHdQ
41,497
Add final contribution report and training file
{ "login": "ali1baba1", "id": 191934682, "node_id": "U_kgDOC3Cw2g", "avatar_url": "https://avatars.githubusercontent.com/u/191934682?v=4", "gravatar_id": "", "url": "https://api.github.com/users/ali1baba1", "html_url": "https://github.com/ali1baba1", "followers_url": "https://api.github.com/users/ali1baba1/followers", "following_url": "https://api.github.com/users/ali1baba1/following{/other_user}", "gists_url": "https://api.github.com/users/ali1baba1/gists{/gist_id}", "starred_url": "https://api.github.com/users/ali1baba1/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/ali1baba1/subscriptions", "organizations_url": "https://api.github.com/users/ali1baba1/orgs", "repos_url": "https://api.github.com/users/ali1baba1/repos", "events_url": "https://api.github.com/users/ali1baba1/events{/privacy}", "received_events_url": "https://api.github.com/users/ali1baba1/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "id": 9258341780, "node_id": "LA_kwDOCUB6oc8AAAACJ9cVlA", "url": "https://api.github.com/repos/huggingface/transformers/labels/Code%20agent%20slop", "name": "Code agent slop", "color": "C59579", "default": false, "description": "" } ]
closed
false
null
[]
null
[]
2025-10-10T02:59:54
2025-10-11T01:34:28
2025-10-10T12:00:37
NONE
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/41497", "html_url": "https://github.com/huggingface/transformers/pull/41497", "diff_url": "https://github.com/huggingface/transformers/pull/41497.diff", "patch_url": "https://github.com/huggingface/transformers/pull/41497.patch", "merged_at": null }
## What does this PR do? This pull request adds my final contribution for the Hugging Face Transformers community project. It includes my experiment summary and the dataset used for fine-tuning a Masked Language Model (MLM). ## Who can review? Any member of the Transformers maintainers team. ## Motivation This contribution documents the results of my experiment using the no-trainer MLM script. It aims to share training data, model results, and experiment setup for community reference. ## Changes - Added `final_contribution_report.md` containing: - Model setup, training commands, and hyperparameters - Results and evaluation metrics - Observations and conclusions - Added `train.txt` used for fine-tuning ## Additional context This PR completes my final submission for the community contribution process.
{ "login": "Rocketknight1", "id": 12866554, "node_id": "MDQ6VXNlcjEyODY2NTU0", "avatar_url": "https://avatars.githubusercontent.com/u/12866554?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Rocketknight1", "html_url": "https://github.com/Rocketknight1", "followers_url": "https://api.github.com/users/Rocketknight1/followers", "following_url": "https://api.github.com/users/Rocketknight1/following{/other_user}", "gists_url": "https://api.github.com/users/Rocketknight1/gists{/gist_id}", "starred_url": "https://api.github.com/users/Rocketknight1/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Rocketknight1/subscriptions", "organizations_url": "https://api.github.com/users/Rocketknight1/orgs", "repos_url": "https://api.github.com/users/Rocketknight1/repos", "events_url": "https://api.github.com/users/Rocketknight1/events{/privacy}", "received_events_url": "https://api.github.com/users/Rocketknight1/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/41497/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/41497/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/41496
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/41496/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/41496/comments
https://api.github.com/repos/huggingface/transformers/issues/41496/events
https://github.com/huggingface/transformers/pull/41496
3,501,173,681
PR_kwDOCUB6oc6tAEvV
41,496
Allow optuna's catch kwargs passthrough
{ "login": "nicha-api", "id": 115199576, "node_id": "U_kgDOBt3OWA", "avatar_url": "https://avatars.githubusercontent.com/u/115199576?v=4", "gravatar_id": "", "url": "https://api.github.com/users/nicha-api", "html_url": "https://github.com/nicha-api", "followers_url": "https://api.github.com/users/nicha-api/followers", "following_url": "https://api.github.com/users/nicha-api/following{/other_user}", "gists_url": "https://api.github.com/users/nicha-api/gists{/gist_id}", "starred_url": "https://api.github.com/users/nicha-api/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/nicha-api/subscriptions", "organizations_url": "https://api.github.com/users/nicha-api/orgs", "repos_url": "https://api.github.com/users/nicha-api/repos", "events_url": "https://api.github.com/users/nicha-api/events{/privacy}", "received_events_url": "https://api.github.com/users/nicha-api/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-10-10T02:48:40
2025-10-10T13:58:41
2025-10-10T13:58:07
CONTRIBUTOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/41496", "html_url": "https://github.com/huggingface/transformers/pull/41496", "diff_url": "https://github.com/huggingface/transformers/pull/41496.diff", "patch_url": "https://github.com/huggingface/transformers/pull/41496.patch", "merged_at": "2025-10-10T13:58:07" }
# What does this PR do? Allow catch kwarg passthrough in Trainer.hyperparameter_search(backend="optuna") in order to expose Optuna’s Study.optimize(..., catch=...) argument by popping out catch kwargs before calling Optuna's study.optimize(). This PR addresss feature request #41463
{ "login": "SunMarc", "id": 57196510, "node_id": "MDQ6VXNlcjU3MTk2NTEw", "avatar_url": "https://avatars.githubusercontent.com/u/57196510?v=4", "gravatar_id": "", "url": "https://api.github.com/users/SunMarc", "html_url": "https://github.com/SunMarc", "followers_url": "https://api.github.com/users/SunMarc/followers", "following_url": "https://api.github.com/users/SunMarc/following{/other_user}", "gists_url": "https://api.github.com/users/SunMarc/gists{/gist_id}", "starred_url": "https://api.github.com/users/SunMarc/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/SunMarc/subscriptions", "organizations_url": "https://api.github.com/users/SunMarc/orgs", "repos_url": "https://api.github.com/users/SunMarc/repos", "events_url": "https://api.github.com/users/SunMarc/events{/privacy}", "received_events_url": "https://api.github.com/users/SunMarc/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/41496/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/41496/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/41495
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/41495/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/41495/comments
https://api.github.com/repos/huggingface/transformers/issues/41495/events
https://github.com/huggingface/transformers/pull/41495
3,500,913,797
PR_kwDOCUB6oc6s_P5F
41,495
Rm yoso kernel
{ "login": "MekkCyber", "id": 93391238, "node_id": "U_kgDOBZEJhg", "avatar_url": "https://avatars.githubusercontent.com/u/93391238?v=4", "gravatar_id": "", "url": "https://api.github.com/users/MekkCyber", "html_url": "https://github.com/MekkCyber", "followers_url": "https://api.github.com/users/MekkCyber/followers", "following_url": "https://api.github.com/users/MekkCyber/following{/other_user}", "gists_url": "https://api.github.com/users/MekkCyber/gists{/gist_id}", "starred_url": "https://api.github.com/users/MekkCyber/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/MekkCyber/subscriptions", "organizations_url": "https://api.github.com/users/MekkCyber/orgs", "repos_url": "https://api.github.com/users/MekkCyber/repos", "events_url": "https://api.github.com/users/MekkCyber/events{/privacy}", "received_events_url": "https://api.github.com/users/MekkCyber/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-10-09T23:41:30
2025-10-10T08:50:15
2025-10-10T08:50:13
CONTRIBUTOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/41495", "html_url": "https://github.com/huggingface/transformers/pull/41495", "diff_url": "https://github.com/huggingface/transformers/pull/41495.diff", "patch_url": "https://github.com/huggingface/transformers/pull/41495.patch", "merged_at": "2025-10-10T08:50:13" }
# What does this PR do? Removing the yoso kernels, and using https://huggingface.co/kernels-community/yoso instead
{ "login": "MekkCyber", "id": 93391238, "node_id": "U_kgDOBZEJhg", "avatar_url": "https://avatars.githubusercontent.com/u/93391238?v=4", "gravatar_id": "", "url": "https://api.github.com/users/MekkCyber", "html_url": "https://github.com/MekkCyber", "followers_url": "https://api.github.com/users/MekkCyber/followers", "following_url": "https://api.github.com/users/MekkCyber/following{/other_user}", "gists_url": "https://api.github.com/users/MekkCyber/gists{/gist_id}", "starred_url": "https://api.github.com/users/MekkCyber/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/MekkCyber/subscriptions", "organizations_url": "https://api.github.com/users/MekkCyber/orgs", "repos_url": "https://api.github.com/users/MekkCyber/repos", "events_url": "https://api.github.com/users/MekkCyber/events{/privacy}", "received_events_url": "https://api.github.com/users/MekkCyber/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/41495/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/41495/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/41494
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/41494/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/41494/comments
https://api.github.com/repos/huggingface/transformers/issues/41494/events
https://github.com/huggingface/transformers/issues/41494
3,500,893,664
I_kwDOCUB6oc7Qq2Xg
41,494
Incorrect tokenizer created for gemma gguf files
{ "login": "amychen85", "id": 4982156, "node_id": "MDQ6VXNlcjQ5ODIxNTY=", "avatar_url": "https://avatars.githubusercontent.com/u/4982156?v=4", "gravatar_id": "", "url": "https://api.github.com/users/amychen85", "html_url": "https://github.com/amychen85", "followers_url": "https://api.github.com/users/amychen85/followers", "following_url": "https://api.github.com/users/amychen85/following{/other_user}", "gists_url": "https://api.github.com/users/amychen85/gists{/gist_id}", "starred_url": "https://api.github.com/users/amychen85/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/amychen85/subscriptions", "organizations_url": "https://api.github.com/users/amychen85/orgs", "repos_url": "https://api.github.com/users/amychen85/repos", "events_url": "https://api.github.com/users/amychen85/events{/privacy}", "received_events_url": "https://api.github.com/users/amychen85/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "id": 3817266200, "node_id": "MDU6TGFiZWwzODE3MjY2MjAw", "url": "https://api.github.com/repos/huggingface/transformers/labels/bug", "name": "bug", "color": "d73a4a", "default": true, "description": null } ]
open
false
{ "login": "Isotr0py", "id": 41363108, "node_id": "MDQ6VXNlcjQxMzYzMTA4", "avatar_url": "https://avatars.githubusercontent.com/u/41363108?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Isotr0py", "html_url": "https://github.com/Isotr0py", "followers_url": "https://api.github.com/users/Isotr0py/followers", "following_url": "https://api.github.com/users/Isotr0py/following{/other_user}", "gists_url": "https://api.github.com/users/Isotr0py/gists{/gist_id}", "starred_url": "https://api.github.com/users/Isotr0py/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Isotr0py/subscriptions", "organizations_url": "https://api.github.com/users/Isotr0py/orgs", "repos_url": "https://api.github.com/users/Isotr0py/repos", "events_url": "https://api.github.com/users/Isotr0py/events{/privacy}", "received_events_url": "https://api.github.com/users/Isotr0py/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "login": "Isotr0py", "id": 41363108, "node_id": "MDQ6VXNlcjQxMzYzMTA4", "avatar_url": "https://avatars.githubusercontent.com/u/41363108?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Isotr0py", "html_url": "https://github.com/Isotr0py", "followers_url": "https://api.github.com/users/Isotr0py/followers", "following_url": "https://api.github.com/users/Isotr0py/following{/other_user}", "gists_url": "https://api.github.com/users/Isotr0py/gists{/gist_id}", "starred_url": "https://api.github.com/users/Isotr0py/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Isotr0py/subscriptions", "organizations_url": "https://api.github.com/users/Isotr0py/orgs", "repos_url": "https://api.github.com/users/Isotr0py/repos", "events_url": "https://api.github.com/users/Isotr0py/events{/privacy}", "received_events_url": "https://api.github.com/users/Isotr0py/received_events", "type": "User", "user_view_type": "public", "site_admin": false } ]
null
[]
2025-10-09T23:27:25
2025-10-27T04:51:51
null
NONE
null
null
null
null
### System Info - `transformers` version: 4.57.0 - Platform: Linux-5.15.0-144-generic-x86_64-with-glibc2.35 - Python version: 3.10.12 - Huggingface_hub version: 0.34.4 - Safetensors version: 0.5.3 - Accelerate version: 0.34.2 - Accelerate config: not found - DeepSpeed version: not installed - PyTorch version (accelerator?): 2.3.1+cu121 (NA) - Tensorflow version (GPU?): 2.17.0 (False) - Flax version (CPU?/GPU?/TPU?): not installed (NA) - Jax version: not installed - JaxLib version: not installed - Using distributed or parallel set-up in script?: NA ### Who can help? @yijun-lee @Isotr0py ### Information - [ ] The official example scripts - [ ] My own modified scripts ### Tasks - [ ] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...) - [ ] My own task or dataset (give details below) ### Reproduction ``` from transformers import AutoTokenizer t1 = AutoTokenizer.from_pretrained("unsloth/gemma-3-4b-it-GGUF", gguf_file="gemma-3-4b-it-Q8_0.gguf") x1 = t1.tokenize("<bos>What is eunoia?") print(f"{x1=}") t2 = AutoTokenizer.from_pretrained("google/gemma-3-4b-it") x2 = t2.tokenize("<bos>What is eunoia?") print(f"{x2=}") ``` ### Expected behavior The print out of the x1 and x2 should be the same. However, ``` x1=['<bos>', 'Wh', 'at', '▁is', '▁eu', 'no', 'ia', '?'] x2=['<bos>', 'What', '▁is', '▁e', 'uno', 'ia', '?'] ``` Looking more into it, the tokenizer created for HF model (t2) is BPE while the tokenizer created for the GGUF model (t1) is Unigram.
null
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/41494/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/41494/timeline
null
null
{ "total": 0, "completed": 0, "percent_completed": 0 }
{ "blocked_by": 0, "total_blocked_by": 0, "blocking": 0, "total_blocking": 0 }
false
false
https://api.github.com/repos/huggingface/transformers/issues/41493
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/41493/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/41493/comments
https://api.github.com/repos/huggingface/transformers/issues/41493/events
https://github.com/huggingface/transformers/pull/41493
3,500,778,419
PR_kwDOCUB6oc6s-zZZ
41,493
[kernels] Remove RWKV kernel finally !
{ "login": "MekkCyber", "id": 93391238, "node_id": "U_kgDOBZEJhg", "avatar_url": "https://avatars.githubusercontent.com/u/93391238?v=4", "gravatar_id": "", "url": "https://api.github.com/users/MekkCyber", "html_url": "https://github.com/MekkCyber", "followers_url": "https://api.github.com/users/MekkCyber/followers", "following_url": "https://api.github.com/users/MekkCyber/following{/other_user}", "gists_url": "https://api.github.com/users/MekkCyber/gists{/gist_id}", "starred_url": "https://api.github.com/users/MekkCyber/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/MekkCyber/subscriptions", "organizations_url": "https://api.github.com/users/MekkCyber/orgs", "repos_url": "https://api.github.com/users/MekkCyber/repos", "events_url": "https://api.github.com/users/MekkCyber/events{/privacy}", "received_events_url": "https://api.github.com/users/MekkCyber/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-10-09T22:19:42
2025-10-10T08:32:07
2025-10-10T08:32:06
CONTRIBUTOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/41493", "html_url": "https://github.com/huggingface/transformers/pull/41493", "diff_url": "https://github.com/huggingface/transformers/pull/41493.diff", "patch_url": "https://github.com/huggingface/transformers/pull/41493.patch", "merged_at": "2025-10-10T08:32:05" }
# What does this PR do? Cleans the rwkv kernel, after adding the kernel to `kernels-community` : https://huggingface.co/kernels-community/rwkv
{ "login": "MekkCyber", "id": 93391238, "node_id": "U_kgDOBZEJhg", "avatar_url": "https://avatars.githubusercontent.com/u/93391238?v=4", "gravatar_id": "", "url": "https://api.github.com/users/MekkCyber", "html_url": "https://github.com/MekkCyber", "followers_url": "https://api.github.com/users/MekkCyber/followers", "following_url": "https://api.github.com/users/MekkCyber/following{/other_user}", "gists_url": "https://api.github.com/users/MekkCyber/gists{/gist_id}", "starred_url": "https://api.github.com/users/MekkCyber/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/MekkCyber/subscriptions", "organizations_url": "https://api.github.com/users/MekkCyber/orgs", "repos_url": "https://api.github.com/users/MekkCyber/repos", "events_url": "https://api.github.com/users/MekkCyber/events{/privacy}", "received_events_url": "https://api.github.com/users/MekkCyber/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/41493/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/41493/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/41492
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/41492/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/41492/comments
https://api.github.com/repos/huggingface/transformers/issues/41492/events
https://github.com/huggingface/transformers/issues/41492
3,500,775,496
I_kwDOCUB6oc7QqZhI
41,492
For finetuning MBart-based model, setting decoder_start_token_id in model.config is NOT ENOUGH.
{ "login": "Bmingg", "id": 82941091, "node_id": "MDQ6VXNlcjgyOTQxMDkx", "avatar_url": "https://avatars.githubusercontent.com/u/82941091?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Bmingg", "html_url": "https://github.com/Bmingg", "followers_url": "https://api.github.com/users/Bmingg/followers", "following_url": "https://api.github.com/users/Bmingg/following{/other_user}", "gists_url": "https://api.github.com/users/Bmingg/gists{/gist_id}", "starred_url": "https://api.github.com/users/Bmingg/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Bmingg/subscriptions", "organizations_url": "https://api.github.com/users/Bmingg/orgs", "repos_url": "https://api.github.com/users/Bmingg/repos", "events_url": "https://api.github.com/users/Bmingg/events{/privacy}", "received_events_url": "https://api.github.com/users/Bmingg/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "id": 3817266200, "node_id": "MDU6TGFiZWwzODE3MjY2MjAw", "url": "https://api.github.com/repos/huggingface/transformers/labels/bug", "name": "bug", "color": "d73a4a", "default": true, "description": null } ]
open
false
null
[]
null
[]
2025-10-09T22:18:01
2025-10-11T13:21:56
null
NONE
null
null
null
null
### System Info Context: finetuning a MBart model with run_translation.py Easy fix is to set it in model.generation_config as well. Both worked outside of run_translation.py, but not setting this in run_translation.py causes validation/evaluation to fail miserably. ### Who can help? _No response_ ### Information - [x] The official example scripts - [x] My own modified scripts ### Tasks - [x] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...) - [ ] My own task or dataset (give details below) ### Reproduction Run run_translation.py to finetune with a validation file. ### Expected behavior BLEU scores should be terrible due to the bug. This makes it hard to monitor finetuning.
null
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/41492/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/41492/timeline
null
null
{ "total": 0, "completed": 0, "percent_completed": 0 }
{ "blocked_by": 0, "total_blocked_by": 0, "blocking": 0, "total_blocking": 0 }
false
false
https://api.github.com/repos/huggingface/transformers/issues/41491
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/41491/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/41491/comments
https://api.github.com/repos/huggingface/transformers/issues/41491/events
https://github.com/huggingface/transformers/pull/41491
3,500,719,119
PR_kwDOCUB6oc6s-nLC
41,491
Add skip_unnecessary_grad_clip to TrainingArguments for optimized gradient clipping
{ "login": "vaibhavgarg230", "id": 170336277, "node_id": "U_kgDOCicgFQ", "avatar_url": "https://avatars.githubusercontent.com/u/170336277?v=4", "gravatar_id": "", "url": "https://api.github.com/users/vaibhavgarg230", "html_url": "https://github.com/vaibhavgarg230", "followers_url": "https://api.github.com/users/vaibhavgarg230/followers", "following_url": "https://api.github.com/users/vaibhavgarg230/following{/other_user}", "gists_url": "https://api.github.com/users/vaibhavgarg230/gists{/gist_id}", "starred_url": "https://api.github.com/users/vaibhavgarg230/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/vaibhavgarg230/subscriptions", "organizations_url": "https://api.github.com/users/vaibhavgarg230/orgs", "repos_url": "https://api.github.com/users/vaibhavgarg230/repos", "events_url": "https://api.github.com/users/vaibhavgarg230/events{/privacy}", "received_events_url": "https://api.github.com/users/vaibhavgarg230/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
open
false
null
[]
null
[]
2025-10-09T21:47:27
2025-10-10T10:53:07
null
NONE
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/41491", "html_url": "https://github.com/huggingface/transformers/pull/41491", "diff_url": "https://github.com/huggingface/transformers/pull/41491.diff", "patch_url": "https://github.com/huggingface/transformers/pull/41491.patch", "merged_at": null }
# What does this PR do? This PR adds an opt-in `skip_unnecessary_grad_clip` argument to `TrainingArguments`, optimizing Trainer’s gradient clipping for better efficiency. When enabled, the Trainer computes and logs the gradient norm every step, but skips calling `clip_grad_norm_` if the norm is already below `max_grad_norm`. This prevents unnecessary computation for models/trainers with consistently low gradient norms, while always maintaining logging. **Maintainer requests addressed:** - Grad norm is logged every step, even when clipping is skipped. - Default behavior is unchanged (the flag is `False` by default). **Motivation:** - Addresses [issue #41431 ] - Improves performance for stable gradient or CPU-bound training. - Keeps code backward-compatible and easy to opt-in. **Dependencies:** - No new dependencies. **Tests added:** - New test: `tests/trainer/test_gradient_clipping.py` - Verifies clipping is skipped when grad norm is under threshold. - Verifies grad norm is still correctly logged. - Verifies the default and opt-in cases. **Documentation:** - Argument and code block are documented as per contributing guidelines. --- ## Before submitting - [x] Discussed/approved via issue or maintainer - [x] Documentation updated - [x] All tests and repo checks run locally --- ## Who can review? - Trainer logic: @zach-huggingface @SunMarc --- Thanks for reviewing! Feedback and suggestions are very welcome.
null
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/41491/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/41491/timeline
null
null
null
null
true
false
https://api.github.com/repos/huggingface/transformers/issues/41490
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/41490/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/41490/comments
https://api.github.com/repos/huggingface/transformers/issues/41490/events
https://github.com/huggingface/transformers/pull/41490
3,499,987,210
PR_kwDOCUB6oc6s8LJg
41,490
Fix _init_weights to safely skip int8 tensors in Qwen2_5_VL model
{ "login": "KaparthyReddy", "id": 166050493, "node_id": "U_kgDOCeW6vQ", "avatar_url": "https://avatars.githubusercontent.com/u/166050493?v=4", "gravatar_id": "", "url": "https://api.github.com/users/KaparthyReddy", "html_url": "https://github.com/KaparthyReddy", "followers_url": "https://api.github.com/users/KaparthyReddy/followers", "following_url": "https://api.github.com/users/KaparthyReddy/following{/other_user}", "gists_url": "https://api.github.com/users/KaparthyReddy/gists{/gist_id}", "starred_url": "https://api.github.com/users/KaparthyReddy/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/KaparthyReddy/subscriptions", "organizations_url": "https://api.github.com/users/KaparthyReddy/orgs", "repos_url": "https://api.github.com/users/KaparthyReddy/repos", "events_url": "https://api.github.com/users/KaparthyReddy/events{/privacy}", "received_events_url": "https://api.github.com/users/KaparthyReddy/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
open
false
null
[]
null
[]
2025-10-09T17:13:44
2025-10-11T13:09:59
null
NONE
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/41490", "html_url": "https://github.com/huggingface/transformers/pull/41490", "diff_url": "https://github.com/huggingface/transformers/pull/41490.diff", "patch_url": "https://github.com/huggingface/transformers/pull/41490.patch", "merged_at": null }
### Summary This PR fixes the `_init_weights()` method in `Qwen2_5_VLForConditionalGeneration` to safely skip int8 tensors during initialization. Previously, applying `normal_()` on int8 weights caused a RuntimeError when loading quantized models. ### Changes - Added dtype check in `_init_weights()` to only initialize floating-point tensors (`float16`, `float32`, `bfloat16`). - Ensures int8 weights from quantized models are skipped safely. - Verified fix by successfully loading a quantized Qwen2.5-VL model on CPU. ### Motivation Quantized models (W8A8, int8 weights) could not be loaded directly due to the previous `_init_weights()` implementation. This fix allows smooth loading without RuntimeError, making contributions compatible with LLMCompressor quantized models. ### Verification - Model loaded successfully on CPU after applying the fix. - `_init_weights()` safely ignores int8 tensors.
null
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/41490/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/41490/timeline
null
null
null
null
true
false
https://api.github.com/repos/huggingface/transformers/issues/41489
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/41489/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/41489/comments
https://api.github.com/repos/huggingface/transformers/issues/41489/events
https://github.com/huggingface/transformers/pull/41489
3,499,877,251
PR_kwDOCUB6oc6s70rh
41,489
More trainer cleaning
{ "login": "SunMarc", "id": 57196510, "node_id": "MDQ6VXNlcjU3MTk2NTEw", "avatar_url": "https://avatars.githubusercontent.com/u/57196510?v=4", "gravatar_id": "", "url": "https://api.github.com/users/SunMarc", "html_url": "https://github.com/SunMarc", "followers_url": "https://api.github.com/users/SunMarc/followers", "following_url": "https://api.github.com/users/SunMarc/following{/other_user}", "gists_url": "https://api.github.com/users/SunMarc/gists{/gist_id}", "starred_url": "https://api.github.com/users/SunMarc/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/SunMarc/subscriptions", "organizations_url": "https://api.github.com/users/SunMarc/orgs", "repos_url": "https://api.github.com/users/SunMarc/repos", "events_url": "https://api.github.com/users/SunMarc/events{/privacy}", "received_events_url": "https://api.github.com/users/SunMarc/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-10-09T16:39:15
2025-10-10T09:55:45
2025-10-10T09:55:43
MEMBER
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/41489", "html_url": "https://github.com/huggingface/transformers/pull/41489", "diff_url": "https://github.com/huggingface/transformers/pull/41489.diff", "patch_url": "https://github.com/huggingface/transformers/pull/41489.patch", "merged_at": "2025-10-10T09:55:43" }
# What does this PR do? This PR removes 2 unused args : `_n_gpu` and `mp_parameters`
{ "login": "SunMarc", "id": 57196510, "node_id": "MDQ6VXNlcjU3MTk2NTEw", "avatar_url": "https://avatars.githubusercontent.com/u/57196510?v=4", "gravatar_id": "", "url": "https://api.github.com/users/SunMarc", "html_url": "https://github.com/SunMarc", "followers_url": "https://api.github.com/users/SunMarc/followers", "following_url": "https://api.github.com/users/SunMarc/following{/other_user}", "gists_url": "https://api.github.com/users/SunMarc/gists{/gist_id}", "starred_url": "https://api.github.com/users/SunMarc/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/SunMarc/subscriptions", "organizations_url": "https://api.github.com/users/SunMarc/orgs", "repos_url": "https://api.github.com/users/SunMarc/repos", "events_url": "https://api.github.com/users/SunMarc/events{/privacy}", "received_events_url": "https://api.github.com/users/SunMarc/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/41489/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/41489/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/41488
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/41488/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/41488/comments
https://api.github.com/repos/huggingface/transformers/issues/41488/events
https://github.com/huggingface/transformers/pull/41488
3,499,846,115
PR_kwDOCUB6oc6s7uGN
41,488
Create 1
{ "login": "yrabasalaev-netizen", "id": 237085136, "node_id": "U_kgDODiGh0A", "avatar_url": "https://avatars.githubusercontent.com/u/237085136?v=4", "gravatar_id": "", "url": "https://api.github.com/users/yrabasalaev-netizen", "html_url": "https://github.com/yrabasalaev-netizen", "followers_url": "https://api.github.com/users/yrabasalaev-netizen/followers", "following_url": "https://api.github.com/users/yrabasalaev-netizen/following{/other_user}", "gists_url": "https://api.github.com/users/yrabasalaev-netizen/gists{/gist_id}", "starred_url": "https://api.github.com/users/yrabasalaev-netizen/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/yrabasalaev-netizen/subscriptions", "organizations_url": "https://api.github.com/users/yrabasalaev-netizen/orgs", "repos_url": "https://api.github.com/users/yrabasalaev-netizen/repos", "events_url": "https://api.github.com/users/yrabasalaev-netizen/events{/privacy}", "received_events_url": "https://api.github.com/users/yrabasalaev-netizen/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
open
false
null
[]
null
[]
2025-10-09T16:28:59
2025-10-09T16:28:59
null
NONE
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/41488", "html_url": "https://github.com/huggingface/transformers/pull/41488", "diff_url": "https://github.com/huggingface/transformers/pull/41488.diff", "patch_url": "https://github.com/huggingface/transformers/pull/41488.patch", "merged_at": null }
1 # What does this PR do? <!-- Congratulations! You've made it this far! You're not quite done yet though. Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution. Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change. Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost. --> <!-- Remove if not applicable --> Fixes # (issue) ## Before submitting - [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case). - [ ] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request), Pull Request section? - [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link to it if that's the case. - [ ] Did you make sure to update the documentation with your changes? Here are the [documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and [here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation). - [ ] Did you write any new necessary tests? ## Who can review? Anyone in the community is free to review the PR once the tests have passed. Feel free to tag members/contributors who may be interested in your PR. <!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @ If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**. Please tag fewer than 3 people. Models: - text models: @ArthurZucker @Cyrilvallez - vision models: @yonigozlan @molbap - audio models: @eustlb @ebezzam @vasqu - multimodal models: @zucchini-nlp - graph models: @clefourrier Library: - generate: @zucchini-nlp (visual-language models) or @gante (all others) - continuous batching: @remi-or @ArthurZucker @McPatate - pipelines: @Rocketknight1 - tokenizers: @ArthurZucker and @itazap - trainer: @zach-huggingface @SunMarc - attention: @vasqu @ArthurZucker @CyrilVallez - model loading (from pretrained, etc): @CyrilVallez - distributed: @3outeille @ArthurZucker @S1ro1 - CIs: @ydshieh Integrations: - deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface - ray/raytune: @richardliaw, @amogkam - Big Model Inference: @SunMarc - quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber - kernels: @MekkCyber @drbh - peft: @BenjaminBossan @githubnemo Devices/Backends: - AMD ROCm: @ivarflakstad - Intel XPU: @IlyasMoutawwakil - Ascend NPU: @ivarflakstad Documentation: @stevhliu Research projects are not maintained and should be taken as is. -->
null
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/41488/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/41488/timeline
null
null
null
null
true
false
https://api.github.com/repos/huggingface/transformers/issues/41487
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/41487/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/41487/comments
https://api.github.com/repos/huggingface/transformers/issues/41487/events
https://github.com/huggingface/transformers/pull/41487
3,499,843,152
PR_kwDOCUB6oc6s7tej
41,487
Migrate transformers cli to Typer
{ "login": "Wauplin", "id": 11801849, "node_id": "MDQ6VXNlcjExODAxODQ5", "avatar_url": "https://avatars.githubusercontent.com/u/11801849?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Wauplin", "html_url": "https://github.com/Wauplin", "followers_url": "https://api.github.com/users/Wauplin/followers", "following_url": "https://api.github.com/users/Wauplin/following{/other_user}", "gists_url": "https://api.github.com/users/Wauplin/gists{/gist_id}", "starred_url": "https://api.github.com/users/Wauplin/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Wauplin/subscriptions", "organizations_url": "https://api.github.com/users/Wauplin/orgs", "repos_url": "https://api.github.com/users/Wauplin/repos", "events_url": "https://api.github.com/users/Wauplin/events{/privacy}", "received_events_url": "https://api.github.com/users/Wauplin/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "id": 9105758243, "node_id": "LA_kwDOCUB6oc8AAAACHr7YIw", "url": "https://api.github.com/repos/huggingface/transformers/labels/for_v5?", "name": "for_v5?", "color": "35BC94", "default": false, "description": "" } ]
closed
false
{ "login": "LysandreJik", "id": 30755778, "node_id": "MDQ6VXNlcjMwNzU1Nzc4", "avatar_url": "https://avatars.githubusercontent.com/u/30755778?v=4", "gravatar_id": "", "url": "https://api.github.com/users/LysandreJik", "html_url": "https://github.com/LysandreJik", "followers_url": "https://api.github.com/users/LysandreJik/followers", "following_url": "https://api.github.com/users/LysandreJik/following{/other_user}", "gists_url": "https://api.github.com/users/LysandreJik/gists{/gist_id}", "starred_url": "https://api.github.com/users/LysandreJik/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/LysandreJik/subscriptions", "organizations_url": "https://api.github.com/users/LysandreJik/orgs", "repos_url": "https://api.github.com/users/LysandreJik/repos", "events_url": "https://api.github.com/users/LysandreJik/events{/privacy}", "received_events_url": "https://api.github.com/users/LysandreJik/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "login": "LysandreJik", "id": 30755778, "node_id": "MDQ6VXNlcjMwNzU1Nzc4", "avatar_url": "https://avatars.githubusercontent.com/u/30755778?v=4", "gravatar_id": "", "url": "https://api.github.com/users/LysandreJik", "html_url": "https://github.com/LysandreJik", "followers_url": "https://api.github.com/users/LysandreJik/followers", "following_url": "https://api.github.com/users/LysandreJik/following{/other_user}", "gists_url": "https://api.github.com/users/LysandreJik/gists{/gist_id}", "starred_url": "https://api.github.com/users/LysandreJik/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/LysandreJik/subscriptions", "organizations_url": "https://api.github.com/users/LysandreJik/orgs", "repos_url": "https://api.github.com/users/LysandreJik/repos", "events_url": "https://api.github.com/users/LysandreJik/events{/privacy}", "received_events_url": "https://api.github.com/users/LysandreJik/received_events", "type": "User", "user_view_type": "public", "site_admin": false } ]
null
[]
2025-10-09T16:28:01
2025-10-16T11:29:43
2025-10-16T11:29:42
CONTRIBUTOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/41487", "html_url": "https://github.com/huggingface/transformers/pull/41487", "diff_url": "https://github.com/huggingface/transformers/pull/41487.diff", "patch_url": "https://github.com/huggingface/transformers/pull/41487.patch", "merged_at": "2025-10-16T11:29:42" }
This PR migrates the `transformers` CLI to Typer. Typer is a package build on top of click by the creator of FastAPI. It is now a dependency of `huggingface_hub`, meaning it's also a dependency of `transformers`. Typer simplifies arg definition and ensures consistency using type annotations, which should help with maintenance. The benefit for users is the built-in autocompletion feature that let someone do `transformers chat [TAB][TAB]` to check what are the options. The `--help` section is also improved. By migrating to Typer, a longer term goal is to delegate to `huggingface_hub` some aspects of the installation and auto-update of the CLI. This will come in a second time and doesn't have to be correlated with the v5 release. ### CLI `--help` ```sh transformers --help Usage: transformers [OPTIONS] COMMAND [ARGS]... Transformers CLI Options: --install-completion Install completion for the current shell. --show-completion Show completion for the current shell, to copy it or customize the installation. --help Show this message and exit. Commands: add-fast-image-processor Add a fast image processor to a model. add-new-model-like Add a new model to the library, based on an... chat Chat with a model from the command line. download Download a model and its tokenizer from the Hub. env Print information about the environment. run Run a pipeline on a given input file. serve Run a FastAPI server to serve models... version Print CLI version. ``` ### Side notes Noted down some stuff while working on it. Can be addressed in later PRs. 1. Any command, even a simple `transformers env` is currently very long due both in previous and new CLI. This is due to `torch` import, no matter if it's used or not. I do think this is not good UX, especially if we want to have something like `transformers chat` as entrypoint for any openai-compatible server. This is not really related to CLI only, but to lazy imports in general. I broke it down to `is_torch_available` actually importing the package, not just checking it's existence. 2. The current `transformers serve` + `transformers chat` twin commands are really nice. One to start a server, the other one launch a chat interface. However, I feel that the current UX for `chat` is too bloated since it covers both the case where a server is already started AND started a new server from a model id (or path). I do think that `transformers chat` should only be to consume an existing API. It would make the whole implementation much cleaner and the interface less bloated for the end user (currently having 4-5 arguments only to provide model name, path, address, port, and host, instead of a single "url" argument). Since this would be a breaking change, I think addressing it for v5 is the perfect timing. => **EDIT:** this is now done in this PR. `transformers chat` does not serve a model now (making it much simpler) ``` transformers chat https://router.huggingface.co/v1 HuggingFaceTB/SmolLM3-3B ``` 3. The `transformers serve` feature is currently only available as a CLI. I do believe it would be best to move it to its own module so that someone could call it programmatically (e.g. `from transformers import serve` in a notebook). ### TODO - [x] (minor) do not import `typer_factory` from private internal (requires an update in `huggingface_hub` first) - [x] delete `./commands` folder and remove `transformers-legacy` CLI (that I currently use for testing) - [x] adapt remaining CLI tests - [x] `transformers chat` UI-only (not serving) - [ ] do not use classes for `chat` and `serve`? + expose them as modules (e.g. `from transformers import chat, serve`)
{ "login": "LysandreJik", "id": 30755778, "node_id": "MDQ6VXNlcjMwNzU1Nzc4", "avatar_url": "https://avatars.githubusercontent.com/u/30755778?v=4", "gravatar_id": "", "url": "https://api.github.com/users/LysandreJik", "html_url": "https://github.com/LysandreJik", "followers_url": "https://api.github.com/users/LysandreJik/followers", "following_url": "https://api.github.com/users/LysandreJik/following{/other_user}", "gists_url": "https://api.github.com/users/LysandreJik/gists{/gist_id}", "starred_url": "https://api.github.com/users/LysandreJik/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/LysandreJik/subscriptions", "organizations_url": "https://api.github.com/users/LysandreJik/orgs", "repos_url": "https://api.github.com/users/LysandreJik/repos", "events_url": "https://api.github.com/users/LysandreJik/events{/privacy}", "received_events_url": "https://api.github.com/users/LysandreJik/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/41487/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/41487/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/41486
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/41486/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/41486/comments
https://api.github.com/repos/huggingface/transformers/issues/41486/events
https://github.com/huggingface/transformers/pull/41486
3,499,829,317
PR_kwDOCUB6oc6s7qjz
41,486
[`CI`] Fix copies on main
{ "login": "vasqu", "id": 73884904, "node_id": "MDQ6VXNlcjczODg0OTA0", "avatar_url": "https://avatars.githubusercontent.com/u/73884904?v=4", "gravatar_id": "", "url": "https://api.github.com/users/vasqu", "html_url": "https://github.com/vasqu", "followers_url": "https://api.github.com/users/vasqu/followers", "following_url": "https://api.github.com/users/vasqu/following{/other_user}", "gists_url": "https://api.github.com/users/vasqu/gists{/gist_id}", "starred_url": "https://api.github.com/users/vasqu/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/vasqu/subscriptions", "organizations_url": "https://api.github.com/users/vasqu/orgs", "repos_url": "https://api.github.com/users/vasqu/repos", "events_url": "https://api.github.com/users/vasqu/events{/privacy}", "received_events_url": "https://api.github.com/users/vasqu/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-10-09T16:24:23
2025-10-09T16:38:19
2025-10-09T16:38:14
CONTRIBUTOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/41486", "html_url": "https://github.com/huggingface/transformers/pull/41486", "diff_url": "https://github.com/huggingface/transformers/pull/41486.diff", "patch_url": "https://github.com/huggingface/transformers/pull/41486.patch", "merged_at": "2025-10-09T16:38:14" }
As per title
{ "login": "vasqu", "id": 73884904, "node_id": "MDQ6VXNlcjczODg0OTA0", "avatar_url": "https://avatars.githubusercontent.com/u/73884904?v=4", "gravatar_id": "", "url": "https://api.github.com/users/vasqu", "html_url": "https://github.com/vasqu", "followers_url": "https://api.github.com/users/vasqu/followers", "following_url": "https://api.github.com/users/vasqu/following{/other_user}", "gists_url": "https://api.github.com/users/vasqu/gists{/gist_id}", "starred_url": "https://api.github.com/users/vasqu/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/vasqu/subscriptions", "organizations_url": "https://api.github.com/users/vasqu/orgs", "repos_url": "https://api.github.com/users/vasqu/repos", "events_url": "https://api.github.com/users/vasqu/events{/privacy}", "received_events_url": "https://api.github.com/users/vasqu/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/41486/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/41486/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/41485
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/41485/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/41485/comments
https://api.github.com/repos/huggingface/transformers/issues/41485/events
https://github.com/huggingface/transformers/pull/41485
3,499,818,961
PR_kwDOCUB6oc6s7ocD
41,485
Fix smolvlm2 dtype mismatch final
{ "login": "omsherikar", "id": 180152315, "node_id": "U_kgDOCrzn-w", "avatar_url": "https://avatars.githubusercontent.com/u/180152315?v=4", "gravatar_id": "", "url": "https://api.github.com/users/omsherikar", "html_url": "https://github.com/omsherikar", "followers_url": "https://api.github.com/users/omsherikar/followers", "following_url": "https://api.github.com/users/omsherikar/following{/other_user}", "gists_url": "https://api.github.com/users/omsherikar/gists{/gist_id}", "starred_url": "https://api.github.com/users/omsherikar/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/omsherikar/subscriptions", "organizations_url": "https://api.github.com/users/omsherikar/orgs", "repos_url": "https://api.github.com/users/omsherikar/repos", "events_url": "https://api.github.com/users/omsherikar/events{/privacy}", "received_events_url": "https://api.github.com/users/omsherikar/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
open
false
null
[]
null
[]
2025-10-09T16:21:42
2025-10-10T18:04:35
null
NONE
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/41485", "html_url": "https://github.com/huggingface/transformers/pull/41485", "diff_url": "https://github.com/huggingface/transformers/pull/41485.diff", "patch_url": "https://github.com/huggingface/transformers/pull/41485.patch", "merged_at": null }
# Fix SmolVLM2 quantization dtype mismatch ## What does this PR do? Fixes #41453 - SmolVLM2 cannot be used with quantization due to dtype mismatch error. **Problem**: When loading SmolVLM2 with BitsAndBytesConfig and bfloat16, the `inputs_merger` function fails with: ``` RuntimeError: Index put requires the source and destination dtypes match, got BFloat16 for the destination and Float for the source. ``` **Root Cause**: - Quantization forces `inputs_embeds` to `torch.bfloat16` (from BitsAndBytesConfig) - Vision encoder outputs `image_hidden_states` in `torch.float32` - Direct assignment between incompatible dtypes causes the crash **Solution**: Added dtype conversion to ensure `image_hidden_states` matches `inputs_embeds` dtype before assignment: ```python # Before (BROKEN): image_embeds[image_mask] = image_hidden_states[block_idx[image_mask], local_idx[image_mask], :] # After (FIXED): # Ensure dtype compatibility for quantization image_hidden_states = image_hidden_states.to(dtype=inputs_embeds.dtype) image_embeds[image_mask] = image_hidden_states[block_idx[image_mask], local_idx[image_mask], :] ``` **Changes**: - Modified `src/transformers/models/smolvlm/modeling_smolvlm.py` - Added dtype conversion in `inputs_merger` function - Updated `src/transformers/models/smolvlm/modular_smolvlm.py` - Aligned modular file with same fix - Added test in `tests/models/smolvlm/test_modeling_smolvlm.py` - `test_quantization_dtype_compatibility()` with `@slow` decorator **Testing**: The fix has been thoroughly tested and verified to resolve the quantization dtype mismatch issue without breaking existing functionality. Fixes #41453 ## Before submitting - [x] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case). - [x] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request), Pull Request section? - [x] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link to it if that's the case. - [x] Did you make sure to update the documentation with your changes? Here are the [documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and [here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation). - [x] Did you write any new necessary tests? ## Who can review? @yonigozlan @molbap - This affects vision models and quantization functionality
null
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/41485/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/41485/timeline
null
null
null
null
true
false
https://api.github.com/repos/huggingface/transformers/issues/41484
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/41484/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/41484/comments
https://api.github.com/repos/huggingface/transformers/issues/41484/events
https://github.com/huggingface/transformers/pull/41484
3,499,676,992
PR_kwDOCUB6oc6s7KHP
41,484
add Trainer import to .md in appropriate cell block for training.ipynb transformers_doc
{ "login": "benkeene", "id": 39920749, "node_id": "MDQ6VXNlcjM5OTIwNzQ5", "avatar_url": "https://avatars.githubusercontent.com/u/39920749?v=4", "gravatar_id": "", "url": "https://api.github.com/users/benkeene", "html_url": "https://github.com/benkeene", "followers_url": "https://api.github.com/users/benkeene/followers", "following_url": "https://api.github.com/users/benkeene/following{/other_user}", "gists_url": "https://api.github.com/users/benkeene/gists{/gist_id}", "starred_url": "https://api.github.com/users/benkeene/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/benkeene/subscriptions", "organizations_url": "https://api.github.com/users/benkeene/orgs", "repos_url": "https://api.github.com/users/benkeene/repos", "events_url": "https://api.github.com/users/benkeene/events{/privacy}", "received_events_url": "https://api.github.com/users/benkeene/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-10-09T15:41:42
2025-10-10T12:04:50
2025-10-10T12:04:07
CONTRIBUTOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/41484", "html_url": "https://github.com/huggingface/transformers/pull/41484", "diff_url": "https://github.com/huggingface/transformers/pull/41484.diff", "patch_url": "https://github.com/huggingface/transformers/pull/41484.patch", "merged_at": "2025-10-10T12:04:07" }
# What does this PR do? transformers_doc/en/training.ipynb does not import Trainer in the code block ```python trainer = Trainer( model=model, args=training_args, train_dataset=dataset["train"], eval_dataset=dataset["test"], compute_metrics=compute_metrics, ) trainer.train() ``` Which is imported from huggingface/transformers repo docs/source/en/training.md. Added the import in the source .md file. Current behavior: --------------------------------------------------------------------------- NameError Traceback (most recent call last) Cell In[7], line 1 ----> 1 trainer = Trainer( 2 model=model, 3 args=training_args, 4 train_dataset=dataset["train"], 5 eval_dataset=dataset["test"], 6 compute_metrics=compute_metrics, 7 ) 8 trainer.train() NameError: name 'Trainer' is not defined Expected behavior: The training should begin. Adding the import directly to transformers_doc/en/training.ipynb fixes the issue, but as mentioned in the pull request prompt: changes should be made in the huggingface/transformers repository. <!-- Congratulations! You've made it this far! You're not quite done yet though. Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution. Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change. Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost. --> <!-- Remove if not applicable --> Fixes # (issue) Missing import in transformers_doc/en/training.ipynb: `from transformers import Trainer`. ## Before submitting - [X] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case). - [ ] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request), Pull Request section? - [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link to it if that's the case. - [ ] Did you make sure to update the documentation with your changes? Here are the [documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and [here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation). - [ ] Did you write any new necessary tests? ## Who can review? Anyone in the community is free to review the PR once the tests have passed. Feel free to tag members/contributors who may be interested in your PR. <!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @ If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**. Please tag fewer than 3 people. Models: - text models: @ArthurZucker @Cyrilvallez - vision models: @yonigozlan @molbap - audio models: @eustlb @ebezzam @vasqu - multimodal models: @zucchini-nlp - graph models: @clefourrier Library: - generate: @zucchini-nlp (visual-language models) or @gante (all others) - continuous batching: @remi-or @ArthurZucker @McPatate - pipelines: @Rocketknight1 - tokenizers: @ArthurZucker and @itazap - trainer: @zach-huggingface @SunMarc - attention: @vasqu @ArthurZucker @CyrilVallez - model loading (from pretrained, etc): @CyrilVallez - distributed: @3outeille @ArthurZucker @S1ro1 - CIs: @ydshieh Integrations: - deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface - ray/raytune: @richardliaw, @amogkam - Big Model Inference: @SunMarc - quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber - kernels: @MekkCyber @drbh - peft: @BenjaminBossan @githubnemo Devices/Backends: - AMD ROCm: @ivarflakstad - Intel XPU: @IlyasMoutawwakil - Ascend NPU: @ivarflakstad Documentation: @stevhliu Research projects are not maintained and should be taken as is. -->
{ "login": "Rocketknight1", "id": 12866554, "node_id": "MDQ6VXNlcjEyODY2NTU0", "avatar_url": "https://avatars.githubusercontent.com/u/12866554?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Rocketknight1", "html_url": "https://github.com/Rocketknight1", "followers_url": "https://api.github.com/users/Rocketknight1/followers", "following_url": "https://api.github.com/users/Rocketknight1/following{/other_user}", "gists_url": "https://api.github.com/users/Rocketknight1/gists{/gist_id}", "starred_url": "https://api.github.com/users/Rocketknight1/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Rocketknight1/subscriptions", "organizations_url": "https://api.github.com/users/Rocketknight1/orgs", "repos_url": "https://api.github.com/users/Rocketknight1/repos", "events_url": "https://api.github.com/users/Rocketknight1/events{/privacy}", "received_events_url": "https://api.github.com/users/Rocketknight1/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/41484/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/41484/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/41483
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/41483/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/41483/comments
https://api.github.com/repos/huggingface/transformers/issues/41483/events
https://github.com/huggingface/transformers/pull/41483
3,499,644,613
PR_kwDOCUB6oc6s7DFS
41,483
Fixed tiny incorrect imports in `glm4v`
{ "login": "Sai-Suraj-27", "id": 87087741, "node_id": "MDQ6VXNlcjg3MDg3NzQx", "avatar_url": "https://avatars.githubusercontent.com/u/87087741?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Sai-Suraj-27", "html_url": "https://github.com/Sai-Suraj-27", "followers_url": "https://api.github.com/users/Sai-Suraj-27/followers", "following_url": "https://api.github.com/users/Sai-Suraj-27/following{/other_user}", "gists_url": "https://api.github.com/users/Sai-Suraj-27/gists{/gist_id}", "starred_url": "https://api.github.com/users/Sai-Suraj-27/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Sai-Suraj-27/subscriptions", "organizations_url": "https://api.github.com/users/Sai-Suraj-27/orgs", "repos_url": "https://api.github.com/users/Sai-Suraj-27/repos", "events_url": "https://api.github.com/users/Sai-Suraj-27/events{/privacy}", "received_events_url": "https://api.github.com/users/Sai-Suraj-27/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-10-09T15:32:45
2025-10-10T08:59:50
2025-10-10T08:57:01
CONTRIBUTOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/41483", "html_url": "https://github.com/huggingface/transformers/pull/41483", "diff_url": "https://github.com/huggingface/transformers/pull/41483.diff", "patch_url": "https://github.com/huggingface/transformers/pull/41483.patch", "merged_at": "2025-10-10T08:57:01" }
# What does this PR do? Fixes tiny incorrect imports in `glm4v`. These are the correct classes https://github.com/huggingface/transformers/blob/34b861abd11074fd32362b9a25c1cc582fa0b941/src/transformers/models/qwen2_vl/processing_qwen2_vl.py#L39 https://github.com/huggingface/transformers/blob/34b861abd11074fd32362b9a25c1cc582fa0b941/src/transformers/models/qwen2_vl/processing_qwen2_vl.py#L48 ## Before submitting - [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case). - [x] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request), Pull Request section? - [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link to it if that's the case. - [ ] Did you make sure to update the documentation with your changes? Here are the [documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and [here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation). - [ ] Did you write any new necessary tests? ## Who can review? cc: @zucchini-nlp @gante @Cyrilvallez (Similar to https://github.com/huggingface/transformers/pull/41354)
{ "login": "zucchini-nlp", "id": 100715397, "node_id": "U_kgDOBgDLhQ", "avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4", "gravatar_id": "", "url": "https://api.github.com/users/zucchini-nlp", "html_url": "https://github.com/zucchini-nlp", "followers_url": "https://api.github.com/users/zucchini-nlp/followers", "following_url": "https://api.github.com/users/zucchini-nlp/following{/other_user}", "gists_url": "https://api.github.com/users/zucchini-nlp/gists{/gist_id}", "starred_url": "https://api.github.com/users/zucchini-nlp/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/zucchini-nlp/subscriptions", "organizations_url": "https://api.github.com/users/zucchini-nlp/orgs", "repos_url": "https://api.github.com/users/zucchini-nlp/repos", "events_url": "https://api.github.com/users/zucchini-nlp/events{/privacy}", "received_events_url": "https://api.github.com/users/zucchini-nlp/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/41483/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/41483/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/41482
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/41482/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/41482/comments
https://api.github.com/repos/huggingface/transformers/issues/41482/events
https://github.com/huggingface/transformers/pull/41482
3,499,602,994
PR_kwDOCUB6oc6s66LP
41,482
Fix smolvlm2 dtype mismatch
{ "login": "omsherikar", "id": 180152315, "node_id": "U_kgDOCrzn-w", "avatar_url": "https://avatars.githubusercontent.com/u/180152315?v=4", "gravatar_id": "", "url": "https://api.github.com/users/omsherikar", "html_url": "https://github.com/omsherikar", "followers_url": "https://api.github.com/users/omsherikar/followers", "following_url": "https://api.github.com/users/omsherikar/following{/other_user}", "gists_url": "https://api.github.com/users/omsherikar/gists{/gist_id}", "starred_url": "https://api.github.com/users/omsherikar/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/omsherikar/subscriptions", "organizations_url": "https://api.github.com/users/omsherikar/orgs", "repos_url": "https://api.github.com/users/omsherikar/repos", "events_url": "https://api.github.com/users/omsherikar/events{/privacy}", "received_events_url": "https://api.github.com/users/omsherikar/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-10-09T15:20:29
2025-10-09T16:19:51
2025-10-09T16:19:51
NONE
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/41482", "html_url": "https://github.com/huggingface/transformers/pull/41482", "diff_url": "https://github.com/huggingface/transformers/pull/41482.diff", "patch_url": "https://github.com/huggingface/transformers/pull/41482.patch", "merged_at": null }
# Fix SmolVLM2 quantization dtype mismatch ## What does this PR do? Fixes #41453 - SmolVLM2 cannot be used with quantization due to dtype mismatch error. **Problem**: When loading SmolVLM2 with BitsAndBytesConfig and bfloat16, the `inputs_merger` function fails with: ``` RuntimeError: Index put requires the source and destination dtypes match, got BFloat16 for the destination and Float for the source. ``` **Root Cause**: - Quantization forces `inputs_embeds` to `torch.bfloat16` (from BitsAndBytesConfig) - Vision encoder outputs `image_hidden_states` in `torch.float32` - Direct assignment between incompatible dtypes causes the crash **Solution**: Added dtype conversion to ensure `image_hidden_states` matches `inputs_embeds` dtype before assignment: ```python # Before (BROKEN): image_embeds[image_mask] = image_hidden_states[block_idx[image_mask], local_idx[image_mask], :] # After (FIXED): # Ensure dtype compatibility for quantization image_hidden_states = image_hidden_states.to(dtype=inputs_embeds.dtype) image_embeds[image_mask] = image_hidden_states[block_idx[image_mask], local_idx[image_mask], :] ``` **Changes**: - Modified `src/transformers/models/smolvlm/modeling_smolvlm.py` - Added dtype conversion in `inputs_merger` function - Updated `src/transformers/models/smolvlm/modular_smolvlm.py` - Aligned modular file with same fix - Added test in `tests/models/smolvlm/test_modeling_smolvlm.py` - `test_quantization_dtype_compatibility()` with `@slow` decorator **Testing**: The fix has been thoroughly tested and verified to resolve the quantization dtype mismatch issue without breaking existing functionality. Fixes #41453 ## Before submitting - [x] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case). - [x] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request), Pull Request section? - [x] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link to it if that's the case. - [x] Did you make sure to update the documentation with your changes? Here are the [documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and [here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation). - [x] Did you write any new necessary tests? ## Who can review? @yonigozlan @molbap - This affects vision models and quantization functionality
{ "login": "omsherikar", "id": 180152315, "node_id": "U_kgDOCrzn-w", "avatar_url": "https://avatars.githubusercontent.com/u/180152315?v=4", "gravatar_id": "", "url": "https://api.github.com/users/omsherikar", "html_url": "https://github.com/omsherikar", "followers_url": "https://api.github.com/users/omsherikar/followers", "following_url": "https://api.github.com/users/omsherikar/following{/other_user}", "gists_url": "https://api.github.com/users/omsherikar/gists{/gist_id}", "starred_url": "https://api.github.com/users/omsherikar/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/omsherikar/subscriptions", "organizations_url": "https://api.github.com/users/omsherikar/orgs", "repos_url": "https://api.github.com/users/omsherikar/repos", "events_url": "https://api.github.com/users/omsherikar/events{/privacy}", "received_events_url": "https://api.github.com/users/omsherikar/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/41482/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/41482/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/41481
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/41481/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/41481/comments
https://api.github.com/repos/huggingface/transformers/issues/41481/events
https://github.com/huggingface/transformers/pull/41481
3,499,517,628
PR_kwDOCUB6oc6s6n5c
41,481
Fix/smolvlm2 quantization dtype mismatch clean
{ "login": "omsherikar", "id": 180152315, "node_id": "U_kgDOCrzn-w", "avatar_url": "https://avatars.githubusercontent.com/u/180152315?v=4", "gravatar_id": "", "url": "https://api.github.com/users/omsherikar", "html_url": "https://github.com/omsherikar", "followers_url": "https://api.github.com/users/omsherikar/followers", "following_url": "https://api.github.com/users/omsherikar/following{/other_user}", "gists_url": "https://api.github.com/users/omsherikar/gists{/gist_id}", "starred_url": "https://api.github.com/users/omsherikar/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/omsherikar/subscriptions", "organizations_url": "https://api.github.com/users/omsherikar/orgs", "repos_url": "https://api.github.com/users/omsherikar/repos", "events_url": "https://api.github.com/users/omsherikar/events{/privacy}", "received_events_url": "https://api.github.com/users/omsherikar/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-10-09T14:52:50
2025-10-09T15:02:16
2025-10-09T15:02:16
NONE
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/41481", "html_url": "https://github.com/huggingface/transformers/pull/41481", "diff_url": "https://github.com/huggingface/transformers/pull/41481.diff", "patch_url": "https://github.com/huggingface/transformers/pull/41481.patch", "merged_at": null }
# Fix SmolVLM2 quantization dtype mismatch ## What does this PR do? Fixes #41453 - SmolVLM2 cannot be used with quantization due to dtype mismatch error. **Problem**: When loading SmolVLM2 with BitsAndBytesConfig and bfloat16, the `inputs_merger` function fails with: ``` RuntimeError: Index put requires the source and destination dtypes match, got BFloat16 for the destination and Float for the source. ``` **Root Cause**: - Quantization forces `inputs_embeds` to `torch.bfloat16` (from BitsAndBytesConfig) - Vision encoder outputs `image_hidden_states` in `torch.float32` - Direct assignment between incompatible dtypes causes the crash **Solution**: Added dtype conversion to ensure `image_hidden_states` matches `inputs_embeds` dtype before assignment: ```python # Before (BROKEN): image_embeds[image_mask] = image_hidden_states[block_idx[image_mask], local_idx[image_mask], :] # After (FIXED): # Ensure dtype compatibility for quantization image_hidden_states = image_hidden_states.to(dtype=inputs_embeds.dtype) image_embeds[image_mask] = image_hidden_states[block_idx[image_mask], local_idx[image_mask], :] ``` **Changes**: - Modified `src/transformers/models/smolvlm/modeling_smolvlm.py` - Added dtype conversion in `inputs_merger` function - Updated `src/transformers/models/smolvlm/modular_smolvlm.py` - Aligned modular file with same fix - Added test in `tests/models/smolvlm/test_modeling_smolvlm.py` - `test_quantization_dtype_compatibility()` with `@slow` decorator **Testing**: The fix has been thoroughly tested and verified to resolve the quantization dtype mismatch issue without breaking existing functionality. Fixes #41453 ## Before submitting - [x] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case). - [x] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request), Pull Request section? - [x] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link to it if that's the case. - [x] Did you make sure to update the documentation with your changes? Here are the [documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and [here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation). - [x] Did you write any new necessary tests? ## Who can review? @yonigozlan @molbap @zucchini-nlp - This affects vision models and quantization functionality
{ "login": "omsherikar", "id": 180152315, "node_id": "U_kgDOCrzn-w", "avatar_url": "https://avatars.githubusercontent.com/u/180152315?v=4", "gravatar_id": "", "url": "https://api.github.com/users/omsherikar", "html_url": "https://github.com/omsherikar", "followers_url": "https://api.github.com/users/omsherikar/followers", "following_url": "https://api.github.com/users/omsherikar/following{/other_user}", "gists_url": "https://api.github.com/users/omsherikar/gists{/gist_id}", "starred_url": "https://api.github.com/users/omsherikar/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/omsherikar/subscriptions", "organizations_url": "https://api.github.com/users/omsherikar/orgs", "repos_url": "https://api.github.com/users/omsherikar/repos", "events_url": "https://api.github.com/users/omsherikar/events{/privacy}", "received_events_url": "https://api.github.com/users/omsherikar/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/41481/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/41481/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/41480
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/41480/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/41480/comments
https://api.github.com/repos/huggingface/transformers/issues/41480/events
https://github.com/huggingface/transformers/pull/41480
3,499,484,892
PR_kwDOCUB6oc6s6g-R
41,480
Fix/smolvlm2 quantization dtype mismatch
{ "login": "omsherikar", "id": 180152315, "node_id": "U_kgDOCrzn-w", "avatar_url": "https://avatars.githubusercontent.com/u/180152315?v=4", "gravatar_id": "", "url": "https://api.github.com/users/omsherikar", "html_url": "https://github.com/omsherikar", "followers_url": "https://api.github.com/users/omsherikar/followers", "following_url": "https://api.github.com/users/omsherikar/following{/other_user}", "gists_url": "https://api.github.com/users/omsherikar/gists{/gist_id}", "starred_url": "https://api.github.com/users/omsherikar/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/omsherikar/subscriptions", "organizations_url": "https://api.github.com/users/omsherikar/orgs", "repos_url": "https://api.github.com/users/omsherikar/repos", "events_url": "https://api.github.com/users/omsherikar/events{/privacy}", "received_events_url": "https://api.github.com/users/omsherikar/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-10-09T14:42:20
2025-10-09T15:23:46
2025-10-09T15:00:09
NONE
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/41480", "html_url": "https://github.com/huggingface/transformers/pull/41480", "diff_url": "https://github.com/huggingface/transformers/pull/41480.diff", "patch_url": "https://github.com/huggingface/transformers/pull/41480.patch", "merged_at": null }
# Fix SmolVLM2 quantization dtype mismatch ## What does this PR do? Fixes #41453 - SmolVLM2 cannot be used with quantization due to dtype mismatch error. **Problem**: When loading SmolVLM2 with BitsAndBytesConfig and bfloat16, the `inputs_merger` function fails with: ``` RuntimeError: Index put requires the source and destination dtypes match, got BFloat16 for the destination and Float for the source. ``` **Root Cause**: - Quantization forces `inputs_embeds` to `torch.bfloat16` (from BitsAndBytesConfig) - Vision encoder outputs `image_hidden_states` in `torch.float32` - Direct assignment between incompatible dtypes causes the crash **Solution**: Added dtype conversion to ensure `image_hidden_states` matches `inputs_embeds` dtype before assignment: ```python # Before (BROKEN): image_embeds[image_mask] = image_hidden_states[block_idx[image_mask], local_idx[image_mask], :] # After (FIXED): # Ensure dtype compatibility for quantization image_hidden_states = image_hidden_states.to(dtype=inputs_embeds.dtype) image_embeds[image_mask] = image_hidden_states[block_idx[image_mask], local_idx[image_mask], :] ``` **Changes**: - Modified `src/transformers/models/smolvlm/modeling_smolvlm.py` - Added dtype conversion in `inputs_merger` function - Updated `src/transformers/models/smolvlm/modular_smolvlm.py` - Aligned modular file with same fix - Added test in `tests/models/smolvlm/test_modeling_smolvlm.py` - `test_quantization_dtype_compatibility()` with `@slow` decorator **Testing**: The fix has been thoroughly tested and verified to resolve the quantization dtype mismatch issue without breaking existing functionality. Fixes #41453 ## Before submitting - [x] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case). - [x] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request), Pull Request section? - [x] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link to it if that's the case. - [x] Did you make sure to update the documentation with your changes? Here are the [documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and [here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation). - [x] Did you write any new necessary tests? ## Who can review? @yonigozlan @molbap @zucchini-nlp - This affects vision models and quantization functionality
{ "login": "omsherikar", "id": 180152315, "node_id": "U_kgDOCrzn-w", "avatar_url": "https://avatars.githubusercontent.com/u/180152315?v=4", "gravatar_id": "", "url": "https://api.github.com/users/omsherikar", "html_url": "https://github.com/omsherikar", "followers_url": "https://api.github.com/users/omsherikar/followers", "following_url": "https://api.github.com/users/omsherikar/following{/other_user}", "gists_url": "https://api.github.com/users/omsherikar/gists{/gist_id}", "starred_url": "https://api.github.com/users/omsherikar/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/omsherikar/subscriptions", "organizations_url": "https://api.github.com/users/omsherikar/orgs", "repos_url": "https://api.github.com/users/omsherikar/repos", "events_url": "https://api.github.com/users/omsherikar/events{/privacy}", "received_events_url": "https://api.github.com/users/omsherikar/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/41480/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/41480/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/41479
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/41479/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/41479/comments
https://api.github.com/repos/huggingface/transformers/issues/41479/events
https://github.com/huggingface/transformers/pull/41479
3,499,421,064
PR_kwDOCUB6oc6s6TI5
41,479
Remove SigOpt
{ "login": "SunMarc", "id": 57196510, "node_id": "MDQ6VXNlcjU3MTk2NTEw", "avatar_url": "https://avatars.githubusercontent.com/u/57196510?v=4", "gravatar_id": "", "url": "https://api.github.com/users/SunMarc", "html_url": "https://github.com/SunMarc", "followers_url": "https://api.github.com/users/SunMarc/followers", "following_url": "https://api.github.com/users/SunMarc/following{/other_user}", "gists_url": "https://api.github.com/users/SunMarc/gists{/gist_id}", "starred_url": "https://api.github.com/users/SunMarc/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/SunMarc/subscriptions", "organizations_url": "https://api.github.com/users/SunMarc/orgs", "repos_url": "https://api.github.com/users/SunMarc/repos", "events_url": "https://api.github.com/users/SunMarc/events{/privacy}", "received_events_url": "https://api.github.com/users/SunMarc/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-10-09T14:26:00
2025-10-09T16:05:57
2025-10-09T16:05:55
MEMBER
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/41479", "html_url": "https://github.com/huggingface/transformers/pull/41479", "diff_url": "https://github.com/huggingface/transformers/pull/41479.diff", "patch_url": "https://github.com/huggingface/transformers/pull/41479.patch", "merged_at": "2025-10-09T16:05:55" }
# What does this PR do? This PR removes SigOpt integration as the library is not maintained anymore by intel. Also, users who used this library had to pass a API key, so since the service is down, the API won't work at all. I disabled the tests a few months ago since they started failing and now it is better to just remove the integration.
{ "login": "SunMarc", "id": 57196510, "node_id": "MDQ6VXNlcjU3MTk2NTEw", "avatar_url": "https://avatars.githubusercontent.com/u/57196510?v=4", "gravatar_id": "", "url": "https://api.github.com/users/SunMarc", "html_url": "https://github.com/SunMarc", "followers_url": "https://api.github.com/users/SunMarc/followers", "following_url": "https://api.github.com/users/SunMarc/following{/other_user}", "gists_url": "https://api.github.com/users/SunMarc/gists{/gist_id}", "starred_url": "https://api.github.com/users/SunMarc/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/SunMarc/subscriptions", "organizations_url": "https://api.github.com/users/SunMarc/orgs", "repos_url": "https://api.github.com/users/SunMarc/repos", "events_url": "https://api.github.com/users/SunMarc/events{/privacy}", "received_events_url": "https://api.github.com/users/SunMarc/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/41479/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/41479/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/41478
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/41478/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/41478/comments
https://api.github.com/repos/huggingface/transformers/issues/41478/events
https://github.com/huggingface/transformers/pull/41478
3,499,415,486
PR_kwDOCUB6oc6s6R6z
41,478
Fix/smolvlm2 quantization dtype mismatch
{ "login": "omsherikar", "id": 180152315, "node_id": "U_kgDOCrzn-w", "avatar_url": "https://avatars.githubusercontent.com/u/180152315?v=4", "gravatar_id": "", "url": "https://api.github.com/users/omsherikar", "html_url": "https://github.com/omsherikar", "followers_url": "https://api.github.com/users/omsherikar/followers", "following_url": "https://api.github.com/users/omsherikar/following{/other_user}", "gists_url": "https://api.github.com/users/omsherikar/gists{/gist_id}", "starred_url": "https://api.github.com/users/omsherikar/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/omsherikar/subscriptions", "organizations_url": "https://api.github.com/users/omsherikar/orgs", "repos_url": "https://api.github.com/users/omsherikar/repos", "events_url": "https://api.github.com/users/omsherikar/events{/privacy}", "received_events_url": "https://api.github.com/users/omsherikar/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-10-09T14:24:47
2025-10-09T15:23:46
2025-10-09T14:38:22
NONE
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/41478", "html_url": "https://github.com/huggingface/transformers/pull/41478", "diff_url": "https://github.com/huggingface/transformers/pull/41478.diff", "patch_url": "https://github.com/huggingface/transformers/pull/41478.patch", "merged_at": null }
# Fix SmolVLM2 quantization dtype mismatch ## What does this PR do? Fixes #41453 - SmolVLM2 cannot be used with quantization due to dtype mismatch error. **Problem**: When loading SmolVLM2 with BitsAndBytesConfig and bfloat16, the `inputs_merger` function fails with: ``` RuntimeError: Index put requires the source and destination dtypes match, got BFloat16 for the destination and Float for the source. ``` **Root Cause**: - Quantization forces `inputs_embeds` to `torch.bfloat16` (from BitsAndBytesConfig) - Vision encoder outputs `image_hidden_states` in `torch.float32` - Direct assignment between incompatible dtypes causes the crash **Solution**: Added dtype conversion to ensure `image_hidden_states` matches `inputs_embeds` dtype before assignment: ```python # Before (BROKEN): image_embeds[image_mask] = image_hidden_states[block_idx[image_mask], local_idx[image_mask], :] # After (FIXED): # Ensure dtype compatibility for quantization image_hidden_states = image_hidden_states.to(dtype=inputs_embeds.dtype) image_embeds[image_mask] = image_hidden_states[block_idx[image_mask], local_idx[image_mask], :] ``` **Changes**: - Modified `src/transformers/models/smolvlm/modeling_smolvlm.py` - Added dtype conversion in `inputs_merger` function - Updated `src/transformers/models/smolvlm/modular_smolvlm.py` - Aligned modular file with same fix - Added test in `tests/models/smolvlm/test_modeling_smolvlm.py` - `test_quantization_dtype_compatibility()` with `@slow` decorator **Testing**: The fix has been thoroughly tested and verified to resolve the quantization dtype mismatch issue without breaking existing functionality. Fixes #41453 ## Before submitting - [x] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case). - [x] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request), Pull Request section? - [x] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link to it if that's the case. - [x] Did you make sure to update the documentation with your changes? Here are the [documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and [here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation). - [x] Did you write any new necessary tests? ## Who can review? @yonigozlan @molbap @zucchini-nlp - This affects vision models and quantization functionality
{ "login": "omsherikar", "id": 180152315, "node_id": "U_kgDOCrzn-w", "avatar_url": "https://avatars.githubusercontent.com/u/180152315?v=4", "gravatar_id": "", "url": "https://api.github.com/users/omsherikar", "html_url": "https://github.com/omsherikar", "followers_url": "https://api.github.com/users/omsherikar/followers", "following_url": "https://api.github.com/users/omsherikar/following{/other_user}", "gists_url": "https://api.github.com/users/omsherikar/gists{/gist_id}", "starred_url": "https://api.github.com/users/omsherikar/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/omsherikar/subscriptions", "organizations_url": "https://api.github.com/users/omsherikar/orgs", "repos_url": "https://api.github.com/users/omsherikar/repos", "events_url": "https://api.github.com/users/omsherikar/events{/privacy}", "received_events_url": "https://api.github.com/users/omsherikar/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/41478/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/41478/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/41477
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/41477/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/41477/comments
https://api.github.com/repos/huggingface/transformers/issues/41477/events
https://github.com/huggingface/transformers/pull/41477
3,499,314,590
PR_kwDOCUB6oc6s58AQ
41,477
Cleaning hub kernels
{ "login": "MekkCyber", "id": 93391238, "node_id": "U_kgDOBZEJhg", "avatar_url": "https://avatars.githubusercontent.com/u/93391238?v=4", "gravatar_id": "", "url": "https://api.github.com/users/MekkCyber", "html_url": "https://github.com/MekkCyber", "followers_url": "https://api.github.com/users/MekkCyber/followers", "following_url": "https://api.github.com/users/MekkCyber/following{/other_user}", "gists_url": "https://api.github.com/users/MekkCyber/gists{/gist_id}", "starred_url": "https://api.github.com/users/MekkCyber/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/MekkCyber/subscriptions", "organizations_url": "https://api.github.com/users/MekkCyber/orgs", "repos_url": "https://api.github.com/users/MekkCyber/repos", "events_url": "https://api.github.com/users/MekkCyber/events{/privacy}", "received_events_url": "https://api.github.com/users/MekkCyber/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-10-09T14:00:06
2025-10-09T14:32:20
2025-10-09T14:32:18
CONTRIBUTOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/41477", "html_url": "https://github.com/huggingface/transformers/pull/41477", "diff_url": "https://github.com/huggingface/transformers/pull/41477.diff", "patch_url": "https://github.com/huggingface/transformers/pull/41477.patch", "merged_at": "2025-10-09T14:32:18" }
# What does this PR do? Cleaning some comments, and renaming all instances of `load_and_register_kernel` to `load_and_register_attn_kernel` since it's used only for attention kernels
{ "login": "MekkCyber", "id": 93391238, "node_id": "U_kgDOBZEJhg", "avatar_url": "https://avatars.githubusercontent.com/u/93391238?v=4", "gravatar_id": "", "url": "https://api.github.com/users/MekkCyber", "html_url": "https://github.com/MekkCyber", "followers_url": "https://api.github.com/users/MekkCyber/followers", "following_url": "https://api.github.com/users/MekkCyber/following{/other_user}", "gists_url": "https://api.github.com/users/MekkCyber/gists{/gist_id}", "starred_url": "https://api.github.com/users/MekkCyber/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/MekkCyber/subscriptions", "organizations_url": "https://api.github.com/users/MekkCyber/orgs", "repos_url": "https://api.github.com/users/MekkCyber/repos", "events_url": "https://api.github.com/users/MekkCyber/events{/privacy}", "received_events_url": "https://api.github.com/users/MekkCyber/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/41477/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/41477/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/41476
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/41476/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/41476/comments
https://api.github.com/repos/huggingface/transformers/issues/41476/events
https://github.com/huggingface/transformers/pull/41476
3,499,137,716
PR_kwDOCUB6oc6s5VRJ
41,476
Pickle - part 2
{ "login": "ydshieh", "id": 2521628, "node_id": "MDQ6VXNlcjI1MjE2Mjg=", "avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4", "gravatar_id": "", "url": "https://api.github.com/users/ydshieh", "html_url": "https://github.com/ydshieh", "followers_url": "https://api.github.com/users/ydshieh/followers", "following_url": "https://api.github.com/users/ydshieh/following{/other_user}", "gists_url": "https://api.github.com/users/ydshieh/gists{/gist_id}", "starred_url": "https://api.github.com/users/ydshieh/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/ydshieh/subscriptions", "organizations_url": "https://api.github.com/users/ydshieh/orgs", "repos_url": "https://api.github.com/users/ydshieh/repos", "events_url": "https://api.github.com/users/ydshieh/events{/privacy}", "received_events_url": "https://api.github.com/users/ydshieh/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-10-09T13:16:11
2025-10-09T13:46:59
2025-10-09T13:46:54
COLLABORATOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/41476", "html_url": "https://github.com/huggingface/transformers/pull/41476", "diff_url": "https://github.com/huggingface/transformers/pull/41476.diff", "patch_url": "https://github.com/huggingface/transformers/pull/41476.patch", "merged_at": "2025-10-09T13:46:54" }
# What does this PR do? Require ``` if not strtobool(os.environ.get("TRUST_REMOTE_CODE", "False")): raise ValueError( "This part uses `pickle.load` which is insecure and will execute arbitrary code that is potentially " "malicious. It's recommended to never unpickle data that could have come from an untrusted source, or " "that could have been tampered with. If you already verified the pickle data and decided to use it, " "you can set the environment variable `TRUST_REMOTE_CODE` to `True` to allow it." ) ``` I limit this PR to conversion scripts only. For other usage of pickle in our codebase, I will try to see if there is any alternative.
{ "login": "ydshieh", "id": 2521628, "node_id": "MDQ6VXNlcjI1MjE2Mjg=", "avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4", "gravatar_id": "", "url": "https://api.github.com/users/ydshieh", "html_url": "https://github.com/ydshieh", "followers_url": "https://api.github.com/users/ydshieh/followers", "following_url": "https://api.github.com/users/ydshieh/following{/other_user}", "gists_url": "https://api.github.com/users/ydshieh/gists{/gist_id}", "starred_url": "https://api.github.com/users/ydshieh/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/ydshieh/subscriptions", "organizations_url": "https://api.github.com/users/ydshieh/orgs", "repos_url": "https://api.github.com/users/ydshieh/repos", "events_url": "https://api.github.com/users/ydshieh/events{/privacy}", "received_events_url": "https://api.github.com/users/ydshieh/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/41476/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/41476/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/41475
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/41475/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/41475/comments
https://api.github.com/repos/huggingface/transformers/issues/41475/events
https://github.com/huggingface/transformers/pull/41475
3,499,088,182
PR_kwDOCUB6oc6s5KhH
41,475
Remove DISABLE_KERNEL_MAPPING flag
{ "login": "MekkCyber", "id": 93391238, "node_id": "U_kgDOBZEJhg", "avatar_url": "https://avatars.githubusercontent.com/u/93391238?v=4", "gravatar_id": "", "url": "https://api.github.com/users/MekkCyber", "html_url": "https://github.com/MekkCyber", "followers_url": "https://api.github.com/users/MekkCyber/followers", "following_url": "https://api.github.com/users/MekkCyber/following{/other_user}", "gists_url": "https://api.github.com/users/MekkCyber/gists{/gist_id}", "starred_url": "https://api.github.com/users/MekkCyber/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/MekkCyber/subscriptions", "organizations_url": "https://api.github.com/users/MekkCyber/orgs", "repos_url": "https://api.github.com/users/MekkCyber/repos", "events_url": "https://api.github.com/users/MekkCyber/events{/privacy}", "received_events_url": "https://api.github.com/users/MekkCyber/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-10-09T13:03:01
2025-10-10T08:19:27
2025-10-10T08:19:25
CONTRIBUTOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/41475", "html_url": "https://github.com/huggingface/transformers/pull/41475", "diff_url": "https://github.com/huggingface/transformers/pull/41475.diff", "patch_url": "https://github.com/huggingface/transformers/pull/41475.patch", "merged_at": "2025-10-10T08:19:25" }
# What does this PR do? Removes `DISABLE_KERNEL_MAPPING` from docker files since it's an opt in for now, and we don't use kernels by default
{ "login": "MekkCyber", "id": 93391238, "node_id": "U_kgDOBZEJhg", "avatar_url": "https://avatars.githubusercontent.com/u/93391238?v=4", "gravatar_id": "", "url": "https://api.github.com/users/MekkCyber", "html_url": "https://github.com/MekkCyber", "followers_url": "https://api.github.com/users/MekkCyber/followers", "following_url": "https://api.github.com/users/MekkCyber/following{/other_user}", "gists_url": "https://api.github.com/users/MekkCyber/gists{/gist_id}", "starred_url": "https://api.github.com/users/MekkCyber/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/MekkCyber/subscriptions", "organizations_url": "https://api.github.com/users/MekkCyber/orgs", "repos_url": "https://api.github.com/users/MekkCyber/repos", "events_url": "https://api.github.com/users/MekkCyber/events{/privacy}", "received_events_url": "https://api.github.com/users/MekkCyber/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/41475/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/41475/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/41474
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/41474/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/41474/comments
https://api.github.com/repos/huggingface/transformers/issues/41474/events
https://github.com/huggingface/transformers/pull/41474
3,498,885,786
PR_kwDOCUB6oc6s4eyj
41,474
🚨 [v5] Toggle the serialization format in processors
{ "login": "zucchini-nlp", "id": 100715397, "node_id": "U_kgDOBgDLhQ", "avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4", "gravatar_id": "", "url": "https://api.github.com/users/zucchini-nlp", "html_url": "https://github.com/zucchini-nlp", "followers_url": "https://api.github.com/users/zucchini-nlp/followers", "following_url": "https://api.github.com/users/zucchini-nlp/following{/other_user}", "gists_url": "https://api.github.com/users/zucchini-nlp/gists{/gist_id}", "starred_url": "https://api.github.com/users/zucchini-nlp/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/zucchini-nlp/subscriptions", "organizations_url": "https://api.github.com/users/zucchini-nlp/orgs", "repos_url": "https://api.github.com/users/zucchini-nlp/repos", "events_url": "https://api.github.com/users/zucchini-nlp/events{/privacy}", "received_events_url": "https://api.github.com/users/zucchini-nlp/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-10-09T12:06:14
2025-10-16T08:19:22
2025-10-16T08:19:22
MEMBER
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/41474", "html_url": "https://github.com/huggingface/transformers/pull/41474", "diff_url": "https://github.com/huggingface/transformers/pull/41474.diff", "patch_url": "https://github.com/huggingface/transformers/pull/41474.patch", "merged_at": "2025-10-16T08:19:22" }
# What does this PR do? As per title, in v5 we will start saving in the new nested format. The loading is still supported though for old-format configs Also, we don't need an `optional_attributes` field that does not do what it is called. Since it is only `chat template`, let's just manually pop and set it. For audio tokenizers, it'd be nice to refactor a bit, I think we could try to move it in `self.attributes` Fixes https://github.com/huggingface/transformers/issues/40447 by deleting ambiguously named `optional_attributes`
{ "login": "zucchini-nlp", "id": 100715397, "node_id": "U_kgDOBgDLhQ", "avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4", "gravatar_id": "", "url": "https://api.github.com/users/zucchini-nlp", "html_url": "https://github.com/zucchini-nlp", "followers_url": "https://api.github.com/users/zucchini-nlp/followers", "following_url": "https://api.github.com/users/zucchini-nlp/following{/other_user}", "gists_url": "https://api.github.com/users/zucchini-nlp/gists{/gist_id}", "starred_url": "https://api.github.com/users/zucchini-nlp/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/zucchini-nlp/subscriptions", "organizations_url": "https://api.github.com/users/zucchini-nlp/orgs", "repos_url": "https://api.github.com/users/zucchini-nlp/repos", "events_url": "https://api.github.com/users/zucchini-nlp/events{/privacy}", "received_events_url": "https://api.github.com/users/zucchini-nlp/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/41474/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/41474/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/41473
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/41473/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/41473/comments
https://api.github.com/repos/huggingface/transformers/issues/41473/events
https://github.com/huggingface/transformers/pull/41473
3,498,828,973
PR_kwDOCUB6oc6s4ScC
41,473
Set `truncation` to `False` in Qwen3Omni to avoid default truncation
{ "login": "BakerBunker", "id": 17872844, "node_id": "MDQ6VXNlcjE3ODcyODQ0", "avatar_url": "https://avatars.githubusercontent.com/u/17872844?v=4", "gravatar_id": "", "url": "https://api.github.com/users/BakerBunker", "html_url": "https://github.com/BakerBunker", "followers_url": "https://api.github.com/users/BakerBunker/followers", "following_url": "https://api.github.com/users/BakerBunker/following{/other_user}", "gists_url": "https://api.github.com/users/BakerBunker/gists{/gist_id}", "starred_url": "https://api.github.com/users/BakerBunker/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/BakerBunker/subscriptions", "organizations_url": "https://api.github.com/users/BakerBunker/orgs", "repos_url": "https://api.github.com/users/BakerBunker/repos", "events_url": "https://api.github.com/users/BakerBunker/events{/privacy}", "received_events_url": "https://api.github.com/users/BakerBunker/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-10-09T11:48:53
2025-10-10T09:56:04
2025-10-10T09:55:18
CONTRIBUTOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/41473", "html_url": "https://github.com/huggingface/transformers/pull/41473", "diff_url": "https://github.com/huggingface/transformers/pull/41473.diff", "patch_url": "https://github.com/huggingface/transformers/pull/41473.patch", "merged_at": "2025-10-10T09:55:18" }
# What does this PR do? <!-- Congratulations! You've made it this far! You're not quite done yet though. Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution. Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change. Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost. --> <!-- Remove if not applicable --> Fixes https://github.com/QwenLM/Qwen3-Omni/issues/40 ## Before submitting - [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case). - [x] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request), Pull Request section? - [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link to it if that's the case. - [ ] Did you make sure to update the documentation with your changes? Here are the [documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and [here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation). - [ ] Did you write any new necessary tests? ## Who can review? Anyone in the community is free to review the PR once the tests have passed. Feel free to tag members/contributors who may be interested in your PR. <!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @ If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**. Please tag fewer than 3 people. Models: - text models: @ArthurZucker @Cyrilvallez - vision models: @yonigozlan @molbap - audio models: @eustlb @ebezzam @vasqu - multimodal models: @zucchini-nlp - graph models: @clefourrier Library: - generate: @zucchini-nlp (visual-language models) or @gante (all others) - continuous batching: @remi-or @ArthurZucker @McPatate - pipelines: @Rocketknight1 - tokenizers: @ArthurZucker and @itazap - trainer: @zach-huggingface @SunMarc - attention: @vasqu @ArthurZucker @CyrilVallez - model loading (from pretrained, etc): @CyrilVallez - distributed: @3outeille @ArthurZucker @S1ro1 - CIs: @ydshieh Integrations: - deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface - ray/raytune: @richardliaw, @amogkam - Big Model Inference: @SunMarc - quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber - kernels: @MekkCyber @drbh - peft: @BenjaminBossan @githubnemo Devices/Backends: - AMD ROCm: @ivarflakstad - Intel XPU: @IlyasMoutawwakil - Ascend NPU: @ivarflakstad Documentation: @stevhliu Research projects are not maintained and should be taken as is. -->
{ "login": "zucchini-nlp", "id": 100715397, "node_id": "U_kgDOBgDLhQ", "avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4", "gravatar_id": "", "url": "https://api.github.com/users/zucchini-nlp", "html_url": "https://github.com/zucchini-nlp", "followers_url": "https://api.github.com/users/zucchini-nlp/followers", "following_url": "https://api.github.com/users/zucchini-nlp/following{/other_user}", "gists_url": "https://api.github.com/users/zucchini-nlp/gists{/gist_id}", "starred_url": "https://api.github.com/users/zucchini-nlp/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/zucchini-nlp/subscriptions", "organizations_url": "https://api.github.com/users/zucchini-nlp/orgs", "repos_url": "https://api.github.com/users/zucchini-nlp/repos", "events_url": "https://api.github.com/users/zucchini-nlp/events{/privacy}", "received_events_url": "https://api.github.com/users/zucchini-nlp/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/41473/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/41473/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/41472
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/41472/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/41472/comments
https://api.github.com/repos/huggingface/transformers/issues/41472/events
https://github.com/huggingface/transformers/issues/41472
3,498,589,872
I_kwDOCUB6oc7QiD6w
41,472
Python 3.13.8 breaks DeBERTa-v2 and SEW-D model loading
{ "login": "albertvillanova", "id": 8515462, "node_id": "MDQ6VXNlcjg1MTU0NjI=", "avatar_url": "https://avatars.githubusercontent.com/u/8515462?v=4", "gravatar_id": "", "url": "https://api.github.com/users/albertvillanova", "html_url": "https://github.com/albertvillanova", "followers_url": "https://api.github.com/users/albertvillanova/followers", "following_url": "https://api.github.com/users/albertvillanova/following{/other_user}", "gists_url": "https://api.github.com/users/albertvillanova/gists{/gist_id}", "starred_url": "https://api.github.com/users/albertvillanova/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/albertvillanova/subscriptions", "organizations_url": "https://api.github.com/users/albertvillanova/orgs", "repos_url": "https://api.github.com/users/albertvillanova/repos", "events_url": "https://api.github.com/users/albertvillanova/events{/privacy}", "received_events_url": "https://api.github.com/users/albertvillanova/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "id": 3817266200, "node_id": "MDU6TGFiZWwzODE3MjY2MjAw", "url": "https://api.github.com/repos/huggingface/transformers/labels/bug", "name": "bug", "color": "d73a4a", "default": true, "description": null } ]
closed
false
null
[]
null
[]
2025-10-09T10:34:40
2025-10-16T08:10:10
2025-10-16T08:10:10
MEMBER
null
null
null
null
Python 3.13.8 breaks DeBERTa-v2 and SEW-D model loading. ### Describe the bug Python 3.13.8 contains a regression in the `inspect` module that breaks PyTorch JIT script compilation when comments appear between decorators and function definitions. This makes DeBERTa-v2 and SEW-D models completely unusable on Python 3.13.8. ### Reproduction ```python # Python 3.13.8 from transformers import AutoModel # This fails with IndentationError model = AutoModel.from_pretrained("microsoft/deberta-v2-xlarge") ``` Error: ```python IndentationError: expected an indented block after function definition on line 3 ``` ### Root Cause The issue is in these files: - transformers/models/deberta_v2/modeling_deberta_v2.py (lines 105-108, 111-114, 117-120) - transformers/models/sew_d/modeling_sew_d.py (similar pattern) Both use: ```python @torch.jit.script # Copied from transformers.models.deberta.modeling_deberta.xxx def function_name(...): ... ``` Python 3.13.8's `inspect.BlockFinder` incorrectly treats the comment as the start of a lambda/generator expression, truncating the source before the function body, which then fails to parse. - ✅ Works in Python 3.13.7 and earlier - ❌ Broken in Python 3.13.8 - 🔄 Expected to be fixed in Python 3.13.9 ### Upstream Bug This is a Python bug tracked at: - https://github.com/python/cpython/issues/139783 ### Impact Affected: - ❌ All DeBERTa-v2 models cannot be loaded - ❌ All SEW-D models cannot be loaded - ❌ Any downstream library using these models (e.g., trl, llm-blender) Not Affected: - ✅ Other model architectures work fine - ✅ Config loading works for all models ### Workaround - For users: Use Python 3.13.7 or earlier, or wait for Python 3.13.9 - For transformers maintainers (optional): Consider removing the comment between decorator and function: move comment above the decorator ```python # Copied from transformers.models.deberta.modeling_deberta.xxx @torch.jit.script def function_name(...): ... ``` ### Environment - Python: 3.13.8 - transformers version: 4.57.0 - PyTorch: 2.8.0 ### Additional Context This issue was discovered while using trl library's PairRMJudge, which uses llm-blender, which loads DeBERTa models via AutoModel.from_pretrained(). Related discussions: - TRL issue: - https://github.com/huggingface/trl/issues/4239 - CPython bug: - https://github.com/python/cpython/issues/139783
{ "login": "albertvillanova", "id": 8515462, "node_id": "MDQ6VXNlcjg1MTU0NjI=", "avatar_url": "https://avatars.githubusercontent.com/u/8515462?v=4", "gravatar_id": "", "url": "https://api.github.com/users/albertvillanova", "html_url": "https://github.com/albertvillanova", "followers_url": "https://api.github.com/users/albertvillanova/followers", "following_url": "https://api.github.com/users/albertvillanova/following{/other_user}", "gists_url": "https://api.github.com/users/albertvillanova/gists{/gist_id}", "starred_url": "https://api.github.com/users/albertvillanova/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/albertvillanova/subscriptions", "organizations_url": "https://api.github.com/users/albertvillanova/orgs", "repos_url": "https://api.github.com/users/albertvillanova/repos", "events_url": "https://api.github.com/users/albertvillanova/events{/privacy}", "received_events_url": "https://api.github.com/users/albertvillanova/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/41472/reactions", "total_count": 2, "+1": 2, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/41472/timeline
null
completed
{ "total": 0, "completed": 0, "percent_completed": 0 }
{ "blocked_by": 0, "total_blocked_by": 0, "blocking": 0, "total_blocking": 0 }
false
true
https://api.github.com/repos/huggingface/transformers/issues/41471
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/41471/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/41471/comments
https://api.github.com/repos/huggingface/transformers/issues/41471/events
https://github.com/huggingface/transformers/pull/41471
3,498,546,564
PR_kwDOCUB6oc6s3Ugt
41,471
Update GLM-4.6 doc
{ "login": "zRzRzRzRzRzRzR", "id": 93239683, "node_id": "U_kgDOBY65gw", "avatar_url": "https://avatars.githubusercontent.com/u/93239683?v=4", "gravatar_id": "", "url": "https://api.github.com/users/zRzRzRzRzRzRzR", "html_url": "https://github.com/zRzRzRzRzRzRzR", "followers_url": "https://api.github.com/users/zRzRzRzRzRzRzR/followers", "following_url": "https://api.github.com/users/zRzRzRzRzRzRzR/following{/other_user}", "gists_url": "https://api.github.com/users/zRzRzRzRzRzRzR/gists{/gist_id}", "starred_url": "https://api.github.com/users/zRzRzRzRzRzRzR/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/zRzRzRzRzRzRzR/subscriptions", "organizations_url": "https://api.github.com/users/zRzRzRzRzRzRzR/orgs", "repos_url": "https://api.github.com/users/zRzRzRzRzRzRzR/repos", "events_url": "https://api.github.com/users/zRzRzRzRzRzRzR/events{/privacy}", "received_events_url": "https://api.github.com/users/zRzRzRzRzRzRzR/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-10-09T10:22:19
2025-10-11T05:35:07
2025-10-09T16:18:05
CONTRIBUTOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/41471", "html_url": "https://github.com/huggingface/transformers/pull/41471", "diff_url": "https://github.com/huggingface/transformers/pull/41471.diff", "patch_url": "https://github.com/huggingface/transformers/pull/41471.patch", "merged_at": "2025-10-09T16:18:05" }
Updated the simple description and technical blog address for GLM-4.6.
{ "login": "stevhliu", "id": 59462357, "node_id": "MDQ6VXNlcjU5NDYyMzU3", "avatar_url": "https://avatars.githubusercontent.com/u/59462357?v=4", "gravatar_id": "", "url": "https://api.github.com/users/stevhliu", "html_url": "https://github.com/stevhliu", "followers_url": "https://api.github.com/users/stevhliu/followers", "following_url": "https://api.github.com/users/stevhliu/following{/other_user}", "gists_url": "https://api.github.com/users/stevhliu/gists{/gist_id}", "starred_url": "https://api.github.com/users/stevhliu/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/stevhliu/subscriptions", "organizations_url": "https://api.github.com/users/stevhliu/orgs", "repos_url": "https://api.github.com/users/stevhliu/repos", "events_url": "https://api.github.com/users/stevhliu/events{/privacy}", "received_events_url": "https://api.github.com/users/stevhliu/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/41471/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/41471/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/41470
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/41470/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/41470/comments
https://api.github.com/repos/huggingface/transformers/issues/41470/events
https://github.com/huggingface/transformers/pull/41470
3,498,394,828
PR_kwDOCUB6oc6s2zkZ
41,470
[kernels] Cleanup deta kernel
{ "login": "MekkCyber", "id": 93391238, "node_id": "U_kgDOBZEJhg", "avatar_url": "https://avatars.githubusercontent.com/u/93391238?v=4", "gravatar_id": "", "url": "https://api.github.com/users/MekkCyber", "html_url": "https://github.com/MekkCyber", "followers_url": "https://api.github.com/users/MekkCyber/followers", "following_url": "https://api.github.com/users/MekkCyber/following{/other_user}", "gists_url": "https://api.github.com/users/MekkCyber/gists{/gist_id}", "starred_url": "https://api.github.com/users/MekkCyber/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/MekkCyber/subscriptions", "organizations_url": "https://api.github.com/users/MekkCyber/orgs", "repos_url": "https://api.github.com/users/MekkCyber/repos", "events_url": "https://api.github.com/users/MekkCyber/events{/privacy}", "received_events_url": "https://api.github.com/users/MekkCyber/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-10-09T09:38:55
2025-10-09T11:17:44
2025-10-09T11:17:42
CONTRIBUTOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/41470", "html_url": "https://github.com/huggingface/transformers/pull/41470", "diff_url": "https://github.com/huggingface/transformers/pull/41470.diff", "patch_url": "https://github.com/huggingface/transformers/pull/41470.patch", "merged_at": "2025-10-09T11:17:42" }
# What does this PR do? Cleanup the `deta` kernel since it's the same as `deformable_detr` kernel cleaned here : https://github.com/huggingface/transformers/pull/36853 And do the necessary modeling changes (even though the model is deprecated)
{ "login": "MekkCyber", "id": 93391238, "node_id": "U_kgDOBZEJhg", "avatar_url": "https://avatars.githubusercontent.com/u/93391238?v=4", "gravatar_id": "", "url": "https://api.github.com/users/MekkCyber", "html_url": "https://github.com/MekkCyber", "followers_url": "https://api.github.com/users/MekkCyber/followers", "following_url": "https://api.github.com/users/MekkCyber/following{/other_user}", "gists_url": "https://api.github.com/users/MekkCyber/gists{/gist_id}", "starred_url": "https://api.github.com/users/MekkCyber/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/MekkCyber/subscriptions", "organizations_url": "https://api.github.com/users/MekkCyber/orgs", "repos_url": "https://api.github.com/users/MekkCyber/repos", "events_url": "https://api.github.com/users/MekkCyber/events{/privacy}", "received_events_url": "https://api.github.com/users/MekkCyber/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/41470/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/41470/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/41469
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/41469/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/41469/comments
https://api.github.com/repos/huggingface/transformers/issues/41469/events
https://github.com/huggingface/transformers/pull/41469
3,498,372,645
PR_kwDOCUB6oc6s2usV
41,469
Remove check_quantized_param
{ "login": "cyyever", "id": 17618148, "node_id": "MDQ6VXNlcjE3NjE4MTQ4", "avatar_url": "https://avatars.githubusercontent.com/u/17618148?v=4", "gravatar_id": "", "url": "https://api.github.com/users/cyyever", "html_url": "https://github.com/cyyever", "followers_url": "https://api.github.com/users/cyyever/followers", "following_url": "https://api.github.com/users/cyyever/following{/other_user}", "gists_url": "https://api.github.com/users/cyyever/gists{/gist_id}", "starred_url": "https://api.github.com/users/cyyever/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/cyyever/subscriptions", "organizations_url": "https://api.github.com/users/cyyever/orgs", "repos_url": "https://api.github.com/users/cyyever/repos", "events_url": "https://api.github.com/users/cyyever/events{/privacy}", "received_events_url": "https://api.github.com/users/cyyever/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-10-09T09:32:42
2025-10-18T00:43:23
2025-10-18T00:43:17
CONTRIBUTOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/41469", "html_url": "https://github.com/huggingface/transformers/pull/41469", "diff_url": "https://github.com/huggingface/transformers/pull/41469.diff", "patch_url": "https://github.com/huggingface/transformers/pull/41469.patch", "merged_at": null }
# What does this PR do? As the title says.
{ "login": "cyyever", "id": 17618148, "node_id": "MDQ6VXNlcjE3NjE4MTQ4", "avatar_url": "https://avatars.githubusercontent.com/u/17618148?v=4", "gravatar_id": "", "url": "https://api.github.com/users/cyyever", "html_url": "https://github.com/cyyever", "followers_url": "https://api.github.com/users/cyyever/followers", "following_url": "https://api.github.com/users/cyyever/following{/other_user}", "gists_url": "https://api.github.com/users/cyyever/gists{/gist_id}", "starred_url": "https://api.github.com/users/cyyever/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/cyyever/subscriptions", "organizations_url": "https://api.github.com/users/cyyever/orgs", "repos_url": "https://api.github.com/users/cyyever/repos", "events_url": "https://api.github.com/users/cyyever/events{/privacy}", "received_events_url": "https://api.github.com/users/cyyever/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/41469/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/41469/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/41468
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/41468/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/41468/comments
https://api.github.com/repos/huggingface/transformers/issues/41468/events
https://github.com/huggingface/transformers/pull/41468
3,498,339,171
PR_kwDOCUB6oc6s2net
41,468
Remove KERAS_NLP_IMPORT_ERROR
{ "login": "cyyever", "id": 17618148, "node_id": "MDQ6VXNlcjE3NjE4MTQ4", "avatar_url": "https://avatars.githubusercontent.com/u/17618148?v=4", "gravatar_id": "", "url": "https://api.github.com/users/cyyever", "html_url": "https://github.com/cyyever", "followers_url": "https://api.github.com/users/cyyever/followers", "following_url": "https://api.github.com/users/cyyever/following{/other_user}", "gists_url": "https://api.github.com/users/cyyever/gists{/gist_id}", "starred_url": "https://api.github.com/users/cyyever/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/cyyever/subscriptions", "organizations_url": "https://api.github.com/users/cyyever/orgs", "repos_url": "https://api.github.com/users/cyyever/repos", "events_url": "https://api.github.com/users/cyyever/events{/privacy}", "received_events_url": "https://api.github.com/users/cyyever/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-10-09T09:22:23
2025-10-09T12:01:43
2025-10-09T11:58:31
CONTRIBUTOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/41468", "html_url": "https://github.com/huggingface/transformers/pull/41468", "diff_url": "https://github.com/huggingface/transformers/pull/41468.diff", "patch_url": "https://github.com/huggingface/transformers/pull/41468.patch", "merged_at": "2025-10-09T11:58:31" }
# What does this PR do? Because `KERAS_NLP_IMPORT_ERROR` is not used.
{ "login": "Rocketknight1", "id": 12866554, "node_id": "MDQ6VXNlcjEyODY2NTU0", "avatar_url": "https://avatars.githubusercontent.com/u/12866554?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Rocketknight1", "html_url": "https://github.com/Rocketknight1", "followers_url": "https://api.github.com/users/Rocketknight1/followers", "following_url": "https://api.github.com/users/Rocketknight1/following{/other_user}", "gists_url": "https://api.github.com/users/Rocketknight1/gists{/gist_id}", "starred_url": "https://api.github.com/users/Rocketknight1/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Rocketknight1/subscriptions", "organizations_url": "https://api.github.com/users/Rocketknight1/orgs", "repos_url": "https://api.github.com/users/Rocketknight1/repos", "events_url": "https://api.github.com/users/Rocketknight1/events{/privacy}", "received_events_url": "https://api.github.com/users/Rocketknight1/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/41468/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/41468/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/41467
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/41467/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/41467/comments
https://api.github.com/repos/huggingface/transformers/issues/41467/events
https://github.com/huggingface/transformers/pull/41467
3,498,332,026
PR_kwDOCUB6oc6s2l8H
41,467
Add check for kernel config
{ "login": "MekkCyber", "id": 93391238, "node_id": "U_kgDOBZEJhg", "avatar_url": "https://avatars.githubusercontent.com/u/93391238?v=4", "gravatar_id": "", "url": "https://api.github.com/users/MekkCyber", "html_url": "https://github.com/MekkCyber", "followers_url": "https://api.github.com/users/MekkCyber/followers", "following_url": "https://api.github.com/users/MekkCyber/following{/other_user}", "gists_url": "https://api.github.com/users/MekkCyber/gists{/gist_id}", "starred_url": "https://api.github.com/users/MekkCyber/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/MekkCyber/subscriptions", "organizations_url": "https://api.github.com/users/MekkCyber/orgs", "repos_url": "https://api.github.com/users/MekkCyber/repos", "events_url": "https://api.github.com/users/MekkCyber/events{/privacy}", "received_events_url": "https://api.github.com/users/MekkCyber/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-10-09T09:20:01
2025-10-16T13:01:50
2025-10-16T13:01:50
CONTRIBUTOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/41467", "html_url": "https://github.com/huggingface/transformers/pull/41467", "diff_url": "https://github.com/huggingface/transformers/pull/41467.diff", "patch_url": "https://github.com/huggingface/transformers/pull/41467.patch", "merged_at": null }
# What does this PR do? Erroring when `kernel_config` is provided without specifically setting `use_kernels` to `True`
{ "login": "MekkCyber", "id": 93391238, "node_id": "U_kgDOBZEJhg", "avatar_url": "https://avatars.githubusercontent.com/u/93391238?v=4", "gravatar_id": "", "url": "https://api.github.com/users/MekkCyber", "html_url": "https://github.com/MekkCyber", "followers_url": "https://api.github.com/users/MekkCyber/followers", "following_url": "https://api.github.com/users/MekkCyber/following{/other_user}", "gists_url": "https://api.github.com/users/MekkCyber/gists{/gist_id}", "starred_url": "https://api.github.com/users/MekkCyber/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/MekkCyber/subscriptions", "organizations_url": "https://api.github.com/users/MekkCyber/orgs", "repos_url": "https://api.github.com/users/MekkCyber/repos", "events_url": "https://api.github.com/users/MekkCyber/events{/privacy}", "received_events_url": "https://api.github.com/users/MekkCyber/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/41467/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/41467/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/41466
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/41466/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/41466/comments
https://api.github.com/repos/huggingface/transformers/issues/41466/events
https://github.com/huggingface/transformers/pull/41466
3,498,303,181
PR_kwDOCUB6oc6s2f5O
41,466
Try to remove `pickle` - `BloomTokenizerFast`
{ "login": "ydshieh", "id": 2521628, "node_id": "MDQ6VXNlcjI1MjE2Mjg=", "avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4", "gravatar_id": "", "url": "https://api.github.com/users/ydshieh", "html_url": "https://github.com/ydshieh", "followers_url": "https://api.github.com/users/ydshieh/followers", "following_url": "https://api.github.com/users/ydshieh/following{/other_user}", "gists_url": "https://api.github.com/users/ydshieh/gists{/gist_id}", "starred_url": "https://api.github.com/users/ydshieh/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/ydshieh/subscriptions", "organizations_url": "https://api.github.com/users/ydshieh/orgs", "repos_url": "https://api.github.com/users/ydshieh/repos", "events_url": "https://api.github.com/users/ydshieh/events{/privacy}", "received_events_url": "https://api.github.com/users/ydshieh/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-10-09T09:10:39
2025-10-10T08:52:53
2025-10-10T08:52:51
COLLABORATOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/41466", "html_url": "https://github.com/huggingface/transformers/pull/41466", "diff_url": "https://github.com/huggingface/transformers/pull/41466.diff", "patch_url": "https://github.com/huggingface/transformers/pull/41466.patch", "merged_at": "2025-10-10T08:52:51" }
# What does this PR do? I can see the effect as the value is changed as expected from ``` (Pdb) self.backend_tokenizer.pre_tokenizer Sequence(pretokenizers=[Split(pattern=Regex(" ?[^(\s|[.,!?…。,、।۔،])]+"), behavior=Isolated, invert=False), ByteLevel(add_prefix_space=False, trim_offsets=True, use_regex=False)]) ``` to ``` (Pdb) self.backend_tokenizer.pre_tokenizer Sequence(pretokenizers=[Split(pattern=Regex(" ?[^(\s|[.,!?…。,、।۔،])]+"), behavior=Isolated, invert=False), ByteLevel(add_prefix_space=True, trim_offsets=True, use_regex=False)]) ```
{ "login": "ydshieh", "id": 2521628, "node_id": "MDQ6VXNlcjI1MjE2Mjg=", "avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4", "gravatar_id": "", "url": "https://api.github.com/users/ydshieh", "html_url": "https://github.com/ydshieh", "followers_url": "https://api.github.com/users/ydshieh/followers", "following_url": "https://api.github.com/users/ydshieh/following{/other_user}", "gists_url": "https://api.github.com/users/ydshieh/gists{/gist_id}", "starred_url": "https://api.github.com/users/ydshieh/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/ydshieh/subscriptions", "organizations_url": "https://api.github.com/users/ydshieh/orgs", "repos_url": "https://api.github.com/users/ydshieh/repos", "events_url": "https://api.github.com/users/ydshieh/events{/privacy}", "received_events_url": "https://api.github.com/users/ydshieh/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/41466/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/41466/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/41465
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/41465/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/41465/comments
https://api.github.com/repos/huggingface/transformers/issues/41465/events
https://github.com/huggingface/transformers/issues/41465
3,498,216,107
I_kwDOCUB6oc7Qgoqr
41,465
Support s_aux in GPT-OSS
{ "login": "LoserCheems", "id": 124847097, "node_id": "U_kgDOB3ED-Q", "avatar_url": "https://avatars.githubusercontent.com/u/124847097?v=4", "gravatar_id": "", "url": "https://api.github.com/users/LoserCheems", "html_url": "https://github.com/LoserCheems", "followers_url": "https://api.github.com/users/LoserCheems/followers", "following_url": "https://api.github.com/users/LoserCheems/following{/other_user}", "gists_url": "https://api.github.com/users/LoserCheems/gists{/gist_id}", "starred_url": "https://api.github.com/users/LoserCheems/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/LoserCheems/subscriptions", "organizations_url": "https://api.github.com/users/LoserCheems/orgs", "repos_url": "https://api.github.com/users/LoserCheems/repos", "events_url": "https://api.github.com/users/LoserCheems/events{/privacy}", "received_events_url": "https://api.github.com/users/LoserCheems/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "id": 2648621985, "node_id": "MDU6TGFiZWwyNjQ4NjIxOTg1", "url": "https://api.github.com/repos/huggingface/transformers/labels/Feature%20request", "name": "Feature request", "color": "FBCA04", "default": false, "description": "Request for a new feature" } ]
open
false
null
[]
null
[]
2025-10-09T08:47:28
2025-10-09T14:51:11
null
CONTRIBUTOR
null
null
null
null
### Feature request Flash Attention currently can’t consume the `s_aux`/learnable bias head that many architectures (e.g. GPT-OSS) rely on—see the open discussion in [Dao-AILab/flash-attention#1797](https://github.com/Dao-AILab/flash-attention/issues/1797). We’d like Transformers to expose an optional backend hook for [Flash-DMATTN](https://github.com/SmallDoges/flash-dmattn), which already supports the same Flash-Attention-style API plus learnable bias tensors out of the box. Integrating Flash-DMATTN as a selectable backend would unblock models that need `s_aux` today while keeping the existing Flash Attention pathways unchanged for users who don’t opt in. ### Motivation Right now, Flash Attention is the default fast path for many HF models, but the missing `s_aux` handling prevents us from enabling learnable-bias features (attention sinks, trainable mask bias, flash-decoding helpers) in production. Downstream projects either fall back to slower SDPA or maintain private forks. Flash-DMATTN already ships kernels with full bias gradients and has parity with Flash Attention on the rest of the API surface, so wiring it in would immediately restore feature coverage without waiting on upstream CUDA updates. This request is prompted by the stalled upstream issue above and by our own deployments that need learnable bias with long-context inference. ### Your contribution We can provide a PR that: - Adds a new optional backend selector (similar to `flash_attn_2` / `flash_attn_3`) that dispatches to Flash-DMATTN when installed. - Introduces lightweight capability checks so models that request learnable bias automatically choose the compatible backend. - Supplies documentation snippets and equivalence benchmarks (matching the Flash-DMATTN suite) to ease review.
null
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/41465/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/41465/timeline
null
null
{ "total": 0, "completed": 0, "percent_completed": 0 }
{ "blocked_by": 0, "total_blocked_by": 0, "blocking": 0, "total_blocking": 0 }
false
false
https://api.github.com/repos/huggingface/transformers/issues/41464
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/41464/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/41464/comments
https://api.github.com/repos/huggingface/transformers/issues/41464/events
https://github.com/huggingface/transformers/pull/41464
3,497,862,133
PR_kwDOCUB6oc6s1AWK
41,464
Fix auto model configuration for encoder of perceptionlm
{ "login": "fschlatt", "id": 23191892, "node_id": "MDQ6VXNlcjIzMTkxODky", "avatar_url": "https://avatars.githubusercontent.com/u/23191892?v=4", "gravatar_id": "", "url": "https://api.github.com/users/fschlatt", "html_url": "https://github.com/fschlatt", "followers_url": "https://api.github.com/users/fschlatt/followers", "following_url": "https://api.github.com/users/fschlatt/following{/other_user}", "gists_url": "https://api.github.com/users/fschlatt/gists{/gist_id}", "starred_url": "https://api.github.com/users/fschlatt/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/fschlatt/subscriptions", "organizations_url": "https://api.github.com/users/fschlatt/orgs", "repos_url": "https://api.github.com/users/fschlatt/repos", "events_url": "https://api.github.com/users/fschlatt/events{/privacy}", "received_events_url": "https://api.github.com/users/fschlatt/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-10-09T06:50:18
2025-10-09T12:08:04
2025-10-09T12:08:03
CONTRIBUTOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/41464", "html_url": "https://github.com/huggingface/transformers/pull/41464", "diff_url": "https://github.com/huggingface/transformers/pull/41464.diff", "patch_url": "https://github.com/huggingface/transformers/pull/41464.patch", "merged_at": "2025-10-09T12:08:03" }
# What does this PR do? <!-- Congratulations! You've made it this far! You're not quite done yet though. Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution. Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change. Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost. --> <!-- Remove if not applicable --> Fixes #41387 ## Before submitting - [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case). - [x] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request), Pull Request section? - [x] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link to it if that's the case. - [x] Did you make sure to update the documentation with your changes? Here are the [documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and [here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation). - [ ] Did you write any new necessary tests? ## Who can review? Anyone in the community is free to review the PR once the tests have passed. Feel free to tag members/contributors who may be interested in your PR. @ArthurZucker @Cyrilvallez
{ "login": "zucchini-nlp", "id": 100715397, "node_id": "U_kgDOBgDLhQ", "avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4", "gravatar_id": "", "url": "https://api.github.com/users/zucchini-nlp", "html_url": "https://github.com/zucchini-nlp", "followers_url": "https://api.github.com/users/zucchini-nlp/followers", "following_url": "https://api.github.com/users/zucchini-nlp/following{/other_user}", "gists_url": "https://api.github.com/users/zucchini-nlp/gists{/gist_id}", "starred_url": "https://api.github.com/users/zucchini-nlp/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/zucchini-nlp/subscriptions", "organizations_url": "https://api.github.com/users/zucchini-nlp/orgs", "repos_url": "https://api.github.com/users/zucchini-nlp/repos", "events_url": "https://api.github.com/users/zucchini-nlp/events{/privacy}", "received_events_url": "https://api.github.com/users/zucchini-nlp/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/41464/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/41464/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/41463
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/41463/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/41463/comments
https://api.github.com/repos/huggingface/transformers/issues/41463/events
https://github.com/huggingface/transformers/issues/41463
3,497,524,185
I_kwDOCUB6oc7Qd_vZ
41,463
Pop catch kwarg in run_hp_search_optuna() for flexible Optuna error handling
{ "login": "nicha-api", "id": 115199576, "node_id": "U_kgDOBt3OWA", "avatar_url": "https://avatars.githubusercontent.com/u/115199576?v=4", "gravatar_id": "", "url": "https://api.github.com/users/nicha-api", "html_url": "https://github.com/nicha-api", "followers_url": "https://api.github.com/users/nicha-api/followers", "following_url": "https://api.github.com/users/nicha-api/following{/other_user}", "gists_url": "https://api.github.com/users/nicha-api/gists{/gist_id}", "starred_url": "https://api.github.com/users/nicha-api/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/nicha-api/subscriptions", "organizations_url": "https://api.github.com/users/nicha-api/orgs", "repos_url": "https://api.github.com/users/nicha-api/repos", "events_url": "https://api.github.com/users/nicha-api/events{/privacy}", "received_events_url": "https://api.github.com/users/nicha-api/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "id": 2648621985, "node_id": "MDU6TGFiZWwyNjQ4NjIxOTg1", "url": "https://api.github.com/repos/huggingface/transformers/labels/Feature%20request", "name": "Feature request", "color": "FBCA04", "default": false, "description": "Request for a new feature" } ]
closed
false
null
[]
null
[]
2025-10-09T03:39:46
2025-10-14T14:03:25
2025-10-14T14:03:25
CONTRIBUTOR
null
null
null
null
### Feature request Allow `catch` kwarg passthrough in `Trainer.hyperparameter_search(backend="optuna")` in order to expose Optuna’s `Study.optimize(..., catch=...)` argument by popping out catch kwargs before calling Optuna's `study.optimize()`. ### Motivation The current Hugging Face Transformers integration with Optuna does not expose Optuna’s Study.optimize(..., catch=...) argument. As a result, when a trial raises an exception (e.g., CUDA OOM, data mismatch, tokenizer failure), the entire hyperparameter search stops prematurely instead of marking the trial as failed or pruned. Adding support for a catch parameter—forwarded to Optuna’s backend—would allow users to continue tuning even when individual trials encounter errors. ### Your contribution [PR#41496](https://github.com/huggingface/transformers/pull/41496)
{ "login": "SunMarc", "id": 57196510, "node_id": "MDQ6VXNlcjU3MTk2NTEw", "avatar_url": "https://avatars.githubusercontent.com/u/57196510?v=4", "gravatar_id": "", "url": "https://api.github.com/users/SunMarc", "html_url": "https://github.com/SunMarc", "followers_url": "https://api.github.com/users/SunMarc/followers", "following_url": "https://api.github.com/users/SunMarc/following{/other_user}", "gists_url": "https://api.github.com/users/SunMarc/gists{/gist_id}", "starred_url": "https://api.github.com/users/SunMarc/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/SunMarc/subscriptions", "organizations_url": "https://api.github.com/users/SunMarc/orgs", "repos_url": "https://api.github.com/users/SunMarc/repos", "events_url": "https://api.github.com/users/SunMarc/events{/privacy}", "received_events_url": "https://api.github.com/users/SunMarc/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/41463/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/41463/timeline
null
completed
{ "total": 0, "completed": 0, "percent_completed": 0 }
{ "blocked_by": 0, "total_blocked_by": 0, "blocking": 0, "total_blocking": 0 }
false
true
https://api.github.com/repos/huggingface/transformers/issues/41462
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/41462/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/41462/comments
https://api.github.com/repos/huggingface/transformers/issues/41462/events
https://github.com/huggingface/transformers/issues/41462
3,497,371,098
I_kwDOCUB6oc7QdaXa
41,462
Support encoder text classification for sequence to sequence models like BART and T5
{ "login": "cbhyphen", "id": 12734117, "node_id": "MDQ6VXNlcjEyNzM0MTE3", "avatar_url": "https://avatars.githubusercontent.com/u/12734117?v=4", "gravatar_id": "", "url": "https://api.github.com/users/cbhyphen", "html_url": "https://github.com/cbhyphen", "followers_url": "https://api.github.com/users/cbhyphen/followers", "following_url": "https://api.github.com/users/cbhyphen/following{/other_user}", "gists_url": "https://api.github.com/users/cbhyphen/gists{/gist_id}", "starred_url": "https://api.github.com/users/cbhyphen/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/cbhyphen/subscriptions", "organizations_url": "https://api.github.com/users/cbhyphen/orgs", "repos_url": "https://api.github.com/users/cbhyphen/repos", "events_url": "https://api.github.com/users/cbhyphen/events{/privacy}", "received_events_url": "https://api.github.com/users/cbhyphen/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "id": 2648621985, "node_id": "MDU6TGFiZWwyNjQ4NjIxOTg1", "url": "https://api.github.com/repos/huggingface/transformers/labels/Feature%20request", "name": "Feature request", "color": "FBCA04", "default": false, "description": "Request for a new feature" } ]
open
false
null
[]
null
[]
2025-10-09T01:59:49
2025-10-22T20:18:38
null
NONE
null
null
null
null
### Feature request Add support for encoder-only text classification for models like BART and T5. ### Motivation The current sequence classifiers for BART and T5 include the full encoder-decoder. While the decoder is used to produce the sentence representation for the classification head, the following paper shows that mean pooling of encoder final hidden states may be a better option (at least for T5): https://arxiv.org/abs/2108.08877. In addition to this, not using the decoder would reduce the size of the model and speedup fine-tuning as well as inference. ### Your contribution PR Submission
null
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/41462/reactions", "total_count": 1, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 1, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/41462/timeline
null
null
{ "total": 0, "completed": 0, "percent_completed": 0 }
{ "blocked_by": 0, "total_blocked_by": 0, "blocking": 0, "total_blocking": 0 }
false
false
https://api.github.com/repos/huggingface/transformers/issues/41461
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/41461/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/41461/comments
https://api.github.com/repos/huggingface/transformers/issues/41461/events
https://github.com/huggingface/transformers/pull/41461
3,496,894,449
PR_kwDOCUB6oc6sx1Rt
41,461
Fix Marian ONNX export to match PyTorch greedy decoding
{ "login": "prajeeta15", "id": 96904203, "node_id": "U_kgDOBcakCw", "avatar_url": "https://avatars.githubusercontent.com/u/96904203?v=4", "gravatar_id": "", "url": "https://api.github.com/users/prajeeta15", "html_url": "https://github.com/prajeeta15", "followers_url": "https://api.github.com/users/prajeeta15/followers", "following_url": "https://api.github.com/users/prajeeta15/following{/other_user}", "gists_url": "https://api.github.com/users/prajeeta15/gists{/gist_id}", "starred_url": "https://api.github.com/users/prajeeta15/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/prajeeta15/subscriptions", "organizations_url": "https://api.github.com/users/prajeeta15/orgs", "repos_url": "https://api.github.com/users/prajeeta15/repos", "events_url": "https://api.github.com/users/prajeeta15/events{/privacy}", "received_events_url": "https://api.github.com/users/prajeeta15/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "id": 9258341780, "node_id": "LA_kwDOCUB6oc8AAAACJ9cVlA", "url": "https://api.github.com/repos/huggingface/transformers/labels/Code%20agent%20slop", "name": "Code agent slop", "color": "C59579", "default": false, "description": "" } ]
closed
false
null
[]
null
[]
2025-10-08T21:04:45
2025-10-09T11:43:05
2025-10-09T11:43:05
NONE
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/41461", "html_url": "https://github.com/huggingface/transformers/pull/41461", "diff_url": "https://github.com/huggingface/transformers/pull/41461.diff", "patch_url": "https://github.com/huggingface/transformers/pull/41461.patch", "merged_at": null }
issue #41430 Summary: This PR resolves the mismatch between Marian ONNX decoding and Hugging Face generate(). ONNX outputs now match PyTorch token-by-token for greedy decoding. What changed: - Added MarianDecoderOnnxWrapper (decoder-only) to: 1. Pass encoder_hidden_states and encoder_attention_mask at every step (cross-attention enabled) 2. Apply lm_head + final_logits_bias to logits (required by Marian) 3. Provide a clean, ONNX-friendly forward method for export - Added a self-contained ONNX parity test (tests/onnx/marian_onnx_test.py): 1. Exports decoder-only ONNX with dynamic axes 2. Starts decoding from decoder_start_token_id 3. Greedily decodes by feeding the full decoded sequence step-by-step (no cache reuse) 4. Asserts token-by-token equality between ONNX and PyTorch outputs Why this (might) work : - Marian’s learned bias and cross-attention are required to replicate greedy decoding exactly. - The test mirrors generate()’s path, ensuring ONNX produces identical tokens. Impact: - No changes to core model behavior. - Wrapper is opt-in and only used for ONNX export. Additional files / examples: - examples/onnx/run_local_marian_decoder_test.py – A standalone script to test Marian ONNX decoder locally with any input sentence. Demonstrates step-by-step decoding and token-by-token parity with PyTorch generate() output. - onnx/run_local_decoder_test_mbart.py - same as above but for Mbart, had similar issues. Resolves the reported Marian ONNX mismatch. Before: <img width="1556" height="217" alt="Screenshot 2025-10-05 132250" src="https://github.com/user-attachments/assets/518b98bf-8bf7-408e-972e-10de93aa8da2" /> After: <img width="1362" height="174" alt="Screenshot 2025-10-09 015547" src="https://github.com/user-attachments/assets/95049844-5748-4cf2-979d-57ded75a8b20" /> <img width="1026" height="710" alt="Screenshot 2025-10-09 002838" src="https://github.com/user-attachments/assets/4b8e1c4a-4d75-4349-8d61-673b74ab3732" /> I am just trying to learn and thought this might work... this might have some errors, but let me know how to improve on those. thank you :)
{ "login": "Rocketknight1", "id": 12866554, "node_id": "MDQ6VXNlcjEyODY2NTU0", "avatar_url": "https://avatars.githubusercontent.com/u/12866554?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Rocketknight1", "html_url": "https://github.com/Rocketknight1", "followers_url": "https://api.github.com/users/Rocketknight1/followers", "following_url": "https://api.github.com/users/Rocketknight1/following{/other_user}", "gists_url": "https://api.github.com/users/Rocketknight1/gists{/gist_id}", "starred_url": "https://api.github.com/users/Rocketknight1/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Rocketknight1/subscriptions", "organizations_url": "https://api.github.com/users/Rocketknight1/orgs", "repos_url": "https://api.github.com/users/Rocketknight1/repos", "events_url": "https://api.github.com/users/Rocketknight1/events{/privacy}", "received_events_url": "https://api.github.com/users/Rocketknight1/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/41461/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/41461/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/41460
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/41460/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/41460/comments
https://api.github.com/repos/huggingface/transformers/issues/41460/events
https://github.com/huggingface/transformers/pull/41460
3,496,750,095
PR_kwDOCUB6oc6sxWJe
41,460
Apply PR #40546
{ "login": "Galigator", "id": 9264392, "node_id": "MDQ6VXNlcjkyNjQzOTI=", "avatar_url": "https://avatars.githubusercontent.com/u/9264392?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Galigator", "html_url": "https://github.com/Galigator", "followers_url": "https://api.github.com/users/Galigator/followers", "following_url": "https://api.github.com/users/Galigator/following{/other_user}", "gists_url": "https://api.github.com/users/Galigator/gists{/gist_id}", "starred_url": "https://api.github.com/users/Galigator/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Galigator/subscriptions", "organizations_url": "https://api.github.com/users/Galigator/orgs", "repos_url": "https://api.github.com/users/Galigator/repos", "events_url": "https://api.github.com/users/Galigator/events{/privacy}", "received_events_url": "https://api.github.com/users/Galigator/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-10-08T20:12:52
2025-10-08T20:15:26
2025-10-08T20:15:26
NONE
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/41460", "html_url": "https://github.com/huggingface/transformers/pull/41460", "diff_url": "https://github.com/huggingface/transformers/pull/41460.diff", "patch_url": "https://github.com/huggingface/transformers/pull/41460.patch", "merged_at": null }
# What does this PR do? <!-- Congratulations! You've made it this far! You're not quite done yet though. Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution. Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change. Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost. --> <!-- Remove if not applicable --> Fixes # (issue) ## Before submitting - [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case). - [ ] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request), Pull Request section? - [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link to it if that's the case. - [ ] Did you make sure to update the documentation with your changes? Here are the [documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and [here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation). - [ ] Did you write any new necessary tests? ## Who can review? Anyone in the community is free to review the PR once the tests have passed. Feel free to tag members/contributors who may be interested in your PR. <!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @ If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**. Please tag fewer than 3 people. Models: - text models: @ArthurZucker @Cyrilvallez - vision models: @yonigozlan @molbap - audio models: @eustlb @ebezzam @vasqu - multimodal models: @zucchini-nlp - graph models: @clefourrier Library: - generate: @zucchini-nlp (visual-language models) or @gante (all others) - continuous batching: @remi-or @ArthurZucker @McPatate - pipelines: @Rocketknight1 - tokenizers: @ArthurZucker and @itazap - trainer: @zach-huggingface @SunMarc - attention: @vasqu @ArthurZucker @CyrilVallez - model loading (from pretrained, etc): @CyrilVallez - distributed: @3outeille @ArthurZucker @S1ro1 - CIs: @ydshieh Integrations: - deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface - ray/raytune: @richardliaw, @amogkam - Big Model Inference: @SunMarc - quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber - kernels: @MekkCyber @drbh - peft: @BenjaminBossan @githubnemo Devices/Backends: - AMD ROCm: @ivarflakstad - Intel XPU: @IlyasMoutawwakil - Ascend NPU: @ivarflakstad Documentation: @stevhliu Research projects are not maintained and should be taken as is. -->
{ "login": "Galigator", "id": 9264392, "node_id": "MDQ6VXNlcjkyNjQzOTI=", "avatar_url": "https://avatars.githubusercontent.com/u/9264392?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Galigator", "html_url": "https://github.com/Galigator", "followers_url": "https://api.github.com/users/Galigator/followers", "following_url": "https://api.github.com/users/Galigator/following{/other_user}", "gists_url": "https://api.github.com/users/Galigator/gists{/gist_id}", "starred_url": "https://api.github.com/users/Galigator/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Galigator/subscriptions", "organizations_url": "https://api.github.com/users/Galigator/orgs", "repos_url": "https://api.github.com/users/Galigator/repos", "events_url": "https://api.github.com/users/Galigator/events{/privacy}", "received_events_url": "https://api.github.com/users/Galigator/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/41460/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/41460/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/41459
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/41459/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/41459/comments
https://api.github.com/repos/huggingface/transformers/issues/41459/events
https://github.com/huggingface/transformers/pull/41459
3,496,484,488
PR_kwDOCUB6oc6swctc
41,459
[docs] Standardize model docs
{ "login": "stevhliu", "id": 59462357, "node_id": "MDQ6VXNlcjU5NDYyMzU3", "avatar_url": "https://avatars.githubusercontent.com/u/59462357?v=4", "gravatar_id": "", "url": "https://api.github.com/users/stevhliu", "html_url": "https://github.com/stevhliu", "followers_url": "https://api.github.com/users/stevhliu/followers", "following_url": "https://api.github.com/users/stevhliu/following{/other_user}", "gists_url": "https://api.github.com/users/stevhliu/gists{/gist_id}", "starred_url": "https://api.github.com/users/stevhliu/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/stevhliu/subscriptions", "organizations_url": "https://api.github.com/users/stevhliu/orgs", "repos_url": "https://api.github.com/users/stevhliu/repos", "events_url": "https://api.github.com/users/stevhliu/events{/privacy}", "received_events_url": "https://api.github.com/users/stevhliu/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
open
false
null
[]
null
[]
2025-10-08T18:36:51
2025-10-28T08:19:19
null
MEMBER
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/41459", "html_url": "https://github.com/huggingface/transformers/pull/41459", "diff_url": "https://github.com/huggingface/transformers/pull/41459.diff", "patch_url": "https://github.com/huggingface/transformers/pull/41459.patch", "merged_at": null }
- standardizes all model docs to include a generated summary of the abstract/blog, a `Pipeline` and `AutoModel` or `ModelForTask` example and usage tips - updates `add_dates.py` to also add contributor names at the top - removes PyTorch badges because we are PyTorch-first now
null
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/41459/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/41459/timeline
null
null
null
null
true
false
https://api.github.com/repos/huggingface/transformers/issues/41458
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/41458/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/41458/comments
https://api.github.com/repos/huggingface/transformers/issues/41458/events
https://github.com/huggingface/transformers/pull/41458
3,496,389,947
PR_kwDOCUB6oc6swIst
41,458
Adding ScatterMoE kernel support for Granite models.
{ "login": "shawntan", "id": 119799, "node_id": "MDQ6VXNlcjExOTc5OQ==", "avatar_url": "https://avatars.githubusercontent.com/u/119799?v=4", "gravatar_id": "", "url": "https://api.github.com/users/shawntan", "html_url": "https://github.com/shawntan", "followers_url": "https://api.github.com/users/shawntan/followers", "following_url": "https://api.github.com/users/shawntan/following{/other_user}", "gists_url": "https://api.github.com/users/shawntan/gists{/gist_id}", "starred_url": "https://api.github.com/users/shawntan/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/shawntan/subscriptions", "organizations_url": "https://api.github.com/users/shawntan/orgs", "repos_url": "https://api.github.com/users/shawntan/repos", "events_url": "https://api.github.com/users/shawntan/events{/privacy}", "received_events_url": "https://api.github.com/users/shawntan/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
open
false
null
[]
null
[]
2025-10-08T18:05:32
2025-10-20T17:34:31
null
CONTRIBUTOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/41458", "html_url": "https://github.com/huggingface/transformers/pull/41458", "diff_url": "https://github.com/huggingface/transformers/pull/41458.diff", "patch_url": "https://github.com/huggingface/transformers/pull/41458.patch", "merged_at": null }
# What does this PR do? Adds ScatterMoE kernel support for Granite MoE models. Started in #40365 but has significantly deviated in approach, so starting a new pull request. ## Before submitting - [x] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request), Pull Request section? - [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case). - [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link to it if that's the case. - [ ] Did you make sure to update the documentation with your changes? Here are the [documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and [here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation). - [ ] Did you write any new necessary tests? ## Who can review? @MekkCyber already started to provide some comments in #40365.
null
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/41458/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/41458/timeline
null
null
null
null
true
false
https://api.github.com/repos/huggingface/transformers/issues/41457
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/41457/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/41457/comments
https://api.github.com/repos/huggingface/transformers/issues/41457/events
https://github.com/huggingface/transformers/pull/41457
3,496,373,582
PR_kwDOCUB6oc6swFW0
41,457
Fix doc
{ "login": "Cyrilvallez", "id": 71554963, "node_id": "MDQ6VXNlcjcxNTU0OTYz", "avatar_url": "https://avatars.githubusercontent.com/u/71554963?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Cyrilvallez", "html_url": "https://github.com/Cyrilvallez", "followers_url": "https://api.github.com/users/Cyrilvallez/followers", "following_url": "https://api.github.com/users/Cyrilvallez/following{/other_user}", "gists_url": "https://api.github.com/users/Cyrilvallez/gists{/gist_id}", "starred_url": "https://api.github.com/users/Cyrilvallez/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Cyrilvallez/subscriptions", "organizations_url": "https://api.github.com/users/Cyrilvallez/orgs", "repos_url": "https://api.github.com/users/Cyrilvallez/repos", "events_url": "https://api.github.com/users/Cyrilvallez/events{/privacy}", "received_events_url": "https://api.github.com/users/Cyrilvallez/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-10-08T17:59:30
2025-10-08T18:15:58
2025-10-08T18:13:21
MEMBER
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/41457", "html_url": "https://github.com/huggingface/transformers/pull/41457", "diff_url": "https://github.com/huggingface/transformers/pull/41457.diff", "patch_url": "https://github.com/huggingface/transformers/pull/41457.patch", "merged_at": "2025-10-08T18:13:21" }
# What does this PR do?
{ "login": "Cyrilvallez", "id": 71554963, "node_id": "MDQ6VXNlcjcxNTU0OTYz", "avatar_url": "https://avatars.githubusercontent.com/u/71554963?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Cyrilvallez", "html_url": "https://github.com/Cyrilvallez", "followers_url": "https://api.github.com/users/Cyrilvallez/followers", "following_url": "https://api.github.com/users/Cyrilvallez/following{/other_user}", "gists_url": "https://api.github.com/users/Cyrilvallez/gists{/gist_id}", "starred_url": "https://api.github.com/users/Cyrilvallez/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Cyrilvallez/subscriptions", "organizations_url": "https://api.github.com/users/Cyrilvallez/orgs", "repos_url": "https://api.github.com/users/Cyrilvallez/repos", "events_url": "https://api.github.com/users/Cyrilvallez/events{/privacy}", "received_events_url": "https://api.github.com/users/Cyrilvallez/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/41457/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/41457/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/41456
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/41456/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/41456/comments
https://api.github.com/repos/huggingface/transformers/issues/41456/events
https://github.com/huggingface/transformers/pull/41456
3,496,115,546
PR_kwDOCUB6oc6svOOr
41,456
Add vision contribution guide
{ "login": "molbap", "id": 39954772, "node_id": "MDQ6VXNlcjM5OTU0Nzcy", "avatar_url": "https://avatars.githubusercontent.com/u/39954772?v=4", "gravatar_id": "", "url": "https://api.github.com/users/molbap", "html_url": "https://github.com/molbap", "followers_url": "https://api.github.com/users/molbap/followers", "following_url": "https://api.github.com/users/molbap/following{/other_user}", "gists_url": "https://api.github.com/users/molbap/gists{/gist_id}", "starred_url": "https://api.github.com/users/molbap/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/molbap/subscriptions", "organizations_url": "https://api.github.com/users/molbap/orgs", "repos_url": "https://api.github.com/users/molbap/repos", "events_url": "https://api.github.com/users/molbap/events{/privacy}", "received_events_url": "https://api.github.com/users/molbap/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-10-08T16:24:56
2025-10-20T16:56:50
2025-10-20T16:56:48
CONTRIBUTOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/41456", "html_url": "https://github.com/huggingface/transformers/pull/41456", "diff_url": "https://github.com/huggingface/transformers/pull/41456.diff", "patch_url": "https://github.com/huggingface/transformers/pull/41456.patch", "merged_at": "2025-10-20T16:56:48" }
# What does this PR do? This updates the vision contribution guide, to help external contributors merge their work faster and reduce the load on maintainers with a checklist. It is a very minimal checklist that will need to be completed with specific model implementation guides following the new API.
{ "login": "molbap", "id": 39954772, "node_id": "MDQ6VXNlcjM5OTU0Nzcy", "avatar_url": "https://avatars.githubusercontent.com/u/39954772?v=4", "gravatar_id": "", "url": "https://api.github.com/users/molbap", "html_url": "https://github.com/molbap", "followers_url": "https://api.github.com/users/molbap/followers", "following_url": "https://api.github.com/users/molbap/following{/other_user}", "gists_url": "https://api.github.com/users/molbap/gists{/gist_id}", "starred_url": "https://api.github.com/users/molbap/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/molbap/subscriptions", "organizations_url": "https://api.github.com/users/molbap/orgs", "repos_url": "https://api.github.com/users/molbap/repos", "events_url": "https://api.github.com/users/molbap/events{/privacy}", "received_events_url": "https://api.github.com/users/molbap/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/41456/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/41456/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/41455
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/41455/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/41455/comments
https://api.github.com/repos/huggingface/transformers/issues/41455/events
https://github.com/huggingface/transformers/issues/41455
3,495,880,613
I_kwDOCUB6oc7QXuel
41,455
RuntimeError with dtype=torch.float16 and dtype=torch.bfloat16 in HQQ 4-bit, 8-bit quantized Granite4 models on RTX3050 TI
{ "login": "prathamesh-chavan-22", "id": 158864599, "node_id": "U_kgDOCXgU1w", "avatar_url": "https://avatars.githubusercontent.com/u/158864599?v=4", "gravatar_id": "", "url": "https://api.github.com/users/prathamesh-chavan-22", "html_url": "https://github.com/prathamesh-chavan-22", "followers_url": "https://api.github.com/users/prathamesh-chavan-22/followers", "following_url": "https://api.github.com/users/prathamesh-chavan-22/following{/other_user}", "gists_url": "https://api.github.com/users/prathamesh-chavan-22/gists{/gist_id}", "starred_url": "https://api.github.com/users/prathamesh-chavan-22/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/prathamesh-chavan-22/subscriptions", "organizations_url": "https://api.github.com/users/prathamesh-chavan-22/orgs", "repos_url": "https://api.github.com/users/prathamesh-chavan-22/repos", "events_url": "https://api.github.com/users/prathamesh-chavan-22/events{/privacy}", "received_events_url": "https://api.github.com/users/prathamesh-chavan-22/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
open
false
null
[]
null
[]
2025-10-08T15:16:56
2025-10-17T06:24:13
null
CONTRIBUTOR
null
null
null
null
### Environment - Transformers version: 4.57.0 - PyTorch version: 2.8.0+cu128 - Accelerate version: 1.10.1 - GPU: Nvidia RTX 3050Ti 4GB VRAM ### Steps to reproduce 1. Load HQQ 4-bit Granite model using dtype=torch.float16 ```python import torch from transformers import AutoModelForCausalLM, AutoTokenizer, HqqConfig # 1️⃣ Quantization config quant_config = HqqConfig(nbits=8, group_size=64) # 2️⃣ Load model model_name = "ibm-granite/granite-4.0-h-micro" model = AutoModelForCausalLM.from_pretrained( model_name, dtype=torch.bfloat16, device_map="cuda", # "auto or "cuda" if single GPU quantization_config=quant_config ) # 3️⃣ Load tokenizer tokenizer = AutoTokenizer.from_pretrained(model_name) # 4️⃣ Prepare prompt prompt = "Translate the following English sentence to French: Hello, how are you?" inputs = tokenizer(prompt, return_tensors="pt").to("cuda") # 5️⃣ Generate output with torch.no_grad(): outputs = model.generate( **inputs, max_new_tokens=50, do_sample=True, # optional: enable sampling for more diverse output temperature=0.7, # optional: adjust creativity top_p=0.9 # optional: nucleus sampling ) # 6️⃣ Decode output generated_text = tokenizer.decode(outputs[0], skip_special_tokens=True) print(generated_text) ``` Observed behavior RuntimeError: expected mat1 and mat2 to have the same dtype (struct c10::BFloat16 != float) Expected behavior Model should generate outputs without dtype errors Workaround 1. Cast inputs to float32 before generation: ```python inputs = {k: v.float() for k, v in inputs.items()} ``` 2. Use dtype=torch.float32 or don't mention it at all **Notes** 1. Docs show dtype=torch.float16 2. Error occurs in some PyTorch/Accelerate setups even on GPUs that support float16/bfloat16 @MekkCyber
null
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/41455/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/41455/timeline
null
null
{ "total": 0, "completed": 0, "percent_completed": 0 }
{ "blocked_by": 0, "total_blocked_by": 0, "blocking": 0, "total_blocking": 0 }
false
false
https://api.github.com/repos/huggingface/transformers/issues/41454
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/41454/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/41454/comments
https://api.github.com/repos/huggingface/transformers/issues/41454/events
https://github.com/huggingface/transformers/issues/41454
3,495,742,177
I_kwDOCUB6oc7QXMrh
41,454
Huggingface wav2vec2-Conformer conversion script fails for Fairseq models
{ "login": "Getmany1", "id": 26164540, "node_id": "MDQ6VXNlcjI2MTY0NTQw", "avatar_url": "https://avatars.githubusercontent.com/u/26164540?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Getmany1", "html_url": "https://github.com/Getmany1", "followers_url": "https://api.github.com/users/Getmany1/followers", "following_url": "https://api.github.com/users/Getmany1/following{/other_user}", "gists_url": "https://api.github.com/users/Getmany1/gists{/gist_id}", "starred_url": "https://api.github.com/users/Getmany1/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Getmany1/subscriptions", "organizations_url": "https://api.github.com/users/Getmany1/orgs", "repos_url": "https://api.github.com/users/Getmany1/repos", "events_url": "https://api.github.com/users/Getmany1/events{/privacy}", "received_events_url": "https://api.github.com/users/Getmany1/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "id": 6470596964, "node_id": "LA_kwDOCUB6oc8AAAABga15ZA", "url": "https://api.github.com/repos/huggingface/transformers/labels/Audio", "name": "Audio", "color": "760453", "default": false, "description": "" } ]
open
false
null
[]
null
[]
2025-10-08T14:40:56
2025-10-13T08:37:11
null
NONE
null
null
null
null
### System Info - `transformers` version: 4.57.0.dev0 - Platform: Linux-4.18.0-553.74.1.el8_10.x86_64-x86_64-with-glibc2.28 - Python version: 3.10.14 - Huggingface_hub version: 1.0.0.rc2 - Safetensors version: 0.6.2 - Accelerate version: not installed - Accelerate config: not found - DeepSpeed version: not installed - PyTorch version (accelerator?): 2.5.1+cu124 (CUDA) - Using distributed or parallel set-up in script?: <fill in> - Using GPU in script?: No - GPU type: Tesla V100-SXM2-32GB ### Who can help? @eustlb @amyeroberts ### Information - [x] The official example scripts - [ ] My own modified scripts ### Tasks - [ ] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...) - [ ] My own task or dataset (give details below) ### Reproduction The official conversion script (`transformers/models/wav2vec2_conformer/convert_wav2vec2_conformer_original_pytorch_checkpoint_to_pytorch.py`) fails to properly convert Fairseq wav2vec2-Conformer models. Note that standard Wav2Vec2 models with the Transformer encoder convert successfully - this issue is specific to models with the Conformer encoder. **Test Model:** [GetmanY1/wav2vec2-conformer-rel-pos-base-fi-ft](https://huggingface.co/GetmanY1/wav2vec2-conformer-rel-pos-base-fi-ft) (includes both original Fairseq checkpoint and converted model for reproducibility) **Fairseq Model Configuration:** ```yaml model: _name: wav2vec2 quantize_targets: true latent_temp: [2.0, 0.5, 0.999995] extractor_mode: default layer_norm_first: false final_dim: 256 dropout_input: 0.1 dropout_features: 0.1 feature_grad_mult: 0.1 encoder_embed_dim: 768 encoder_layers: 12 encoder_ffn_embed_dim: 3072 encoder_attention_heads: 12 activation_fn: gelu dropout: 0.1 attention_dropout: 0.1 activation_dropout: 0.0 encoder_layerdrop: 0.0 layer_type: conformer attn_type: espnet pos_enc_type: rel_pos ``` **Steps to reproduce the behavior** 1. Run the conversion script: ```bash python3 transformers/models/wav2vec2_conformer/convert_wav2vec2_conformer_original_pytorch_checkpoint_to_pytorch.py \ --pytorch_dump_folder_path="/converted" \ --checkpoint_path="/path/to/checkpoint_best.pt" \ --dict_path="/path/to/dict.ltr.txt" ``` 2. **Error 1:**` AttributeError: 'ParametrizedConv1d' object has no attribute 'weight_g'` **Fix:** Update lines 99 and 101 in `convert_wav2vec2_conformer_original_pytorch_checkpoint_to_pytorch.py`: ```python # Line 99: Change hf_pointer.weight_g.data = value # To: hf_pointer.parametrizations.weight.original0.data = value # Line 101: Change hf_pointer.weight_v.data = value # To: hf_pointer.parametrizations.weight.original1.data = value ``` (Related: [Issue #26796](https://github.com/huggingface/transformers/issues/26796), which probably wasn't fixed in this particular conversion script. ) 3. **Error 2:** Shape mismatch for pos_conv weights: `ValueError: Shape of hf wav2vec2_conformer.encoder.pos_conv_embed.conv.weight_g is torch.Size([768, 48, 128]), but should be torch.Size([1, 1, 128]) for w2v_encoder.w2v_model.encoder.pos_conv.0.weight_g` **Temporary workaround:** Comment out shape validation (lines 90-94) in `convert_wav2vec2_conformer_original_pytorch_checkpoint_to_pytorch.py`: ```python if hf_shape != value.shape: raise ValueError( f"Shape of hf {key + '.' + weight_type if weight_type is not None else ''} is {hf_shape}, but should be" f" {value.shape} for {full_name}" ) ``` which results in the script running without errors, but the saved converted model doesn't work correctly and only outputs blanks. --- **P.S. Common Fairseq Setup Issues:** If you encounter: `omegaconf.errors.ConfigKeyError: Key 'multi_corpus_keys' not in 'AudioFinetuningConfig'` **Fix:** Install the latest Fairseq from source: `pip install --no-build-isolation git+https://github.com/pytorch/fairseq` If you get: `TypeError: ConformerEncoder.build_encoder_layer() got an unexpected keyword argument 'layer_idx'` **Fix:** Add `**kwargs` to two methods in `fairseq/models/wav2vec/wav2vec2.py`: ```python # Line 1182: Change def build_encoder_layer(self, args): # To: def build_encoder_layer(self, args, **kwargs): # Line 1225: Change def extract_features(self, x, padding_mask=None, tgt_layer=None): # To: def extract_features(self, x, padding_mask=None, tgt_layer=None, **kwargs): ``` (Fix was proposed in https://github.com/facebookresearch/fairseq/issues/5386 , but still not merged with the main branch) ### Expected behavior The converted HuggingFace model should produce outputs matching the Fairseq model. The original Fairseq model runs correctly. Here's a minimal example: ```python import torch import fairseq from datasets import load_dataset # Paths (update these) MODEL_PATH = "path/to/checkpoint_best.pt" DICT_PATH = "path/to/dict.ltr.txt" def load_dict(path): vocab = {0: '<s>', 1: '<pad>', 2: '</s>', 3: '<unk>'} with open(path) as f: for line in f: token = line.strip().split()[0] vocab[len(vocab)] = token return vocab def decode_ctc(logits, vocab): preds = logits.argmax(dim=-1)[0] blank_idx = logits.shape[-1] - 1 tokens = [] prev = None for p in preds: p = p.item() if p != blank_idx and p != prev and p >= 4: tokens.append(vocab[p]) prev = p return ''.join(tokens).replace('|', ' ').strip() device = torch.device('cuda' if torch.cuda.is_available() else 'cpu') dataset = load_dataset("google/fleurs", "fi_fi", split="test", streaming=True) sample = next(iter(dataset)) audio = sample["audio"]["array"] print(f"Reference: {sample['transcription']}") models, _, _ = fairseq.checkpoint_utils.load_model_ensemble_and_task([MODEL_PATH]) model = models[0].eval().to(device) audio_tensor = torch.from_numpy(audio).float().unsqueeze(0).to(device) with torch.no_grad(): out = model(source=audio_tensor, padding_mask=None) logits = out["encoder_out"].transpose(0, 1) if isinstance(out, dict) else out[0] vocab = load_dict(DICT_PATH) transcript = decode_ctc(logits, vocab) print(f"Fairseq transcription: {transcript}") ``` I also tried to convert a wav2vec2-conformer-base with rotary positional embeddings instead of relative embeddings, but the converted model still works incorrectly, yielding>80% WER. ### **Impact** This issue affects anyone trying to convert Fairseq wav2vec2-Conformer models to HuggingFace format, making it difficult to share these models with the broader research and developer community through HuggingFace Hub. --- I've tried to include all necessary information to verify and reproduce the issue. Please let me know if you need any additional details
null
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/41454/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/41454/timeline
null
null
{ "total": 0, "completed": 0, "percent_completed": 0 }
{ "blocked_by": 0, "total_blocked_by": 0, "blocking": 0, "total_blocking": 0 }
false
false
https://api.github.com/repos/huggingface/transformers/issues/41453
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/41453/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/41453/comments
https://api.github.com/repos/huggingface/transformers/issues/41453/events
https://github.com/huggingface/transformers/issues/41453
3,495,732,527
I_kwDOCUB6oc7QXKUv
41,453
SmolVLM2 cannot be used quantized
{ "login": "anderlem", "id": 120205010, "node_id": "U_kgDOByou0g", "avatar_url": "https://avatars.githubusercontent.com/u/120205010?v=4", "gravatar_id": "", "url": "https://api.github.com/users/anderlem", "html_url": "https://github.com/anderlem", "followers_url": "https://api.github.com/users/anderlem/followers", "following_url": "https://api.github.com/users/anderlem/following{/other_user}", "gists_url": "https://api.github.com/users/anderlem/gists{/gist_id}", "starred_url": "https://api.github.com/users/anderlem/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/anderlem/subscriptions", "organizations_url": "https://api.github.com/users/anderlem/orgs", "repos_url": "https://api.github.com/users/anderlem/repos", "events_url": "https://api.github.com/users/anderlem/events{/privacy}", "received_events_url": "https://api.github.com/users/anderlem/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "id": 3817266200, "node_id": "MDU6TGFiZWwzODE3MjY2MjAw", "url": "https://api.github.com/repos/huggingface/transformers/labels/bug", "name": "bug", "color": "d73a4a", "default": true, "description": null } ]
open
false
null
[]
null
[]
2025-10-08T14:38:43
2025-10-13T09:49:10
null
NONE
null
null
null
null
### System Info When loading SmolVLM2 quantized with a BitsAndBytesConfig and in bf16 and training, I get `RuntimeError: Index put requires the source and destination dtypes match, got BFloat16 for the destination and Float for the source.` at ``` File "lib/python3.12/site-packages/transformers/models/smolvlm/modeling_smolvlm.py", line 648, in inputs_merger image_embeds[image_mask] = image_hidden_states[block_idx[image_mask], local_idx[image_mask], :] ``` image_hidden_states seems not to match the dtype of image_embeds. I fixed it by inserting `image_hidden_states = image_hidden_states.to(dtype=inputs_embeds.dtype)` above that line. ### Who can help? @yonigozlan @molbap ### Information - [ ] The official example scripts - [ ] My own modified scripts ### Tasks - [ ] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...) - [ ] My own task or dataset (give details below) ### Reproduction ``` bnb_config: BitsAndBytesConfig = BitsAndBytesConfig( load_in_4bit=True, bnb_4bit_use_double_quant=True, bnb_4bit_quant_type="nf4", bnb_4bit_compute_dtype=torch.bfloat16 ) AutoModelForImageTextToText.from_pretrained( "HuggingFaceTB/SmolVLM2-2.2B-Instruct", quantization_config=bnb_config, torch_dtype=torch.bfloat16 ) ``` ### Expected behavior No error.
null
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/41453/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/41453/timeline
null
null
{ "total": 0, "completed": 0, "percent_completed": 0 }
{ "blocked_by": 0, "total_blocked_by": 0, "blocking": 0, "total_blocking": 0 }
false
false
https://api.github.com/repos/huggingface/transformers/issues/41452
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/41452/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/41452/comments
https://api.github.com/repos/huggingface/transformers/issues/41452/events
https://github.com/huggingface/transformers/pull/41452
3,495,622,741
PR_kwDOCUB6oc6stjPO
41,452
Update hqq.md
{ "login": "prathamesh-chavan-22", "id": 158864599, "node_id": "U_kgDOCXgU1w", "avatar_url": "https://avatars.githubusercontent.com/u/158864599?v=4", "gravatar_id": "", "url": "https://api.github.com/users/prathamesh-chavan-22", "html_url": "https://github.com/prathamesh-chavan-22", "followers_url": "https://api.github.com/users/prathamesh-chavan-22/followers", "following_url": "https://api.github.com/users/prathamesh-chavan-22/following{/other_user}", "gists_url": "https://api.github.com/users/prathamesh-chavan-22/gists{/gist_id}", "starred_url": "https://api.github.com/users/prathamesh-chavan-22/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/prathamesh-chavan-22/subscriptions", "organizations_url": "https://api.github.com/users/prathamesh-chavan-22/orgs", "repos_url": "https://api.github.com/users/prathamesh-chavan-22/repos", "events_url": "https://api.github.com/users/prathamesh-chavan-22/events{/privacy}", "received_events_url": "https://api.github.com/users/prathamesh-chavan-22/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-10-08T14:12:14
2025-10-08T14:44:57
2025-10-08T14:44:56
CONTRIBUTOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/41452", "html_url": "https://github.com/huggingface/transformers/pull/41452", "diff_url": "https://github.com/huggingface/transformers/pull/41452.diff", "patch_url": "https://github.com/huggingface/transformers/pull/41452.patch", "merged_at": "2025-10-08T14:44:56" }
Fixed a typo in documentation of hqq, quantization method Documentation: @stevhliu
{ "login": "stevhliu", "id": 59462357, "node_id": "MDQ6VXNlcjU5NDYyMzU3", "avatar_url": "https://avatars.githubusercontent.com/u/59462357?v=4", "gravatar_id": "", "url": "https://api.github.com/users/stevhliu", "html_url": "https://github.com/stevhliu", "followers_url": "https://api.github.com/users/stevhliu/followers", "following_url": "https://api.github.com/users/stevhliu/following{/other_user}", "gists_url": "https://api.github.com/users/stevhliu/gists{/gist_id}", "starred_url": "https://api.github.com/users/stevhliu/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/stevhliu/subscriptions", "organizations_url": "https://api.github.com/users/stevhliu/orgs", "repos_url": "https://api.github.com/users/stevhliu/repos", "events_url": "https://api.github.com/users/stevhliu/events{/privacy}", "received_events_url": "https://api.github.com/users/stevhliu/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/41452/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/41452/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/41451
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/41451/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/41451/comments
https://api.github.com/repos/huggingface/transformers/issues/41451/events
https://github.com/huggingface/transformers/pull/41451
3,495,582,298
PR_kwDOCUB6oc6stadX
41,451
Build a CI docker image with FA3
{ "login": "ydshieh", "id": 2521628, "node_id": "MDQ6VXNlcjI1MjE2Mjg=", "avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4", "gravatar_id": "", "url": "https://api.github.com/users/ydshieh", "html_url": "https://github.com/ydshieh", "followers_url": "https://api.github.com/users/ydshieh/followers", "following_url": "https://api.github.com/users/ydshieh/following{/other_user}", "gists_url": "https://api.github.com/users/ydshieh/gists{/gist_id}", "starred_url": "https://api.github.com/users/ydshieh/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/ydshieh/subscriptions", "organizations_url": "https://api.github.com/users/ydshieh/orgs", "repos_url": "https://api.github.com/users/ydshieh/repos", "events_url": "https://api.github.com/users/ydshieh/events{/privacy}", "received_events_url": "https://api.github.com/users/ydshieh/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
open
false
null
[]
null
[]
2025-10-08T14:02:03
2025-10-09T12:43:02
null
COLLABORATOR
null
null
true
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/41451", "html_url": "https://github.com/huggingface/transformers/pull/41451", "diff_url": "https://github.com/huggingface/transformers/pull/41451.diff", "patch_url": "https://github.com/huggingface/transformers/pull/41451.patch", "merged_at": null }
# What does this PR do? Build a CI docker image with FA3. It is available as huggingface/transformers-all-latest-gpu:fa3 a build is trigged at [this run](https://github.com/huggingface/transformers/actions/runs/18347217144/job/52257062261) Let's see how it ends.
null
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/41451/reactions", "total_count": 2, "+1": 0, "-1": 0, "laugh": 1, "hooray": 0, "confused": 0, "heart": 1, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/41451/timeline
null
null
null
null
true
false
https://api.github.com/repos/huggingface/transformers/issues/41450
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/41450/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/41450/comments
https://api.github.com/repos/huggingface/transformers/issues/41450/events
https://github.com/huggingface/transformers/pull/41450
3,495,413,243
PR_kwDOCUB6oc6ss1u4
41,450
Allow modifications of attributes in modular
{ "login": "molbap", "id": 39954772, "node_id": "MDQ6VXNlcjM5OTU0Nzcy", "avatar_url": "https://avatars.githubusercontent.com/u/39954772?v=4", "gravatar_id": "", "url": "https://api.github.com/users/molbap", "html_url": "https://github.com/molbap", "followers_url": "https://api.github.com/users/molbap/followers", "following_url": "https://api.github.com/users/molbap/following{/other_user}", "gists_url": "https://api.github.com/users/molbap/gists{/gist_id}", "starred_url": "https://api.github.com/users/molbap/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/molbap/subscriptions", "organizations_url": "https://api.github.com/users/molbap/orgs", "repos_url": "https://api.github.com/users/molbap/repos", "events_url": "https://api.github.com/users/molbap/events{/privacy}", "received_events_url": "https://api.github.com/users/molbap/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-10-08T13:17:08
2025-10-08T15:34:36
2025-10-08T15:34:35
CONTRIBUTOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/41450", "html_url": "https://github.com/huggingface/transformers/pull/41450", "diff_url": "https://github.com/huggingface/transformers/pull/41450.diff", "patch_url": "https://github.com/huggingface/transformers/pull/41450.patch", "merged_at": null }
# What does this PR do? Often, attributes are not derived from model names. Modular files based on a given model are then bloated because there is a tiny change in an attribute, needing the entire thing to be rewritten. See https://github.com/huggingface/transformers/pull/41224/files here for instance. This introduces an attribute renamer, from an arbitrary attribute name to another. ```python class DINOv3ViTForImageClassification(Dinov2ForImageClassification): _attribute_renames = {"dinov2": "dinov3"} ``` On the attached PR, this expands the modeling file correctly.
{ "login": "molbap", "id": 39954772, "node_id": "MDQ6VXNlcjM5OTU0Nzcy", "avatar_url": "https://avatars.githubusercontent.com/u/39954772?v=4", "gravatar_id": "", "url": "https://api.github.com/users/molbap", "html_url": "https://github.com/molbap", "followers_url": "https://api.github.com/users/molbap/followers", "following_url": "https://api.github.com/users/molbap/following{/other_user}", "gists_url": "https://api.github.com/users/molbap/gists{/gist_id}", "starred_url": "https://api.github.com/users/molbap/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/molbap/subscriptions", "organizations_url": "https://api.github.com/users/molbap/orgs", "repos_url": "https://api.github.com/users/molbap/repos", "events_url": "https://api.github.com/users/molbap/events{/privacy}", "received_events_url": "https://api.github.com/users/molbap/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/41450/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/41450/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/41449
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/41449/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/41449/comments
https://api.github.com/repos/huggingface/transformers/issues/41449/events
https://github.com/huggingface/transformers/pull/41449
3,495,305,196
PR_kwDOCUB6oc6sseW-
41,449
Fix trainer simple tests
{ "login": "SunMarc", "id": 57196510, "node_id": "MDQ6VXNlcjU3MTk2NTEw", "avatar_url": "https://avatars.githubusercontent.com/u/57196510?v=4", "gravatar_id": "", "url": "https://api.github.com/users/SunMarc", "html_url": "https://github.com/SunMarc", "followers_url": "https://api.github.com/users/SunMarc/followers", "following_url": "https://api.github.com/users/SunMarc/following{/other_user}", "gists_url": "https://api.github.com/users/SunMarc/gists{/gist_id}", "starred_url": "https://api.github.com/users/SunMarc/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/SunMarc/subscriptions", "organizations_url": "https://api.github.com/users/SunMarc/orgs", "repos_url": "https://api.github.com/users/SunMarc/repos", "events_url": "https://api.github.com/users/SunMarc/events{/privacy}", "received_events_url": "https://api.github.com/users/SunMarc/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-10-08T12:49:08
2025-10-16T14:19:08
2025-10-15T12:09:00
MEMBER
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/41449", "html_url": "https://github.com/huggingface/transformers/pull/41449", "diff_url": "https://github.com/huggingface/transformers/pull/41449.diff", "patch_url": "https://github.com/huggingface/transformers/pull/41449.patch", "merged_at": "2025-10-15T12:09:00" }
# What does this PR do? This PR should all simple trainer tests and some deepspeed tests. The only remaining tests to fix are deepspeed z2 grad acc tests but this is strange why it is failing ... cc @IlyasMoutawwakil maybe you have an idea as you worked on it for `HPU`
{ "login": "Cyrilvallez", "id": 71554963, "node_id": "MDQ6VXNlcjcxNTU0OTYz", "avatar_url": "https://avatars.githubusercontent.com/u/71554963?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Cyrilvallez", "html_url": "https://github.com/Cyrilvallez", "followers_url": "https://api.github.com/users/Cyrilvallez/followers", "following_url": "https://api.github.com/users/Cyrilvallez/following{/other_user}", "gists_url": "https://api.github.com/users/Cyrilvallez/gists{/gist_id}", "starred_url": "https://api.github.com/users/Cyrilvallez/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Cyrilvallez/subscriptions", "organizations_url": "https://api.github.com/users/Cyrilvallez/orgs", "repos_url": "https://api.github.com/users/Cyrilvallez/repos", "events_url": "https://api.github.com/users/Cyrilvallez/events{/privacy}", "received_events_url": "https://api.github.com/users/Cyrilvallez/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/41449/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/41449/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/41448
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/41448/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/41448/comments
https://api.github.com/repos/huggingface/transformers/issues/41448/events
https://github.com/huggingface/transformers/pull/41448
3,495,225,330
PR_kwDOCUB6oc6ssM9p
41,448
🚨 [v5] Rename left traces of `past_key_value` in BERT-like models
{ "login": "zucchini-nlp", "id": 100715397, "node_id": "U_kgDOBgDLhQ", "avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4", "gravatar_id": "", "url": "https://api.github.com/users/zucchini-nlp", "html_url": "https://github.com/zucchini-nlp", "followers_url": "https://api.github.com/users/zucchini-nlp/followers", "following_url": "https://api.github.com/users/zucchini-nlp/following{/other_user}", "gists_url": "https://api.github.com/users/zucchini-nlp/gists{/gist_id}", "starred_url": "https://api.github.com/users/zucchini-nlp/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/zucchini-nlp/subscriptions", "organizations_url": "https://api.github.com/users/zucchini-nlp/orgs", "repos_url": "https://api.github.com/users/zucchini-nlp/repos", "events_url": "https://api.github.com/users/zucchini-nlp/events{/privacy}", "received_events_url": "https://api.github.com/users/zucchini-nlp/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-10-08T12:25:14
2025-10-09T14:39:01
2025-10-09T08:44:44
MEMBER
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/41448", "html_url": "https://github.com/huggingface/transformers/pull/41448", "diff_url": "https://github.com/huggingface/transformers/pull/41448.diff", "patch_url": "https://github.com/huggingface/transformers/pull/41448.patch", "merged_at": "2025-10-09T08:44:44" }
# What does this PR do? As per title, finalizes the work from https://github.com/huggingface/transformers/pull/41425 and renames `past_key_value` as `past_key_values` in all files without deprecation cycle It was a find and replace, so tests might fail 😅
{ "login": "zucchini-nlp", "id": 100715397, "node_id": "U_kgDOBgDLhQ", "avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4", "gravatar_id": "", "url": "https://api.github.com/users/zucchini-nlp", "html_url": "https://github.com/zucchini-nlp", "followers_url": "https://api.github.com/users/zucchini-nlp/followers", "following_url": "https://api.github.com/users/zucchini-nlp/following{/other_user}", "gists_url": "https://api.github.com/users/zucchini-nlp/gists{/gist_id}", "starred_url": "https://api.github.com/users/zucchini-nlp/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/zucchini-nlp/subscriptions", "organizations_url": "https://api.github.com/users/zucchini-nlp/orgs", "repos_url": "https://api.github.com/users/zucchini-nlp/repos", "events_url": "https://api.github.com/users/zucchini-nlp/events{/privacy}", "received_events_url": "https://api.github.com/users/zucchini-nlp/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/41448/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/41448/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/41447
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/41447/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/41447/comments
https://api.github.com/repos/huggingface/transformers/issues/41447/events
https://github.com/huggingface/transformers/issues/41447
3,495,127,956
I_kwDOCUB6oc7QU2uU
41,447
ImportError: cannot import name 'Qwen3VLMoeForConditionalGeneration' from 'transformers'
{ "login": "EIIvy", "id": 48089890, "node_id": "MDQ6VXNlcjQ4MDg5ODkw", "avatar_url": "https://avatars.githubusercontent.com/u/48089890?v=4", "gravatar_id": "", "url": "https://api.github.com/users/EIIvy", "html_url": "https://github.com/EIIvy", "followers_url": "https://api.github.com/users/EIIvy/followers", "following_url": "https://api.github.com/users/EIIvy/following{/other_user}", "gists_url": "https://api.github.com/users/EIIvy/gists{/gist_id}", "starred_url": "https://api.github.com/users/EIIvy/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/EIIvy/subscriptions", "organizations_url": "https://api.github.com/users/EIIvy/orgs", "repos_url": "https://api.github.com/users/EIIvy/repos", "events_url": "https://api.github.com/users/EIIvy/events{/privacy}", "received_events_url": "https://api.github.com/users/EIIvy/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
open
false
null
[]
null
[]
2025-10-08T11:56:39
2025-10-15T09:24:55
null
NONE
null
null
null
null
I download the latest transformers==4.57.0, the code get this error : from transformers import Qwen3VLMoeForConditionalGeneration ImportError: cannot import name 'Qwen3VLMoeForConditionalGeneration' from 'transformers'
null
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/41447/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/41447/timeline
null
null
{ "total": 0, "completed": 0, "percent_completed": 0 }
{ "blocked_by": 0, "total_blocked_by": 0, "blocking": 0, "total_blocking": 0 }
false
false
https://api.github.com/repos/huggingface/transformers/issues/41446
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/41446/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/41446/comments
https://api.github.com/repos/huggingface/transformers/issues/41446/events
https://github.com/huggingface/transformers/pull/41446
3,495,117,304
PR_kwDOCUB6oc6sr1fq
41,446
Enable non-streaming mode in `transformers serve`
{ "login": "LysandreJik", "id": 30755778, "node_id": "MDQ6VXNlcjMwNzU1Nzc4", "avatar_url": "https://avatars.githubusercontent.com/u/30755778?v=4", "gravatar_id": "", "url": "https://api.github.com/users/LysandreJik", "html_url": "https://github.com/LysandreJik", "followers_url": "https://api.github.com/users/LysandreJik/followers", "following_url": "https://api.github.com/users/LysandreJik/following{/other_user}", "gists_url": "https://api.github.com/users/LysandreJik/gists{/gist_id}", "starred_url": "https://api.github.com/users/LysandreJik/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/LysandreJik/subscriptions", "organizations_url": "https://api.github.com/users/LysandreJik/orgs", "repos_url": "https://api.github.com/users/LysandreJik/repos", "events_url": "https://api.github.com/users/LysandreJik/events{/privacy}", "received_events_url": "https://api.github.com/users/LysandreJik/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-10-08T11:53:19
2025-10-15T07:37:28
2025-10-15T07:37:26
MEMBER
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/41446", "html_url": "https://github.com/huggingface/transformers/pull/41446", "diff_url": "https://github.com/huggingface/transformers/pull/41446.diff", "patch_url": "https://github.com/huggingface/transformers/pull/41446.patch", "merged_at": "2025-10-15T07:37:26" }
Needs this to be merged first: https://github.com/huggingface/transformers/pull/41444 Tests and docs need to be added before undraft
{ "login": "LysandreJik", "id": 30755778, "node_id": "MDQ6VXNlcjMwNzU1Nzc4", "avatar_url": "https://avatars.githubusercontent.com/u/30755778?v=4", "gravatar_id": "", "url": "https://api.github.com/users/LysandreJik", "html_url": "https://github.com/LysandreJik", "followers_url": "https://api.github.com/users/LysandreJik/followers", "following_url": "https://api.github.com/users/LysandreJik/following{/other_user}", "gists_url": "https://api.github.com/users/LysandreJik/gists{/gist_id}", "starred_url": "https://api.github.com/users/LysandreJik/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/LysandreJik/subscriptions", "organizations_url": "https://api.github.com/users/LysandreJik/orgs", "repos_url": "https://api.github.com/users/LysandreJik/repos", "events_url": "https://api.github.com/users/LysandreJik/events{/privacy}", "received_events_url": "https://api.github.com/users/LysandreJik/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/41446/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/41446/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/41445
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/41445/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/41445/comments
https://api.github.com/repos/huggingface/transformers/issues/41445/events
https://github.com/huggingface/transformers/pull/41445
3,495,093,267
PR_kwDOCUB6oc6srwXq
41,445
[`from_pretrained`] Small refactor `from_pretrained`: move around unrelated stuff
{ "login": "ArthurZucker", "id": 48595927, "node_id": "MDQ6VXNlcjQ4NTk1OTI3", "avatar_url": "https://avatars.githubusercontent.com/u/48595927?v=4", "gravatar_id": "", "url": "https://api.github.com/users/ArthurZucker", "html_url": "https://github.com/ArthurZucker", "followers_url": "https://api.github.com/users/ArthurZucker/followers", "following_url": "https://api.github.com/users/ArthurZucker/following{/other_user}", "gists_url": "https://api.github.com/users/ArthurZucker/gists{/gist_id}", "starred_url": "https://api.github.com/users/ArthurZucker/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/ArthurZucker/subscriptions", "organizations_url": "https://api.github.com/users/ArthurZucker/orgs", "repos_url": "https://api.github.com/users/ArthurZucker/repos", "events_url": "https://api.github.com/users/ArthurZucker/events{/privacy}", "received_events_url": "https://api.github.com/users/ArthurZucker/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-10-08T11:45:17
2025-10-15T13:13:10
2025-10-13T14:33:32
COLLABORATOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/41445", "html_url": "https://github.com/huggingface/transformers/pull/41445", "diff_url": "https://github.com/huggingface/transformers/pull/41445.diff", "patch_url": "https://github.com/huggingface/transformers/pull/41445.patch", "merged_at": "2025-10-13T14:33:32" }
# What does this PR do? As we are moving towards having a dynamic weight loader, we need to have a simple `from_pretrained` call. This should help
{ "login": "ArthurZucker", "id": 48595927, "node_id": "MDQ6VXNlcjQ4NTk1OTI3", "avatar_url": "https://avatars.githubusercontent.com/u/48595927?v=4", "gravatar_id": "", "url": "https://api.github.com/users/ArthurZucker", "html_url": "https://github.com/ArthurZucker", "followers_url": "https://api.github.com/users/ArthurZucker/followers", "following_url": "https://api.github.com/users/ArthurZucker/following{/other_user}", "gists_url": "https://api.github.com/users/ArthurZucker/gists{/gist_id}", "starred_url": "https://api.github.com/users/ArthurZucker/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/ArthurZucker/subscriptions", "organizations_url": "https://api.github.com/users/ArthurZucker/orgs", "repos_url": "https://api.github.com/users/ArthurZucker/repos", "events_url": "https://api.github.com/users/ArthurZucker/events{/privacy}", "received_events_url": "https://api.github.com/users/ArthurZucker/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/41445/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/41445/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/41444
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/41444/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/41444/comments
https://api.github.com/repos/huggingface/transformers/issues/41444/events
https://github.com/huggingface/transformers/pull/41444
3,495,048,352
PR_kwDOCUB6oc6srmov
41,444
Streaming should be handled at the request-level rather than at the istance level
{ "login": "LysandreJik", "id": 30755778, "node_id": "MDQ6VXNlcjMwNzU1Nzc4", "avatar_url": "https://avatars.githubusercontent.com/u/30755778?v=4", "gravatar_id": "", "url": "https://api.github.com/users/LysandreJik", "html_url": "https://github.com/LysandreJik", "followers_url": "https://api.github.com/users/LysandreJik/followers", "following_url": "https://api.github.com/users/LysandreJik/following{/other_user}", "gists_url": "https://api.github.com/users/LysandreJik/gists{/gist_id}", "starred_url": "https://api.github.com/users/LysandreJik/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/LysandreJik/subscriptions", "organizations_url": "https://api.github.com/users/LysandreJik/orgs", "repos_url": "https://api.github.com/users/LysandreJik/repos", "events_url": "https://api.github.com/users/LysandreJik/events{/privacy}", "received_events_url": "https://api.github.com/users/LysandreJik/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-10-08T11:30:55
2025-10-10T08:24:57
2025-10-10T08:24:55
MEMBER
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/41444", "html_url": "https://github.com/huggingface/transformers/pull/41444", "diff_url": "https://github.com/huggingface/transformers/pull/41444.diff", "patch_url": "https://github.com/huggingface/transformers/pull/41444.patch", "merged_at": "2025-10-10T08:24:55" }
null
{ "login": "LysandreJik", "id": 30755778, "node_id": "MDQ6VXNlcjMwNzU1Nzc4", "avatar_url": "https://avatars.githubusercontent.com/u/30755778?v=4", "gravatar_id": "", "url": "https://api.github.com/users/LysandreJik", "html_url": "https://github.com/LysandreJik", "followers_url": "https://api.github.com/users/LysandreJik/followers", "following_url": "https://api.github.com/users/LysandreJik/following{/other_user}", "gists_url": "https://api.github.com/users/LysandreJik/gists{/gist_id}", "starred_url": "https://api.github.com/users/LysandreJik/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/LysandreJik/subscriptions", "organizations_url": "https://api.github.com/users/LysandreJik/orgs", "repos_url": "https://api.github.com/users/LysandreJik/repos", "events_url": "https://api.github.com/users/LysandreJik/events{/privacy}", "received_events_url": "https://api.github.com/users/LysandreJik/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/41444/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/41444/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/41443
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/41443/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/41443/comments
https://api.github.com/repos/huggingface/transformers/issues/41443/events
https://github.com/huggingface/transformers/pull/41443
3,495,038,108
PR_kwDOCUB6oc6srkbY
41,443
Qwen3: fix bool attention_mask in eager_attention_forward
{ "login": "ZhengKai91", "id": 18749478, "node_id": "MDQ6VXNlcjE4NzQ5NDc4", "avatar_url": "https://avatars.githubusercontent.com/u/18749478?v=4", "gravatar_id": "", "url": "https://api.github.com/users/ZhengKai91", "html_url": "https://github.com/ZhengKai91", "followers_url": "https://api.github.com/users/ZhengKai91/followers", "following_url": "https://api.github.com/users/ZhengKai91/following{/other_user}", "gists_url": "https://api.github.com/users/ZhengKai91/gists{/gist_id}", "starred_url": "https://api.github.com/users/ZhengKai91/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/ZhengKai91/subscriptions", "organizations_url": "https://api.github.com/users/ZhengKai91/orgs", "repos_url": "https://api.github.com/users/ZhengKai91/repos", "events_url": "https://api.github.com/users/ZhengKai91/events{/privacy}", "received_events_url": "https://api.github.com/users/ZhengKai91/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-10-08T11:27:37
2025-10-08T12:34:23
2025-10-08T12:34:23
NONE
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/41443", "html_url": "https://github.com/huggingface/transformers/pull/41443", "diff_url": "https://github.com/huggingface/transformers/pull/41443.diff", "patch_url": "https://github.com/huggingface/transformers/pull/41443.patch", "merged_at": null }
This PR fixes a silent numerical/logic bug that occurs when a boolean attention_mask is supplied to the eager attention path. The current implementation simply adds the mask to the attention scores, which is correct for float masks that already contain −∞ values, but produces completely wrong logits when the mask is of type torch.bool. The fix converts a boolean mask to float and fills the masked-out positions with −∞ before the addition, guaranteeing that softmax will assign them zero probability. No breaking change. @vasqu @ArthurZucker @Cyrilvallez
{ "login": "ZhengKai91", "id": 18749478, "node_id": "MDQ6VXNlcjE4NzQ5NDc4", "avatar_url": "https://avatars.githubusercontent.com/u/18749478?v=4", "gravatar_id": "", "url": "https://api.github.com/users/ZhengKai91", "html_url": "https://github.com/ZhengKai91", "followers_url": "https://api.github.com/users/ZhengKai91/followers", "following_url": "https://api.github.com/users/ZhengKai91/following{/other_user}", "gists_url": "https://api.github.com/users/ZhengKai91/gists{/gist_id}", "starred_url": "https://api.github.com/users/ZhengKai91/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/ZhengKai91/subscriptions", "organizations_url": "https://api.github.com/users/ZhengKai91/orgs", "repos_url": "https://api.github.com/users/ZhengKai91/repos", "events_url": "https://api.github.com/users/ZhengKai91/events{/privacy}", "received_events_url": "https://api.github.com/users/ZhengKai91/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/41443/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/41443/timeline
null
null
null
null
true
true